Apple is building its own Mac CPUs, does this mean ARM has won? | Upscaled

Поделиться
HTML-код
  • Опубликовано: 21 окт 2024

Комментарии • 1 тыс.

  • @juanmj93
    @juanmj93 4 года назад +520

    This video is amazingly informative and goes well beyond the typical “ARM is more efficient but instructions are different so apps will have to be ported and that’s it”

    • @campkira
      @campkira 4 года назад +8

      x86 still better for normal cpu...

    • @arcowo
      @arcowo 4 года назад +5

      @@yougoonie3338 "consumer world", sure, but the world as a whole runs on Linux

    • @cokesucker9520
      @cokesucker9520 4 года назад +7

      For now at least, but in the long haul ARM looks like it will overtake x86. The most powerful super computer in the world runs on ARM, and cost way less to build than an x86 equivalent and uses less power. The writing is on the wall here, it’s just a matter of when.

    • @thebuddercweeper
      @thebuddercweeper 4 года назад

      @@arcowo I was literally about to say this lol

    • @thebuddercweeper
      @thebuddercweeper 4 года назад +6

      @@campkira Did you see Craig Federighi editing 3 streams of 4k video on what is essentially 2 year old tablet CPU using integrated graphics? Yeah, no, I think I’ll go with ARM.

  • @DadIsntMad
    @DadIsntMad 4 года назад +209

    One of the most thorough video explaining actual importance of ARM vs X86 and it’s history. Nice work

  • @8aravindk
    @8aravindk 4 года назад +552

    Superb explainer, thank you, read 50 articles yesterday saying Mac is switching to ARM, no one bothered to explain why. Thank you.

    • @Rex55590
      @Rex55590 4 года назад +8

      I think Apple explained it pretty well in the conference but who has the time to watch the whole thing

    • @RaquelFoster
      @RaquelFoster 4 года назад +16

      The only reason "why" is because it'll give them a much higher profit margin. It'll be awhile before we start seeing things recompiled for ARM and can talk about what the performance difference actually is.

    • @runcmd1419
      @runcmd1419 4 года назад +21

      @@RaquelFoster Laptops not acting as space heaters is another.

    • @gogobrasil7185
      @gogobrasil7185 4 года назад +12

      @@RaquelFoster also the number of apps mac can run has more than tripled? Maybe that's a bit of an incentive too?

    • @RoHan-83
      @RoHan-83 4 года назад +5

      Gogo Brasil, RUN CMD - don’t try logic and facts on people plz, it might crush their beliefs

  • @MiltonGeorges
    @MiltonGeorges 4 года назад +111

    I just wanna say, I really appreciate these very well produced, highly technical, yet still accessible videos from Engadget. This series is awesome!

  • @AU-hs6zw
    @AU-hs6zw 4 года назад +472

    A video on RISC-V? Soon??
    Yes of course.....:)

    • @DuvJones
      @DuvJones 4 года назад +2

      You know, now that he has basically gone though the cliff notes version of "RISC-1: What is that?"....
      Yeah, I would like him to lay out the land of architectures and chips because for the last 2 years (X86 aside)... It's been rather hot.
      Like something that he passes on in this. The POWER ISA (Instruction Set Architecture) has never really gone away... it's still big in the server (like Supercomputers) space and two things have been keeping it up in the discussion: The fact that POWER (as of POWER 5+) is open-sourced So, every so often IBM updates the ISA (now at version 3.1) that is open which they did this year. Then their is the fact that Samsung semi-conductor will be involved with producing some of those chips (POWER 10 on it's 7nm node), as well as extending things so any semi-conductor an produce said chips (like TSMC). Given how well that has been going for AMD, APPLE, the entire ARM industry over at TSMC, and the fact that POWER historically has a nasty issues with wattage and heat (which 7nm seems to address in AMD's Threadripper).... Ok, keep an eye on this old dog. It's got some new tricks, and IBM seems to want to the core-count crown with POWER 10....
      Then add to this that RISC-V has been the talk of the Architecture town when it comes to compute. With alot of movement from the industry side of things....
      Yeah, it about time that we get a quick, cliff notes overview of chips and arch's. I think that we will all need it.

    • @anix86
      @anix86 4 года назад +3

      Apple might use risc-v and get rid of arm... they call it Apple silicon... may be for a reason... over the year arm partnership would have given all the knowledge they need to make own chips

    • @kusumayogi7956
      @kusumayogi7956 4 года назад

      RISC V is LAME

    • @dbtest117
      @dbtest117 4 года назад

      @@anix86 Apple and acorn and few others made Advanced Risc Machine company. That's why Acorn was dropped as the current ARM was based upon Apples need for a processor to their Newton machine.

    • @DuvJones
      @DuvJones 4 года назад +3

      @@anix86
      To be honest, it would be VERY hard for Apple to hide that and Apple likes being hidden. The community around RISC-V LIKES being open and tend to have their discussions in such manners. They would be VERY leery of Apple taking an interested considering how much of Apple's open-projects tend to flounder and how hostile Apple tends to be on projects at times.
      Also it would likely have to have access to some internal modules of XNU, and Apple doesn't talk about it's kernel much (if at all).

  • @yensteel
    @yensteel 4 года назад +214

    Dude, get into teaching hardware/software engineering. This was explained REALLY well! Call it Eng-get-it XD

  • @JimmyLinPhD
    @JimmyLinPhD 4 года назад +20

    Very well done video! Just a couple of minor comments:
    • ARM is simply pronounced “arm”, it isn’t spelled out. In fact, the company Arm Holdings has changed the spelling to “Arm” to emphasize this.
    • While David Patterson is a professor at UC Berkeley, John Hennessy is a professor at Stanford. His group did not work on Patterson’s RISC-I; they had their own architecture called MIPS - the same MIPS that Silicon Graphics eventually used.

  • @martinsauer5944
    @martinsauer5944 4 года назад +316

    I learned more about computer architecture here than I did in my first semester of university.
    Thanks for the great video!
    I didn’t know you before this video but I just subbed :)

    • @locatedeatar3642
      @locatedeatar3642 4 года назад

      x2

    • @elck3
      @elck3 4 года назад

      You didn’t know Engadget?

    • @poppygin4807
      @poppygin4807 4 года назад +1

      well, it was only first semester.

    • @BlackEagle352
      @BlackEagle352 4 года назад +5

      You probably didn't listen, in which you will deny.

    • @TimSheehan
      @TimSheehan 4 года назад +1

      They're more focused on fundamentals of programming and maths first year, this kind of thing should get covered in much greater detail second or third year

  • @anmolagrawal5358
    @anmolagrawal5358 4 года назад +95

    What compression algorithm do you use to pack all that info in a mere 14 min video. Nice job really!

    • @zaphenath6756
      @zaphenath6756 4 года назад +9

      middle out. i think pied piper came up with it

    • @dbtest117
      @dbtest117 4 года назад

      He left out a lot of details, as who what and so on, also left out that x86 today is a hybrid design between risc and sisc, which AMD was first to implement and then later intel too. Most of it is irrelevant, but interesting as e.g. when A was changed from Acorn to Advanced in ARM it was to a large deal because of Apple and their Newton. But there were other companies involved too.

    • @sdwvit
      @sdwvit 4 года назад

      gzip

    • @nicholaslandolina
      @nicholaslandolina 4 года назад

      Info phisibg

  • @Vojoor
    @Vojoor 4 года назад +44

    Damn, that’s a great video, Kudos to everyone involved.

  • @ZaprenK
    @ZaprenK 4 года назад +59

    You're back! Finally

    • @Travisharger
      @Travisharger 4 года назад +1

      Stoked about Apple making their own Mac chips.

  • @FromScratchWithLove
    @FromScratchWithLove 4 года назад +1

    Good explainer, however, the current iPad Pro does not *approach* speeds of *some* laptops; it is FASTER than over 90% of them. Also, apples approach to this is different than everyone else. Apple only licenses the instruction set, not any of ARM’s reference designs. Apples chips are based on the work of PA-Semi, whom they purchased in 2008; who used to design PowerPC chips. Think of apples SOC’s as a modern PowerPC chip, an you’ll start to understand why they have such a huge performance advantage over everyone else who makes these type of processors.

  • @deiviuds
    @deiviuds 4 года назад +32

    Please do a RISC-V video!

  • @TonkarzOfSolSystem
    @TonkarzOfSolSystem 4 года назад +126

    When you find out that ARM descends from Acorn

    • @moow950
      @moow950 4 года назад +7

      Yes, my first computer was the Acorn BBC Model B!!

    • @mrrolandlawrence
      @mrrolandlawrence 4 года назад +6

      i cut my teeth programming ARM assembler :) on the acorn a440!

    • @gareginasatryan6761
      @gareginasatryan6761 4 года назад +1

      Roland Lawrence how old are u

    • @axethepenguin
      @axethepenguin 4 года назад +2

      @@gareginasatryan6761 read the comment, the acorn a440 is a very old computer think before you comment

    • @IanValentine147
      @IanValentine147 4 года назад +1

      I cut my teeth programming games in 6502 Assembler on the BBC, ARM was my wet dream at the time. (Author of Galaxy Raiders for the BBc Micro - This was pre-Elite!)

  • @Sacchidanand
    @Sacchidanand 4 года назад +42

    Yes please do video on RISC-V

  • @Martin-cc5xn
    @Martin-cc5xn 4 года назад +1

    11:00 Wrong! Macs were running Motorola 68k (CISC) series processors not x86, they used the 68k series from the first Mac model until transitioning to PowerPC (RISC) in the mid 90s the PowerPC based macs used machine code level translation (a technology purchased from a British company called Transmeta if I recall which they renamed Rosetta) to translate 68k instructions into PowerPC instructions, this would emulate 68k software so it could run on PowerPC based Macs and aid the transition for users running older software and buy time for developers to re-engineer their apps as native PowerPC. Later they did the transition from PowerPC to intel X86 with universal binaries that contained both PowerPC and X86 compiled code in one app and the OS would choose which to run based on the Macs processor hardware. In the keynote they’ve suggested the same blended approach for transition to ARM. So the Mac started life as 68k CISC then PowerPC RISC the x86 CISC and its future is now ARM RISK (Apples variant of). Apples interest in ARM began with the search for a low power high performance processor needed for the Apple Newton, they were uncomfortable about working with Acorn because of Acorns desktop computer manufacturing was a conflict of interest so Acorn spun out its processor design division as ARM and Apple bought in. All of which ultimately lead to ARM architecture chips being in the iPhone , IPad and later iPods. Everyone in uk pronounces it as ARM and not A. R. M by the way 😉

  • @SalarKalantari
    @SalarKalantari 4 года назад +4

    Yes, please make a video on RISC-V. RISC-V needs all the publicity it can get.

  • @suhassreenivas1878
    @suhassreenivas1878 4 года назад

    This video should be in a computer architecture course. The history, context and the explanation is so good and accessible. Video on RISC-V please!

  • @h00b00
    @h00b00 4 года назад +16

    The video equivalent of reading an Anandtech/Ars article.
    Much thanks.

  • @JoeMarler58
    @JoeMarler58 4 года назад

    This is an excellent video. Whoever wrote and produced this did a good job. They managed to compress into a 14 min video a tutorial on machine code, an accurate history of RISC vs CISC, and the Apple ARM transition. They mentioned the often overlooked point about lower RISC instruction density impacting instruction and data cache. The choice of graphics and archival b-roll was good.

  • @jaytorr6701
    @jaytorr6701 4 года назад +15

    I had an Acorn Archemides 300. At that time I had a windows interface, could do video editing and image editing with thousands of polygons and 64k colourmap...it could then do 3 MIPS. never really understood why the x86 technology took over...

    • @moow950
      @moow950 4 года назад

      Because of legacy applications

    • @jaytorr6701
      @jaytorr6701 4 года назад

      @@moow950 true. IBM and other manufacturers did not have the guts to go with what was the better solution. At the same time this will be the 4th time apple changes its core processor architecture.

  • @potawatomi100
    @potawatomi100 4 года назад +1

    Very well explained and your narration is evidence that you have talent: use of language, facial expressions, emotional disposition, hand gestures - really well narrated. You’d make a great investigative reporter.

  • @akashbahety3865
    @akashbahety3865 4 года назад +20

    RISC 5 PLEASE

  • @Sitti2300
    @Sitti2300 4 года назад +1

    Finally understood why ARM is such a big deal. Great Video!! Others' videos on this topic were quite vague, didn't sound like they actually knew what they were talking about. This could be something worth upgrading from my still going strong 2012 rMBP

  • @demonitized6208
    @demonitized6208 4 года назад +108

    Imagine macbooks with 15 hours battery 😲

    • @juanmj93
      @juanmj93 4 года назад +24

      They could also keep at around 10 hours but drastically reduce battery size and therefore weight and thickness, I actually hope they aim at something in the middle, 15 hours seems very reasonable

    • @juanmj93
      @juanmj93 4 года назад +15

      Demonitized but something like 20 hours plus, charge your laptop every other day could be a huge marketing point for apple

    • @stephenborgella2312
      @stephenborgella2312 4 года назад +17

      Imagine a 24 hour MacBook Air 🤯

    • @Blovmag
      @Blovmag 4 года назад +11

      I think they will give the consumer all the options. A thin and light Macbook with long battery life, and a „Pro“ Macbook which is more powerful

    • @utubekullanicisi
      @utubekullanicisi 4 года назад

      @@juanmj93 Making a Macbook Air even thinner can even be considered a good thing as that machine's aim is to be as thin and light as possible, but even if they made the batteries of the Pro models smaller, not needing discrete GPU's that suck 50W more power from the battery will not only increase the battery life almost two fold, but also the space savings from the removal of dGPU's will allow for a bigger battery.

  • @notlessgrossman163
    @notlessgrossman163 4 года назад

    Superb exposition, these types of video give good understanding above all the hype

  • @neoasura
    @neoasura 4 года назад +7

    Eh, I'll wait for a few years. The Surface Pro X is ARM based, and it has a serious lack of ARM compatible programs, and running anything x86 based on it is a pain in the ass.

    • @nameless-user
      @nameless-user 4 года назад +1

      neoasura From what I’ve seen so far, Apple seems pretty on the ball on compatibility. Microsoft had never played nice with ARM (RT and the X experiment), and Apple has done this rodeo twice already. They’re staking a lot of their future prospects on this, not just some one-off product.

    • @VeritasVortex
      @VeritasVortex 4 года назад +1

      That's because Microsoft was never serious about making x86 apps compatible with ARM based computers. Even with the new emulation in the surface pro X. I have a feeling Apple will get it right

  • @shanereagan3498
    @shanereagan3498 4 года назад

    I love these Upscaled videos so much. Please keep them coming!

  • @gunaseelanselvam2374
    @gunaseelanselvam2374 4 года назад +3

    Sure please Do the video for Risc v

  • @DarthTechwarrior
    @DarthTechwarrior 4 года назад

    great video. Im actually really liking this upscaled series now. Everything is explained so well.

  • @rufioh
    @rufioh 4 года назад +24

    I’m just curious to see if apple’s silicon will be directly purchasable the consumer. Also will Apple’s work on Rosetta make Microsoft re-evaluate ARM based surfaces

    • @marvean2519
      @marvean2519 4 года назад +4

      @@ricardo-neves Right. What I don't understand is that the mbpro just received an upgrade with the 10th gen intel cpu. Just seems like bad timing for the consumers.

    • @josephlemons
      @josephlemons 4 года назад +3

      Marvin Marker that’s why I waited lol

    • @marvean2519
      @marvean2519 4 года назад +3

      @@josephlemons Sure. Though we still don't know which Macs will be receiving those ARM chips first. Nor do we know how well those first ARM macs will be performing. It's a bit of gamble (depending on your needs) either way.

    • @dawidvanstraaten
      @dawidvanstraaten 4 года назад +3

      @@marvean2519 they'll have to support the Intel Macs for some time. Released a Mac Pro not too long ago, then also the Macbook Pro's and also some unreleased Macs coming on Intel too.

    • @vedshah8784
      @vedshah8784 4 года назад +2

      @@marvean2519 that was for the keyboard fix. As far as I can tell, apple is saving the brand new designs for their own silicon, that's how you drive up consumer adoption and Devs will follow along

  • @arvindkanesan
    @arvindkanesan 4 года назад

    Coming from someone who loved & excelled in computer architecture back in college a few years back, this video does an amazing job at being technically accurate & packing a lot of information into a mainstream tech video.

  • @mrrolandlawrence
    @mrrolandlawrence 4 года назад +19

    the dev machine with 12 cores... might just end up being the "bottom end" ;) when the arm mac is eventually launched...

    • @1idd0kun
      @1idd0kun 4 года назад +4

      Probably. Apple has done wonders with their custom ARM chips but they're still don't have the single-core performance Intel and AMD's best x86 architectures have. 12 Apple ARM cores are probably equivalent to 6 Comet Lake cores. That being the case, an ARM-based based iMAC would probably need at least 24 ARM cores to have good performance.

    • @taurus20077
      @taurus20077 4 года назад

      @@1idd0kun Maybe ARM cores/chips take up the whole entire back of iMac.

    • @radun70
      @radun70 4 года назад +1

      @@1idd0kun 2020 iPad pro is about equal to an Macbook Pro 13"... if the 2020 Fall Macbook pro has 50% more performance cores, then we'll begin to see things really taking off. And that's on 5nm 1st gen, they're probably not maxxing out the process node potential. So the following year, we'll see another increase in performance, and then another one etc/ Intel has been on 14nm for like 8 years and they're only starting on 10nm. I have about no use for macOS and I want to get one just because it's so interesting.

    • @1idd0kun
      @1idd0kun 4 года назад

      @@radun70 "2020 iPad pro is about equal to an Macbook Pro 13"
      About as equal in a single benchmark, Geekbench. If you know anything about Geekbench, you'd know is hardly representative of actual workloads.
      Besides, Macbook 13 probably doesn't have enough cooling to let the Intel CPU going all out anyway.

    • @blaze6400
      @blaze6400 4 года назад +1

      @@kevin.callens Geek bench is designed around ARM processors so x86 processors have lower results then they should have

  • @BillRey
    @BillRey 4 года назад +1

    Only one thing was slightly wrong: It was mentioned that the chips inside the iPad Pro's is close to the performance of some laptops. The reality is that the iPad Pro is already faster than the vast majority of PC laptops today.

  • @ArianSharifian
    @ArianSharifian 4 года назад +4

    I've been a software engineer for way longer than I want to admit, and I still learned a lot from this video. Very good job 👍🏽

    • @cat-.-
      @cat-.- 3 года назад

      Same!!

  • @julzriosdirector
    @julzriosdirector 3 года назад

    Awesome explanation and very well illustrated, very well done guys! I finished up a bit confused at the end but I'm looking forward to seeing the second part. Cheers! 🇨🇦

  • @ilovelimpfries
    @ilovelimpfries 4 года назад +35

    All i know is TSMC is the winner in this "war."

    • @DuvJones
      @DuvJones 4 года назад

      You think?

    • @subjord
      @subjord 4 года назад +5

      They would also win with AMD processors. They probably already won. Intel doesn’t take orders form other companies for chips and Samsung is mainly producing chips for themselves. TSMC is the biggest and probably best Option to go for when you want to produce your own chips.

    • @miguelpereira9859
      @miguelpereira9859 4 года назад +1

      Yep TSMC has gotten used to winning

    • @FE0070
      @FE0070 4 года назад

      But their stock value is not growing much in last years, thats mystery.

    • @miguelpereira9859
      @miguelpereira9859 4 года назад +1

      @@FE0070 That's not true, their stock has nearly tripled in the last 5 years. That's excellent performance

  • @rdsm9374
    @rdsm9374 4 года назад

    That was by far the best video about this subject! I hope you do the Risc V video!

  • @FunfakeElectronics
    @FunfakeElectronics 4 года назад +3

    This news is amazing, can't wait to see the new ARM based macbooks.

  • @stachowi
    @stachowi 4 года назад

    Coming from an EE/CS professional... this is SO GOOD. Educational and not dry. Love this series.

  • @Powhart
    @Powhart 4 года назад +4

    I AM HYPED About this!!! Can't wait to see what Apple has prepared!

  • @danielwoods7325
    @danielwoods7325 4 года назад

    Love these Upscaled vids - perfect balance of actual technical information and understandable explanation!

  • @XJLCA
    @XJLCA 4 года назад +39

    Nice. A cute presenter who also knows his stuff. Impressive.

    • @jhtrico1850
      @jhtrico1850 4 года назад +13

      Stop objectifying him

    • @audy6947
      @audy6947 4 года назад +1

      Why r u geh

    • @eerereps
      @eerereps 4 года назад +2

      @@audy6947 who says he's geh?

  • @tomboss9940
    @tomboss9940 4 года назад +1

    The role of Apple is a bit diminished here. ARM was a joint venture of Acorn and Apple, which needed the ARM and StrongARM chip for their Newton. So the Newton ist the grandfather of all modern mobile devices. And PowerPC was a joint venture of Apple, Motorola and IBM.

  • @JulianAndresKlode
    @JulianAndresKlode 4 года назад +3

    What's up with mispronouncing ARM as A.R.M. while RISC, SPARC, and MIPS are right?

  • @arunkumarnsmorpheus1983
    @arunkumarnsmorpheus1983 4 года назад

    Point and precise. Wish I had teachers of your aptitude. Good work mate. Always keep up the good work and stay blessed.

  • @wepranaga
    @wepranaga 4 года назад +4

    cloud computing? maybe no
    desktop/personal computing? yes, definitely

    • @josephbargo5024
      @josephbargo5024 4 года назад +1

      Depends on how well it scales. If it can scale even somewhat reasonably, I disagree with you, but we don’t know yet.

    • @bartomiejkomarnicki7506
      @bartomiejkomarnicki7506 4 года назад +1

      the most powerful computer literally runs on ARM, there are plenty of ARM-based processors

    • @UlfricStromcloak
      @UlfricStromcloak 4 года назад

      @@bartomiejkomarnicki7506 The Fujitsu A64FCX you are talking about is very different than your run of the mill ARM CPU. To simple put it can act as both CPU & GPU, It's an hybrid design. Cortek has a video on it.

    • @wepranaga
      @wepranaga 4 года назад

      for cloud, we're talking about million upon millions of dollars worth of hardware from cloud providers. it'd take billions to migrate from existing x86/64 to arm hardware given that's only running in the background for your device to talk to so it's not like hardly anyone would notice it.
      not to mention that companies had enterprise softwares that runs on these machines needs to be ported too. that'd take resource. not to mention translating would cost some performance overhead to run software that's not intended for it. if not complete rewrite is necessary.
      meanwhile, apple is in their class of itself in terms of arm's processor. so it's hardly other companies would likely at their hardware level soon.
      but the fact that apple is bringing developers and encouraging them to transition to their platform (putting xcode and write native apps for their arm hardware) it's not impossible that developer community would come up with a lot of software written for arm hardware. and that might translate to the arm software landscape as a whole.

  • @sadatnafis2032
    @sadatnafis2032 4 года назад

    This series has started to grow on me. Great info, well researched and fairly in depth paired with your presentation, cant ask for more. And hell yes for an episode on Risc-V.

  • @59Goku
    @59Goku 4 года назад +4

    "Apple and ARM"
    Non-tech people: -wtf face-

  • @WarriorsPhoto
    @WarriorsPhoto 4 года назад +1

    I couldn’t watch this whole video the first time but I knew you were going to explain modern CPUs very well. You did and I am very impressed. Very good info about modern CPUs and how RISC and CISC chips are similar yet different, thank you.

  • @SamuelSarette
    @SamuelSarette 4 года назад +3

    I've never in my life as an engineer heard someone spell out ARM, and it's especially jarring since you don't do the same for the other acronyms.
    It's just "arm" not "ayy arr emm", the video was great, but that just kept taking me out of the moment.

  • @DannyWilliamH
    @DannyWilliamH 4 года назад +1

    Yeah, the story of ARM and when even THEY realized the computation delivered compared to power used was almost miraculous they didn't believe it. They checked everything about 50x over before finally accepting that it was a major breakthrough.
    It's a good story and videos are online covering it.

  • @AarshParashar
    @AarshParashar 4 года назад +3

    RISC-V + Linux + Steam = Future of Personal Computing!

  • @EduardoRFS
    @EduardoRFS 4 года назад

    Seriously bro, you make an weird good job here, easy to digest content and it's actually correct. I was expecting "risc good, cisc bad", but you even pointed out that it's not so simple

  • @_digifish
    @_digifish 4 года назад +4

    I think you are oversimplifying the software (and hardware) compatibility issues. Apple has a terrible track record with backward and forward compatibility for both software and hardware. While this won't affect most users of social media apps, browsers and email. I am sure there will be huge issues of software and hardware compatibility for creatives using specialised software. It also means Boot Camp is probably toast and a whole range of engineering-first applications.

  • @mikeandersonwa
    @mikeandersonwa 4 года назад +1

    I am so damn excited about this change! I plan on buying the first ARM MacBook that they release later this year, and testing everything out using that while keeping my desktop for work purposes. It's going to be really exciting to see what kind of performance we are able to get from these machines!

  • @BaghaShams
    @BaghaShams 4 года назад +5

    Nobody calls it A-R-M. It's just pronounced "arm".

  • @jekae61
    @jekae61 3 года назад +1

    wow i stubbled on this by mistake and man thank god i did!
    its informative, thank you, more please!

  • @GinoObiso
    @GinoObiso 4 года назад +6

    I just graduated software engineering by just 14mins

  • @sledgehamma
    @sledgehamma 4 года назад +4

    Extremely well researched video, as usual from you! Cheerio

  • @vigd6298
    @vigd6298 4 года назад +11

    2018: apple claim making its own 5G modem
    2019: apple buying intel modem business
    but 2020: apple give up and using 5G from qualcomm. LOOOOL

    • @officialANON001
      @officialANON001 4 года назад +4

      Apple bough intel modem business so it can design there own in the next 4 years in the meantime there gonna buy modems from Qualcomm but make there own antennas for them

    • @Dazzxp
      @Dazzxp 4 года назад +1

      @@officialANON001 Don't think it will be in your best interest they are only doing it so they can go from 300% profit to 600% profit margin on their products.

    • @montex66
      @montex66 4 года назад +2

      And your point is what? That Apple was "wrong"? Guess what, nobody (not even Apple) can foresee the future and what the market is going to do. This idea that Apple is dumb because they changed plans according to market or developmental forces is vapid and immature.

  • @EricsonHerbas
    @EricsonHerbas 4 года назад

    Well explained video. Good editing too. Great job!

  • @ToxikDnB544
    @ToxikDnB544 4 года назад +5

    Intel: **Hahaha youre never gonna beat me AMD!**
    AMD: **Hahaha youre never gonna beat me Intel!**
    Apple: **You're right... BYEE!!**

  • @mitchelligbafa2365
    @mitchelligbafa2365 4 года назад

    I have to rewatch the video again to understand in-depth, but you did a very good job explaining, nice one ✌️

  • @imhafdhom
    @imhafdhom 4 года назад +5

    Those New Macs are going to be just another iPads
    ...with extra parts

    • @Seatux
      @Seatux 4 года назад

      As long as its not basically a extra large iPad, with fixed RAM and storage. If it is the case, just buy an iPad really, its basically the same thing then.

    • @kaiwalpanchal5728
      @kaiwalpanchal5728 4 года назад +1

      @@Seatux I'd buy ipad with 16 gigs of ram!

  • @onlythunder3323
    @onlythunder3323 4 года назад

    That is really well explained dude! Nice work.

  • @TheoWerewolf
    @TheoWerewolf 4 года назад +5

    Apple represents a small fraction of all desktop and laptop systems for Intel (around 10% world wide if we're being generous). Losing Apple (which won't even happen for two more years, BTW) just isn't that big a thing. Then there's the server and workstation markets in which Apple essentially is a non-starter (expensive cheese graters included).
    And we keep forgetting bout AMD who, at the moment, are the real threat to Intel, who are coming out with faster, lower power x86 CPUs - and who own the ATI GPU platform, unlike Intel who's never really risen above 'here's something to keep you going until you get a real GPU'.
    So, no - ARM hasn't won - it's not even really playing in the same game.
    ARM has the mobile market. That's inarguable and it's a big (and mainly low end) market with a LOT of ARM chip makers in it - Samsung, Huawei, Broadcom, Fujitsu, Marvell, Qualcomm, Rockchip and so on. If THAT wasn't a threat, Apple isn't either.
    Apple isn't even the first company to make an ARM version of their main devices. Microsoft tried it with the Surface RT. Then they introduced the "Always On" computer design that Asus and HP used to create an ARM tablet and laptop running Win 10 for ARM and then released their own Surface Pro X.
    Seriously, get yer butt out of Apple's walled hole and realise there's more to the world than this one company.

    • @nameless-user
      @nameless-user 4 года назад +7

      I’d beg to differ on downplaying the impact ARM has had. You mention that Apple is only about 10% of the PC industry, yet the modern PC industry has been suffering heavily against the gradual shift to ARM-based mobile devices. There’s a good reason why both Intel and AMD are primarily aiming their products towards the gaming crowd: That market segment is the only one still growing, with literally every other market segment shrinking from switching to mobile devices for computing instead.

    • @ricoswave2326
      @ricoswave2326 4 года назад +2

      If I was losing a client which made up 10% of my sales, I would be apoplectic. Even at half that amount, which is closer to the mark, it is a *very* big deal for Intel. Having a Billion dollar client evaporate is not a minor setback.

  • @tthurlow
    @tthurlow 4 года назад

    Very well explained. I did notice one slide which makes no sense: "RISC: 1 instruction per Hz". This reads as "1 instruction per per second". Hz is a measure of frequency (some number of occurrences per some unit of time). What it should say, and what I think the video is actually saying, is "1 instruction per clock cycle".

  • @donaldmicheal4288
    @donaldmicheal4288 3 года назад +9

    I've always wanted to invest in forex/stock trading, after a few trials I realized that trading without a good mentor and guidance is a waste of time and money.

    • @andrewlucky1038
      @andrewlucky1038 3 года назад

      Despite the rampant scams among account Managers, i highly recommend that as beginner, you should find a good expert trader to help and guide you through the beginners phase of forex trading so that you don't lose money and Mr Andy Calistoga is a trusted man i can recommend

    • @michealdonaldray5198
      @michealdonaldray5198 3 года назад

      I have always wanted to expand my business but I could not do that because the cash was not forth coming as I had planned. Getting referred to Mr Andy Calistoga played a great role in that vision. Mr Andy is a great mentor and the best trader/Account manager on my list.

    • @anthonyauthur7720
      @anthonyauthur7720 3 года назад

      Honesty and trust means a lot to me that's why I still trade with Mr Andy. It's been 7months now and I haven't had reasons to doubt his competence. Traders/Account managers like him are hard to come by.

    • @petersonvalentine5126
      @petersonvalentine5126 3 года назад

      How can I get to Mr Andy?

    • @michealdonaldray5198
      @michealdonaldray5198 3 года назад

      (A n d y c a l i 9 2 4 @ y a h o o . c o m)

  • @chaobintang
    @chaobintang 4 года назад

    I am surprised I learned this much history of hardware and software in 14 minutes. I am mesmerised by how well the content is presented that everything appear well connected and perfectly contextual.

  • @KieferNguyen
    @KieferNguyen 4 года назад +3

    Mark my words, this will be a huge flop. Apple just partnered with the wrong company for cpu. Here is why AMD will win the laptop and mobile space. Apple my try as hard as they can, Apple just don’t have the GPU knowledge like AMD. rDNA2 that’s going to be running on Samsung mobile flagship will be a game changer. That’s the mobile space. On the laptop and desktops side of thing, AMD will leverage their GPU and CPU for deep integration of each other. AMD Smartshift technology will help with that integration. We are already seeing 4700u and 4800u destroying Intel equivalent in laptops. For Apple to achieve that 4800u equivalent performance their MacBook pro is using a core i9. 12core Arm CPU, Ryzen will just beat bit, Apple GPU, rdna2 will just destroy it.

  • @jecelassumpcaojr890
    @jecelassumpcaojr890 4 года назад

    John Hennessey did MIPS at Stanford while Dave Patterson did RISC-I at Berkeley. They wrote a fundamental book on computer architecture together, but their projects were different.

  • @capNsgt
    @capNsgt 4 года назад +4

    They should've gone AMD

    • @BlownMacTruck
      @BlownMacTruck 4 года назад +1

      You clearly missed the point. It's kind of amazing how badly you did.

    • @ICEMANZIDANE
      @ICEMANZIDANE 4 года назад +1

      Exactly, Apple feels kinda patriotic tbh. They definitely should have moved to AMD

  • @BramdeVogel
    @BramdeVogel 4 года назад

    Absolute best video on the topic I've seen to date, please consider making a video on RISC-V too :)

  • @ksoman953
    @ksoman953 4 года назад +1

    Apple has been rumored to do this since A10 hit the market, so this was so long in coming... but in Apple's typical way, they took their time to launch with exactly the feature set it has/would have.

  • @yourstrulysidhu
    @yourstrulysidhu 4 года назад +2

    Such detail! My goodness! Wonderful job, sir ☺

  • @utubekullanicisi
    @utubekullanicisi 4 года назад +1

    The thing I'm most excited about is that this will change laptop designs FOREVER. While the competition has chunky, beefy fans that run super loud Apple's machines will be hyper thin, super light and create absolutely no noise (while still having better performance than laptops with fans). And the tradeoff that is not being able to run some apps smoothly for a few years is absolutely worth it for what Apple's SoC will bring to the table. Not only not needing to have discrete GPU's will save space for a bigger battery, but the A series chips' outstanding efficiency will allow those Pro machines to run for HOURS even if you run Pro apps on it.

  • @deedas
    @deedas 4 года назад

    You are the only people I’ve seen so far talk about Apple’s return to RISC. Thank you for being more in touch with our roots.

    • @sonofage
      @sonofage 4 года назад

      That's true. I forgot apple did this earlier. At least he did his research well

  • @hiitsnate
    @hiitsnate 4 года назад

    Would love more videos of this topic; fantastic job!

  • @robertotomas
    @robertotomas 4 года назад

    RISC was the particular approach borne of Stanford and uc Berkeley work, but in essence it was a return to earlier architectures from prior decades. I think you are right to highlight the reduction aspects of risc and credit the scholarship (although even there, those ideas did not come out of the blue, just the particular approaches used) overall, I think it is only fair to credit risc as the slow, global approach to computing. Cisc approaches were their natural extensions , and many were great local minima, but eventually a return to the central ideas was inevitable. What is amazing is just how well some (particularly minimal/risc-like) cisc architectures persevered, such as x86

  • @DjebbZ
    @DjebbZ 4 года назад +2

    Please do an episode on Risc-V. And thanks for this episode, really good one.

  • @povelvieregg165
    @povelvieregg165 4 года назад

    Thank you!! I knew a lot of this already but you put it together really well, and it was refreshing to finally see someone comment on this switch who has done their homework probably and isn't totally clueless.

  • @rivercenter9068
    @rivercenter9068 4 года назад

    This... is one of the most informative videos I've seen in a while. WELL DONE.

  • @eerereps
    @eerereps 4 года назад +1

    very informative, Thanks!

  • @ezmod0
    @ezmod0 4 года назад

    Very nice video, I actually learned something. Most coverage about this are talking heads repeating what was said in the keynote

  • @charleschaimkohl
    @charleschaimkohl 4 года назад

    Excellent video!! After watching numerous videos explaining this, I finally understand..

  • @rcgnetworks
    @rcgnetworks 4 года назад +1

    Very detailed video. Thanks bro

  • @Jsmith1611
    @Jsmith1611 4 года назад

    5 stars. Super clear, explanations. Best tech video this month.

  • @johningraham6547
    @johningraham6547 4 года назад

    My understanding is that the difference between CISC and RISC processors is that CISC processors can operate directly on memory. I.e. you can directly multiply two numbers in memory. RISC processors require you to load the number from memory into a register before you can work with it. CISC processors have an instruction decoder that does the same thing.

  • @homeworld1031tx
    @homeworld1031tx 4 года назад

    This is really an amazingly well put together and researched video. Thank you engadget.

  • @NavamNiles
    @NavamNiles 3 года назад

    Love Upscaled. Well done

  • @Alster763
    @Alster763 4 года назад

    This is the best video explaining the RISC vs CISC situation currently at play and Apples new announcement. Great work!!

  • @nocivolive
    @nocivolive 4 года назад +1

    What most of the people forget is that apple only license the ARM instructions set. The rest is all Apple design while most of the others companies license the whole ARM core.

  • @FullNarutoIdiot
    @FullNarutoIdiot 4 года назад

    Awesome explanation! Looking forward to the RISC-V video!

  • @slap_my_hand
    @slap_my_hand 4 года назад

    RISC-V is only an open instruction set, not a design. Anyone can make a RISC-V CPU, and these designs can be either open source or proprietary. The instruction set can also be extended, so software won't be compatible across chip vendors unless they agree on a standard set of extensions.

  • @DomainObject
    @DomainObject 4 года назад

    Awesome video. Very detailed and informative. Thanks!

  • @samskyverareddy3135
    @samskyverareddy3135 4 года назад

    A video with solid content and entertaining. Great job!

  • @00bikeboy
    @00bikeboy 4 года назад

    Super impressive summary, well done.

  • @Agreedtodisagree
    @Agreedtodisagree 4 года назад +1

    This video should be used in university level classes to give a historical background around the modern processors we use to get the upcoming CS majors excited about the future it holds.

  • @Raffix394
    @Raffix394 4 года назад

    In the early days .. Assembler...
    As a Software developer I can tell you Assembler is still more commonly used then you might think, especially in devices that need to be as cheap as possible when thousands of them get produced. ECUs in Cars come to mind and I'm sure many other applications too. All to reduce production cost even though storage and CPU power is really cheap these days.