Tap to unmute

Why Apple Stopped Using Intel Chips

Share
Embed
  • Published on Feb 8, 2026
  • 🌎 Get an exclusive 15% discount on Saily data plans! Use code APPLEEXPLAINED at checkout. Download Saily app or go to saily.com/appl... ⛵
    ________________________
    You may’ve recently discovered that Apple stopped using Intel chips in their computers. Which is a bit surprising, given Steve Jobs praised Intel back in 2005. And what Jobs said was true, but that was almost two decades ago, and the relationship between Apple and Intel has been rocky.

Comments •

  • @AppleExplained
    @AppleExplained  Year ago +84

    🌎 Get an exclusive 15% discount on Saily data plans! Use code APPLEEXPLAINED at checkout. Download Saily app or go to saily.com/appleexplained ⛵

    • @EternalSnivy1196
      @EternalSnivy1196 Year ago +14

      After honey I don’t trust any RUclips sponser

    • @benjaminanddanielle
      @benjaminanddanielle Year ago +3

      Hi

    • @eksmad
      @eksmad Year ago

      @EternalSnivy1196 You should never trust sponsorships/ads on RUclips and just assume from the beginning it's a scam.

    • @PeterBorbas_3
      @PeterBorbas_3 Year ago +4

      @appleexplained Can you please make a video about why chromebooks failed?
      Or about why apple switched to M series chips?
      Or why don’t iphones have M chips like ipads, if they’re better?
      Or why do ipad M chips have “more M’s” than macbooks?

    • @Locutus
      @Locutus Year ago +2

      Don't advertise anymore Fum vapes...

  • @velociraptor5962
    @velociraptor5962 Year ago +1542

    90nm... and here we are talking about 2nm. wow.

    • @betag24cn
      @betag24cn Year ago +59

      after 14, the name is what changes only
      no wow anywhere

    • @dennyroozeboom4795
      @dennyroozeboom4795 Year ago +70

      @betag24cn Well not exactly. It is more dense, the numbers just don't reflect actual measurements anymore.

    • @betag24cn
      @betag24cn Year ago +13

      @dennyroozeboom4795 true, all you can do is look for benchmarks

    • @slipoch6635
      @slipoch6635 Year ago +2

      @betag24cn The hardest part is finding benchmarks that do not favour particular accelerations so you can get a true apples to apples comparison in a variety of workloads.

    • @ryanboscoe9670
      @ryanboscoe9670 Year ago +33

      I remember the 338nm days 😂

  • @devsda1
    @devsda1 Year ago +4236

    Yeah, I “recently discovered” this 4 years ago…

    • @carlweston4808
      @carlweston4808 Year ago +83

      recently discovered? recently discovered? recently discovered?

    • @DuesenbergJ
      @DuesenbergJ Year ago +163

      Thank you for playing the engagement game. The algorithm will be pleased.

    • @d33zknots88
      @d33zknots88 Year ago +5

      And?

    • @mendodave
      @mendodave Year ago +80

      I heard Apple makes an M1 chip!

    • @rileyinbali
      @rileyinbali Year ago

      No way!!! That's amazing! Is it x86 or Arm? ​@mendodave

  • @saifahmad1921
    @saifahmad1921 Year ago +855

    Best decision they’ve ever made.

    • @mr.dude1338
      @mr.dude1338 Year ago +17

      agreed

    • @gabrielreisinger8047
      @gabrielreisinger8047 Year ago +18

      Intel,AMD, and Qualcomm still can match the M series macs in single core performance.

    • @alexcuevas5633
      @alexcuevas5633 Year ago +40

      the biggest reason why I and a lot of people are buying Macs now

    • @Websurger
      @Websurger Year ago +27

      @gabrielreisinger8047 Do they match performance per watt though?

    • @jinraigami3349
      @jinraigami3349 Year ago +11

      @gabrielreisinger8047 They couldn't. What they could compete is multicore peerformance.

  • @SteveMichael
    @SteveMichael Year ago +685

    Another side benefit of switching to Intel back in 2006 was that for the first time it allowed people with Macbooks to run Microsoft Windows without emulation. This was huge and got a ton of people that were scared of trying a Mac to adopt one. Me being one of them that got a Mac Pro and had Linux and Windows VM's running at near native speed.

    • @tino94
      @tino94 Year ago +41

      Now we return to emulation due to the different architectures between x86 and ARM

    • @VolkerHett
      @VolkerHett Year ago +14

      Yes, this includes me! I bought the white Macbook with C2D because it won a test against a Sony Vaio of comparable weight and size. With Windows Vista it was faster than the Sony.
      Guess what, I never installed Vista on it.

    • @paradoxzee6834
      @paradoxzee6834 Year ago +11

      One of the biggest downside using Intel was people saw Macs as just PCs with Mac OS.
      Many people just did not see as worth to pay more just to have a different OS.
      I remermber the debates back in the day are Macs PCs or not on online forums

    • @tino94
      @tino94 Year ago +24

      @paradoxzee6834 By definition, Macs are computers, and they still are, regardless of whether you use PowerPC, Intel or Arm processors . They have the basic elements of a PC such as memory, storage, processing unit, input and output interfaces, There is nothing that a Mac has that differentiates it from a PC, only the Operating System, that is, Mac OS. The debate is silly as it has been tainted by a good "Mac vs PC" marketing strategy created by Apple. Which should actually be Mac vs Windows, since there are also other Operating Systems like GNU/Linux
      And yes, by that definition, smartphones are also computers.

    • @everettputerbaugh3996
      @everettputerbaugh3996 Year ago +4

      Did anybody notice that Mac OS is based on a version of Unix, a Ma Bell development? Nice GUI, though.

  • @lazyman2451
    @lazyman2451 Year ago +302

    I don’t know but my MacBook Air gets me through 12 hours of harsh work from onsite to remote work. It’s nuts how such a thin laptop can have such a performance and durability about 18 hours of use and still with 20% battery.

    • @user-Old_Ben
      @user-Old_Ben Year ago +35

      Apple - SoC, with the "M" chips is very power efficient. This is similar to iOS/iPhone - which was easier to start as a SoC design, because those devices had to run mostly on battery.
      SoC or system on a chip is running macOS (or Windows in emulation) on a chip... I love the versatility of the Intel style processors allowing greater compatibility and sometimes interesting or useful upgrade options. The newer A4 Macs look like a great value at this point.

    • @jeremymoore145
      @jeremymoore145 9 months ago +1

      I’m having battery issues with my MacBook with Intel.

    • @Niibaah
      @Niibaah 8 months ago +2

      We used to have phone run all day without dying …

    • @shootz2344
      @shootz2344 8 months ago +9

      ​@Niibaahyeah and they ran way less shit than modern phones so

    • @BrooklynBalla
      @BrooklynBalla 6 months ago +3

      ⁠@Niibaah lol ya my old flip phone used to run for like 3-4 days before needing a charge

  • @CherryColaWizard
    @CherryColaWizard Year ago +340

    Apple's chips are impressive. Fast and power effiecient, which is something Intel struggles to do. I just wish that Apple cared about making products that are easy to repair. Nobody wants to have to replace their laptop because the permanently attached SSD stopped working.

    • @mwangidanson2615
      @mwangidanson2615 Year ago +21

      In a capitalist world, thats a double profit opportunity.

    • @CherryColaWizard
      @CherryColaWizard Year ago +10

      @mwangidanson2615 Sad, but true

    • @DragonsinGenesisPodcast
      @DragonsinGenesisPodcast Year ago +11

      The SSD are now easily replaced. So the thing you wish they had, they now have.

    • @CherryColaWizard
      @CherryColaWizard Year ago +7

      @DragonsinGenesisPodcast That's good news. Let's hope they continue to do this for all of their products

    • @narrativeless404
      @narrativeless404 Year ago +8

      That's not because of Apple being good at it. It's just because of how inefficient and bloated(like Windows 11 lel) x86 architecture is.
      If Intel were to make an ARM chip, they wouldn't be any worse than Apple.

  • @itsokayrozay
    @itsokayrozay Year ago +91

    You and Brandon Butch have a competition on who has the best news anchor voice 😂😂😂

  • @concertvids34
    @concertvids34 Year ago +76

    I literally was frying an egg as part of breakfast for dinner when you used the analogy of a computer chip being so hot you could fry an egg on it.

  • @ScribblingOnTheWalls
    @ScribblingOnTheWalls Year ago +185

    Basically Intel's inability to innovate to Apple's liking caused them to look at alternatives before realising they already had one in the form of their iPhone ARM processors that would be more than powerful enough if scaled up a bit.
    And yeah, Intel's refusal on collaborating on the iPhone will probably go down in history alongside Blockbuster laughing Netflix out of the office as one of the worst calls in the tech industry's history even if at the time it looked like a sensible decision.

    • @jesfel14
      @jesfel14 Year ago +17

      Add Yahoo knocking back offers from Google and Microsoft to the list.

    • @Bewefau
      @Bewefau Year ago +1

      they used last gen intel chips lol

    • @oakspines7171
      @oakspines7171 Year ago +1

      Not an easy job for Apple to switch HW and all the corresponding ecosystems. They did it. Apple is an innovative and successful company, and the market rewards it accordingly.

    • @jesfel14
      @jesfel14 Year ago +1

      @ I can see AMD doing it if the hand is forced.

    • @BradHouser
      @BradHouser Year ago +4

      @rosl-80 Otellini was not for it, but he later admitted that his gut had told him "yes", and he regrets the decision. The sad thing is Intel had an ARM processor called X-Scale, which they could have promoted to Apple, but they wanted x86 chips in the iPhone, which was too expensive.

  • @smidgeondutchrabbit
    @smidgeondutchrabbit Year ago +1032

    intel used to be able to make good processors but now they can only make hot plates

    • @Joe-TechandBlog
      @Joe-TechandBlog Year ago +59

      This is what happens when you don't have coipition for the last decade or so and then you are hit in the face with AMD Ryzen. I think Nvidia is next as the power on the RTX 5090 is insane.

    • @scottgfx
      @scottgfx Year ago +53

      Intel started making hotplates in 2000 with the Pentium 4. If my understanding is correct, they went back to the Pentium 3 and re-engineered it to become the "Core" line of chips.

    • @OctavioGaitan
      @OctavioGaitan Year ago +19

      I use an old Pentium 4 Prescott hot plate to cook hot dogs. LOL :P

    • @tichaonanhlangano1426
      @tichaonanhlangano1426 Year ago +5

      Man show his face hmm 2025 kicks off

    • @auritro3903
      @auritro3903 Year ago +10

      Chill man... it's winter. Someone needs to stay warm.

  • @OKZ15
    @OKZ15 Year ago +518

    Apple Explained's face showing is such a weird concept to me. But I think I like it.

    • @copyer9088
      @copyer9088 Year ago +25

      He’s done it before are you new to this channel?

    • @YahmahaR7
      @YahmahaR7 Year ago +26

      Looks like AI

    • @glacieawn
      @glacieawn Year ago +15

      @YahmahaR7 not really

    • @FruityKoala
      @FruityKoala Year ago +11

      It almost kind of looks ai. Could just be a high frame rate or something though.
      But the voice does sound off

    • @alexej123
      @alexej123 Year ago +6

      It looks like the channel was hijacked, because i dont remember that videos on that channel used to spread so much BS before. Its his second video of his that i saw recently after long time of not watching(and before i only watched very vew on topics that were interesting), but both of the recent videos came with very obvious bs, thinking that the viewer are dumb or Apple a$$ licking, that wasnt there before. That used to be pretty okay channel, now its just...

  • @kevikiru
    @kevikiru Year ago +222

    21:44 The A4 wasn't introduced with iPhone 4, it was introduced with the original iPad. In fact, the A4 in the iPhone was significantly downclocked. Small detail though...awesome documentary.
    Edit: Another small detail...when you say that they did not have to pay their chip surplier, it doesn't make sense. Maybe what you mean is that they did not have to pay for the margins of an off the shelf chip. They still had to pay Arm to licence ArmV7 ISA, they had to pay Samsung to manufacture the chips, and they had to pay licensing fees to Imagination Tech. for GPU designs.

    • @dlnishantabhishek2815
      @dlnishantabhishek2815 Year ago +12

      Samsung doesn't manufacture chips it's TSMC

    • @KrisiCrossi
      @KrisiCrossi Year ago

      @dlnishantabhishek2815 they actually DO make their own chips but do partner with TSMC for the phones (for Example They put their own chips in the non american S20s named "Exynos")

    • @MikeinAustin
      @MikeinAustin Year ago

      @dlnishantabhishek2815Samsung doesn’t manufacture the ARM chips for Apple. They do for others.

    • @EddieStarr
      @EddieStarr Year ago +12

      Apple announced the A4 in January 2010, and Steve Jobs confirmed the iPhone 4 would use it in June 2010. The A4 was also used in the first-generation iPad, fourth-generation iPod Touch, and second-generation Apple TV. I was there the day it was announced.

    • @triadwarfare
      @triadwarfare Year ago

      ​@dlnishantabhishek2815They used to manufacture chips with Samsung until their lawsuit that Samsung was copying Apple, hence, the year after that, they made an exclusivity contract with TSMC and had first dibs on new nodes.

  • @qwerty6789x
    @qwerty6789x 7 months ago +40

    thats what you get when Companies are run by MBAs instead of Engineers

    • @RunsWithChainsaw
      @RunsWithChainsaw Month ago +1

      And idiotic MBAs, at that.
      No, Pat, it wasn't OK for your CPUs to run at 95C.
      I'd still buy a B580 if they went on sale for $200. Taking a big chance buying intel these days. It might just explode after 6 months...
      Too bad management can't properly identify the B580 and B570 as loss leaders. Their chance to regain some public trust. Gamers may not make up a big part of the revenue stream but they are 98% of the hype train. If the people in enterprise don't even want to buy intel for their kid's gaming PC, why would anyone think they'd make a multi-million dollar purchase from intel at work?

    • @Varshan_Prabu_M
      @Varshan_Prabu_M Month ago +2

      U need both mate.

    • @randomname435
      @randomname435 17 days ago

      @Varshan_Prabu_M Bros trying to justify his degree.

  • @anthonykoller4459
    @anthonykoller4459 Year ago +57

    I had a Intel MacBook and when I changed over to the M series, I was amazed how fast they are and the battery last for ages and more importantly no more fan noise

    • @TomJones-tx7pb
      @TomJones-tx7pb Year ago +6

      And now I use an M4 mini, and my M1 laptop feels slow!

    • @dheerajb1883
      @dheerajb1883 5 months ago +3

      Wow man, wish they become compatible with my cad application, then many of us will be able to feel the power!

    • @biffbobfred
      @biffbobfred 3 months ago +1

      When I used to do a multi vid stream Zoom call, on the Intel MacBook the fans would spin up like a jet engine. No longer.

  • @Ad-skip
    @Ad-skip Year ago +76

    01:31 ad skip

  • @theredrighteye
    @theredrighteye Year ago +13

    I think our bro lagged by making this video like the intel chips.

  • @IgorsPlay
    @IgorsPlay Year ago +159

    I’m really glad they did, M chips are fantastic! 🎉

    • @chillinwithluis
      @chillinwithluis Year ago +12

      Damn right!

    • @tzacks_
      @tzacks_ Year ago +8

      sure they are, 99% of them is used for office type workload.

    • @IgorsPlay
      @IgorsPlay Year ago +9

      @ You are mistaken, 99.9% of them are used for creative work.

    • @tzacks_
      @tzacks_ Year ago +6

      @IgorsPlay these days every cpu has hw video encode, 3d accelerator.. so yea, office type.

    • @Tryna-B-Good
      @Tryna-B-Good Year ago +17

      It’s amusing to see many Apple haters stating Apple never invents anything, they just copy…yet they have the fastest domestic/private computers in the world built with chips that they designed and built lol

  • @DavidRavenMoon
    @DavidRavenMoon Year ago +74

    It should be notated that ARM was founded as a joint venture between Acorn Computers, Apple, and VLSI Technology. Acorn provided 12 employees, VLSI provided tools, Apple provided a US$3 million investment (equivalent to $7 million in 2023).
    Larry Tesler, Apple VP was a key person and he helped recruit the first CEO at the joint venture, Robin Saxby.
    Apple had been using Acorn’s Archimedes in the Newton.

    • @TheEulerID
      @TheEulerID Year ago +15

      No, Apple did not use the Archimedes in the Newton device, or anything else for that matter. Archimedes was a line on Acorn computers, not a processor architecture. That was always called ARM, and used to power the Archimedes, but then stood for Acorn RISC machine, and was originally developed by the company on a shoestring with a tiny handful of engineers. When the Acorn Archimedes got steamrollered by the PC architecture, the processor business was split off and targeted at the mobile market due to its astonishingly low power consumption, and Acorn RISC machines became Advanced RISC Machines, which was where that critical Apple investment came in with the use of the ARM processor architecture in the Newton device.
      Ironically, Newton failed in the market, but the ARM processor became wildly successful and Apple divested themselves of their share in ARM as they were in desperate need of the capital.
      Nb. by coincidence or not, Isaac Newton was a professor from Cambridge, and Acorn and, therefore ARM was founded in that hat university city.

    • @felipe367
      @felipe367 Year ago +8

      @TheEulerIDimagine if Apple had kept their share of ARM 😊

    • @TheEulerID
      @TheEulerID Year ago

      @felipe367 Then it might not have been so successful. Those in the mobile appliance market would not have been keen on being reliant on IPR and licenses granted by a company which was in large part owned by a competitor. It's especially lucky that it didn't happen as Apple's track record on trying to block out competitors and seeking to use its IPR as means of doing so is not great. ARM, majority owned by Softbank does not really compete with its own customers, and when Nvidea tried to buy ARM it caused major competition concerns by both the company's customers and regulators in the UK, USA and EU.
      Besides which, Apple were in a desperate way at the time. The needed the money.

    • @kurt9395
      @kurt9395 Year ago

      @felipe367 During the late 90's - early 2000's, if you followed Apple's financial reports, they would regularly report sales of their seemingly bottomless pit of ARM shares. The reason was most likely to bring their numbers up to Wall Street analyst expectations.

    • @kris-u4n6j
      @kris-u4n6j 10 months ago

      @TheEulerIDAcorn and VLSI were on the brink of bankruptcy and that when Apple came and ARM was created. All 3 owned 1/3 of the shares. Later Apple had to sell the shares to survive. Now Apple again owns 10% of the company.

  • @UserfeedbackbyTea
    @UserfeedbackbyTea Year ago +42

    130-90nm is wild... In a world where 5nm is seen as old tech and 3nm is the standard... How far we have come

    • @jonathanruiz8723
      @jonathanruiz8723 Year ago +10

      It’s actually more like >13nm but still impressive! Node sizes since the late 90’s are mostly a marketing gimmick and not representative of the actual minimum feature size of a transistor. The figure is now derived from the performance gains realized from improved semiconductor topology. That or the marketing team just throws out a number that is lower than last time .

    • @XashA12Musk
      @XashA12Musk Year ago

      @jonathanruiz8723 you mean if TSMC get performance of 2 nm node from 4nm revised node than they can call it 2nm node?

    • @jonathanruiz8723
      @jonathanruiz8723 Year ago +4

      ​@XashA12Musk Correct... kind of... It's really mostly marketing. For example Intel's 10nm node technically has higher transistor density than TSMC’s 7nm node. The figure did originally match the minimum gate length of a given process' transistors but it hasn't since the mid 90's (My original 2008 guesstimate was off by over 10 years). Most of the gains nowadays are from placing transistors closer together and effective usage of 3D space, rather than just decreasing transistor size. Hence all the hot chips lately.

    • @osdenza
      @osdenza 7 months ago

      "5nm" or "3nm" are just marketing terms they no longer refer to gate length in a transistor

    • @wolfy6631
      @wolfy6631 5 months ago

      ​@osdenzaSource ?

  • @njerurichard3581
    @njerurichard3581 Year ago +10

    25:00 They wanted to build the fastest, most efficient computer possible -while remaining reasonably affordable- .

  • @SeverusBlue
    @SeverusBlue Year ago +18

    Great video but you completely left out the ipad - think that would’ve been relevant since M1 is a modified A12Z.

  • @Samsgaming310
    @Samsgaming310 Year ago +40

    Yeah… this isn’t “Recent” at all as this has been out for a couple years now/

  • @whophd
    @whophd Year ago +11

    10:50 I was attending the keynote that year, and in the days that followed, with after-parties every night in San Francisco, word was getting around that Steve made this announcement against the wishes of IBM. Apparently they weren't willing to stick their necks out for "3 GHz", but he stuck their necks out for them. In fact the IBM engineers were saying there's no way they'd reach that speed in 12 months, and were only promising around 2.5 GHz, which is exactly what happened.
    When you think about it, "the Osborne effect" was exactly the sort of thing you try to avoid, especially when the memory of Osborne's mistake was fresh in the 1990s and 2000s. So the fact that Steve directly counteracted the golden rule of the Osborne effect, the rule of not pre-announcing next year's technology while announcing this year's technology, meant that something must have really been up.

  • @IOOISqAR
    @IOOISqAR Year ago +7

    When Apple asked INTeL for a smartphone processor, the ARM-architecture was not set. Instead, using ARM was a result of INTeL denying the processor built.

  • @devcybiko
    @devcybiko Year ago +35

    20:32 Commodore owned their own chip manufacturing company, MOS Technology, Inc., which they acquired in 1976. MOS Technology was instrumental in creating some of the most iconic chips of the era, including the 6502 processor, which powered many early personal computers such as the Apple II, Atari 8-bit computers, and of course, Commodore’s own computers.
    Under Commodore, MOS Technology developed custom chips that gave their machines a competitive edge. For example:
    1. The VIC-II - Graphics chip for the Commodore 64.
    2. The SID (Sound Interface Device) - Famous sound chip in the Commodore 64.
    3. The 6510 CPU - Used in the Commodore 64, a variant of the 6502.
    Owning MOS Technology allowed Commodore to reduce costs by vertically integrating their chip production, which was a significant advantage in the competitive home computer market of the 1980s. This strategy contributed to the affordability and success of systems like the Commodore 64.

    • @Bewefau
      @Bewefau Year ago +1

      Jack Tramiel :D "yes your a business of course you want to make money but you don't need to make a stupid amount of money"

    • @csjakat
      @csjakat Year ago

      And when Amiga Prototype was shown to Steve Jobs he said "Too much hardware!". I guess he was not a big fan of custom chips back then.

    • @uzlonewolf
      @uzlonewolf Year ago +1

      But... but... but the MBA farm says outsourcing everything is always cheaper! lol

    • @kirishima638
      @kirishima638 4 months ago

      Commodore should have dominated the market but they threw it all away.
      The Amiga in particular was so ahead of its time.

  • @obscuracamaria2931
    @obscuracamaria2931 Year ago +6

    Waiting for a video on the Siri lawsuit

  • @ONE_GEN_X
    @ONE_GEN_X Year ago +19

    I usually don’t like long videos but this was great information and you kept it moving very well. I appreciate it

    • @asaeed86
      @asaeed86 Year ago

      He could’ve just said that people installed windows instead of macs and it would’ve taken less than a minute but that won’t get any monetization

  • @michaelelder3945
    @michaelelder3945 9 months ago +9

    I was there and saw a lot of this as it was happening until I left the computer industry in 2007. I first started working at IBM when we were using the code name "Rios" for what would later be named the RISC/6000. I remember when Apple switched to Intel due to the cooling problem with the G5 processor. I even went to COMDEX '94 in Las Vegas just before Windows 95 was released. Now? ... I'm working as a musician in Monterrey, Mexico. I got into computers because I wanted to use them in Music production and I ended up as an O.S. Specialist. It's fun to take a trip down memory lane, (No pun intended), but now I'm working on learning web design and Latin American Spanish. .. And piano (which will be my 5th and final instrument). I'm still a UNIX nerd, but now I'm more interested in using computers as tools for producing results.

  • @jriver226
    @jriver226 Year ago +8

    TSMC doesn't create their own arm chips, they are a foundry. They make chips for others.

  • @betamax1091
    @betamax1091 Year ago +5

    and thank goodness they did. I no longer have a room heater / Macbook Pro.

  • @Gaisenberg
    @Gaisenberg Year ago +64

    Dude, where's your signature music?

    • @Pearloryx
      @Pearloryx Year ago +5

      Yeah I kinda miss it, probably because he doesn’t have to add awkward cuts with the videos illustrating about the chip situation.

    • @thecapone45
      @thecapone45 Year ago +4

      I don’t miss it… I used to binge his videos and the same song over and over would get on my nerves…

    • @euromarkand
      @euromarkand Year ago +1

      would be silly in a so long video

  • @AchyutChaudhary
    @AchyutChaudhary Year ago +34

    *Next video idea:* who’s the best  Apple leaker!
    Mark Gurman, Ross Young, Jon Prosser, Ming-Chi Quo, etc…

  • @thetypebeast
    @thetypebeast Year ago +6

    Apple's PowerBook sales _was not_ 64% in 1981, as the first Macintosh was released in 1984. The first PowerBook was released in 1991. The G5 chip launched in 2002, so is the pie chart supposed to be 2001 Macintosh sales?

  • @DolapoAmusan
    @DolapoAmusan Year ago +7

    Watching this on my Apple Silicon M1 powered Macbook. The M1 was such a revolutionary laptop chip especially for tech enthusiasts in Nigeria for it's exceptional power management. Start the day with a full battery and you are free to stay unplugged till nightfall! It gave incredible mobility to the tech worker's life. It's so good I know friends who have not needed to upgrade in the last 4 years since it was introduced.

  • @ivanmaglica264
    @ivanmaglica264 Year ago +12

    Moore's Law is, first and foremost, despite the name, an observation. Not even consensus. So there was no violation.

    • @BradHouser
      @BradHouser Year ago +3

      So true, and it is amazing that Moore's predictions were made on something like four generations of technology. That it has lasted this long is incredible. It provided a roadmap that semiconductor equipment manufacturers could help fulfill a sticking to a technology drum beat of sorts.

  • @HelpfulAssistant
    @HelpfulAssistant Year ago +4

    1:28 the worst part of this is: if your phone is carrier locked, you can’t add esims. Carrier locks usually are placed if you are financing you phone through your phone bill, and will generally lift shortly after you pay off your phone.

  • @portfolio91
    @portfolio91 8 months ago +5

    One of the things not mentioned here is that each change to a new CPU architecture was done in a seamless way. Before I saw it, I had thought such a thing was impossible, because it was so complex.
    Each CPU architecture runs on a different instruction set. This is the low-level Machine language that the CPU runs on. (Your email program, accounting program, word processor and web browser are different applications, although sometimes these days, there are websites that also do it, and run on the browser.) The mac started on the 68000 architecture, transitioned to PowerPC, then to Intel, and now the 'apple silicon' ARM chips (M1, M2, so on). Most of the applications you run on your machine are compiled down to Machine Language so it can just load up and run - often called the "binary" because the raw bytes of the program are totally different. But during these transitions, old applications fundamentally don't work on the new architecture. I'm not just talking about like Italian vs Spanish, I'm talking Italian versus Chinese or Sign Language. Nothing in common besides bytes being 8 bits. Sometimes, the way they numbered bytes was in reverse order!
    Typically, the programmers have to make two different applications, running the compiler & linker once for each architecture. You have to install the one that works on your machine. Then, there were all sorts of details that were different and often the user had to make changes, which were a hassle (for thousands or millions of users). Electronics were different, interfaces were different, plugs were different. Windows usually can't do that, so it's always on Intel's x86 architecture, which gets old.
    But, each time, Apple pulled it off, seamlessly. I remember one friend who got a new machine saying, he unplugged cables from his old machine,put the new machine in its place, plugged in cables, and was checking email within an hour. Absolutely astounding.
    One technique they used was a 'fat binary'. The application file actually contained two binaries, and the machine would decide which it could run. Apple would supply the software to compile an application that way.
    Another technique they used was emulation. The new chip was faster, so Apple wrote software that would pretend to be the older chip, and could run apps for the older architecture. (the Emulator, it would be slower than the old machine, but everything worked, and you buy a new machine eventually.) I remember attending an Apple lecture describing how they did that in one case. This one programmer was explaining differences between the new emulator and the old chip. All of the differences were very low-level and obscure, and nothing I, as a programmer, had to deal with. I was amazed they did such a good job.
    So, they've gone through three transitions like this, each one seamless. I've got two Macs in my apartment, one Intel, and this one ARM. I can just move apps back and forth, no problems. Absolutely incredible work.

  • @aptiveviennapro
    @aptiveviennapro Year ago +7

    Now is the reverse.

  • @whophd
    @whophd Year ago +8

    13:46 bit unfair to compare the Pentium I to the PowerPC G3. Surely the better comparison would be Pentium III with PowerPC G3, or Pentium I with PowerPC G1 (e.g. 601 or 604).

  • @darthslackus
    @darthslackus Year ago +23

    As a Windows user for 20+ years this is why I switched to the M4 mac mini.
    Intel and MS can have the 'finger' from me.
    Though Apple still gets the 'finger' from me for being a too tightly closed ecosystem.

    • @tibyanne
      @tibyanne Year ago +1

      Same

    • @xsleeptodreamx
      @xsleeptodreamx Year ago +1

      how do you like it so far? macOS isn’t really a closed system i’d say, just iOS, which is good imo considering their stance on privacy
      i don’t particularly want my phone with GPS, mics, cameras, etc that i have with me at all times to be an open system. there’s little to gain for the loss in security (though people still blow it by installing social media apps and giving them mic and camera access lol)

    • @darthslackus
      @darthslackus Year ago +4

      @sleeptodreamx MacOS is absolutely a closed ecosystem system. If it weren't, we have more Linux natively running on Apple silicon. Even the m3 & m4 can't run Linux natively.

    • @CPpuppies
      @CPpuppies Year ago +1

      @darthslackusYou are interchanging the Apple hardware and software in your comment. Confusing.

  • @pyrioncelendil
    @pyrioncelendil Year ago +6

    27:12 and yet no mention of Intel's via oxidation issue in its 13th and 14th gen chips? I'd argue that was a bigger contributor to their stock price crashing out in 2024.

  • @timothymarks6992
    @timothymarks6992 Year ago +8

    I like all your videos, but will you post more of your 2-3 minute short videos explaining things like what's new in iOS 18.2 or rumors about upcoming apple products or something like that? Still enjoy all your videos, they're good quality content.

  • @robertsteel3563
    @robertsteel3563 Year ago +3

    Long time no see with the face to face conversation Type Video!!

  • @bemacbe
    @bemacbe Year ago +12

    The pie chart at 10:09 says laptops made up 64% of Mac sales in 1981. This is TOTALLY wrong. Consider this - the first Macintosh Portable computer was introduced in 1989. 😂

    • @OscarUnrated
      @OscarUnrated 11 months ago +5

      He must’ve put the year wrong it’s supposed to be around 2003

  • @cocoon2520
    @cocoon2520 Month ago +1

    anddd now we got chips like 13980hx 7945hx that constantly stay on 90+ degrees that could fry an egg

  • @iRedMCYT
    @iRedMCYT 7 months ago +3

    If only game developers would catch up now!

  • @rainwatertea
    @rainwatertea Year ago +5

    One of the BEST narrators on YT

  • @mijmijrm
    @mijmijrm Year ago +9

    i'm still getting over Apple giving up on the 6502

  • @jerelull9629
    @jerelull9629 Year ago +2

    I used an early Compaq "portable" -- It was luggable, but had to be plugged in to work, and floppies formatted by Compaq didn't register on IBM computers, and vice versa. Formatting the floppies on my Mac at home worked for both sides of the "compatible divide".

  • @metyaricioglu9077
    @metyaricioglu9077 Year ago +3

    the best explanation of apple switching of M chips 👏🏻 congratulations

  • @johnsartain4160
    @johnsartain4160 Year ago

    This is the first video where the sponsor actually has something that I need

  • @Rokuscreensaverwithmusic

    You should do the history of text messaging from when it first blew up to the introduction of SMS then iMessage then how social media apps begin allowing the feature then when RCS was released when iPhones got RCS and how many companies are starting to kill off SMS ❤

  • @gnocci42069
    @gnocci42069 Year ago +2

    This video has me looking up old apple WWDC events. They way Steve Jobs talks is truly captivating

  • @DankyMankey
    @DankyMankey Year ago +13

    I didn’t know they used to call building a PC “cloning an IBM” in the 90’s

    • @mendodave
      @mendodave Year ago +1

      Yep IBM clone. I had several of those. Even built a few myself.

    • @Pene7942
      @Pene7942 Year ago

      Yea - department stores like Sears, KMart, JC Penney’s, they even used the term in their marketing and in the stores, while also featuring the IBM PC.

    • @gooseknack
      @gooseknack Year ago +2

      Because many were literally "clones" or a carbon copy. This was possible because IBM's original PC system was an open architecture system. This was done to encourage third party peripheral manufacturers to make peripherals.
      The result, was IBM Clones.
      Others, they did the right thing and created "IBM Compatible Computers". These were different from IBM's system, but compatible with IBM software and peripherals.

    • @Bewefau
      @Bewefau Year ago

      IBM was king 20k computers in the 80's woot woot. 8088 chip's. Your keyboard layout ? IBM baby

    • @BradHouser
      @BradHouser Year ago +1

      While it was easy to duplicate the hardware, the IBM PC BIOS firmware was copyrighted, so Compaq had to reverse-engineer its own compatible BIOS. They were successful, and the acid test of compatibility was if it could boot Flight Simulator, which at the time before MS bought it was a bootable disk. Some clone makers were not as good at creating a fully-compatible BIOS and failed the FS boot test. Eventually companies like Phoenix Technologies created "clean room" copies that were licensed to manufacturers. IBM later realized it made things too easy to duplicate so they came out with the PS/2 architecture, which had to be licensed by other manufacturers. No one wanted the PS/2 as it was not any better.

  • @LastOneLeft99
    @LastOneLeft99 Year ago

    Triumph of the Nerds is one of my favorite documentaries. I have seen the 4 hour version of it like 20 times when I was a teen.

  • @YahyaRahimov
    @YahyaRahimov Year ago +7

    And people say Apple stopped innovating...

    • @MyWifesBoyfriend-pp8kq
      @MyWifesBoyfriend-pp8kq 10 months ago +3

      They don’t innovate often but when they do it’s impactful, for better or worse.

  • @robertplatt643
    @robertplatt643 9 months ago

    What's that long yellow thing at 16:05?

  • @whophd
    @whophd Year ago +4

    16:32 What really made "both companies are engineering driven" ring true, was the way Intel rolled out new technology much faster with Apple. Gone was the BIOS boot mode, finally replaced with UEFI. 64-bit was missing for 5 minutes but came up pretty quickly with the Core 2 Duo. Work must have begun pretty soon after that on Thunderbolt 1 (joint project between Intel and Apple), and remember how USB 1.0 existed but adoption was going nowhere until the iMac G3. Even the "Core" naming sequence commencement coincided with Apple's transition.

  • @abzzeus
    @abzzeus Year ago +2

    The reason ARM is so power efficient is that the very first one way back in 1984/5 they wanted to use the plastic packaging rather than ceramic as it was cheaper - oh and that power efficiency? When they went to measure the power usage it was ZERO as they'd forgotten to attach the power connector, it was scavenging power off the memory bus!

  • @SB7-r9i
    @SB7-r9i Year ago +13

    25:45 on the laptop Market. Desktop still has the ryzen 9 7950x3D. and dont forget about the threadripper 7995WX.

    • @akyhne
      @akyhne Year ago +2

      The Threadripper is not a desktop CPU. It's a workstation CPU.

    • @StevenSSmith
      @StevenSSmith Year ago

      ​@akyhneI beg to differ

    • @akyhne
      @akyhne Year ago +1

      @StevenSSmith So, you don't know what a workstation is?

    • @StevenSSmith
      @StevenSSmith Year ago

      @akyhne how do you say "I have autism" without saying "I have autisim"

    • @akyhne
      @akyhne Year ago +2

      @StevenSSmith Say you don't know anything about computers, without saying you don't know anything about computers.

  • @stijn27112
    @stijn27112 8 months ago

    Where did the apple explained sound go

  • @FennecTECH
    @FennecTECH Year ago +3

    Im watching this on an M1 iPad Pro 12.9. 3 years on. Its still screaming fast and even got apple intelligence

  • @replex8889
    @replex8889 2 months ago +1

    Vro has the Homer Simpson Beard

  • @DEXXXO
    @DEXXXO Year ago +3

    Hey Greg when you gonna talk about the Failure of apples Vision Pro Thanks 😉

  • @HiFiGuy197
    @HiFiGuy197 Year ago

    Were the first “Intel Inside” ads rendered on Macintoshes…? I thought I recalled that from 30 years ago…

  • @gregoryhouse7140
    @gregoryhouse7140 Year ago +12

    I am from Poland, I use Mac Mini M2 Pro, and MacBook M1, after switching from Itel I felt the power of the computer, it is a technological abyss. Although Apple M1 is not the first ARM computer, but only now people are ready. The first computer in ARM technology was ACORN called ARCHIMEDES, in 1985. Greetings from Poland.

    • @Bewefau
      @Bewefau Year ago

      with no backwards compatibility

    • @CPpuppies
      @CPpuppies Year ago

      @Bewefauhaha backwards

    • @KasakyaGeorge
      @KasakyaGeorge 11 months ago

      Have you tried running Maya renders on your beautiful MacBook ?

  • @hkiajtaqks5253
    @hkiajtaqks5253 2 months ago +1

    Bro started at when Dinosaurs were using phones.

  • @NightSky360
    @NightSky360 Year ago +3

    Best move for Apple developing the M line of chips. I just got the Mac Mini M4 and its the fastest computer I've ever owned.

  • @itchyisvegeta
    @itchyisvegeta Year ago +1

    7:07 - Wow, its literally Blast Processing

  • @msvd1152
    @msvd1152 2 months ago +3

    You can make a new clip..."Why is Apple using Intel chips again" 😂

  • @Nathan15038
    @Nathan15038 Year ago

    I mean, definitely seeing you sitting from a mic talking on Apple Explained is weird, but I’m all for it😂

  • @ryok8090
    @ryok8090 11 months ago +4

    1:56 anyone else whisper “Risc is good” underneath their breathe?

    • @OroborOSX11
      @OroborOSX11 10 months ago +1

      RISC is king as far as I’m concerned. CISC isn’t worth the complexity and power consumption. Apple Silicon and other RISC CPUs have proven this now for non-mobile use!

  • @Billionairelnvestment
    @Billionairelnvestment 5 months ago +1

    Not everyone follows things the moment they happen - for some people, even 4 years later can still feel like a fresh discovery. No need to dismiss it.”

  • @kris-u4n6j
    @kris-u4n6j 10 months ago +3

    Why is it that people forgets that Apple founded the ARM with VLSI and Acorn? Without Apple ARM would not exist.

  • @utopian123
    @utopian123 11 months ago

    at 10:09 the pie chart says 64% of all Macintosh sales were laptops... in 1981?

  • @estajosue
    @estajosue 11 months ago +5

    You missed a key point. Yes… reliability, availability, release schedule, and power efficiency all played very big roles in Apple’s move away from Intel. However, one of the biggest reasons was thermal efficiency. Apple’s brand has always been power AND aesthetics, not one or the other. Apple’s aggressively thin and sleek form factors made Intel chips absolutely useless in portable Macs. They were throttled back so far once the computer was asked to do any meaningful task, all because the chip would get way too hot for the cooling system to properly mitigate given the form factor of Apple’s designs. Once the Skylake nightmare happened, the camel’s back was broken, but it was more like the final straw more than the deciding one.

  • @jeffevarts8757
    @jeffevarts8757 Year ago +1

    Great video. Great use of the advertisenents & presentations. Great technical details, and as far as I could tell, you got em all RIGHT, which is rare

  • @orangejjay
    @orangejjay 11 months ago +9

    22:20 to get to the point.

  • @NeonGreenKing
    @NeonGreenKing Year ago

    Love this format. It’s been a while.

  • @nicholashennessy4543
    @nicholashennessy4543 Year ago +27

    And Intel has no one else to blame but themselves. They were the best on the market for good reason back when Steve was still around, even into the 2010s. Intel at a point got comfortable and stopped innovating, then in a rush to fend off the AMD Zen architecture, they produced flawed products that drew way too much power and produced too much heat. Apple was very smart to pull out when they could.

    • @louiswilliamterminator2887
      @louiswilliamterminator2887 Year ago +2

      they were also incredibly arrogant, bullying and greedy. Probably criminal too

    • @Bewefau
      @Bewefau Year ago

      No it was Meltdown and Spectre that did them in.

    • @systemBuilder
      @systemBuilder 5 months ago

      Marketing people ran Intel from 2008 until 2024. Pat Gelsinger lied so often and made so many stupid moves and wasted so much money needlessly that he qualifies as a marketing person... They really blew it with 14 nanometers (14nm+, 14nm++, 14nm+++, 14nm++++ !!). They were stuck there for 6 years! They bet against ASML and it was an asinine mistake to make! Now intel does not have enough market share for a 2nm fab. Even if someone gave it to them they would have a hard time affording the cost of running it! Without cell phone chips to fabricate any advanced semiconductor manufacturer is completely screwed ... The fabs cost too much and the annual volumes from CPUs and GPUs are too small!!!

    • @avarise5607
      @avarise5607 4 months ago

      ​@Bewefaufunny that apple chips have been hit by similar silicon level bug, so critical that it has to be disabled and software-worked around, essentially same as meltdown/spectre 😂

  • @Astinsan
    @Astinsan Year ago

    powerpc .. I remember those days.. got a xserve dual g5 rack server.

  • @AchyutChaudhary
    @AchyutChaudhary Year ago +141

    *Bro…I appreciate your long-form-style content, but can you also please continue your to-the-point quick explainer videos, and your frequent macOS/ iOS update series please :)*

    • @zellzoi
      @zellzoi Year ago +8

      I love these videos

    • @IonasalPreciel
      @IonasalPreciel Year ago

      Personally, I like the long form videos much more.

  • @Royaleoake
    @Royaleoake Year ago

    I remember those early 90’s Intel and Motorola commercials. Such an awesome time for computers.

  • @apa5749
    @apa5749 Year ago +9

    Apple Silicon is such an amazing breakthrough. Apple is the only company that has managed to disrupt the AMD-Intel x86-64 duopoly in the market. consumers didn't realize how terrible mobile computers are with x86-64 chips. Apple Silicon is proven to be so great that Qualcomm just started to make their own laptop-grade ARM64 chips with Snapdragon Elite series. and guess who was involved there? Gerard Williams, the former CPU engineer of Apple who made the A series chips, i think he was involved in designing the M1 too.
    but there's something that still doesn't make sense to me. it's the existence of Apple Silicon Mac Pro. it's a high end consumer grade workstation. the whole appeal of a Mac Pro has always been self-upgradability while the whole appeal of Apple Silicon is performance per watt and long lasting battery on Macbooks. people who buy this grade of computers don't care about power efficiency, they just want performance. self-upgradability is not possible with Apple Silicon, you can't upgrade the memory or the graphics. once you got an Apple Silicon Mac Pro then the only thing you can add is just the PCIE expansions so an Apple Silicon Mac with a case that big doesn't make sense.

    • @lycanthoss
      @lycanthoss 7 months ago

      Apple using ARM for Macs doesn't make sense to you because you have the wrong idea about CPU instruction sets. The instruction set is only the "language" the CPU uses to understand instructions. The ISA does impact the decoders and things around it, but in modern CPUs the impact to efficiency is minimal. ARM CPUs aren't really any more efficient than x86 CPUs, at least not to a meaningful degree. Intel or AMD can 100% make a CPU with similar or even better efficiency using x86. In fact, under heavier loads Apple chips don't really perform much better than the new AMD or Intel chips.

  • @barrymaslow8441
    @barrymaslow8441 6 months ago +1

    Dude said "had shrank"... LOL!
    Editor pleeeeease!

  • @riseofthethorax
    @riseofthethorax Year ago +4

    Thanks!In essence Apple custom designs their chips to address the needs of the platform, users and the applications.. And I agree.. oh and thanks for talking about the power pc line, I had wondered what had happened with that.. Part of the selling point of those PC's was that Apple could have custom hardware emulations at suitable speeds on native hardware... The capacity to run windows and macos on the same computer,

    • @ibranidi
      @ibranidi Year ago

      can i also have $1.99 😢

  • @Indrid__Cold
    @Indrid__Cold Year ago

    6:00 I saw this video at the Intel Overdrive product launch banquet. Still makes the hairs on the back of my neck stand up!

  • @pratronald
    @pratronald Year ago +8

    25:09 Affordable?!

  • @HairyDalek
    @HairyDalek Year ago +10

    ARM processors are not “in their infancy” - they’ve been around since the 1980s. en.wikipedia.org/wiki/ARM_architecture_family

    • @yeahthatkornel
      @yeahthatkornel 10 months ago

      He didnt say theyre in the infancy NOW. He meant back then.

  • @markmaz56
    @markmaz56 6 months ago +1

    Your chart at 10:05 is wrong. There were no Macintosh sales in 1981 and definitely no laptops!

  • @PaulzoR2
    @PaulzoR2 Year ago +6

    strange vid... i was an IBM PC dev in the 80s, and it was a different world for sure. I just installed an apple mini with the M4 chip, and it's light years ahead of the olden days. This tiny box blows me away re: power consumption and performance. Everything up until 2022 was misery on a board, but now everything is buttery smooth across the entire line of Apple PCs.

  • @ikeman1972
    @ikeman1972 Year ago +2

    This video is exceptionally well-produced and informative. I eagerly anticipate your future videos, particularly those of this nature, as they are both engaging and educational.

  • @salteveline
    @salteveline Year ago +9

    the WINNER here is TSMC - it has been manufacturing Processors designed by Apple, nVidia, AMD .. 😄

    • @Bewefau
      @Bewefau Year ago

      And China wants it.

    • @NCHLT
      @NCHLT Year ago

      Mediatek and Qualcomm too, even intel I think

  • @TheFPSChannel
    @TheFPSChannel Year ago

    Great video (as usual)
    I like the addition of you on camera… and I LOVE the fact that you opted to drop the repetitive music track in behind the video. So, so much better this way. 👏👏👏👏👏👏👏👏👏

  • @swschilke
    @swschilke Year ago +11

    That video could have been 5-10 minutes long

  • @StevenSSmith
    @StevenSSmith Year ago

    The problem is the limitation of programs available on PC not being on Mac

  • @D.von.N
    @D.von.N Year ago +8

    Not an apple person and had no knowledge about this, but I have noticed the terrible performance of intel's 13th and 14th gen processors.

    • @Bewefau
      @Bewefau Year ago

      yeah they know it was a thing I think they tried to fix it with update.

    • @DavidSmith-dm8ew
      @DavidSmith-dm8ew Year ago

      Intels new cpu's are even worse then the 13 and 14th

    • @NCHLT
      @NCHLT Year ago

      Ah yes, the processors with a near 100% failure rate within a year

  • @thetailgunner777

    10:31 I worked for over 15 years in that building. good memories. Also it had nothing to do with the powerpc.

  • @HeyNiagraFalls
    @HeyNiagraFalls Year ago +4

    Thank you, Greg.

    • @GregMoress
      @GregMoress Year ago +1

      (looks around) Umm... You're welcome.