Tap to unmute
Why Apple Stopped Using Intel Chips
Embed
- Published on Feb 8, 2026
- 🌎 Get an exclusive 15% discount on Saily data plans! Use code APPLEEXPLAINED at checkout. Download Saily app or go to saily.com/appl... ⛵
________________________
You may’ve recently discovered that Apple stopped using Intel chips in their computers. Which is a bit surprising, given Steve Jobs praised Intel back in 2005. And what Jobs said was true, but that was almost two decades ago, and the relationship between Apple and Intel has been rocky.








🌎 Get an exclusive 15% discount on Saily data plans! Use code APPLEEXPLAINED at checkout. Download Saily app or go to saily.com/appleexplained ⛵
After honey I don’t trust any RUclips sponser
Hi
@EternalSnivy1196 You should never trust sponsorships/ads on RUclips and just assume from the beginning it's a scam.
@appleexplained Can you please make a video about why chromebooks failed?
Or about why apple switched to M series chips?
Or why don’t iphones have M chips like ipads, if they’re better?
Or why do ipad M chips have “more M’s” than macbooks?
Don't advertise anymore Fum vapes...
90nm... and here we are talking about 2nm. wow.
after 14, the name is what changes only
no wow anywhere
@betag24cn Well not exactly. It is more dense, the numbers just don't reflect actual measurements anymore.
@dennyroozeboom4795 true, all you can do is look for benchmarks
@betag24cn The hardest part is finding benchmarks that do not favour particular accelerations so you can get a true apples to apples comparison in a variety of workloads.
I remember the 338nm days 😂
Yeah, I “recently discovered” this 4 years ago…
recently discovered? recently discovered? recently discovered?
Thank you for playing the engagement game. The algorithm will be pleased.
And?
I heard Apple makes an M1 chip!
No way!!! That's amazing! Is it x86 or Arm? @mendodave
Best decision they’ve ever made.
agreed
Intel,AMD, and Qualcomm still can match the M series macs in single core performance.
the biggest reason why I and a lot of people are buying Macs now
@gabrielreisinger8047 Do they match performance per watt though?
@gabrielreisinger8047 They couldn't. What they could compete is multicore peerformance.
Another side benefit of switching to Intel back in 2006 was that for the first time it allowed people with Macbooks to run Microsoft Windows without emulation. This was huge and got a ton of people that were scared of trying a Mac to adopt one. Me being one of them that got a Mac Pro and had Linux and Windows VM's running at near native speed.
Now we return to emulation due to the different architectures between x86 and ARM
Yes, this includes me! I bought the white Macbook with C2D because it won a test against a Sony Vaio of comparable weight and size. With Windows Vista it was faster than the Sony.
Guess what, I never installed Vista on it.
One of the biggest downside using Intel was people saw Macs as just PCs with Mac OS.
Many people just did not see as worth to pay more just to have a different OS.
I remermber the debates back in the day are Macs PCs or not on online forums
@paradoxzee6834 By definition, Macs are computers, and they still are, regardless of whether you use PowerPC, Intel or Arm processors . They have the basic elements of a PC such as memory, storage, processing unit, input and output interfaces, There is nothing that a Mac has that differentiates it from a PC, only the Operating System, that is, Mac OS. The debate is silly as it has been tainted by a good "Mac vs PC" marketing strategy created by Apple. Which should actually be Mac vs Windows, since there are also other Operating Systems like GNU/Linux
And yes, by that definition, smartphones are also computers.
Did anybody notice that Mac OS is based on a version of Unix, a Ma Bell development? Nice GUI, though.
I don’t know but my MacBook Air gets me through 12 hours of harsh work from onsite to remote work. It’s nuts how such a thin laptop can have such a performance and durability about 18 hours of use and still with 20% battery.
Apple - SoC, with the "M" chips is very power efficient. This is similar to iOS/iPhone - which was easier to start as a SoC design, because those devices had to run mostly on battery.
SoC or system on a chip is running macOS (or Windows in emulation) on a chip... I love the versatility of the Intel style processors allowing greater compatibility and sometimes interesting or useful upgrade options. The newer A4 Macs look like a great value at this point.
I’m having battery issues with my MacBook with Intel.
We used to have phone run all day without dying …
@Niibaahyeah and they ran way less shit than modern phones so
@Niibaah lol ya my old flip phone used to run for like 3-4 days before needing a charge
Apple's chips are impressive. Fast and power effiecient, which is something Intel struggles to do. I just wish that Apple cared about making products that are easy to repair. Nobody wants to have to replace their laptop because the permanently attached SSD stopped working.
In a capitalist world, thats a double profit opportunity.
@mwangidanson2615 Sad, but true
The SSD are now easily replaced. So the thing you wish they had, they now have.
@DragonsinGenesisPodcast That's good news. Let's hope they continue to do this for all of their products
That's not because of Apple being good at it. It's just because of how inefficient and bloated(like Windows 11 lel) x86 architecture is.
If Intel were to make an ARM chip, they wouldn't be any worse than Apple.
You and Brandon Butch have a competition on who has the best news anchor voice 😂😂😂
I literally was frying an egg as part of breakfast for dinner when you used the analogy of a computer chip being so hot you could fry an egg on it.
did you happen to be using a computer chip for that? lol
Basically Intel's inability to innovate to Apple's liking caused them to look at alternatives before realising they already had one in the form of their iPhone ARM processors that would be more than powerful enough if scaled up a bit.
And yeah, Intel's refusal on collaborating on the iPhone will probably go down in history alongside Blockbuster laughing Netflix out of the office as one of the worst calls in the tech industry's history even if at the time it looked like a sensible decision.
Add Yahoo knocking back offers from Google and Microsoft to the list.
they used last gen intel chips lol
Not an easy job for Apple to switch HW and all the corresponding ecosystems. They did it. Apple is an innovative and successful company, and the market rewards it accordingly.
@ I can see AMD doing it if the hand is forced.
@rosl-80 Otellini was not for it, but he later admitted that his gut had told him "yes", and he regrets the decision. The sad thing is Intel had an ARM processor called X-Scale, which they could have promoted to Apple, but they wanted x86 chips in the iPhone, which was too expensive.
intel used to be able to make good processors but now they can only make hot plates
This is what happens when you don't have coipition for the last decade or so and then you are hit in the face with AMD Ryzen. I think Nvidia is next as the power on the RTX 5090 is insane.
Intel started making hotplates in 2000 with the Pentium 4. If my understanding is correct, they went back to the Pentium 3 and re-engineered it to become the "Core" line of chips.
I use an old Pentium 4 Prescott hot plate to cook hot dogs. LOL :P
Man show his face hmm 2025 kicks off
Chill man... it's winter. Someone needs to stay warm.
Apple Explained's face showing is such a weird concept to me. But I think I like it.
He’s done it before are you new to this channel?
Looks like AI
@YahmahaR7 not really
It almost kind of looks ai. Could just be a high frame rate or something though.
But the voice does sound off
It looks like the channel was hijacked, because i dont remember that videos on that channel used to spread so much BS before. Its his second video of his that i saw recently after long time of not watching(and before i only watched very vew on topics that were interesting), but both of the recent videos came with very obvious bs, thinking that the viewer are dumb or Apple a$$ licking, that wasnt there before. That used to be pretty okay channel, now its just...
21:44 The A4 wasn't introduced with iPhone 4, it was introduced with the original iPad. In fact, the A4 in the iPhone was significantly downclocked. Small detail though...awesome documentary.
Edit: Another small detail...when you say that they did not have to pay their chip surplier, it doesn't make sense. Maybe what you mean is that they did not have to pay for the margins of an off the shelf chip. They still had to pay Arm to licence ArmV7 ISA, they had to pay Samsung to manufacture the chips, and they had to pay licensing fees to Imagination Tech. for GPU designs.
Samsung doesn't manufacture chips it's TSMC
@dlnishantabhishek2815 they actually DO make their own chips but do partner with TSMC for the phones (for Example They put their own chips in the non american S20s named "Exynos")
@dlnishantabhishek2815Samsung doesn’t manufacture the ARM chips for Apple. They do for others.
Apple announced the A4 in January 2010, and Steve Jobs confirmed the iPhone 4 would use it in June 2010. The A4 was also used in the first-generation iPad, fourth-generation iPod Touch, and second-generation Apple TV. I was there the day it was announced.
@dlnishantabhishek2815They used to manufacture chips with Samsung until their lawsuit that Samsung was copying Apple, hence, the year after that, they made an exclusivity contract with TSMC and had first dibs on new nodes.
thats what you get when Companies are run by MBAs instead of Engineers
And idiotic MBAs, at that.
No, Pat, it wasn't OK for your CPUs to run at 95C.
I'd still buy a B580 if they went on sale for $200. Taking a big chance buying intel these days. It might just explode after 6 months...
Too bad management can't properly identify the B580 and B570 as loss leaders. Their chance to regain some public trust. Gamers may not make up a big part of the revenue stream but they are 98% of the hype train. If the people in enterprise don't even want to buy intel for their kid's gaming PC, why would anyone think they'd make a multi-million dollar purchase from intel at work?
U need both mate.
@Varshan_Prabu_M Bros trying to justify his degree.
I had a Intel MacBook and when I changed over to the M series, I was amazed how fast they are and the battery last for ages and more importantly no more fan noise
And now I use an M4 mini, and my M1 laptop feels slow!
Wow man, wish they become compatible with my cad application, then many of us will be able to feel the power!
When I used to do a multi vid stream Zoom call, on the Intel MacBook the fans would spin up like a jet engine. No longer.
01:31 ad skip
Thank you
@beyondmemoX5 You're welcome
Thank you
I think our bro lagged by making this video like the intel chips.
I’m really glad they did, M chips are fantastic! 🎉
Damn right!
sure they are, 99% of them is used for office type workload.
@ You are mistaken, 99.9% of them are used for creative work.
@IgorsPlay these days every cpu has hw video encode, 3d accelerator.. so yea, office type.
It’s amusing to see many Apple haters stating Apple never invents anything, they just copy…yet they have the fastest domestic/private computers in the world built with chips that they designed and built lol
It should be notated that ARM was founded as a joint venture between Acorn Computers, Apple, and VLSI Technology. Acorn provided 12 employees, VLSI provided tools, Apple provided a US$3 million investment (equivalent to $7 million in 2023).
Larry Tesler, Apple VP was a key person and he helped recruit the first CEO at the joint venture, Robin Saxby.
Apple had been using Acorn’s Archimedes in the Newton.
No, Apple did not use the Archimedes in the Newton device, or anything else for that matter. Archimedes was a line on Acorn computers, not a processor architecture. That was always called ARM, and used to power the Archimedes, but then stood for Acorn RISC machine, and was originally developed by the company on a shoestring with a tiny handful of engineers. When the Acorn Archimedes got steamrollered by the PC architecture, the processor business was split off and targeted at the mobile market due to its astonishingly low power consumption, and Acorn RISC machines became Advanced RISC Machines, which was where that critical Apple investment came in with the use of the ARM processor architecture in the Newton device.
Ironically, Newton failed in the market, but the ARM processor became wildly successful and Apple divested themselves of their share in ARM as they were in desperate need of the capital.
Nb. by coincidence or not, Isaac Newton was a professor from Cambridge, and Acorn and, therefore ARM was founded in that hat university city.
@TheEulerIDimagine if Apple had kept their share of ARM 😊
@felipe367 Then it might not have been so successful. Those in the mobile appliance market would not have been keen on being reliant on IPR and licenses granted by a company which was in large part owned by a competitor. It's especially lucky that it didn't happen as Apple's track record on trying to block out competitors and seeking to use its IPR as means of doing so is not great. ARM, majority owned by Softbank does not really compete with its own customers, and when Nvidea tried to buy ARM it caused major competition concerns by both the company's customers and regulators in the UK, USA and EU.
Besides which, Apple were in a desperate way at the time. The needed the money.
@felipe367 During the late 90's - early 2000's, if you followed Apple's financial reports, they would regularly report sales of their seemingly bottomless pit of ARM shares. The reason was most likely to bring their numbers up to Wall Street analyst expectations.
@TheEulerIDAcorn and VLSI were on the brink of bankruptcy and that when Apple came and ARM was created. All 3 owned 1/3 of the shares. Later Apple had to sell the shares to survive. Now Apple again owns 10% of the company.
130-90nm is wild... In a world where 5nm is seen as old tech and 3nm is the standard... How far we have come
It’s actually more like >13nm but still impressive! Node sizes since the late 90’s are mostly a marketing gimmick and not representative of the actual minimum feature size of a transistor. The figure is now derived from the performance gains realized from improved semiconductor topology. That or the marketing team just throws out a number that is lower than last time .
@jonathanruiz8723 you mean if TSMC get performance of 2 nm node from 4nm revised node than they can call it 2nm node?
@XashA12Musk Correct... kind of... It's really mostly marketing. For example Intel's 10nm node technically has higher transistor density than TSMC’s 7nm node. The figure did originally match the minimum gate length of a given process' transistors but it hasn't since the mid 90's (My original 2008 guesstimate was off by over 10 years). Most of the gains nowadays are from placing transistors closer together and effective usage of 3D space, rather than just decreasing transistor size. Hence all the hot chips lately.
"5nm" or "3nm" are just marketing terms they no longer refer to gate length in a transistor
@osdenzaSource ?
25:00 They wanted to build the fastest, most efficient computer possible -while remaining reasonably affordable- .
Great video but you completely left out the ipad - think that would’ve been relevant since M1 is a modified A12Z.
Yeah… this isn’t “Recent” at all as this has been out for a couple years now/
More than a couple
10:50 I was attending the keynote that year, and in the days that followed, with after-parties every night in San Francisco, word was getting around that Steve made this announcement against the wishes of IBM. Apparently they weren't willing to stick their necks out for "3 GHz", but he stuck their necks out for them. In fact the IBM engineers were saying there's no way they'd reach that speed in 12 months, and were only promising around 2.5 GHz, which is exactly what happened.
When you think about it, "the Osborne effect" was exactly the sort of thing you try to avoid, especially when the memory of Osborne's mistake was fresh in the 1990s and 2000s. So the fact that Steve directly counteracted the golden rule of the Osborne effect, the rule of not pre-announcing next year's technology while announcing this year's technology, meant that something must have really been up.
When Apple asked INTeL for a smartphone processor, the ARM-architecture was not set. Instead, using ARM was a result of INTeL denying the processor built.
20:32 Commodore owned their own chip manufacturing company, MOS Technology, Inc., which they acquired in 1976. MOS Technology was instrumental in creating some of the most iconic chips of the era, including the 6502 processor, which powered many early personal computers such as the Apple II, Atari 8-bit computers, and of course, Commodore’s own computers.
Under Commodore, MOS Technology developed custom chips that gave their machines a competitive edge. For example:
1. The VIC-II - Graphics chip for the Commodore 64.
2. The SID (Sound Interface Device) - Famous sound chip in the Commodore 64.
3. The 6510 CPU - Used in the Commodore 64, a variant of the 6502.
Owning MOS Technology allowed Commodore to reduce costs by vertically integrating their chip production, which was a significant advantage in the competitive home computer market of the 1980s. This strategy contributed to the affordability and success of systems like the Commodore 64.
Jack Tramiel :D "yes your a business of course you want to make money but you don't need to make a stupid amount of money"
And when Amiga Prototype was shown to Steve Jobs he said "Too much hardware!". I guess he was not a big fan of custom chips back then.
But... but... but the MBA farm says outsourcing everything is always cheaper! lol
Commodore should have dominated the market but they threw it all away.
The Amiga in particular was so ahead of its time.
Waiting for a video on the Siri lawsuit
I usually don’t like long videos but this was great information and you kept it moving very well. I appreciate it
He could’ve just said that people installed windows instead of macs and it would’ve taken less than a minute but that won’t get any monetization
I was there and saw a lot of this as it was happening until I left the computer industry in 2007. I first started working at IBM when we were using the code name "Rios" for what would later be named the RISC/6000. I remember when Apple switched to Intel due to the cooling problem with the G5 processor. I even went to COMDEX '94 in Las Vegas just before Windows 95 was released. Now? ... I'm working as a musician in Monterrey, Mexico. I got into computers because I wanted to use them in Music production and I ended up as an O.S. Specialist. It's fun to take a trip down memory lane, (No pun intended), but now I'm working on learning web design and Latin American Spanish. .. And piano (which will be my 5th and final instrument). I'm still a UNIX nerd, but now I'm more interested in using computers as tools for producing results.
TSMC doesn't create their own arm chips, they are a foundry. They make chips for others.
and thank goodness they did. I no longer have a room heater / Macbook Pro.
Dude, where's your signature music?
Yeah I kinda miss it, probably because he doesn’t have to add awkward cuts with the videos illustrating about the chip situation.
I don’t miss it… I used to binge his videos and the same song over and over would get on my nerves…
would be silly in a so long video
*Next video idea:* who’s the best Apple leaker!
Mark Gurman, Ross Young, Jon Prosser, Ming-Chi Quo, etc…
Who is Ming-Chi Quo?
Ming who?
No we dont care about "trust me bro" people
Apple's PowerBook sales _was not_ 64% in 1981, as the first Macintosh was released in 1984. The first PowerBook was released in 1991. The G5 chip launched in 2002, so is the pie chart supposed to be 2001 Macintosh sales?
Watching this on my Apple Silicon M1 powered Macbook. The M1 was such a revolutionary laptop chip especially for tech enthusiasts in Nigeria for it's exceptional power management. Start the day with a full battery and you are free to stay unplugged till nightfall! It gave incredible mobility to the tech worker's life. It's so good I know friends who have not needed to upgrade in the last 4 years since it was introduced.
Moore's Law is, first and foremost, despite the name, an observation. Not even consensus. So there was no violation.
So true, and it is amazing that Moore's predictions were made on something like four generations of technology. That it has lasted this long is incredible. It provided a roadmap that semiconductor equipment manufacturers could help fulfill a sticking to a technology drum beat of sorts.
1:28 the worst part of this is: if your phone is carrier locked, you can’t add esims. Carrier locks usually are placed if you are financing you phone through your phone bill, and will generally lift shortly after you pay off your phone.
One of the things not mentioned here is that each change to a new CPU architecture was done in a seamless way. Before I saw it, I had thought such a thing was impossible, because it was so complex.
Each CPU architecture runs on a different instruction set. This is the low-level Machine language that the CPU runs on. (Your email program, accounting program, word processor and web browser are different applications, although sometimes these days, there are websites that also do it, and run on the browser.) The mac started on the 68000 architecture, transitioned to PowerPC, then to Intel, and now the 'apple silicon' ARM chips (M1, M2, so on). Most of the applications you run on your machine are compiled down to Machine Language so it can just load up and run - often called the "binary" because the raw bytes of the program are totally different. But during these transitions, old applications fundamentally don't work on the new architecture. I'm not just talking about like Italian vs Spanish, I'm talking Italian versus Chinese or Sign Language. Nothing in common besides bytes being 8 bits. Sometimes, the way they numbered bytes was in reverse order!
Typically, the programmers have to make two different applications, running the compiler & linker once for each architecture. You have to install the one that works on your machine. Then, there were all sorts of details that were different and often the user had to make changes, which were a hassle (for thousands or millions of users). Electronics were different, interfaces were different, plugs were different. Windows usually can't do that, so it's always on Intel's x86 architecture, which gets old.
But, each time, Apple pulled it off, seamlessly. I remember one friend who got a new machine saying, he unplugged cables from his old machine,put the new machine in its place, plugged in cables, and was checking email within an hour. Absolutely astounding.
One technique they used was a 'fat binary'. The application file actually contained two binaries, and the machine would decide which it could run. Apple would supply the software to compile an application that way.
Another technique they used was emulation. The new chip was faster, so Apple wrote software that would pretend to be the older chip, and could run apps for the older architecture. (the Emulator, it would be slower than the old machine, but everything worked, and you buy a new machine eventually.) I remember attending an Apple lecture describing how they did that in one case. This one programmer was explaining differences between the new emulator and the old chip. All of the differences were very low-level and obscure, and nothing I, as a programmer, had to deal with. I was amazed they did such a good job.
So, they've gone through three transitions like this, each one seamless. I've got two Macs in my apartment, one Intel, and this one ARM. I can just move apps back and forth, no problems. Absolutely incredible work.
Now is the reverse.
13:46 bit unfair to compare the Pentium I to the PowerPC G3. Surely the better comparison would be Pentium III with PowerPC G3, or Pentium I with PowerPC G1 (e.g. 601 or 604).
As a Windows user for 20+ years this is why I switched to the M4 mac mini.
Intel and MS can have the 'finger' from me.
Though Apple still gets the 'finger' from me for being a too tightly closed ecosystem.
Same
how do you like it so far? macOS isn’t really a closed system i’d say, just iOS, which is good imo considering their stance on privacy
i don’t particularly want my phone with GPS, mics, cameras, etc that i have with me at all times to be an open system. there’s little to gain for the loss in security (though people still blow it by installing social media apps and giving them mic and camera access lol)
@sleeptodreamx MacOS is absolutely a closed ecosystem system. If it weren't, we have more Linux natively running on Apple silicon. Even the m3 & m4 can't run Linux natively.
@darthslackusYou are interchanging the Apple hardware and software in your comment. Confusing.
27:12 and yet no mention of Intel's via oxidation issue in its 13th and 14th gen chips? I'd argue that was a bigger contributor to their stock price crashing out in 2024.
I like all your videos, but will you post more of your 2-3 minute short videos explaining things like what's new in iOS 18.2 or rumors about upcoming apple products or something like that? Still enjoy all your videos, they're good quality content.
Long time no see with the face to face conversation Type Video!!
The pie chart at 10:09 says laptops made up 64% of Mac sales in 1981. This is TOTALLY wrong. Consider this - the first Macintosh Portable computer was introduced in 1989. 😂
He must’ve put the year wrong it’s supposed to be around 2003
anddd now we got chips like 13980hx 7945hx that constantly stay on 90+ degrees that could fry an egg
If only game developers would catch up now!
One of the BEST narrators on YT
i'm still getting over Apple giving up on the 6502
😆🤣
I used an early Compaq "portable" -- It was luggable, but had to be plugged in to work, and floppies formatted by Compaq didn't register on IBM computers, and vice versa. Formatting the floppies on my Mac at home worked for both sides of the "compatible divide".
the best explanation of apple switching of M chips 👏🏻 congratulations
This is the first video where the sponsor actually has something that I need
You should do the history of text messaging from when it first blew up to the introduction of SMS then iMessage then how social media apps begin allowing the feature then when RCS was released when iPhones got RCS and how many companies are starting to kill off SMS ❤
This video has me looking up old apple WWDC events. They way Steve Jobs talks is truly captivating
I didn’t know they used to call building a PC “cloning an IBM” in the 90’s
Yep IBM clone. I had several of those. Even built a few myself.
Yea - department stores like Sears, KMart, JC Penney’s, they even used the term in their marketing and in the stores, while also featuring the IBM PC.
Because many were literally "clones" or a carbon copy. This was possible because IBM's original PC system was an open architecture system. This was done to encourage third party peripheral manufacturers to make peripherals.
The result, was IBM Clones.
Others, they did the right thing and created "IBM Compatible Computers". These were different from IBM's system, but compatible with IBM software and peripherals.
IBM was king 20k computers in the 80's woot woot. 8088 chip's. Your keyboard layout ? IBM baby
While it was easy to duplicate the hardware, the IBM PC BIOS firmware was copyrighted, so Compaq had to reverse-engineer its own compatible BIOS. They were successful, and the acid test of compatibility was if it could boot Flight Simulator, which at the time before MS bought it was a bootable disk. Some clone makers were not as good at creating a fully-compatible BIOS and failed the FS boot test. Eventually companies like Phoenix Technologies created "clean room" copies that were licensed to manufacturers. IBM later realized it made things too easy to duplicate so they came out with the PS/2 architecture, which had to be licensed by other manufacturers. No one wanted the PS/2 as it was not any better.
Triumph of the Nerds is one of my favorite documentaries. I have seen the 4 hour version of it like 20 times when I was a teen.
And people say Apple stopped innovating...
They don’t innovate often but when they do it’s impactful, for better or worse.
What's that long yellow thing at 16:05?
16:32 What really made "both companies are engineering driven" ring true, was the way Intel rolled out new technology much faster with Apple. Gone was the BIOS boot mode, finally replaced with UEFI. 64-bit was missing for 5 minutes but came up pretty quickly with the Core 2 Duo. Work must have begun pretty soon after that on Thunderbolt 1 (joint project between Intel and Apple), and remember how USB 1.0 existed but adoption was going nowhere until the iMac G3. Even the "Core" naming sequence commencement coincided with Apple's transition.
The reason ARM is so power efficient is that the very first one way back in 1984/5 they wanted to use the plastic packaging rather than ceramic as it was cheaper - oh and that power efficiency? When they went to measure the power usage it was ZERO as they'd forgotten to attach the power connector, it was scavenging power off the memory bus!
25:45 on the laptop Market. Desktop still has the ryzen 9 7950x3D. and dont forget about the threadripper 7995WX.
The Threadripper is not a desktop CPU. It's a workstation CPU.
@akyhneI beg to differ
@StevenSSmith So, you don't know what a workstation is?
@akyhne how do you say "I have autism" without saying "I have autisim"
@StevenSSmith Say you don't know anything about computers, without saying you don't know anything about computers.
Where did the apple explained sound go
Im watching this on an M1 iPad Pro 12.9. 3 years on. Its still screaming fast and even got apple intelligence
Vro has the Homer Simpson Beard
Hey Greg when you gonna talk about the Failure of apples Vision Pro Thanks 😉
Were the first “Intel Inside” ads rendered on Macintoshes…? I thought I recalled that from 30 years ago…
I am from Poland, I use Mac Mini M2 Pro, and MacBook M1, after switching from Itel I felt the power of the computer, it is a technological abyss. Although Apple M1 is not the first ARM computer, but only now people are ready. The first computer in ARM technology was ACORN called ARCHIMEDES, in 1985. Greetings from Poland.
with no backwards compatibility
@Bewefauhaha backwards
Have you tried running Maya renders on your beautiful MacBook ?
Bro started at when Dinosaurs were using phones.
Best move for Apple developing the M line of chips. I just got the Mac Mini M4 and its the fastest computer I've ever owned.
7:07 - Wow, its literally Blast Processing
You can make a new clip..."Why is Apple using Intel chips again" 😂
I mean, definitely seeing you sitting from a mic talking on Apple Explained is weird, but I’m all for it😂
1:56 anyone else whisper “Risc is good” underneath their breathe?
RISC is king as far as I’m concerned. CISC isn’t worth the complexity and power consumption. Apple Silicon and other RISC CPUs have proven this now for non-mobile use!
Not everyone follows things the moment they happen - for some people, even 4 years later can still feel like a fresh discovery. No need to dismiss it.”
Why is it that people forgets that Apple founded the ARM with VLSI and Acorn? Without Apple ARM would not exist.
at 10:09 the pie chart says 64% of all Macintosh sales were laptops... in 1981?
You missed a key point. Yes… reliability, availability, release schedule, and power efficiency all played very big roles in Apple’s move away from Intel. However, one of the biggest reasons was thermal efficiency. Apple’s brand has always been power AND aesthetics, not one or the other. Apple’s aggressively thin and sleek form factors made Intel chips absolutely useless in portable Macs. They were throttled back so far once the computer was asked to do any meaningful task, all because the chip would get way too hot for the cooling system to properly mitigate given the form factor of Apple’s designs. Once the Skylake nightmare happened, the camel’s back was broken, but it was more like the final straw more than the deciding one.
Great video. Great use of the advertisenents & presentations. Great technical details, and as far as I could tell, you got em all RIGHT, which is rare
22:20 to get to the point.
Love this format. It’s been a while.
And Intel has no one else to blame but themselves. They were the best on the market for good reason back when Steve was still around, even into the 2010s. Intel at a point got comfortable and stopped innovating, then in a rush to fend off the AMD Zen architecture, they produced flawed products that drew way too much power and produced too much heat. Apple was very smart to pull out when they could.
they were also incredibly arrogant, bullying and greedy. Probably criminal too
No it was Meltdown and Spectre that did them in.
Marketing people ran Intel from 2008 until 2024. Pat Gelsinger lied so often and made so many stupid moves and wasted so much money needlessly that he qualifies as a marketing person... They really blew it with 14 nanometers (14nm+, 14nm++, 14nm+++, 14nm++++ !!). They were stuck there for 6 years! They bet against ASML and it was an asinine mistake to make! Now intel does not have enough market share for a 2nm fab. Even if someone gave it to them they would have a hard time affording the cost of running it! Without cell phone chips to fabricate any advanced semiconductor manufacturer is completely screwed ... The fabs cost too much and the annual volumes from CPUs and GPUs are too small!!!
@Bewefaufunny that apple chips have been hit by similar silicon level bug, so critical that it has to be disabled and software-worked around, essentially same as meltdown/spectre 😂
powerpc .. I remember those days.. got a xserve dual g5 rack server.
*Bro…I appreciate your long-form-style content, but can you also please continue your to-the-point quick explainer videos, and your frequent macOS/ iOS update series please :)*
I love these videos
Personally, I like the long form videos much more.
I remember those early 90’s Intel and Motorola commercials. Such an awesome time for computers.
Apple Silicon is such an amazing breakthrough. Apple is the only company that has managed to disrupt the AMD-Intel x86-64 duopoly in the market. consumers didn't realize how terrible mobile computers are with x86-64 chips. Apple Silicon is proven to be so great that Qualcomm just started to make their own laptop-grade ARM64 chips with Snapdragon Elite series. and guess who was involved there? Gerard Williams, the former CPU engineer of Apple who made the A series chips, i think he was involved in designing the M1 too.
but there's something that still doesn't make sense to me. it's the existence of Apple Silicon Mac Pro. it's a high end consumer grade workstation. the whole appeal of a Mac Pro has always been self-upgradability while the whole appeal of Apple Silicon is performance per watt and long lasting battery on Macbooks. people who buy this grade of computers don't care about power efficiency, they just want performance. self-upgradability is not possible with Apple Silicon, you can't upgrade the memory or the graphics. once you got an Apple Silicon Mac Pro then the only thing you can add is just the PCIE expansions so an Apple Silicon Mac with a case that big doesn't make sense.
Apple using ARM for Macs doesn't make sense to you because you have the wrong idea about CPU instruction sets. The instruction set is only the "language" the CPU uses to understand instructions. The ISA does impact the decoders and things around it, but in modern CPUs the impact to efficiency is minimal. ARM CPUs aren't really any more efficient than x86 CPUs, at least not to a meaningful degree. Intel or AMD can 100% make a CPU with similar or even better efficiency using x86. In fact, under heavier loads Apple chips don't really perform much better than the new AMD or Intel chips.
Dude said "had shrank"... LOL!
Editor pleeeeease!
Thanks!In essence Apple custom designs their chips to address the needs of the platform, users and the applications.. And I agree.. oh and thanks for talking about the power pc line, I had wondered what had happened with that.. Part of the selling point of those PC's was that Apple could have custom hardware emulations at suitable speeds on native hardware... The capacity to run windows and macos on the same computer,
can i also have $1.99 😢
6:00 I saw this video at the Intel Overdrive product launch banquet. Still makes the hairs on the back of my neck stand up!
25:09 Affordable?!
ARM processors are not “in their infancy” - they’ve been around since the 1980s. en.wikipedia.org/wiki/ARM_architecture_family
He didnt say theyre in the infancy NOW. He meant back then.
Your chart at 10:05 is wrong. There were no Macintosh sales in 1981 and definitely no laptops!
strange vid... i was an IBM PC dev in the 80s, and it was a different world for sure. I just installed an apple mini with the M4 chip, and it's light years ahead of the olden days. This tiny box blows me away re: power consumption and performance. Everything up until 2022 was misery on a board, but now everything is buttery smooth across the entire line of Apple PCs.
This video is exceptionally well-produced and informative. I eagerly anticipate your future videos, particularly those of this nature, as they are both engaging and educational.
the WINNER here is TSMC - it has been manufacturing Processors designed by Apple, nVidia, AMD .. 😄
And China wants it.
Mediatek and Qualcomm too, even intel I think
Great video (as usual)
I like the addition of you on camera… and I LOVE the fact that you opted to drop the repetitive music track in behind the video. So, so much better this way. 👏👏👏👏👏👏👏👏👏
That video could have been 5-10 minutes long
I disagree. It's fine as-is.
The problem is the limitation of programs available on PC not being on Mac
Not an apple person and had no knowledge about this, but I have noticed the terrible performance of intel's 13th and 14th gen processors.
yeah they know it was a thing I think they tried to fix it with update.
Intels new cpu's are even worse then the 13 and 14th
Ah yes, the processors with a near 100% failure rate within a year
10:31 I worked for over 15 years in that building. good memories. Also it had nothing to do with the powerpc.
Thank you, Greg.
(looks around) Umm... You're welcome.