Intel’s FAILED gamble against Apple
HTML-код
- Опубликовано: 4 янв 2024
- Is Intel's latest CPU lineup a true challenger to Apple's M-series chips? This video provides a deeper look into the new Intel Core Ultra processors, their performance in comparison to Apple's M3, M2, and M2 Pro, and what this rivalry means for the future of computing.
Run Windows on a Mac: prf.hn/click/camref:1100libNI (affiliate)
Use COUPON: ZISKIND10
🛒 Gear Links 🛒
🍏💥 New MacBook Air M1 Deal: amzn.to/3S59ID8
💻🔄 Refurb MacBook Air M1 Deal: amzn.to/45K1Gmk
🎧⚡ Great 40Gbps T4 enclosure: amzn.to/3JNwBGW
🛠️🚀 My nvme ssd: amzn.to/3YLEySo
📦🎮 My gear: www.amazon.com/shop/alexziskind
🎥 Related Videos 🎥
🌗 Intel's Meteor Lake vs Apple M3 - Performance Showdown - • Video
👨💻 Intel's AI Integration - Game Changer or Gimmick? - • Video
🤖 Apple M-Series Dominance in 2024 - Unstoppable? - • Video
💰 Intel vs. Apple: The Cost of Power - • Video
🛠️ CPU Wars Playlist - • Playlist
Huang blog post - blog.hjc.im/spec-cpu-2017
- - - - - - - - -
❤️ SUBSCRIBE TO MY RUclips CHANNEL 📺
Click here to subscribe: www.youtube.com/@azisk?sub_co...
- - - - - - - - -
📱LET'S CONNECT ON SOCIAL MEDIA
ALEX ON TWITTER: / digitalix
- - - - - - - - -
#intel #macbookpro #m3 - Наука
You’re comparing potatoes and tomatoes.
AppleM3 is a chip “inscribed” in a complete organised optimised for its architecture.
Meteorlake has non optimised software yet and works on architectures that are not “made” exclusively for it.
The M3 is the “son” of M1 and M2, so the 3rd iteration.
Meteorlake is the first.
You’re comparing a 7nm chip against a 3nm chip.
Arrowlake will be out in 4month and it will be 4nm, lunar lake will be here at the end of 2024 and it will be 2nm.
You forgot to mention that with only the update of the bios, the meteorlake is 30% better…
Let’s wait 3-4 month and let’s see the results. Let’s wait one year and see what you will say.
It’s no even a problem of partiality, but to be precise.
P.S: 2 years ago everyone said that would be impossible for the x86 architecture to move below 12nm and that Windows on Arm is the only future possible.
Intel said that the purpose for this release would be "energy".
Job done, the new Ultra Laptop (without dedicated RTX GCard) will last hours and hours more than the previous.
i feel like your comment has valuable information for others, but when you start out your constructive criticism with “nonsense” it puts the reader (me) on the defensive right away. if you remove the “nonsense” part, i think your comment could be pinned. Thanks!
@@AZisk Oh. ok, wasn't intended as an insult. But I'm sorry if you felt so, removed, no prob.
Who is the "everyone" that 2 years ago said it would be impossible for x86 to move below 12nm?
Asking for the three years old 7nm x86 chip in the computer I'm typing this on. :P
Intel's process nodes ≠ tsmc process nodes. 7nm is now called intel 4, fittingly, as it beats tsmc 5, and probably sits between 4 and 3. Intel is doing a weird move by rushing out all these architectures, suddenly grasped by power efficiency after years of 100w+.
It's not even that the only reason apples arm CPUs get those numbers is the faster and lower latency ram they have because it's soldered next to the chip and their controller is simpler because it has to work only with said ram and not with a huge variate and then there are all the other disadvantages of the arm CPUs
I hope Intel can turn it around. They've not been the most consumer friendly company but AMD and Nvidia both use TSMC and we really don't need a chip manufacturer monopoly.
TSMC is Taiwan's insurance policy against China invading them. If TSMC fails, Taiwan fails, they could (and would) run at a complete loss and it would still be worth it for them, who can compete with that?
parts of the Intel Ultra CPUs are already manufactufed by TSMC
@@WhiteoutTech So?
@@kaveanto "So?"
The reason Intel is stumbling right now, is because they became comfortable with their monopoly, and it isn't easy to catch back up.
The history and management of Intel suggest a preference for prioritizing profits, often leaving consumers disadvantaged. To illustrate, consider Apple's situation. Apple was exclusively using Intel processors, but Intel did not update their 14nm processors for over 11 years. During this time, Apple was designing computer chassis based on the expectation that new processors would be available. However, the absence of these updates left Apple with pre-designed laptops that couldn't be equipped with new processors. Imagine a company waiting 10 years, continuously promised new technology, only to be told repeatedly to wait. Intel, being a bloated company, seems unable to move quickly enough to keep pace, a fact that is disappointing to me as an Intel fan. Lastly, why is it that smaller companies are outpacing Intel in the consumer market? Intel needs a wake up call, (how do you lose apple as a customer unless you are short sited)
100% this right here
It’s difficult to get the focus required without external pressure. Intel profits taking a nose dive is the kind of pressure Intel needs to focus. Looking forward to what they release in the next few years!!
Man, I'm really thankful for your videos, not many youtubers test laptops from a developer perspective.
I agreed. When you search which laptops are good for developing then no video would show up but not any more.I remember When I was searching for a laptop for my development work. NOW I have a legion 7 with AMD and Nvidia and I love it. It's been a year since I have and it's performing is great.
You realize he didn't test any laptops, right? All he did was summarized someone's article to you.
@@xungnham1388 In this video yes, but I was talking about in general.
@@kristofszabo666 Ok. Probably more appropriate to make this comment on a video that he actually tests laptops in, cause doing it for this video makes you sound like an AI hallucination.
@@xungnham1388 maybe, but this is his latest video
Also, I feel what is the point of comparing Apple's M3 with Intel when we know that there is no way that M3 can be used in a PC. And I don't think that we can ever get a Mac for 300 dollars that can be used by people who do not have so much money or students.
So the real question is does the few seconds or a couple of minutes difference in processing time matter for people who have to choose between having a computer and not having one?
As far as battery life is concerned, mobility for laptop is different than what mobility means for tablet or smartphone. Mobility for laptop means that we should be able to fold it, keep it in a back pack and move it wherever we go. But it's very impractical to use it in a train or garden or a lift. It is generally used in a home set up or in an office set up where plugging it is not a challenge. Hence, I don't care much about battery life as long as I get the performance I need and as long as I don't need movers and packers to transport it to my office or home.
Smartphones or tablets on the other hand are designed to be used while on bed, in a metro, on road etc and this needs good battery life.
It's a big leap for Intel on efficiency, even if they don't gain leadership everywhere. The IGP getting to M1-M2 isn't bad considering gains slowed from M1. I think it'll be a very successful product, even if it doesn't beat AMD and especially Apple at everything, they'll still ship more volume and get all those NPUs out there.
gains since m1 didn slowed, they are significant, industry leading actually.
intel and efficiency are two concepts that can’t truly coexist and when they do it’s at cost of performance
and who EXACTLY, started including NPUs😋
Do you work at intel 😅
@@Conntrailed Would I be saying it's not the best if I did 🤣
It's a win for consumers becuase now any laptop model we can get, some companies only use Intel, is not that far behind. 13th gen was a power hog compared to AMD.
The main problem is that Apple has full control over their operating system and can tune the OS to their liking and get the most power out of it, whereas in the intel world, they are not developing the OS; it's Microsoft that's developing, so when Microsoft is developing, they have to consider multiple hardware devices, such as RAM, a GPU, and more. The moment Microsoft and Intel work together to optimise the OS, the performance will be good, but it's not going to be great. That's the whole store.
"The main problem is that Apple has full control over their operating system" - many of us think that not only is that _not_ a problem, but is indeed a great thing.
"The moment Microsoft and Intel work together to optimise the OS" - but Microsoft is currently pushing Nvidia and Qualcomm to make ARM processors, for Windows for ARM. There is no reason for Microsoft to be too closely tied to Intel.
@@TheDanEdwards But still It's the hardware when Microsoft is developing Windows it has to take consideration of multiple Hardware that are available in the market where it comes to Mac there is only one choice.. It's about control and optimization. The control of hardware and use software to get the Maximum power..Even with Arm with Windows it's still not optimized.
@@TheDanEdwards It is a good thing that Apple has full control of both hardware and software but comparing the two (Intel VS Apple M series) is literally apples and oranges.
Even if MS were to focus on ARM with Qualcomm, they still run into the same issue with not having full control over the hardware down the line as there's going to be an explosion of various hardware combinations to support.
We can still say Intel just sucks by looking at AMD CPUs. The 14nm joke was not so long ago.
@@ondrejsedlak4935 that's the point
The right approach is to get more instructions per clock cycle. Most designers know this, yet the focus has been on more cores instead. That helped a little, but also created new bottlenecks.
Both Intel and AMD should focus on efficiency rather than performance for laptop CPUs/GPUs. CPUs are not the bottle neck when it comes to Windows for average people. If they could sell laptops with battery life comparable to Apple silicon even at 50-75% of the performance of Apple, everyday users wouldn’t notice the performance difference. Focus on CPU/GPU efficiency, fastest storage available, and a minimum of 16GB of RAM, and they could actually compete.
I’ve never even used a Mac, and in my eyes PC laptops are dead (extremely poor value) compared to MacBooks for the average user. The only reason you should be buying a PC over a Mac laptop is for gaming, engineering software, and other niche software that simply doesn’t work on MacOS. The fact that you can get an M1 MacBook Air for $750 new, that genuinely lasts 12+ hours and feels fast, is insane value for the average person. Like I’m going back to college to be a nurse and had I not been gifted a new laptop through a friend who works in IT, I would’ve bought an M1 MacBook Air, it’s genuinely a no brainer decision.
I would say that it's too early in the game (or perhaps too late; I never really know) to discuss these new CPUs. My experience with new releases like this has been mixed. The only recent product that truly impressed me on the CPU front was Apple's MacBook Air with the M1 chip. That was one of those remarkable moments in history, and I had the privilege of witnessing it firsthand!
Apples chips are incredible, but I’m also rooting for Intel. I really hope they can turn things around. I hope the same for Samsung’s Exynos processors as well.
Enjoying your videos so much, keep ‘em coming Alex!
Meteor Lake is looking pretty good to me. We are still probably waiting on those OS level optimizations for the new low power core type and perhaps some application level stuff, but even currently things look good. Mac is a total fail for me due to lack of data processing software (no Power BI and gimped Excel for starters).
This isn't failed exactly as intel probably knew it wasn't good enough when they showed it on their roadmap that only Arrow Lake would give them 'Performance Leadership'. Only by using the gate all around tech and backside power delivery will Intel get back to competing at the top. That will be later this year, the whole thing is planned.
The real issue is that the GPU memory isn't truly shared. It has to be pre-allocated, so it won't run LLMs well compared to what the Apple silicon does by automatically allocating up to 128Gb.
Apple really has an advantage when you work with big models also outside of ML.
You’ll need a high specced setup though, e.g a Max with 64gb… that’s quite the investment.
I agree with what you’re saying here, but IMO it would make sense to not just look at performance and efficiency, but also relate it to cost.
@@friedpicklezzz the cost of that 64gb Mac versus a pc with an equivalent GPU is peanuts in comparison.
And a 32gb Mac can run any 7b and most 13b models on GPU. You need a 16gb card to run most 7b and 24-32gb cards to run most 13b models.
And at 128gb which can run 70b models, you’re talking less than half the price for the Mac versus a pc with GPUs.
And remember: 3 4k screens on a pc will use 5gb of vram. So you start on the PC with 2 GPUs just to do any of this. Which isn’t true with the Mac.
@@jameshancock you're literally trying to justify shit based on emerging tech?
These neural networks are like the old phones, hella bulky, weird, and not that great. Give it time before you think raw power is the solution to this. It's gonna be bad for consumers if it were to be that way.
@@TragicGFuel this is about right now chips. And intel can’t compete with Mac because of architecture RIGHT NOW.
The nicest thing about the future is that I will be alive to witness its unveiling, and it promises to be excellent. Thank you for another excellent video Alex.
I think Windows itself needs a complete reboot for ARM, but it concerns me to an extent. Would ARM ruin/change PC building forever?
Yeah I love my Mac as my laptop but I also enjoy building and upgrading my desktop for gaming. ARM is clearly amazing compared to x86 but I’m really hoping that not everything becomes an SoC. I really hope PC building doesn’t die.
@@-Burb Probably replaceable SOCs? That would be awesome.
Intel,AMD,Nvidia & x86 are dead for good.
this is all replaced by ARM and everybody is making their own soc, means no customers.
this is it
Maybe we need a future with both. X86 for Standing PCs where you don’t need so much Power Efficiency and have lot of Room for Cooling and ARM for Mobile Devices with a Battery.
I hate fan noise. It came back in M2 and M3 :(
is the test using latest BIOS update from intel? coz i read there's a new update that improve performance on newly released meteor lake platforms
yes its a recent test. That Bios update is.already a few weeks old . remember he is taking out single core performance here,
I kinda blame OS for some tests, windows has been really lazy with their memory and power management for applications to the point it could be a bottleneck to the hardware. Ofc I could be wrong but its just a thought.
You need to look carefully to the test sheet its saying that all chips were running debian 12 and gcc so these tests were carried out on linux and not windows so this actually proves that intel's meteor lake is actually inferior to the old gen raptor lake and for the case of m1 m2 and m3 its macos since you cant really run debian on macbooks yet
@@user-ft3by4sc2q so u mean on windows they're way worse?
Tests were done on Linux - MS nothing to do with it. But @OP is right: The reason why apple is in the top percentile is very much because they approach performance end-to-end on a few very well known devices that they, once again, tuned themselves from the beginning. So even if intel would come up with a processor that is on benchmarks beating out apple -> it would be questionable if that carries over into practice due to the zoo of puzzle pieces shaping the overall “experience”
In this case, it’s intel’s faukt. It has been proven again and again that Windows actually runs faster… on a Mac 😂
@@user-ft3by4sc2qI might be wrong, but isn't Linux currently not really optimized for Intel's Meteor Lake? I think over at Phoronix they were talking about this. So I would say it currently a bit too early to say that Meteor Lake is inferior. Give it a bit more so that Linux can catch up and then let's see again.
Would love to see developer experience review on the ultra 9 laptop Zenbook Duo 2024. And see speedometer results.
What about the intel xeon W-10855m and W-11955m for dell precision and lenovo p-series workstations @230watts...HOW efficient are they when running LLMs and A.I
AND A.I related tasks
Can you try to run llama2 on any of the new intel Core ultra CPUs and compare with m1-3 ?
Hey @AZisk, Did you know? asitop in mac running overtime build up memory pressure? I ran for 3 hours, it took almost 4.5 gb. weirdly you will see python running high on memory but under the hood, its asitop!
Are Apple and Intel really competing though? I’m an Apple user and find Apple laptops incredibly expensive. A decently specced M3 Max is around $4K.
For that you can buy a higher end Nvidia equipped PC and go out to town with running AI models and a huge game library. Apple is more about creative productivity and a tight integration between hardware and software.
Does it make sense to compare both as if they were the same ecosystem?
Well, competition is definitely good for going forward in tech, but not so sure if it actually keeps prices at bay. I mean, if we look at the phones market, competition is strong but the prices are constantly keep going up every year.
on the one hand i could say “imagine how much an iphone would cost now if others weren’t doing the same thing”. On the other hand, maybe they all are enabling each other to keep climbing
@@AZisk Yeah, after a while it always ends up to the latter.
I always assumed that Intel had at least 5 years of incremental upgrades ready, and they could skip 2 generations ahead on a whim. But they didn't even do that? No preparation for competitors catching up? Wow. What terrible business practices. Or rather: what short-term vision.
Is there still a big performance difference with Intel Core Ultra laptops running on battery vs being plugged into AC power?
good question - I’ll have to check that out when I get one in
The performance difference is like 10~20%. When amd 7840hs is about 30~40%
yes ofc there is a huge hit, this "efficient" core ultras have plenty benchmakr "cheats" implemented they go over 120watts in power mode.
While Excel can use up to 8 threads for its recalculation, that applies only to intrinsic Excel functions. Any VBA code is single threaded. You can’t even run two totally different VBA macros on two separate threads, the entire VBA environment is single threaded. So, if you’re using Excel with any notable amount of VBA, single threaded performance is critical.
Yeah, and even otherwise many calculations just are such that there is at many points one determining thread that can at points be spread out to multithreaded parts, only to converge again later. Excel is a good example though where Macbooks are rather useless as they have only gimped Excel without its more advanced features.
@@InnocentiusLacrimosa Clearlly, you haven't used Excel on a Mac recently. Virtually feature for feature complete, and runs faster on an M1 than on any PC I've tested.
@@geoffstrickler does Excel for Mac well perform at basic and intermediate level ?
@@waqasahmad6111 Is there some part of “It runs faster on an M1 than on any PC I’ve tested” that wasn’t clear? Did you not read the part about “virtually feature for feature complete”?
@@geoffstrickler English isn't my first language so don't comprehend it even now. What I know about the MacBook is that it lacks all advanced features like VBA,Power query, pivot table, different lookup functions etc. I am in the accountancy field so need a laptop and got advised to avoid MacBook due to the fact that online Exam software has issues with MacBook unlike Windows PC.
Idk, that tootbrush looks solid. You may like to keep it as a backup.
maybe i accidentally stumbled on a dental industry disruptor
1:06 use an AI Electric toothbrush. That's way more cool and advanced.
As someone getting back into coding, and also needing a new computer, PLEASE test the Ultra 7 chips against the M3 Pro (they're the closest in price range-ish). Currently stuck on whether I should go for the Asus 14 w the Ultra 7, or an M3 Pro.
Yes, because you need a spacecraft of a CPU to code 🤦♂
@@KrypteiaXi Well, not exactly, but I am looking into learning Mixed and/or Augmented Reality development, so…..in a way, sorta lol
@Phantraas if you are a developer and you have the budget you always pick Mac, it doesn't even need further research as to why.
@@KrypteiaXi🤦
Great one!
I've an Asus Vivobook with an i9 13980hx.. not to impressed at its compute performance but certainly keeps the room warm during this winter cold snap.
Great video! But what's the cat noise at 1:53? 1:54? xD around that area
it was a cat
@@AZisk the fact that it was an actual cat, amazing.
I do have a question:
Will Intel/AMD match the performance of Apple and efficiency for x86? Or ARM is the only way to go?
@@aienpIMO ARM is the way to go
I wish I could go back to a mac at times but I need to run more than ARM based VMs for courses.
It seems Apple has it figured out, while Intel still doesn't.
Well an M3 max MacBook Pro is £4,000 where as similar specs on a Lenovo ultra 7 is £1,199. I don't think it's right to compare them imo
thats not similar spec at all, the m3 max is a game changer with so many p-cores, it's a monster mobile workstation, and really we cannot even compare any cpu to apples SOC, there are so many more benefits.@@maphazar7549
I love the focus on efficiency that Apple has started. My ears can finally take a rest from all the fan noise.
“Who gets a head… or who stays ahead.” 😂😂😂
Not really sure this is the biggest change in 40 years for Intel, I am thinking i386 changing from 16bit to 32bit, or maybe even the shift to 64bit chips as being a bigger change for them?
Qualcomm Snapdragon X elite based laptops is what I'm waiting for this year before I buy a new one. Hopefully it's worth it along side Windows enhanced ARM version.
Samsung Galaxy Book 4 14-inch products have lower performance due to power limitations, and 16-inch products have similar performance to the M1, but consume twice as much power. This is also a benchmark when powered on, and when powered only by batteries, it takes 1 hour and 10 minutes. Waste Intel power efficiency.
Efficiency is critical when you are talking about mobile devices, it drives better battery life and also allows better form factors. Efficiency in desktop machines is less important but excess heat still makes the machines need more cooling and so more noisy. When it comes to datacentres, compute power per watt once again becomes a game changer. Since there are so many machines crammed into the same space, keeping the machines cool becomes a problem, and the power consumption means cost. Not only does it cost a lot more to run, but the double whammy of costing more to cool as well makes it an issue.
How does this relate to Apple and Intel? It’s lucky for Intel that Apple doesn’t concentrate at the datacentre space. If we started to see the same improvements in performance per watt there then the savings and incentive to innovate would be pretty major. (Yes, I know that a lot of the applications in that space are Intel based - for now - but when the threat is economic things can change rapidly).
Intel blamed the poor performance on a bad BIOS. If a software update can unlock major performance gains then I have a bridge to sell you.
its about efficiency and they failed at that
I've held the stock for years, so I'm hoping the ultra series will do well despite the early disappointment.
Imagine the current Intel processors with their heat and power consumption in a thin apple product. That would've been a disaster.
Meteor Lake is great for laptops. Which I believe was its intention purpose. It does what it needs to do. Since it is based upon legacy architecture it’s not surprising there’s no comparable leap in performance per watt the M1 demonstrated upon release.
So when will Intel make the same type of leap in performance per Watt?!? I can't be the only one who wants a Windows Laptop that's at least capable of running light tasks without sounding like a Jet Engine...
Great video. Intel has a LONG way to go to catch Apple’s power management as the os is not tuned and they need more power to keep close on performance.
The core ultra H series are the predecessor to the 13th gen p series not H series, the core ultra 9 185H is predecessor to the 13th gen intel h series
why do intel chips have lower performance than apple silicon at the same power envelop?
I believe that a few BIOS updates will improve Intel performance at least a bit.
That most recent one already improved performance by 10%. Not sure how much headroom is left. But even then 10% is already quite good imho.
@@woeye3251 I think the same happened with their ARC GPUs. I don't have one, but the first reviewers had nothing good to say about them. Nowadays, some reviewers even recommend them for a cost conscious build.
Question. Could Apple put mutiple SOC’S inside a Mac Pro? Would love to see that.
M2 Ultra should not be the biggest chip, the M2 has more than one interconnect side, so more than just 2 of them are fuseable. Probably they can't reach an acceptable extra performance yet with the doubled Ultra chip, so they are still working on it. The M2 Ultra Mac Pro just released because they must give us the last piece of Apple Silicon machine for their promise (Finishing the Apple Silicon transition in 2 years).
Just check the Snapdragon X Elite benchmark scores on 23W and on 80W, the diff is only 10% extra performance. Really 10% extra performance for 250% extra power comsuption? Apple probably still not released the doubled Ultra chip yet for something like this.
tbh i think the mac pro is dead, there is just no need for such a tower anymore, the studio can hold anything.
glueing 8 ultra together then yes, but why
Haha you say catch up like ketchup
🤭
It is also due to the os. Apple have their cpu highly optimized for their own os.
IMO, laptop computers should come with the option for efficiency and/or power whereas desktop machines should bias towards performance. I'd be happier if the M3 came in a version with more performance cores and only 2 efficiency cores.
I aint seeing Intel doing much of anything with their first "real" arm cpu. I'll put my eggs in the Qualcomm basket
Its funny, there are people that said this would be the case for about a year now. Intel didn't seem worried about increasing traditional performance with these chips. Not sure if it has anything to do with incorporating an NPU in the soc. What I can say for sure, comparing an x86 cpu to am ARM cpu is folly. Has always been. You look plenty old enough to have learned this at least 30 years ago.
AMD upstaged Intel too back when they hit 1GHz decades ago.... but it's good someone finally woke Intel up... they were power hungry, heat generators.... and it's good to see Qualcomm and Apple giving Intel a run for their money.
I don't know if intel can catch up but AMD is making strides. There are rumours that they will move towards a more Intel-like dual ISA architecture after Zen 6
@@QuangHaMinh-oy1bc Intel made 2 versions of the x86 instruction set. One for low power, and the other full set they share with AMD. The rumours say AMD will do the same. But I guess my original comment was ambiguous.
i hope not, i hate efficiency cores
What are you thoughts on the Gaudi2 processor?
i haven’t heard of it. will google it.
looks interesting. It’s not a proc that I will likely get my hands on physically, but I’d be curious to test it out if a cloud provider exposes these directly for ml use.
@@AZisk It's Intel and I'm sure they'll work something out with you to get the buzz going. Worth an interview with them anyway.
looks like there might be a provider, and much better priced than A100’s. x.com/Aki__Singh/status/1743346282121339023?s=20
Honestly it's a pointless conversation. We can't be comparing performance numbers just yet.
Even if intel can get close to the power draw of Mx that enough.
They can make more performance later with the newer nodes
What about power consumption?
thats what core ultra is all about, they failed miserably while cheating to shine on benchmarks.
Intel: How about 30 efficiency cores? You can use those right? Meteor lake is a pretty massive failure imo, especially given the timing. At least their igpu is catching up to AMD though.
I am interested in all CPUs / SoCs. For me, for me it is important, how strong the single core power may be. As starting programs and running most apps, only a single core is used today. So multicore performance just tries to make „you blind“ for weak power. Some programs ARE using more than one single performance core. Even every iPad pro may not use more than a single performance core at a time, this really sucks. Even with Stage Mode or with connected power supply (up,to 40W PD), my experience shows, that all four performance cores will never been used at all! The reason: Too many constraints of iPadOS and power savings. But: Is this „professional“?
But isn't this Intel's FIRST iteration? M-THREE versus a first edition release? Well done. Apple had enough time to improve and streamline their stuff. Let's come back to this discussion after 2 years.
Here’s an idea for another episode. Can you please 🙏 do a comparison that shows real life usage for a .NET developer (me 😅) using a MacBook Air m2 with 24 GB of RAM. The idea is to see how the machine performs when running the VM attempting to build and do other real-life tasks in comparison to a Windows native machine? This could showcase the true potential for having MacBook as a .NET developer
Well I can't speak for the MacBook air, but I recently purchased a MacBook pro M3 pro with 18GB of Ram and it compiles faster than my desktop PC with 64GB DDR5 and i9 13900K. I tried building some personal projects plus docker and the memory usage was around 12GB so I think 18 is enough. Not to mention the fans never kick in, stay silent and cool too.
@@KarlynGR I can also confirm this behavior. Keep in mind that I am a Windows developer and I have about 7GB of RAM allocated for Parallels (VM). Aside from that, the CPU utilization is phenomenal and build times are substantially faster. In fact, compared to my i7, I have seen up to 70% gains on build times.
8gb ram. Yes, Apple is the best!
8gb ram is plenty for the vast majority of users just like 128gb would be enough for the vast majority of users, why not have an entry level.
@@JohnSmith-pn2vl because increase from 8 GB to 16 GB in apple is worth salary of some people...
I think people should keep their eyes on the Ryzen 7/9 AI + 780M SoC. AMD is catching up to the M3/4 in another generation or two.
Are those the chips that wake up when the PC goes to sleep? 😂
Although CPU with other features are positive, high CPU performance offers longevity, compare i7 and 10 year old Celeron, i7 still solves a lot of things while Celeron today must crash web
I believe Apple is playing good in terms delivering exceptional performance and high mobility for their Laptops, as me Software engineer I care about both.
m3 max is an insane machine, a true mobile workstation.
Well said.
I think intel really needs to focus on power efficiency for laptop chips. AMD is ahead of Intel in that regards, it also doesn't help that Windows software isn't well optimized for their architecture because of how different it is. In the gaming handheld world AMD is probably going to be hands down better than Intel, meteor lake's APU's might be a little bit more powerful when enough wattage is thrown at them, but the fact that MSI hasn't allowed reviewers with pre production units to test battery life is a very bad sign
Seriously, do use an electric toothbrush.
Apples CPU and GPU are great and it's good what they are doing but for me Gaming is up there with Development so I don't want to switch from x86 architecture to ARM. then day gaming is good on Mac OS and on ARM I will make then switch. Until then I am good woth other products
It did a great job. Its focus was power consumption, and they did that. It makes a lot of sense in laptops. While yes in single core, yeah, it had a regression. But in multicore should fair fine. This is a complete new architecture and a phase to Lunar Lake, which shows Lunar a high promise. Thus, the important thing is how good Lunar Lake is. If Lunar underperforms, Intel is leading to a dead end. Will it be the end of Intel who knows, but it will be very grim.
hope intel can catch up it's really unfortunate that the most powerful option out there is apple
Great video. Apple receive a ton of criticism over their pricing. We should also note the price reduction on many devices once they kicked Intel out. Apple knew how good their M1 chips were compared to Intel and could easily taken advantage.
Well, I recently compared prices for top tablets and MacBook m1 air, MacBook wins by huge margin by at least functionality. So 1000 dollars price is kind of justified.
To be honest the pricing is more than fair also just considering display quality, speaker quality, and build quality of the devices. Apple has a huge advantage because they excel in many details compared to other laptops.
@@Roma-tz9vgoh yeah totally , for unusable 8gb ram 😂😂😂 🐑
@@najeebshah.while 8gb of ram isn’t really good you have to realise that most people who use windows don’t realise how much windows uses ram while Mac handles ram much more efficiently. If you aren’t coding or doing editing you don’t need more ram however i personally would prefer them to change the base ram to 16gb.
Its also worth noting that checking ram usage on mac is not a good way to compare because macOS considers unused ram wasted space so they try to max it out. Checking how much swap space is used is a better idea to compare.
It starts to bother me Intel stays so far away from Apple in terms of battery performance. I hate Apple pricing for upgrades but I might switch to them if Windows laptops can't catch up.
Intel is like: "POWER OVERWHELMING!!!!! We burn!!!! "
Amazing cosmic power, iffy bitty living space. Apple Silicon is like the genie in Aladdin.
i think intel and amd are not even trying to catch up but are more focusing to provide good cpu's for the average consumer who don't have 5-6k to drop on a imac, the only reason why people buy a imac is work related or just having to much money. better to focus on the 90% marketplace and compete with amd then competing with apple for the 10% marketplace.
Efficiency is the main focus of this line of CPUs Intels been really adamant about it. So yeah if you just want efficiency and need a x86 system Intel is really fighting for your attention.
still not efficient, they got plenty benchmark cheats going, lets see the battery life and noise which is what efficiency does.
Apple is using 3nm, Intel is using packed 7nm, running at higher frequencies. Of course, it is not as efficient. But it will gets better on the next node.😅😅😅😅😊😊
I can see where this is going. Every year after the new Apple processor beats the new Intel processor: "It will get better soon".
honestly, Intel needs to dumb backward compatibility to really gain efficiency. Either that or they should go all in on Risc5 to start fresh. Gamers nexus shows just how hot and inefficient the latest Intel CPU run. Windows is going to support ARM, so that is probably easier than ripping out stuff from legacy CPU designs
Windows already supports ARM and has for years
@@benammiswift But Windows ARM from what I heard isn't good rn.
@@polarfr_st I mean, good or not it exists
I don't like Intel recently. But i need them to succeed.
could you install any free operating system on your m3 and compare? 😉
if i had a ferrari, would I install 4 different used tires on it?
@@AZiskIf it's from Apple, of course not. You have to buy a new car when the tires are worn out.
Joking aside, I wanted to express that they are completely different approaches. Closed proprietary and specialized system versus an open one. I use Apple for development myself because native clients for ios are easiest to develop in xcode. But I also develop on Linux and Windows. For example, when I compare intelliJ between my m3 and my 14900k running Fedora 39, there's a light years difference in feel between my Linux and my M3 Pro, in favor of the 14900k+fedora. (with all the disadvantages like power consumption etc.).
There is simply no holy grail. Only light and shade, depending on the perspective. That's why I'm absolutely against polarization.
Yes, I also use all those systems. Unfortunately, not everyone can have all these systems or afford all of them, especially the latest and greatest. If I could only pick one, I would still go with Apple, and not only because I havr to build ios apps, but also for the simplicity and stability.
Chiplet designs are a mistake.
Intel user here. Its good to hear that intel is focusing on efficiency. That said, only laptop users really care about this. Power users rarely care about efficiency. I certainly dont.
your voice is just incredible
If Intel had the most efficient x86 chips one could argue that nobody is beating them at their own game, but AMD's CPUs offer so much more battery endurance, Intel is lucky that AMD doesn't have enough supply to keep tons of OEMs supplied, from the looks of it Meteor Lake is still not as efficient as the Phoenix CPUs, so for anyone who needs the x86 architecture and a laptop that has great battery life it's a no brainer to go with an AMD CPU so far.
Yes, Apple has several years head start so it will be a hard, uphill battery for the competition to even catch up. Great for consumers though lol
we talking decades here, apple controls everything while windows and android are the clusterfuck they are, this is a huge problem when going SOC and ARM.
very hard and time consuming. there are just too many devices and companies and factors involved. super hard
Surprised you didn't mention amd.. they have been killing it on efficiency these last few years
i did
did you watch the full video ??:)
Intel's inefficiency is due to maintaining backward compatibility with older softwares (..remember SSE4? *cough..)
Just my 2 cents ✌🏼
This dude is so wrong.
Intel did great with core ultra processors.
Instead of going for the fastest performance, they worked well in streamlining performance between plugged and unplugged state.
The fans don't run loud even at full load.
Task scheduling improved a lot with latest bios update and will improve further.
iGPU is great. You don't even need dedicated GPU to play a lot of games now.
Battery life is on par with other processors.
NPU is already great if you are a streamer and will further help AI processes.
More and more softwares are being optimized for NPU and it will play a huge part in its longevity.
which machine did you buy?
Apple isn't actually that efficient under load anymore, they're way more efficient at idle/low power tasks but under high load AMD is basically already on par with Apple Silicon.
Apple's chip is much more expensive to make, but they control the whole device, so they can recover the chip's cost elsewhere (ram upgrade etc). This is because the whole chip is manufactured on a very expensive node.
Intel has to compete on price just for the chip, not the whole device, so the economics is harder. And their tiles approach makes much more sense. Reduces cost by a lot, and allows for mixing and matching the tiles depending on the application. And they cover a much wider and varied market compared to Apple.
Yup, I want intel to smash that ASilicon.
LPDDR is faster than DDR actually 😅
no, low power is lower power, thats it, SOC is what makes everything so fast
@@JohnSmith-pn2vl but lpddr is often indeed faster in mhz. ddr5 is 5600mhz, and lpddr5 is 6400 iirc
Intel has been failing pretty hard for the last 3 years or so, this isn't really a surprise sadly.