Apple feel snappy, no My MacBook Air M1 16 GB often feels SLOWER than my old Lenovo laptop !!! So what I hear you are saying is that AMD, Intel, Qualcomm and Apple will be very close at the end of this year, Intel and Apple are no-upgradeable, are AMD and Qualcomm upgradeable ? What will the price/performance be for the four ? I guess that Apple will be the most expensive !
The problem with Apple is that there are only a handful of apps & games that can work on the ARM platform when compared to AMD, Intel & Nvidia. And Apple's superiority over AI has narrowed, if not been surpassed by their competition in 2024 and going forward it's only going to get a lot harder for Apple to maintain a lead if any. However, we all benefit from this competition.
In late 2020 Apple ignited the competition with the M1, now almost 4 years later we have 4 great chips to choose from. This is why competition always helps the consumer.
@@amirtorhan2762 lol no, AMD already competes well with apple in terms of both battery life and performance. While not sacrificing upgreadeable RAM or SSDs. Wake up, Macs are no longer gaining market share like they used to.
Intel and AMD will definitely have their share of the market. TSMC is at max capacity and investing in other semiconductor companies will be an absolute power move, I keep increasing my shares manageably. Different chips are good at different things and Nvidia has been very specialised, which leaves other aspects of Al open.
Of course, i had bought NVDA shares at 300, 475 cheap before the split and with huge interest I keep adding, i’m currently doings the same for PLTR and AMD constructively. Best possible way to get ahead, is participating behind top experienced performers.
I agree, I'm compiling and picking stocks that l'd love to hold on to for a few years before retirement, do you think these stocks would do better over the years.
You are buying a company to own it and not a piece of paper, The market is a zero-sum game (2 sides), Know what you are buying not just out of trend interest.
Not 4 years to recognize, 4 years to pivot. Especially when chips like this are generally developed for a decade, this was a last ditch rush job to catch up
It is institutional inertia, people running the BU don’t want to disrupt their existing book of business, until it is too late. That is how lot of great tech companies end up in history’s dust bin.
Apple spend 10 years for developing M1 .... do you know who are the top development leader at qualcomm x elite ? Some former top engineers, manager and so on from apple... Intel will bankrupt in maybe 10 years, like kodak because they are unable to improve and innovate, they just follow with 4 years delays and try to copy the m1 chip ... and still with x86
In my opinion it doesn’t matter who is making the best chip.The important thing is that there is competition and competition is in the end always good for the users
Keep in mind Intel isn’t abandoning upgradable memory. As of now, the integration is just for Lunar lake. Arrow lake (desktop) due later this year will utilize CAMM-2 memory. I believe the processor will still have integrated memory such as Lunar Lake as well… so the big difference is that it will not be “unified”
It is likely that all future ultrabook chips will have soldered RAM, and CAMM2 will be only for desktop, gaming, and pro machines due to the expense and the need for a separate board (space and weight). Also, the problem with CAMM2 is that if you buy a machine with 16GB and upgrade to 32GB, you will have to buy a new 32GB board. The 16GB board becomes worthless e-waste, and the 32GB board currently costs $279 (64GB DIMM costs $150). 95% of users will not care about it being modular, especially as it is currently of poor value. The main advantage is that we can get the fastest RAM in these performance machines rather than the current limitations--like no LPDDR5X.
@@ahaimes6320no the new memory. Modules are extremely. Thin. And made specifically for thin. Laptops handhelds etc desktops have plenty of space and don't need thin modules. Apple could also allow for memory upgrades with these by having soldered unified and slots for camp which would actually be as fast as the sidered unified memory for most things Instead of going to the SSD when. Memory runs out, it will dynamically use the much faster camm memory modules. If apple does this it will be huge for consumers and professionals who needs change over time stop waste and help performance.
@@ahaimes6320 I've seen an article quoting one of the OEMs as stating that CAMM2 is actually cheaper to produce due to only one board and less materials being needed compared to the typical 2-4 ram sticks that have existed with DDR; if that's true the high prices right now are probably just because it's new and new always commands a higher price for first adopters. Prices should come down to affordable levels before too long just like DDR always has after a new variant. A 16 GB board is hardly worthless either as it can be repurposed in another machine that doesn't require more. 16 GB system ram is still more than enough for most computers that are only used for basic office work and media consumption.
Hate to break it to you but the M4 is not going to get 972 in cinebench. the uplift in geekbench mainly came from the new support of matrix calculations on the CPU side. cinebench does nothing with that. It will be slightly better than m3 maybe 750
didn't m4 get miniscule upgrade since they got all massive gains and now it is mature platform that can only be refined? some people here praise x64 as it's the only way forward, but engineers say that we're nowhere near limits of x86, from leaks, zen 7 is supposed to work on power efficiency over the whole voltage curve, so better power managment might be even possible on x86, i don't feel like x64 is the only way to make things power efficient, more like all x64 were designed with mobile in mind, so they are for lower power scenarios
@@1Grainer1x86 is essentially dead, Many new Intel and some new AMD chips have dropped support for “real-mode” 8/16-bit x86. It’s virtually all been x64 (more accurately x86-64, an AMD invention) for the last 20 years. And both Intel and AMD have labeled even 32-bit x86 instructions as deprecated or “not recommended” because it’s much more useful to optimize the x64 instructions, so they don’t focus on optimizing the performance of the 32-bit x86 instructions.
@@geoffstrickler so it is essentially dead since windows 7 or vista (dunno if vista had 64-bit version), so you're just proving that this video and comments are wrong the only reason i know of that windows didn't fully commit to x64 is because most programs would have stopped working, so most probably all engineering stuff that never worked on apple
@@geoffstrickler 8bit and 16 bit processors are for low power embedded applications, recent macOS X versions are 64 bit - not even smartphones use them. 64 bit is used today because the bus and address widths can be wider meaning bigger data (loaded, stored or moved at a time) and more complex instruction can be executed in ~the same number of clock cycles (amount of time to execute an instruction), that's why we're using x86-64 (The x86 instruction set - list of all the most basic instruction a CPU can perform - has seen changes from AMD and Intel). x86 also focused earlier on floating point computation which was what beat PowerPC and the reason it's not in your Mac today. Intel and AMD have innovated chip design, bus, cooling, microarchitectures - that's how we've had faster chips. x86 is dead...x86 is garbage...but many improvements in computing came (or because of competition) from x86 because it is just that good.
The lunar lake chip can go under 1 watt in idle and 2 watt decoding. It is way more efficient then the z1 extreme/7840u/8840u. At most that uses 4-5 watts of power, and 8-10 watts of total system power. It is way better for laptops if you are concerned about battery life compared to amd.
@@marcogoncalves1073 was in the pc world video. The intel people had 3dmark on but the benchmark wasn't running, it went from 3 watts down to 0.958 watts.
Apple defines smart phone, Apple defines tablet, and after 4 years of released of M1, Intel and Qualcomm finally confirm Apple also defines laptop cpu.
Apple defines consumer desktop CPU, the m1ultra was way more powerful than 12900K, m2ultra is slightly more powerful than 14900K. Apple is the definition of CPU SPEC CPU® 2017
@@PKperformanceEU Yet, Windows Intel and AMD run the desktop market. ARM chips are best for laptops where battery life and portability are important. Not desktops within the USA. Add on gaming is drastically easier on X86 architecture. Same for CAD software used by engineers (Civil, Mechanical, and architects). For laptops, ARM chips > X86 chips. For desktops X86 > ARM chips.
@@ricarmig CoPilot is a travesty and I foresee MS in W12 not allowing you to disable it, given their history of monitoring. One reason why I only use my PC for gaming only, and my personal stuff on other OSes.
You're right. Learn from the time when Apple switch to arm. Those early adopters just pull their hairs out when things don't work (even with Rosetta. I expect it to be worse with Prism simply because microsoft has a track record of writing bad software compared to Apple). Don't pay to be beta testers for the new chip. 😂 Wait for the entire industry to switch to Windows on Arm before you switch. That tiny gain in terms of power efficiency (compared to Lunar Lake) is not worth pulling your hairs out, especially when you pay for it. You'll end up spending even more time and offsetting that power efficiency trying to make unsupported things work. (from experience). Having said that, we still need guinea pigs to be beta testers to make things work. I just don't want to be that guinea pig 🙂
It sounds like Lunar lake might fulfill my dream of playing esports games with an igpu and getting consistently decent frames on low settings. Id love to have the battery efficiency and portability of a thin and light and still be able to comfortably play games
You will need frame interpolation (AI), which reduces the quality. But, yes, if you are ok with artificial frame rates then the iGPU in these new ultrabook chips may suffice for casual gaming.
@@thecreativepeasant8781 ask ai to write code that you would usually have to search GitHub for, then correct and repurpose it for what you need. Speeds up your workflow
@visitante-pc5zc theoretically a lot of things, but we need to wait and see if it will turn out well.For sure the tech at the base of the node should be very advanced, and the first among every other foundry with back side power and GAA transistors. This two things together should brings big power efficiency gains compared to normal FinFet transistors, the ones used till today.
They arent using N3E. They are using tsmc to bide time for their own foundry. Dont forget that intel is planning to compete directly with TSMC to make nvidia, apple, qualcomm and amd chips, their goal is to manufacture chips for everyone. That will be their main competitor, not nvidia, amd, qualcomm or apple.
Hey Vadim, Intel also redid their new iGPU too! It's supposed to help much better in rendering, better Ray tracing support, and now it'll properly support XeSS.
The vast vast majority use their computers for basic stuff like surfing, RUclipsrs, emails, and simple word processing and spreadsheets. These people could care less about AI and cutting edge processors. Battery life is however very important in the laptop arena.
@@oldgrub x86 has the best compatibility. Besides better performance, there's nothing going on at ARM. You want to emulate everything? Best-case scenario would be x86 becoming just as efficient, not Intel also making ARM chips.
Will be interesting to see how snapdragon will be supported in Linux, I would really like a good Synology NAS with such a cpu and maybe clients as well, like a raspberry sized board.
Should I wait to buy a new laptop with one of these new intel lunar lake chips or AMD 9000 as I need one by August or should I just buy now? With everything going on I am just very confused?
Intel's battlemage is set to match AMD's 890m at a lower power draw Source: videocardz.com/newz/intel-core-ultra-200v-lunar-lake-reportedly-hits-4100-points-in-timespy-at-30w
@@NothingNewNerd would u say that both will deliver roughly same experience? But I mean I seen the first review for new amd chip and they all point out that the computer get very hot
Looks to me like it is Apple, Qualcomm and Intel efficiency chips for laptops and 2-in-1s and the x86 chip still for power desktops. For me if I upgrade this fall it will be Windows (Snapdragon or Intel).
Yes and no. Think about it like this, the power users buy M4 sell m3, the M1 users buy me sell M1. Also depends if it's ultra pro or base model when consumers upgrade. When m1 was new the power was too much, now it's normal. So If I want to buy M1 or X plus I'd be hoping the new stuff sells to the power users so the M1 becomes more affordable
@@WarAlex16 The Galaxy Tab s7 came out in August 2020 - OS updates until 2023 and security updates until 2024 - In regards of duration not even close to the ipad air 2.
What do you mean by useable ? Are you talking about Android developed by Google or the hardware developed by Samsung ? Android is at the whim of Google so Samsung has no say in that if Google pulls support for an OS . An iPad air 2 is hardly usable. What do you actually do on the tablet?
We keep seeing various benchmarks where Intel is faster than Apple, or Apple is faster than AMD, etc, etc, but the real world results show how much Apple has worked on integration and system-wide efficiencies. I hope MS take note of this part and don't just rely on brute force of the fastest SoC to get results. This is an opportunity for MS to get rid of some of the baggage associated with legacy Windows and to work with vendors such as Adobe to fine tune their apps to the new SoC's.
Sadly software has never been Microsoft's strong point (which is kinda' weird if you think about it). I like Windows 11 more than 10, but it's still quite inconsistent at times.
If they got rid of the baggage why would one any use windows ? People use Windows because they know that there software will work on it. if they did what Apple did with their silicon or got rid of backwards support it would kill the Windows Market. If anything they should have continued the development of Singularity and Midori and instead just launched two OS's one that is the advancement of current Windows and the second being a new OS. This way they could have developed the new OS without the baggage while still having Windows but with limited new features and then eventually kill it when the new OS is up to par.
I really hope Snapdragon X takes off this time so it makes it worthwhile for developers. For the small chance that then Apple would bring back bootcamp for a native Windows experience on Mac again.
@@Vashei it doesnt take a crash. Its a feature that helps to steal hardware sales from Windows. MacOS still only has a 14% market share while Windows is at 70% +. The same applies today as it did in 2007. And since laptop and desktop sales have been stagnant, its the perfect time to do so if Arm based Windows takes off. That being said, if Snapdragon ends up a flop, no point in doing so.
@@bort7258 you can not compare the situations. Besides battery runtime there was NO advantage for Apple. And with Snapdragon there IS NO advantage for Apple. If you pay the same money as for Apple hardware - e.g. Microsoft Surface - you a much better experience on the Micosoft side. Sad bad true. Would have liked an Apple Surface Studio... but there was never one. So I switched to Windows after 25 years using only Macs ... (beside Server stuff)
@@kaptnwelpe5322 For me and many of my peers and co-workers, running Windows natively via bootcamp on our Intel MacBook Pro's was a much better experience vs using Windows on our ThinkPads. In fact, I knew several that got MacBooks just to run Windows all the time via BC and never boot into MacOS. I fail to see why this can't be the same today if ARM takes off this time and Win developers support the platform. Touch interface on a laptop is not crucial to a great Windows experience as there are still a ton of Windows laptops that ship today without touch enabled screens. I currently use Windows 9 to 5 / mon to Friday, have a touch enabled HP work laptop and can't remember the last time my hands used the screen to navigate. For a niche group of people that require input via touch, perhaps so, but that would not be the MacOS target market, rather iPadOS. The Surface Studio is quite the niche product. Pretty fantastic for certain use cases and very well designed and engineered, but still niche and doesn't portray a typical Windows user. Even MS knows this and is why they are super slow to update them with the current one shipping with mostly 2 to 3 generation old specs at astonomical pricing.
@@VasheiI wouldn’t be so sure. Craig Federighi said a couple years ago that the ball was in Microsoft’s court. At the time, Microsoft had an exclusive contract with Qualcomm, which has since expired. Craig F basically said that if Microsoft was willing to allow Windows to run on a Mac, they would be all for it. Remember that Apple is a hardware company while Microsoft is a software company. If Apple can entice people to buy a MacBook by using Windows as a marketing hook, that becomes a win-win for both Apple and Microsoft.
Seems like we finally have a true chip CONVERGENCE. X-86 finally doesn't suck anymore thanks to arm. Fun fact.. the X Elite has actually been delayed for two years so it uses the "old" Oryon cores. Next generation will provide huge improvements, with Qualcomm solving the teething problems and gaining experience.
Yeah I'm interested in seeing if we are going to get the next generation of Snapdragon X-Elite being announced later this year alongside Snapdragon 8 Gen 4... cuz this one was delayed for 2 years.
Well when they have the new cores ready, then we will see. There's a reason why the used those old cores. And the industry won't stand static waiting for them to release them. So the odds are uncertain.
This is the most exciting time to consider upgrading my PC I have experienced in years. The Qualcomm Windows announcement is in a word, fantastic! But then I watched the Intel Lunar Lake SOC announcement and Wow. I did not expect this level of advancement from Intel this year. I was thinking of upgrading in a couple of years, but with Intel Lunar Lake and Microsoft Windows 11 Copilot+ PC announcements, I am looking for an upgrade to my Windows 11 laptop later this year. For me things can only get better from here. I just hope the promise of AI on Windows 11 PCs shows real progress. To my way of thinking software developers will have a priority to develop AI apps for X86 with an eye on ARM Windows development. But I would think most will let emulation be the bridge to ARM for the time being. In my case the Microsoft Windows 11 Copilot+ PC and Intel Lunar Lake SOC platform are the best of the best for thin and light laptops. However, I will have to wait to see how the Windows 11 PC makers design for the new world of Microsoft Windows 11 Copilot+ PCs.
As we learned that Apple with M3 uses the N3B node temporarily till M4 with N3E node is fully replacing M2 (PRO, MAX & ULTRA). It looks like that TSMC will not stop producing N3B as suggested. That production stops for Apple is likely now Intel uses the N3B node.
It's good news. Less power consumption is always good. I would never buy an Apple product because I don't like their OS and especially loathe the company and its policies. That being said, they make good products
A point that should be made when discussing the ability of the various CPUs/SOCs to run AI is the implementation within the computers which incorporate them. None of the Windows based laptops announced to date have enough onboard resources to run very large LLMs. It doesn't matter how fast the NPU is when the computer doesn't have enough RAM to load and process the LLM. If you want a laptop that can run the LLMs having higher accuracy rates, then your best option is the MacBook Pro M3 MAX (or the M4 MAX version when it ships) with 16-core CPU, 40-core GPU, and 128GB RAM (or better, depending on options available with the M4).
Microsoft demanded NPUs on all windows machines because they want to run them full time spying (uh “learning”) on all users. If this spying can’t be turned off I will never run a windows computer again. For all of us with IP to protect, we will run quantized models not much bigger than they need to be to understand our voices and the apps we run.
Thats BS. WHO on earth uses a Mac for AI seriously with NO Cuda-support at all?? And: for interference its all about the fine-tuning of the model. Excaclty what Microsoft delivers. Even more so with their ground breaking Copilot... Apple is loosing big time. But they don't care as they gave up the Mac long time ago...
Qualcomm's 1st generation Oryon cores in the the Snapdragon X Elite beats Apple's 1st - 3rd generation M series chips and is only beaten by 4th generation M4 chips. Does that mean... by 2nd generation..... Qualcomm's Snapdragon Dragon X Elite chips will equal Apple's M6 chip? 😅
@@dsblue1977 but apple has 4 generations of power efficiency refinements. compare 1st Gen M1 with 1st Gen snapdragon x elite. all I'm saying is... their first go at it.... 1st Gen already matches apples 3rd Gen 😜
One of the best responsive pc’s i had were dual processors. I had a dual processor pentium 2 that was awesome and super snappy. I also had a dual p3 1000mhz that was super responsive. Todays are fast because of ssd but dual processor pcs could be awesome. And shut off one full processor on battery life.
The 780m already gets around 36 fps in 3dmark Wild Life Extreme, the 890m is claimed to be as much as 36% faster per an OEM with its 25% increase in core count and the chips support for much faster ram. That would put it right at around 50 fps. Here's the thing though, it's been proven time and time again that performance in synthetic benchmarks often don't translate to real world performance in games. Intel's top end xe-lpg in Meteor Lake scores similar to or better than the 780m in 3dmark but gets beaten soundly in nearly every game. This is why I believe that the HX 370's 890m will be the fastest of the lot. AMD has lots of experience and optimization in the gaming segment compared to Intel, Apple and Qualcomm.
So, does Qualcomm expect people to pre-order based on their marketing materials and pseudo-ads on tech RUclips channels? Why haven't we seen benchmarks conducted by actual users instead of the performance numbers that Qualcomm wants to believe their processors can achieve?
Thanks for the video. It's nice to have both operating system and CPU options. Windows runs the office and gaming, MacOS runs the studio and Linux runs the Internet.
@@prianshubhatia2759Creative studios, yes, and more, and it is cannibalising windows. Gaming is also increasing whether you liked it or not. The difference is in the strategy, one is lost and the other is focused.
@@prianshubhatia2759 Windows is not dominant everywhere. There are some industries in which Apple is the standard. Not everyone buys computers for gaming.
There are no m4 laptops. Thats like comparing the m4 to competitors uocoming chio thats not out yet. We have m1 m2 and m3 airs out now lets compare rhem.
@@SeanLi-i7n This outdated argument continues to be repeated uncritically by digital influencers within the Apple bubble. This concept is at least four years behind and demonstrates these influencers' disconnect from the current market reality. They are unaware of the high-end and gaming laptop lines, which are far more advanced than any MacBook and sold with good profit margins. Of course, these brands do not operate with exorbitant margins, as their audience is not the typical conformist Apple customer.
Check the software you need for the tasks. If it doesn't run on ARM architecture, then Intel or AMD laptops. If it runs on ARM architecture, then Apple or Snapdragon X Elite laptops.
Also, laptop or desktop? Lunar lake is for laptops, Arrow lake processors are due later this year for desktops, they should have upgradable RAM with CAMM-2 memory
Nope. Apple has a contract with TSMC giving them like "first access" rights to the newest chip technologies. So Apple has an advantage worth about 1 year. It must be scary for Apple to think about how fast the Snapdragons will be in their second generation...
the fact apple was able to get their own cpu's under control and will never again depend on others (motorola, ibm, intel) is all the win apple needed. They don't need to 'be the fastest' they just need to be fast enough for their products and it seems they will be doing fine for the next few years. The fact other chip makers and youtubers compare to their chips is a bonus for PR.
@@visitante-pc5zc something is only overpriced if people who buy it don't see the value. clearly apple has found a market segment in all areas they play in that disagree with you.
@kleanthisgroutides7100 true or not (won't reply to that) how is apple making their own chips instead of Motorola, IBM or intel have any relation to this, do you feel they would be in a better position if they stayed with any of their old chip providers instead of doing it them selves?
@@ibraheem1224 Even 2200 single core is still excellent for daily office work. 2400+ is overkill unless you are an engineer or EA shooter game competitor.
@@akin242002 Bruhh phone chips are getting 2200 single-score nowadays. Laptop chips obviously need to be better. For reference, Snapdragon 8 Gen 3 gets around 2300 in single-core.
@@ibraheem1224 They are getting better at an amazing rate, but it's like saying you need rocket fuel to ride a bike in a park. The improved CPU single core performances are great for large CPU tasks, but not a big deal for smaller tasks unless you are in a rush.
People don’t just buy laptops based on raw performance, I think OS is far more a deciding factor. I have been a long time Pc user, switch to MAC lately, got to say, MAC OS is just better in so many way.
My windows laptop just failed yesterday. Seeing Microsofts plan for always-on AI has put my transition to Mac into emergency mode. If Apple announces an M4 Mac mini and a reasonable privacy based AI development plan at WWDC it will be the end of windows for me.
Could you elaborate on why it is so much better? I have been a Windows user (mostly Windows Pro) my entire life and my wife has an M3 Air. I don't see the advantages beyond less software updates and better video editing quality.
One of the most important feature for AMD based Laptops is emulation. It doesn't matter how good the processing power is if you cant use it. In that area apple will continue to be the obvious choice for ARM based laptops.
Glad to see all the CPU manufacturers are getting better. The real winner is the customer. - For business laptops, the X1 Carbon G13 will be amazing with Intel Lunar Lake. For tasks the ARM architecture is capable of doing, Snapdragon X Elite will take that market share. - For gaming desktops and mini-PCs, AMD will dominate from a CPU standpoint. Still X86, so gamers will not lose their games. Also, "Recall" is not included. - Apple will keep their current market. The M4 Pro MacBook Pro will continue to dominate the photography and video editing industry. Same in the front-end programming with JavaScript and backend with Java.
Apple will lose market share just by the existance of a new level of Windows battery runtime. When the economy tanks - which it looks like it will - Apple will get in big trouble as it could not afford to substantially lower the price of their Mac Mini or entry level MacBooks.
According to Statcounter, MacOS increased from roughly 15% of the USA market to 25% of the USA market. Impressive growth, but still not the majority. Edit: This is from June 2014 to May 2024 to give historical reference.
Can you please help me with some information I have bought an macbook air M2,but I have a samsung s22 ultra.Do you think those match with each other.Is it easy to share data from one another??
Of course Not. Base on the new functions of iOS18, I suggest you waiting for the new iPhone. In my opinion, lunar lake still have a great distance from Apple M4, or newer chips.
Non upgradable RAM doesn't mean Apple was right. It just means Apple paved the way for Intel to implement anti consumer ideas without backlash. Apple's strength is that it faces far less scrutiny over it's decisions so they can make design choices that only increase revenue and their customers will usually give them the benefit of the doubt. Once they normalize those choices companies like Intel and Samsung become free to follow and thus profit. This has been the cycle for a long time now and based on the opinions in the video it seems this cycle will continue on for the foreseeable future.
It is important to note that only the Snapdragon X Elite will be widely available for the important back-to-school season. Youths that are used to the battery life of iPad and mobile phones will gravitate towards Snapdragon X Elite devices.
Man I don’t think Intel’s approach for Lunar Lake especially when it comes to ram memory is a mere copy of Apple’s chip. This was the right call for laptop mean to be efficient, a new complete architecture (without hypertreading) and tile design that also helps a lot. This chip design took more than five years so it is more than just a copy
It is unfortunately a copy when the ran out of ideas and see themselves falling. The last call before abandoning the ship! They took too much time. First M1 should’ve sounded the alarm.
@@fmax3000 You guys are gullible. CPU design take 3-5 years, it's not a plug and play process with of the shelf components. It took Apple 10 years to develop the M1 and it took Intel 4 years to answer while AMD is still working on its own answer. Intel hasn't fall behind like RUclipsrs make you believe.
Let's remember - this is all about battery life, nothing else. If someone is like me, and works mostly with the laptop plugged into the electricity network (doing programming, gaming, watching videos, etc.), they shouldn't care much about the new chips, only about performance. My current Intel laptop with Nvidia GPU beats the crap (performance-wise) out of any current apple computer and the Snapdragon laptops are not going perform better. I guess about 95% of all computer users should really only care about the performance and not saving a few Watts-per-hour on their machines. If you really care about the environment, there are much better ways to do that than using inferior computers.
So what I've said a few years ago that when Apple "ditched" Intel for their own CPU, they gave Intel some technology in the process, tech that we saw in the M1. That big-little architecture Intel came out with is from Apple. And now, on-board memory, come on. It is obvious Apple "saved" Intel by giving them some tech.
So, here's what I want you to test, in regards to power consumption. I feel as though it is extremely hard to visualize computer power so, here’s a scenario. I live in a house, there was a storm and a tree branch fell and snapped just the powerline off of the pole to my house but, any Internet lines are still working. I have a Tesla, or a Ford lightning that can power my house and I still want a game after my power is back up until the power company comes out and fixes the line to the grid. how much can I do running on a fixed battery? Because I feel SA standard PC would wear the battery out a lot faster than an M-series Mac. especially considering the immense wattage, a gaming PC uses, versus a Windows laptop, versus a Mac laptop, with everything else being exactly the same because I also think that Power is using, and then it’s generating, would show a difference in your electric bill every month.
Snapdragon X Elite benchmarks look so damn better. I compared it with Ryzen 9 8850HS and it blew my mind entirely. No wonder why x86 seems to be declining in the future if not immediately.
2:29 NO, this is just a false assumption. This just means BOTH want to make max profit! btw: On mobile devices RAM isn´t upgradeable anyway since years, which is "ok", as those devices will be thrown away after some years anyway.
You forgot to tell the biggest downside of Apple chips. At the same price range all the competitors will be offering base models with 2x RAM and 2x-4x larger SSD. If you want to match these with an Apple laptop, you need to pay significantly more, since Apple prices their RAM and SSD upgrades very high. M3 Max 14” isn’t really a thick and heavy laptop either. Smallest and most power efficient high end laptop out there. But costs over 5000€ when equipped with 2TB SSD. It’s a awesome small laptop if price is not a consideration.
Amd & Intel conquered X86 APU ecosystem. Apple & Qualcomm conquered ARM APU ecosystem. Nvidia with their own ARM CPU design will entering ARM APU ecosystem next year. WIN - WIN SITUATION FOR CONSUMERS ❤❤❤
The small and lite window laptops are usually pretty cheap. Is Intel going to lose money on this new chip since it's probably more expensive than their legacy chips that don't have built in memory, ect?
The small and lite windows laptops are going to be Snapdragon. Once they have sold to the early adaptors... Guess why Microsoft didn't announce THE Surface where ARM would make the biggest difference - the Surface Go...
Yeah, I think there's going to be a huge disparity in marketing specifications. We have Qualcomm citing INT8, Intel & AMD combining multiple function blocks then comparing them to the competition's single function blocks. All very under-handed as CoreML allows multiple blocks to be used. The picture painted by GeekbenchML does not concur with any of these points BTW. You should start publishing these.
What wasn't covered are the beer tokens you'll need to exchange to equip a business, school, or family, with a stack of them, not forgetting the power bill, in a total cost of ownership, over 3+ years, calculation.
AMD, NVIDIA, Intel and Qualcomm will give Apple many ideas on what to improve for the next A/M chip release. Apple is ahead of the game as it pivoted four years ago and has now gotten its release cycle down to a year for the M chips. Apple will always be the first to release the next generation of TSMC chips. Apple is now in a position to integrate the best features of the other chip manufacturers into their SoC chip designs for the next release.
N3B a.k.a 3.5 nm is half backed techby, N3E is what is next (both Apple and Intel went with N3B just for the sake of being first, Apple being first with their M3, and ideally the M4 should have been M3).
Why are you comparing Apple Chips with Windows systems? After you choose either windows or apple, you can no longer choose the other chip. Choosing between Windows and Apple is the first decision any buyer makes.
The big loser with these announcements from Qualcomm, Intel, and AMD is Apple, which will no longer experience the advantages it has enjoyed in the last three years as the only option for high-performance, low-consumption PC chips on the market. The Snapdragon X has come to share space in this show, and who knows, it may even gain the leading role due to its adoption by several computer manufacturers. It virtually offers the same things offered in Macs with M-series chips, such as high performance, low power consumption, and emulation of the x86 instruction set (Rosetta 2 in the case of Apple and Prism in the case of Qualcomm). In addition to its adoption by several PC manufacturers, another advantage of Snapdragon X over Apple chips is that Windows is the platform for gaming par excellence, while the offering of games for Mac is quite limited.
I must have missed something, but I did not see any releases by Intel, AMD or Qualcomm that will match the M3 Max let alone the upcoming M4 Max & Ultra! These chips at Computex were ultrabook chips competing with the MB Air, not an M3 Max MB Pro 16, which beats out these easily.
@@ahaimes6320 I've seen a lot of these "but the Max" comments. While the M3 Max is impressive, the cheapest M3 Max MacBook is $3199. That's a very expensive laptop for a niche of a niche. Basically, only sensible for serious video editors who work on battery almost exclusively. All other professional use cases would be better served by some permutation of slightly longer processing times, plugging in more often, and dedicated graphics with broad industry support. Meanwhile, it's all irrelevant to average consumers. They just want inexpensive, reliable laptops with long battery runtimes and performance suitable for casual computing.
Saying Apple was right for not having user upgradable ram is a very poor take. Just because something is the standard doesn't make it right. If something goes wrong which can happen then you need a whole logic board as opposed to a ram stick. I don't mean this disrespectfully, but anyone with half a brain has known that integrated ram or soc's would be the standard, but us as consumers should not accept it. With that said, it is quite an exciting time to be a tech enthusiast with all the innovation.
There are some fundamental points you are missing. X64 is not inherently less efficient than arm, as there are trade offs each direction, and the architectures borrow from each other. Rather, there is a lot of backwards compatibility that x86/64 has to support, plus the differences on the operating systems running on the chips Also intel was/is servers first, consumers second, whereas Apple is consumers first. This results in some fundamental differences in how a particular chip is optimized. You would not sell a server chip with on die ram Once Intel was willing to make this shift, and windows can let go of baggage, then you will see efficiencies start to even out. Intel has some brilliant engineers, which Apple enticed to their side for awhile, and now Qualcomm is benefitting from the protégées coming out of intel and ibm. Look, intel managed 2.5x the npu power on lunar lake of apples m4. They added the various hardware encoders which propelled Apple silicon on so many you tube review sites, and reduced power consumption greatly, including next gen on die interconnect layer that they have been working on for over a descale. By the way, I really love the vision Apple has brought forth, and I really enjoy their products and ecosystem, although I use windows, Mac, and Ubuntu, depending on usage.
Snapdragon is amazing, but it's early and not yet widely adapted. Microsoft needs to go all in and use it in all products so OEMs start making software run natively on their snapdragon processor. Right now it's spotty at best.
THIS is the BEST 45W Fast Charger for only $18 (Anker) ➡ geni.us/TjjzD
THIS is the Laptop you should buy right now ($50 OFF) ➡ geni.us/qk6qhg
Apple feel snappy, no My MacBook Air M1 16 GB often feels SLOWER than my old Lenovo laptop !!! So what I hear you are saying is that AMD, Intel, Qualcomm and Apple will be very close at the end of this year, Intel and Apple are no-upgradeable, are AMD and Qualcomm upgradeable ? What will the price/performance be for the four ? I guess that Apple will be the most expensive !
The problem with Apple is that there are only a handful of apps & games that can work on the ARM platform when compared to AMD, Intel & Nvidia. And Apple's superiority over AI has narrowed, if not been surpassed by their competition in 2024 and going forward it's only going to get a lot harder for Apple to maintain a lead if any. However, we all benefit from this competition.
Intel bought SiFive ? …was that a mistake , I DNT think they have. They tried in 2021 i remember that
In late 2020 Apple ignited the competition with the M1, now almost 4 years later we have 4 great chips to choose from. This is why competition always helps the consumer.
agreed
get 'modern' options if you're still not ready to move over to ARM64
Apple is still kind of in a league of their own, let's be honest. The only one that can somewhat compete would be Elite X
The issue with apple is that its apple
@@amirtorhan2762 lol no, AMD already competes well with apple in terms of both battery life and performance. While not sacrificing upgreadeable RAM or SSDs. Wake up, Macs are no longer gaining market share like they used to.
@@FronosElectronics exactly. Limited options, boring designs and no upgradeable parts.
Intel and AMD will definitely have their share of the market. TSMC is at max capacity and investing in other semiconductor companies will be an absolute power move, I keep increasing my shares manageably. Different chips are good at different things and Nvidia has been very specialised, which leaves other aspects of Al open.
This is the type of in-depth detail on the semiconductor market that investors need, also the right moment to focus on the rewarding AI manifesto.
Im unconvinced about intel future. Do you really think nvidia can keep up this surge rate and stream for the better part of the decade?
Of course, i had bought NVDA shares at 300, 475 cheap before the split and with huge interest I keep adding, i’m currently doings the same for PLTR and AMD constructively. Best possible way to get ahead, is participating behind top experienced performers.
I agree, I'm compiling and picking stocks that l'd love to hold on to for a few years before retirement, do you think these stocks would do better over the years.
You are buying a company to own it and not a piece of paper, The market is a zero-sum game (2 sides), Know what you are buying not just out of trend interest.
I'm just happy there's options and competition all throughout the laptop space.
Lots of fine options but my hope is that Apple will finally offer 16G standard memory and make 32G reasonably priced.
Agreed, it's a much happier position than a decade ago when it was Intel galore 😕
Linux on M4 and on iPad would be killer as well as Mac
It seems Qualcomm will face steep battle as imperium strikes back :😅
What's ironic though is unlike Apple, Intel offers 16 gigs standard. Granted it only ends at 32 gigs this time.@@TechOverwrite
I like this manner of Vadim, not hyped, just natural
It's interesting that it took Intel 4 years to recognise their failures.
Not 4 years to recognize, 4 years to pivot. Especially when chips like this are generally developed for a decade, this was a last ditch rush job to catch up
It is institutional inertia, people running the BU don’t want to disrupt their existing book of business, until it is too late. That is how lot of great tech companies end up in history’s dust bin.
It takes 3-5 years to develop a CPU, so...
Apple spend 10 years for developing M1 .... do you know who are the top development leader at qualcomm x elite ? Some former top engineers, manager and so on from apple...
Intel will bankrupt in maybe 10 years, like kodak because they are unable to improve and innovate, they just follow with 4 years delays and try to copy the m1 chip ... and still with x86
They have realised their mistake many years ago, but they couldn't change everything overnight.
In my opinion it doesn’t matter who is making the best chip.The important thing is that there is competition and competition is in the end always good for the users
Keep in mind Intel isn’t abandoning upgradable memory. As of now, the integration is just for Lunar lake. Arrow lake (desktop) due later this year will utilize CAMM-2 memory. I believe the processor will still have integrated memory such as Lunar Lake as well… so the big difference is that it will not be “unified”
It is likely that all future ultrabook chips will have soldered RAM, and CAMM2 will be only for desktop, gaming, and pro machines due to the expense and the need for a separate board (space and weight). Also, the problem with CAMM2 is that if you buy a machine with 16GB and upgrade to 32GB, you will have to buy a new 32GB board. The 16GB board becomes worthless e-waste, and the 32GB board currently costs $279 (64GB DIMM costs $150).
95% of users will not care about it being modular, especially as it is currently of poor value. The main advantage is that we can get the fastest RAM in these performance machines rather than the current limitations--like no LPDDR5X.
@@ahaimes6320no the new memory. Modules are extremely. Thin. And made specifically for thin. Laptops handhelds etc desktops have plenty of space and don't need thin modules.
Apple could also allow for memory upgrades with these by having soldered unified and slots for camp which would actually be as fast as the sidered unified memory for most things
Instead of going to the SSD when. Memory runs out, it will dynamically use the much faster camm memory modules.
If apple does this it will be huge for consumers and professionals who needs change over time stop waste and help performance.
@@ahaimes6320 I've seen an article quoting one of the OEMs as stating that CAMM2 is actually cheaper to produce due to only one board and less materials being needed compared to the typical 2-4 ram sticks that have existed with DDR; if that's true the high prices right now are probably just because it's new and new always commands a higher price for first adopters. Prices should come down to affordable levels before too long just like DDR always has after a new variant.
A 16 GB board is hardly worthless either as it can be repurposed in another machine that doesn't require more. 16 GB system ram is still more than enough for most computers that are only used for basic office work and media consumption.
Hate to break it to you but the M4 is not going to get 972 in cinebench. the uplift in geekbench mainly came from the new support of matrix calculations on the CPU side. cinebench does nothing with that. It will be slightly better than m3 maybe 750
didn't m4 get miniscule upgrade since they got all massive gains and now it is mature platform that can only be refined?
some people here praise x64 as it's the only way forward, but engineers say that we're nowhere near limits of x86, from leaks, zen 7 is supposed to work on power efficiency over the whole voltage curve, so better power managment might be even possible on x86, i don't feel like x64 is the only way to make things power efficient, more like all x64 were designed with mobile in mind, so they are for lower power scenarios
Apple in R23 Cinebench 😂😂😂
@@1Grainer1x86 is essentially dead, Many new Intel and some new AMD chips have dropped support for “real-mode” 8/16-bit x86. It’s virtually all been x64 (more accurately x86-64, an AMD invention) for the last 20 years. And both Intel and AMD have labeled even 32-bit x86 instructions as deprecated or “not recommended” because it’s much more useful to optimize the x64 instructions, so they don’t focus on optimizing the performance of the 32-bit x86 instructions.
@@geoffstrickler so it is essentially dead since windows 7 or vista (dunno if vista had 64-bit version), so you're just proving that this video and comments are wrong
the only reason i know of that windows didn't fully commit to x64 is because most programs would have stopped working, so most probably all engineering stuff that never worked on apple
@@geoffstrickler 8bit and 16 bit processors are for low power embedded applications, recent macOS X versions are 64 bit - not even smartphones use them.
64 bit is used today because the bus and address widths can be wider meaning bigger data (loaded, stored or moved at a time) and more complex instruction can be executed in ~the same number of clock cycles (amount of time to execute an instruction), that's why we're using x86-64 (The x86 instruction set - list of all the most basic instruction a CPU can perform - has seen changes from AMD and Intel).
x86 also focused earlier on floating point computation which was what beat PowerPC and the reason it's not in your Mac today.
Intel and AMD have innovated chip design, bus, cooling, microarchitectures - that's how we've had faster chips.
x86 is dead...x86 is garbage...but many improvements in computing came (or because of competition) from x86 because it is just that good.
The lunar lake chip can go under 1 watt in idle and 2 watt decoding. It is way more efficient then the z1 extreme/7840u/8840u. At most that uses 4-5 watts of power, and 8-10 watts of total system power.
It is way better for laptops if you are concerned about battery life compared to amd.
Just saying the battery life drains too fast on a rog ally and legion go.
Lunar lake is not on the market yet. Neither AMD's next gen.
@@mmo0Jthere has already been a benchmark on RUclips that intel showed showing it goes under 1 watt on idle and 2-3 watts decoding video.
@@oo--7714 Wheres the under 1 watt video? I saw the 2-3 watt compared to 30 watt decoding of meteor lake, but not the 1 watt idle.
@@marcogoncalves1073 was in the pc world video. The intel people had 3dmark on but the benchmark wasn't running, it went from 3 watts down to 0.958 watts.
Apple defines smart phone, Apple defines tablet, and after 4 years of released of M1, Intel and Qualcomm finally confirm Apple also defines laptop cpu.
Apple defines consumer desktop CPU, the m1ultra was way more powerful than 12900K, m2ultra is slightly more powerful than 14900K.
Apple is the definition of CPU SPEC CPU® 2017
@@PKperformanceEU lol what are you on?
@@PKperformanceEUNo it doesn’t lol. Intel and AMD are still the go to when people build desktops😭
@@PKperformanceEU”Yeah bro for my cpu I’m going for the M2 Ultra” -said nobody ever
@@PKperformanceEU Yet, Windows Intel and AMD run the desktop market. ARM chips are best for laptops where battery life and portability are important. Not desktops within the USA. Add on gaming is drastically easier on X86 architecture. Same for CAD software used by engineers (Civil, Mechanical, and architects).
For laptops, ARM chips > X86 chips.
For desktops X86 > ARM chips.
Intel and AMD are still based on x86-64, so they have 1 less work to do compared to Qualcomm, and I will stick with x86-64 for now.
unfortunately windows is not innovating nor improving from here on
@@qwerty6789x you wrong bro there's new windows for arm
@@qwerty6789xW12 is on the works. x86 is still the best at full power and compatibility; for the rest ARM wins hands down…
@@ricarmig CoPilot is a travesty and I foresee MS in W12 not allowing you to disable it, given their history of monitoring. One reason why I only use my PC for gaming only, and my personal stuff on other OSes.
You're right. Learn from the time when Apple switch to arm. Those early adopters just pull their hairs out when things don't work (even with Rosetta. I expect it to be worse with Prism simply because microsoft has a track record of writing bad software compared to Apple).
Don't pay to be beta testers for the new chip. 😂 Wait for the entire industry to switch to Windows on Arm before you switch.
That tiny gain in terms of power efficiency (compared to Lunar Lake) is not worth pulling your hairs out, especially when you pay for it. You'll end up spending even more time and offsetting that power efficiency trying to make unsupported things work. (from experience).
Having said that, we still need guinea pigs to be beta testers to make things work. I just don't want to be that guinea pig 🙂
Awesome roundup!😊
Hi Alex !
The Lunar Lake chip looks crazy efficient. Looking forward to see what sort of systems it ships in.
Intel has been promising a lot and disappointing even more. I wouldn't expect anything from them anymore. The future is clear, and it's called ARM.
Ur newbie , dont compare arm to x86 , x86 is superior @@CesarPeron
It sounds like Lunar lake might fulfill my dream of playing esports games with an igpu and getting consistently decent frames on low settings. Id love to have the battery efficiency and portability of a thin and light and still be able to comfortably play games
You will need frame interpolation (AI), which reduces the quality. But, yes, if you are ok with artificial frame rates then the iGPU in these new ultrabook chips may suffice for casual gaming.
Esport games usually run already very well on the steam deck so this shouldnt be totally new?
Please stop saying tops… there are like 3 people here that actually care about the AI performance.
And this is coming from a dev.
Yeah, that's true. Most of us don't give a rat's ass about TOPS.
True, AI is the biggest marketing con for a while and OEMs are desperate to rebrand to squeeze money from customers.
If you’re a dev you should definitely be integrating ai into your workflow
@@christianr.5868 and how to do that?
@@thecreativepeasant8781 ask ai to write code that you would usually have to search GitHub for, then correct and repurpose it for what you need. Speeds up your workflow
Everything blew up all of a sudden.
The "Best" is too much for me. "Good enough" is better. You can keep the best. I don't need it or want it. And I'm a retired programmer.
But if Intel is using the N3B node; doesn't that mean they will have to redesign all chips in future like Apple did? In order to use N3E?
They will have their own 18A process.
@OlegZhuravel whats the advantage of that?
@@visitante-pc5zc Smaller node size compared to N3E, GAA transistors and powervia
@visitante-pc5zc theoretically a lot of things, but we need to wait and see if it will turn out well.For sure the tech at the base of the node should be very advanced, and the first among every other foundry with back side power and GAA transistors. This two things together should brings big power efficiency gains compared to normal FinFet transistors, the ones used till today.
They arent using N3E. They are using tsmc to bide time for their own foundry. Dont forget that intel is planning to compete directly with TSMC to make nvidia, apple, qualcomm and amd chips, their goal is to manufacture chips for everyone. That will be their main competitor, not nvidia, amd, qualcomm or apple.
Hey Vadim, Intel also redid their new iGPU too! It's supposed to help much better in rendering, better Ray tracing support, and now it'll properly support XeSS.
Might be old-school, but I’d like to see where the desktop chips go
Nothing old school about raw performance
The vast vast majority use their computers for basic stuff like surfing, RUclipsrs, emails, and simple word processing and spreadsheets. These people could care less about AI and cutting edge processors. Battery life is however very important in the laptop arena.
I love how Intel is sticking with x86 and trying to bring the best efficiency competing with arm.
why do you love that
@@oldgrub More competition means better products
@@oldgrub x86 has the best compatibility. Besides better performance, there's nothing going on at ARM. You want to emulate everything? Best-case scenario would be x86 becoming just as efficient, not Intel also making ARM chips.
Will be interesting to see how snapdragon will be supported in Linux, I would really like a good Synology NAS with such a cpu and maybe clients as well, like a raspberry sized board.
elite x as a stand alone chip might be supported, but i'm afraid that other components of a laptop won't work on linux. as always.
Locked bootloader dude
I would stick with changeable ECC RAM for any kind of server workload...
Should I wait to buy a new laptop with one of these new intel lunar lake chips or AMD 9000 as I need one by August or should I just buy now? With everything going on I am just very confused?
Wait for Lunar lake it will be worth it :)
@@Magnus0891I think you mean wait for AMD Ryzen 9 ai hx 300.
Ryzen AI 300 will be available early August.
Intel's battlemage is set to match AMD's 890m at a lower power draw
Source: videocardz.com/newz/intel-core-ultra-200v-lunar-lake-reportedly-hits-4100-points-in-timespy-at-30w
@@NothingNewNerd would u say that both will deliver roughly same experience? But I mean I seen the first review for new amd chip and they all point out that the computer get very hot
Looks to me like it is Apple, Qualcomm and Intel efficiency chips for laptops and 2-in-1s and the x86 chip still for power desktops. For me if I upgrade this fall it will be Windows (Snapdragon or Intel).
Thanks Qualcomm for bringing up this competition. Is Snapdragon X Elite the only one that integrates with the 5G modem?
The X Elite still lacks a 5G option, and we don't know if that will be available until Q4.
I wonder if the intel chips will run x86 apps faster than the others because they use emulation?
I think you already know the answer to that.
Do people really use this amount of raw processing power of chips in daily usage?
Yes and no. Think about it like this, the power users buy M4 sell m3, the M1 users buy me sell M1.
Also depends if it's ultra pro or base model when consumers upgrade.
When m1 was new the power was too much, now it's normal.
So If I want to buy M1 or X plus I'd be hoping the new stuff sells to the power users so the M1 becomes more affordable
gamers and content creators
The programs will use that stuff, windows and other programs will be adapted
I am still using the iPad air 2 from 2014.
I highly doubt that a Samsung Tablet would be this useable after 10 years.
Galaxy Tab s7, s8 and s9 will last.
Lol my S8 is lagging now 😂🤮 @@WarAlex16
@@Akkk117 mine is perfectly fine. You must have damaged it or you have a FE version. One Ui is a great optimized OS for S8 and S9.
@@WarAlex16 The Galaxy Tab s7 came out in August 2020 - OS updates until 2023 and security updates until 2024 - In regards of duration not even close to the ipad air 2.
What do you mean by useable ? Are you talking about Android developed by Google or the hardware developed by Samsung ? Android is at the whim of Google so Samsung has no say in that if Google pulls support for an OS . An iPad air 2 is hardly usable. What do you actually do on the tablet?
We keep seeing various benchmarks where Intel is faster than Apple, or Apple is faster than AMD, etc, etc, but the real world results show how much Apple has worked on integration and system-wide efficiencies. I hope MS take note of this part and don't just rely on brute force of the fastest SoC to get results. This is an opportunity for MS to get rid of some of the baggage associated with legacy Windows and to work with vendors such as Adobe to fine tune their apps to the new SoC's.
Sadly software has never been Microsoft's strong point (which is kinda' weird if you think about it).
I like Windows 11 more than 10, but it's still quite inconsistent at times.
If they got rid of the baggage why would one any use windows ? People use Windows because they know that there software will work on it. if they did what Apple did with their silicon or got rid of backwards support it would kill the Windows Market.
If anything they should have continued the development of Singularity and Midori and instead just launched two OS's one that is the advancement of current Windows and the second being a new OS. This way they could have developed the new OS without the baggage while still having Windows but with limited new features and then eventually kill it when the new OS is up to par.
I'm genuinely looking forward to Lunar Lake laptops.
I really hope Snapdragon X takes off this time so it makes it worthwhile for developers. For the small chance that then Apple would bring back bootcamp for a native Windows experience on Mac again.
Unless Mac sales crash it will never happen.
@@Vashei it doesnt take a crash. Its a feature that helps to steal hardware sales from Windows. MacOS still only has a 14% market share while Windows is at 70% +. The same applies today as it did in 2007. And since laptop and desktop sales have been stagnant, its the perfect time to do so if Arm based Windows takes off. That being said, if Snapdragon ends up a flop, no point in doing so.
@@bort7258 you can not compare the situations. Besides battery runtime there was NO advantage for Apple. And with Snapdragon there IS NO advantage for Apple. If you pay the same money as for Apple hardware - e.g. Microsoft Surface - you a much better experience on the Micosoft side. Sad bad true. Would have liked an Apple Surface Studio... but there was never one. So I switched to Windows after 25 years using only Macs ... (beside Server stuff)
@@kaptnwelpe5322 For me and many of my peers and co-workers, running Windows natively via bootcamp on our Intel MacBook Pro's was a much better experience vs using Windows on our ThinkPads. In fact, I knew several that got MacBooks just to run Windows all the time via BC and never boot into MacOS. I fail to see why this can't be the same today if ARM takes off this time and Win developers support the platform. Touch interface on a laptop is not crucial to a great Windows experience as there are still a ton of Windows laptops that ship today without touch enabled screens. I currently use Windows 9 to 5 / mon to Friday, have a touch enabled HP work laptop and can't remember the last time my hands used the screen to navigate. For a niche group of people that require input via touch, perhaps so, but that would not be the MacOS target market, rather iPadOS. The Surface Studio is quite the niche product. Pretty fantastic for certain use cases and very well designed and engineered, but still niche and doesn't portray a typical Windows user. Even MS knows this and is why they are super slow to update them with the current one shipping with mostly 2 to 3 generation old specs at astonomical pricing.
@@VasheiI wouldn’t be so sure. Craig Federighi said a couple years ago that the ball was in Microsoft’s court. At the time, Microsoft had an exclusive contract with Qualcomm, which has since expired. Craig F basically said that if Microsoft was willing to allow Windows to run on a Mac, they would be all for it. Remember that Apple is a hardware company while Microsoft is a software company. If Apple can entice people to buy a MacBook by using Windows as a marketing hook, that becomes a win-win for both Apple and Microsoft.
x86 emulation on Windows for ARM still sucks, I have a bunch of applications which won’t run.
What's your laptop? The new MS surface laptop with X Elite?
You just are a very small demographic….everything I run on x elite has been flawless.
The Xe2 in the intel is insanely powerful. Raytracing that's better than Nvidia is a thin and light is mind boggling.
Seems like we finally have a true chip CONVERGENCE.
X-86 finally doesn't suck anymore thanks to arm.
Fun fact.. the X Elite has actually been delayed for two years so it uses the "old" Oryon cores.
Next generation will provide huge improvements, with Qualcomm solving the teething problems and gaining experience.
Yeah I'm interested in seeing if we are going to get the next generation of Snapdragon X-Elite being announced later this year alongside Snapdragon 8 Gen 4... cuz this one was delayed for 2 years.
Well when they have the new cores ready, then we will see. There's a reason why the used those old cores. And the industry won't stand static waiting for them to release them.
So the odds are uncertain.
This is the most exciting time to consider upgrading my PC I have experienced in years. The Qualcomm Windows announcement is in a word, fantastic! But then I watched the Intel Lunar Lake SOC announcement and Wow. I did not expect this level of advancement from Intel this year. I was thinking of upgrading in a couple of years, but with Intel Lunar Lake and Microsoft Windows 11 Copilot+ PC announcements, I am looking for an upgrade to my Windows 11 laptop later this year. For me things can only get better from here. I just hope the promise of AI on Windows 11 PCs shows real progress.
To my way of thinking software developers will have a priority to develop AI apps for X86 with an eye on ARM Windows development. But I would think most will let emulation be the bridge to ARM for the time being. In my case the Microsoft Windows 11 Copilot+ PC and Intel Lunar Lake SOC platform are the best of the best for thin and light laptops. However, I will have to wait to see how the Windows 11 PC makers design for the new world of Microsoft Windows 11 Copilot+ PCs.
As we learned that Apple with M3 uses the N3B node temporarily till M4 with N3E node is fully replacing M2 (PRO, MAX & ULTRA). It looks like that TSMC will not stop producing N3B as suggested. That production stops for Apple is likely now Intel uses the N3B node.
It's good news. Less power consumption is always good. I would never buy an Apple product because I don't like their OS and especially loathe the company and its policies. That being said, they make good products
A point that should be made when discussing the ability of the various CPUs/SOCs to run AI is the implementation within the computers which incorporate them.
None of the Windows based laptops announced to date have enough onboard resources to run very large LLMs. It doesn't matter how fast the NPU is when the computer doesn't have enough RAM to load and process the LLM.
If you want a laptop that can run the LLMs having higher accuracy rates, then your best option is the MacBook Pro M3 MAX (or the M4 MAX version when it ships) with 16-core CPU, 40-core GPU, and 128GB RAM (or better, depending on options available with the M4).
Microsoft demanded NPUs on all windows machines because they want to run them full time spying (uh “learning”) on all users. If this spying can’t be turned off I will never run a windows computer again.
For all of us with IP to protect, we will run quantized models not much bigger than they need to be to understand our voices and the apps we run.
You mean for right now. Later, it all changes.
Thats BS. WHO on earth uses a Mac for AI seriously with NO Cuda-support at all?? And: for interference its all about the fine-tuning of the model. Excaclty what Microsoft delivers. Even more so with their ground breaking Copilot... Apple is loosing big time. But they don't care as they gave up the Mac long time ago...
No one is running such large models without Cuda support on MacBooks.
@@ibraheem1224 Yes, they are. These models actually run very well on the more powerful Macs.
So which new pc is best? i was looking into i9 intel gen 14th and now this...
If you specifically speak about PC, then at this point better wait for Arrow Lake.
IF I could have good performance DaVinci resolve in a low weight low power windows environment Ill take it.
Yes ✋
Good thing then that DaVinci Resolve is releasing their native Arm app for Windows by the end of this month.
What's up with you tech guy using 4k 24 hrz ?
Unless we use catodic projection, it will render laggy af in any screen.
Qualcomm's 1st generation Oryon cores in the the Snapdragon X Elite beats Apple's 1st - 3rd generation M series chips and is only beaten by 4th generation M4 chips. Does that mean... by 2nd generation..... Qualcomm's Snapdragon Dragon X Elite chips will equal Apple's M6 chip? 😅
Qualcomm is only beating Apple when energy consumption is not an issue.
@@dsblue1977 but apple has 4 generations of power efficiency refinements. compare 1st Gen M1 with 1st Gen snapdragon x elite. all I'm saying is... their first go at it.... 1st Gen already matches apples 3rd Gen 😜
What one run a cold thermal on the gpu with better performance the snap drargon or the lunar lake I need to know
One of the best responsive pc’s i had were dual processors. I had a dual processor pentium 2 that was awesome and super snappy. I also had a dual p3 1000mhz that was super responsive. Todays are fast because of ssd but dual processor pcs could be awesome. And shut off one full processor on battery life.
The 780m already gets around 36 fps in 3dmark Wild Life Extreme, the 890m is claimed to be as much as 36% faster per an OEM with its 25% increase in core count and the chips support for much faster ram. That would put it right at around 50 fps. Here's the thing though, it's been proven time and time again that performance in synthetic benchmarks often don't translate to real world performance in games. Intel's top end xe-lpg in Meteor Lake scores similar to or better than the 780m in 3dmark but gets beaten soundly in nearly every game. This is why I believe that the HX 370's 890m will be the fastest of the lot. AMD has lots of experience and optimization in the gaming segment compared to Intel, Apple and Qualcomm.
Fascinating this competition exists when most people just write messages to each other.
So, does Qualcomm expect people to pre-order based on their marketing materials and pseudo-ads on tech RUclips channels? Why haven't we seen benchmarks conducted by actual users instead of the performance numbers that Qualcomm wants to believe their processors can achieve?
Thanks for the video. It's nice to have both operating system and CPU options. Windows runs the office and gaming, MacOS runs the studio and Linux runs the Internet.
Macos doesn't run the studios. Music studios? Maybe. Other than that, windows is dominant everywhere.
@@prianshubhatia2759Creative studios, yes, and more, and it is cannibalising windows. Gaming is also increasing whether you liked it or not. The difference is in the strategy, one is lost and the other is focused.
@@prianshubhatia2759 Windows is not dominant everywhere. There are some industries in which Apple is the standard. Not everyone buys computers for gaming.
@@dsblue1977 ik, but even in professional space, windows is dominant.
There are no m4 laptops.
Thats like comparing the m4 to competitors uocoming chio thats not out yet.
We have m1 m2 and m3 airs out now lets compare rhem.
It is soon possible that Nvidia's ARM chips will also be on the market, competing with Qualcomm's Snapdragon X and Apple's M-series.
Nvidia hasn't announced any Arm chips. Maybe seeing all the hype, Nvidia will also jump on the bandwagon next year.
Doubt it - simple, why enter a cut-throat, low margin market? It isn't much interested in low-mid GPU market.
@@SeanLi-i7n This outdated argument continues to be repeated uncritically by digital influencers within the Apple bubble. This concept is at least four years behind and demonstrates these influencers' disconnect from the current market reality. They are unaware of the high-end and gaming laptop lines, which are far more advanced than any MacBook and sold with good profit margins. Of course, these brands do not operate with exorbitant margins, as their audience is not the typical conformist Apple customer.
They will, nvidia and mediatek collab for making nvidia chip with rtx igpu
Features aside, pricing? That's a big part of the equation.
Which of these chips should I buy if I want new computer before the end of the year?
Check the software you need for the tasks. If it doesn't run on ARM architecture, then Intel or AMD laptops. If it runs on ARM architecture, then Apple or Snapdragon X Elite laptops.
@@akin242002 The software I run will run on Arm or it will very soon run on Arm. Which of these is the best platform if that is the case?
Also, laptop or desktop? Lunar lake is for laptops, Arrow lake processors are due later this year for desktops, they should have upgradable RAM with CAMM-2 memory
Wait and see the comments after they are all out and used for a month.
Crazy how the M series of chips are so powerfull they for real shut down the competition
What do you mean shut down 😂
Are they powerful enough to correct your grammar?
Not really, but they will keep their market share. Especially in photography where CPU matters more than the GPU.
Nope. Apple has a contract with TSMC giving them like "first access" rights to the newest chip technologies. So Apple has an advantage worth about 1 year. It must be scary for Apple to think about how fast the Snapdragons will be in their second generation...
Yes shutted down, the ever growing android market share at 71,5%, or the 90% windows market share is dominated by the chip?
the fact apple was able to get their own cpu's under control and will never again depend on others (motorola, ibm, intel) is all the win apple needed. They don't need to 'be the fastest' they just need to be fast enough for their products and it seems they will be doing fine for the next few years. The fact other chip makers and youtubers compare to their chips is a bonus for PR.
Apple stuff is overpriced. Enough is not enough
@@visitante-pc5zc something is only overpriced if people who buy it don't see the value. clearly apple has found a market segment in all areas they play in that disagree with you.
@kleanthisgroutides7100 true or not (won't reply to that) how is apple making their own chips instead of Motorola, IBM or intel have any relation to this, do you feel they would be in a better position if they stayed with any of their old chip providers instead of doing it them selves?
@kleanthisgroutides7100but again would them sticking to their last supplier in this case intel have solved that ?
@kleanthisgroutides7100 how exactly have they killed photo and video??
Best would be a x elite combined with nvidia gpu, is that even possible?
Upgradable memory and repairable components has been forgotten for long time
Snapdragon X Elite ❤❤❤
Bro actually X Elite have 3100+ points in single core
Where? I am checking on Geekbench 6 and it shows the Snapdragon X Elite - X1E84100 chip (most powerful version) at 2800 to 2900 single core.
Nahh. Most leaked benchmarks are in the range 2700-2900.
That's still quite good for a first gen product running via emulation!
@@ibraheem1224 Even 2200 single core is still excellent for daily office work. 2400+ is overkill unless you are an engineer or EA shooter game competitor.
@@akin242002 Bruhh phone chips are getting 2200 single-score nowadays. Laptop chips obviously need to be better.
For reference, Snapdragon 8 Gen 3 gets around 2300 in single-core.
@@ibraheem1224 They are getting better at an amazing rate, but it's like saying you need rocket fuel to ride a bike in a park. The improved CPU single core performances are great for large CPU tasks, but not a big deal for smaller tasks unless you are in a rush.
People don’t just buy laptops based on raw performance, I think OS is far more a deciding factor. I have been a long time Pc user, switch to MAC lately, got to say, MAC OS is just better in so many way.
My windows laptop just failed yesterday. Seeing Microsofts plan for always-on AI has put my transition to Mac into emergency mode. If Apple announces an M4 Mac mini and a reasonable privacy based AI development plan at WWDC it will be the end of windows for me.
Which way exactly is Mac OS better?
i mean you can install any OS
linux is an option
@@niveZz-people who buy macs specifically want macOS, not Linux, because of the ecosystem or exclusive apps.
Could you elaborate on why it is so much better? I have been a Windows user (mostly Windows Pro) my entire life and my wife has an M3 Air. I don't see the advantages beyond less software updates and better video editing quality.
What about the X Elite Pro, they
have 2 chips!
X Plus (multiple skus) and X Elite (multiple skus)
Total TOPS is a meaningless stat since no models use more than one device at a time...
Your estimated chart showed the Intel iGPU being faster than AMD... 😂
You guys are forgetting that ARM provides the technology to make all of this happen. Without it none of this is possible
One of the most important feature for AMD based Laptops is emulation.
It doesn't matter how good the processing power is if you cant use it.
In that area apple will continue to be the obvious choice for ARM based laptops.
These are definitively exciting times!
What’s the name of the background music?
How is that Apple was right? Not at all, now both of them are wrong!
Glad to see all the CPU manufacturers are getting better. The real winner is the customer.
- For business laptops, the X1 Carbon G13 will be amazing with Intel Lunar Lake. For tasks the ARM architecture is capable of doing, Snapdragon X Elite will take that market share.
- For gaming desktops and mini-PCs, AMD will dominate from a CPU standpoint. Still X86, so gamers will not lose their games. Also, "Recall" is not included.
- Apple will keep their current market. The M4 Pro MacBook Pro will continue to dominate the photography and video editing industry. Same in the front-end programming with JavaScript and backend with Java.
Apple will lose market share just by the existance of a new level of Windows battery runtime. When the economy tanks - which it looks like it will - Apple will get in big trouble as it could not afford to substantially lower the price of their Mac Mini or entry level MacBooks.
@@kaptnwelpe5322Bruhh Apple captured a wide market of Windows with their M-series chip laptops... No wonder Microsoft is desperate to get it back.
According to Statcounter, MacOS increased from roughly 15% of the USA market to 25% of the USA market. Impressive growth, but still not the majority.
Edit: This is from June 2014 to May 2024 to give historical reference.
Can you please help me with some information
I have bought an macbook air M2,but I have a samsung s22 ultra.Do you think those match with each other.Is it easy to share data from one another??
Of course Not. Base on the new functions of iOS18, I suggest you waiting for the new iPhone. In my opinion, lunar lake still have a great distance from Apple M4, or newer chips.
@@ZhuozhenYou well,I have a samsung now and macbook voming next week.
Non upgradable RAM doesn't mean Apple was right. It just means Apple paved the way for Intel to implement anti consumer ideas without backlash. Apple's strength is that it faces far less scrutiny over it's decisions so they can make design choices that only increase revenue and their customers will usually give them the benefit of the doubt. Once they normalize those choices companies like Intel and Samsung become free to follow and thus profit.
This has been the cycle for a long time now and based on the opinions in the video it seems this cycle will continue on for the foreseeable future.
Intel following Apple is not a proof that Apple was right, most people would prefer upgradable RAM over small performance gains.
It is important to note that only the Snapdragon X Elite will be widely available for the important back-to-school season. Youths that are used to the battery life of iPad and mobile phones will gravitate towards Snapdragon X Elite devices.
Surface Go ... if the price is right and Microsoft "gets it" - upgradeable SSD for the non-school customer s - killer...
Man I don’t think Intel’s approach for Lunar Lake especially when it comes to ram memory is a mere copy of Apple’s chip. This was the right call for laptop mean to be efficient, a new complete architecture (without hypertreading) and tile design that also helps a lot. This chip design took more than five years so it is more than just a copy
It is unfortunately a copy when the ran out of ideas and see themselves falling. The last call before abandoning the ship!
They took too much time. First M1 should’ve sounded the alarm.
@@fmax3000 You guys are gullible. CPU design take 3-5 years, it's not a plug and play process with of the shelf components. It took Apple 10 years to develop the M1 and it took Intel 4 years to answer while AMD is still working on its own answer.
Intel hasn't fall behind like RUclipsrs make you believe.
Let's remember - this is all about battery life, nothing else.
If someone is like me, and works mostly with the laptop plugged into the electricity network (doing programming, gaming, watching videos, etc.), they shouldn't care much about the new chips, only about performance. My current Intel laptop with Nvidia GPU beats the crap (performance-wise) out of any current apple computer and the Snapdragon laptops are not going perform better. I guess about 95% of all computer users should really only care about the performance and not saving a few Watts-per-hour on their machines. If you really care about the environment, there are much better ways to do that than using inferior computers.
So what I've said a few years ago that when Apple "ditched" Intel for their own CPU, they gave Intel some technology in the process, tech that we saw in the M1. That big-little architecture Intel came out with is from Apple. And now, on-board memory, come on. It is obvious Apple "saved" Intel by giving them some tech.
So, here's what I want you to test, in regards to power consumption. I feel as though it is extremely hard to visualize computer power so, here’s a scenario. I live in a house, there was a storm and a tree branch fell and snapped just the powerline off of the pole to my house but, any Internet lines are still working. I have a Tesla, or a Ford lightning that can power my house and I still want a game after my power is back up until the power company comes out and fixes the line to the grid. how much can I do running on a fixed battery? Because I feel SA standard PC would wear the battery out a lot faster than an M-series Mac. especially considering the immense wattage, a gaming PC uses, versus a Windows laptop, versus a Mac laptop, with everything else being exactly the same because I also think that Power is using, and then it’s generating, would show a difference in your electric bill every month.
Snapdragon X Elite benchmarks look so damn better.
I compared it with Ryzen 9 8850HS and it blew my mind entirely.
No wonder why x86 seems to be declining in the future if not immediately.
The only con with Qualcomm is the adreno gpu. If they make a highigher end gpu to compete with nvidia and amd gpus this will be a big deal.
2:29
NO, this is just a false assumption. This just means BOTH want to make max profit!
btw: On mobile devices RAM isn´t upgradeable anyway since years, which is "ok", as those devices will be thrown away after some years anyway.
You forgot to tell the biggest downside of Apple chips. At the same price range all the competitors will be offering base models with 2x RAM and 2x-4x larger SSD. If you want to match these with an Apple laptop, you need to pay significantly more, since Apple prices their RAM and SSD upgrades very high.
M3 Max 14” isn’t really a thick and heavy laptop either. Smallest and most power efficient high end laptop out there. But costs over 5000€ when equipped with 2TB SSD. It’s a awesome small laptop if price is not a consideration.
Amd & Intel conquered X86 APU ecosystem. Apple & Qualcomm conquered ARM APU ecosystem. Nvidia with their own ARM CPU design will entering ARM APU ecosystem next year. WIN - WIN SITUATION FOR CONSUMERS ❤❤❤
I wonder what happens when these chips reaches the desktop pc market.
The small and lite window laptops are usually pretty cheap. Is Intel going to lose money on this new chip since it's probably more expensive than their legacy chips that don't have built in memory, ect?
The small and lite windows laptops are going to be Snapdragon. Once they have sold to the early adaptors... Guess why Microsoft didn't announce THE Surface where ARM would make the biggest difference - the Surface Go...
Yeah, I think there's going to be a huge disparity in marketing specifications. We have Qualcomm citing INT8, Intel & AMD combining multiple function blocks then comparing them to the competition's single function blocks. All very under-handed as CoreML allows multiple blocks to be used.
The picture painted by GeekbenchML does not concur with any of these points BTW. You should start publishing these.
What wasn't covered are the beer tokens you'll need to exchange to equip a business, school, or family, with a stack of them, not forgetting the power bill, in a total cost of ownership, over 3+ years, calculation.
AMD, NVIDIA, Intel and Qualcomm will give Apple many ideas on what to improve for the next A/M chip release. Apple is ahead of the game as it pivoted four years ago and has now gotten its release cycle down to a year for the M chips. Apple will always be the first to release the next generation of TSMC chips. Apple is now in a position to integrate the best features of the other chip manufacturers into their SoC chip designs for the next release.
N3B a.k.a 3.5 nm is half backed techby, N3E is what is next (both Apple and Intel went with N3B just for the sake of being first, Apple being first with their M3, and ideally the M4 should have been M3).
Since when M4 is a laptop chip?
Not yet. But it will be later this year.
Why are you comparing Apple Chips with Windows systems? After you choose either windows or apple, you can no longer choose the other chip. Choosing between Windows and Apple is the first decision any buyer makes.
The big loser with these announcements from Qualcomm, Intel, and AMD is Apple, which will no longer experience the advantages it has enjoyed in the last three years as the only option for high-performance, low-consumption PC chips on the market. The Snapdragon X has come to share space in this show, and who knows, it may even gain the leading role due to its adoption by several computer manufacturers. It virtually offers the same things offered in Macs with M-series chips, such as high performance, low power consumption, and emulation of the x86 instruction set (Rosetta 2 in the case of Apple and Prism in the case of Qualcomm). In addition to its adoption by several PC manufacturers, another advantage of Snapdragon X over Apple chips is that Windows is the platform for gaming par excellence, while the offering of games for Mac is quite limited.
You forgot to mention that Apples M4 is on a newer node and the Snapdragon comes close.... Imagine TSMC offers enough production capacity for all...
I must have missed something, but I did not see any releases by Intel, AMD or Qualcomm that will match the M3 Max let alone the upcoming M4 Max & Ultra! These chips at Computex were ultrabook chips competing with the MB Air, not an M3 Max MB Pro 16, which beats out these easily.
@@ahaimes6320 I've seen a lot of these "but the Max" comments. While the M3 Max is impressive, the cheapest M3 Max MacBook is $3199. That's a very expensive laptop for a niche of a niche. Basically, only sensible for serious video editors who work on battery almost exclusively. All other professional use cases would be better served by some permutation of slightly longer processing times, plugging in more often, and dedicated graphics with broad industry support. Meanwhile, it's all irrelevant to average consumers. They just want inexpensive, reliable laptops with long battery runtimes and performance suitable for casual computing.
Intel is smart, they will probably still offer Intel Opane ram memory on board as expanded ram for large tasks
Apple took the lead because they ditched 32 bit apps in favor of optimal design for new software. It was bold, but it definitely payed off.
Intel didnt copy Apple's strategy, they are copying AMD's strategy of using TSMC, but they'll use their own foundry for panther lake
Saying Apple was right for not having user upgradable ram is a very poor take. Just because something is the standard doesn't make it right. If something goes wrong which can happen then you need a whole logic board as opposed to a ram stick.
I don't mean this disrespectfully, but anyone with half a brain has known that integrated ram or soc's would be the standard, but us as consumers should not accept it.
With that said, it is quite an exciting time to be a tech enthusiast with all the innovation.
There are some fundamental points you are missing. X64 is not inherently less efficient than arm, as there are trade offs each direction, and the architectures borrow from each other.
Rather, there is a lot of backwards compatibility that x86/64 has to support, plus the differences on the operating systems running on the chips
Also intel was/is servers first, consumers second, whereas Apple is consumers first. This results in some fundamental differences in how a particular chip is optimized. You would not sell a server chip with on die ram
Once Intel was willing to make this shift, and windows can let go of baggage, then you will see efficiencies start to even out. Intel has some brilliant engineers, which Apple enticed to their side for awhile, and now Qualcomm is benefitting from the protégées coming out of intel and ibm.
Look, intel managed 2.5x the npu power on lunar lake of apples m4. They added the various hardware encoders which propelled Apple silicon on so many you tube review sites, and reduced power consumption greatly, including next gen on die interconnect layer that they have been working on for over a descale.
By the way, I really love the vision Apple has brought forth, and I really enjoy their products and ecosystem, although I use windows, Mac, and Ubuntu, depending on usage.
Snapdragon is amazing, but it's early and not yet widely adapted. Microsoft needs to go all in and use it in all products so OEMs start making software run natively on their snapdragon processor. Right now it's spotty at best.
Great analysis. Thank you!
Awesome & Excited :) Nice video & Thanks :)