00:02 Gerard Williams is the lead architect for the new compute chip at Snapdragon Summit 2023. 02:28 Developing energy-efficient designs for CPUs on Snapdragon 04:55 The performance of the CPU exceeded expectations and was above predicted numbers. 07:15 Being competitive in the market requires studying competitors and trends. 09:32 Snapdragon X Elite (code name Orion) is a CPU that will enable Qualcomm to expand into other markets. 12:02 Snapdragon X elitec is targeting lightweight and portable laptop designs with high performance graphics and NPUs. 14:30 Windows on Snapdragon provides a solid operating experience with good battery life and potential for performance in gaming and high-end applications. 16:48 Snapdragon X Elite opens new doors for third party applications on Windows
Andrei Frumusamu is member of CPU engineering team in Qualcomm? Wow. What a career from Anandtech redactor. I remember how he was predicting rise of ARM at forum and most people attacked him for that. Now he is part of one of the best CPU team on the world. What an example of success. Do not listen to haters and follow your dream.
@@claudyla I wonder why they couldn't mention his name. Maybe because AMD and Intel are hiring ARM engineers and QC wants to protect their dream-team from them?
@@edrd6257 Yeah, his CPU deep dive analysis were legendary and gave so much knowledge to others. This is now gone. I wish he could make Oryon analysis for us.
Bro - you got that eye too lol. It’s crazy to me how a lot of people can’t/don’t pick up vibes from other people. He looks outright smitten lol - don’t get me wrong, I don’t know the guy so it could be just heaps and bounds of professional idolatry, but I’ll eat my fuggin shoe if I’m wrong 😂 Stumblin and stuttering lol awwwww so adorable ❤️….
As a reminder they legally have to say the design for the chip started day 1 of the acquisition by Qualcom. Arm sued about the IP developed before hand over the license type.
A good interview, nice to see Gerard so pleased with what his team have achieved. It's helpful to hear him say that there is a lot of research on the state of play before they get started. I wish them well as we need to see a vibrant semiconductor ecosystem at a time when there is a bit too much emphasis on the teams at Google and Apple. So, thanks Ian.
Nicely done, Ian. I'm glad I got the early scoop on the technical snafu! LOL! Turned out great! Great seeing you in Maui. Thanks for hopping on my show.
There were two moments in this interview (and also a slide) that show Qualcomm isnt pursuing desktop anytime soon. Also I wish you asked about die size, transistor count, and expected laptop price range, I have a bad feeling we wont see Snapdragon X in laptops under $1000.
Had already asked those questions off camera, didn't get an answer. Still a few months away from having them on shelves. Die size, I think people have already estimated from images of the silicon out there.
I hope they release an updated MSFT dev kit. They'll probably go after similar markets to what Apple have (Mac mini and iMac). Doubt they'll go after DIY until they've established themselves in mobile. They would probably spread themselves too thin if they did. Either way, the competition in mobile will force intel and AMD to be more aggressive.
that would be a mistake if you ask me. Getting a couple more hours for the same performance but, say 30% more cost isn't going to entice people to switch to an ARM platform and deal with software compatibility issues. They'd be smart to try to only break even or even sell at a loss and M$ would be smart to subsidize the prices if that's out of the question for qualcomm.
Oryon has 52% higher IPC than the best x86. - Oryon .... 3227 pts ST GB6 / 4.3 Ghz ....... 750 pts/GHz - Zen 4 ..... 2918 pts / 5.9 GHz .................... 494 pts/GHz .... 750/494 = 1.52x This is absolutely insane IPC. Not sure how Intel and AMD can compete with this Oryon monster. Even if Zen 5 will have 25% IPC rise it will not get only half way. Not speaking about excellent efficiency which allow Oryon be in smart phones.
Qualcomm has clearly delivered and even surprised itself and surpassed it's own expectations. Apple's chip on one hand vs Qualcomm's chips on the other similar to what we have with Intel vs AMD in the x86 universe. Thx a lot for making this interview Dr. Cutress!👏👍
@@mistamaogAnd more importantly Qualcomm or whoever else is making the silicon. They need something in there that accelerates the translation layer. Relying on software alone is a complete failure. Unless they can waive a magic wand and all of sudden make software translation alone way faster. Ai this and that. NPU this and that yawn. It can generate a ai picture so much faster than XYZ.... Get some silicon x86 to ARM acceleration going that actually works well and you can change the market If I can buy a ARM windows machine and more or less not care if the app is native or not. If it still runs very well and the general consumer can just install what they normally install and use their machines just like before. It will succeed. Just like apple did. If you went from an Intel MacBook air pro to the M ones. Your battery life got way better. Performance depending on the task went way up and even if didn't it wasn't so much slower or anything else that had a bad UX. Not a Apple silicon version at launch or now? Just install x86 like normal. The majority of m silicon Mac users don't know what is and isn't arm or x86. Or what any of that even means. And it all works really well. Compared to dumpster fire of the PC industry and arm
It’s not the speed of the processor. It’s the lack of arm version of software and backwards compatibility that’s the issue. Windows 11 ARM is shit How is Intel gonna transition to arm if 30yrs worth of apps suddenly run like a sack of potatoes? It’ll take 10yrs to transition
I hope this will finally turn Windows laptops into what I and many other people want. Cool, sub-25W, little noise, long battery life. We prefer that over 10% extra performance. Zen 5 and Meteor Lake will also scale poorly with power, here's hoping...
Seeing that rocm is an open standard will Qualcomm be able to make hardware that’s compatible with it? If so then I’d be interested in using their hardware as an inference machine depending on how good and cost effective it is. I really like the m1 ultra. I can get the maxed out version used for $5000 cad now. 128gb of unified memory is great for large language models. That very well may just be my next computer.
I still think that for ARM to succeed in the desktop space, they'd need a common desktop platform at first, just like back in the Socket 7 days. For the companies that would mean more competition but consumer confidence would be higher and adoption rate would be faster as consumers had more options to chose from with an upgrade path from their competition. Especially since Qualcomm exited the ARM server market prematurely, there is too much uncertainty still if they can keep it up or loose interest once again.
Even in the laptop space. Hardware upgradability aside, we need at least a common firmware/bootloader for us to be able to install different OSes instead of having an ISO for each platform out there.
Currently just forget the desktop part, based on their exposed test results, that is a far-far away options.. compare the 23W results with 80W results.. almost 250% extra power consumption for less than extra 10% performance..
I disagree. I think what we will see is multiple players in the Arm CPU space sell laptops. Once that occurs is when it will take off for desktops. We might even see HP and Dell roll their own down the road. We already have Windows 11 ARM which is a big plus versus prior years.
@@freckledtrout3299 For the laptop market, different platforms for each ARM vendor might not be a big deal, as OEMs design their custom motherboards already. But in the desktop market you'd need to get motherboard vendors on board, and if each ARM vendor starts at 0% market share, the platform fragmentation would hinder the market penetration of them all as there is no guarantee which of them will succeed in the end. From the perspective of the motherboard vendors, it would be a big gamble as they'd need to invest in all of these different platforms in advance without knowing which ones will be more popular in the end. It would be way less risky if there were such a shared platform for all ARM vendors, making support from the motherboard vendors more likely.
@@TamasKiss-yk4st Sure, Qualcomm's CPU design might not be great for scaling up right now. But I also thought of AMD and Nvidia entering the ARM desktop space as rumored lately. AMD might rejuvinate their ambidextrious computing initiative with a common platform for their x86 and ARM offerings. So AMD is in a better position here than Nvidia and other ARM vendors that'd need buy-in from the motherboard vendors to support their platform.
Im less excited about speed, and more interested about high efficiency. Imagine if an entire motherboard for a 14" laptop was shrunken down to the size of the valve deck sbc, but it uses a 6800u instead, and has a turbo tdp of 5w, and usual tdp of 3w and idle / low usage of 0.1W/1W --> Then you cram a 96WHR battery and you can decide what io/ storage/ hotswap gpu if you want (similar to the framework laptop idea. That would be amazing for workstation. I know its always better to be faster, but a 6/8core haswell is seriously perfectly useable (assuming you clock it over 4.5ghz) 10 years later. Increase in cpu performance is no longer as important, it is alot more interesting to be more efficient. Just my 2 cents
I thought that the exclusive Microsoft deal they got was a major part of their motivation to actually produce this, I expected to hear more about what it can do in Windows and application support, even expansion support for regular desktop cards like PCIe.
Very tactful use of words starting around 4:45 regarding the history of this architecture in order to avoid providing proof against Qualcomm in the ARM lawsuit . "we started ages ago. Well actually 3 years ago but you know that feels like ages"😂 No disrespect it's just interesting/funny to see Qualcomm trying to maintain that this design was started at Qualcom and not at Nuvia before being aquired by Qualcomm.
Not true. They claimed 40% performance uplift over m1. The 60% is just for GPU. Where qualcomm is the king. Currently qualcomm snapdragon 8gen 2 is more than twice as fast as. Apple a17pro GPU.
But keep in mind the Basic M2 and M3 has only 4 power cores and 4 efficiency cores, and you compare those to 12 power cores, and the result is not 3x better.. if you compare that with M3 Max where you also have 12 power cores, their results just nowhere (their GPU also just barely surpass the Basic M2 level, even M2 Pro gives way better fps in the tested games, not to mention the M2 or M3 Max..)
Very very happy to hear G.Williams on your channel ! Great job and nice interview. I don’t really understand how they can do the job with a same 12 core-cpu to address mass-market fanless laptops and pro-market laptops. Ok you can use binning and lower frenquency, but a unique cpu design which covers every needs… ARM, Qualcomm and Apple, and soon Intel are not stupids, if they implements Big and Little cores… there are many good reasons. Williams talks about SKUs… ok but it is not a real response. Williams doesn’t talk about their possible hardware attention about Windows optimizations.
the downside is Qualcomm management and culture might squash the progress. If they leave Nuvia team alone to do what they're good at, they'll give Apple solid competition. Congrats to the Nuvia team.
Semiaccurate revealed a backstory on the OEM/component side that makes me wonder if Qualcomm really wants to succeed with this design, e.g. forcing OEMs to use their PMICs that are not optimized for laptop workloads, leading to cost and power inefficiencies. It would be great if Qualcomm made everything right this time. Can you confirm that story, Ian?
Using mobile pmic and board layouts won't make it inefficient but very expensive - mobiles have much less material requirements and pcb size compared to laptops
@@aravindpallippara1577 Mobile PMICs are optimized for mobile current flows, not for 80W current flows. Just read the Semiaccurate article, it's free. By the way, they'd need more PMICs and additional four layers for PCBs to make it work. That makes it super expensive for the OEMs and needlessly so, as OEMs could design a cheaper PCB with standard PMICs which would deliver even a more efficient solution in the end. But Semiaccurate claims that Qualcomm executives tried to force the OEMs into a bundle deal with their mobile PMICs and as OEMs threatened to withdraw, instead of doing the sane solution and giving in to the OEMs demands, Qualcomm just hushed it over with money that undercuts the whole bundle deal in the first place and might even hurt their bottomn line more than giving in to the OEMs.
i honestly don't think they can realistically do it unless they have some sort of hardware translation layer built into the silicon like apple does for backwards compatibility
Have there been any statements on how open/documented the platform will be and whether it requires proprietary drivers? One of the biggest strengths of the PC platform is that it's (mostly) open enough to be able to have hardware run the latest software and drivers for a very long time (and open source operating systems beyond that time). Whereas I thought one of the reasons for Google building their own SoC was the lack of Qualcomm long-term platform support and proprietary drivers (I haven't verified this though). If these will be able to run a Linux distribution without binary drivers, then I'd definitely be interested. If not, then potentially small efficiency advantages are simply not worth the migration effort.
I really wonder if the we will see Qualcomm take on the top tier flagship SOCs from Apple now. Keep in mind that Apple just launched the M3 series, which will face this Qualcomm SOC.
They just compared their chip with the 8 or 10 core M2, all of those have efficiency cores (just like all the Intel CPU in the compaeison..), so they don't even showwd you a fair 12 power cores vs 12 power cores on their chart.. Basic M2 has 4 power cores and 4 efficiency cores, not to great if you need a 12 power cores chip to beat it..
What really matters is how well this will run stuff not made for it. It needs to be like Apples switch to ARM. Sure the native stuff is ideal and will run better but all the x86 stuff needs to run well. Well enough that it does not suck. That the average buyer in the space is not going to notice it being slow. Good enough that the competing x86 chips provide a good experience and on the ARM machine it is friction point. Then your average consumer can and will buy machines with these in them and be happy with the user experience and the performance. Then the people who write the software people use may actually make native arm versions. If it runs x86 programs like crap, its dead in the water.
If they really help out people who do Audio Production, I would not have any reason to jump to Apple's OS at that point. I would freakin love that! And so many people do audio production even as a hobby as it started for me. BUT now it's a passion, and I would love to have windows put at least some focus to audio production creators. DAWs and plugin developers should work with ya'll to take that niche market.
Will it run starfield? :| i mean it has support for dx12.2 Just asking tho. Is there any x86 to arm translator in built in windows on arm? @_@ gaming is the only thing that i love the most on every single device. And this chip is arm which is power efficient. And i choose it over m3 because windows is also contributing to it. And has dx12 support unlike that damn apple metal api. Which is unable to run cyberpunk properly. :) and i know that this snapdragon ecosystem is not going to kill my expectations. @_@ i prefer 45 to 60 watt modle of this chip. :)
Windows 11 on ARM has a very decent x64 emulator. Windows also has dual mode DLLs, so you can move parts of an app to ARM code and still call functions in a x64 third-party DLL that is not yet available in native ARM code yet. So far x64 emulation only works for user-mode code, so older x64 drivers will not work on ARM processors. A big unknown is if add-on hardware vendors will support ARM versions of Windows drivers. Like today you can built a 128 ARM core Windows workstation, but you can’t get ARM drivers for your workstation class GPU. My guess is for a while, only HW that comes with a system will be supported, and USB devices that conform to standard classes.
recenty read that AMD is going to launch ARM cpu's (aimed at the desktop consumer market), anything exciting to say about those yet? really excited by this interview and what's to come
@@fanban2926 wrong design and they lost credibility with server customers 5 years ago. Server customers don't care about single thread performance. Throughput matters.
I'm sure you guys are cordial, but felt a little like he was overselling how close they were to shipping something. I'd love some desktop qualcomm parts but it feels like step one is dedicating a decent amount of your soc area to x86 compatibility like apple did. (I have no idea what I'm talking about) Windows ARM isn't happening anytime soon I don't think.
Also Ian you mentioned not getting your hands on the hardware for your benchmarks, where do you think the deltas would lie if you got to run full test suite? Obviously it's falling over somewhere.
Wow!! Nuvia comes to fruition under Qualcomm leadership. A match made in "Digital Heaven"`. Apple Silicon competition arrives from a modern silicon processor perspective. Oh boy! 👋👍
This conversation is very much reassuring of the future and it have made some enlightenment on the idea of taking bold steps at remodeling traditional concept. Sometimes we can redesign things from the ground up and not be afraid of our contemporaries who love to stick to certain conventions. Sometimes a specific design may not be the only design that can work. And there are non-traditional concept to be tried or discover. It appears that snapdragon have raise some concerns to the competitors and also they've spark some thought provoking, revamping ideas that need to be considered. 😎💯💪🏾👍🏾
it won't. ARM chips are very sensitive to frequency up to a certain point and then theres basically no performance gains. If you overclock it to 5ghz you'll get like 2% more performance for 2x the power. Not worth it.
basically he justifies why apple would have sued him: “u build trend lines of where the competition will be… “ … or u just leave the competition with all that data in your head… 🤦♂️ … anyhow, a pointless “me mmeeeee mee .. how great i am … me me me” exercise
Higgest single thread performance, wow , that means higher than intel 14900k cpu , on just arm cpu , now that is crazy . Intel cpu uses over 300 watts to achieve that , but if arm cpu can do that on 50 watts that will be monumental shift , i welcome new Arm cpu to run PC .
0:00 ~"We study competition, we copy we copy and we pasted" 14:00 basically copied AMD approach? There are no efficiency cores, by design - schematics. In case of AMD the execution in silicon is either stretched or relaxed to meet thermal and frequency targets. Blobs of different size routed little bit differently but of the same exact cores. Later called "efficiency core" but there is no functional difference. In contrast to what ARM and Intel is doing. Am i right?
Yes, you catch it. Beside Intel is late to implementation sweetpoints show. They copy an old dense core and call them efficient. QC take there first food into the room of this high game but do not being able to do on the fly legacy LLVM x86 binary translation.
BTW. I dont mean it's bad thing plus - "good artists copy; great artists steal" im fully against patenting system, no idea should be protected for more than 5 years, maybe except names, logos
Interesting that he mentioned single-thread performance crown (at 10:30), yet Qualcomm explicitly omitted any single-thread performance metrics in their presentation. Someone's not being entirely forthcoming!
@@TechTechPotato Huh, that's strange. I read Ryan's review at AnandTech and he explicitly says in this paragraph: "With 12 performance cores, Qualcomm is pushing hard on multi-threaded performance. In fact, multi-threaded performance is the only CPU performance comparisons Qualcomm makes, as there are no single-threaded comparisons to speak of. Make of that what you will." and I also didn't notice any mention of it in ArsTechnica. Yet, looking at comments in various places, there is mention of ST performance!
hmmm that feels like nokia when the Iphone came out🤦. Going for performance per watt makes sense when x86 is getting more and more expensive and difficult to make everytime the node goes down. We're running out of road with Moores Law🤷. MS appear to expending a lot of effort to make Windows multi platform. NVidia and AMD are interested in other CPU architectures I think x86 might be in trouble long term. On the other hand stuff like box86 to run x86 on ARM already exists. You can run Windows titles already on Linux operating systems and people have gotten windows titles running on Raspberry PIs using that software. There's nothing that states PCs have to be x86🤷.
00:02 Gerard Williams is the lead architect for the new compute chip at Snapdragon Summit 2023.
02:28 Developing energy-efficient designs for CPUs on Snapdragon
04:55 The performance of the CPU exceeded expectations and was above predicted numbers.
07:15 Being competitive in the market requires studying competitors and trends.
09:32 Snapdragon X Elite (code name Orion) is a CPU that will enable Qualcomm to expand into other markets.
12:02 Snapdragon X elitec is targeting lightweight and portable laptop designs with high performance graphics and NPUs.
14:30 Windows on Snapdragon provides a solid operating experience with good battery life and potential for performance in gaming and high-end applications.
16:48 Snapdragon X Elite opens new doors for third party applications on Windows
Thanks you saved my time
Andrei Frumusamu is member of CPU engineering team in Qualcomm? Wow. What a career from Anandtech redactor. I remember how he was predicting rise of ARM at forum and most people attacked him for that. Now he is part of one of the best CPU team on the world. What an example of success. Do not listen to haters and follow your dream.
Ha glad I wasn't the only one that caught that
@@claudyla I wonder why they couldn't mention his name. Maybe because AMD and Intel are hiring ARM engineers and QC wants to protect their dream-team from them?
To be fair, he was way better than just about any redactor out there when he was with Anandtech..
@@edrd6257 Yeah, his CPU deep dive analysis were legendary and gave so much knowledge to others. This is now gone. I wish he could make Oryon analysis for us.
so romantic 10/10
So much sustained eye contact 😩
Bro - you got that eye too lol. It’s crazy to me how a lot of people can’t/don’t pick up vibes from other people. He looks outright smitten lol - don’t get me wrong, I don’t know the guy so it could be just heaps and bounds of professional idolatry, but I’ll eat my fuggin shoe if I’m wrong 😂
Stumblin and stuttering lol awwwww so adorable ❤️….
smitten lmao@@NightRogue77
@@NightRogue77 It's called social anxiety.
I have a heavy stutter when talking to people @@NightRogue77
As a reminder they legally have to say the design for the chip started day 1 of the acquisition by Qualcom. Arm sued about the IP developed before hand over the license type.
when he said they started designing the chip ages ago and then had to backpedal that statement lmao
A good interview, nice to see Gerard so pleased with what his team have achieved. It's helpful to hear him say that there is a lot of research on the state of play before they get started. I wish them well as we need to see a vibrant semiconductor ecosystem at a time when there is a bit too much emphasis on the teams at Google and Apple.
So, thanks Ian.
Nicely done, Ian. I'm glad I got the early scoop on the technical snafu! LOL! Turned out great! Great seeing you in Maui. Thanks for hopping on my show.
There were two moments in this interview (and also a slide) that show Qualcomm isnt pursuing desktop anytime soon. Also I wish you asked about die size, transistor count, and expected laptop price range, I have a bad feeling we wont see Snapdragon X in laptops under $1000.
Had already asked those questions off camera, didn't get an answer. Still a few months away from having them on shelves. Die size, I think people have already estimated from images of the silicon out there.
I hope they release an updated MSFT dev kit. They'll probably go after similar markets to what Apple have (Mac mini and iMac). Doubt they'll go after DIY until they've established themselves in mobile. They would probably spread themselves too thin if they did. Either way, the competition in mobile will force intel and AMD to be more aggressive.
that would be a mistake if you ask me. Getting a couple more hours for the same performance but, say 30% more cost isn't going to entice people to switch to an ARM platform and deal with software compatibility issues. They'd be smart to try to only break even or even sell at a loss and M$ would be smart to subsidize the prices if that's out of the question for qualcomm.
Oryon has 52% higher IPC than the best x86.
- Oryon .... 3227 pts ST GB6 / 4.3 Ghz ....... 750 pts/GHz
- Zen 4 ..... 2918 pts / 5.9 GHz .................... 494 pts/GHz .... 750/494 = 1.52x
This is absolutely insane IPC. Not sure how Intel and AMD can compete with this Oryon monster. Even if Zen 5 will have 25% IPC rise it will not get only half way. Not speaking about excellent efficiency which allow Oryon be in smart phones.
@@richard.20000 that's what instructions are for. I'll wait for real world testing before believing that kind of IPC claim
Qualcomm has clearly delivered and even surprised itself and surpassed it's own expectations. Apple's chip on one hand vs Qualcomm's chips on the other similar to what we have with Intel vs AMD in the x86 universe. Thx a lot for making this interview Dr. Cutress!👏👍
The past examples of ARM + windows have been lacklust, i really hope they make great strides in the near future.
I'm wondering whether it's on Microsoft, or developers. I'm guessing a bit of both.
@@mistamaogAnd more importantly Qualcomm or whoever else is making the silicon.
They need something in there that accelerates the translation layer. Relying on software alone is a complete failure.
Unless they can waive a magic wand and all of sudden make software translation alone way faster.
Ai this and that. NPU this and that yawn. It can generate a ai picture so much faster than XYZ.... Get some silicon x86 to ARM acceleration going that actually works well and you can change the market
If I can buy a ARM windows machine and more or less not care if the app is native or not. If it still runs very well and the general consumer can just install what they normally install and use their machines just like before. It will succeed.
Just like apple did. If you went from an Intel MacBook air pro to the M ones. Your battery life got way better. Performance depending on the task went way up and even if didn't it wasn't so much slower or anything else that had a bad UX.
Not a Apple silicon version at launch or now? Just install x86 like normal. The majority of m silicon Mac users don't know what is and isn't arm or x86. Or what any of that even means. And it all works really well.
Compared to dumpster fire of the PC industry and arm
It’s not the speed of the processor. It’s the lack of arm version of software and backwards compatibility that’s the issue.
Windows 11 ARM is shit
How is Intel gonna transition to arm if 30yrs worth of apps suddenly run like a sack of potatoes?
It’ll take 10yrs to transition
Hope this precessors will support linux from the get go
Moar!! We want more Gerard Williams!!
I hope this will finally turn Windows laptops into what I and many other people want. Cool, sub-25W, little noise, long battery life. We prefer that over 10% extra performance. Zen 5 and Meteor Lake will also scale poorly with power, here's hoping...
Hopefully the Linux support will be good :)
Yeah bro, can't wait for a full day linux laptop
@@Tatar_Piano exactly! My Thinkpad x1 carbon lasts for around 10 hours but that's nothing compared to a MacBook the same size and power
Seeing that rocm is an open standard will Qualcomm be able to make hardware that’s compatible with it?
If so then I’d be interested in using their hardware as an inference machine depending on how good and cost effective it is.
I really like the m1 ultra. I can get the maxed out version used for $5000 cad now. 128gb of unified memory is great for large language models. That very well may just be my next computer.
No. Rocm is very much designed with AMD gpus in mind. Open source isn't open standard
ROCm isn't an open standard but SYCL (used by Intel's oneAPI) is, so they could possibly integrate with that
This man is the GOAT of ARM CPUs. Massive respect for him and his team.
I still think that for ARM to succeed in the desktop space, they'd need a common desktop platform at first, just like back in the Socket 7 days. For the companies that would mean more competition but consumer confidence would be higher and adoption rate would be faster as consumers had more options to chose from with an upgrade path from their competition. Especially since Qualcomm exited the ARM server market prematurely, there is too much uncertainty still if they can keep it up or loose interest once again.
Even in the laptop space. Hardware upgradability aside, we need at least a common firmware/bootloader for us to be able to install different OSes instead of having an ISO for each platform out there.
Currently just forget the desktop part, based on their exposed test results, that is a far-far away options.. compare the 23W results with 80W results.. almost 250% extra power consumption for less than extra 10% performance..
I disagree. I think what we will see is multiple players in the Arm CPU space sell laptops. Once that occurs is when it will take off for desktops. We might even see HP and Dell roll their own down the road. We already have Windows 11 ARM which is a big plus versus prior years.
@@freckledtrout3299 For the laptop market, different platforms for each ARM vendor might not be a big deal, as OEMs design their custom motherboards already. But in the desktop market you'd need to get motherboard vendors on board, and if each ARM vendor starts at 0% market share, the platform fragmentation would hinder the market penetration of them all as there is no guarantee which of them will succeed in the end. From the perspective of the motherboard vendors, it would be a big gamble as they'd need to invest in all of these different platforms in advance without knowing which ones will be more popular in the end. It would be way less risky if there were such a shared platform for all ARM vendors, making support from the motherboard vendors more likely.
@@TamasKiss-yk4st Sure, Qualcomm's CPU design might not be great for scaling up right now. But I also thought of AMD and Nvidia entering the ARM desktop space as rumored lately. AMD might rejuvinate their ambidextrious computing initiative with a common platform for their x86 and ARM offerings. So AMD is in a better position here than Nvidia and other ARM vendors that'd need buy-in from the motherboard vendors to support their platform.
Gonna be though supporting everything that is required on a windows pc
Im less excited about speed, and more interested about high efficiency.
Imagine if an entire motherboard for a 14" laptop was shrunken down to the size of the valve deck sbc, but it uses a 6800u instead, and has a turbo tdp of 5w, and usual tdp of 3w and idle / low usage of 0.1W/1W --> Then you cram a 96WHR battery and you can decide what io/ storage/ hotswap gpu if you want (similar to the framework laptop idea.
That would be amazing for workstation.
I know its always better to be faster, but a 6/8core haswell is seriously perfectly useable (assuming you clock it over 4.5ghz) 10 years later.
Increase in cpu performance is no longer as important, it is alot more interesting to be more efficient.
Just my 2 cents
I thought that the exclusive Microsoft deal they got was a major part of their motivation to actually produce this, I expected to hear more about what it can do in Windows and application support, even expansion support for regular desktop cards like PCIe.
Fantastic interview.
Wow, you got a legend!🎉
The idea started as soon as we walked in the door...with all our ideas and work we developed at Apple over the previous 9 years
The reason why interviewers typically use lapel mics is so you don't have to sit so uncomfortably close lol
The funny thing is that under that foam windbreaker is a rode wireless go. A lapel mic. Maybe he just didn't have a second transmitter.
@@POVwithRC I lost the second one a few months back. I might bite the bullet and get a new set. The DJI ones seem to be widely used these days
Very tactful use of words starting around 4:45 regarding the history of this architecture in order to avoid providing proof against Qualcomm in the ARM lawsuit . "we started ages ago. Well actually 3 years ago but you know that feels like ages"😂 No disrespect it's just interesting/funny to see Qualcomm trying to maintain that this design was started at Qualcom and not at Nuvia before being aquired by Qualcomm.
Yes also conveniently no mention of working at Apple the previous 9 years on arm CPUs which are veerry similar
Awesome interview man, can't wait to see X Elite go up against M3!
Well, Apple claims the M3 Pro has +60% performance of the M2 Pro.
Not true.
They claimed 40% performance uplift over m1.
The 60% is just for GPU.
Where qualcomm is the king.
Currently qualcomm snapdragon 8gen 2 is more than twice as fast as. Apple a17pro GPU.
@@anandsuralkar2947 oh, my bad, i should have paid better attention. Thanks for correction.
But keep in mind the Basic M2 and M3 has only 4 power cores and 4 efficiency cores, and you compare those to 12 power cores, and the result is not 3x better.. if you compare that with M3 Max where you also have 12 power cores, their results just nowhere (their GPU also just barely surpass the Basic M2 level, even M2 Pro gives way better fps in the tested games, not to mention the M2 or M3 Max..)
@@anandsuralkar2947 "Sd 8 Gen 2's GPU is more than twice as fast as the A17 Pro's GPU" not true either
I wonder what over hall the VR snapdragon will get for the quest 3. I expect a good cpu upgrade.
Very very happy to hear G.Williams on your channel ! Great job and nice interview. I don’t really understand how they can do the job with a same 12 core-cpu to address mass-market fanless laptops and pro-market laptops. Ok you can use binning and lower frenquency, but a unique cpu design which covers every needs… ARM, Qualcomm and Apple, and soon Intel are not stupids, if they implements Big and Little cores… there are many good reasons.
Williams talks about SKUs… ok but it is not a real response.
Williams doesn’t talk about their possible hardware attention about Windows optimizations.
the downside is Qualcomm management and culture might squash the progress. If they leave Nuvia team alone to do what they're good at, they'll give Apple solid competition. Congrats to the Nuvia team.
I think the culture is better with Cristiano
I doubt M$ would allow that. The exclusivity contract they have is a pretty big deal
Semiaccurate revealed a backstory on the OEM/component side that makes me wonder if Qualcomm really wants to succeed with this design, e.g. forcing OEMs to use their PMICs that are not optimized for laptop workloads, leading to cost and power inefficiencies. It would be great if Qualcomm made everything right this time. Can you confirm that story, Ian?
Using mobile pmic and board layouts won't make it inefficient but very expensive - mobiles have much less material requirements and pcb size compared to laptops
@@aravindpallippara1577 Mobile PMICs are optimized for mobile current flows, not for 80W current flows. Just read the Semiaccurate article, it's free. By the way, they'd need more PMICs and additional four layers for PCBs to make it work. That makes it super expensive for the OEMs and needlessly so, as OEMs could design a cheaper PCB with standard PMICs which would deliver even a more efficient solution in the end. But Semiaccurate claims that Qualcomm executives tried to force the OEMs into a bundle deal with their mobile PMICs and as OEMs threatened to withdraw, instead of doing the sane solution and giving in to the OEMs demands, Qualcomm just hushed it over with money that undercuts the whole bundle deal in the first place and might even hurt their bottomn line more than giving in to the OEMs.
I am very excited to finally see ARM compete within the Windows PC space.
i honestly don't think they can realistically do it unless they have some sort of hardware translation layer built into the silicon like apple does for backwards compatibility
@@octagonPerfectionisteven then...
Let's wait for the end products and their pricing. Something tells me that these products won't come in cheap.
@@Capeau well yeah it’d need to actually be materially better enough to be worthwhile lol
Will this cpu have a Linux driver or only a Windows driver?
Have there been any statements on how open/documented the platform will be and whether it requires proprietary drivers?
One of the biggest strengths of the PC platform is that it's (mostly) open enough to be able to have hardware run the latest software and drivers for a very long time (and open source operating systems beyond that time).
Whereas I thought one of the reasons for Google building their own SoC was the lack of Qualcomm long-term platform support and proprietary drivers (I haven't verified this though).
If these will be able to run a Linux distribution without binary drivers, then I'd definitely be interested. If not, then potentially small efficiency advantages are simply not worth the migration effort.
Will it have linux support?
Snapdragon X Elite ❤❤
I really wonder if the we will see Qualcomm take on the top tier flagship SOCs from Apple now. Keep in mind that Apple just launched the M3 series, which will face this Qualcomm SOC.
They just compared their chip with the 8 or 10 core M2, all of those have efficiency cores (just like all the Intel CPU in the compaeison..), so they don't even showwd you a fair 12 power cores vs 12 power cores on their chart.. Basic M2 has 4 power cores and 4 efficiency cores, not to great if you need a 12 power cores chip to beat it..
What really matters is how well this will run stuff not made for it. It needs to be like Apples switch to ARM. Sure the native stuff is ideal and will run better but all the x86 stuff needs to run well. Well enough that it does not suck. That the average buyer in the space is not going to notice it being slow. Good enough that the competing x86 chips provide a good experience and on the ARM machine it is friction point.
Then your average consumer can and will buy machines with these in them and be happy with the user experience and the performance. Then the people who write the software people use may actually make native arm versions.
If it runs x86 programs like crap, its dead in the water.
If they really help out people who do Audio Production, I would not have any reason to jump to Apple's OS at that point. I would freakin love that! And so many people do audio production even as a hobby as it started for me. BUT now it's a passion, and I would love to have windows put at least some focus to audio production creators. DAWs and plugin developers should work with ya'll to take that niche market.
I forgot about VR/AR headsets, thats why I wanted to see Snapdragon get better.
Scary faster and hopefully not scary expensive.
Well, maybe you can hope for not scary expensive...
can't wait for next gen vr .mr. xr headsets
Will it run starfield? :| i mean it has support for dx12.2
Just asking tho.
Is there any x86 to arm translator in built in windows on arm?
@_@ gaming is the only thing that i love the most on every single device. And this chip is arm which is power efficient. And i choose it over m3 because windows is also contributing to it. And has dx12 support unlike that damn apple metal api.
Which is unable to run cyberpunk properly. :) and i know that this snapdragon ecosystem is not going to kill my expectations.
@_@ i prefer 45 to 60 watt modle of this chip. :)
Windows 11 on ARM has a very decent x64 emulator. Windows also has dual mode DLLs, so you can move parts of an app to ARM code and still call functions in a x64 third-party DLL that is not yet available in native ARM code yet. So far x64 emulation only works for user-mode code, so older x64 drivers will not work on ARM processors. A big unknown is if add-on hardware vendors will support ARM versions of Windows drivers. Like today you can built a 128 ARM core Windows workstation, but you can’t get ARM drivers for your workstation class GPU. My guess is for a while, only HW that comes with a system will be supported, and USB devices that conform to standard classes.
recenty read that AMD is going to launch ARM cpu's (aimed at the desktop consumer market), anything exciting to say about those yet?
really excited by this interview and what's to come
The good news is that they are not talking about using the Nuvia design for servers. When will this chip ramp in customer designs?
Why is that good news?
@@fanban2926 wrong design and they lost credibility with server customers 5 years ago. Server customers don't care about single thread performance. Throughput matters.
Do they have RISCV projects at Qualcomm?
They announced they're designing RISC-V cores for WearOS a couple weeks ago.
Qualcomm also has a proposal for a RISC-V ISA extension called Znew. So we can guess they actively work on RISC-V.
Do you think them leaving apple affected apples progress?
I'm sure you guys are cordial, but felt a little like he was overselling how close they were to shipping something. I'd love some desktop qualcomm parts but it feels like step one is dedicating a decent amount of your soc area to x86 compatibility like apple did. (I have no idea what I'm talking about) Windows ARM isn't happening anytime soon I don't think.
Also Ian you mentioned not getting your hands on the hardware for your benchmarks, where do you think the deltas would lie if you got to run full test suite? Obviously it's falling over somewhere.
Wow!! Nuvia comes to fruition under Qualcomm leadership. A match made in "Digital Heaven"`. Apple Silicon competition arrives from a modern silicon processor perspective. Oh boy! 👋👍
What camera is that interviewer look to? Why doesn’t he look into this video’s camera? Weird. Also, didn’t this guy work on Apple’s chips earlier?
15:16 "By the way I won't mention him but he is a part of the CPU engineering team". Is this where Mr Ryan Shrout has moved to?
They're likely talking about former Andandtech writer Andrei Frumusanu
Shrout has gone back to being an analyst
Ten billion thats impressive
This conversation is very much reassuring of the future and it have made some enlightenment on the idea of taking bold steps at remodeling traditional concept. Sometimes we can redesign things from the ground up and not be afraid of our contemporaries who love to stick to certain conventions. Sometimes a specific design may not be the only design that can work. And there are non-traditional concept to be tried or discover. It appears that snapdragon have raise some concerns to the competitors and also they've spark some thought provoking, revamping ideas that need to be considered. 😎💯💪🏾👍🏾
Can it compete with Meteor Lake?
I doubt it
thats nuts!
These people are God-like...
I wonder if they'll come unlocked?
it won't. ARM chips are very sensitive to frequency up to a certain point and then theres basically no performance gains. If you overclock it to 5ghz you'll get like 2% more performance for 2x the power. Not worth it.
As usually: drivers, software, OS and application support.
I think given the performance of all those Arm based CPUs, the most guy who should be fired is the Intel CTO.
Fascinating!
who wants windows the spyware? I want Linux!
AMD also will have Strix APUs with 45 TOPs in Mid 2024.
Game consoles became x86 to make development easier as pcs are x86... right before we begin switching to arm.
Remember before switching to x86 the Xbox 360 and the PlayStation 3 were running on IBM’s PowerPC chips (RISC chips, just like ARM’s)
X DOUBT
Pakala 🎉
No Onion treats for Ian?
*Oryon
basically he justifies why apple would have sued him: “u build trend lines of where the competition will be… “ … or u just leave the competition with all that data in your head… 🤦♂️ … anyhow, a pointless “me mmeeeee mee .. how great i am … me me me” exercise
I love this channel, but other new viewers will view this as too slow. I'm watching this at 2X speed and I would probably do 3.5X.
I use a chrome plug in and watch most stuff at 2-4x
@@TechTechPotato Nice.
They need to add more ram to the soc to run big llm. It has to run 130b gpt4.
Snap dragons phones are good
tech tech gelato
mr. potato. you have to invest in better/more robust hardware 😅😅
but amazing content nonetheless
Higgest single thread performance, wow , that means higher than intel 14900k cpu , on just arm cpu , now that is crazy .
Intel cpu uses over 300 watts to achieve that , but if arm cpu can do that on 50 watts that will be monumental shift , i welcome new Arm cpu to run PC .
Who is Cristiano?
Cristiano Amon, the CEO of Qualcomm
0:00 ~"We study competition, we copy we copy and we pasted" 14:00 basically copied AMD approach? There are no efficiency cores, by design - schematics. In case of AMD the execution in silicon is either stretched or relaxed to meet thermal and frequency targets. Blobs of different size routed little bit differently but of the same exact cores. Later called "efficiency core" but there is no functional difference.
In contrast to what ARM and Intel is doing.
Am i right?
Yes, you catch it. Beside Intel is late to implementation sweetpoints show. They copy an old dense core and call them efficient. QC take there first food into the room of this high game but do not being able to do on the fly legacy LLVM x86 binary translation.
BTW. I dont mean it's bad thing plus - "good artists copy; great artists steal" im fully against patenting system, no idea should be protected for more than 5 years, maybe except names, logos
Interesting that he mentioned single-thread performance crown (at 10:30), yet Qualcomm explicitly omitted any single-thread performance metrics in their presentation. Someone's not being entirely forthcoming!
Er what? plenty of ST numbers were presented and given
@@TechTechPotato Huh, that's strange. I read Ryan's review at AnandTech and he explicitly says in this paragraph: "With 12 performance cores, Qualcomm is pushing hard on multi-threaded performance. In fact, multi-threaded performance is the only CPU performance comparisons Qualcomm makes, as there are no single-threaded comparisons to speak of. Make of that what you will." and I also didn't notice any mention of it in ArsTechnica. Yet, looking at comments in various places, there is mention of ST performance!
Was anyone from Ars even there? They did cinebench and geekbench ST numbers against the competition. I still need to record my video
Oryon is not pronounced Orion lol
Wanna bet?
@@TechTechPotatoShould put two Oryon together and make an Oreon cookie to munch on.
Hopefully it’s priced reasonably, Apple M3 pro and max are way overpriced
Snapdragon X Elite beat Apple M2 Max and Intel i9 14th gen
Garbage e waste chips that will end up in e waste devices that will never be as fast as x86, never as compatible, and with zero efficency advantage
hmmm that feels like nokia when the Iphone came out🤦. Going for performance per watt makes sense when x86 is getting more and more expensive and difficult to make everytime the node goes down. We're running out of road with Moores Law🤷.
MS appear to expending a lot of effort to make Windows multi platform. NVidia and AMD are interested in other CPU architectures I think x86 might be in trouble long term.
On the other hand stuff like box86 to run x86 on ARM already exists. You can run Windows titles already on Linux operating systems and people have gotten windows titles running on Raspberry PIs using that software. There's nothing that states PCs have to be x86🤷.
Wow, lucky me? First :D
Third?
You guys in love? 😂
Incredible interview 👍