As always, be civil in the comments. Check out the semiconductors playlist for other videos on this topic: ruclips.net/p/PLKtxx9TnH76QEYXdJx6KyycNGHePJQwWW
remember to cross verify everything you get from wccf with other sources, they're not exactly known for accuracy. i know sometimes news truly break over there, but just as often they're the source of false information.
You know asking to peoples to behave "Civilized" on a video that low key throwing shade to China advancement and intentions on a online video platform that dominated by Western audience and hardcore Chinese nationalists using VPN is on par with asking peoples to be civil about atheism on Sunday masses on some southern baptism church. it's either you completely disable the comment sections or just ignores it completely. hey as long as you've got millions views it doesn't matter if the CCP bots and the Incels start trash talks with each other right? just grab the pop corn sit back and enjoy the circus shitshow while you rack up those monetization from the views.
Great content as always. I was surprised that you did not mention probably the biggest reason why x86 is hard to replace, which is that virtually all desktop software is written and compiled for x86 .
@Waldel Martell good point, i was thinking x86 as both 32bit (x86) and 64bit (x86-64/x64) . At least with current and previous windows environments, 64bit machines are compatible with 32bit binaries, and i believe that it will be so also in future, so i don't see 32bit x86 binaries going away any time soon. It has been really good thing that we have had x86 as de-facto ISA. This has enabled a common platform for software we have today, but i am looking forward to see where this period of cross development madness will end up, with billions of ARM devices side-by-side with x86 devices.
@@jarvenpaajani8105 Maybe so, but thinking forward, that argument is out of date now that we have AI able to program for us. Like OpenAI's new Codex update. Maybe it needs some more fine turning at the moment, but in a year or two, your argument won't hold any longer...
Love your work. I did read through your amazing piece on the Hygon benchmarks and enjoyed it. But in the end, I couldn't fit it into the video without butchering it. I'll put it in the description, though.
@@Asianometry Something else which I think might be missing from the video: patents are starting to expire in the coming years for x86. So that means if China or some other country can make competitive chips maybe they can start to sell them in the west too. Or maybe we'll see more x86-extensions in ARM-or RISC-V chips for compatibility.
There aren't "various reasons" that Intel licensed its architecture to AMD, there was just one. When IBM designed the IBM PC, they told Intel that they wouldn't design their chip in unless there was a second source. IBM knew better than to get locked in to a single vendor on the hardware side and they had the clout to force Intel to agree. What they didn't understand was how important the operating system and software was going to be, and let that strategic part walk right out the front door. Later on they tried to take things back with OS/2, but by then the market had decided.
That was the reason forty years ago, not so much now. Those patents have long since expired and IBM does not have any ability to compel Intel to do anything anymore. The primary reason Intel and AMD have cross-licensing today is a) because they both hold patents that could restrict the other from making microprocessors and b) Intel has had to hedge against anti-trust exposure. I suspect strongly that when AMD was stumbling back in 2010-ish, that Intel were quite concerned that AMD would be acquired by a company not interested in Intel's patents (and that would extort them to let them continue producing processors.) At the time, AMD held the patent on the x86-64 architecture.
@@JasonDoege yeah, Intel is responsible for x86, while AMD is x64. The kinda needed to work together to at least *some* degree, plus potential antitrust issues.
@@JasonDoege lol, AMD, Intel, IBM are all owned by the same people, as is practically every other publicly owned company, namely Vanguard, Blackrock, and State Street, among many smaller owners. It's all the same people's property (like mine).
It matters, China is aiming to be self reliant on Chips. For national security and to shift their economy from manufacturing only to services and high tech. That is the big dissadvantage who has China. Being reliant on foreign chips.
ATM, what China needs most is the photolithography equipment needed to manufacture leading-edge chips. Developing this technology will take most of a decade. It also hurts that scams permeate the Chinese capital market -- look up Wuhan Hongxin for an example. OTOH, rocket science seems to be less of a problem. 🤷♂️
Actually that's why they are preparing experts in Sirius education center, they're gonna develop Russian chips in next decade. Hopefully it's gonna be good competitor.
@@permafrosty What Russia lacks and need more is fabrication and what China should look to, is a non-western architecture, which Elbrus provides. It would be better for them both to join efforts to develop these.
Another interesting thing to point out about instruction sets, some CPU lineups that fall into the same family can still have instructions not seen in other chips of the same family.
this reminds me of the Pentium/Celeron debacle. Where as Celerons were way cheaper then Pentiums but during the manufacturing process all Celeron chips were Pentium chips with some features physically disabled. But Pentiums were like twice as expensive.
@@markdsm-5157 Idk if it's true but I remember hearing that a lot of the failures of from the higher tear chips are sometimes converted into lower tier chips if they still meet those specs. Does seem like having a v8 with 4 broken cylinders to get a v4 but I guess with chips the extra weight of dead silicon doesn't really make a difference.
this was done with the cell cpu in the ps3. chips were fabricated with 8 cores, but only seven where used. thus, this allowed for one defective core, increasing the yield. picture.iczhiku.com/resource/ieee/sYIlaPwJufluTnnb.pdf
@@ragnarok7976 That is later approach. Because majority of old chips where single-core, internal structures of the cores where subjects of marketing/quality control disabling. Now they mostly turn off additional cores.
All these chips may have open instruction sets, but closed design, which can lead to companies and governments building backdoors or hidden vulnerabilities into them. I want to see not only the instruction set, but also the chip design getting open sourced, that way we can have many peering eyes from the public to make sure nothing shady is going on.
The only way you could verify that they aren’t malicious is by destructively scanning random samples in the supply chain. It’s not worth the effort, IMO. Especially when China has shown clear intent by building hardware back doors in devices previously (Cisco networking equipment and of course Huawei come to mind).
Slow and steady. Probably with this x86 processor technology acquired, China will be able to progress to more advanced processor and chip technologies of its own and self sufficient .
@@MrSvenovitch He never complimented things that grow organically, he only complimented organic growth itself as the best growth. Reading comprehension is seriously lacking.
Cheap or expensive.. whatever if it has good performance, no need to care Chinese or American or Cheap or expensive... just buy.. If China can give better or same performance at a cheap price it would much better..
China should focus on open source architecture such as RISC-V. More than that, they should double their efforts on lithography as well. I hear their SMIC is now 14nm-capable. That'd be good news. Most consumer-grade products don't need 7nm or better chips anyway.
Well, issue is China's tech development always needs a "benchmark", aka things to copy from. There are no real advanced processors based on RISK-V yet. I am sure they'd be all over it when someone else made a very fast RISK-V processor on an advanced node.
lol tell that to intel moving to 3nm for consumer grade products. All progress in consumer grade products is skyrocketing because of smaller fab processes, what are you talking about no need 7nm or better chips? Maybe china doesn;t, but then again nothing coming out of there in that arena anyway
@@siggydj bro you don't even need a 14nm processor in your toster. Yes the industry is moving towards higher nodes but do you really want a 300 dollar toster?
we are. RISC V is getting heavy funding and commercial interest. Lot more than the X86 architech metioned here. The tech here will be mainly used as bridge gap before the industry fully transition to RSIC V from chipset to sofware. And it will be used to power legacy system eventually slowly phase out.
@@vashsun4908 soooo you're saying china plans to compete with intel and amd computer processors by making toaster tech? K 🤣 You do realize that the video is specifically about computer cpus right and so is the discussion about that? Consumer products dont just imply toaster chips lol. Or are you saying we're gonna be seeing risc-v chip loaded toasters soon too with 14nm lithography chips, something the OP wants china to make progress in and what i was replying to? Toaster chips i dont think even reach 28nm lithography let alone 14nm; this is computer and cell phone cpu chips etc we talking about here
Chinese engineers (and companies) are very good at iterative design. Now that they have a baseline x86 chip design, In ten years they will be challenging AMD and Intel.
@James Moss I should have prefaced with "on the low end." They won't be able to make the latest and greatest gaming CPUs. But, they will push Intel and AMD out of the low end CPU market. I have been in FABs in China and seen what they have done in Solar and Display.
This is, quite honestly, one of the stupidest comments I've EVER read, not just on RUclips, but the entire internet. And I've been on here since before the WWW even existed.
@@bradallen8909 Indeed, some people just don't understand how hard it is for China to truly "catch-up" to the West, even if its in their own domestic market; they make a lot of "claims" in the technology space, all very much unverified, and its all a far-cry than Chinese media portray it to be.
That's a polite way of saying that China (like East Asians in general) is extremely good at improving upon tech innovations and IP stolen from the West/Europe. I agree. Frankly, I'm looking forward to the shift of hegemons. I welcome our new Chinese overlords, on the basis that they don't seem interested in dismantling our culture or civilisation. Nothing that happens will change the per capita inventiveness of each pole's respective natives, but when you have over a billion people under a hyperorganised central government, per capita metrics just don't matter.
I have a feeling I am one of few here actually used Hygon-based server, had to put up with all sort of wacky I/O characteristics... Thanks for the explanation on the join-venture structure of AMD.
The 6502 was the first desktop microprocessor I ever used. It was i the Pple 2 and the Commodore VIC-20, both of which I had experience with before getting an old 80286 in the shop in the early 1990s. The DX486 was the go-to machine at that time.
x86 is the perfect all-in-one architecture to leverage most ASIC based architectures without developing something completely new. So yes i believe it does matter for now but eventually we will evolve past it especially now that custom chips is becoming a method of protecting IP/trade secrets and creating walled garden.
It should remove the IP problem for them to use an open architecture.The issue here though is Foundries. For that you'll need lithography and that is basically ASML. Nor does it help to invest billions in companies that do not actually do things but collect government funds.
Easier said than done. China is good at copying and stealing technologies. Coming up with their own original design is a huge challenge. Since creativity and critical thought isn't taught with their educational system.
@@iMadrid11 Couldn't agree more. "Sticking your neck out" and bringing innovation or something weird or different or plainly "fun" is tantamount to insurrection, if you stifle free thought you stifle innovation, creativity and plain inventiveness. Look at the Jack Mas paraded and humbled as they are being forced out of their companies and sent to "reeducation". Japanese tech sector blossomed in the 70s and 80s thanks to US money and relative freedoms.
@@iMadrid11 The real challenge is not coming up with new design, but market adoption. Using existing standard and specification can ensure faster adoption of their products and replace the competitors.
X86 on older processes definitely have a market albeit in the same way z80, 68000 and other legacy chips still have their uses. It is not like it is DEC Alpha or PA-RISC and therefore at a dead end. Hopefully they can do an AMD and evolve the architecture rather than copy. This could suit a new era of homegrown products for the Chinese market.
Dont forget, AMD checks those designs before fabrication. So ultimately, they can get some new design features from China too. It is a win-win cooperation.
It matters to the Chinese. Think about it this way. China had practically no chip design prior to Trump’s sanctions and Taiwan’s limit of export. After just a few years, they are self-sufficient in x86 chips. By 2025, they aim to be fully self-reliant. This is pretty important news considering they are the world’s top semiconductor market.
Not true. Huawei had their Kirin line up of processors (their own design) that was comparable to Qualcomm Snapdragon and MeidaTek. But they both used TSMC to build their chips. But they had to stop developing Kirin chips because of sanctions.
I don't know if this one matters in your perspective, but if you ever heard your PC or Mac with "64-bit system" or something, you must know that in x86 architecture, it is originally 32-bit, then AMD develop an IP for extending that x86 32-bit architecture into a 64-bit one, it's called x86_64, where almost all computer system, Intel or AMD uses, supports and become standard these day. Because of this, Intel also have to 'licensed' to use 64-bit extension of its own IP. I'd like to think, how can Intel and AMD have a (sort of) deadlocking competition between them even though AMD is using Intel's IP for so long, it is because both of them are on tipping balance caused by their circular dependant IPs, it is like both in a position where if you pull the rug under your enemy your rug will be get pulled too at the same time. Note : IP = Intellectual Property
I think the RISC-V is the only right way. RISC-V will be the top 1 in the market in the future. We will see more and more companies moving from ARM and x86 to RISC-V.
The end product use is where the tech is, has we integrate more complex into the chip the harder it going to be. We are in such a flex of change even Intel needs a couple of years to see where things are going.
It is worth noting that patents covering much of the basic x86 ISA are expired. Anyone can in theory, legally make CPU's supporting all the instructions that existed in 2003 at this point. x86_64 and lots of the SIMD and other acceleration instructions would however still require licensing.
it would have been true if it was from anyother country. But this is coming from China and I don't know how I feel about it when a dictatorial regime obsessed with gaining power gets there hand on something like this. Your Indian so you know what I am talking about.
@@icollectstories5702 ARM is in the process of doing just that. RISC-V may be the long-term solution (though it'll meet resistance from the Western powerbrokers) but ARM is ready to do battle with the American duopoly for the laptop, desktop, server, and supercomputer space over the next ten years. At least it's some competition for the US, which has had things entirely their own way for far too long (probably the last time anyone else was on top was just after WW2 when the UK finally passed the baton after reaching most of the computational firsts with everything from the Difference Engine to the Ferranti Mk1, Colossus, and the Manchester Baby; Germany potentially could've been on top for a while if they'd taken that lone wolf Conrad whatshisname more seriously). At any rate, ARM's big advantage is that nobody has to worry about Britain building and backdooring the chips itself -- you can choose your own backdoor based on country of origin.
@@sheilaburrowes9081 Well, technically, ARM has been doing that since last century. How much longer do I have to wait?😴 US dominance in technical fields is mostly due to having the largest economy post-WWII. An example is when the UK killed its space program because the US offered them lower prices. You can compete with the US for a while, especially if you form international consortia, but in the long term, you have to bring American money to the table. Or you can specialize in niche markets the US doesn't care about like Swedish fighter jets.
Making a chip on this generation of it and engineering the next generation of chip for manufacture are 2 different things worth billions. Just because you can make this gen doesn't mean you can make next gen
To all the comments, I'm sure the reason with RISC-V moving to Switzerland is to avoid the political conflict between the States and China. Besides, it's not like there hasn't been RISC-V development on the latter.
AFAIR it was mostly a reaction of Trump's government ingerence on ARM and Huawei business. Risc-V wanted to be independent from any country so they chose the most neutral one. And yes both US and PRC are developing on Risc-V since this is a clean ISA taylored for specialized microarchs, it's royalty-free and the BSD license allows proprietary designs.
funny that an ISA that began life as the arithmetic engine for a desktop calculator has grown organically to encompass almost the entire computing world.
A loose connection there! - That was the Intel 4004 - a 4 bit processor. Other than being created by Intel, it bears little resemblance to any x86 ISA. But yes it is remarkable that the inventor of the first successful microprocessor is still basically the leader of the same industry over 50 years later.
I remember those Hewlett Packard calculators that were programmable with little strips of magnetic tape - for the early 1970s it was pretty rad. Our college actually had a special conference to decide if calculators would be allowed in finals, can you imagine?
I'm assuming most of the China sales are to OEMs that manufacture their systems in China and export many of them to other parts of the world, including the US. Taiwan is certainly also a very large originator of revenue for Intel, and it's not because of processors that end up being run in Taiwan.
A very old microprocessor 8080 with 8bits and 2MHz makes miracles when well used, like Space Shuttle Columbia did on 12 April 1981. THERE ISN'T ANY PROBLEM making x86 processors, it won't change anything. Here I show an example. A simple Pentium 4 with puppy Linux can be better than a brand new Win 11 computer. Eager users want faster computers for doing, very very very simple jobs ( like seeing pictures typing texts, or navigating on the net. I compare it with a space rocket driving in 55 miles velocity limit road
What happen to the Loongson mips chips? I had a Loongson laptop (netbook) 10 years ago, that was pretty functioning, cheap and has super long battery power ... It seems the gap gotten bigger and not smaller over the years, now with apple & microsofts arm adventures it seems completely obsolete ...
I think with Apple an Windows aiming for ARM and Intel still struggling with its 7 nm line they try to squeeze the last bit of revenue out of the x86 platform before the portable market evolves to ARM or M1 in case of Apple. The desktop will soon follow. Most software will be recompiled for ARM so most applications will be available for customers. Specific software will be running in simulator or on the fly interpreter.
M1 is a dead end because of Apple's business model. Apple doesn't make technologies that others can then use and extend upon, but instead has a walled garden to cross promote it's closed ecosystem.
@@hamobu Kind of. But if Apple can get there, so can ARM. It's just a question of whether a British company can find the kind of R&D budget needed. But with its long queue of international buyers, that might not be a problem.
Well, x86 is in a sense both. The encoding is CISC, but not extremely so. Because, unlike 68000 and many others, it was based on the simplistic 8-bit Datapoint 2200 (via Intel 8008 & 8080), not the more capable minicomputers or "mainframes" of the era. The internal execution in the 486 and the original Pentium (and contemporary competitor's x86 CPU:s) were very similar to RISC designs for simple arithmetic instructions, although more complex and involving microcode for larger instructions. Even many old and (fully) microcoded CISC processors had an execution core similar to RISC processor. That's were the RISC idea came from back in 1975 - let a high level language compiler generate code for that simple core directly.
Just curious are you a native chinese speaker? I saw from some of your videos that you lived in Taiwan, but your mandarin pronunciations are unique to say the least..
I speak both Mandarin Chinese and British English at a native level. What I can say is when I'm interposing a word of another language into a sentence, I will most likely follow the inflection conventions of the sentence language. Some examples I can think of include: 1. When I include a Chinese name (often for a place) in an English context, I often drop the tone; 2. When I'm reading out a string of Latin letters (over the phone for example) in a Chinese context, I often assign each letter a fixed tone in Mandarin regardless of its place in the sequence.
@@scheimong Agreed, but the uploader doesn't always drop the tone. E.g: ruclips.net/video/vyVyphJebfw/видео.html I was just wondering if it was due to the a dialect or something.
It matters. Even if x86_64 is going to be obsoleted in some years as RISC-V is rapidly rising. It's still good for China to master existing technical solutions as good as it can. x86_64 architecture does have crappy old fundamentals, but in order for it to compete it has to be filled with leading edge stuff.So it pays to be there to learn that stuff.
x86 is crappy but it is the most common, and it is "fact on the ground". RISC-V may be the future, but not now. However, China's whole-country approach is very promising. They can instruct universities to teach RISC-V. Systems sold to ASEAN, Africa, South America, by China, can benefit from RISC-V.
Obsoleted in maybe some decades... Not just some years... It takes an enormous amount of product support and supply to make a certain bit of sand obsolete compared to another bit of sand... Edit: sand is a metaphor for the small material cost in a product and should not be taken literally (duh). Look to sbc for small projects and you'll find very quickly that a variety of different processors prove the idea of obsolete has a very steep requirement from a variety of sources before being obsolete in the consumer world.
Bullshit, ARM is decades behind x86. ARM has a massive issue with latency. You can't get it to respond as fast as x86. It's more likely a hybrid architecture will be the future. Using ARM for low power tasks and x86 for it's speed.
The wikipedia article you linked to clearly states that the chips you are talking about are AMD64 not "x86"... even the 32bit intel designs are NOT "x86" they are x88, the command set changed.
For critical area, China uses its own designed and manufactured chips such as Loongson 3. In 2021, Loongson Technology releases its own ISA (Instruction Set Architecture) LoongArch, which makes the LoongSon 3 5000 series a pure Chinese chip.
It is likely that Chinese will design and develop their own chips based on a open ISA or even their own ISA. Since the restriction of technology imposed by Trump, China had determined to be self -reliant on nearly all technology sectors including aeroengine, machines, robots and medicine. GREAT VIDEO, thank you.
They still have unfettered access to Britain's ARM IP. But I suppose that depending on the ARM ISA is still ultimately leaving them dependent on an external player. There's only so many Cortexes you can shove on an SoC and only so much you can do to optimise the chip designs as an end manufacturer.
Espressif the best thing happened in the recent years. I can't stand buying prototype that cost more than 20 bucks, with Espressif, I can buy prototype and chip for pocket changes
Whatever joint venture may be, China is aiming to be self reliant in microchips and that is commendable and my belief is China will surely develop its own technology very soon. The Americans will always try to create an impression that it is their technology that is used but it will not be so very long.
Remember intel once had patents that belonged to IBM which has gone through several innovations and locations but still remains a force throughout the world.
Really interesting video! I think it's interesting discussing x86's value as we move into the shaky continuation moore's law. The industry needs chips to double in power every two years to meet targets, and now, the only way to do that is more cores. More cores, and more complex sub cores are 100% in RISC territory. I think that over the next two decades, RISC chips will be able to make significantly more significant performance gains over x86. Just take a look at the 64-core AMDs -- it was obviously a huge engineering challenge to fit 64 cores onto one chip using one system bus. As a side note though -- there's an slight high-pitched whine in the audio feed that I think a high pass filter could get rid of.
The elephant in the room : Chip manufacturing in current year is really, really hard and expensive. There are only 3 fabs (intel, Samsung, TSMC) unlike in 80-90s when there were 10-15 fabs. 7nm TSMC process with EUV wasn't single person/company/country effort, it was effort of all shareholders, Intel, AMD, Nvidia, Samsung ... So, can China make their chips by themselves ? No, even Intel or AMD can't make chips by themselves, because we are at the end of the current manufacturing process and to take the next step (or the next iterative manufacturing process) it will take a lot of smart people working together
China has 20% of humanity behind it. And rather than bring CHina closer together to become a consumer and user of US chip design and manufacture... the US has decided to push China into being a competitor. China has no choice. There might be cheaper and better chips outside China, but if the US could ban their export at any moment and starve China's industry at a moments notice, what does cost matter? China has to develop home grown chip fab capability or die trying. China already has home grown DUV.
@@nickl5658 you really didn't understand. EUV was a research in the 80s, in 1995-2002 was the first iteration and 2019 was put into production. (just EUV probably costs more then 50 billion dollars) China or any other country that is on a ban list can just copy current design, nothing more and I will say current chip designs are pretty good in performance, it's not like 90s where in just 5 years you have completely performance. But to be on the cutting edge, that is very difficult and it's always combine effort of all shareholders in the business, not just one company or country.
I reckon arguing about how China will lead in Semiconductor industry has become more of a past subject. Remember Chinese never publicly announce that how well their R&D is working, I believe in next couple of years we might hear some surprising news from Chinese Tech Industry.
@@bambur1 if China couldn't produce their own ram/memory chip on last year, the price will be doubled even tripled because of the monopolize of supplies like what the graphic card did
@@Aka.Aka. it works for the rest of the world except China. Unfortunately there is no patent protection in China. It's like state sponsored tech theft. They encourage companies to copy and steal technologies from the west. Look how many fake handbags, iphones, shoes that comes out of China. Yet they are not doing anything to stop it.
@@user-mhgu6om9mj2t you literally don't know what I am talking about. There are fake Louis Vuitton made to the exact same dimensions and same material. Iphones are copied to the same dimension and same interface but using Android skin. ruclips.net/video/dwfikpYT1jQ/видео.html My classmate who went to China to work for a high tech company literally told me they reverse engineer iphones and copy it to the exact dimension.
This was an excellent video detailing a situation about Chinese development and economy and you didn’t bring in international politics into it God bless your boots.
However the best PC CPU is China is 3A5000 which is on MIPS/LONGARCH architecture, from benchmark ran on linux it matched the performance of the 1 gen Ryzen and it could translate to x86 or arm, but you wouldn't di that.
Your video listed the Zen 3 uP with 4nm technology. If that is fact, then how did a Chinese manufacturer get access to EUV Technology. Had to be thru ASML. That needs a entire explanation . DUV, and EUV with Silicon Valley Group was my job for 20 years, and was a constant concern, and challange to keep this technology out of China.
Its great because China will gain experience, and provided the tech for zen processors , china also designed a great arm processor for the huwaei phone so this is a sign they have the understanding of a variety of processor architects, in a few years they may be ready to innovate further or at least they can protect theor selves from sanctions.
@@arenzricodexd4409 yeah, let them conquer MCU market, then annihilate ARM on the embedded SoCs and integrated core markets then the low-power servers one. x86 has currently too much software inertia and Risc-V will need a lot of time and other markets share to be considered as an inevitable ISA for mainstream software development.
It's not that hard to design a faster CPU than x86, multiple others did in the 90s, MIPS,SPARC,ALPHA, Power. All have either died out or are a small niche. The power of x86 is with 40 years of software designed for it and RISC-V is starting from scratch.
China's entry to the x86 processors is nothing but a stop gap for adoption of RISC-V chips. It is not just China, but many other countries want to use RISC-V to reduce dependence from Western IP's.
I share the same view. The X86 will be used to power legacy systems in infastructure or transportation managment. It is frustrated when you have to do a hard migration to a newer system just because you are running out of spare part.
Ngl china has some strong RISC-V stuff happening and they are mass training people into the industry. I think they are gonna close off their country once they succeed
x86 is only a translation layer on RISC processors, has been for many years. And x86 is nearing its end of useful life. So in 10 years or so China can have its own RISC equal running custom Linux and be ahead in 20 years. But generally, if the anglo-imperialists think they can go on acting like they do now, life surely has a few surprises in store for them.
A half truth. The x86 line was considered dead by many in the 1980s too. And you could equally well argue that old CISC processors from the 1950-70s used "a translation layer on RISC". That translation layer is the embedded microcode in these processors, while the central core in them execute the microcode in a very similar way to how a RISC processor executes machine code generated by a compiler. The main difference is that x86 CPUs (from the mid 1990s and onwards) performs this translation dynamically, i.e. on the fly during execution. This is one of the factors that gives them a much higher performance than old CISC designs, and even higher than contemporary RISC designs. The downside is that this dynamic translation and optimization consumes some power, so not the way to go for most small battery powered devices.
There's no x86-to-RISC translation. This was a popular myth from the 90s when the x86 vendors (not only Intel) began implementing RISC-like features in their designs. The blame should probably fall on the former NexGen Corp. for calling their x86 implementation "RISC86", found in the Nx586 CPUs. All CPU architectures -- RISC or CISC -- do translate their higher level ISA (x86, ARM, etc.) to the actual machine operations, specific to the underlying hardware. It is these machine op's that the core of a CPU is actually scheduling, ordering and executing. There's no point to "translate" from CISC to RISC and then again down to machine op's -- that's the running confusion among the public. The issue for x86 specifically is that this ISA defines instruction lengths with variable size and that makes decoding parallelization very resource-heavy and power inefficient, compared to the RISC designs. That's the main difference. On the other hand x86 code in generally is more compact and doesn't require large caches, but the advancements in the semiconductor manufacturing mostly negated that advantage.
@@Ivan-pr7ku Not all CPU "translate". Many execute on the spot, directly via gates and latches. That's what RISC does. CISC often runs an internal microcode program that *interpretes* the machine language and performs the actions. It's very much like a BASIC (or script) interpreter. It does not _translate_ a program, it just _executes_ it, i.e. performs the actions. A compiler on the other hand, _translates_ the code (to another language). In modern x86, microcode is dynamically emitted, buffered, sheduled and reordered as a means to efficiently perform the semantics of a sequence of x86 instructions. Just like normal static microcode, these wide instructions are somewhat similar in structure to (horizontal) RISC instructions. (Static microcode was used not only in the 8086-386, but also in the RISC like 486 and Pentium for some actually complex instructions. It's used even in today's x86, but mainly for "housekeeping" tasks.)
well it needs you sino chauvinist fascists then to come up with something instead of blabbering speculative BS. And half truth narratives that you've laid t here clearly indicating you have no idea of x86 tech
@@herrbonk3635 Very few instructions are microcoded. Most instructions ARE broken down into microops on most x86 implementations. You are a bit confused.
This is interesting, I love the explosion of tech in China, It had been great for people like myself who are electrnics and IOT hobbiests! I will defo seek out some of these chips and give then a try in the future.
RISC and CISC have always existed along side each other, sometime one becomes a bit more popular than other. China is not AIMing for cutting edge so the trend doesn’t matter that match. If they see X86 fits their initial market, they will do it. Once they get a good x86, then making an ARM equivalent chip is trivial.
@@supertnt318 Trivial. Sure, tell that to Intel. But yeah, I do agree that x86 has a huge market but the main factor in my opinion of why x86 will still be relevant for the foreseeable future is that software is written for x86 and it takes a lot of time and money to rewrite it. When I wrote my original comment I was thinking of developing countries skipping building land line infrastructure all together and embracing 3G/4G, which made those countries much more digital savvy whether it is digital payments/banking, etc. and jumpstart a whole new array of possibilities just because there was such a high penetration of mobile phone users. I do hope China will succeed in the chip industry, as users we all benefit.
@@vitalis x86 vs arm is like comparing apples to oranges. No expert will tell you ARM is more advanced than X86. All depends on applications. The low level software has to be rewritten for every new chip anyways even for different ARM chips. The tech bottleneck for China is at the silicon level, which are not that much different between a ARM architecture or a X86 architecture. You are seeing ARM has more growth these days are purely because of market strategy, not technology. If China sees that market benefits them, they will do their own RISC processor when they don’t have bottleneck at the silicon level.
@@supertnt318 Steve Jobs mentioned in one of his interviews that at Apple they invest in the early stages of the tech life cycle, not when it is past the top of the curve because there is diminishing returns in optimisation and it is an eventual downtrend from there. That's is why they ditched the floppy disc and CD-ROM much earlier than anyone which was a massive deal at the time and the reason they are where they are with ARM after a decade of development, refinement, data sets and having the time to plan, strategise and build a flawless transition of their ecosystem. A decade that would be lost if they didn't have such a strategy. You seem to take this so personally that you are not getting my points and are sorely focusing on defending x86 as if I had anything against it. The world trend is reducing power usage, reducing pollutants, recycling precious minerals and basically making everything more power efficient. That's it. That's the trend I'm talking about. And it so happens ARM is leaps and bounds ahead of the x86. It's way past the trend of just competing for pure Ghz at whatever cost because there is another massive trend up which is increase in digital consumption and by extension power usage, a problem that is compounded by rising power cost per watt. A huge factor server farms, businesses and something consumers take into account when buying A-rated appliances, who are further encouraged by government subsidies.
@@vitalis If Jobs was still there, Apple would rather invest in Risc-V like many companies are doing now. ARM is beyond the top of the curve, now trapped losing more and more market share in the low-power every years and being unable to compete with the software inertia of x86 that the trend recently made by Apple could not entirely resolve (because M1 is not a standard arch and Apple closed environment is a no-no for many potential users and devs)
ARM isn't even close to X86 in power. The best ARM chips barely match mid tier desktop processors. You got to understand that ARM has a fatal flaw in that it always will have higher internal latency because it has to process more data than X86 to get the same result.
Arm has top-tier server chips which are a lot better than "barely matching mid-tier desktops" not to mention that the m1 Mac is better than x86 laptops in single core workloads if I'm not mistaken
@@arewealone9969 close but not faster. apple hype it single process performance but the ryzen eat the M1 for breakfast when you start pushing load in. it not really because the instruction set is worst, it is just that an 8 core processor is still an 8 core processor... thus the issue with ARM is that no one has really build a chip as large as AMD64 chip, not that it can't.
8080 was designed in 1973 using about 5000 transistors; Pentium MMX in 1993, using many milions of transistors. It's a *much* wider performance gap between the 8080 (~0.001 MFLOPS) and Pentium MMX (~250 MFLOPS) than it is between the latter and CPUs sold today (~8000 MFLOPS). 250/0.001 = 250 000, while 8000/250 = 32. (I wrote this using a CPU from 2005 :D)
I'm pretty sure that it's not really as practical for China to get into making their own x86 chips for the purposes of domestic supply chain security... It seems to me the more obvious strategy is to leverage their large population, and plentiful talent in the software space to continue encouraging popular cross platform apps and software to wean their population off of x86 reliance over time and to move to more readily onshored platforms such as Risc V and ARM. I'm not entirely sure this is the best example as they've had clashes with the Chinese government in the past, but Mihoyo and Genshin Impact come to mind, as a readily cross platform application (mostly, I'm still salty about their Linux support), which has absolutely blown up in popularity, but I do think it's a good example of the strategy I'm describing. China absolutely does have plenty of talented people in the software and artistic spaces, so to me it feels natural that they'd conglomerate and work together in a way that lets them produce software which aids their goals (while letting them get American dollars from foreign fans at the same time), but of course, I'm a rando on the internet, and have little knowledge of statecraft.
Really rather too bad that x86 became the standard. It is not a good instruction set, but we are stuck with it. It was designed for microcode, and that only had a short time when it was a good idea. In actuality the IBM 360 was better. LOL.
The chinese chips cost as much or more to a superior Intel or AMD variant. X86 cpus are extremely complex and their blueprints can be measured in the miles of paper required for each chip. Also China is unable to buy EUV lithography machines from Europe which all but guarantees they cannot catch up in the near future.
If for home use Chinese innovation strikes again with tons of X99 mainboards and Xeon CPUs for very little money ! ruclips.net/video/9sWOnX-yitE/видео.html
I hope there are more HYGON and Zhaoxin processors ready for use on the market, I wanted to buy some but couldn' find them since the fear of legislature is still there.
X-86 is at the end of its life. Intel is now adopting RISC-V to survive this decade. It's interesting to see more $ coming to this new architecture. China has a leg up there though.
X86 crushes most other CPUs quite easily. People are overlooking software compatibility in this, they think you can just turn up in a market with a radically different processor and magically have mature compilers, IDEs, experienced developers with decades of experience and huge software libraries.
@@SerBallister It's not software compatibility that made x86 fast though. (And there were p-code systems already in the 1960s, similar to todays "byte code" and "just in time" compilation.) Sure the first 8086/88 was a little slow, but already the 286 killed a lot of competition (even within Intel, such as the iAPX 432). The 486 was as fast as most "RISC" processors of the day and the Pentium had a new fantastic floating point unit that had learned *a lot* from the competition in the area, like Cyrix and others that made arithmetics coprocessors for x86 at the time (more or less x87 compatible).
@@pinkipromise here comes the genus. Fabrication for Shakti is done in Oregon on order from iit m. That means it’s a commercial paid to work order. Even otherwise my country can indigenouslydo that when compared to land grabbing wannabe superpowers who can’t?
As always, be civil in the comments. Check out the semiconductors playlist for other videos on this topic: ruclips.net/p/PLKtxx9TnH76QEYXdJx6KyycNGHePJQwWW
If the Chinese are making it... Then it does matter...
How are they getting american tech you say? The patent office 😏
It's over the fall of Domino's effect has begun.
remember to cross verify everything you get from wccf with other sources, they're not exactly known for accuracy. i know sometimes news truly break over there, but just as often they're the source of false information.
You know asking to peoples to behave "Civilized" on a video that low key throwing shade to China advancement and intentions on a online video platform that dominated by Western audience and hardcore Chinese nationalists using VPN is on par with asking peoples to be civil about atheism on Sunday masses on some southern baptism church.
it's either you completely disable the comment sections or just ignores it completely.
hey as long as you've got millions views it doesn't matter if the CCP bots and the Incels start trash talks with each other right?
just grab the pop corn sit back and enjoy the circus shitshow while you rack up those monetization from the views.
Great content as always. I was surprised that you did not mention probably the biggest reason why x86 is hard to replace, which is that virtually all desktop software is written and compiled for x86 .
...which is too bad really, since it's truly an old out of date architecture long time ago already...
@Waldel Martell good point, i was thinking x86 as both 32bit (x86) and 64bit (x86-64/x64) . At least with current and previous windows environments, 64bit machines are compatible with 32bit binaries, and i believe that it will be so also in future, so i don't see 32bit x86 binaries going away any time soon.
It has been really good thing that we have had x86 as de-facto ISA. This has enabled a common platform for software we have today, but i am looking forward to see where this period of cross development madness will end up, with billions of ARM devices side-by-side with x86 devices.
@@jarvenpaajani8105 Maybe so, but thinking forward, that argument is out of date now that we have AI able to program for us. Like OpenAI's new Codex update. Maybe it needs some more fine turning at the moment, but in a year or two, your argument won't hold any longer...
Apple has switched architecture several time, and covered its back with emulation.
Rosseta2 is actively showcasing why it doesn't matter. A great emulation layer easily solves the issue.
Surprised you didn't talk about the benchmarks of Hygon over at AnandTech or Level1Tech.
what's your minimum specifications?
Love your work. I did read through your amazing piece on the Hygon benchmarks and enjoyed it. But in the end, I couldn't fit it into the video without butchering it. I'll put it in the description, though.
Wow hello Dr. Cutress :D crossover episode when?
@@Asianometry that's fair, always hard to boil down. You got the Corp stuff in which is always a key point!
@@Asianometry Something else which I think might be missing from the video: patents are starting to expire in the coming years for x86. So that means if China or some other country can make competitive chips maybe they can start to sell them in the west too. Or maybe we'll see more x86-extensions in ARM-or RISC-V chips for compatibility.
One more time thanks for the subtitles on every video, this channel is much easier to follow now
There aren't "various reasons" that Intel licensed its architecture to AMD, there was just one. When IBM designed the IBM PC, they told Intel that they wouldn't design their chip in unless there was a second source. IBM knew better than to get locked in to a single vendor on the hardware side and they had the clout to force Intel to agree. What they didn't understand was how important the operating system and software was going to be, and let that strategic part walk right out the front door. Later on they tried to take things back with OS/2, but by then the market had decided.
That was the reason forty years ago, not so much now. Those patents have long since expired and IBM does not have any ability to compel Intel to do anything anymore. The primary reason Intel and AMD have cross-licensing today is a) because they both hold patents that could restrict the other from making microprocessors and b) Intel has had to hedge against anti-trust exposure. I suspect strongly that when AMD was stumbling back in 2010-ish, that Intel were quite concerned that AMD would be acquired by a company not interested in Intel's patents (and that would extort them to let them continue producing processors.) At the time, AMD held the patent on the x86-64 architecture.
@@JasonDoege yeah, Intel is responsible for x86, while AMD is x64. The kinda needed to work together to at least *some* degree, plus potential antitrust issues.
That's still a various reason. So he is correct just not given the background. Which is not the point of the video.
Imagine thinking the only thing your end users will ever see isn’t that important.
@@JasonDoege lol, AMD, Intel, IBM are all owned by the same people, as is practically every other publicly owned company, namely Vanguard, Blackrock, and State Street, among many smaller owners. It's all the same people's property (like mine).
It matters, China is aiming to be self reliant on Chips. For national security and to shift their economy from manufacturing only to services and high tech. That is the big dissadvantage who has China. Being reliant on foreign chips.
China can have potato chips only
@@jontopham2742 for goverment purposes are useful. The important information is always on boring potato pc's
ATM, what China needs most is the photolithography equipment needed to manufacture leading-edge chips. Developing this technology will take most of a decade. It also hurts that scams permeate the Chinese capital market -- look up Wuhan Hongxin for an example. OTOH, rocket science seems to be less of a problem. 🤷♂️
@@icollectstories5702 my stimation is they will try to conquer taiwan between 2030 and 2048 so they have time to develop leading edge chips.
@@profounddamas yeah a lot of people don't understand that Samsung and TSMC started 7nm with DUV... And yes - Chinese companies have DUV
You could make a video about Soviet / Russian Elbrus class of processors. If you'll dig into their history you might be surprised what you will find.
They should be investing in and developing the Elbrus architecture for their processors.
Actually that's why they are preparing experts in Sirius education center, they're gonna develop Russian chips in next decade. Hopefully it's gonna be good competitor.
With the Russia China distrust and now Russia supporting an independent Taiwan besides cutting off other military tech from China.
@@permafrosty What Russia lacks and need more is fabrication and what China should look to, is a non-western architecture, which Elbrus provides. It would be better for them both to join efforts to develop these.
@@PlanetFrosty Thought SinoRus coop was at all time highs
Another interesting thing to point out about instruction sets, some CPU lineups that fall into the same family can still have instructions not seen in other chips of the same family.
this reminds me of the Pentium/Celeron debacle. Where as Celerons were way cheaper then Pentiums but during the manufacturing process all Celeron chips were Pentium chips with some features physically disabled. But Pentiums were like twice as expensive.
@@markdsm-5157 Idk if it's true but I remember hearing that a lot of the failures of from the higher tear chips are sometimes converted into lower tier chips if they still meet those specs.
Does seem like having a v8 with 4 broken cylinders to get a v4 but I guess with chips the extra weight of dead silicon doesn't really make a difference.
this was done with the cell cpu in the ps3. chips were fabricated with 8 cores, but only seven where used. thus, this allowed for one defective core, increasing the yield.
picture.iczhiku.com/resource/ieee/sYIlaPwJufluTnnb.pdf
@@ragnarok7976 That is later approach. Because majority of old chips where single-core, internal structures of the cores where subjects of marketing/quality control disabling. Now they mostly turn off additional cores.
All these chips may have open instruction sets, but closed design, which can lead to companies and governments building backdoors or hidden vulnerabilities into them. I want to see not only the instruction set, but also the chip design getting open sourced, that way we can have many peering eyes from the public to make sure nothing shady is going on.
Risc v it is
The only way you could verify that they aren’t malicious is by destructively scanning random samples in the supply chain. It’s not worth the effort, IMO. Especially when China has shown clear intent by building hardware back doors in devices previously (Cisco networking equipment and of course Huawei come to mind).
how did the other x8 x16 x32 x 64 electronic-chip design got made?
@@PutineluAlin its same analogy as solving 2+2 =4 to evolving to solving 3x+1 problem
@@ArabianKnight63 Risc-v processors I've seen gaining traction are only 5% open hardware, most is proprietary for things like video encoding.
Slow and steady. Probably with this x86 processor technology acquired, China will be able to progress to more advanced processor and chip technologies of its own and self sufficient .
without R&D, China will never be a leader for CHIP Manufacturing.. they always copy and steal from the US
@@kentofficial62 I believe that may be true 20 years ago... But many Chinese companies have very interesting R&D efforts these days.
@@kentofficial62 are you living under a rock last time i checked 1.4 trillion dollars is enough R&D for semiconductors
@@sophisticatedthumb5364 Seach for semi-conductor scams in china so that you are also aware of corruption happening in there
@@chd0043 They cant still mass produce hi end chip due to ASML Litography technology being banned by US
Thanks!
Being 5 or 6 years away from top US chip makers is not bad at all tbh!
If AMD hadn't had their Ryzen moment then the Chinese manufacturers would be the ones to do the job
@@TuskForce what quality products do you get then, in my experience all the global exports from china is cheap junk
@@kaliban4758 no China is best.
Many products i don't find in world that china make
62k subscribers! Nicely growing. Nothing beats organic growth.
Cancer grows organically. Viruses. Nutcase religions. Yeah great stuff all around, Einstein.
@@MrSvenovitch He never complimented things that grow organically, he only complimented organic growth itself as the best growth.
Reading comprehension is seriously lacking.
Cheap or expensive.. whatever if it has good performance, no need to care Chinese or American or Cheap or expensive...
just buy..
If China can give better or same performance at a cheap price it would much better..
Yup - cost and the ability to port programs are the two main things
China should focus on open source architecture such as RISC-V. More than that, they should double their efforts on lithography as well. I hear their SMIC is now 14nm-capable. That'd be good news. Most consumer-grade products don't need 7nm or better chips anyway.
Well, issue is China's tech development always needs a "benchmark", aka things to copy from. There are no real advanced processors based on RISK-V yet. I am sure they'd be all over it when someone else made a very fast RISK-V processor on an advanced node.
lol tell that to intel moving to 3nm for consumer grade products. All progress in consumer grade products is skyrocketing because of smaller fab processes, what are you talking about no need 7nm or better chips? Maybe china doesn;t, but then again nothing coming out of there in that arena anyway
@@siggydj bro you don't even need a 14nm processor in your toster. Yes the industry is moving towards higher nodes but do you really want a 300 dollar toster?
we are. RISC V is getting heavy funding and commercial interest. Lot more than the X86 architech metioned here. The tech here will be mainly used as bridge gap before the industry fully transition to RSIC V from chipset to sofware. And it will be used to power legacy system eventually slowly phase out.
@@vashsun4908 soooo you're saying china plans to compete with intel and amd computer processors by making toaster tech? K 🤣
You do realize that the video is specifically about computer cpus right and so is the discussion about that? Consumer products dont just imply toaster chips lol. Or are you saying we're gonna be seeing risc-v chip loaded toasters soon too with 14nm lithography chips, something the OP wants china to make progress in and what i was replying to? Toaster chips i dont think even reach 28nm lithography let alone 14nm; this is computer and cell phone cpu chips etc we talking about here
Chinese engineers (and companies) are very good at iterative design. Now that they have a baseline x86 chip design, In ten years they will be challenging AMD and Intel.
@James Moss I should have prefaced with "on the low end." They won't be able to make the latest and greatest gaming CPUs. But, they will push Intel and AMD out of the low end CPU market. I have been in FABs in China and seen what they have done in Solar and Display.
LOLOLOLOL
This is, quite honestly, one of the stupidest comments I've EVER read, not just on RUclips, but the entire internet. And I've been on here since before the WWW even existed.
@@bradallen8909 Indeed, some people just don't understand how hard it is for China to truly "catch-up" to the West, even if its in their own domestic market; they make a lot of "claims" in the technology space, all very much unverified, and its all a far-cry than Chinese media portray it to be.
That's a polite way of saying that China (like East Asians in general) is extremely good at improving upon tech innovations and IP stolen from the West/Europe. I agree. Frankly, I'm looking forward to the shift of hegemons. I welcome our new Chinese overlords, on the basis that they don't seem interested in dismantling our culture or civilisation. Nothing that happens will change the per capita inventiveness of each pole's respective natives, but when you have over a billion people under a hyperorganised central government, per capita metrics just don't matter.
Excellent video Asianometry. Great summary on what happened and how we got here.
Thank you four information, now I see how business and profits go from both sides.
I have a feeling I am one of few here actually used Hygon-based server, had to put up with all sort of wacky I/O characteristics... Thanks for the explanation on the join-venture structure of AMD.
I'd love to get one to be honest.
It would have been cool if you explained your experience more with what you used it for in basic words
IM USE HE ship the leave them eye on bat eeire
The 6502 was the first desktop microprocessor I ever used. It was i the Pple 2 and the Commodore VIC-20, both of which I had experience with before getting an old 80286 in the shop in the early 1990s. The DX486 was the go-to machine at that time.
I wonder how their backdoors compare to AMD's or Intel's.
Given that is Russian and Chinese and other RED bloc Communist states primarily exploiting the backdoors?
the best chip always has the best backdoors
@@joseywales7463 Just a heads up, Russia stopped being communist 30 years ago. You seem to not have gotten the memo.
@@hackhenk Right... It is what Putin likes to call a Controlled Democracy (really an Autocracy).
> How did they acquire x86 technologies in the first place.
They've been the worlds #1 chip maker for 20 years. This is the least surprising thing.
x86 is the perfect all-in-one architecture to leverage most ASIC based architectures without developing something completely new. So yes i believe it does matter for now but eventually we will evolve past it especially now that custom chips is becoming a method of protecting IP/trade secrets and creating walled garden.
Arm is a perfect example
Could depend on whether China brings out an OS that gains popularity within China, then they could eventually replace the hardware underneath that OS
They (The Chinese) may be better off with RISC-V
It should remove the IP problem for them to use an open architecture.The issue here though is Foundries. For that you'll need lithography and that is basically ASML. Nor does it help to invest billions in companies that do not actually do things but collect government funds.
Easier said than done. China is good at copying and stealing technologies. Coming up with their own original design is a huge challenge. Since creativity and critical thought isn't taught with their educational system.
@@iMadrid11 Couldn't agree more. "Sticking your neck out" and bringing innovation or something weird or different or plainly "fun" is tantamount to insurrection, if you stifle free thought you stifle innovation, creativity and plain inventiveness. Look at the Jack Mas paraded and humbled as they are being forced out of their companies and sent to "reeducation". Japanese tech sector blossomed in the 70s and 80s thanks to US money and relative freedoms.
@@iMadrid11 The real challenge is not coming up with new design, but market adoption. Using existing standard and specification can ensure faster adoption of their products and replace the competitors.
Great idea, let's blow it up
X86 on older processes definitely have a market albeit in the same way z80, 68000 and other legacy chips still have their uses. It is not like it is DEC Alpha or PA-RISC and therefore at a dead end. Hopefully they can do an AMD and evolve the architecture rather than copy. This could suit a new era of homegrown products for the Chinese market.
Dont forget, AMD checks those designs before fabrication. So ultimately, they can get some new design features from China too. It is a win-win cooperation.
It matters to the Chinese.
Think about it this way. China had practically no chip design prior to Trump’s sanctions and Taiwan’s limit of export. After just a few years, they are self-sufficient in x86 chips. By 2025, they aim to be fully self-reliant. This is pretty important news considering they are the world’s top semiconductor market.
Not true. Huawei had their Kirin line up of processors (their own design) that was comparable to Qualcomm Snapdragon and MeidaTek. But they both used TSMC to build their chips. But they had to stop developing Kirin chips because of sanctions.
@@subipan4593 sanctions by who(m) ?
i hope they choke on those chips
@@bartkoens5246 The USA
@@SanFranciscoFatboy Not much chance of choking in the current market :D
I don't know if this one matters in your perspective, but if you ever heard your PC or Mac with "64-bit system" or something, you must know that in x86 architecture, it is originally 32-bit, then AMD develop an IP for extending that x86 32-bit architecture into a 64-bit one, it's called x86_64, where almost all computer system, Intel or AMD uses, supports and become standard these day. Because of this, Intel also have to 'licensed' to use 64-bit extension of its own IP.
I'd like to think, how can Intel and AMD have a (sort of) deadlocking competition between them even though AMD is using Intel's IP for so long, it is because both of them are on tipping balance caused by their circular dependant IPs, it is like both in a position where if you pull the rug under your enemy your rug will be get pulled too at the same time.
Note : IP = Intellectual Property
It was originally 16bit or at least the original 8086 was.
@@Noise-Bomb Ah right, the x86 name, comes from the original 8086's name i think
What is their status on RISC processors?
You should cover Russian Elbrus processors too
Mostly running on servers ,telco equipment (router, switch, etc) and maybe military equipment. Not widely available as consumer products yet.
Nice video shot, thanks for sharing with us, well done :)
I think the RISC-V is the only right way. RISC-V will be the top 1 in the market in the future. We will see more and more companies moving from ARM and x86 to RISC-V.
Why?
Free and open.
@@tomlxyz EU and other major powers, even India are investing in RISC-V for fear of being blocked by the USA.
@@elgs1980 and the BSD license giving opportunity to close in-house designs and ISA extensions, just like ARM Holding did in the past.
@@elgs1980 Only the instruction set. The chip blueprints are not open or free. Either you design it or buy from chip designers.
Great video! Thanks!
The end product use is where the tech is, has we integrate more complex into the chip the harder it going to be. We are in such a flex of change even Intel needs a couple of years to see where things are going.
It is worth noting that patents covering much of the basic x86 ISA are expired. Anyone can in theory, legally make CPU's supporting all the instructions that existed in 2003 at this point.
x86_64 and lots of the SIMD and other acceleration instructions would however still require licensing.
This is so nice to see someone trying to atleast disrupt the monopoly in field of semiconductor industry. Very good for consumers in the end.
it would have been true if it was from anyother country. But this is coming from China and I don't know how I feel about it when a dictatorial regime obsessed with gaining power gets there hand on something like this. Your Indian so you know what I am talking about.
IMO x86 needs to be killed off by a better architecture, not just copied.
@@icollectstories5702 ARM is in the process of doing just that. RISC-V may be the long-term solution (though it'll meet resistance from the Western powerbrokers) but ARM is ready to do battle with the American duopoly for the laptop, desktop, server, and supercomputer space over the next ten years. At least it's some competition for the US, which has had things entirely their own way for far too long (probably the last time anyone else was on top was just after WW2 when the UK finally passed the baton after reaching most of the computational firsts with everything from the Difference Engine to the Ferranti Mk1, Colossus, and the Manchester Baby; Germany potentially could've been on top for a while if they'd taken that lone wolf Conrad whatshisname more seriously). At any rate, ARM's big advantage is that nobody has to worry about Britain building and backdooring the chips itself -- you can choose your own backdoor based on country of origin.
@@sheilaburrowes9081 Well, technically, ARM has been doing that since last century. How much longer do I have to wait?😴
US dominance in technical fields is mostly due to having the largest economy post-WWII. An example is when the UK killed its space program because the US offered them lower prices.
You can compete with the US for a while, especially if you form international consortia, but in the long term, you have to bring American money to the table.
Or you can specialize in niche markets the US doesn't care about like Swedish fighter jets.
Making a chip on this generation of it and engineering the next generation of chip for manufacture are 2 different things worth billions. Just because you can make this gen doesn't mean you can make next gen
To all the comments, I'm sure the reason with RISC-V moving to Switzerland is to avoid the political conflict between the States and China. Besides, it's not like there hasn't been RISC-V development on the latter.
AFAIR it was mostly a reaction of Trump's government ingerence on ARM and Huawei business. Risc-V wanted to be independent from any country so they chose the most neutral one.
And yes both US and PRC are developing on Risc-V since this is a clean ISA taylored for specialized microarchs, it's royalty-free and the BSD license allows proprietary designs.
I’m not concerned about tech transfer. But concern over back door entries by governments does worry me (whether that’s USA, Uk, Russia or China)
funny that an ISA that began life as the arithmetic engine for a desktop calculator has grown organically to encompass almost the entire computing world.
A loose connection there! - That was the Intel 4004 - a 4 bit processor. Other than being created by Intel, it bears little resemblance to any x86 ISA. But yes it is remarkable that the inventor of the first successful microprocessor is still basically the leader of the same industry over 50 years later.
it sure as hell wasn't organic. It was one hack after another hack after another hack.
@@socialistsuccubus822 that what usually is meant by “organic”
I remember those Hewlett Packard calculators that were programmable with little strips of magnetic tape - for the early 1970s it was pretty rad. Our college actually had a special conference to decide if calculators would be allowed in finals, can you imagine?
@@stevengill1736 The 4004 was used in a most basic desk calculator - not programmable and not even scientific - just basic functions +-x/
I am surprised the author didn’t mention “Loongson” at all
Loongson is modified mips
readthe title first
I'm assuming most of the China sales are to OEMs that manufacture their systems in China and export many of them to other parts of the world, including the US. Taiwan is certainly also a very large originator of revenue for Intel, and it's not because of processors that end up being run in Taiwan.
So Zhaxoin is producing Cyrix (VIA) processors?
A very old microprocessor 8080 with 8bits and 2MHz makes miracles when well used, like Space Shuttle Columbia did on 12 April 1981.
THERE ISN'T ANY PROBLEM making x86 processors, it won't change anything.
Here I show an example. A simple Pentium 4 with puppy Linux can be better than a brand new Win 11 computer.
Eager users want faster computers for doing, very very very simple jobs ( like seeing pictures typing texts, or navigating on the net.
I compare it with a space rocket driving in 55 miles velocity limit road
when the software is too bloated it can even slow down a supercomputer
An excellent essay, thanks for the research!
How recent is this? In how many years will we have Chinese x86 in the real computers in the open market?
I doubt it will come soon let alone who knows how the landscape will be in another 5-10 years, Apple's ARM based cpus are an absolute monster.
What happen to the Loongson mips chips? I had a Loongson laptop (netbook) 10 years ago, that was pretty functioning, cheap and has super long battery power ... It seems the gap gotten bigger and not smaller over the years, now with apple & microsofts arm adventures it seems completely obsolete ...
I think with Apple an Windows aiming for ARM and Intel still struggling with its 7 nm line they try to squeeze the last bit of revenue out of the x86 platform before the portable market evolves to ARM or M1 in case of Apple. The desktop will soon follow. Most software will be recompiled for ARM so most applications will be available for customers. Specific software will be running in simulator or on the fly interpreter.
M1 is a dead end because of Apple's business model. Apple doesn't make technologies that others can then use and extend upon, but instead has a walled garden to cross promote it's closed ecosystem.
@@hamobu Kind of. But if Apple can get there, so can ARM. It's just a question of whether a British company can find the kind of R&D budget needed. But with its long queue of international buyers, that might not be a problem.
The age old quest for CISC and RISC based CPU. ARM has its niche. Thank you for the video for it sheads some light on recent development.
Well, x86 is in a sense both. The encoding is CISC, but not extremely so. Because, unlike 68000 and many others, it was based on the simplistic 8-bit Datapoint 2200 (via Intel 8008 & 8080), not the more capable minicomputers or "mainframes" of the era. The internal execution in the 486 and the original Pentium (and contemporary competitor's x86 CPU:s) were very similar to RISC designs for simple arithmetic instructions, although more complex and involving microcode for larger instructions. Even many old and (fully) microcoded CISC processors had an execution core similar to RISC processor. That's were the RISC idea came from back in 1975 - let a high level language compiler generate code for that simple core directly.
Just curious are you a native chinese speaker? I saw from some of your videos that you lived in Taiwan, but your mandarin pronunciations are unique to say the least..
In some of his videos the pronunciation is better, I think I has to do with "English mode" vs. "Chinese mode" especially when reading a script.
I speak both Mandarin Chinese and British English at a native level. What I can say is when I'm interposing a word of another language into a sentence, I will most likely follow the inflection conventions of the sentence language.
Some examples I can think of include:
1. When I include a Chinese name (often for a place) in an English context, I often drop the tone;
2. When I'm reading out a string of Latin letters (over the phone for example) in a Chinese context, I often assign each letter a fixed tone in Mandarin regardless of its place in the sequence.
@@scheimong Agreed, but the uploader doesn't always drop the tone. E.g: ruclips.net/video/vyVyphJebfw/видео.html
I was just wondering if it was due to the a dialect or something.
super old news bruhh, but appreciate the completeness to your subscribers!
It matters. Even if x86_64 is going to be obsoleted in some years as RISC-V is rapidly rising.
It's still good for China to master existing technical solutions as good as it can.
x86_64 architecture does have crappy old fundamentals, but in order for it to compete it has to be filled with
leading edge stuff.So it pays to be there to learn that stuff.
x86 is crappy but it is the most common, and it is "fact on the ground". RISC-V may be the future, but not now. However, China's whole-country approach is very promising. They can instruct universities to teach RISC-V. Systems sold to ASEAN, Africa, South America, by China, can benefit from RISC-V.
@@oceanwave4502 Why is x86 crappy? I'm not saying it isn't, but what aspects of the instruction set and programming model is it that you don't like.
Obsoleted in maybe some decades... Not just some years... It takes an enormous amount of product support and supply to make a certain bit of sand obsolete compared to another bit of sand...
Edit: sand is a metaphor for the small material cost in a product and should not be taken literally (duh). Look to sbc for small projects and you'll find very quickly that a variety of different processors prove the idea of obsolete has a very steep requirement from a variety of sources before being obsolete in the consumer world.
@@Bajicoy Bit of sand? There are several dozens of materials and chemicals in an IC, not just silicon. You cannot find all of them in sand.
Bullshit, ARM is decades behind x86.
ARM has a massive issue with latency. You can't get it to respond as fast as x86. It's more likely a hybrid architecture will be the future. Using ARM for low power tasks and x86 for it's speed.
Providing the X86 chip design file and firmware to China would be illegal under the US State Department’s ITAR export law.
The wikipedia article you linked to clearly states that the chips you are talking about are AMD64 not "x86"... even the 32bit intel designs are NOT "x86" they are x88, the command set changed.
GlobalFoundries is owned by the UAE SWF, and what might be worth to add that Chinese JV also manufactured a Radeon Vega 64-based GPU as well.
For critical area, China uses its own designed and manufactured chips such as Loongson 3. In 2021, Loongson Technology releases its own ISA (Instruction Set Architecture) LoongArch, which makes the LoongSon 3 5000 series a pure Chinese chip.
It is likely that Chinese will design and develop their own chips based on a open ISA or even their own ISA. Since the restriction of technology imposed by Trump, China had determined to be self -reliant on nearly all technology sectors including aeroengine, machines, robots and medicine. GREAT VIDEO, thank you.
China developed Corona and distributed to the world.
@@kapilbhardwaj4680 no legitimate proof. Do you have one? Then shut up
@@rodrozil6544 cope
@@rodrozil6544 Yes we do. It's called Google. Now you shut up.
They still have unfettered access to Britain's ARM IP. But I suppose that depending on the ARM ISA is still ultimately leaving them dependent on an external player. There's only so many Cortexes you can shove on an SoC and only so much you can do to optimise the chip designs as an end manufacturer.
Why not make RISC-V chips ?
Arm != RiscV but yeah it’s a option. Espressif changed to it also because of the Nvidia thing.
I love Espressif because of this
Espressif the best thing happened in the recent years. I can't stand buying prototype that cost more than 20 bucks, with Espressif, I can buy prototype and chip for pocket changes
Love the videos mate.
Whatever joint venture may be, China is aiming to be self reliant in microchips and that is commendable and my belief is China will surely develop its own technology very soon.
The Americans will always try to create an impression that it is their technology that is used but it will not be so very long.
They’ve been trying for 20 years
Those days are over. It's British technology they're building on at the moment. (For the most part, anyway.)
I wish they could build chips for the pga sockets witth reverse compatibility.
It's massive. Russia has already been making MIPS based networking hardware to avoid cisco, and light desktop use.
who does Russia use to fab their designs ? I am not aware of Russia having it's own fab, even low end (45nm & above) ones. Just curious.
@@tweedy4sg tsmc....most likely (it handles over 35% of the fab business...)
thank you very much for speaking so perfectly understandable in your great videos !
VIA produced pretty good x64 Intel compatible CPU Isaiah (VIA Nano). But I haven't seen it in real devices...
Remember intel once had patents that belonged to IBM which has gone through several innovations and locations but still remains a force throughout the world.
Really interesting video! I think it's interesting discussing x86's value as we move into the shaky continuation moore's law. The industry needs chips to double in power every two years to meet targets, and now, the only way to do that is more cores. More cores, and more complex sub cores are 100% in RISC territory. I think that over the next two decades, RISC chips will be able to make significantly more significant performance gains over x86. Just take a look at the 64-core AMDs -- it was obviously a huge engineering challenge to fit 64 cores onto one chip using one system bus.
As a side note though -- there's an slight high-pitched whine in the audio feed that I think a high pass filter could get rid of.
The elephant in the room :
Chip manufacturing in current year is really, really hard and expensive. There are only 3 fabs (intel, Samsung, TSMC) unlike in 80-90s when there were 10-15 fabs.
7nm TSMC process with EUV wasn't single person/company/country effort, it was effort of all shareholders, Intel, AMD, Nvidia, Samsung ...
So, can China make their chips by themselves ?
No, even Intel or AMD can't make chips by themselves, because we are at the end of the current manufacturing process and to take the next step (or the next iterative manufacturing process) it will take a lot of smart people working together
China has 20% of humanity behind it. And rather than bring CHina closer together to become a consumer and user of US chip design and manufacture... the US has decided to push China into being a competitor. China has no choice. There might be cheaper and better chips outside China, but if the US could ban their export at any moment and starve China's industry at a moments notice, what does cost matter? China has to develop home grown chip fab capability or die trying.
China already has home grown DUV.
@@nickl5658 you really didn't understand. EUV was a research in the 80s, in 1995-2002 was the first iteration and 2019 was put into production. (just EUV probably costs more then 50 billion dollars)
China or any other country that is on a ban list can just copy current design, nothing more and I will say current chip designs are pretty good in performance, it's not like 90s where in just 5 years you have completely performance.
But to be on the cutting edge, that is very difficult and it's always combine effort of all shareholders in the business, not just one company or country.
I reckon arguing about how China will lead in Semiconductor industry has become more of a past subject. Remember Chinese never publicly announce that how well their R&D is working, I believe in next couple of years we might hear some surprising news from Chinese Tech Industry.
Absolutely, on what tech they have stolen from the U.S.and how they've modified it to fit their needs.Why do R&D when it's there for the Stealing?
@@hralf6041 Precisely. Look at 5G tech. They stole from the US but they have more patents on it. Strange.
@@zafir7007 They stole a huge portion of it from Canada. That op is actually the whole reason China is even a player in 5G today.
@@liesdamnlies3372 China has gone 6G now.
Perhaps they stole it from.......... (fill in your favorite western country)
@@zafir7007 👍
Nice video. ZhaoXin means MegaCore
Its good to see more competition. Drives prices lower and lower.
Clone chips are a huge problem.
@The Professional 1 Terrabyte drive reads 1 terra only has
@@bambur1 if China couldn't produce their own ram/memory chip on last year, the price will be doubled even tripled because of the monopolize of supplies like what the graphic card did
love the channel, can you do a video about the 1st and 2nd opium wars?
I hope China makes good processors. More competition means cheaper computers for us consumers!
But it will drive engineering salaries down. Then no one wants to be engineers in US and then innovation will slow down again.
@@mlai2546 Wow I thought "THE FREE MARKET SOLVES ALL PROBLEMS" where's your capitalism now? Competition breeds innovation blablbla
@@Aka.Aka. it works for the rest of the world except China. Unfortunately there is no patent protection in China. It's like state sponsored tech theft. They encourage companies to copy and steal technologies from the west. Look how many fake handbags, iphones, shoes that comes out of China. Yet they are not doing anything to stop it.
@@mlai2546 you are comparing apples and oranges. Cell phones and cars all look alike and have similar features so maybe get the lawsuits out lol.
@@user-mhgu6om9mj2t you literally don't know what I am talking about. There are fake Louis Vuitton made to the exact same dimensions and same material. Iphones are copied to the same dimension and same interface but using Android skin. ruclips.net/video/dwfikpYT1jQ/видео.html
My classmate who went to China to work for a high tech company literally told me they reverse engineer iphones and copy it to the exact dimension.
Can I buy one? How much is their best one? I bet it's better than a Phenom II X6.
This was an excellent video detailing a situation about Chinese development and economy and you didn’t bring in international politics into it God bless your boots.
we cannot escape politics brother. You can only imagine what happens when CCP gets there hand on this.
Their political system is one to absolutely avoid so yea, really can't ignore it
However the best PC CPU is China is 3A5000 which is on MIPS/LONGARCH architecture, from benchmark ran on linux it matched the performance of the 1 gen Ryzen and it could translate to x86 or arm, but you wouldn't di that.
Your video listed the Zen 3 uP with 4nm technology. If that is fact, then how did a Chinese manufacturer get access to EUV Technology. Had to be thru ASML. That needs a entire explanation . DUV, and EUV with Silicon Valley Group was my job for 20 years, and was a constant concern, and challange to keep this technology out of China.
Its great because China will gain experience, and provided the tech for zen processors , china also designed a great arm processor for the huwaei phone so this is a sign they have the understanding of a variety of processor architects, in a few years they may be ready to innovate further or at least they can protect theor selves from sanctions.
I cant wait for the new RISC 5 processors to be competition for x86. I hope they will catch up soon.
Soon? Probably will not going to happen for another decade.
@@arenzricodexd4409 yeah, let them conquer MCU market, then annihilate ARM on the embedded SoCs and integrated core markets then the low-power servers one. x86 has currently too much software inertia and Risc-V will need a lot of time and other markets share to be considered as an inevitable ISA for mainstream software development.
It's not that hard to design a faster CPU than x86, multiple others did in the 90s, MIPS,SPARC,ALPHA, Power. All have either died out or are a small niche. The power of x86 is with 40 years of software designed for it and RISC-V is starting from scratch.
Nice video. And, surprisingly I don't see many hateful comments here.
You should look more carefully.
Ok they're concern about American tech... they create their own processor, the install windows on it's computer.... genius!
China's entry to the x86 processors is nothing but a stop gap for adoption of RISC-V chips. It is not just China, but many other countries want to use RISC-V to reduce dependence from Western IP's.
I share the same view. The X86 will be used to power legacy systems in infastructure or transportation managment. It is frustrated when you have to do a hard migration to a newer system just because you are running out of spare part.
Risc-V is the future because it'll break the bamn monopoly
True but RISC V needs some work ie virtualization instructions or real time support.
Ngl china has some strong RISC-V stuff happening and they are mass training people into the industry. I think they are gonna close off their country once they succeed
@@93hothead well, they will just follow what Apple is trying to do on its customers.
interesting video - thanks!
x86 is only a translation layer on RISC processors, has been for many years. And x86 is nearing its end of useful life. So in 10 years or so China can have its own RISC equal running custom Linux and be ahead in 20 years.
But generally, if the anglo-imperialists think they can go on acting like they do now, life surely has a few surprises in store for them.
A half truth. The x86 line was considered dead by many in the 1980s too. And you could equally well argue that old CISC processors from the 1950-70s used "a translation layer on RISC". That translation layer is the embedded microcode in these processors, while the central core in them execute the microcode in a very similar way to how a RISC processor executes machine code generated by a compiler.
The main difference is that x86 CPUs (from the mid 1990s and onwards) performs this translation dynamically, i.e. on the fly during execution. This is one of the factors that gives them a much higher performance than old CISC designs, and even higher than contemporary RISC designs. The downside is that this dynamic translation and optimization consumes some power, so not the way to go for most small battery powered devices.
There's no x86-to-RISC translation. This was a popular myth from the 90s when the x86 vendors (not only Intel) began implementing RISC-like features in their designs. The blame should probably fall on the former NexGen Corp. for calling their x86 implementation "RISC86", found in the Nx586 CPUs. All CPU architectures -- RISC or CISC -- do translate their higher level ISA (x86, ARM, etc.) to the actual machine operations, specific to the underlying hardware. It is these machine op's that the core of a CPU is actually scheduling, ordering and executing. There's no point to "translate" from CISC to RISC and then again down to machine op's -- that's the running confusion among the public. The issue for x86 specifically is that this ISA defines instruction lengths with variable size and that makes decoding parallelization very resource-heavy and power inefficient, compared to the RISC designs. That's the main difference. On the other hand x86 code in generally is more compact and doesn't require large caches, but the advancements in the semiconductor manufacturing mostly negated that advantage.
@@Ivan-pr7ku Not all CPU "translate". Many execute on the spot, directly via gates and latches. That's what RISC does. CISC often runs an internal microcode program that *interpretes* the machine language and performs the actions. It's very much like a BASIC (or script) interpreter. It does not _translate_ a program, it just _executes_ it, i.e. performs the actions. A compiler on the other hand, _translates_ the code (to another language).
In modern x86, microcode is dynamically emitted, buffered, sheduled and reordered as a means to efficiently perform the semantics of a sequence of x86 instructions. Just like normal static microcode, these wide instructions are somewhat similar in structure to (horizontal) RISC instructions.
(Static microcode was used not only in the 8086-386, but also in the RISC like 486 and Pentium for some actually complex instructions. It's used even in today's x86, but mainly for "housekeeping" tasks.)
well it needs you sino chauvinist fascists then to come up with something instead of blabbering speculative BS. And half truth narratives that you've laid t here clearly indicating you have no idea of x86 tech
@@herrbonk3635 Very few instructions are microcoded. Most instructions ARE broken down into microops on most x86 implementations. You are a bit confused.
Actually Accurate Content. Pog
People said RISC-V is the thing, this chip thingy is getting interesting. 😊
This is interesting, I love the explosion of tech in China, It had been great for people like myself who are electrnics and IOT hobbiests! I will defo seek out some of these chips and give then a try in the future.
Enjoy the Spyware built into the chips.
thank you very interesting. I remember once RISC was the big deal. I totally forgot about it until you mentioned NOW.
x86 is the only non-RISC architecture in wide use at this point. I don't think any non-RISC arch was developed after 1985 at least.
Makes no sense trying to get into x86 when ARM is the trend
RISC and CISC have always existed along side each other, sometime one becomes a bit more popular than other. China is not AIMing for cutting edge so the trend doesn’t matter that match. If they see X86 fits their initial market, they will do it. Once they get a good x86, then making an ARM equivalent chip is trivial.
@@supertnt318 Trivial. Sure, tell that to Intel. But yeah, I do agree that x86 has a huge market but the main factor in my opinion of why x86 will still be relevant for the foreseeable future is that software is written for x86 and it takes a lot of time and money to rewrite it.
When I wrote my original comment I was thinking of developing countries skipping building land line infrastructure all together and embracing 3G/4G, which made those countries much more digital savvy whether it is digital payments/banking, etc. and jumpstart a whole new array of possibilities just because there was such a high penetration of mobile phone users.
I do hope China will succeed in the chip industry, as users we all benefit.
@@vitalis x86 vs arm is like comparing apples to oranges. No expert will tell you ARM is more advanced than X86. All depends on applications. The low level software has to be rewritten for every new chip anyways even for different ARM chips. The tech bottleneck for China is at the silicon level, which are not that much different between a ARM architecture or a X86 architecture. You are seeing ARM has more growth these days are purely because of market strategy, not technology. If China sees that market benefits them, they will do their own RISC processor when they don’t have bottleneck at the silicon level.
@@supertnt318 Steve Jobs mentioned in one of his interviews that at Apple they invest in the early stages of the tech life cycle, not when it is past the top of the curve because there is diminishing returns in optimisation and it is an eventual downtrend from there. That's is why they ditched the floppy disc and CD-ROM much earlier than anyone which was a massive deal at the time and the reason they are where they are with ARM after a decade of development, refinement, data sets and having the time to plan, strategise and build a flawless transition of their ecosystem. A decade that would be lost if they didn't have such a strategy.
You seem to take this so personally that you are not getting my points and are sorely focusing on defending x86 as if I had anything against it.
The world trend is reducing power usage, reducing pollutants, recycling precious minerals and basically making everything more power efficient. That's it. That's the trend I'm talking about. And it so happens ARM is leaps and bounds ahead of the x86.
It's way past the trend of just competing for pure Ghz at whatever cost because there is another massive trend up which is increase in digital consumption and by extension power usage, a problem that is compounded by rising power cost per watt. A huge factor server farms, businesses and something consumers take into account when buying A-rated appliances, who are further encouraged by government subsidies.
@@vitalis If Jobs was still there, Apple would rather invest in Risc-V like many companies are doing now. ARM is beyond the top of the curve, now trapped losing more and more market share in the low-power every years and being unable to compete with the software inertia of x86 that the trend recently made by Apple could not entirely resolve (because M1 is not a standard arch and Apple closed environment is a no-no for many potential users and devs)
I had a cyrix 486 knockoff back in the mid 90’s. It worked great and was much cheaper than an intel 486.
I ran a cyrix for a years for a dumpy server years ago, was fun. 😂
ARM isn't even close to X86 in power. The best ARM chips barely match mid tier desktop processors.
You got to understand that ARM has a fatal flaw in that it always will have higher internal latency because it has to process more data than X86 to get the same result.
Arm has top-tier server chips which are a lot better than "barely matching mid-tier desktops"
not to mention that the m1 Mac is better than x86 laptops in single core workloads if I'm not mistaken
use a raspberry pi.
that 35 usd computer is good enough for 4k video output, basic photo editing, facebook, etc
Apples M1 Macs are not only power efficient they are overall more efficient than x86.
@@arewealone9969 close but not faster. apple hype it single process performance but the ryzen eat the M1 for breakfast when you start pushing load in. it not really because the instruction set is worst, it is just that an 8 core processor is still an 8 core processor... thus the issue with ARM is that no one has really build a chip as large as AMD64 chip, not that it can't.
Who One of the viewers uses 8080 processor from Intel, AMD and Cyrix? Or Pentium 1 mmx?
8080 was designed in 1973 using about 5000 transistors; Pentium MMX in 1993, using many milions of transistors. It's a *much* wider performance gap between the 8080 (~0.001 MFLOPS) and Pentium MMX (~250 MFLOPS) than it is between the latter and CPUs sold today (~8000 MFLOPS).
250/0.001 = 250 000, while 8000/250 = 32.
(I wrote this using a CPU from 2005 :D)
P1 and Cyrix I used when was young. Cyrix I didn't liked :D
I'm pretty sure that it's not really as practical for China to get into making their own x86 chips for the purposes of domestic supply chain security... It seems to me the more obvious strategy is to leverage their large population, and plentiful talent in the software space to continue encouraging popular cross platform apps and software to wean their population off of x86 reliance over time and to move to more readily onshored platforms such as Risc V and ARM. I'm not entirely sure this is the best example as they've had clashes with the Chinese government in the past, but Mihoyo and Genshin Impact come to mind, as a readily cross platform application (mostly, I'm still salty about their Linux support), which has absolutely blown up in popularity, but I do think it's a good example of the strategy I'm describing.
China absolutely does have plenty of talented people in the software and artistic spaces, so to me it feels natural that they'd conglomerate and work together in a way that lets them produce software which aids their goals (while letting them get American dollars from foreign fans at the same time), but of course, I'm a rando on the internet, and have little knowledge of statecraft.
Really rather too bad that x86 became the standard. It is not a good instruction set, but we are stuck with it. It was designed for microcode, and that only had a short time when it was a good idea. In actuality the IBM 360 was better. LOL.
The chinese chips cost as much or more to a superior Intel or AMD variant.
X86 cpus are extremely complex and their blueprints can be measured in the miles of paper required for each chip.
Also China is unable to buy EUV lithography machines from Europe which all but guarantees they cannot catch up in the near future.
They don't mind the British ARM ISA then?
I am definitely stoked to buy cheaper Chinese x86
If for home use Chinese innovation strikes again with tons of X99 mainboards and Xeon CPUs for very little money !
ruclips.net/video/9sWOnX-yitE/видео.html
very accurate info !!!!
I hope there are more HYGON and Zhaoxin processors ready for use on the market, I wanted to buy some but couldn' find them since the fear of legislature is still there.
Amd is genius for that hygon deal.
X-86 is at the end of its life. Intel is now adopting RISC-V to survive this decade. It's interesting to see more $ coming to this new architecture. China has a leg up there though.
Well, x86 was "soon dead" in the 1980s too, according to academics as well as all the competition :)
X86 crushes most other CPUs quite easily.
People are overlooking software compatibility in this, they think you can just turn up in a market with a radically different processor and magically have mature compilers, IDEs, experienced developers with decades of experience and huge software libraries.
@@SerBallister It's not software compatibility that made x86 fast though. (And there were p-code systems already in the 1960s, similar to todays "byte code" and "just in time" compilation.) Sure the first 8086/88 was a little slow, but already the 286 killed a lot of competition (even within Intel, such as the iAPX 432). The 486 was as fast as most "RISC" processors of the day and the Pentium had a new fantastic floating point unit that had learned *a lot* from the competition in the area, like Cyrix and others that made arithmetics coprocessors for x86 at the time (more or less x87 compatible).
I think we are already self reliant in terms of being in the elite club of owning indigenously built supercomputer PARAM and a great processor SHAKTI
It's only assembled in India, but made using western technology
@@pinkipromise here comes the genus. Fabrication for Shakti is done in Oregon on order from iit m. That means it’s a commercial paid to work order. Even otherwise my country can indigenouslydo that when compared to land grabbing wannabe superpowers who can’t?
lol