i am an entry level embedded systems engineer and i have always wanted to go beyond just making PCBs, i wanted to make my own ASIC so i started looking into the VLSI world but there isn't much out there that can help you start a career. your videos are awesome, keep the good work! one side note, plz put websites/resources that you talk about in your videos in description box. also if you can make a video about a roadmap of VLSI/chip architecture design/chip manufacturing.. etc. that would be great!
Get any job you can in any microchip design group. Study before hand the books and sites that talk layout and circuits. Outline the circuit logic forms and understand their limitations. Sadly the software for design, PDKs, and layout are behind a high cost barrier. I did use freeware Electric and Magic for layout in a real environment but that required at least one software person to support. PDKs is perhaps the biggest barrier. Good luck.
Interesting how on one hand everything moves towards getting more closed, restrictive, and unfree. On the other hand a lot develops towards a fully open, free, and transparent system. I wonder what will win.
Winning is possible only in (closed) games. Infinities balancing (life, civilization, family, peerage, etc) are research, in findable pereniality. Akin to discerning between "coherence" and "completeness": endable (completable) games mapping forms of extendable (coherent) interactivity. Keeping any ship sailing involve a lot of recatch and a lot of release. Breathing in seconds/minutes, sleeping in minutes/hours, eating in hours/days, etc. Playfulness in seriality enable variable mindfulness.
Fabricating each smaller node iteration take $billions of research, giving away the results of this research so other fabs can use that $billions of research for free and let someone else pay for it will disincentivize progress that requires $billions to develop. This is the way it should be, let the patents run out and let Open Source loose on mature silicon, lots of use-cases could value novel ingenuity over failsafe reliability. A lot of the cost of proprietary architectures come from needing failsafe design in extreme environments like a factory, AI machine learning could be used to optimize data-logging for example where a worst-case problem simply shuts down a database, but when the consequences of logic that doesn't properly handle edge-cases could kill someone, the logic-design environment must be heavily constrained by design, using prebuilt heavily constrained protected logic blocks for example, that are used for building the failsafe control logic, before any actual sequencing or process logic is programmed. In manufacturing, Rockwell AB, Siemens SIMATIC, & OMRON or Mitsubishi will continue to dominate the automation control of large plants with heavy platform lockin, most industry-specific automation builders who make custom interfaces with their logos everywhere will build on one of these. Its much easier to get help with a specific problem on a Siemens forum, than searching stackoverflow for something that might be similar.
It would be interesting if PDKs were made to follow a certain standard. Like XML or what have you. That way the "big professional stuff" and the open source stuff would be able to work with the leading edge nodes provided you're in the good graces of the leading foundries.
You'll need some sort of access control mechanism to get any sort of adoption from the current industry. The best implementation might be something like a pdk server where the pdk lives in a black box either locally or on the fabs servers somewhere. The latter would probably not be great since the latency is annoying. Some pdks ive worked on have required you to submit your designs to the fab to check your design so a chunk of the design rule manual is hidden on there server.
Thanks alot for this overview on Open Source Semiconductor Design Tools, very happy to know there are some amazing projects ongoing, cheers from france, love your content.
You might want to talk to Matt Venn about his "Zero to Asic" couse which is fully using open source tools. I dont know how often you do interviews but i think both of you would both benifit from exposure to eachothers user base.
Here in Israel there are many projects that would love to have open source EDA even WITHOUT a PDK (i.e. moon lander project = SpaceIL) and many university projects in robotics & motion control. Also small startups can benefit from not spending millions on tools. I was at a satellite comm startup like that. Cadence & Synopsis don't care about who the customers are small startup = Intel = Qualcomm; more or less. This makes DARPA's idea a good innovation "base". These are national technology people, they think different (sorry for the pun)
I think these go in two different directions. The old flow is graphical floorplanning, cell based design, etc. The new way is Verilog, tested on FPGAs and sent off to a "conversion" firm or to the end fab that takes the design and runs it though automated floorplanning and process prep. All of the SOCs I have been involved with for the last 15-20 years have been done this way. The nice thing is the Verilog route is going to be much more amenable to open source processing.
Sam Zeloof here on RUclips has managed something like 100 transistors on a homemade chip (just an array of the things for testing, though he has previously done an op-amp). Not sure what the feature size actually is, but if we say 10 micrometres, then that's early 1970s kind of scale.
Why is a channel length < micrometer even a problem? How many illuminations do we need? Isn’t there something like self alignment? Just create the fast gate at a small scale, but everything else can be bigger.
I work for one of the three big EDA companies. Until you experience the amount of support customers require, and how many thousands of engineers are needed to create and support tools and IP, there's no way of comprehending the effort involved. Sure they charge a lot of money, and they're profitable, but they do burn huge sums of cash creating the products. Despite my employment, I'm a big fan of open source everything. But trying to kick off the hardware product flow without paying tens of thousands engineers is like how Mao Zedong tried to have back yard steel furnaces replace large steel plants.
Gotta love EDA support. I still remember the guys at Snps, I would get prompt replies from them whenever I would hit a wall trying to access obscure functionality and email them questions. However, as a power user I would have rather had access to the source code. Most users of the software will follow the flow defined by the vendor and need a lot of hand holding, however power users like to bend software too their will and don’t really need support, they just figure it out. Those are the guys that make open source work. Linux is the best example, the big users of Linux don’t get it for free - they just pay the kernel engineers directly to get it to do what they want. Regular users will either pay for support to someone like red hat, or pay for it in lost productivity doing their own support.
Look at a lot of open source software companies whose business is just charging for support over an open source thing. Open source and support are not opposed. Maybe this will be the same thing in some years for the EDA world
From your verification video I took away "oh, so verification in the silicon space is not related to formal verification, it's just testing". Formal verification is mathematically proving properties about the outputs of the system, as you do in bluspec. That is a fun subject to dig into, if you get the chance, and very few design shops have the budget to do that.
formal verification can be also very difficult if you think in nanometer structures. Because there are always capacities between two things next to each other. So even if the normal behavior of your components is what you want you have also to make sure there are no side effects because of unwanted capacities and similar things. And no I don't know really how this "magic" is done, because I have enough to think in normal high-frequency stuff like radio, but no idea how to do it on a microscale.
@SirWickenshire My mothertongue is not English so I often use more complicated wordings for relative simple things. And yes quantum leaping can be a problem too, but the effect of capacitance is at much bigger distances relevant than quantum leaping.
@SirWickenshire It's not that simple. And it's got little or nothing to do with quantum tunneling. There's stuff like "ground bounce", where a transistor that turns on can suck all of the electrons out of the local ground, raising it above the local vcc for another transistor (experiencing vcc sag) and so ones turn into zeros and v.v. Then there are timing issues; you need to have clean "eyes" on any serial line. High capacitance can cause eyes to close. Then there are issues with poorly balanced clock trees, where basically not everyone switches in time. Anyway, formal verification will not catch any of this, because formal verification hasn't the slightest clue about these crosstalk-type effects. Worse, you can't model/simulate this stuff, until you have the floorplan of the chip, and an accurate model of the transistors, the assorted metal layers, the dielectric constants of the various layers, because all of those affect the propagation of the signals and the likelihood of any of these effects
Formal verification is also used in some cases! See, for example, the paper "Replacing Testing with Formal Verification in Intel Core i7 Processor Execution Engine Validation"
Bharath semiconductor society members esteemed Suresh Kumar ceo chipware technologies,ceo verifworks,Asfigo Srinivas Venkatraman and Amit CEO cadre design and many more ...
An open source PDK sounds like a good fit for Risc V. I wonder if it were possible to have an open, non-foundry-specific PDK for say, something around 100nm - and then have foundries agree to fab from this PDK. Also I think part of the problem here is, usually enterprise-level tech eventually comes down to the consumer level - not so with fabs ;) Unless there's some major breakthroughs in 3D printing, I don't see enthusiast-scale fab (kits?) in the near future.
It's turf I'm trying my hand in at home, I've been a software engineer for 20 years. I'm starting with FPGAs and VHDL, where should I go next once I've learnt that?
@@MostlyPennyCat You know you never stop to learn ! Have a look at the GHDL compiler, which enables things even the VHDL designers couldn't imagine (I have done actual real-time simulations based on the computer's clock, input-output to the printer port or the Linux Framebuffer...) VHDL takes time to get your head around but it's well worth it. GHDL changed the rules too. I do most of my EDA work with VHDL now, with a bit of bash scripting to glue it all. After you feel comfortable comes the real question : what do you want to do with this technology ?
@@leyasep5919 What I Want To Do: My hobby target is one of two things. 1) A hobby games console, 2D, advanced stereo synth and sampled sound, ROM based cartridge games, basic SD and HD (up to 1920 x 1080 x 32bit RGBA colour) 2) Variable byte length microprocessor. Rather than one instruction cycle performing one piece of 8-bit maths in parallel (8 XOR bits) it performs one piece of bit maths per instruction cycle. So, 8 bits require 8 cycles. 5 bits requires 5 cycles. But, because they are so small, the clock speed is much higher. So it's my take on a 1-bit computer. I realise that probably makes no sense.
@@MostlyPennyCat As for the prototype queue, this exists since the early 80s as MOSIS. Since 2020, the Google-Skywater collaboration uses a "MultiWafer Project" service from efabless.
its like 3d software and other graphics software... a while back in college i thought it was kind pointless for me personally to learn Maya(and i was right), since its way too freaking expensive for me to further improve my skills when my student license runs out, same goes for photoshop. now the blender and gimp replacements were a bit rough around the edges but they were free long after my discounted school license ran out. + looking back now i had 0 chance of getting hired for Maya after school since even though the asked a lot from us for a passing grade, it was not even close to being enough compare to what the industry expects of you.
Have you heard about CIRCT, an LLVM project aiming to use MLIR concepts to create an open source intermediate representation and optimization infrastructure for circuits while solving the biggest performance shortcomings of VERILOG and VHDL?
I recently watched this interview with Palmer Luckey.(here is the link for those interested ruclips.net/video/fFUgtZT1vwE/видео.html ) He said that if Taiwan was ever invaded, likely a few patriotic engineers would destroy the foundries as a scorched Earth policy. And, China would be hurt by loosing that manufacturing capacity the same as everyone else. So the downside of invasion would be really severe. The more I learn about it the more I believe that Taiwan becoming the world leader in semiconductor manufacturing is probably the greatest stroke of geopolitical genius in modern history.
@@falsch4761 How is he hoping they go to war? Hes litterally talking about great it is, that Taiwan is so prevalent in the fab industry, and that they will never go to war as a result of it.
@@falsch4761 Your saying that massacring jews single handedly lead to a free Asia? China is not being suppressed by the US. If by suppresion, you mean every country in the world calling China out on its Uhygur genocide, IP Theft, African Colonialism, and the rest, then yeah, I guess thats a form of suppresion. The US just sets up empty bases in Europe and the Middle east, and rarely exercises anything with it. All these bases are for, is for a better more efficient logistics. These bases rarely house troops, and are more or less empty airstrips with a hangar and a warehouse. Infact, in my perspective, its the other way around. China has become a larger colonial power than what Britain/USA has ever been. In my home country of Sri Lanka, China bribed government officials, and put the country on the same colonial loans and contracts that britain put on us, just to steal our ports and land. Now China sets up their carriers and naval fleet there. Remember when Mao said "if China devolved into tyranny and colonialism, then its Chinas duty to destroy it"? Seems like right now, China is the exact colonial tyrant that Mao predicted it would become.
How you can tell commercial EDA is fully optimized? It's closed. It's likely similar to proprietary software, chip binary firmware blobs or security over obscurity.
I wonder if you could have a standardised immediate PDK. Something for which you could automate the process of going from the intermediate to a vendor specific PDK. So, a standardised intermediate 14nm PDK. And then companies with the NDA write automation to go from i14 to Samsung 14nm or TSMC 14nm. Like how JIT compiled languages like Java and C# work.
I feel like there was an essay on this channel in the past that discussed how far home-grown EDA software from the PRC was lagging behind industry leaders from the West. How would these open-source tools, which as you state aren't necessarily the most cutting edge, compare to the efforts from the PRC? Are these DARPA-funded projects potentially shooting the national interest in the foot by providing this technology know-how openly, or does the thinking go that the democratization of chip design outweigh those concerns overall?
If you're of a more sinister mindset, that might be exactly what DARPA, and other related agencies, hope might happen, making large chunks of Chinese semiconductor design depend on US-made software. Or more charitably, it would also give such engineering tools and know-how to their rivals equally as well.
My guess is that DARPA sees it the other way around: that it is the US that needs to catch up to China. I don't mean that in terms of current positions, but trajectories. They're looking twenty years into the future and preparing for a worst case scenario - a darkest timeline if you will - of the PRC closing the design quality gap purely through internal efforts *and* successfully annexing Taiwan (giving them TSMC and UMC).
Incredible just what I thought I wanted to learn but did not know how to, if you can add in the comments more self learn courses on this I will try my best to learn as much as possible.
they do not compare on the same basis. On one side you'll have all kind of close-to-nothing-to-pay software that is portable, scalable (you can run as many instances on any computer as you want), easy-to-hack, but low performance and low gate count. OTOH you have extremely expensive, hyper-featured, industry-proven, ultra-high-scale systems that you can only run on cherry-picked computers "if the SW provider agrees".
At some point, node shrinks are going to hit their physical limit and all the foundries will have their super mature 12 angstrom (or what ever) node running, then there'll be no reason to keep their PDKs closed source.
Those cost millions. And they're forth it if you're big enough... And therein lies the problem: It's a huge barrier to entrance. You need to get big before they're worth it... and you won't get big if you don't have them.
Cadence, synopsis are results of decades of work done by hundreds and thousands of engineers. It’s going to take a lot of effort to produce something even remotely comparable
Open-source tools have existed for longer than any surviving proprietary product. Ones like SPICE have undergone decades of real-world use and refinement.
@@lawrencedoliveiro9104 it's not about time of existence, it's about time of active development. Also, SPICE is just one of thousands of components involved. I do agree though that there are good open-source spice implementations, e. g. xyce
maybe you could do a video about Bluespec System Verilog - it's supposedly a way way way easier to type (c-centered) HDL language .. I had some classes at my uni on it .. seemed very bleeding edge
Good job and present without saying it one of the big challenges for RISC V moving into more advanced design process nodes and capabilities. So what would change this? New machine technologies which could be produced cheaper than current EUV.
RISCV has been implemented using carbon nanotubes. We just need a kind of 3d printer which can harvest those tubes, measure their characteristics and build a circuit. Pick and place. I have seen gold wires of 1 atom thickness. Maybe a copper wire can be routed around and stretched thin? Like on a breadboard, but smaller.
Ebeam writing was a candidate for mask-less prototyping but the market was too small and the resist requirements where different from the mask-based process.
Synthetic biology => grow an FPGA in a petri dish... Honestly, synthetic biology is just a way to bootstrap the holy grail of nano-tech: codified molecular self-assembly.
Apropos tools, I think it would be cool to see a video on your take of "Turing Complete" which poses as a video game, but is actually capable of generating VHDL...
Where is the video on Mead-Conway? I can't find it on the channel. It would be helpful if there were links in the description to other videos you reference or in the (i) button thing
Optics have far too large of wavelengths (yellow light is >300 nm, which means the transistor is larger than that). Speed of light lag, ironically, means that such a computer goes too slowly.
@@evannibbe9375 in a way that is actually an advantage since you dont have to deal with those pesky quantum nuisances of going super small. also, you need to understand that working with photons is fundamentally different than working with electrons. optical gates can go way way faster than electronic ones, more than making up for the transistor density difference. optical lines can achieve literal hundreds of Tb/s over very long distances, but try doing that with electrons, good luck. oh yeah, and you can multiplex optical channels.
I'm a software guy and only had very basic HW stuff in college. How realistic are optical computers and why is or isn't that something that's taking off? This is actually the first I have heard of it.
@@TheJeremyKentBGross It isn't that hard to create something like a PCB that is very good at the single job it has. It is hard to design the hardware with optics that is good at every task you throw at it. That is propably why you did not hear from it. It simply doesn't exist for now I think. There is (successful) research too mix optics into the design of regular chips, but only to transport data faster (still inside the chip).
Governments ought to require that any tools used to create chips be released under an open source model in order to sell those chips in their country. The same for all the schematics, designs, processes, etc. It should all be online for anyone interested to see.
Hi; if you have a proposal for a new processor architecture and it is much more efficient than any existing architecture, a simple designer's What do you think it should do to develop it and bring it to life? What would be the safest way?
You prototype it in simulatable or synthesizable form, in simulation or on an FPGA. It can be even shipped into products as an FPGA. But i bet you will come up with something that either isn't 'new', you just missed part of the history lesson; or it will turn out not to be efficiently implementable in silicon, or will turn out inefficient from practical concerns, like lack of great tooling ecosystem. I welcome you to prove me wrong. Also saying "more efficient" incurs "... for WHAT exactly". There's a reason a microcontroller and a server CPU are very dissimilar, in spite of potentially sharing parts of an instruction set. Are you designing an instruction set or a microarchitecture? Two pairs of shoes.
@@SianaGearz I'm talking about a completely different method from the existing system. By efficiency, much less energy and real parallel speed to work in teraherz. Do you think it can be described as efficient?
6:30 - "apple has 3000 people working on a modem so that it's hyper optimized", yet the same company used such a shit video driver on it's g1 apple silicon MacBook/Mac mini that a 3rd party developer had to make a virtual driver to allow ultrawide monitors. These companies always amaze me with how much they focus on things and leave others to rot.
No offense, but the channel might aswell not even be called Asianometry, since its basically a science channel now lol. Im not saying change ur name, im just saying its kinda funny lol. Great video. I never knew about this entire world of just fabrication. I wish it really trickled down to hobbyists like us to experiment and play around with yk.
@@Asianometry By chance, do you work in the chip-fab field at all? I was thinking of going into Comp-sci in Uni, but it seems cooler to learn about the fabrication process and stuff.
Chip design is to Fritzing what a performance of the Balshoi ballet is to a toddler "dancing" on their bed. It involves some of the same muscles, but on an entirely different level. ;-)
I think it's redonk that thease companies that create EDAs aren't immidetly nationalized and all of their resources made open source. Design tools should never put behind a paywall for individuals or nonprofits to use, only for-profit corperations should have to be charged to use the software - which is how the nationalized companies would make their money back.
@@edenassos OP's view seems too extreme for me, but charging the big players while letting small ones off the hook is a valid business strategy. See Unreal Engine for example.
That is a huge national security thread. No state that has leading devs in this sector will let their ppl do that fully. There are reasons things like github are restricted for iran etc.
@@edenassos If the companies won't share resources with the public, then why should the public share resources with the companies? Cut off all use of public infrastructure. No roads. No utilities. No use of air space. No land. No law enforcement. No access to courts. If companies want roads and power and police protection, they can start their own country and give them away.
"open source hardware innovations like RISC-V" sorry but RISC-V is not innovation, it's more like standardisation on the least common denominator... it is a VERY polished RISC but brings close to nothing new to the table :-/
@@MostlyPennyCat RISC-V is not as open as you seem to think. It's a standard for ISA that is controlled by a cartel of companies who pay for having first dibs on new extensions, and fight for imposing their own view by being "first to market". You are only free to follow, not innovate (because change will you make "incompatible"). Furthermore the architecture is polished but brings nothing to the table because it is based on a 40 years old architecture that has already been stretched to the extremes but there is nothing new, no real innovative concept that brings us to the 21th century. Only extension to the same old RISC-1 and MIPS concept and instruction format. The YASEP is another "free" ISA with features derived from 20 years of observing where other ISAs evolved and failed. It learned from the failure of the F-CPU project (started in 1998) and I develop yet another even simpler ISA right now. I can start from scratch and reconsider every single choice, but RISC-V has 4 decades of legacy... Feel free to contact me to discuss more about it.
@@leyasep5919 I'm a total layman, but wasn't risc-v introduced to just drop this baggage? What would you want to see in a new ISA then? Do you think risc-v will be able tackle arm? Is risc-v better than arm? Is riscv modern enough isa to be realistically considered? I mean chinese are literally copying mips with loongarch so ISA doesn't seem to be a very important factor.
Desiging chips (SoC or ASIC) on a small scale was never economical or a good idea! Start with a FPGA then if volume is there you can turn it into an ASIC. The concept of open source tool is ridiculous.
i am an entry level embedded systems engineer and i have always wanted to go beyond just making PCBs, i wanted to make my own ASIC so i started looking into the VLSI world but there isn't much out there that can help you start a career.
your videos are awesome, keep the good work!
one side note, plz put websites/resources that you talk about in your videos in description box.
also if you can make a video about a roadmap of VLSI/chip architecture design/chip manufacturing.. etc. that would be great!
You should learn FPGA using verilog instead. Then, you can learn VLSI. There are so many high salary job opportunities in FPGA design.
Get any job you can in any microchip design group. Study before hand the books and sites that talk layout and circuits. Outline the circuit logic forms and understand their limitations. Sadly the software for design, PDKs, and layout are behind a high cost barrier. I did use freeware Electric and Magic for layout in a real environment but that required at least one software person to support. PDKs is perhaps the biggest barrier. Good luck.
Homemade ASICs are the dream. Being able to make ur own processor/soc, deploy AI models, etc.
@@BB-iq4su This video showed both free pdks and software, so it should be possible to start experimenting and learning at least 😊
@@jongxina3595Google Tiny Tapeout. I'm waiting for my first ASIC to be shipped this month.
Interesting how on one hand everything moves towards getting more closed, restrictive, and unfree. On the other hand a lot develops towards a fully open, free, and transparent system. I wonder what will win.
Winning is possible only in (closed) games. Infinities balancing (life, civilization, family, peerage, etc) are research, in findable pereniality.
Akin to discerning between "coherence" and "completeness": endable (completable) games mapping forms of extendable (coherent) interactivity.
Keeping any ship sailing involve a lot of recatch and a lot of release. Breathing in seconds/minutes, sleeping in minutes/hours, eating in hours/days, etc. Playfulness in seriality enable variable mindfulness.
Fabricating each smaller node iteration take $billions of research,
giving away the results of this research so other fabs can use that $billions of research for free and let someone else pay for it will disincentivize progress that requires $billions to develop.
This is the way it should be, let the patents run out and let Open Source loose on mature silicon, lots of use-cases could value novel ingenuity over failsafe reliability.
A lot of the cost of proprietary architectures come from needing failsafe design in extreme environments like a factory, AI machine learning could be used to optimize data-logging for example where a worst-case problem simply shuts down a database, but when the consequences of logic that doesn't properly handle edge-cases could kill someone, the logic-design environment must be heavily constrained by design, using prebuilt heavily constrained protected logic blocks for example, that are used for building the failsafe control logic, before any actual sequencing or process logic is programmed.
In manufacturing, Rockwell AB, Siemens SIMATIC, & OMRON or Mitsubishi will continue to dominate the automation control of large plants with heavy platform lockin,
most industry-specific automation builders who make custom interfaces with their logos everywhere will build on one of these.
Its much easier to get help with a specific problem on a Siemens forum, than searching stackoverflow for something that might be similar.
Open has been winning for decades in the most meaningful ways.
@@wedchidnaok1150 That's awesome. Is there an Asianosophy channel? 😂
Thanks!
It would be interesting if PDKs were made to follow a certain standard. Like XML or what have you. That way the "big professional stuff" and the open source stuff would be able to work with the leading edge nodes provided you're in the good graces of the leading foundries.
You'll need some sort of access control mechanism to get any sort of adoption from the current industry. The best implementation might be something like a pdk server where the pdk lives in a black box either locally or on the fabs servers somewhere. The latter would probably not be great since the latency is annoying. Some pdks ive worked on have required you to submit your designs to the fab to check your design so a chunk of the design rule manual is hidden on there server.
There was Interoparable PDK:
en.wikipedia.org/wiki/Interoperable_PDK_Libraries
which is going a little far than GDS2
I think that will take new technology in semiconductor manufacturing. Cheaper faster machines making a more competitive marketplace
There's only 3 leading edge fabs, I don't see any financial or technological gains to be had by standardized PDKs.
@@gatocochino5594 I didn't mean standardized PDKs. I meant having all PDKs in a standardized format so any EDM can work with any PDK.
Excellent technical video. Well researched!
Thanks alot for this overview on Open Source Semiconductor Design Tools, very happy to know there are some amazing projects ongoing, cheers from france, love your content.
my top channel right now, keep going!
You might want to talk to Matt Venn about his "Zero to Asic" couse which is fully using open source tools.
I dont know how often you do interviews but i think both of you would both benifit from exposure to eachothers user base.
We've been talking already! I tried to link my course but it got spam binned I think.
Here in Israel there are many projects that would love to have open source EDA even WITHOUT a PDK (i.e. moon lander project = SpaceIL) and many university projects in robotics & motion control. Also small startups can benefit from not spending millions on tools. I was at a satellite comm startup like that. Cadence & Synopsis don't care about who the customers are small startup = Intel = Qualcomm; more or less. This makes DARPA's idea a good innovation "base". These are national technology people, they think different (sorry for the pun)
Thanks man, I have learnt a lot from your channel. Keep up the good work. Thank you!
I think these go in two different directions. The old flow is graphical floorplanning, cell based design, etc. The new way is Verilog, tested on FPGAs and sent off to a "conversion" firm or to the end fab that takes the design and runs it though automated floorplanning and process prep. All of the SOCs I have been involved with for the last 15-20 years have been done this way. The nice thing is the Verilog route is going to be much more amenable to open source processing.
The open source crowd will take any and all paths toward getting to silicon ;-)
This is one of the best RUclips channels. Thanks for the great video
my dream : design my own soc but...with hobbyst levels costs . No matter if I can afford only micrometer scale, not nanometer!
Sam Zeloof here on RUclips has managed something like 100 transistors on a homemade chip (just an array of the things for testing, though he has previously done an op-amp). Not sure what the feature size actually is, but if we say 10 micrometres, then that's early 1970s kind of scale.
You can start with a FPGA prototype 🙂
@@yxyk-fr good answer, and I think @nandland has the Go board which seems good To start With FPGA
Same. Even w the AI craze, u can implement many models directly on an ASIC for deployment after they been trained. It would be very performant.
Why is a channel length < micrometer even a problem? How many illuminations do we need? Isn’t there something like self alignment? Just create the fast gate at a small scale, but everything else can be bigger.
I work for one of the three big EDA companies. Until you experience the amount of support customers require, and how many thousands of engineers are needed to create and support tools and IP, there's no way of comprehending the effort involved. Sure they charge a lot of money, and they're profitable, but they do burn huge sums of cash creating the products.
Despite my employment, I'm a big fan of open source everything. But trying to kick off the hardware product flow without paying tens of thousands engineers is like how Mao Zedong tried to have back yard steel furnaces replace large steel plants.
Gotta love EDA support. I still remember the guys at Snps, I would get prompt replies from them whenever I would hit a wall trying to access obscure functionality and email them questions. However, as a power user I would have rather had access to the source code. Most users of the software will follow the flow defined by the vendor and need a lot of hand holding, however power users like to bend software too their will and don’t really need support, they just figure it out. Those are the guys that make open source work. Linux is the best example, the big users of Linux don’t get it for free - they just pay the kernel engineers directly to get it to do what they want. Regular users will either pay for support to someone like red hat, or pay for it in lost productivity doing their own support.
Look at a lot of open source software companies whose business is just charging for support over an open source thing. Open source and support are not opposed. Maybe this will be the same thing in some years for the EDA world
I want to design an EDA tool like virtuoso, what technologies do I need to learn or are there any good learning roadmap resources, thank you!
From your verification video I took away "oh, so verification in the silicon space is not related to formal verification, it's just testing". Formal verification is mathematically proving properties about the outputs of the system, as you do in bluspec. That is a fun subject to dig into, if you get the chance, and very few design shops have the budget to do that.
formal verification can be also very difficult if you think in nanometer structures. Because there are always capacities between two things next to each other. So even if the normal behavior of your components is what you want you have also to make sure there are no side effects because of unwanted capacities and similar things. And no I don't know really how this "magic" is done, because I have enough to think in normal high-frequency stuff like radio, but no idea how to do it on a microscale.
@SirWickenshire My mothertongue is not English so I often use more complicated wordings for relative simple things.
And yes quantum leaping can be a problem too, but the effect of capacitance is at much bigger distances relevant than quantum leaping.
@SirWickenshire It's not that simple. And it's got little or nothing to do with quantum tunneling. There's stuff like "ground bounce", where a transistor that turns on can suck all of the electrons out of the local ground, raising it above the local vcc for another transistor (experiencing vcc sag) and so ones turn into zeros and v.v. Then there are timing issues; you need to have clean "eyes" on any serial line. High capacitance can cause eyes to close. Then there are issues with poorly balanced clock trees, where basically not everyone switches in time.
Anyway, formal verification will not catch any of this, because formal verification hasn't the slightest clue about these crosstalk-type effects.
Worse, you can't model/simulate this stuff, until you have the floorplan of the chip, and an accurate model of the transistors, the assorted metal layers, the dielectric constants of the various layers, because all of those affect the propagation of the signals and the likelihood of any of these effects
Formal verification is also used in some cases! See, for example, the paper "Replacing Testing with Formal Verification in Intel Core i7 Processor Execution Engine Validation"
Bharath semiconductor society members esteemed Suresh Kumar ceo chipware technologies,ceo verifworks,Asfigo Srinivas Venkatraman and Amit CEO cadre design and many more ...
An open source PDK sounds like a good fit for Risc V. I wonder if it were possible to have an open, non-foundry-specific PDK for say, something around 100nm - and then have foundries agree to fab from this PDK. Also I think part of the problem here is, usually enterprise-level tech eventually comes down to the consumer level - not so with fabs ;) Unless there's some major breakthroughs in 3D printing, I don't see enthusiast-scale fab (kits?) in the near future.
Thanks for this video ! That subject is "my turf" and I've been invested in this field for more than 20 years !
It's turf I'm trying my hand in at home, I've been a software engineer for 20 years.
I'm starting with FPGAs and VHDL, where should I go next once I've learnt that?
@@MostlyPennyCat You know you never stop to learn !
Have a look at the GHDL compiler, which enables things even the VHDL designers couldn't imagine (I have done actual real-time simulations based on the computer's clock, input-output to the printer port or the Linux Framebuffer...)
VHDL takes time to get your head around but it's well worth it. GHDL changed the rules too. I do most of my EDA work with VHDL now, with a bit of bash scripting to glue it all.
After you feel comfortable comes the real question : what do you want to do with this technology ?
@@leyasep5919
What I Want To Do:
My hobby target is one of two things.
1) A hobby games console, 2D, advanced stereo synth and sampled sound, ROM based cartridge games, basic SD and HD (up to 1920 x 1080 x 32bit RGBA colour)
2) Variable byte length microprocessor. Rather than one instruction cycle performing one piece of 8-bit maths in parallel (8 XOR bits) it performs one piece of bit maths per instruction cycle.
So, 8 bits require 8 cycles.
5 bits requires 5 cycles. But, because they are so small, the clock speed is much higher. So it's my take on a 1-bit computer. I realise that probably makes no sense.
Also, somebody needs to start a global business that allows you to queue up your prototype asic on a shared wafer.
Who's up for that?
@@MostlyPennyCat As for the prototype queue, this exists since the early 80s as MOSIS.
Since 2020, the Google-Skywater collaboration uses a "MultiWafer Project" service from efabless.
I recently bumped into OpenRoad and am slowly getting up,to speed. Great video!
its like 3d software and other graphics software... a while back in college i thought it was kind pointless for me personally to learn Maya(and i was right), since its way too freaking expensive for me to further improve my skills when my student license runs out, same goes for photoshop. now the blender and gimp replacements were a bit rough around the edges but they were free long after my discounted school license ran out. + looking back now i had 0 chance of getting hired for Maya after school since even though the asked a lot from us for a passing grade, it was not even close to being enough compare to what the industry expects of you.
Thanks a lot for the overview! I learned about the existance of some new tools :D
This is the best yt channel so far . Keep it up
This channel is a gem! Great work.
Have you heard about CIRCT, an LLVM project aiming to use MLIR concepts to create an open source intermediate representation and optimization infrastructure for circuits while solving the biggest performance shortcomings of VERILOG and VHDL?
I recently watched this interview with Palmer Luckey.(here is the link for those interested ruclips.net/video/fFUgtZT1vwE/видео.html )
He said that if Taiwan was ever invaded, likely a few patriotic engineers would destroy the foundries as a scorched Earth policy. And, China would be hurt by loosing that manufacturing capacity the same as everyone else. So the downside of invasion would be really severe. The more I learn about it the more I believe that Taiwan becoming the world leader in semiconductor manufacturing is probably the greatest stroke of geopolitical genius in modern history.
I’d say this is off topic
@@Asianometry ouch
@@falsch4761 How is he hoping they go to war?
Hes litterally talking about great it is, that Taiwan is so prevalent in the fab industry, and that they will never go to war as a result of it.
@@falsch4761 Your saying that massacring jews single handedly lead to a free Asia?
China is not being suppressed by the US.
If by suppresion, you mean every country in the world calling China out on its Uhygur genocide, IP Theft, African Colonialism, and the rest, then yeah, I guess thats a form of suppresion.
The US just sets up empty bases in Europe and the Middle east, and rarely exercises anything with it. All these bases are for, is for a better more efficient logistics. These bases rarely house troops, and are more or less empty airstrips with a hangar and a warehouse.
Infact, in my perspective, its the other way around. China has become a larger colonial power than what Britain/USA has ever been. In my home country of Sri Lanka, China bribed government officials, and put the country on the same colonial loans and contracts that britain put on us, just to steal our ports and land. Now China sets up their carriers and naval fleet there.
Remember when Mao said "if China devolved into tyranny and colonialism, then its Chinas duty to destroy it"?
Seems like right now, China is the exact colonial tyrant that Mao predicted it would become.
@@falsch4761 Reading your comments gives me an aneurism. Like, somehow I'm a Nazi because I think Taiwan is good at foreign policy LMFAO!
I can't find the video you reference at 5:15
@0:25 at first I heard 'fapping on the leading edge' and thought I was on the wrong tube site for a second.
12? No I'm 40. 🤣
It's nice to hear from you about things I'm related with.
How you can tell commercial EDA is fully optimized? It's closed. It's likely similar to proprietary software, chip binary firmware blobs or security over obscurity.
I think you meant "security by obscurity", given the context.
I wonder if you could have a standardised immediate PDK.
Something for which you could automate the process of going from the intermediate to a vendor specific PDK.
So, a standardised intermediate 14nm PDK.
And then companies with the NDA write automation to go from i14 to Samsung 14nm or TSMC 14nm.
Like how JIT compiled languages like Java and C# work.
Keep up the good job. 🙂
I feel like there was an essay on this channel in the past that discussed how far home-grown EDA software from the PRC was lagging behind industry leaders from the West. How would these open-source tools, which as you state aren't necessarily the most cutting edge, compare to the efforts from the PRC? Are these DARPA-funded projects potentially shooting the national interest in the foot by providing this technology know-how openly, or does the thinking go that the democratization of chip design outweigh those concerns overall?
If you're of a more sinister mindset, that might be exactly what DARPA, and other related agencies, hope might happen, making large chunks of Chinese semiconductor design depend on US-made software.
Or more charitably, it would also give such engineering tools and know-how to their rivals equally as well.
My guess is that DARPA sees it the other way around: that it is the US that needs to catch up to China. I don't mean that in terms of current positions, but trajectories. They're looking twenty years into the future and preparing for a worst case scenario - a darkest timeline if you will - of the PRC closing the design quality gap purely through internal efforts *and* successfully annexing Taiwan (giving them TSMC and UMC).
Nice, the audio is much improved!
Who are you man? You cover such a broad range of cool stuff - thanks 👍🏾
Quite interesting. What books have you read in your educations?
Can you recommend books on IC design, IC fab production ?
The comment section for Asianometry is fire. Go Asianometry !
Incredible just what I thought I wanted to learn but did not know how to, if you can add in the comments more self learn courses on this I will try my best to learn as much as possible.
How will the open source EDA compare to industry veterans Cadence, Mentor (now Siemens), Synopsys and Chinese new comers Amedac, BGI, X-Epic
they do not compare on the same basis.
On one side you'll have all kind of close-to-nothing-to-pay software that is portable, scalable (you can run as many instances on any computer as you want), easy-to-hack, but low performance and low gate count.
OTOH you have extremely expensive, hyper-featured, industry-proven, ultra-high-scale systems that you can only run on cherry-picked computers "if the SW provider agrees".
Thank you for changing the video title :)
At some point, node shrinks are going to hit their physical limit and all the foundries will have their super mature 12 angstrom (or what ever) node running, then there'll be no reason to keep their PDKs closed source.
EDA = Electronic Design Automation
How do they compare with commercial ones Synopsis, Cadences etc ?
Those cost millions. And they're forth it if you're big enough... And therein lies the problem: It's a huge barrier to entrance. You need to get big before they're worth it... and you won't get big if you don't have them.
Cadence, synopsis are results of decades of work done by hundreds and thousands of engineers. It’s going to take a lot of effort to produce something even remotely comparable
Open-source tools have existed for longer than any surviving proprietary product. Ones like SPICE have undergone decades of real-world use and refinement.
@@lawrencedoliveiro9104 it's not about time of existence, it's about time of active development. Also, SPICE is just one of thousands of components involved. I do agree though that there are good open-source spice implementations, e. g. xyce
I did an interview on my Zero to ASIC channel with Tom Spyrou - lead dev of OpenROAD, he had some comments about comparisons.
Please add links in the description for the things you referred to in the video.
maybe you could do a video about Bluespec System Verilog - it's supposedly a way way way easier to type (c-centered) HDL language .. I had some classes at my uni on it .. seemed very bleeding edge
I can't find your vid on Mead-Conway VLSI revolution, can you help me out?
What kind of investment would one be looking at to startup a fabrication facility to produce these open source chips?
Please do mot stop, this is so much fun to learn about
Good job and present without saying it one of the big challenges for RISC V moving into more advanced design process nodes and capabilities. So what would change this? New machine technologies which could be produced cheaper than current EUV.
RISCV has been implemented using carbon nanotubes. We just need a kind of 3d printer which can harvest those tubes, measure their characteristics and build a circuit. Pick and place. I have seen gold wires of 1 atom thickness. Maybe a copper wire can be routed around and stretched thin?
Like on a breadboard, but smaller.
Good summary!
Hello sir, I want to ask what software is used by IBM Semiconductor to design processors ?
churk moore wrote his own VLSI editor and then used that to create 144 core CPUs
Newsletter link is not working... :(
Now, let's wait for publicly accessible foundries that let you print your own chip, say on 40-60 nm node, as easily as 3d printing.
Ebeam writing was a candidate for mask-less prototyping but the market was too small and the resist requirements where different from the mask-based process.
Knowing what we know of marketable level quality microchip foundries like tsmc schedule you would wait for a millenia. Literally
Synthetic biology => grow an FPGA in a petri dish...
Honestly, synthetic biology is just a way to bootstrap the holy grail of nano-tech: codified molecular self-assembly.
in your dreams ,kiddo
@@eone199 That’s your dream? That’s disgusting
Great Video!
Awesome video
Apropos tools, I think it would be cool to see a video on your take of "Turing Complete" which poses as a video game, but is actually capable of generating VHDL...
Is it? I probably haven't gotten far enough to unlock that functionality.
Where is the video on Mead-Conway? I can't find it on the channel. It would be helpful if there were links in the description to other videos you reference or in the (i) button thing
That one is in early access.
I would love to see something on optical computers 😁
Qbits and quantum computers is my vote...
Optics have far too large of wavelengths (yellow light is >300 nm, which means the transistor is larger than that). Speed of light lag, ironically, means that such a computer goes too slowly.
@@evannibbe9375 in a way that is actually an advantage since you dont have to deal with those pesky quantum nuisances of going super small.
also, you need to understand that working with photons is fundamentally different than working with electrons. optical gates can go way way faster than electronic ones, more than making up for the transistor density difference.
optical lines can achieve literal hundreds of Tb/s over very long distances, but try doing that with electrons, good luck. oh yeah, and you can multiplex optical channels.
I'm a software guy and only had very basic HW stuff in college. How realistic are optical computers and why is or isn't that something that's taking off? This is actually the first I have heard of it.
@@TheJeremyKentBGross It isn't that hard to create something like a PCB that is very good at the single job it has.
It is hard to design the hardware with optics that is good at every task you throw at it.
That is propably why you did not hear from it. It simply doesn't exist for now I think.
There is (successful) research too mix optics into the design of regular chips, but only to transport data faster (still inside the chip).
there is no previous video
Great stuff!
Is "The Semiconductor Design Revolution" video public?
No, it’s in early access
@@Asianometry thanks. I thought it was an older video.
This is so exciting
Why this video is unlisted ?
Ha you managed to come across a video in Early Access. I’ll release it this week.
@@Asianometry
Aah! Early Access thing.
I found it on twitter ,Someone shared the link .
best vid on yt
Outstanding!
What do you think of risk v?
The fabs have to support the EDA softwares
what if someone made one tool that binds all the tools into one workflow like Kicad did for PCB design
Governments ought to require that any tools used to create chips be released under an open source model in order to sell those chips in their country. The same for all the schematics, designs, processes, etc. It should all be online for anyone interested to see.
No mention of KiCad? I gather
CERN has been sponsoring its development.
kicad is for designing PCBs, not for semiconductors/ICs
yay unlisted asianometry
Hi, can you make a video aboute Amkor?
How do you make so many Videos? Doesn't this take days or weeks to research, script and record?
Hi;
if you have a proposal for a new processor architecture and it is much more efficient than any existing architecture, a simple designer's
What do you think it should do to develop it and bring it to life? What would be the safest way?
You prototype it in simulatable or synthesizable form, in simulation or on an FPGA. It can be even shipped into products as an FPGA.
But i bet you will come up with something that either isn't 'new', you just missed part of the history lesson; or it will turn out not to be efficiently implementable in silicon, or will turn out inefficient from practical concerns, like lack of great tooling ecosystem. I welcome you to prove me wrong.
Also saying "more efficient" incurs "... for WHAT exactly". There's a reason a microcontroller and a server CPU are very dissimilar, in spite of potentially sharing parts of an instruction set. Are you designing an instruction set or a microarchitecture? Two pairs of shoes.
@@SianaGearz I'm talking about a completely different method from the existing system. By efficiency, much less energy and real parallel speed to work in teraherz. Do you think it can be described as efficient?
@@empatikokumalar8202 Have you invented a transistor with sub 0.1ps tr? If not, why even mention teraherz?
@@SianaGearz Furthermore, wiring capacitance and all other types of interferences will destroy the transition time.
@@SianaGearz i'm talking about the clock speed it can run
Where is klayout?
Magic in Efabless.
Song in Hebrew : Micha Shitrit -El Ninyo--מיכה שיטרית-אל ניניו
6:30 - "apple has 3000 people working on a modem so that it's hyper optimized", yet the same company used such a shit video driver on it's g1 apple silicon MacBook/Mac mini that a 3rd party developer had to make a virtual driver to allow ultrawide monitors. These companies always amaze me with how much they focus on things and leave others to rot.
No offense, but the channel might aswell not even be called Asianometry, since its basically a science channel now lol. Im not saying change ur name, im just saying its kinda funny lol.
Great video. I never knew about this entire world of just fabrication. I wish it really trickled down to hobbyists like us to experiment and play around with yk.
I like to mix it up. My last video was about North Korea, after all.
@@Asianometry By chance, do you work in the chip-fab field at all?
I was thinking of going into Comp-sci in Uni, but it seems cooler to learn about the fabrication process and stuff.
Let's hope open source will become the standard for photonic IC from the very beginning - at the moment it's not looking like it :/
This sounds like an uber fancy version of Fritzing to my uninitiated mind 😂
Chip design is to Fritzing what a performance of the Balshoi ballet is to a toddler "dancing" on their bed. It involves some of the same muscles, but on an entirely different level. ;-)
ASML! All is ASML!
I await the day when silicon chips will be as easily produced as mugs on a 3D printer.
😂😂😂😂me too.. but mugs on a 3d printer is funny 😂😂
It will be accomplished through my 3D algorithm. 🧐
woah unlisted asianometry
Thx
If it was viable then design companies would have done it already. Even FOSS can not solve eda hell. sigh
design companies are not incentivized to do this.
So I can just download ram?
I think it's redonk that thease companies that create EDAs aren't immidetly nationalized and all of their resources made open source. Design tools should never put behind a paywall for individuals or nonprofits to use, only for-profit corperations should have to be charged to use the software - which is how the nationalized companies would make their money back.
Start your own software company and give it away then.
@@edenassos OP's view seems too extreme for me, but charging the big players while letting small ones off the hook is a valid business strategy. See Unreal Engine for example.
@@thatguy7595 That's why I'm telling him to start his own software company so he can practice what he preaches.
That is a huge national security thread. No state that has leading devs in this sector will let their ppl do that fully. There are reasons things like github are restricted for iran etc.
@@edenassos If the companies won't share resources with the public, then why should the public share resources with the companies?
Cut off all use of public infrastructure. No roads. No utilities. No use of air space. No land. No law enforcement. No access to courts. If companies want roads and power and police protection, they can start their own country and give them away.
How does the deer in your profile pic feel about being famous? Do an interview with him?
Open Source now is basically a free labor for big companies.
What the heck is a 'nitch'?
KiCAD?
I was hoping for a mention as well. But to be fair: KiCad does not do chip level design (yet).
@@KonradTheWizzard Yeah true.
It's not a fool's errand, it's just that the volunteers in the open source community lack the necessary competence, time, or both.
EDA ??
Electronic Design Automation
@@kneekoo ohkay, thanks
i like this one
sRAM - or in English - Static Random Access Memory.
sram - or in Polish - I'm taking a s**t.
In slaves language is a 'shame'.
"open source hardware innovations like RISC-V"
sorry but RISC-V is not innovation, it's more like standardisation on the least common denominator... it is a VERY polished RISC but brings close to nothing new to the table :-/
Sorry I disappointed you with my wording.
@@Asianometry heh, don't feel sorry for my nitpicking :) BTW check your inbox ;-)
@@leyasep5919
RISC-V _enables_ people bringing something new to the table.
_Because_ it's open source and license free.
@@MostlyPennyCat RISC-V is not as open as you seem to think. It's a standard for ISA that is controlled by a cartel of companies who pay for having first dibs on new extensions, and fight for imposing their own view by being "first to market". You are only free to follow, not innovate (because change will you make "incompatible").
Furthermore the architecture is polished but brings nothing to the table because it is based on a 40 years old architecture that has already been stretched to the extremes but there is nothing new, no real innovative concept that brings us to the 21th century. Only extension to the same old RISC-1 and MIPS concept and instruction format.
The YASEP is another "free" ISA with features derived from 20 years of observing where other ISAs evolved and failed. It learned from the failure of the F-CPU project (started in 1998) and I develop yet another even simpler ISA right now. I can start from scratch and reconsider every single choice, but RISC-V has 4 decades of legacy...
Feel free to contact me to discuss more about it.
@@leyasep5919 I'm a total layman, but wasn't risc-v introduced to just drop this baggage? What would you want to see in a new ISA then? Do you think risc-v will be able tackle arm? Is risc-v better than arm? Is riscv modern enough isa to be realistically considered? I mean chinese are literally copying mips with loongarch so ISA doesn't seem to be a very important factor.
go bears
Desiging chips (SoC or ASIC) on a small scale was never economical or a good idea! Start with a FPGA then if volume is there you can turn it into an ASIC. The concept of open source tool is ridiculous.
They should have a pipeline for putting multiple ASICS on one wafer, so that small companies can have a dozen prototypes made.
well put an end to this asian best at computer bullcrap.
Verilog is trash. VHDL is way better.