This is the most glorious part. "LDA 14. What does this mean, where does it come from? Well, we're building the computer. We get to decide what stuff means!" For some reason that just puts a beaming smile on my face. And I'm not even building along.
Thanks Ben, now I've blown $300 in breadboards & components, and lost a month of sleep. I keep waking up midnight thinking of chip enables and op codes. 100% worth it
Ebay and random electronic component stores I assume, because that's how I do this stuff. Though avoid the local stores as much as you can, they have huge markup on the price.
6 лет назад+4
GTpcGaming not mine! I refuse to make cheap components outrageous and I love seeing people build things and expand their minds.
Hi, I've built my own 8bits CPU in Logisim, but i have problems to chose my TTL 7400 serie components, indeed, i don't see what components i have to chose, for exemple : i don't find any RAM which has an 8 bits address in the 7400 serie. I don't find any 11 bits counter either and about the Multiplexers/Decoders/Demultiplexers, i don't see what means "4 to 1" and things like that. It is written "select bit" and "data bit" in logisim (and I understand that) but i don't see what refers to what. May someone have some tips ? This would be my first real project cause i've never done that before and i only used some components plugged onto an Arduino to see how it works. Sorry for my english if it seems strange. After having chosen my components, what softwares do you prefer to make the PCB ? I already have Fritzing, Is it good enough for that sort of projects ? Do I need to know some things before building my PCB or do I just have to put my components into Friting then link them together and organize a bit the connexions ? Thanks for any answer made in order to help me ;)
I'm an electrical engineering student and you have cleared up, in less than 30 minutes, material that I've been searching hours on end for, regarding control logic and its implementation in an easy-to-understand manner. I cannot commend you highly enough for this incredibly lucid presentation. Please do not ever take these videos down. As someone else commented, they should be required for electrical/computer engineering students.
I feel like this whole series should be required viewing material for all computer engineering students. It covers the whole gamut of topics from software to hardware to architecture design in a detailed but clear way. Thank you so much, Ben Eater, for all the time and work you have put into these videos.
it does lay out the flow nicely ... and is a great starting point for first year engineers ... get them thinking about doing more with the absolute minimum
Ben I am at loss words, but will still try. This is by far the most "INFORMATIVE" lecture/demo/tutorial I have seen on how a computer works. Of all the great things this video stands for , the most interesting is the INTUITION it builds. If someone is interested they can watch the entire series,, but even if they go through just this lecture, I am sure one would get insight into a how a computer works. I am also amazed at the efforts and the amount of hard-work that has gone to create this content. If there is any award out there for best teacher, you out to get it. Last but not least, I think this should be required content for every computer engineering student. 👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏
Someday: Ben Eater releases his 24 core breadboard computer series The next day: breadboard stocks are at an all time high The day after that: global plastic shortage
This video is beautiful. The content, the didactics, the troubleshooting, the 42 in the end, the lack of audiovisual fillers, the simplicity, etc etc. Thank you very much.
The simplicity of explanation is so wonderful. Just from watching your videos, I've been able to connect the dots on how assembly works at a low level, how a cpu processes information, how bit-shifting works, etc. This is fascinating stuff.
I'm an Computer Science instructor at a community college in California, I have my students build the entire SAP-1 for their Machine Language class. The students seem to really look forward to lab. This will be the 2nd semester that we've incorporated the SAP-1 into the curriculum via the Ben Eater videos. Last semester the students made it to the output register, this semester we are hoping to finish.
Jeff Barnett that is awesome Wish these exercises were around when I was learning this stuff as a teen and such. I always wanted to look into the CPU internals because I was baffled about how it really worked. Luckily Charles Perzolds book CODE was available to fill any gaps in my understanding.
i wish i was taking your class. I'm a computer science student somewhere in Africa, and all I do is Miss classes trying to learn on my own,,,,,, because I gave up hope after being disappointed after joining campus,
For me this is the channel with the most anticipated uploads. Imagine if school classes had been this interesting, how much smarter we would all be. Good job Ben!
That's exactly why school classes _aren't_ that way: they _don't_ want you to get smarter. They rather want you to be obedient. So they only teach you enough so that you could follow their orders, but overwhelm you with confusion and penalize for making errors, so that you were scared to think on your own or to ask questions.
@@scitwi9164 Public schools were deliberately modeled on Prussian military academies to make good worker bees. They don't want thinkers, they want workers. Classical education at private schools are meant to make the thinkers.
Ben, I can't begin to express how informative and lucid this (and your other videos) are. Too many computer architecture and digital design books gloss over the minutiae of how CPU's are designed and carry out their tasks and your videos explain these details very well. Even though modern CPU architecture is orders of magnitude smaller and more complex, the basics remain the same and these videos are worth a king's ransom if you want to get a real conceptual understanding of CPUs.
I watch your videos now, not necessarily because I wanna understand how a CPU work, but because I admire you attention to detail and the way you can communicate something really complex and make it sound that easy. Good job!
I just discovered and watched this series. you have a great way of explaining all this complicated stuff. I can't wait for this last bit, that ties it all together. Keep up this good stuff!!
Lucky you, you watched it in a day. I'm been (and many others) following in it for a year now! This series started around Easter last year, and it's Easter again.
Oh man, I'd found it a week or two ago and having not checked the upload history that closely, was excited to get an update so quickly. From what I've seen so far, the series is definitely worth waiting for, much as it'd be awesome to have the whole thing already. As far as I can tell the control logic is all that's left, and I think it's just a counter and combinational logic to implement that, but I don't know how much explanation will go into the design, because it's definitely a complex part with a lot of impact on the whole device.
frankly, I'm shocked that anyone can demonstrate these concepts so accessibly. I'm a software dev by trade and I have never felt like I understand low-level computing more than I do now. Keep it coming!!!!
Nowadays there are software simulator software apps, which allows you to "glue" together (drag 'n drop) simple gates to simulate exactly what he did in hardware. Nonetheless this is impressive since it was done with so much love the hard way.
We could run the CE (program counter increment) during cycle 2, shaving a clock cycle off the fetch stage. It doesn't require the bus, so won't collide as long as it's not a counter out cycle.
Depending on how fast the signal propagates, it might even be possible to do it on the same clock cycle (like with the ALU), but considering that instruction fetching requires at least 2 cycles anyways, it doesn't really matter.
Nice optimization! Now I'll have to change my copy of this to work that way. Gotta save every cycle we can! I've been trying to think of a good way to optimize for instructions that only have one cycle (like out or noop) so they don't have to go through all 6 microinstructions. I could add a microinstruction counter clear to the end of the shorter instructions, but then I would only be improving by one cycle. I'd like to cut out the extra steps entirely.
Sometimes saving clock pulses won't give you much, if you have to wait for the RAM to respond anyway. E.g. in a certain microcontroller I used (85C92), it outputs the PSEN signal (Program Store ENable = enable the output from external program memory) for as many as 3 clock pulses, doing pretty much nothing, just to make sure the external memory had enough time for its internal circuitry to put the actual data on the bus after it received the enable signal. The microcontroller then latches the instruction from the bus on the rising edge of the PSEN signal at the next clock pulse.
This is such a nice project, really well done and perfectly explained. It reminds me on something similar we did in the digital electronics lab one semester, back in university at the Technical University Berlin. It is so great to see, after so many years, that someone brings the depest root of our digital age to light in such an instructional and well made video. We couldn't use so many breadboards for our "CPU", because in our class we have been groups of only four people each and we ran a bit short on breadboards. We were told to build actually a 4 bit CPU, but I wanted to have an 8 bit one, which I regretted many times, later ;-). We ended up with seven or eight densely packed breadboards and a wiring that looked like an aerial photograph of the amazon basin rain forest. But in the end it worked up to 2 MHz clock frequency. We used an EEPROM as the instruction decoder, in a way like what you would call microcode. The instruction opcode was the address of the "microcode subroutine" in the EEPROM and the bits of the data in the EEPROM represented the different states of the multimplexers etc.. There was a common microcode routine, which increments the program counter by one, put its content on the address bus, load the data into the instruction register and put its content on the address bus of the EEPROM. The LDA command would increase the program counter once more, because we used 8 bit opcodes and 8 bits for data, load its content into the instruction data register, put the value of the instruction data register on the address bus and load the content into register A. And jump to the common "load next instruction" subroutine in the EEPROM afterwards. Other groups went with an FPGA as the instruction decoder and had a hard time even with their 4 bit CPUs. Our EEPROM approach was so plain dead simple and straightforward that, although we have officially been given the choice between FPGA and EEPROM, it felt a little bit like cheating ;-). After this lab, in the time of Z80, 8086 and 68000, I felt like I know what every transistor in my computer is doing, when I was writing a program. Something, I wouldn't dare to say today ;-)
That was the best demonstration, of how a computer is hardwired to run software commands. Your manually stepping through the subroutines makes it very clear now how signals are processed. When they say machine language, every machine has its own architecture, so the designer actually hardwires the command codes, then implements the command syntax (software) to use, to execute the hardwired coding. Very good demonstration.
For me, this is the video that makes the whole series worth it. You build all these modules that each do their own thing, which is fun, and then here when it all comes together, magic happens and the whole thing becomes a programmable computer
8:20 "The beginning of executing any command is a *Fetch* cycle" -- as somebody who's taken countless computer science courses, and learned about the Fetch-Decode-Execute cycle many tens of times, hearing it come up here, for some reason, blew my mind. Never before have I understood something so completely and so instantaneously. The entire reason for the FDE cycle makes so much sense when I've seen you building the CPU from scratch -- how else would it work? Not that I didn't get it before: but now, suddenly, in that moment, I felt like I understood every facet of it. I had an epiphany and it was wonderful. Thank you so very much for these masterful videos!!!
This is SO much better than the traditional approach for teaching computer architecture, which typically involves describing a ficticious computer and going through the motions of describing what will happen at each step of execution. Seeing the LEDs change states as the clock is cycled makes it REAL. Especially when it doesn't behave as expected! Sure, machine simulators are great for demonstrating historic machines, but here it's like we're watching it come to life. It's quite amazing: the control logic really isn't any more difficult than anything that has gone before (hey, that multiplexed decimal display was a work of art!), but this is where the magic happens - it stops being a pile of modules and begins to be a computer.
Now I'm enlightened about the control unit, wow. I can sense a little bit about how we're gonna define LDA or other commands just as we did with the display, combinational logic or rom,, or maybe some other way
Hi I'm a mechatronics student currently on my sixth semester. I have to say your explanation in this video is phenomenal, your way showing how a computer actually runs a programm is intutive to a certain point, you poInt out all the important details and I just love the structure you gave to the computer and how you actually program the commands. Also the way you searched for the issue at the very end from mesuaring highs and lows until you find the specific place where the issue is happening and also looking at the datasheet for the component to see if it's working how it's supposed to I can totally relate to that struggle it's just that my projects have obviously not been this complex lol. Anyway, great video thank you very much for the knowledge you're spreading and greetings from Mexico!
I really appreciate your methodical approach: step through it manually then think how can we get logic to automate this step, rinse and repeat. It has been a dream of mine to build an 8-bit computer and you make it seem a lot closer.
Hey I just found your channel and I just started learning the basics of electrical engineering. I don’t exactly want to be an electrical engineer but I do want to get into embedded systems engineering so I’m going to start out learning computer architecture along with the electronics side of things as I already know programming. But I can not say enough how much of a gold mine your channel is. There is nothing else like it on RUclips that I have found so far and you explain things so well that even my puny little brain that only knows Python and Rust can understand it. Please continue doing the amazing work that you are doing.
I pulled out an old microcontroller I used EE school years ago because my kids are starting to show interest in computer science and I figured it could be a fun way of making simple projects. When trying to start it up and program it, I quickly got overwhelmed with everything I had forgotten or not bothered to learn. This video series revived all that knowledge and memories. When the modules started transfering information on clock pulses, I was just about giddy to the point of tears.
Thank you for such a great explanation. This series is making me put some clarity around assembler with a deeper understanding of modern day computers. Thank you! By the way, I think the best part about the videos series is when something doesn't work and troubleshooting mode commences.
I have learned so much from these videos, I wish I could have seen them when I was in school. Thanks for showing the troubleshooting, I think it is so important in any design.
This entire series is by far the most valuable thing I ever found on RUclips. Thank you so much for putting this out. And not even for me, I'm a seasoned developer, but educational value of those videos can not be calculated, not even by your Turing complete computer ;-)
I have been watching your videos today (rainy Easter here in England). I am inspired to build this and have been gathering parts lists from the videos and sourcing components etc. I built my own 8 bit machine back in the day and this will be a journey into even more fundamental architecture. Thank you.
Ha, it was pretty great to be following along on the computer I built myself and getting "42" on the first try while Ben was busy troubleshooting. thanks for getting me this far Ben!
Greetings from Belgium, that's pure dope , idk why we don't learn this way, graphical approach is so important, easier to learn and so much more interactif, we also feel how deeply you love the tech.
I went through a computer science degree program and I absolutely love your videos. It's so much better than what I learned in university 😂. Especially this part.
wow, what an absolutely excellent series. as a self taught (high level) programmer this stuff has always been a bit of mystery to me, so glad i can go through it now for free. Thanks!
These videos have blown my mind! Thanks Ben. I got all excited by your videos and bought components to build one myself. Some need to come from China so will take time.
I don't know why youtube recommended me this video, but I'm happy it did. I'm a handyman, but I have a degree in electrical engineering so I remember learning this stuff in college. Assembly was my favorite language and this video was delight to watch!
I've studied an 8bit made here in our local university. One of the best classes i've taken in my computer science course. Super cool to see It working (and to code It)
You're a Wizard, I've always wanted to understand this, it's just never gone into my "memory", thanks to you, I'm now starting to "See" what makes it "Chooch" absolutely brilliant series of videos...Thanks.
Epic videos man, i cant believe i learned so much about computers from you on youtube, and everything started with me just wanting to discover how software and hardware interact, and I ended up watching hours after hours of your videos. Thanks for this awesome information.
that moment between 0'40 and 0'55 is THE very best moment for those following the serie from the beginning. If you understand this deeply, you understand it all
Thank you so much. This is eye opening. I'm new into this kind of logic, so it's a crash course for me. 5.5 GHz are really fast! Thx for slowing it down to realize what happens. All the best for you! 🤗
Fantastic! I have long understood how your basic logic works for arithmetic, but was fuzzy on how instruction decoding worked. This is really helpful along those lines.
I've been watching the whole playlist and basically sitting aghast. And the end of this video was as good as any thriller I've ever seen. On to the next vid. Cheers.
I always wondered how microcode worked on modern CPUs, but after watching your video I feel that I have a much better understanding of what microcode does exactly. The implication is that you could take an EEPROM (or two) and have it drive the control lines, such that it takes the instruction (probably multiplied by some value to divide up the EEPROM into equal regions for the bits that would go to the control lines) and dumps a series of bits into the control lines on each clock cycle, until some signal that the "microcode" for that instruction is complete (perhaps a special signal to the control lines, or just a constant value). Granted you would need some additional control logic just for sequencing the output of the EEPROM(s) to the control lines, including a multiplier for the opcode to feed the start address into the EEPROM and perhaps even some way of sequencing the clock pulses to ensure that the bits from the EEPROM are put on the control lines before all of the other components are clocked after the address being fed to the EEPROM is incremented, but the benefit is being able to reprogram the computer's instruction set on a whim :) awesome series Ben, keep it up!
Thanks Ben! Much appreciated for the upload of all the videos. I am currently watching your video's avidly, and am in the process of creating a fully 8 bit computer with 8 bit databus and addressable space. I am using a single 64KB ram chip with i/o pins - and have that working. (I think I might create a secret instruction to "page" memory too) LOL. Anyway - so far with your instructions, I have a clock, Ram Module, Program Counter, A, B and ALU. My next step is doing the EEPROM programmer, Command register, and the CPU control logic. Cannot wait to see more of this series. The biggest issue I think I am going to have fun with is going to be designing commands that are two bytes long. For example - LDA #FF, which would load A with address 256 - requiring a PC increment to get to the instructions expected address location. And then of course, a LSA (load static value to A) command which hard loads a number into the register rather than a memory location. Having a great time - and it's so much fun realizing that the technology of an Apple II - is kind of pretty much reachable by the common man. Thanks again.
I'm just working this out now for my design. How did you get on? Regarding the LSA command, the number you want to load is actually already in a memory address - the next one along from the command!
This video is especially fascinating for me. The step-by-step procedure done here is essentially how I would make a program in a high level language (e.g. Python)! That also includes that very interesting "debugging" bit there. I'd literally insert printing statements at various parts of the code to inspect the flow and see where did it start going wrong. It is also very nice to see how each part of the computer work together. Suggestion: I think this video (or kind of an abridged version of it?) would be a very nice introductory video for your 8-bit computer building playlist :D
I just binge watched the whole series from the start. Fantastic, I tip my hat to you. I don't know how many hours you have put into this thing. I had an idea of how registers, memory and buses worked but seeing it makes a big difference. You deliverance is perfect, the right amount of speed and easy of explaining things. I'm an old fart that recently got into microprocessors and automated controls, I'm absolutely fascinated with electronics. Waiting to see how you are going to control all thise signals, cant wait. Thanks for your time
God bless you man. You enlightened me, I am looking all around the web and I found my answers all here. Thank you so much for what you do. This is just incredible. I wish you always be happy and keep on exploring amazing things. Thanks.
Cannot begin to tell you how gratifying it is to have made it this far and everything worked the first time! At least in this vid, plenty of troubleshooting to get to this point. It's almost an computer!
The fetch cycle in your example takes three clock pulses. Can it be sped up to two clock pulses, by increasing the program counter while loading the instruction into the instruction register?
Oh wow, I actually learnt this in college and never really fully understood it until now. Bit bashing definitely helps, and you now have a new subscriber.
Thanks ben I have always wanted to know how microcontrollers and cumputers work, and with your explanations, videos and knowladge, I have finally been able to, thaaanks a lot 🙏🏻🙏🏻🙏🏻
Super cool stuff! I've been following along for awhile, but this is going to be the most interesting part for me: implementing the CPU control logic. I've always wondered how it works on this low a level, and most projects are based on existing CPUs that handle the instruction decoding and everything, so you don't quite get to learn what happens "under the hood," getting from the op codes to the control logic. I'm excited! Keep up the good work, my man!
23:27 Wow, that was the most anxiety-inducing moment of this series! All the anticipation of working through the code, aaaaaaand... nothing. Lol Glad to see it was a quick fix, and I'm stoked to have finally stepped through the methods of a computer!
This is the most glorious part. "LDA 14. What does this mean, where does it come from? Well, we're building the computer. We get to decide what stuff means!" For some reason that just puts a beaming smile on my face. And I'm not even building along.
Well typically processors follow an ISA (instruction set architecture) and the registers and instructions are not arbitrary.
@@Ne012 Look who came, destroyer of fun, remover of child smiles.
@@Ne012 Well ISA aren't based on stone tablets found on some mountain, so someone still had to arbitrarily (with a lot of thought) define them
What is super sweet about your code? Well its your own cryptic bull chip! )
Load accumulator
Thanks Ben, now I've blown $300 in breadboards & components, and lost a month of sleep. I keep waking up midnight thinking of chip enables and op codes.
100% worth it
Mike B where did u go to get the stuff? In general
Ebay and random electronic component stores I assume, because that's how I do this stuff. Though avoid the local stores as much as you can, they have huge markup on the price.
GTpcGaming not mine! I refuse to make cheap components outrageous and I love seeing people build things and expand their minds.
Hi, I've built my own 8bits CPU in Logisim, but i have problems to chose my TTL 7400 serie components, indeed, i don't see what components i have to chose, for exemple : i don't find any RAM which has an 8 bits address in the 7400 serie. I don't find any 11 bits counter either and about the Multiplexers/Decoders/Demultiplexers, i don't see what means "4 to 1" and things like that. It is written "select bit" and "data bit" in logisim (and I understand that) but i don't see what refers to what. May someone have some tips ? This would be my first real project cause i've never done that before and i only used some components plugged onto an Arduino to see how it works. Sorry for my english if it seems strange. After having chosen my components, what softwares do you prefer to make the PCB ? I already have Fritzing, Is it good enough for that sort of projects ? Do I need to know some things before building my PCB or do I just have to put my components into Friting then link them together and organize a bit the connexions ? Thanks for any answer made in order to help me ;)
@Nicholas, Welcome to the world of internet shopping. The place where you can buy anything. Just google for an electronics webshop or ebay indeed.
I'm an electrical engineering student and you have cleared up, in less than 30 minutes, material that I've been searching hours on end for, regarding control logic and its implementation in an easy-to-understand manner. I cannot commend you highly enough for this incredibly lucid presentation. Please do not ever take these videos down. As someone else commented, they should be required for electrical/computer engineering students.
I feel like this whole series should be required viewing material for all computer engineering students. It covers the whole gamut of topics from software to hardware to architecture design in a detailed but clear way. Thank you so much, Ben Eater, for all the time and work you have put into these videos.
^ this
Isn't this a little bit "low" for engineering students?
not initially.
it does lay out the flow nicely ... and is a great starting point for first year engineers ... get them thinking about doing more with the absolute minimum
This and Scott CPU
This is a childhood dream come true, except someone else is doing the hard part.
Ben Eater's brain is an actual working CPU, won't even skip a beat talking about it, just amazing. :D
@Dr. M. H. < A good debouncer is always a good idea, we don't want him to malfunction. ;P
🌝
You can also buy the kit and do it yourself.
Ben I am at loss words, but will still try.
This is by far the most "INFORMATIVE" lecture/demo/tutorial I have seen on how a computer works.
Of all the great things this video stands for , the most interesting is the INTUITION it builds.
If someone is interested they can watch the entire series,, but even if they go through just this lecture, I am sure one would get insight into a how a computer works.
I am also amazed at the efforts and the amount of hard-work that has gone to create this content.
If there is any award out there for best teacher, you out to get it.
Last but not least, I think this should be required content for every computer engineering student.
👏👏👏👏👏👏👏👏👏👏👏👏👏👏👏
Can't wait to see the 64-bit 24 core version
Ravo with the size that cpu would be I’m sure you will be able to see it anywhere on the world! Would be awesome to have a macro processor
A cpu you can see from space.
Woah, that would be the size of a room -.- also as if someone else thought of it...
That would be super super cool but the amount of transistors, he would have to buy acres and acres of land to build it
Someday: Ben Eater releases his 24 core breadboard computer series
The next day: breadboard stocks are at an all time high
The day after that: global plastic shortage
This video is beautiful. The content, the didactics, the troubleshooting, the 42 in the end, the lack of audiovisual fillers, the simplicity, etc etc.
Thank you very much.
LMAO
"That zero went over there, I'm not sure if you saw it."
That zero is really aggassing, I tried to hunt it yestudray cause it was disturbing me when i was sleeping, i didnt success to catch it...
The simplicity of explanation is so wonderful. Just from watching your videos, I've been able to connect the dots on how assembly works at a low level, how a cpu processes information, how bit-shifting works, etc. This is fascinating stuff.
Elijah Doern also check out his networking tutorial,it's the best
ikr
nice name mr. led
I'm an Computer Science instructor at a community college in California, I have my students build the entire SAP-1 for their Machine Language class. The students seem to really look forward to lab. This will be the 2nd semester that we've incorporated the SAP-1 into the curriculum via the Ben Eater videos. Last semester the students made it to the output register, this semester we are hoping to finish.
Jeff Barnett that is awesome Wish these exercises were around when I was learning this stuff as a teen and such. I always wanted to look into the CPU internals because I was baffled about how it really worked. Luckily Charles Perzolds book CODE was available to fill any gaps in my understanding.
SAME HERE
i wish i was taking your class. I'm a computer science student somewhere in Africa, and all I do is Miss classes trying to learn on my own,,,,,, because I gave up hope after being disappointed after joining campus,
For me this is the channel with the most anticipated uploads. Imagine if school classes had been this interesting, how much smarter we would all be. Good job Ben!
That's exactly why school classes _aren't_ that way: they _don't_ want you to get smarter. They rather want you to be obedient. So they only teach you enough so that you could follow their orders, but overwhelm you with confusion and penalize for making errors, so that you were scared to think on your own or to ask questions.
you should also watch Primitive Technology, it's like this channel, but with rocks - ruclips.net/channel/UCAL3JXZSzSm8AlZyD3nQdBAvideos
@@scitwi9164 Public schools were deliberately modeled on Prussian military academies to make good worker bees. They don't want thinkers, they want workers. Classical education at private schools are meant to make the thinkers.
@@scitwi9164 That's not how my college classes worked, but as usual, no video no matter how a-political, is immune to political posts.
@@michaelbauers8800 here in 2022, they are saying math is racist, so... good luck to anyone building a computer with that mindset!
Stopped Netflix immediatly when the notification came. Still there was already 5 likes....ben eater is the best beast in easter....
I didn't understand anything from this video, but it's still a pleasure to watch. It makes me feel like a genius.
Watching this kind of stuff just puts into perspective how much we take our modern computers for granted.
Ben,
I can't begin to express how informative and lucid this (and your other videos) are. Too many computer architecture and digital design books gloss over the minutiae of how CPU's are designed and carry out their tasks and your videos explain these details very well.
Even though modern CPU architecture is orders of magnitude smaller and more complex, the basics remain the same and these videos are worth a king's ransom if you want to get a real conceptual understanding of CPUs.
i nearly screamed when the output didn't update. there was so much anticipation
I watch your videos now, not necessarily because I wanna understand how a CPU work, but because I admire you attention to detail and the way you can communicate something really complex and make it sound that easy. Good job!
I just discovered and watched this series. you have a great way of explaining all this complicated stuff. I can't wait for this last bit, that ties it all together.
Keep up this good stuff!!
Lucky you, you watched it in a day. I'm been (and many others) following in it for a year now! This series started around Easter last year, and it's Easter again.
Oh man, I'd found it a week or two ago and having not checked the upload history that closely, was excited to get an update so quickly. From what I've seen so far, the series is definitely worth waiting for, much as it'd be awesome to have the whole thing already. As far as I can tell the control logic is all that's left, and I think it's just a counter and combinational logic to implement that, but I don't know how much explanation will go into the design, because it's definitely a complex part with a lot of impact on the whole device.
frankly, I'm shocked that anyone can demonstrate these concepts so accessibly. I'm a software dev by trade and I have never felt like I understand low-level computing more than I do now. Keep it coming!!!!
Nowadays there are software simulator software apps, which allows you to "glue" together (drag 'n drop) simple gates to simulate exactly what he did in hardware. Nonetheless this is impressive since it was done with so much love the hard way.
This vedeo tells everything about what's going on inside a computer, from software codes to hardware operations. Great.
We could run the CE (program counter increment) during cycle 2, shaving a clock cycle off the fetch stage. It doesn't require the bus, so won't collide as long as it's not a counter out cycle.
I was typing this as a question when I saw your comment :) He probably did this so it's easier to understand for the video though.
Depending on how fast the signal propagates, it might even be possible to do it on the same clock cycle (like with the ALU), but considering that instruction fetching requires at least 2 cycles anyways, it doesn't really matter.
Yeah, that was also my idea. And cycle 1 would never be a CO cycle, so that should never run into any trouble.
Nice optimization! Now I'll have to change my copy of this to work that way. Gotta save every cycle we can!
I've been trying to think of a good way to optimize for instructions that only have one cycle (like out or noop) so they don't have to go through all 6 microinstructions. I could add a microinstruction counter clear to the end of the shorter instructions, but then I would only be improving by one cycle. I'd like to cut out the extra steps entirely.
Sometimes saving clock pulses won't give you much, if you have to wait for the RAM to respond anyway. E.g. in a certain microcontroller I used (85C92), it outputs the PSEN signal (Program Store ENable = enable the output from external program memory) for as many as 3 clock pulses, doing pretty much nothing, just to make sure the external memory had enough time for its internal circuitry to put the actual data on the bus after it received the enable signal. The microcontroller then latches the instruction from the bus on the rising edge of the PSEN signal at the next clock pulse.
This is such a nice project, really well done and perfectly explained. It reminds me on something similar we did in the digital electronics lab one semester, back in university at the Technical University Berlin. It is so great to see, after so many years, that someone brings the depest root of our digital age to light in such an instructional and well made video. We couldn't use so many breadboards for our "CPU", because in our class we have been groups of only four people each and we ran a bit short on breadboards. We were told to build actually a 4 bit CPU, but I wanted to have an 8 bit one, which I regretted many times, later ;-). We ended up with seven or eight densely packed breadboards and a wiring that looked like an aerial photograph of the amazon basin rain forest. But in the end it worked up to 2 MHz clock frequency. We used an EEPROM as the instruction decoder, in a way like what you would call microcode. The instruction opcode was the address of the "microcode subroutine" in the EEPROM and the bits of the data in the EEPROM represented the different states of the multimplexers etc.. There was a common microcode routine, which increments the program counter by one, put its content on the address bus, load the data into the instruction register and put its content on the address bus of the EEPROM. The LDA command would increase the program counter once more, because we used 8 bit opcodes and 8 bits for data, load its content into the instruction data register, put the value of the instruction data register on the address bus and load the content into register A. And jump to the common "load next instruction" subroutine in the EEPROM afterwards. Other groups went with an FPGA as the instruction decoder and had a hard time even with their 4 bit CPUs. Our EEPROM approach was so plain dead simple and straightforward that, although we have officially been given the choice between FPGA and EEPROM, it felt a little bit like cheating ;-).
After this lab, in the time of Z80, 8086 and 68000, I felt like I know what every transistor in my computer is doing, when I was writing a program. Something, I wouldn't dare to say today ;-)
That was the best demonstration, of how a computer is hardwired to run software commands. Your manually stepping through the subroutines makes it very clear now how signals are processed. When they say machine language, every machine has its own architecture, so the designer actually hardwires the command codes, then implements the command syntax (software) to use, to execute the hardwired coding.
Very good demonstration.
The hardware debugging at the end was strangely the most interesting part of this video. Really appreciate you keeping the whole process on screen.
For me, this is the video that makes the whole series worth it. You build all these modules that each do their own thing, which is fun, and then here when it all comes together, magic happens and the whole thing becomes a programmable computer
8:20 "The beginning of executing any command is a *Fetch* cycle"
-- as somebody who's taken countless computer science courses, and learned about the Fetch-Decode-Execute cycle many tens of times, hearing it come up here, for some reason, blew my mind. Never before have I understood something so completely and so instantaneously. The entire reason for the FDE cycle makes so much sense when I've seen you building the CPU from scratch -- how else would it work?
Not that I didn't get it before: but now, suddenly, in that moment, I felt like I understood every facet of it. I had an epiphany and it was wonderful. Thank you so very much for these masterful videos!!!
This is SO much better than the traditional approach for teaching computer architecture, which typically involves describing a ficticious computer and going through the motions of describing what will happen at each step of execution. Seeing the LEDs change states as the clock is cycled makes it REAL. Especially when it doesn't behave as expected! Sure, machine simulators are great for demonstrating historic machines, but here it's like we're watching it come to life.
It's quite amazing: the control logic really isn't any more difficult than anything that has gone before (hey, that multiplexed decimal display was a work of art!), but this is where the magic happens - it stops being a pile of modules and begins to be a computer.
Now I'm enlightened about the control unit, wow. I can sense a little bit about how we're gonna define LDA or other commands just as we did with the display, combinational logic or rom,, or maybe some other way
Said is very clear and thorough, you can clearly know the CPU architecture and logical operation, hard work
Hi I'm a mechatronics student currently on my sixth semester. I have to say your explanation in this video is phenomenal, your way showing how a computer actually runs a programm is intutive to a certain point, you poInt out all the important details and I just love the structure you gave to the computer and how you actually program the commands. Also the way you searched for the issue at the very end from mesuaring highs and lows until you find the specific place where the issue is happening and also looking at the datasheet for the component to see if it's working how it's supposed to I can totally relate to that struggle it's just that my projects have obviously not been this complex lol. Anyway, great video thank you very much for the knowledge you're spreading and greetings from Mexico!
I really appreciate your methodical approach: step through it manually then think how can we get logic to automate this step, rinse and repeat. It has been a dream of mine to build an 8-bit computer and you make it seem a lot closer.
Hey I just found your channel and I just started learning the basics of electrical engineering.
I don’t exactly want to be an electrical engineer but I do want to get into embedded systems engineering so I’m going to start out learning computer architecture along with the electronics side of things as I already know programming.
But I can not say enough how much of a gold mine your channel is. There is nothing else like it on RUclips that I have found so far and you explain things so well that even my puny little brain that only knows Python and Rust can understand it.
Please continue doing the amazing work that you are doing.
I pulled out an old microcontroller I used EE school years ago because my kids are starting to show interest in computer science and I figured it could be a fun way of making simple projects. When trying to start it up and program it, I quickly got overwhelmed with everything I had forgotten or not bothered to learn. This video series revived all that knowledge and memories. When the modules started transfering information on clock pulses, I was just about giddy to the point of tears.
Thank you for such a great explanation. This series is making me put some clarity around assembler with a deeper understanding of modern day computers. Thank you! By the way, I think the best part about the videos series is when something doesn't work and troubleshooting mode commences.
Man this simple program take soo much, glad we have microprocessor ic's
This video helped me understand the difference between assembly and machine code and just how it actually works. What a good feeling of dopamine
I have learned so much from these videos, I wish I could have seen them when I was in school. Thanks for showing the troubleshooting, I think it is so important in any design.
Love that you include your debugging steps when something goes wrong - helps a lot!
i wish i could pick your brain for a day. a great breakdown/demo, thanks for sharing
This entire series is by far the most valuable thing I ever found on RUclips. Thank you so much for putting this out. And not even for me, I'm a seasoned developer, but educational value of those videos can not be calculated, not even by your Turing complete computer ;-)
I have been watching your videos today (rainy Easter here in England). I am inspired to build this and have been gathering parts lists from the videos and sourcing components etc. I built my own 8 bit machine back in the day and this will be a journey into even more fundamental architecture. Thank you.
just watched every electronics video on this channel in 3 days - and I cannot wait for more :-D
Ha, it was pretty great to be following along on the computer I built myself and getting "42" on the first try while Ben was busy troubleshooting. thanks for getting me this far Ben!
Greetings from Belgium, that's pure dope , idk why we don't learn this way, graphical approach is so important, easier to learn and so much more interactif, we also feel how deeply you love the tech.
Thank you Ben for all these videos so far! I have watched every of them :) Keep up the good work!
Great video, now I understand how 1s and 0s turn into instructions. Added bonus was watching you troubleshoot a problem.
I went through a computer science degree program and I absolutely love your videos. It's so much better than what I learned in university 😂. Especially this part.
Can't thank @Ben enough. Truly a treasure trove. I don't know why I didn't find these videos for this long.
wow, what an absolutely excellent series. as a self taught (high level) programmer this stuff has always been a bit of mystery to me, so glad i can go through it now for free. Thanks!
The most amazing computer architecture lecture I have ever seen
I just found this playlist the other day and watched through it. Great explanations! I'm very excited for the control logic.
This is my favorite episode. Can't wait to build this myself
These videos have blown my mind! Thanks Ben. I got all excited by your videos and bought components to build one myself. Some need to come from China so will take time.
I don't know why youtube recommended me this video, but I'm happy it did. I'm a handyman, but I have a degree in electrical engineering so I remember learning this stuff in college. Assembly was my favorite language and this video was delight to watch!
Ben Eater makes it the Best Easter
And Ben gets to Eat the Easter eggs too. ;D
This is the most fucking amazing video series you can possibly find on the internet
I've studied an 8bit made here in our local university. One of the best classes i've taken in my computer science course. Super cool to see It working (and to code It)
You're a Wizard, I've always wanted to understand this, it's just never gone into my "memory", thanks to you, I'm now starting to "See" what makes it "Chooch" absolutely brilliant series of videos...Thanks.
All the questions which I had on computer science last years have been answered in a while watching this amazing channel! Thanks, Ben.
I work in SW engineering, I know how this stuff works. But I have never seen it explained so f****g well like in this video! Thanks and subscribed!
Epic videos man, i cant believe i learned so much about computers from you on youtube, and everything started with me just wanting to discover how software and hardware interact, and I ended up watching hours after hours of your videos. Thanks for this awesome information.
Can't wait for the next video!
Been watching this series breathlessly over the past few days, thank you!
One of the greatest videos I saw. And the answer to the “Great Question of Life, the Universe and Everything” at the end makes it EPIC:)
Thank you sir! Connected the dots of years of questioning
Coolest RUclips series ever!
Saw yesterdays and have been eagerly anticipating.
that moment between 0'40 and 0'55 is THE very best moment for those following the serie from the beginning. If you understand this deeply, you understand it all
Thank you so much. This is eye opening. I'm new into this kind of logic, so it's a crash course for me. 5.5 GHz are really fast! Thx for slowing it down to realize what happens.
All the best for you! 🤗
Fantastic! I have long understood how your basic logic works for arithmetic, but was fuzzy on how instruction decoding worked. This is really helpful along those lines.
Thank you. I really enjoyed this video. I know that setting up the board took a long time, but it was very educational to see it all happen.
We can do whatever we want. We can add, we can subtract. We can ... Thats all we can do.
Kidding. This is awesome!
From time to time I come back and rewatch your videos and your amazing teaching skills never stop to impress me!
I've been watching the whole playlist and basically sitting aghast. And the end of this video was as good as any thriller I've ever seen. On to the next vid. Cheers.
I always wondered how microcode worked on modern CPUs, but after watching your video I feel that I have a much better understanding of what microcode does exactly. The implication is that you could take an EEPROM (or two) and have it drive the control lines, such that it takes the instruction (probably multiplied by some value to divide up the EEPROM into equal regions for the bits that would go to the control lines) and dumps a series of bits into the control lines on each clock cycle, until some signal that the "microcode" for that instruction is complete (perhaps a special signal to the control lines, or just a constant value). Granted you would need some additional control logic just for sequencing the output of the EEPROM(s) to the control lines, including a multiplier for the opcode to feed the start address into the EEPROM and perhaps even some way of sequencing the clock pulses to ensure that the bits from the EEPROM are put on the control lines before all of the other components are clocked after the address being fed to the EEPROM is incremented, but the benefit is being able to reprogram the computer's instruction set on a whim :) awesome series Ben, keep it up!
Thanks Ben! Much appreciated for the upload of all the videos.
I am currently watching your video's avidly, and am in the process of creating a fully 8 bit computer with 8 bit databus and addressable space. I am using a single 64KB ram chip with i/o pins - and have that working. (I think I might create a secret instruction to "page" memory too) LOL.
Anyway - so far with your instructions, I have a clock, Ram Module, Program Counter, A, B and ALU. My next step is doing the EEPROM programmer, Command register, and the CPU control logic. Cannot wait to see more of this series.
The biggest issue I think I am going to have fun with is going to be designing commands that are two bytes long. For example - LDA #FF, which would load A with address 256 - requiring a PC increment to get to the instructions expected address location. And then of course, a LSA (load static value to A) command which hard loads a number into the register rather than a memory location.
Having a great time - and it's so much fun realizing that the technology of an Apple II - is kind of pretty much reachable by the common man.
Thanks again.
I'm just working this out now for my design. How did you get on? Regarding the LSA command, the number you want to load is actually already in a memory address - the next one along from the command!
23:34 "Do something" - literally everybody following along and building this kit. Loving it.
This video is especially fascinating for me. The step-by-step procedure done here is essentially how I would make a program in a high level language (e.g. Python)! That also includes that very interesting "debugging" bit there. I'd literally insert printing statements at various parts of the code to inspect the flow and see where did it start going wrong.
It is also very nice to see how each part of the computer work together.
Suggestion: I think this video (or kind of an abridged version of it?) would be a very nice introductory video for your 8-bit computer building playlist :D
I think you have convinced me to build my own Ben, great series.
Your breadboarding wiring is next level.
Amazing. I'm glued. Thanks
Nice explanation how a control unit works in general. Two thumbs up!
I just binge watched the whole series from the start. Fantastic, I tip my hat to you. I don't know how many hours you have put into this thing. I had an idea of how registers, memory and buses worked but seeing it makes a big difference. You deliverance is perfect, the right amount of speed and easy of explaining things.
I'm an old fart that recently got into microprocessors and automated controls, I'm absolutely fascinated with electronics.
Waiting to see how you are going to control all thise signals, cant wait.
Thanks for your time
God bless you man. You enlightened me, I am looking all around the web and I found my answers all here. Thank you so much for what you do. This is just incredible. I wish you always be happy and keep on exploring amazing things. Thanks.
Cannot begin to tell you how gratifying it is to have made it this far and everything worked the first time! At least in this vid, plenty of troubleshooting to get to this point. It's almost an computer!
The fetch cycle in your example takes three clock pulses. Can it be sped up to two clock pulses, by increasing the program counter while loading the instruction into the instruction register?
I literally just closed the previous video as this one popped up in my notifications.
Same here!
I watched the previous one on full screen, and when I exited full screen, this video was there!! Talk about awesome!
this one is just.......
you have a tons and tons of patience,,really nice
One of the best examples to understand how assembly instructions actually work.
At last! A nuts-n-bolts explanation of what goes on with logic. It's great to see the 'machinery' behind the fancy screen graphics explained so well.
I loved the debugging part. Of course the rest too, but the debugging part was more relatable to me :)
I think I finally understand how the clock works and I've been over clocking your cpu at 2x speed
Oh wow, I actually learnt this in college and never really fully understood it until now. Bit bashing definitely helps, and you now have a new subscriber.
This is going to help me so much as a React developer
Did I just found someone who is explaining how a CPU works ??
Finally I can sleep in peace :D
THANK YOU!!!!!
Thanks ben I have always wanted to know how microcontrollers and cumputers work, and with your explanations, videos and knowladge, I have finally been able to, thaaanks a lot 🙏🏻🙏🏻🙏🏻
So glad I found this! My digital electronics class left me hungry for more of this kind of thing. Can't wait for more :)
Super cool stuff! I've been following along for awhile, but this is going to be the most interesting part for me: implementing the CPU control logic. I've always wondered how it works on this low a level, and most projects are based on existing CPUs that handle the instruction decoding and everything, so you don't quite get to learn what happens "under the hood," getting from the op codes to the control logic. I'm excited! Keep up the good work, my man!
All those LEDs going off automatically with the control logic is going look pretty impressive.
23:27 Wow, that was the most anxiety-inducing moment of this series! All the anticipation of working through the code, aaaaaaand... nothing. Lol Glad to see it was a quick fix, and I'm stoked to have finally stepped through the methods of a computer!