I Designed My Own 16-bit CPU

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 2,1 тыс.

  • @AstroSamDev
    @AstroSamDev  2 года назад +648

    It sure has been a while since my last video, and there are so many new faces here!
    If you want to interact more with the community, I have a discord server where we talk about cool and random programming things, and you can see sneak peeks of future videos! discord.gg/9p82dTEdkN
    I will also start streaming programming soon on my second channel here: ruclips.net/channel/UCQSOig4wEV_pAFPRg0qtSmQ

    • @offsetmonkey5382
      @offsetmonkey5382 2 года назад

      Hmmm 7 minutes beforr video was uploaded

    • @Gamer-ct6hb
      @Gamer-ct6hb 2 года назад +1

      @@offsetmonkey5382 sus

    • @kingofthegoats7032
      @kingofthegoats7032 2 года назад

      Discord server is very good everyone should instantly join

    • @stickworldanimated9545
      @stickworldanimated9545 2 года назад

      Very Nice

    • @ItsCOMMANDer_
      @ItsCOMMANDer_ 2 года назад

      Why don't you team up with @Sam Zeloof to meke a homemade computer, he made an 1000 transistor silicon based chip.

  • @ByWire-yk8eh
    @ByWire-yk8eh 2 года назад +1833

    I designed a 16-bit CPU (microprocessor) in 1977 that was used in a mainframe as the I/O processor. It had a unique instruction set (RISC). Lots of fun, and a lot of work.

    • @helloworld145
      @helloworld145 Год назад +94

      16 bit in 1977? That's insane

    • @OpenGL4ever
      @OpenGL4ever Год назад +183

      @@helloworld145 Not for a mainframe. But the price tag sure was insane.

    • @trol0001
      @trol0001 Год назад +14

      Cool

    • @herrbonk3635
      @herrbonk3635 Год назад +71

      @@helloworld145 There were 64-bit computers in the early 1960s.

    • @KuroParasite
      @KuroParasite Год назад +29

      @@herrbonk3635 64 bit computers must be crazy expensive in that year lol.

  • @davidm7333
    @davidm7333 Год назад +2057

    As a programmer there's nothing that makes you feel quite like a fraud than seeing someone create their own CPU, build an emulator for it, create a programming language and compiler then write a game in it 😅
    Well done and keep up the good work!

    • @AstroSamDev
      @AstroSamDev  Год назад +334

      Thank you!
      Also, think the biggest problem with this project is time. I believe anyone can do it, easily, given there are so many resources online. But finding time to do the research, and make the design can be difficult. I didn't know anything about computer hardware, but I did research (and watched a lot of Ben Eater), and then a month later I made this cpu. It is relatively basic, but it is a start.

    • @seannibecker5500
      @seannibecker5500 Год назад +15

      Wow, are you just super smart or something @@AstroSamDev

    • @Thiquid
      @Thiquid Год назад +17

      ​@AstroSamDev Even still, it takes incredible dedication to complete. Even if there's enjoyment to it, what you've done is beyond impressive!

    • @monad_tcp
      @monad_tcp Год назад +2

      Do you use programming language that were not made for Browsers ? (aka, not JS)
      If you answer yes to that, you're not a fraud.

    • @okie9025
      @okie9025 11 месяцев назад +14

      ​@@monad_tcpyou wouldn't be able to write this comment without JS

  • @Azeal
    @Azeal 2 года назад +1848

    You have a level of knowledge about low-level computing that I wish I had in, well... anything. Incredible work, continue to hone your talents and I'm looking forward to that build when the day arrives!

    • @alperkaya8919
      @alperkaya8919 2 года назад +25

      Just read textbooks

    • @elakstein
      @elakstein 2 года назад +16

      These things are taught in engineering.

    • @PflanzenChirurg
      @PflanzenChirurg 2 года назад +31

      @@elakstein no they dont!

    • @elakstein
      @elakstein 2 года назад +7

      @@PflanzenChirurg ok. I will correct myself. I learnt these things in college. Obviously not as deep as he showed in this video.

    • @vishnuskandata2355
      @vishnuskandata2355 2 года назад +3

      @@elakstein true... but we somehow dont understand or learn that but a person like this can teach it in just hours...

  • @timedmondson1307
    @timedmondson1307 Год назад +256

    I have been involved in computer engineering since the 90's and to see how easy the knowledge is accessible now is mind blowing. I had to spend years on what he has done in a few months. This is truly the best time to be alive

    • @handlehaggler
      @handlehaggler 11 месяцев назад +10

      I can't imagine what it must be like for the people who were there from around the beginning of all this craze to where we are now. At some point I imagine you knew everything there was to know about but as time moved on new products, programs and such became available and after a while it just becomes too much to all learn and you just watch in awe at how dense and complex its become

    • @MarkieMark-vy7hg
      @MarkieMark-vy7hg 8 месяцев назад +1

      @@handlehagglerthe lady who help invent voip is still alive…she was one of the hallmark individuals in the creation of the World Wide Web. It’s so wild to me.

    • @onlyms4693
      @onlyms4693 2 месяца назад

      That why we need to support those people who betting on their life to crack payed research paper with newest knowledge and publish it for free, you can called it not right but people alwasy have right for seeking knowledge.

  • @UNIT398
    @UNIT398 2 года назад +39

    Simply amazing. The fact that you did all this for fun (mostly) speaks to your intelligence and understanding of how technology works. Yes, my brain didn't understand most of it, but seeing you construct each layer, part, and program was fascinating! I hope to see that creation of the CPU and other parts!

    • @JesusisJesus
      @JesusisJesus Год назад +2

      I understood about 0000000000000000% of it.
      Wondering WHY he did it, When there’s stuff already available and cheap these days, I guess it boils down to “Because I can”

  • @maindepth8830
    @maindepth8830 2 года назад +4320

    The fact he did all this in under 4 months is impressive

    • @dimi5862
      @dimi5862 2 года назад +55

      I can do it in one

    • @maindepth8830
      @maindepth8830 2 года назад +263

      @@dimi5862 go on show me, everything from scratch, from nothing. As of this comment the date is Saturday 23rd july 2022.
      You can work during live streams to make sure you didnt cheat

    • @AkariInsko
      @AkariInsko 2 года назад +168

      i can do it in one
      (millenium).

    • @brysonw7270
      @brysonw7270 2 года назад +8

      @@maindepth8830 lol but true

    • @brysonw7270
      @brysonw7270 2 года назад +10

      @@dimi5862 prove it

  • @theburntcrumpet8371
    @theburntcrumpet8371 2 года назад +974

    I've been programming for 8 years. I have recently fallen out of love with it. You have reminded me why I loved it in the first place - the ingenuity of simple yet expansible design, the thrill of problem solving and the marriage of hardware and software. Thank you.

    • @memeology4138
      @memeology4138 2 года назад +1

      Hello can u help me? Im new to coding can u link me to any good sources for learning python

    • @elakstein
      @elakstein 2 года назад +38

      @@memeology4138 learn c first. Python abstract just too much. First language should be c if you want to understand things. I would recommend "Let us c" for leaning c. We used this to learn c in college, then I learnt Java fir understanding object oriented language, then python.

    • @abcdefghjijklfgu
      @abcdefghjijklfgu 2 года назад +2

      @@memeology4138 im new too but im doing c and js

    • @memeology4138
      @memeology4138 2 года назад +2

      @@elakstein thx for ur help

    • @tacorito1809
      @tacorito1809 2 года назад +6

      @@abcdefghjijklfgu why c and js? learn one language first, master it. plus, js is usually nothing without html/css, unless you're learning react and stuff.

  • @le9038
    @le9038 2 года назад +5926

    >>creates own programming language
    >> learn and creates a CPU with it's own assembly and microcodes
    this man is becoming god.

    • @smallcatgirl
      @smallcatgirl 2 года назад +96

      Broo this is not 4chan

    • @smallcatgirl
      @smallcatgirl 2 года назад +11

      @@le9038 just assumed it was

    • @nuclear-salmon
      @nuclear-salmon 2 года назад +46

      @@le9038 ew... reddit

    • @KatzRool
      @KatzRool 2 года назад +4

      @@le9038 I think you'll find it's quite literally the opposite lmaoo

    • @vaisakh_km
      @vaisakh_km 2 года назад +30

      God like terry himself...
      waiting for alternate temple os :)

  • @ChaseDetrick
    @ChaseDetrick 4 месяца назад +4

    Watching this after my first year as a comp enge major makes me feel so incredibly proud of how far I’ve come in the past year

  • @estebanod
    @estebanod 2 года назад +24

    This guy is building his whole DIY computer and I struggle to write a three-line-long Python code to download pictures from a website ...

    • @eekee6034
      @eekee6034 Год назад +3

      Yeah. I've wanted to build my own computer for decades, but my brain just goes, "what are these facts NOPE!" lol but it's getting better slowly.

  • @D.a.n_D
    @D.a.n_D 2 года назад +236

    So i just barely passed my computer architecture course. We did mostly 32 bit MIPS processor and we also had to design one ourselves, but on 16 bits, and we didn't have to add multiplication or division. We only implemented about 15 instructions and i can tell you what this man did right hear, is no easy work. You deserve way more subscribers for the amount of work you put in man. Great work!

  • @Auxilor
    @Auxilor 2 года назад +592

    Amazing. Deserves far more views! Would be great to see you make this CPU for real like the jdh-8

    • @AstroSamDev
      @AstroSamDev  2 года назад +125

      I was planning on doing just that, it might just take some time though. I also want to do a design that doesn't use breadboards, and instead have a few custom PCBs printed.

    • @AstroSamDev
      @AstroSamDev  2 года назад +58

      @@avinadadmendez4019 Cool, I have never heard of that program, I'll look into it more. Thanks!

    • @pixobit5882
      @pixobit5882 2 года назад +14

      @@AstroSamDev I would buy it if it's not too expensive! I really like this project and i think it would be really fun to develop own programms that run on real hardware!

    • @cemalettincembelentepe8943
      @cemalettincembelentepe8943 2 года назад +21

      @@AstroSamDev @AstroSam You can also put your design on an FPGA, I don't think I have ever seen one done on youtube before for a homebrew cpu. For those who don't know; FPGAs are circuits that you can load any digital circuit on it. For example you can put your circuit on an FPGA and map the input output pins of the top module and the circuit is now the circuit you created.

    • @xanderkyron
      @xanderkyron 2 года назад +9

      @@cemalettincembelentepe8943 FPGAs have been used for soft CPUs for a very long time now (well over a decade), it would be more interesting to see a hard CPU manufacturing approach and it would be unique for these homebrew CPUs unlike using an FPGA. Granted he'd be stuck getting them as samples on a really old process node (>=28nm), but that's more than good enough for this purpose

  • @polic72andDrD3ath
    @polic72andDrD3ath 2 года назад +95

    It's been a while since I've taken a comp org class; this was the perfect bridge of software and hardware that reminded me why I loved it so much!

  • @nichonaugle5659
    @nichonaugle5659 2 года назад +5

    incredible!! this is the most complex thing I've ever picked up on so quickly, You literally made this sound so straightforward I love it. Thank you so much!!!

  • @daviddafitt
    @daviddafitt 2 года назад +3

    This is amazing. I've always wanted to try things like this but I'm just a busy CS student for now. I'll come back to this if I do get time to start.

  • @lewisd56
    @lewisd56 2 года назад +301

    I did something similar to this a good few years ago, I designed an 8-bit CPU in logicSim, wrote an emulator in C#, designed an assembly language and assembler. Unfortunately, I made a lot of mistakes in designing the architecture, things like fixed instruction lengths and no way to store perform operations on values larger than 8 bits (technically it was possible, but with 256 bytes of memory and 8 of them reserved for registers it wasnt very useful anyway). Having recently written a Z80 simulator, and learning a lot about CPU architecture in the process helped me identify all of the issues in my initial design. with that said, it wasnt bad for an A-level project that confused the hell out of everyone who tried to mark it.

    • @absalomdraconis
      @absalomdraconis 2 года назад +5

      256 addresses and some external logic should be enough to implement a microcode architecture.

    • @TechTino
      @TechTino 2 года назад +13

      A-level??? Dude that's unreal. I built an sql table viewer in VB...

    • @shinyhappyrem8728
      @shinyhappyrem8728 2 года назад +4

      Fixed instruction length isn't that bad, it's what RISC uses.

    • @lewisd56
      @lewisd56 2 года назад +2

      @@shinyhappyrem8728 considering the memory limitations, every instruction being 3 bytes long is an issue (command operand1 operand2), some instructions didn't need operands, such as halt, but would still need 3 bytes in RAM.

    • @lewisd56
      @lewisd56 2 года назад +4

      @Ramtin Javanmardi If you understand how the architecture works, writing an emulator isnt really too hard, you could even write a simulator if you wanted, but you need to get instruction timings and all sorts of hardware stuff simulated too, which can be a hassle.
      Basically, you need really 2 core things for an emulator, an ISA, which you should have if you designed the microArch, and some way to interpret the ISA, really this will probably be high level logic in your chosen programming language.
      I simply used a case switch on binary opcodes that called methods with the operands (remember, because this is an emulator, memory read and write timings dont matter), but a better approach (and the one I used on my Z80 simulator) would be to use instruction classes and loop through an array of these classes (well, cast to interfaces) testing if the opcode can execute a given function in the class.
      How you handle IO is up to you, I didn't, but memory mapped IO is pretty easy to implement and can be done by simply accessing the RAM object of your emulator.
      My Z80 simulator is open source, and was written for a university project, so it might help, it can be found here: github.com/Lewinator56/z80Sim (you will need .net core 3.1)
      Writing an assembler is kind of different, in reality at the easiest level you are simply going to have a defined assembly language and just convert it into opcodes, which is pretty easy to do. If you want to include functions and variables (by name) in the ASM then you need to process symbols too. I simply stuck with a conversion from ASM to binary.

  • @Roter_Wolf
    @Roter_Wolf 2 года назад +162

    When I was 16 i just started learning my first programming language. Meanwhile this guy is casually creating his own languages and also designing a computer from scratch while he's at it.
    You're going places, man

    • @angryvaultguy
      @angryvaultguy 6 месяцев назад +4

      You think you feel under skilled I'm 22 and just learning visual studio code and Django

    • @FluixMakesGames
      @FluixMakesGames Месяц назад

      @@angryvaultguy dang

  • @comandercrypto1318
    @comandercrypto1318 2 года назад +180

    You could learn verilog or vhdl to create the cpu on an fpga, then use off the shelf ic's for ram and rom. Fpga's, if you dont know, are programmable hardware that rewire themselves based on coding. Intel , AMD, and other companies use multi million dollar chips for prototyping chips before sending them off to print. Consumer models range from $50 to $300, and have been used for software implementations for retro systems and modern 32 and 64 bit risc v based cpus.

    • @proxy1035
      @proxy1035 2 года назад +15

      alternatively you can also be lazy like me and simply build your circuits in a logic simulator like "Digital" (it's like Logisim but faster and with extra features) and then just export them as Verilog/VDHL.
      it works surprisingly well actually.

    • @hedgehuug1603
      @hedgehuug1603 2 года назад +1

      yes this

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt 2 года назад +3

      And at this point I would have thought that chips from Intel and AMD are pure evolution .. no revolutionary steps. Steps are not agile. Maybe they include experimental modules in every chip, test them, and if it works, you get the next iteration ( more expensive ). An FPGA and a chip from Intel have nothing in common. Intel chips are optimized down to the photolithography layout and full analog simulation (spice) of the transistors. FPGA does not give you this. The days of the Pentium DIV bug, where a professor from a university uses an ad hoc script to get his algorithm from university software to some proprietary CAD at intel, are long gone.

    • @OpenGL4ever
      @OpenGL4ever Год назад

      @@ArneChristianRosenfeldt The advantage of using an FGPA instead of individual chips for this project is, that others could make use of it too by simply reprogramming their FPGA. There is already a real CPU design called Magic-1 from Bill Buzbee. The problem is, others have to recreate it, if they want have a copy of it. Search for Magic-1 on YT, there are some videos about this impressive project.

    • @darkcoeficient
      @darkcoeficient Год назад

      ​@@ArneChristianRosenfeldtwouldn't FPGA be a bit more like a drawing board at that level?

  • @raychrash5139
    @raychrash5139 2 года назад +2

    dude legit its my first year in computer science university and the fact that i understood everything you said in this video makes me happy, you inspired me to try messing around with cpu designing

  • @mixmaster5150
    @mixmaster5150 2 года назад +2

    I love that you're using the gameboy sims musics for this video. I mean those are incredibly underrated games but made my childhood

  • @nathanaeltrimm2720
    @nathanaeltrimm2720 2 года назад +413

    Man, the sequel to templeOS is looking wild, I’m just waiting for the twist the writers have planned.

    • @BigOrse
      @BigOrse 2 года назад +103

      It's just TempleOS minus the schizophrenia

    • @HBP27
      @HBP27 2 года назад +37

      @@BigOrse so its just boring then

    • @BigOrse
      @BigOrse 2 года назад +13

      @@test-zg4hv What so I should praise the man for being completely insane?

    • @tsclly2377
      @tsclly2377 2 года назад +4

      @@BigOrse I'll wait on that diagnosis..

    • @BigOrse
      @BigOrse 2 года назад +15

      Not coming here to defend myself, I don't care if what I said was stupid. To anyone saying I'm disrespectful though I suggest you maybe actually look into Terry Davis as a person, because he was a lot more disrespectful than I am.
      Anyway RIP Terry. I may not agree with his ideology but can't deny the man was a pretty legendary programmer.

  • @martinooliveri7310
    @martinooliveri7310 2 года назад +66

    amazing, and actually insipiring. im looking forward to make my own operative system in the future and these videos bring me so much joy. cheers!

    • @WTBuilder
      @WTBuilder 2 года назад

      is it linux based?

    • @4.0.4
      @4.0.4 2 года назад +1

      @@WTBuilder _"Interjection!!"_
      - Stallman Ace Attorney

  • @asyncawait5335
    @asyncawait5335 2 года назад +7

    Congrats on everything man I'm glad to see the progress you are making !

  • @Hylianmonkeys
    @Hylianmonkeys 2 года назад +5

    HELL YEAH.
    I unknowingly taught myself how to build computers using redstone in minecraft.
    At this point I've now built dozens of redstone machines including an entire gamesystem in minecraft. It has a 7x7 pixel screen, A 7bit CPU, more than 80 4bit regesters, and a rom size currently only 5.456KB or 496 bytes. However the rom can be expanded to indefinite sizes.

  • @jayanthpandit3437
    @jayanthpandit3437 11 месяцев назад +1

    watched this video once before college and didn't understand much. watched it again after my first semester as an ECE major and understood a lot more. thanks!

  • @Bobbias
    @Bobbias 2 года назад +54

    For smoother movement of the paddles, you could have created 2 more characters (either added to the charset, or overwriting some unused characters) with half height paddle characters, so your paddle could move in half-tile increments.

    • @Maykeye
      @Maykeye 2 года назад +4

      if it supports color for text, 1 character is enough: you simply swap background and foreground color which turn ▀ turns into ▄ .

    • @morgan0
      @morgan0 2 года назад +1

      or a way to write arbitrary data into unused characters and then display that with the faster character rendering

    • @OpenGL4ever
      @OpenGL4ever Год назад

      This would make the drawing routine more complex and thus slower. Better design a bit blitter and let the bit blitter hardware do the rest.

  • @leeaffdraws8502
    @leeaffdraws8502 2 года назад +8

    Woah, I remember last year, one of the subjects I had during the semester was about computer architecture. We saw how the alu works, the ram, the control unit, instruction and microinstruction sets and even periferal handling. Our tests were about designing some use cases for each component, which was a demanding process. Seeing you put that knowledge to work makes me remember how amazed I was while taking those classes. Nice video! and excellently explained too

    • @mathismartin3092
      @mathismartin3092 2 года назад +1

      I did that last year too! I don't know what software you used, but here we did it with diglog and it was a lot of fun ^^

    • @leeaffdraws8502
      @leeaffdraws8502 2 года назад

      @@mathismartin3092 Our exams were mainly done on paper, but we used logisim for practicing! Of course we stuck to no more than 4 or 5 bits for the instruction set, and mostly doing just read, write and math instructions, but it was really interesting to figure out how to do certain things :D

  • @markify8019
    @markify8019 2 года назад +7

    Your videos are extremely well edited and your explanations are in depth and easy to understand. Can’t wait to see you hit 100k soon!

  • @gegaomega5038
    @gegaomega5038 2 года назад +1163

    i am too stupid to understand it

    • @Schnickenpick
      @Schnickenpick 7 месяцев назад +85

      Finally something I can relate to

    • @0x_nietoh
      @0x_nietoh 7 месяцев назад +30

      At least you’re not in denial. Less competition for us folks!

    • @filipec0rreia
      @filipec0rreia 7 месяцев назад +17

      ​@@0x_nietoh "folks"

    • @kiwi7556
      @kiwi7556 7 месяцев назад +4

      Womp womp

    • @aleclandman8194
      @aleclandman8194 7 месяцев назад +42

      Noooo. You just haven't gone deep enough to understand. Not understanding YET is totally different from being stupid.
      :)

  • @snowflakezzonPC
    @snowflakezzonPC 2 года назад +4

    I really feel like it's understated how incredible this video really is. Amazing work, man!

  • @monemperor1559
    @monemperor1559 2 года назад +67

    this is insane. Making a CPU, then your own compiler specifically to run programs on it and then an emulator for it.
    Cool stuff

  • @ggre55
    @ggre55 2 года назад +7

    Damn bro
    I can't believe how smart u actually are i never thought someone like u would exist really really awesome
    Damn

  • @lucarosania1358
    @lucarosania1358 2 года назад +12

    This is probably the most impressive thing I've ever seen on RUclips, very very good job! Thank you so much for sharing this 😊

  • @elijahknox4421
    @elijahknox4421 2 месяца назад +5

    15:20 it's been 2 years. Can you make a video of the physical version now? 😀

  • @fieldmouselegos
    @fieldmouselegos Месяц назад +1

    I watched this video twice before today. I never really understood anything
    I just played nandgame the other day and i now actually understood most of the stuff. I realized how much more impressive this was

  • @M3LP
    @M3LP 2 года назад +7

    This is both beautiful and confusing, it's truly art. I wish go truly understand everything here, but this flies way over my head.

  • @DsiPro1000
    @DsiPro1000 2 года назад +30

    but can it run doom?

    • @gradykinn
      @gradykinn 3 месяца назад +3

      Probably not

  • @totallynotabot151
    @totallynotabot151 2 года назад +36

    Now grab an FPGA devkit and turn it into real hardware! (In case you haven't done that before, designing circuitry with Verilog / VHDL is actually quite an interesting puzzle).

    • @AxelMontini
      @AxelMontini 2 года назад +7

      Agree. My university had us build a single cycle 8-bit MIPS processor on a Xilinx FPGA. A very simple processor, yet probably the best project we've done so far.

  • @FloatSamplesGT710
    @FloatSamplesGT710 7 месяцев назад +1

    You deserve way more subs dude, very underrated and hardworking/smart working and dedicated too

  • @waffle8364
    @waffle8364 10 месяцев назад +1

    As a CS major and am a software engineer, I designed my own 8-bit microcontroller. I never got past the logic diagram because I didn't need to. I just used logisim to just stimulate my own JAR files for custom functionality. most of it was imbedded diagrams made in the sim. The whole goal was to create a compiler for machine code and load the bytes in with a file in logisim. It was cool writing my own assembly language and executing programs I wrote. I can't remember the program I used to simplify my logic diagrams though

  • @adama7752
    @adama7752 2 года назад +62

    Man relives computer history. Punch cards, assembly, etc..

  • @AshtonSnapp
    @AshtonSnapp 2 года назад +41

    Using a 16-bit address bus would get you 64 kibibytes. Although, since you’re storing a 16-bit word at each address rather than a byte, you have 64 kibiwords - or 128 kibibytes - of memory to work with.
    Le edit: 1 kibibyte = 1024 bytes. 1 kilobyte = 1000 bytes. One is a binary prefix, the other is a metric prefix.

    • @axmoylotl
      @axmoylotl 2 года назад +16

      how many habibibytes is that

    • @DeeezNuts
      @DeeezNuts 2 года назад +8

      @@axmoylotl 2 hamoodibytes

    • @anon_y_mousse
      @anon_y_mousse 2 года назад

      I'm never going to use that childish sounding garbage. 1024 bytes is a kilobyte and SI has no business even being included in the discussion, I don't even care that they're using the same letters for prefixing, it's still going to be powers of two. Hard drive manufacturers have caused and are continuing to cause harm to the industry by being such cheats and now we've got douche-nozzles who want to redefine all of our terms instead of taking the manufacturers to task. It's almost as bad as changing BC/AD to BCE/CE because they don't like the implications of how the names originated. Stop redefining terms.

    • @tissuepaper9962
      @tissuepaper9962 2 года назад +8

      Nobody cares about kibi, kilobyte is 1024 in my heart.

    • @Anon.G
      @Anon.G 2 года назад +2

      @@tissuepaper9962 same here

  • @fillupdidier5514
    @fillupdidier5514 2 года назад +29

    The quality of your videos is like a youtuber with a million views per video. I guess I found your channel the right time as I am getting more into computers but not even close to where you're at.

  • @yossefnabil2122
    @yossefnabil2122 6 месяцев назад

    bro, I've just came from computer organization and architecture final exam and I wish they taught us by making similar CPU model in the course, well done and keep going you're amazing!

  • @evandossett3897
    @evandossett3897 8 месяцев назад

    Love the video and the content! Only issue I had was getting caught off guard by the change in music or the selection, startled me a few times cause it would change out of no where while I wasn't fully focused on it and thought something was making noise in a different tab or process lol.

  • @nullmemaddress
    @nullmemaddress 2 года назад +44

    Let’s get this strait:
    - Man makes own cpu
    - Man then makes own language
    - Man then makes own compiler
    - Man then makes his own emulator
    WTF

    • @msmith2961
      @msmith2961 2 года назад +5

      Don't forget he also wrote a port of Pong to run on said computer...

    • @notsa_s
      @notsa_s 2 года назад +2

      Bro this is like jdh shit right here

    • @number1salesman-1997
      @number1salesman-1997 2 года назад

      *straight

    • @PiyushBhakat
      @PiyushBhakat 2 года назад +1

      And he's ONLY 16. When I was 16, I was barely writing hello world in C++.

    • @alexyo2440
      @alexyo2440 Год назад +1

      ​@@PiyushBhakatBits per age. This man will only grow stronger in time and consume more bits

  • @kingofthegoats7032
    @kingofthegoats7032 2 года назад +30

    What's next? Own OS on this CPU programmed with z#? Lol
    Great vid btw worth the wait

    • @AstroSamDev
      @AstroSamDev  2 года назад +15

      Probably, that text editor is just begging to be transformed into an interactive command line.

    • @kingofthegoats7032
      @kingofthegoats7032 2 года назад +8

      @@AstroSamDev AstroOS coming :000

    • @haralabospap7091
      @haralabospap7091 2 года назад +3

      @@AstroSamDev rewrite the Linux kernel in Z# and make z#/Linux
      /s of course

    • @Maykeye
      @Maykeye 2 года назад +1

      Lol aside, that's actually not impossible: see Collapse OS for how to bootstrap from almost nothing to working OS.

  • @jeffparent2159
    @jeffparent2159 2 года назад +7

    Very cool project. This was the stuff I loved doing at uni. Sadly FPGA are still crazy expensive, the CPUs we designed there were all in HDL languages and allowed you to run on hardware. Makes me wanna dig out that code again.

  • @amardevsharma1223
    @amardevsharma1223 2 года назад +2

    Yo this helped so much and I always appreciate the content and when i found the channel and got the energy from you from the previous video, you've been nothing but real and can vouch for the amazing content and how down to earth you are with everything! All the most love, respect, and appreciation

  • @kevincozens6837
    @kevincozens6837 Год назад

    A nice fun little(?) project. I would make some changes to the instruction set. Most instructions had the register after the mnemonic but you have AIN, BIN, CIN. They should be INA, INB, and INC. However, you have 5 bits for the op code and 11 spare bits. You can use the spare bits to indicate source and/or destination register as that can be coded in only two bits each. For jump instructions the spare bits can indicate which flag(s) to check. That would free up 7 instructions for other possible op-codes and make IN, LDI, SWP, and the math operations able to work with any (pair of) registers.

  • @hamburgerhamburgerv2
    @hamburgerhamburgerv2 2 года назад +4

    2025: I designed the world’s most powerful computer

  • @gawkersdeathrattle1759
    @gawkersdeathrattle1759 2 года назад +11

    You may want to expand the architecture to add interrupts, although it does complicate things somewhat given that you'll probably want a stack to handle those, and you currently don't have anything stack-ish like jump-to-subroute or returns.

    • @Ehal256
      @Ehal256 2 года назад +1

      Was going to mention this. It would make timing quite a bit simpler, no more arbitrary wait N cycle loops, just execute game logic once on every vertical sync interrupt. :)

  • @Koljon
    @Koljon 2 года назад +4

    very cool cant wait to see you building it in irl ;)

  • @3v068
    @3v068 2 года назад +2

    The first video of yours im watching and I love it! Ive always been interested in computers and specifically how they work, and ive also wanted to get into circuit building so this is RIGHT up my alley. Your editing is fucking amazing too! Keep it up dude!

  • @zencibalina2926
    @zencibalina2926 2 года назад

    Thanks for the motivation. I wasn't sure if I could do it, but I might try it eventually.

  • @ad.i
    @ad.i 2 года назад +4

    THIS WAS EXPLAINED SO WELL WHAT THE HELL? SERIOUSLY KUDOS TO YOU MAN

  • @kazzookaE
    @kazzookaE 7 месяцев назад +5

    Me still unable to understand recursion in JavaScript

  • @CYON4D
    @CYON4D 2 года назад +8

    Does your emulator skip the logic layer and just execute instructions on the list?
    You did an amazing job with your custom CPU.
    By the way I noticed the original Sims OST playing in the background. I love that OST.

  • @philippepanayotov9632
    @philippepanayotov9632 5 месяцев назад

    I salute you! I always wanted to do this but never had enough free time. You have motivated me to try again. Keep on inspiring people!

  • @Random-os5rd
    @Random-os5rd 2 года назад +1

    Great tutorial, links and program worked fine for me. Thanks for sharing.

  • @atomspalter2090
    @atomspalter2090 2 года назад +4

    congrats for that achievement. You actually designed a own computer. Wow!

  • @rabbitcreative
    @rabbitcreative 2 года назад +3

    0:17 I'll complain. Green-field-project-owner neglects updating critical value after making changes.

  • @lordbarron3352
    @lordbarron3352 5 месяцев назад +4

    But did God give you a dream telling you to make it as his 4th temple?

  • @doctorweile
    @doctorweile 2 года назад +2

    Nice. If anyone would like to know the principles behind memory, GPU, ALU etc. I can recommend a course, I took a couple of years ago: "From NAND to Tetris", where you start up with a NAND-gate and end up with a working system. No hardware is involved, all simulated, using a hardware description language.

  • @alexjohnson5677
    @alexjohnson5677 Год назад

    I had a similar project during my computer science degree. This was a good refresher on the different individual components of a CPU. Makes me want to fire up Logisim again! Great video.

  • @Frankabyte
    @Frankabyte 2 года назад +18

    This is seriously impressive, and I'm blown away by the amount of time and effort that went into this...
    ...but can it run DOOM?

  • @Bobbias
    @Bobbias 2 года назад +7

    One small thing about how you explained your emulator. You conveniently forgot to mention that part of why logisim is so much slower is that it actually simulates things far more in depth than your emulator. A very minor thing, but the way you glossed over it gives the impression that logisim is slow because... It's slow. Rather than because it's doing way more work than your emulator.

    • @DFX2KX
      @DFX2KX 2 года назад +1

      and Logisim's speed does depend on whether or not one is using the default 'chips' inside it or not, it has math operators and register modules, and whatnot, but if you eschew that and go down to the individual gates in sub-circuits....

    • @Ehal256
      @Ehal256 2 года назад +1

      He does say that logisim is simulating every logic gate, and that's why it's slow. Seemed sufficient to me.

    • @Costelad
      @Costelad 2 года назад

      Yeah I was a bit confused at that. Why would they differ in speed so much when they’re both emulators?

    • @Ehal256
      @Ehal256 2 года назад +1

      @@Costelad Imagine instead of executing an add instruction directly, you simulate the propagation of electricity through each part of the circuit implementing an add operation.

  • @rezqy_
    @rezqy_ 2 года назад +22

    "I built my own earth"

    • @SusDoctor
      @SusDoctor 7 месяцев назад +4

      I built the entire observable universe by scratch with real breathing life forms in it.

    • @Anirudhfou4
      @Anirudhfou4 6 месяцев назад

      👍

  • @Vivek2062
    @Vivek2062 4 месяца назад

    I just discovered your channel and instantly loved it, and since you're so skilled with programming, here's an idea: Why don't you start working on a project where you try to design your own NFS Most Wanted 2005 version with the Flagship Cars designed by you, and a bit of story change, do you think it's interesting enough for a YT video? I'm pretty sure after these 20 years I find nostalgia hitting me hard about NFS MW, there are a bunch of mods for this game, but doing something as original as you do in your YT channel, that would be of immense value to us developers and game lovers. Thank you for all your hard work, amazing channel!!

  • @ArtemArist
    @ArtemArist Год назад

    What the hell, man, this is nuts..
    Great work, i cannot imagine how tough it was to this point, damn..

  • @MegaBlox_YT
    @MegaBlox_YT 7 месяцев назад +7

    Why did you disappear from RUclips?

  • @SirSans842
    @SirSans842 Год назад +13

    Can it run doom?

    • @pfaffnoel
      @pfaffnoel 4 месяца назад +2

      16 bit computers CAN run Doom

  • @goteer10
    @goteer10 2 года назад +6

    Next step, TempleOS!

  • @pan2285
    @pan2285 Год назад

    Thanks brother, your content helps me a lot! Keep up the great work! This is exactly what yt is made for!!

  • @neomage2021
    @neomage2021 Год назад

    Fun project! I remember having to do this (but 8 bit) in digital electronics class in undergrad for EE.

  • @gachastorys5129
    @gachastorys5129 2 года назад +12

    Hey so I have an idea! I think that Z# would fit perfectly as an embed for the computer. It should be its own name language. Kind of like batch is for windows. I’m very impressed by this! Hope to see more videos from you in the future

  • @7.12_am
    @7.12_am 2 года назад +9

    Bet in 2030 he'll be competition for Intel, AMD and NVIDIA after he figures out how to chemically make transistors

    • @LuaanTi
      @LuaanTi 2 года назад

      You want Sam Zeloof for that :P ruclips.net/video/s1MCi7FliVY/видео.html

  • @The_hot_blue_fire_guy
    @The_hot_blue_fire_guy 2 года назад +4

    Now you need to build this with actual hardware and add a long term memory to it. Like a SD card slot or something that could be used to store data even when the system is turned off.

  • @guaren6611
    @guaren6611 Год назад

    Impressive work. In my computer architecture course we are creating a similar CPU, fun project

  • @cispower
    @cispower 2 месяца назад +1

    2 years ago i watched this video trying to understand computers(i didn't). Now watching this again knowing and understanding everything you did here.

  • @janikarkkainen3904
    @janikarkkainen3904 2 года назад +5

    Awesome project. I love stuff like this, even started my own project. Tho it's already a few years working on and off on it, and using a 6502 instead of just logic stuff, and on breadboards. Fun stuff and great work getting so deep with this stuff!
    Now make it in real hardware! 😏 ...I mean, you used logisim so IC counterparts should be relarively easy to come by, VGA might pose some problems but the VGA signal is quite simple in the end...

  • @hades4real265
    @hades4real265 Год назад +1

    here jus to say ur a legend bro keep up the work

  • @thomasvogel4340
    @thomasvogel4340 Год назад

    I've always wanted to know how computers work at lower levels. Thanks a lot, this was very insightful, and was very entertaining to watch. You got a new subscriber now.

  • @TimothyChapman
    @TimothyChapman 2 года назад +1

    I've also recently made my own 16-bit CPU and intend to put it on real hardware. My CPU has a much larger instruction set, much larger register set, and the massive oversight of not having any add or subtract immediate instructions.

  • @jigneshsolanki2486
    @jigneshsolanki2486 2 года назад

    Been watcNice tutorialng your vids for a good few weeks now, learning new sNice tutorialt each day. my worksoftow has improved so much since watcNice tutorialng

  • @redluck01
    @redluck01 2 года назад

    Impressive. I wrote a language in college. I never thought of creating a CPU. Since there is only maybe 2 CPU instruction sets left in use now, creating a whole new instruction set is rather interesting.

  • @jimenezharlenjoyf.8158
    @jimenezharlenjoyf.8158 2 года назад

    You are literally the best, I've been looking for a tutorial for three days and yours works

  • @suvetar
    @suvetar 9 месяцев назад

    Thanks for this! Very inspirational and please keep up the good work!

  • @Azurdos
    @Azurdos 2 года назад

    My godness, your content is awesome !
    I think that a video explaining your path to achieve such a detailed knowledge about low level computing would also be very interesting :)

  • @ana303_
    @ana303_ Год назад

    I had microprocessor last semester, you did well explaining everything as simply as possible.

  • @Gghhhngbvg
    @Gghhhngbvg Год назад

    I just found your channel today and your crazy dude! im Just getting into computers. But Making a language AND creating a whole new CPU. Is amazing man! If your were to ask a professor how to do this, he would look at like you were insane! Keep up the great work man!!

  • @morchedlafferty8614
    @morchedlafferty8614 Год назад

    Impressed!!! Great content! Well presented, I loved it!

  • @LeicaM11
    @LeicaM11 Год назад

    I designed and implemented a cross compiler(Assembler) for a 72 Bit bit slice processor, back in 1987. I had programming experience in C, Assembler and Pascal only.😅 I made the Assembler compiler some kind of comfortable, so you got a Listing with errors and you were able to comment code out.

  • @Ponanoix
    @Ponanoix 5 месяцев назад

    I had a similar project to do as part of my Architecture of Computers 2 course at my university. Albeit, much simpler. The groups consisted of two people per group and me and my friend chose to create a fixed point numbers CPU that would be able to calculate floating point values of four different mathematical operations, namely adding, subtraction, multiplication and division. We sticked with the IEEE-754 standard and had quite a bit of trouble with the processor, the teacher eventually passed the project, although it wasn't the best 😂. Still, not only did we technically were allowed to work in groups, the subject itself was simpler than yours and we didn't have to use anything besides Logisim

  • @codeplayer8575
    @codeplayer8575 2 года назад +1

    My friend, you are truly a legend. Congrats!

  • @petercortens6019
    @petercortens6019 Год назад

    Very nice work! There is still a use case for this line of work, albeit niche. Like powering up new games for retro console cartridges, powerful for retro purposes, but with a reasonable small footprint & power consumption < 1W and so with a specific reduced optimized instruction set to complement the original hardware and or game type as much as possible.

  • @PainkillerX12
    @PainkillerX12 4 месяца назад

    I was looking into N64 turbo nerd assembly stuff and I realized something (maybe i'm unaware of some intermediary steps tho, i'm just a terminally online dude with way too much time), the CPU being basically a glorified calculator, the bits fed into it might "just" open/close logic doors to tell the cpu what to do with the following bits of the instruction.
    For instance : if the 5 first bits are linked to, say, ADD instruction, they'll open the "gates" for the following bits to be fed through the ADD transistor array section.
    I believe that's why in every assembly language I ever read about, the instruction comes first, then the appropriate values, being usually at least one register address and a fixed value or another register to read the value in. IDK if i'm clear, very interesting project overall ! Thank you for the video !

  • @dan_mirnejhad
    @dan_mirnejhad Год назад +1

    was fully invested until I heard a halo 3 odst soundtrack halfway through and got hit by a wave of nostalgia

  • @mind_over_matterz
    @mind_over_matterz 3 месяца назад

    thank you for sharing, waiting to see a board soon! :)