How Machine Language Works

Поделиться
HTML-код
  • Опубликовано: 12 фев 2021
  • Support The 8-Bit Guy on Patreon:
    / 8bitguy1
    Visit my website:
    www.the8bitguy.com/

Комментарии • 3,5 тыс.

  • @Sladen70
    @Sladen70 3 года назад +2224

    Learning machine language with the 8-bit guy over a couple hours sounds like fun to me.

    • @jwjones1979
      @jwjones1979 3 года назад +95

      I thought the same thing. He should do another subset of videos on just that. I'd watch it.

    • @markstrickland438
      @markstrickland438 3 года назад +24

      One of the frequent contributors over at David's Commander X16 forum/FB page has been publishing machine language tutorials for the big brother of the 6502, the 65c816. His name is Matt Heffernan. I think he had published 8 parts of the tutorial series. He has also made some nice games for the X16.

    • @sjjjvideo1
      @sjjjvideo1 3 года назад +56

      I'd pay real money to take that class.

    • @WarrenGarabrandt
      @WarrenGarabrandt 3 года назад +30

      @@sjjjvideo1 Oh, for sure! I'd pitch in $20 at least to have access to watch that video.

    • @mfaizsyahmi
      @mfaizsyahmi 3 года назад +37

      Check out Ben Eater's channel. He's been doing a video series on programming a breadboard computer in assembly. He also did the thing David said about devices being memory addresses to the CPU.
      Best of all, he sells the same kit as he featured in his videos, so you can follow along and create your own breadboard computer!

  • @grandmaster1004
    @grandmaster1004 3 года назад +408

    I feel like I walked into the wrong class in college and was too confused and embarrassed to leave.

    • @Blitterbug
      @Blitterbug 3 года назад +28

      Why? David did a fairly good job of covering simple basics here.

    • @technopoptart
      @technopoptart 3 года назад +29

      @@Blitterbug it isnt everyone's strong suit even with an explination

    • @Mr.Pop0
      @Mr.Pop0 3 года назад +14

      You shoulda seen the faces of complete devastation people had when our online instructor told the class the day of the exam that they couldn't use the codex to read program outputs. Which means they had to actually memorize the hex values

    • @GORF_EMPIRE
      @GORF_EMPIRE 3 года назад +19

      @@Blitterbug It's not something easily grasped from a 20 minute video for some. Back 40 years ago, I had a bit of trouble. That is because most people who explained it when redonkulous with technical jargon. One day it all was made clear by a gentleman in Sears..... in a matter of 10 minutes he removed the road block with a very layman like explanation. I went home and started coding my ass off in 6502 assembler on my Vic-20. The dam was broken.

    • @the_kombinator
      @the_kombinator 3 года назад +7

      I'm learning this at the graduate level - I'm not the greatest coder, but using Python makes it rather simple. *Edit nevermind - Machine language != Machine learning. Guess I'd better go back to studying...

  • @jaymartinmobile
    @jaymartinmobile 3 года назад +309

    Back in '81-'83 I used to teach a few "introductory to computer programming" courses as part of my job repairing the computers of the time (mostly C64 and VIC-20). I was teaching the basics of BASIC to a class when one student asked me to explain ML. I gave a few examples very similar to yours to try to show the similarities and differences. The next question was "why write in ML when BASIC was so readable?" I then wrote a program to poke a character to all screen locations and then poke the next character etc. in both ML and BASIC. Showing the speed difference and explaining why made for a very good class, with most students signing up for the next level course. Thanks for the memories.

    • @whocares4598
      @whocares4598 2 года назад +10

      Darn dude your job must have been cool. My teacher is similar too you as he started teaching in the 80s.

    • @gruberstein
      @gruberstein 2 года назад +12

      That's about the same time I was learning how computers work with a Digiac Board, Comtran 10 and SDK85 at Lincoln Tech. It got me a job at Bell Labs where I never used any of that knowledge again.
      The Comtran was amazing, it was basically a CPU built of logic chips on a huge board with a set of schematics and timing charts that explained the operations of each instruction available. Instructor would open a connection with a switch in back and you'd have to trace the problem back to the logic gate that lost connection. Instructions could be put in one at a time or with a punch tape reader. You could manual clock the thing one cycle at a time or run a program .There were lights to show the contents of the registers.

    • @jackiemowery5243
      @jackiemowery5243 Год назад +4

      @@gruberstein I used the SDK85 at Eastern Washington University back in 81. Had a blast! Found myself thinking in 8085 assembler and having to reboot my brain whenever someone would ask me a question in English. Took a few seconds to change language and operating system.

    • @SATISFYPLANET
      @SATISFYPLANET Год назад

      God is coming soon to judge the world!! ✝️

  • @pilotkid2011
    @pilotkid2011 3 года назад +319

    As someone who has taken 3 assembly language classes in college, I must say this was an EXCELLENT overview.

    • @chitlitlah
      @chitlitlah 3 года назад +5

      Yes. I learned the basics of IA32 assembly in high school. Then I took a microprocessors class in college that had us coding various processors like the 6502, Z80, and Motorola 68000 in hex. Recently I got into IA64 assembly a bit.

    • @mrInSaNiTyDuDe
      @mrInSaNiTyDuDe 2 года назад +2

      I took a year of learning C++ and VB and i have to agree that this is the best. I wish he was my coding instructor.

    • @informitas0117
      @informitas0117 2 года назад +2

      Assembly can die in a fire.

    • @robb1324
      @robb1324 2 года назад +7

      I did as well... And now I find myself sometimes struggling in 4hr technical interviews because I forgot that one Python library that I'm "supposed" to know (there's always one that I forget).
      Turns out saying "I know assembly, I'm sure I can figure that one Python library out when needed..." Doesn't fly with employers. ☹️

    • @SATISFYPLANET
      @SATISFYPLANET Год назад

      God is coming soon to judge the world! ✝️

  • @tompov227
    @tompov227 3 года назад +416

    i feel like this is just a 20 minute PSA of david saying “STOP ASKING ME TO PORT Petscii robots” understandable honestly

    • @ZX3000GT1
      @ZX3000GT1 3 года назад +14

      If he want to give the source code to Petscii Robots, someone else can port it. Of course, that's IF he actually want to.

    • @joechristo2
      @joechristo2 3 года назад +7

      @@asificam1 But if they want to port it to computers with a different cpu, they have to rewrite the entire game

    • @glenndoiron9317
      @glenndoiron9317 3 года назад +8

      @@ZX3000GT1 IN THE VIDEO he shows the status of someone else's port of PETSCII ROBOTS to the Apple II.

    • @ZX3000GT1
      @ZX3000GT1 3 года назад +1

      @@glenndoiron9317 Well, I stand corrected. I honestly haven't watched the video properly, so pardon the mistake.

    • @KeMeEscupaUnPollo
      @KeMeEscupaUnPollo 3 года назад +5

      Maybe, but David is like direct enough to just say it plain and simple.

  • @binixx
    @binixx 3 года назад +402

    1:15 "I'm not gonna teach you how to code in machine language because that will probably take like 10 hours"
    Where do i sign?

    • @PJBonoVox
      @PJBonoVox 3 года назад +11

      He's not doing that because that's not the style of the channel. There are hundreds of resources to learn 6502 assembly and C64 programming. Make the effort to learn if you really care.

    • @stevejquest
      @stevejquest 3 года назад +29

      10 hours is a joke, it actually takes much longer than that.

    • @SteveJones172pilot
      @SteveJones172pilot 3 года назад +22

      @@PJBonoVox But.. He's got a great teaching style, and having him post his own version would be a great benefit.. I'd especially be interested in some of his graphics routines and techniques

    • @binixx
      @binixx 3 года назад +17

      @@PJBonoVox I was trying to make a joke. Imagine if you could learn to code, or draw, or animate, or something else only in 10 hours? Also i would like if the 8 bit guy would teach us how to code in Basic and Assembly the Commander x16.

    • @puckyMaXxx
      @puckyMaXxx 3 года назад +1

      wait after BYTE assemble of 1111 1111

  • @austinpatkos7563
    @austinpatkos7563 3 года назад +427

    Dude it's insane how much he actually knows and how talented he really is.

    • @rcs2003
      @rcs2003 2 года назад +42

      like almost any programmers from the 80's.

    • @chrisfusion6945
      @chrisfusion6945 2 года назад +15

      *skilled

    • @SteveMorrow8859
      @SteveMorrow8859 2 года назад +13

      Machine language (assembly) just takes a focus over a long period of time like anything that a person desires to learn, and you will get better. The 8 bit guy has inspired me in all kinda of ways, including electronics, and various projects. I follow assembly language game design on my channel for those interested.

    • @youngwaveaudio9390
      @youngwaveaudio9390 2 года назад +14

      The man hides his powerlevel as to not frighten us mere mortals using languages like C#.

    • @lilmsgs
      @lilmsgs Год назад +7

      All of this was standard course work in a computer science 2-year degree.

  • @neodimium
    @neodimium Год назад +34

    In my college we used to program ATMEL MCU in assembler. That was in 2000 and I remember it as great time. Working directly with addresses and limited operations pushed us to be more creative than ever.
    I would gladly start that class again!

  • @shrekinabox1730
    @shrekinabox1730 3 года назад +729

    His new setup looks like he's in heaven

    • @millsyinnz
      @millsyinnz 3 года назад +63

      It doesn't look as cosy as the last one, but I'm guessing that it is still unfinished, so benefit of the doubt applies here.

    • @kjrehberg
      @kjrehberg 3 года назад +36

      Needs some sound dampening panels for sure.

    • @jakobv8
      @jakobv8 3 года назад +11

      Matrix loading screen:-)

    • @CasperHulshof
      @CasperHulshof 3 года назад +5

      Including a halo!

    • @WillOnSomething
      @WillOnSomething 3 года назад +6

      It looks like an episode of The Good Place, haha

  • @rickwest2818
    @rickwest2818 3 года назад +200

    Assembly really does give a person a good grasp of how a computer processor works. Even if it's outdated, it's still worth learning.

    • @joechristo2
      @joechristo2 3 года назад +15

      Yup, it’s not as hard as people think

    • @abizair1832
      @abizair1832 3 года назад +6

      And maybe you can make a computer yourself, *from scratch* like SAP-1 or Scott CPU, and program it.

    • @CZghost
      @CZghost 3 года назад +14

      @@abizair1832 Ben Eater stuff?

    • @alliejr
      @alliejr 3 года назад +17

      As David correctly points out, its not the assembly language one needs to learn, rather it’s the specific interfaces to all of the hardware controllers and I/O.

    • @rickwest2818
      @rickwest2818 3 года назад +5

      @@alliejr it's more than that though. Program counter, accumulator, etc.

  • @janhruby9379
    @janhruby9379 3 года назад +200

    FYI: C# is same as Java they are compiled to bytecode and then JITed to machine code.

    • @rogervanbommel1086
      @rogervanbommel1086 2 года назад +4

      Also, python(maybe also other interpreted languages too) can be transformed into executables/binarys that both increase speed and remove the need for a interpreter using pyinstaller

    • @Kitulous
      @Kitulous 2 года назад +2

      @Fabian Fernlund python (at least cpython) can actually be compiled to bytecode and be run in a vm

    • @yeppiidev
      @yeppiidev 2 года назад

      @@Kitulous well isn't what it does normally? it's still kinda the same thing, interpreting code but it's a bit faster. doesn't really matter because you will barely notice any issue unless you're making a cpu/GPU intensive application

    • @Kitulous
      @Kitulous 2 года назад +3

      @@yeppiidev well i divide code execution into three categories:
      1. slowest: interpreting code at runtime (lex -> parse -> ast -> invoke ast nodes at runtime)
      2. somewhere in between: compiling to bytecode (lex -> parse -> ast -> bytecode) and then interpreting bytecode
      3. fastest: compiling to native instructions (x86, arm, etc)
      2 can be sometimes faster than poorly written 3 because of JIT optimizations (if there are any or course)

    • @yeppiidev
      @yeppiidev 2 года назад

      @@Kitulous oh yeah I forgot the individual components of an interpreter exists but yeah the closer it is to machine code the faster it runs but modern computers are too fast for the difference to be noticeable (atleast in normal apps)

  • @eduardolarrymarinsilva76
    @eduardolarrymarinsilva76 3 года назад +62

    16:18 "Wait it's all RAM?"
    *CPU about to execute a kill opcode* Always has been...

  • @stmchale
    @stmchale 3 года назад +124

    I liked the scene you pointed out from "The Terminator" movie. I saw this in the theaters in 84' in Dallas and I was with my brother who just graduated from college as a Electrical Engineer and during that scene he whispered to me "that is assembly language", I whispered back. "Your telling me there going to use assembly language 50 years from now"? Well, here we are now.

    • @hackmattr
      @hackmattr 3 года назад +9

      I got my CSE degree a few years ago. One project for a game development class was to make a simple game using assembly for the Atari 2600. I had another class where we had to use assembly for the 8080. I'm glad we use higher level languages for everything.

    • @Okurka.
      @Okurka. 3 года назад +11

      I can't believe 1984 was 50 years ago. Time does fly.

    • @pocoloco8075
      @pocoloco8075 3 года назад +16

      @@Okurka. Get Your math right, bro. 1984 was even 69D years ago!

    • @Defianthuman
      @Defianthuman 3 года назад +4

      @@Okurka. its not event 2024 so its 37 years ago

    • @PWingert1966
      @PWingert1966 3 года назад +2

      We watched that in our Robotics Class at University. We even had a question on the final exam about it!

  • @BixbyConsequence
    @BixbyConsequence 3 года назад +54

    Wrote a puzzle-solving program for TRS-80. Initial run in BASIC was 48 hours. Took it down to about 12 hours with heavy optimizing. I broke down and rewrote in Z-80 assembler and got solutions in 4 minutes.

    • @legion162
      @legion162 3 года назад +2

      Ohhhh my little sister had a TRS 80, only thing I remember about it was it took cartridge games that loaded pretty instantly, whereas my ORIC 1 loaded games on tape and took ages to load if ever lol

    • @BertGrink
      @BertGrink 3 года назад +7

      That pretty much sums up the difference between BASIC and Machine Code.

    • @TheGuyThatEveryoneIgnores
      @TheGuyThatEveryoneIgnores 3 года назад +2

      I used to write programs in BASIC on an Apple II+ and manually convert them to machine language. It was typical for them to run 200 to 300 times as fast. I found most of the speed improvement was from working with binary numbers directly rather than numbers stored as character strings in BASIC.
      Also, I calculated Pi out to over 20,000 decimal places, though the math was done in binary. It still took five days to run even with everything written in machine language! There is no doubt it would have taken years to run if I had wrote it in BASIC.

    • @blahorgaslisk7763
      @blahorgaslisk7763 3 года назад +1

      @@TheGuyThatEveryoneIgnores Interesting that the numbers conversion from a string to binary numbers was such a huge bottleneck. I say that as I remember programming the ABC 80 in basic back in the very early 80's. It was a Z80 based computer and the version of BASIC that it ran had a lot of strange quirks, one of which was that you could enter the numbers explicitly as string values just by putting them inside quotation marks. So as far as to the output the two following lines were equal:
      Let A=10 : For X = 1 to 500 : Print A+X-6 : Next X
      Let A="10" : For X = "1" to "500" : Print A+X-"6" : Next X
      There were however differences in how they are stored in memory and how fast they are interpreted. If I remember correctly the editor stored any number that didn't have a qualifier as a 9 byte floating point value. But by entering the numbers as strings they were also stored as strings. so in the statement Let A=10 the number will take up 9 bytes of memory while if it was written as a string it would only take up four bytes. To make things even more fun the interpreter handled the numbers that were entered as strings in fewer clock cycles. The variables were the same, so for optimal performance you would have used string variables for everything except possibly integer value variables as those were only 16 bits. But if I remember correctly integer variables were still slower when used in calculations than string variables. Always thought this was really weird, but back then there was a lot of that going around in the computer industry.
      Another fun thing was that if you were doing calculations that required high precision then you would convert all numbers to strings and use only string variables in all calculations. Not only was it faster but the results were more exact. While I can see why it could be more exact I still can't figure out how the heck it could be faster. But I remember verifying this myself.
      Those machines were very limited, but at the same time just so simple to work with. You had no libraries of functions to load and the hardware was no more advanced than that you could easily read the circuit diagram and figure out how it worked. You could actually buy the ABC 80 as a kit with a bare PCB and bags of components and solder everything together yourself if you wanted. The simple architecture also made it so easy to write programs in machine language or assembler for these 8 bit processors. After all you had at the most 64KB of memory to worry about...

    • @TheGuyThatEveryoneIgnores
      @TheGuyThatEveryoneIgnores 3 года назад

      @@blahorgaslisk7763 Before I ever had heard the word "compiler", I had written a very simple one for the Apple II which could compile BASIC programs that only used the hi-res statements. All of the coordinates for plotting dots and drawing lines had to be hard-coded. My compiler simply converted these coordinates, which were stored as character strings in BASIC, to binary, loaded them into registers and called the same routines in ROM that BASIC called for plotting dots and drawing lines. I tested my compiler on a BASIC program that drew a picture in about 20 seconds. The compiled program drew the same picture instantaneously.

  • @lemagreengreen
    @lemagreengreen 3 года назад +40

    The whole explanation of the CPU simply viewing other chips as memory is great and really demystifies a lot of this stuff.

  • @user-wj9xq7ig2v
    @user-wj9xq7ig2v 3 года назад +425

    The limited power of these machines forced programmers to be elegant and efficient in ways that are no longer seen. Amazing skills.

    • @BruceCarbonLakeriver
      @BruceCarbonLakeriver 3 года назад +4

      you can see it by watching code demo conventions :)

    • @JivanPal
      @JivanPal 3 года назад +9

      Wirth's law: en.wikipedia.org/wiki/Wirth%27s_law

    • @BruceCarbonLakeriver
      @BruceCarbonLakeriver 3 года назад +27

      @@JivanPal fully agree crap code is all over the place... it is a shame. Just check out a demo convention and see what those guys are capable of with sometimes insane limitations like a full blown 3D render with AA and textures within a 64kB file - not mention the commodore64 hacks or the gameboy colour demo xD - lean code really makes things better.

    • @onearthonelegion
      @onearthonelegion 3 года назад +7

      I think more is better. Why do 64kB when you can do 64GB.

    • @calebfuller4713
      @calebfuller4713 3 года назад +9

      True. If you go to "Code Golf" on Stack Exchange some programmers still enjoy the challenge of trying to make the smallest possible code, competing on the minimum number of bytes (yes bytes) they can use to fulfil a given task.

  • @retep8891
    @retep8891 3 года назад +193

    Ben Eater recently (past year) did a series on assembling and programming a 6502 computer to say hello world. Worth checking out if you like this sort of thing.

    • @robertmudry4242
      @robertmudry4242 3 года назад +23

      Seriously, anyone with any interest in the topic should watch Ben’s videos. His approach to teaching is remarkable, his videos are top notch, and you’ll learn more than you thought you wanted to know about the topic! Plenty of “Ah ha!” moments, even if you already have some knowledge of the topic.

    • @DaVince21
      @DaVince21 3 года назад +9

      And another Ben, Ben Heck, has been programming an absolutely tiny chip to run as a video game console. Very technical, but very interesting.

    • @halonothing1
      @halonothing1 3 года назад +13

      Ben's videos have been ABSOLUTELY ESSENTIAL in me realizing my dream of one day building a computer from scratch. I mean building one completely out of chips and other components. I love playing with this program called Logisim, which lets you build digital circuits of pretty much any size and complexity you can come up with from things like logic gates, RAM, counters, flip flops and other components.
      The first time I played with it, I told myself one day I would make a computer with it. Although I've since learned how to build physical electronics and now have the means to build such circuits in real life. So my goal now is to make an actual breadboard computer, like Ben did. Though there are some things I would like to change.
      Anyways, if anyone wants Logisim, it's a free program. The page for it can be found here:
      www.cburch.com/logisim/
      And in case you have trouble finding the download link, here's a link to the download page on sourceforge:
      sourceforge.net/projects/circuit/
      Whether you just want to play around with digital circuits, or do some serious learning, or build a model of a full, working computer. I cannot recommend this program highly enough. It's been essential in my journey from being curious about computers but not even know what a transistor is or does to designing computer and CPU components and then building them in real life.
      And while I'm at it, the other program that's been just as important is a circuit simulator called circuitmod, which is also free and runs in your browser, and it can be found here:
      lushprojects.com/circuitjs/circuitjs.html
      This amazing program will allow you to build both analog and digital circuits. It simulates just about every electrical/electronic component you can imagine, resistors, capacitors, bipolar transistors, mosfets, jfets diodes, zener diodes, tunnel diodes, logic gates, 555 timers, various logic chips, and soooo much more. Plus it allows for a simulation of oscilloscope output that even includes a fast fourier transform of the output. Or it lets you export the wave form into a sound file, or raw data if it's a digital circuit. This program is superb for not not only building circuits, but for understanding how they work on an intuitive level by letting you see where current is going and what voltage everything's at at a single glance. It also has a ton of example circuits, which you could spend hours on end playing with and learning about.
      This program along with Logisim probably single handedly helped me transition from just screwing around on the computer, to building actual real life electronic circuits. Again, I cannot overstress how useful and fun both of these incredible programs are whether you just wanna have some fun, or do some serious learning. Or both!

    • @lightningmcqueen1577
      @lightningmcqueen1577 2 года назад +2

      @@halonothing1 wow that's a great way of introduction to digital circuits and designs, if you want to go further I had recomend learning VLSI design using verilog which is the standard in the industry today, 100 years of technological advancements

    • @halonothing1
      @halonothing1 2 года назад +2

      @@lightningmcqueen1577Thanks! I'm really glad I appreciated my comment. I put out so many comments like this and this is the only time somebody actually replied. So, at least ONE person's taken the time to bother reading it. And I appreciate that a lot.
      My post was more geared towards hobbyists, especially those starting out. So I wouldn't recommend FPGAs for beginners. Hell, I had enough trouble figuring out the serial drivers for my knock off Arduino back in the day. But if you're already familiar with electronics and embedded design. Then absolutely HDLs are a must. And even as a hobbyist, I'd lie if I said I hadn't looked at buying an entry level dev-board. But I have enough to keep me busy as it is.

  • @MikeBramm
    @MikeBramm 3 года назад +123

    Without a memory map of the hardware, you end up having to reverse engineer the board to determine out what addresses control which hardware (through address decoders to chip select or enable inputs). If you're lucky, you can get a schematic, otherwise you'll spend A LOT of time following traces, making your own schematic and downloading data sheets.

    • @SeanBZA
      @SeanBZA 3 года назад +12

      Add to that that a lot of those on chip registers can be write only, read only or they can have different characteristics on write compared to read, or just the act of reading them causes the action, and write does another thing. Very common in the CRTC world, where you have a multitude of registers to control stuff, and also on modern chipsets, where you have a massive set of registers that need to be set up with "write these registers, in this order, with these bytes, to get the chipset to operate" with no other information on what that does. Get that wrong and you run into all sorts of issues, up to and including chips failing.

    • @sinephase
      @sinephase 3 года назад

      nuts

    • @TheGuyThatEveryoneIgnores
      @TheGuyThatEveryoneIgnores 3 года назад +2

      Fortunately, the Apple II+ computers came with a good user guide that detailed the memory mapped hardware. You could also reverse engineer existing software to figure out how things work. I taught myself 6502 machine language when I was 13 and a lot of what I did was looking at existing software to see how it worked. This is known as white hat hacking.

    • @absalomdraconis
      @absalomdraconis 3 года назад

      @@SeanBZA : The IBM PC world provided a pretty easy example of differing read/write behavior- read from the hardware and you get the current status of buttons & timers, whereas writing resets those timers, leading to potentially very confusing results.

    • @markstrickland438
      @markstrickland438 3 года назад +1

      Even with the stock BASIC on the C64, and to a lesser extent the Atari 8 bit machines, you were forced to learn hardware registers.

  • @ossietee7562
    @ossietee7562 3 года назад +7

    This is the best technical video/ lecture I’ve attended in ages. Seriously!
    I am a retired CHEM prof, aged 79+ who was working on Data acquisition, in my research, on Sabbatical in California.
    I bought my first Apple ll in 1979 with my own money, $1500, and taught myself to use it.
    A bit later, I wrote in 6502 machine language to convert data in BCD to Hex to do graphics on the Apple ll prior to doing Data analysis.

  • @herrbonk3635
    @herrbonk3635 3 года назад +41

    15:10 Intel and Zilog assembly used a suffixed "h" (F143h). So the notation 0xF143 wasn't common at all in 8088 assembly. It comes from C and has creeped into the x86 world via the C-based Unix/Linux in later years.

    • @DaedalusYoung
      @DaedalusYoung 3 года назад +3

      In MSX-BASIC, it was the prefix &H, for example in POKE &HF346,1. MSX of course used Zilog's Z80 processor.

    • @herrbonk3635
      @herrbonk3635 3 года назад +2

      @@DaedalusYoung Sure, there are a lot of different notations used for hex, binary and octal numbers. But this is not dependent on the processor, as you know. A Z80 processor does not mean that Zilog (or Intel) wrote the assembler or BASIC for it. Also, a similar assembly syntax (say AT&T) may be used for many quite different processors, such as 80386 and 68000, for instance. But the first assembler for the IBM PC was actually written by Intel, and firms like Microsoft, Borland and others followed suit in many syntactical details.

  • @microbuilder
    @microbuilder 3 года назад +10

    4:07 Purple ball on the right going down to the bottom left corner makes a perfect corner hit lol

  •  3 года назад +14

    I made the transition from Basic to Assembly around the age of 9 or 10. My young brain had a hard time wrapping around the fact that instructions were just numbers in memory, surely it had to be somerhing more special. Then one lunch break at school, my friends and I realized that if instructions were numbers in memory we could write code that modifies itself! It was almost like stepping into our own scifi-movie

    • @SreenikethanI
      @SreenikethanI 3 года назад +2

      Wow that feels like a sweet experience

  • @jmhg92
    @jmhg92 3 года назад +19

    Back in college I've made this small Mega Man 3 animation in assembly and took quite a while to make it, with hundreds of lines written to paint and create a tone. Still, it was a fun project that got me an A in my class.

  • @battlefox6481
    @battlefox6481 Год назад +3

    Came back to rewatch this video after learning 6502 assembly and I love it so much more!

  • @mikesimms1
    @mikesimms1 3 года назад +32

    As someone who has written assembly professional (fairly recently, I might add), I still think there's some value in it. Mostly because it helps you understand computer architecture at a level you're not going to get from the perspective of a high level language.

    • @DFX2KX
      @DFX2KX 3 года назад +4

      Not to mention that, if I understand correctly, it's still used in microcontrollers, which are *everywhere*. People have made Atmel chips do some crazy stuff that way.

    • @BertGrink
      @BertGrink 3 года назад +5

      @@DFX2KX There's no doubt that programs that are originally written in assembler/machine code take up the least amount of space, and run the fastest.

    • @wbahnassi
      @wbahnassi 3 года назад +6

      If you're debugging your high-level language program and symbols aren't available, then (dis)assembly is your only option.. so yeah.. it is very relevant even today in various domains.

    • @glenndoiron9317
      @glenndoiron9317 3 года назад +2

      @@DFX2KX Assembler in microcontrollers is slowly going away, as microcontrollers are getting more and more powerful each year.
      The last serious microcontroller project I did around 13 years ago, had a 40Mhz MIPS32 embedded controller; the bulk of the microcontroller code was written to C, and only 6 or 7 subroutines were later recoded to assembler (bitbanging of words/blocks of words into an SRAM, and bitbanging IDE bus transfers).
      This resulted in a 10x speed improvement, but compilers have gotten even better since then, and I doubt that there would be as much improvement if it was (properly) done in C today.

    • @richardlighthouse5328
      @richardlighthouse5328 3 года назад

      @@glenndoiron9317 C makes me think less about multiplying 32 bit or 64 bit numbers on 8-bit microcontroller.

  • @Kris_M
    @Kris_M 3 года назад +65

    Just a note: software that turns assembly into machine language was (is) commonly known as an assembler.

    • @joesterling4299
      @joesterling4299 3 года назад +13

      Right. He's conflating assemblers with compilers, which is not at all correct.

    • @lorensims4846
      @lorensims4846 3 года назад +1

      Yeah, I do commonly hear references to "Assembly Language compilers" but I think that's just laziness.
      A lot of people use a C compiler to assemble their AL code, but this bypasses the compiler to go straight to the assembly step.
      A compiler performs the difficult task of converting human friendly program code in to roughly equivalent assembly code for whatever machine they are running on.

    • @rty1955
      @rty1955 3 года назад +3

      The difference between a compiler and an assembler is that each line of source code in Assembly is one machine instruction (not counting macros)
      A compiler produces 1 to MANY machine code instructions per each line of source code

    • @PJBonoVox
      @PJBonoVox 3 года назад +2

      Listen to you all nitpicking just so you can let everyone in the comments know how smart you are.

    • @AmstradExin
      @AmstradExin 3 года назад +7

      @@PJBonoVox When people get paid for saying something in exchange for money, it is always valid to point out mistakes.

  • @dhpbear2
    @dhpbear2 3 года назад +6

    6:46. Thank you, David, thank you! I started programming in assembly back in the early 80s. This is the FIRST mention of 'assembly language' NOT being defined as 'machine language'!

  • @lazyhominid
    @lazyhominid 3 года назад +6

    C# also runs through an interpreter; it does not compile to machine language, but just like Java, it compiles to a byte code which is run on a runtime. The Java runtime is called JRE, and the C# runtime is called CLR. Java and C# are designed in exactly the same way when it comes to machine code.

  • @wimwiddershins
    @wimwiddershins 3 года назад +19

    Back in 1980sumthin', I had a friend at school who'd sit in front of the C64 coding in assembly, while we all struggled to learn BASIC. It helped that both his parents were programmers.

    • @gilles111
      @gilles111 3 года назад +3

      Had such a friend to, wrote his own racing game while we were just struggling to copy a Basic-program out of a magazine...

    • @aikou2886
      @aikou2886 3 года назад +1

      That sounds like lots of fun!
      Yeah, I'm not being sarcastic. I genuinely find this bith fun and interesting, just wish I was able to do it while I was a kid/younger.

    • @gilles111
      @gilles111 3 года назад

      Was always a kind of jealous (in the same way as you describe - wished I could do the same as I found an still find this interesting). My friend was also the kind of kid that pulled a defective radio in parts and managed to get it back working.

  • @leviwuzere07
    @leviwuzere07 3 года назад +122

    Holy hell, that cleanly, clearly, quickly and distinctly explained quite a lot about Assembly languages that I've been struggling to figure out for months as I'm interested in learning about how old computers worked as a modern programmer
    Excellent video

    • @joesterling4299
      @joesterling4299 3 года назад +3

      Assembly was the way to go even in the 1980s. You'd be amazed how well an 8088 IBM PC can perform if the graphics libraries are written in assembler. This was a forte of mine back in the day. Eventually, the technology advanced to the point where dabbling in machine code became rarely necessary. But the difference was night and day until graphics accelerators came into being.

    • @thewhitefalcon8539
      @thewhitefalcon8539 3 года назад +1

      @@joesterling4299 there's a demo called 8088 Domination which plays full-motion video with sound on an original IBM PC. Apparently, it is just barely possible.

    • @povilasstaniulis9484
      @povilasstaniulis9484 3 года назад +1

      Modern computers work very similarly. It's just that they are fast enough to run high-level languages at acceptable speeds and there's almost no need to write code in machine language directly.

    • @TheThomSirveaux
      @TheThomSirveaux 3 года назад +1

      @@joesterling4299 My first job after graduating was programming aircraft mission computers for the "Legacy" F-18 Hornets. Not the SuperHornets, the old mainstays, since they were still running on an Intel 8088 processor (well, there were a BUNCH of them, but yeah).
      This was less than ten years ago. While I was there, Boeing had a contract to upgrade the tech to... dual-core PowerPC chips and our group had to migrate from assembly to C++.
      At another position I worked, I had to set a register on an embedded platform, and for some reason the C++ code wasn't working. Thankfully, our IDE let you embed assembly code in C++, and that worked.
      Sometimes, assembly is just the way to go.

    • @mgjk
      @mgjk 3 года назад

      @@TheThomSirveaux I was once trying to demonstrate to somebody how memory protection worked. I set a pointer in C and would write to an address out of range to trigger a protection fault.... it didn't work. Then it hit me... because I never *read* the value, the optimizer discarded it. Once I turned off optimizations, it worked fine. Mental.

  • @tjsynkral
    @tjsynkral 3 года назад +19

    I would've just said that Java and C# work a lot like BlitzBASIC... They kinda do!

    • @__blackjack__
      @__blackjack__ 3 года назад +2

      Except that in an additional step their ”P-Code” is almost always further compiled into native machine language.

    • @Spelter
      @Spelter 2 года назад

      @@__blackjack__ But only on the fly, so that takes time and that's the reason many say it's slow. The VM is cleaning up the code in RAM and if you run the same method again, its compiled again.

  • @davidbonner4556
    @davidbonner4556 3 года назад +1

    As a Hardware Tech in the 70s I found it extremely useful to teach myself all manner of 8-bit machine code. I worked on S-100 gear (like the Altair 8800 shown in the video) but could only afford a Rockwell Aim-65 (almost identical to the KIM-1 shown) for personal use.
    For example, to test a memory problem at a known address in ram I'd enter a machine code program at a known good ram address via the toggle switches or keypad, whether Hex or Octal, to load a value to the Accumulator, store it at the bad address and then loop. This would set all the address, data and control lines to a known repeating pattern which I could then look at using an Oscilloscope and isolate the bad chip or shorted lines or whatever caused the issue. This lead to quicker repair of the problem than staring at a schematic and trying to "logic" it out.

  • @MrRandomgamerdkHD
    @MrRandomgamerdkHD 3 года назад +236

    I read the title as "How Machine Learning Works" and wondered why David would make a video about that 😄

  • @misterhat5823
    @misterhat5823 3 года назад +69

    Assembly is not a lost art. It's still used for microcontrollers.

    • @mapesdhs597
      @mapesdhs597 3 года назад +6

      @@SimonWoodburyForget The caveat there is for industrial process control systems where available memory is often extremely limited, so doing it directly is preferable. Otherwise one is relying on however good the compiler is and that's too much of a luxury when available RAM is counted in K. It's just that public awareness of these fields is basically nill.

    • @neutrino78x
      @neutrino78x 3 года назад

      @@mapesdhs597 wouldn't it be more because there's no K&R C or C++ compiler for that particular chip or that system? There was a K&R C compiler for the Apple II and it's memory is measured in kilobytes :) Same with the IBM PC Model 5150. :) (Aztec C, Microsoft C)

    • @darranrowe174
      @darranrowe174 3 года назад +5

      Assembly is very useful for debugging too.

    • @misterhat5823
      @misterhat5823 3 года назад +4

      @@mapesdhs597 I've had to work with micros where RAM is counted in bytes. For a BLDC controller even 256 bytes is gross overkill when using assembly.

    • @misterhat5823
      @misterhat5823 3 года назад +2

      @@SimonWoodburyForget That's the complicated (and resource wasteful) way to do it. With microcontrollers we're speaking of RAM measured in K or even bytes and program ROM measured in K.

  • @CF542
    @CF542 3 года назад +18

    I wish that I had teachers like this in school. I might have learned to code and have stuck with it.

    • @mikeklaene4359
      @mikeklaene4359 3 года назад +1

      Not everyone is meant to be a programmer. To write assembler or even "C" requires a level of understand of exactly what the computer/processor is doing.
      Languages like COBOL or C# or Swift assumes that the coder is ignorant and must be protected from their own stupidity.
      Assembler and machine language just say "Yes sir!" and do what you tell them - even it will just lock up the machine.

    • @mikeklaene4359
      @mikeklaene4359 2 года назад +3

      @randomguy8196 The programming language is a tool. Some are better suited for different tasks than are others.
      In the bad old days when a job had to run in 8 or 16K of memory and the processor was rather slow, assembler was the only choice.

  • @michaellong8812
    @michaellong8812 2 года назад +14

    This was a great video.
    Assembly language seems esoteric at fist, but once you learn it, you realize that a computer is very much a machine with levers that do different things.
    Thanks for sharing.

  • @astrocapsule
    @astrocapsule 3 года назад +105

    Me: "mmmm I wonder how David started this 2021..."
    David Murray: BALLZ 64

  • @BenHeckHacks
    @BenHeckHacks 3 года назад +816

    Great video. Suggestion though, you should have discussed the "endianess" of how 16 bit values are stored in memory.

    • @thekornreeper
      @thekornreeper 3 года назад +11

      🖤

    • @TheNewVortexChannel
      @TheNewVortexChannel 3 года назад +4

      Yes

    • @johndododoe1411
      @johndododoe1411 3 года назад +30

      See www.ietf.org/rfc/ien/ien137.txt for the definitive endian-ness introduction.

    • @TheBigBigBlues
      @TheBigBigBlues 3 года назад +9

      Second Ben comment on a video I've watched today...

    • @bastscho
      @bastscho 3 года назад +36

      Endianess can be needlessly confusing.

  • @memsom
    @memsom 3 года назад +5

    A compiler and an assembler are two distinct stages. A compiler can use an assembler to assemble your assembly language files it generated. I think the terminology we usually use in modern CS now is : compiler - high level language, assembler - assembly language (what you call “machine language “ is either opcode (instruction), assembly (language) for processor specific instructions), object code or machine code for binary output, binary/executable for the linked object code - this is the bit you can execute. On older machines without an OS that supports the notion of exe files it’s a little more murky. Using “compile” for the assembly stage goes against normal usage.

  • @TheRetroShack
    @TheRetroShack 3 года назад

    Another great video David! As one of the inspirations for me starting my own channel recently, I just want to say ‘Thank You!’ For the years of great content so far, and hoping for many more years to come. Stay safe and well.

  • @drdeath893
    @drdeath893 3 года назад +93

    Minor nitpick: Java and C# are in the same category. C# gets compiled to CLI byte code to be interpreted by the CLR (or other CLI compatible runtime).

    • @mumblic
      @mumblic 3 года назад +9

      I was thinking the same:
      There are in essence 4 kind of language: Assembly transformed to ML, Compiled to ML, Compiled to IL and Interpreted at runtime
      Although the line between some is getting thinner and thinner,

    • @DylanMatthewTurner
      @DylanMatthewTurner 3 года назад +8

      But C# bytecode is actually not interpreted. It uses a JIT compiler to convert assemblies to machine code, which still makes it faster on average than traditional interpreters.
      I don't know if Java bytecode uses a JIT yet, it has been taking a lot from C# the past couple years such as the var keyword, but I don't believe it originally used a JIT compiler

    • @drdeath893
      @drdeath893 3 года назад +10

      "Compiled" typically refers to languages that produce machine-specific instructions at compile-time. While languages like C# and Java (and even Python) can use a JIT runtime to run natively, the fact remains that they are not machine-specific until runtime.

    • @mumblic
      @mumblic 3 года назад +4

      @@DylanMatthewTurner Java and C# are very similar in they way they work. Complied to IL and both use JIT compiler. Actually C# or better the .NET JIT/RT is developed around the ideas/principles of Java

    • @alexmller1442
      @alexmller1442 3 года назад

      You can compile C# to native now idk about java tho

  • @orondf343
    @orondf343 3 года назад +23

    5:20 C# and Java are actually in neither of the categories mentioned, and are much more similar to each other. They are compiled to and intermediate language (IL and Bytecode, respectively) which are then interperted by a runtime (CLR and JVM, respectively, using tricks to improve performance such as JIT (just-in-time) optimization). They are not entirely interpreted, and yet not directly compiled to machine code, but rather, both. (There exist some other languages that are also built on those same existing runtimes.)

    • @6581punk
      @6581punk 3 года назад

      The term is compiler-interpreter languages. I've not had that interview question for a while, bit I've had it in the past. These days with containerisation and cloud it becomes less important, We're getting even further away from the metal.

    • @wenyuwang3203
      @wenyuwang3203 3 года назад

      Just a side note: there are works doing ahead-of-time compilation for Java to make standalone binaries.

    • @donpalmera
      @donpalmera 3 года назад

      >They are compiled to and intermediate language (IL and Bytecode, respectively)
      and ..... So is C with any modern toolchain like GCC or LLVM. The fact that C started out life as a language that was translated directly to assembly and that Java was VM based from the start means very little now. Everything meets at the middle: Languages are just a frontend that is turned into some sort of generic presentation and that becomes machine code eventually. The JVM at this point is basically a massive compiler that takes java byte code as it's IR and spits out machine code at run time. If you don't believe me look at what Android did: They took Java's javac as their frontend to get java byte code and replaced the bit to get machine code. Originally it was a interpreted virtual machine but AFAIK now they do completely ahead of time code generation.

    • @natestickeler924
      @natestickeler924 3 года назад +3

      Somebody didn't watch the whole video before commenting 😂

    • @adamsfusion
      @adamsfusion 3 года назад +3

      @@natestickeler924 Saying "before you rush to the comments" does not make up for being wrong. If you're going to present yourself as an authority on a subject to beginners, at least get the easy information right.

  • @brendanhoffmann8402
    @brendanhoffmann8402 Год назад +3

    This is so cool. I remember talking about this with my friend in high school in the early 90s. He was a mad scientist... always tinkering with old computers. He ended up becoming the CFO of an ISP. Smart kid.

  • @Mirauge
    @Mirauge 2 года назад

    These are some of my favorite videos of yours. Thanks so much for sharing.

  • @give_me_my_nick_back
    @give_me_my_nick_back 3 года назад +19

    It's a thing of beauty when you first write the code for a big task requiring a lot of IO, then you spend a day eliminating everything that can be eliminated, packing as much as possible into the least possible numbers of loops full of breaks and other ugly commands you were always told not to use, making it kind of hard to grasp for any other person and voila 3 minute task now takes 30s.

    • @d2factotum
      @d2factotum 3 года назад +1

      I always remember reading about how Mike Singleton (RIP) wrote the original "Lords of Midnight" on the Spectrum--basically, he needed to save every byte he could in order to fit the thing in memory, and he would even re-order subroutines so that, if one subroutine always called another at the end, he'd put them one after the other to save 3 whole bytes (the CALL and the extra RET in the original subroutine). If only modern OS programmers had the same mindset!

    • @evensgrey
      @evensgrey 3 года назад +5

      @@d2factotum Yes, we'd have a LOT less software. Not being miserly leads to bloat, but not needing to be miserly leads to much faster development.

    • @SerBallister
      @SerBallister 3 года назад

      @@evensgrey Exactly, it's a balancing act of resources. It's easy to scoff at modern software without realising the constraints that it was developed in, namely deadlines.

    • @davidbonner4556
      @davidbonner4556 3 года назад

      Add to that if you write machine code or even Assembly source on a 6502, the most efficient branch statements only work within the same page... For example, a branch occurring in page 0x0400 cannot branch into page 0x0300 or 0x0500.
      If you got a branch violation you would have to re-arrange code to keep the branch legal by moving subroutines around or padding with NOPs (No OPeration commands).
      If I had a dime for every time someone added a few bytes to an early bit of code and I (as the "Clean-up" coder) had to restructure the rest of the source to fix such violations, usually on a Friday night before a Monday customer demo...

    • @SerBallister
      @SerBallister 3 года назад

      @@davidbonner4556 Reminds me of Gameboy development with its ROM banking. Bank 0 would be basically full of page-swapping trampolines (jump to bank 0, switch banks, jump to target bank) for all the cases of functionality spanning many pages.

  • @mateusbmedeiros
    @mateusbmedeiros 3 года назад +291

    *Starts typing well-meant feedback comment about Java and C# with video playing on BG *
    *Gets to the part where he says "Now, before you go rush into the comments to tell me how wrong I am about Java and C#" *
    "...goddamnit"

    • @Gurux13
      @Gurux13 3 года назад +24

      I'm in the same league.
      But... Why not put them into one bucket of interpreted/compiled?.. Doesn't matter which one, but they're basically the same in terms of all the interpreter/compiler, VM, JIT, etc. stuff!

    • @steffennilsen2132
      @steffennilsen2132 3 года назад +4

      Oh I admit I was triggered while hearing this in the background myself lol

    • @stewcoil2481
      @stewcoil2481 3 года назад +2

      @@Gurux13 Yeah, I think C# is compiled into Intermediate Language (IL), then the Just In Time (JIT) compiles it on the end users machine (and saved) as Machine Language optimized to the CPU it runs on, so it might be different between AMD/Intel or different versions of the processor (like a new instruction might be available on a newer processor).

    • @borrrden
      @borrrden 3 года назад +17

      I’d certainly put c# and Java into the same category on this simplification. Both of them are compiled into an intermediate representation before being compiled into machine language :)

    • @BonkedByAScout
      @BonkedByAScout 3 года назад +13

      C# is not compiled and shouldn't be bundled with them. It's cached pre-interpretation at the VM level at best, which other languages like Python also do. Posing C# as a compiled to assembly language is just dumb.

  • @InvalidUserProfile
    @InvalidUserProfile 3 года назад +1

    I love your videos and I love the intro theme song! It gets me hyped for the day ahead. Thank you 8-Bit Guy!

  • @XFanmarX
    @XFanmarX Год назад +1

    13:28 Wow that's so silly but so easily memorable, I love the terms programmers come up with sometimes.

  • @pufaxx
    @pufaxx 3 года назад +14

    I'm not kidding: I really noticed that empty studio at about minute 19:00. Content was too interesting. 👍

  • @bongodoug
    @bongodoug 3 года назад +64

    I've learned more about code in 20 minutes than I have in 20 years, but to be fair, I live on the hardware side of computer knowledge.

    • @miles7267
      @miles7267 3 года назад +1

      How should look into the breadboard computer series and programming assembly for the TI-83. Those two things taught me so much about how computers work

    • @kingkrispy5289
      @kingkrispy5289 3 года назад +1

      so you’re a really good electrical engineer?

    • @farhanyousaf5616
      @farhanyousaf5616 3 года назад

      What kinds of HW things? I'm a Software Engineer.

    • @hariranormal5584
      @hariranormal5584 3 года назад +1

      Same. I am more of a hardware/networking based guys

    • @Okurka.
      @Okurka. 3 года назад +2

      @@kingkrispy5289 He probably assembles PCs; not design motherboards.

  • @martinhertog5357
    @martinhertog5357 Год назад +1

    In 1983 I got my first Commodore 64. I learned Basic, machine language, all the peeks and pokes, even got a tiny C compiler. Wonderful times playing games, programming, disassemble games to see how they work. I even wrote a 6502 emulator on my PC just for fun. For a full C64 emulation I prefer using VICE!

  • @ctrlaltrees
    @ctrlaltrees 3 года назад +3

    We had an entire module on 6502 assembly language in my Computer Science degree course 15 years ago. I'm mainly a C# guy in my day job now. Anyway, this is a really excellent and information-dense introduction, nice work! I particularly liked how the part about everything being a memory address as far as the CPU is concerned was explained - getting your head around this is really fundamental and helps all of the other pieces fall into place. 👍

    • @stargazer7644
      @stargazer7644 Год назад +1

      But not all processors work this way. The Z80 and x86/x64 (as 2 examples) can talk to devices via memory addresses, but they also have an I/O port system where they can be mapped to IO ports instead so that the very limited system memory address spaces are not wasted for hardware.

  • @googlepask7551
    @googlepask7551 3 года назад +21

    Holy camera angle batman! Shelves intersecting the head in dead center. Looking forward how the studio ends up looking...

  • @undergroundman1993
    @undergroundman1993 3 года назад +12

    Back when I used to program my own Apple II games I would write out assembly subroutines and assemble them by hand on grid paper before typing in the hex manually

    • @fredsmith1970
      @fredsmith1970 3 года назад +1

      ha ha - I used to do exactly that with the zx Spectrum and the Amstrad CPC464 back in the 80's.

    • @nuk1964
      @nuk1964 3 года назад +1

      Yep... similar story with me and the TRS-80 Model I. Write assembly code, then translate into byte values (using the Z-80 reference guide), then write BASIC program to POKE it into memory. One annoyance when hand-translating the assembly code is when you discover the branch target is outside the range off the offset value, forcing you to adjust the code to adjust another, then another... Suffices to say that that method of writing machine code involved sufficient tedium that I was only writing small programs that way.
      I do remember being somewhat envious when I found out on the Apple ][ you can type CALL -151 to get into the monitor program.

    • @johngaltline9933
      @johngaltline9933 3 года назад +1

      I still have notebooks full of this stuff my dad wrote 40 years ago. I never understood any of it until the last 5 years or so.

    • @adamengelhart5159
      @adamengelhart5159 3 года назад +1

      These folks assemble 👍

    • @billr3053
      @billr3053 3 года назад

      @@nuk1964 Model I, yay!!! I wrote a disassembler in BASIC for the Z80. Somewhere along the line I acquired an assembler and did proper assembly language programming though. I don't think I put in hex via POKEs in the fashion your described. Maybe a few bytes worth to write text to the screen but that's it.

  • @derecwilsom4546
    @derecwilsom4546 3 года назад

    thanks for the great video david, hope you are getting things back to normal after the storm, stay safe.

  • @TimsCabana
    @TimsCabana Год назад +1

    I have been writing assembly language code for over 30 years now. It's the only language that I know, and I still love it! I design a lot of embedded stuff, and write all the code in assembly. In fact, many years ago, I designed a timer system for my home recording studio that had 16 outputs, and could time events to the 1/100th of a second. I wrote all the code in machine code. At the time... I had no idea what a compiler was because I taught myself how to do this. I used to go through lots of paper!

  • @environmentNow
    @environmentNow 3 года назад +12

    When I was in high school (I just finished), I used to write simple programs in 8088 machine language (Not Assembly) using op codes, and counting the memory address, when I got back home, I typed it in a hex editor, saved as a .com file, and tried running it in DosBox (there were usually a few mistakes).

    • @BigOlSmellyFlashlight
      @BigOlSmellyFlashlight 3 года назад +2

      reminds me of the period of time in 8th grade where i didn't have internet at home so i would look up my python errors at the school classroom laptop and write the possible solutions on paper and try those once I got home

  • @AltriusCodes
    @AltriusCodes 3 года назад +10

    It's worryingly convenient that begin researching the basics of using an assembly language only a day before this video comes out. Thanks for the explanation!

    • @llamathenerd1672
      @llamathenerd1672 3 года назад +1

      Same, I was just thinking of learning how to code in machine language on my VIC-20.

  • @JeffRyman69
    @JeffRyman69 Год назад +1

    Reminds me of my first programming class in 1966 on an IBM 1620 (operated by a console typewriter and punched cards). First we learned machine language. Second we learned Symbolic Programming System (assembler). Then we learned Fortran II.

  • @wings9925
    @wings9925 Год назад +1

    Ah, the nostalgia! This takes me back to studying Computer Science in the late 80s and very early 90s! Writing interpreted code on the Sinclair ZX81 and Acorn/BBC Micro B, then progressing on to Pascal, C and ML itself. They teach little of the fundamentals these days: Little about hardware or the abstraction layer. Really enjoyed this trip down memory lane. Thanks.

  • @gzozulin
    @gzozulin 3 года назад +53

    You caught me when I was "rushing into the comments to tell" lol

  • @Lokalaskurar
    @Lokalaskurar 3 года назад +15

    Ooof! We REALLY need that blue colour back on the walls. It's become part of your journey :)

    • @6581punk
      @6581punk 3 года назад +2

      When doing photography, to get accurate metering they often use a "grey card" to set the camera exposure. So a darker background definitely stops exposure problems.

    • @The8BitGuy
      @The8BitGuy  3 года назад +25

      I do plan on painting the wall blue behind me...

    • @pedrofelck
      @pedrofelck 3 года назад +3

      @@The8BitGuy VIC-20 blue like you neighbour's house? I mean, like the outside of the studio. :P

    • @wadereynoldsgm
      @wadereynoldsgm 3 года назад +3

      C64 blue with light blue trim all the way around...

    • @mfaizsyahmi
      @mfaizsyahmi 3 года назад +1

      @@6581punk I saw that on Technology Connections.

  • @nicodermcq
    @nicodermcq 3 года назад +2

    Absolutely fantastic video. I don't think I've ever seen anyone describe or demonstrate this so well.

  • @-r-495
    @-r-495 Год назад

    Your content is right up in the top three of my favourite yt creators.
    And that‘s just me.
    You‘ve got at least one!

  • @mattfojtik7130
    @mattfojtik7130 3 года назад +76

    There is a pronounced 60 Hz hum in your new studio. It's present in every scene where you are talking in front of your desk.

    • @Defianthuman
      @Defianthuman 3 года назад +9

      60hz is the American power standard and there are computers and things running

    • @HighPassGuy
      @HighPassGuy 3 года назад +12

      It actually sounds lower than 60Hz.. It sounds like it could be the AC or some fan running. A high-pass filter would fix this for you. Most mics have one built in..

    • @richardlighthouse5328
      @richardlighthouse5328 3 года назад +3

      @@HighPassGuy Bandpass filter would be better.

    • @markusbl80
      @markusbl80 3 года назад +1

      Bass roll off would do the trick

    • @HighPassGuy
      @HighPassGuy 3 года назад +4

      Why would a bandpass filter be better? and a "bass roll off" is a high-pass, just said a different way.

  • @user-tb5ns7hc5i
    @user-tb5ns7hc5i 3 года назад +16

    I learned assembly and machine language on an Apple II in grade 8 in the mid 80s after learning to code in basic on a Vic 20. Great video David. Memories. Pun intended.

    • @TheRainHarvester
      @TheRainHarvester 3 года назад +1

      This interests me. What books/people/resources did you have that allowed you to find info for learning? I was similar in age and desperately looked in every book in every book store and library but couldn't find info.

    • @user-tb5ns7hc5i
      @user-tb5ns7hc5i 3 года назад +1

      @@TheRainHarvester there was a unique school ‘nerd’ culture at the time 40 years ago when home computing was so new so there was a lot of shared interest and resources available at that time mostly by just networking and hanging out in computer labs and clubs and online bbs chat rooms with similarly minded enthusiasts. We learned from each other in real-time by dissecting and sharing each other’s code and watching older kids code and ‘hacking’ software to learn how the applications ran. There were also many magazine publications with programming code examples in them that we would religiously type out and learn from. It was a very social intense time for computing with a lot of new exciting ideas happening quickly. I would recommend today trying to join a local computer club and join online groups dedicated to specific kinds of programming languages catering to your level of experience. If I were looking to get into it today, I would immerse myself in Artificial Intelligence machine learning, VR, and neural networks for sure. The next big things along with nano and bio tech.

    • @markyacoubian1911
      @markyacoubian1911 Год назад

      @@user-tb5ns7hc5i Seems like a similar era today (I am currently in CompSci1900 at the local state U); even though I was only in primary school in the mid/late 80's I have a pretty solid sense of historical time periods. They also tried to teach me Apple II stuff and some simple programming (2nd grade thru 5th grade in 1988-1992) but I didn't get much further than Oregon Trail (version 1.0) and the version 2.0 of Oregon Trail with slightly better graphics. Haha.

  • @MrSloppyLoppy
    @MrSloppyLoppy 3 года назад

    Hey the construction of that was quite nice and thank you for sharing that as well. After seeing you put the wiring and how much care you put into that place it's going to be worth the wait and worse the while watching you make some new videos and see you come out with some new projects. Thanks for sharing

  • @001GrandExplorer
    @001GrandExplorer 2 года назад

    The best programming intro I've ever have!
    I even will recommend this to my colleagues!!
    Thank you so much for this great work!

  • @kjrehberg
    @kjrehberg 3 года назад +14

    Great video. I'm old enough to call it "assembler" instead of "assembly language."

    • @maighstir3003
      @maighstir3003 3 года назад +9

      I was under the impression that the language (well, languages, one for each CPU) is called assembly while the application that assembles the text into machine code is called an assembler. That may very well be a more recent change though, I've only been messing about with computers for less than 30 years and the lowest-level language I've dabbled in is C - and only barely at that - I'm much more familiar with Python, Bash, and Powershell.

    • @junbird
      @junbird 3 года назад

      Those are two very different things, even though this misconception is very common even to these days (I heard computer science teachers confuse the two concepts). Assemblers are basically compilers, they are programs that compile assembly language to machine language.

    • @countzero1136
      @countzero1136 3 года назад

      You too eh?

  • @234laptop
    @234laptop 3 года назад +95

    Feedback on the new set: The audio has a bit of reverb/echo. Apart from that, looks nice.

    • @KanawhaCountyWX
      @KanawhaCountyWX 3 года назад +4

      I sincerely hope he moves over the bracket mounted keyboards.

    • @GP1138
      @GP1138 3 года назад +12

      There also appears to be some low-frequency hum in the audio.

    • @cumulus888
      @cumulus888 3 года назад +3

      @Cheese Touch Gaming Wonder if that is from the LED lights.

    • @whiffysole
      @whiffysole 3 года назад

      @Cheese Touch Gaming it's probably something to do with the bright white background and how he has the lighting set up

    • @FerroequinologistofColorado
      @FerroequinologistofColorado 3 года назад

      @@KanawhaCountyWX I hope so too

  • @richardsd8526
    @richardsd8526 3 года назад

    New studio looks great. Always a good video!!!

  • @benjaminmoseslieb9856
    @benjaminmoseslieb9856 Год назад +1

    This was so eye-opening for me. Thank you so much! I've been a web developer for 15 years and never really had to look into this. The advent of WebAssembly though is changing that.

  • @XanthinZarda
    @XanthinZarda 3 года назад +39

    Remember, Roller Coaster Tycoon was made in assembly with just a little C to make it executable.
    That's how powerful and versatile it is, and why it was so easy to run on a vast swathe of systems with little problems; perfectly timed out.
    (It's explain on Chris Sawyer's personal website.)

    • @MrMatteNWk
      @MrMatteNWk 3 года назад +3

      "I haven't finished my tentacle yet."

    • @Tatsh2DX
      @Tatsh2DX 3 года назад +8

      RCT only runs on x86. This is the nature of writing in assembly.

    • @kurtbitner6675
      @kurtbitner6675 3 года назад +1

      @@Tatsh2DX it was ported to ARM because it’s available on iOS.

    • @FatCatCooper
      @FatCatCooper 3 года назад +1

      @@kurtbitner6675 and android!

    • @Tatsh2DX
      @Tatsh2DX 3 года назад +8

      @@kurtbitner6675from Wikipedia: 'In March 2016, Sawyer affirmed he had started work on RollerCoaster Tycoon Classic with Origin8, to be released for mobile devices.[1] As with the rework of Transport Tycoon, this required Sawyer and Origin8 to rework the assembly code from RollerCoaster Tycoon 2 into C. They were also able to add new elements to the game during this period.[3]'

  • @andie_pants
    @andie_pants 3 года назад +11

    So I clicked on ONE silly 22-second long video and suddenly my YT feed is nothing but thumbnails with red arrows and circles. I was about to lose hope when suddenly... The 8-Big Guy appeared!

  • @gryfandjane
    @gryfandjane 3 года назад +2

    This was fascinating, thanks! Back in the day, I would stay up late keying BASIC programs into my Atari 400 from books that didn’t target Atari machines. I learned a lot. Then I took a data processing class at a local community college, in which we entered machine code into punch cards and executed them on a mainframe, with the output appearing on a huge dot-matrix printer. Yeah, really old-school for the mid-1980s; there were complaints from the class, along with suggestions that learning a higher-level language would make more sense. So the teacher explained that the language we were using had been a pet project of one of their IT faculty, and we were basically stuck with it. And the funny thing was, the class gave me a solid concept of what actually goes on under the hood. I learned things like not sending my outputs to the same memory space as my instructions, causing my program to eat its leg off and die, for example. All told, it was great fun. Again, thanks for all the great content!

  • @memepatroller9533
    @memepatroller9533 3 года назад

    The new studio looks amazing! I love your channel.

  • @alphabeets
    @alphabeets 3 года назад +7

    I’ve waited decades for a video like this explaining these things in simple terms. Thank you.

  • @KLa35
    @KLa35 3 года назад +7

    This is the Mecha Hyper Jesus Animatronic with lasers of 8-Bit Guy episodes.

  • @AnimationByDylan
    @AnimationByDylan Год назад +1

    C# is originally compiled into an intermediate byte code (tokenized), which is then just-in-time compiled on execution. It requires a runtime interpreter.

  • @anjinmiura6708
    @anjinmiura6708 3 года назад +6

    It's good that he's getting back to work again. And even better that he's discussing Assembly language. I used to write assembly language for CoCo and OS-9 (6809).
    I think you SHOULD make a side channel for C64 Assembly language development. It would be awesome. PLEASE!

  • @discoHR
    @discoHR 3 года назад +7

    11:55 We used to call it "assembler" over here in Europe while "compiler" was used for high level languages (Pascal etc). Yes, they basically do the same thing.

    • @bozimmerman
      @bozimmerman 3 года назад +1

      Same in the U.S., though I have to say that yall had MUCH more fun with the word 'assembler' when naming them than we did.

    • @rty1955
      @rty1955 3 года назад

      Not at all. You CAN create machine code by hand with an assembler source code, but you cannot create machine code with compiled languages such as C

    • @discoHR
      @discoHR 3 года назад

      @@rty1955 They both do the same thing.
      Assembler: assembly -> machine code
      Compiler: C -> machine code
      You can also inline assembly language into C source if you wan't to speed up your code or if you want to skip unnecessary machine code instructions compilers usually generate. I used to do it a lot back in the (DOS) days.

    • @rty1955
      @rty1955 3 года назад +1

      @@discoHR no they dont..... An assembler assembles and a compiler compiles....
      Assembly Lang is a one lines of source to one machine codeb an Compiler produces 1 lines of source code to MANY machine codes BIG difference

    • @discoHR
      @discoHR 3 года назад

      @@rty1955 Doesn't matter. They both produce machine code. Humans can write unnecessary instructions too, such as CPX after INX shown in this video. INX already updates the zero flag so CPX is not needed and it just slows everything down.

  • @greenslimy
    @greenslimy 3 года назад +18

    I wish I could like this twice. It explains all of the intricacies involved in Computer Science.

    • @pvc988
      @pvc988 3 года назад +4

      Not even close.

  • @michaeldefries3583
    @michaeldefries3583 Год назад

    this is the best explanation I've ever hear/seen. I started programming over 25 years ago with C++ and had no idea on how it actually worked. I realised early on that I wasn't driven enough to keep up with the pace of learning different techniques and languages but it has always interested me. You've just answered one of my main questions around machine language and assembly. thank you.

  • @josephgaviota
    @josephgaviota 2 года назад +1

    Mr. 8-bit. I'm watching this a second time today.
    You're really good at making things easy to understand, and many of us really appreciate your efforts.
    PS: As a FORTY-year computer guy (CCI / ATEX / PENTA / KORN / PYTHON) ... I can really appreciate what it takes to get the computer to do what we need for a given task.

  • @RWSCOTT
    @RWSCOTT 3 года назад +12

    I always used to giggle when they'd mention the lords of *Kobol* on Battlestar Galactica :D

  • @xyzzyfrobozz
    @xyzzyfrobozz 3 года назад +6

    The new studio needs a sign saying, "This space not intentionally left blank." :) Nice progress though.

  • @trevinbeattie4888
    @trevinbeattie4888 3 года назад +4

    Learning assembly in the 80’s and 90’s, I’ve found that 6502 is relatively trivial (that’s the first one I learned), 8086 (and its successors) was nightmarish, 68000 is clean and elegant, and VAX is rather fun (it’s one of the most complex instruction sets I’ve ever had the pleasure to program in - it even has string operations!). I would have liked to try my hand at a RISC processor such as the T400 or ARM, but never got my hands on one.

    • @stargazer7644
      @stargazer7644 Год назад

      Take a look at like the Microchip PIC series of microcontrollers. Those are RISC processors and are often programmed in assembly, though C compilers are available.

  • @carpdude73
    @carpdude73 3 года назад

    It is fun watching your videos (even though I am only following a fraction of your explanations)! Great new space! Can't wait to see it completed!

  • @Jesse__H
    @Jesse__H 3 года назад +6

    You have the best intro on RUclips.

  • @keatondickens2532
    @keatondickens2532 3 года назад +8

    can't wait for PART 3 studio construction but i do miss the old blue background

  • @nervousquirrel
    @nervousquirrel 3 года назад

    Awesome video man, thank you for putting in the time on it. This was very cool.

  • @davidtucker4044
    @davidtucker4044 3 года назад

    Love your passion dude. Keep up the good work.

  • @wanderingheath
    @wanderingheath 3 года назад +5

    This is probably my favorite 8bit video, and that’s saying a lot.

  • @thepillageroutpostguy6121
    @thepillageroutpostguy6121 3 года назад +21

    Hey look the 8 bit guy starts filming in his new studio!

    • @HelloKittyFanMan.
      @HelloKittyFanMan. 3 года назад +3

      Videoing.

    • @n9wox
      @n9wox 3 года назад +1

      Recording

    • @Okurka.
      @Okurka. 3 года назад

      Taping

    • @nhand42
      @nhand42 3 года назад

      Shooting

    • @HelloKittyFanMan.
      @HelloKittyFanMan. 3 года назад +3

      Nahh, @@Okurka., I bet you he''s not using tape-based cameras anymore. That term is fairly outdated, even though "filming" is even more. But since "flashing" doesn't sound very recording-specific, and since he's still shooting on video, that's why "videoing" is best.

  • @ChrisM541
    @ChrisM541 2 года назад

    Excellent video! This brings back memories. Subbed!
    As someone who has programmed a lot of assembly software in the 80's for the C64, there's a couple of things that would benefit from a little expansion. At 5:12 assembly language isn't compiled - it's converted to machine code by a process known as assembly. This distinction is important to note - particularly for the way in which many assemblers 'of that era' worked...your hand-written assembly program, literally, was a 1:1 version of machine code. Every assembly mnemonic op-code was converted to its machine language via a simple lookup table, a table which also determined the type of operand, where applicable (CLC, SEC, INX, etc have no operand). In these early days, the programmer would always know what the machine code output would be before assembly.
    Confusingly (and I guess where we're seeing 'assembly as a compiled language' in this video), some of today's compilers e.g. C++, through the compilation process, can also produce an assembly version output. But this is not hand written(huge distinction!!) - it's been generated from a high level language source i.e. C++ - the programmer isn't writing in assembly language.
    In later years assemblers would offer increasing levels of macro abstraction together with machine code optimisations e.g. an assembly multiply or divide instruction by a power of 2 might get replaced by a faster rotate instruction. Depending on the use case (and for maximum speed), a pre-computed look-up table could also be used (used extensively in all those shoot-em-up's).
    The primary focus with these early computers, particularly for games programmers, was in execution speed and program size. Today's extremely fast CPU's, GPU's and large RAM sizes, perhaps unfortunately, require programmers to pay much less attention the level of manual optimisations required in the past.
    A word of caution, though, for compiled languages - don't assume the compiler will always produce the fastest possible machine code - manual assembly optimisation in certain time-critical loops (e.g. main game loop) is more common than you might think in today's AAA studios!

  • @elanman608
    @elanman608 3 года назад +1

    Real nostalgia trip I remember using the RML 380z at school in the very late 70's and early 80's which had a halfway decent Basic compiler as well as an assebly language interpreter and competing to perform tasks in the fewest clock cycles.

  • @niyagentleman8143
    @niyagentleman8143 3 года назад +4

    The 8-Bit Guy and the thousand others, for those people precisely, we should feel SO THANKFULL towards them,
    for spreading their ENORMOUS knowledge wordwild! RESPECT
    ( big GEEKY BRAINS ) !!!
    LOVE U!

  • @jsmunroe
    @jsmunroe 3 года назад +28

    C# is a language that is compiled into MSIL which is not machine code but an intermediate language. MSIL is executed on the Common Language Runtime which JITs it into machine code native to the machine it's running on during execution. As a result, managed DLLs and EXE's are encoded in MSIL and can pretty easily be decompiled into C# that is just a little syntactically verbose but still easily readable. Visual Studio now automatically decompiles MSIL assemblies when you perform a Go To Definition (F12) on something that is not in your own code. This is the world we live in now, and it is wonderful.

    • @veschyoleg
      @veschyoleg 3 года назад +3

      And C# and Java aren’t that different in terms of syntax and learning curve. Much more similar to each others than to C++ or any other languages listed.

    • @jsmunroe
      @jsmunroe 3 года назад +2

      @@veschyoleg C# is a cross between Java and Delphi. It is much better than both IMO.

    • @grn1
      @grn1 3 года назад

      @@jsmunroe How does Java's Bytecode differ from MSIL? I learned a bit of Java in college (forgot most of it as I've not used it) and was taught that Java is an interpreted language that speeds itself up somewhat by compiling into Bytecode.

    • @joechristo2
      @joechristo2 3 года назад +1

      what is JIT

    • @veschyoleg
      @veschyoleg 3 года назад +2

      @@joechristo2 Just-In-Time compiling, i.e. compiling that happens at the end user machine before execution.

  • @philipgrice1026
    @philipgrice1026 Год назад

    Back in the 1960s I operated and programmed both an 8K EMIDEC 1100 computer. It was used extensively by a large company to do things like calculate the payroll for eight thousand employees, math modeling for petroleum cracking plants and physical inventory. Everything was written in machine code but some programs still took days to execute. We replaced the EMIDEC with an IBM 360 Model 40 with 64K. It was not the panacea we anticipated as half the memory was consumed by the operating system and the MTBF was measured in hours. But now we learned to use Assembly language. Cobol and Fortran were used the the commercial application programmers but we 'system programmers' used Assembly to create 'tight' code that executed faster. My favorite instruction was the BXLE, Branch on Index Low or Equal. I loved writing tight loops using 'Bixlees', without explanatory remarks in the source code. Keep 'em guessing how it works. Today's software does little computing it seems. It is all about painting screens for video. So sad to see phenomenal computational power idling along painting games or streaming trash TV programs.

  • @Deniii4000
    @Deniii4000 3 года назад +1

    This is one of your best videos.
    More like this one, please.