Why should I learn assembly language in 2020? (complete waste of time?)
HTML-код
- Опубликовано: 10 июл 2024
- Patreon ➤ / jacobsorber
Courses ➤ jacobsorber.thinkific.com
Website ➤ www.jacobsorber.com
---
Why should I learn assembly language in 2020? (complete waste of time?) // Assembly language is one of the most hated things about computer science education. It's also the foundation for nearly everything that happens on your computer. In this video, I describe a few reasons why you still might want to learn assembly in 2020.
***
Welcome! I post videos that help you learn to program and become a more confident software developer. I cover beginner-to-advanced systems topics ranging from network programming, threads, processes, operating systems, embedded systems and others. My goal is to help you get under-the-hood and better understand how computers work and how you can use them to become stronger students and more capable professional developers.
About me: I'm a computer scientist, electrical engineer, researcher, and teacher. I specialize in embedded systems, mobile computing, sensor networks, and the Internet of Things. I teach systems and networking courses at Clemson University, where I also lead the PERSIST research lab.
More about me and what I do:
www.jacobsorber.com
people.cs.clemson.edu/~jsorber/
persist.cs.clemson.edu/
To Support the Channel:
+ like, subscribe, spread the word
+ contribute via Patreon --- [ / jacobsorber ]
+ rep the channel with nerdy merch --- [teespring.com/stores/jacob-so...]
Source code is also available to Patreon supporters. --- [jsorber-youtube-source.heroku...]
Writing in assembly makes you realize how comfy job you have with all these high level languages
And it explains to some degree why games in the 80s took just a few KB, while today even the most trivial window program requires a double digit MB. Abstraction is nice, but comes at a cost. Now that Moore's law is dead, I'm pretty sure that in the near future we'll rediscover the value of "efficient" programming without the terrible waste of using vast amounts of resources incredibly inefficiently.
yea, funny thing is high level programmers think they're legends..
@@saulgoodman5662 My personal experience is that primarily those coders that are rather problematic to bad (eg writing completely unmaintainable code, don't do any inline documentation, don't follow coding guide lines, ignore business requirements, alter external Interfaces at will without joint agreement, never meet any deadline, waste time on optimizing code that is not the bottleneck, always have disputes with every other coder etc) think they are the real deal and gods greatest gift to humanity.
The real good ones, that simply produce perfectly fine and highly maintainable code, stick to agreements, are reliable and always meet their deadline usually don't claim such things.
@@frankschneider6156 Moore's law isn't dead. Although the speed of CPU's isn't increasing at the same rate as it once used to, GPU's are.
@@Extreme10s Nope even GPUs are taking longer to iterate than before. It used to get more than 30% every year and now it's more like 40% every 2 years while being slightly more expensive.
Learning assembly will make you:
1. Understand that 1kb is a lot
2. Can command the computer yourself.
3. Understand the computer from the lowest level.
4. Writing an operating system *YOURSELF* .
5. Understand what did the compiler produced.
@Donald Duc Check out Ben Eater's 6502 breadboard computer project (ruclips.net/p/PLowKtXNTBypFbtuVMUVXNR0z1mu7dp7eH), no kernel (or "kernal" if you're from the CBM world). You'll get a deep understanding of how computers work on the molecular level (sort of). Every little thing you want the computer to do has to be spelled out (via assembly); many of those things are taken care of for you by high-level languages. Being concerned with the nitty-gritty details of simply adding 2 numbers together makes you a much better programmer in higher level languages. The ultra plus side is when the wife asks "Whatcha doin'?", you can simply say "it's technical", show them some code and they quickly walk away :).
@@dcc1165 Better learning tool is Ben eater's 8-bit breadboard computer from TTL logic gates. Allows you to be in full control over your machine!
@@richardlighthouse5328 Been there, "watched" (not done yet) that :). I've watched all the videos, just haven't gotten the kits yet. Work and life keep getting in the way :)
Don Costello My wife can’t speak anyway
How to make your computer your slave 101.
If you enjoy something it is never a waste of time 😊
You have a point. But what if I enjoyed collecting stamps? Or what if I enjoyed playing the guitar? Or some other useless thing?
@@omairtech6711 Useless in what regard? If it makes your life better in any way is it really useless?
@@hansisbrucker813 Good point.
so fucking true
I once wrote a simple binary tree program in both C and assembly, and the C one ran faster :(
"The compiler is smarter than you", a phrase I've seen devs say here and there, and it's pretty true, the compiler knows a lot of optimization tricks...
Makes me cry at night... ):
I feel so weak...
@@J_Tanzanite not that possible if they are taking wrong values and address in register and memory then warn u other wise keep you going in my opinion nothing better then assembly lang bc its very fast it can directly interrect hardware
@hameed yousfi how fast is it? Is it the fastest language in computer programming ?
Betcha you didn't look up the number of cycles for each instruction and its addressing mode, and write the code a bunch of different ways and stuff. Modern processors have so many instructions that "straightforward" code is unlikely to be the fastest.
I should find the video I saw of a "compiler explorer" I think it was, probably a GOTO conference talk, where he showed the asm code various compilers generated (and for various Intel processors), for one example, multiplying by a constant.Sometimes it called the multiply instruction, other times it did shift-and-add, other times it used an addressing mode and used LEA to capture the result.
This must have been it:
ruclips.net/video/w0sz5WbS5AM/видео.html
When I was writing 6502 asm (okay, this was just a few short decades back, think "More Than A Feeling" on the radio), when I wrote a routine I could feel reasonably confident it was an optimal implementation of the algorithm I was using. Not any more.
I'd say "Get off my lawn" but we only had dirt back then.
@@TranscendentBen
Help I can't press that
"assembly is human readable machine language"
Next screen: unreadable gibberish.
All it is is a simple Hello World program.
Well compare that to a screen of nothing but 1's and 0's, because that's what machine code is.
@@Lucky10279 machine code can be readable opcodes. People used to code in machine code back in the day, believe it or not
@@Victor-vc9br You'd want to use a binary analyzer such as Ghidra or IDA
Was quite readable to me.
Learning assember early on, and understanding how C compiles into it, has been a significant influence on my coding technique and benefit to my career.
How would you recommend approaching learning this?
@@jask00 Watch Jacob's video and try it out, use the compiler to output assembly from your programs. Then find a tutorial or reference for your hardware. There are some tutorials on youtube.
May I ask how much money you make?
I have learned this and I hate it
Writing compilers and all that is never fun
@@azophi buddy, you're like 15 years old, you aren't writing any compilers
A lot of game companies hire engineers that know assembly to reverse engineer cheating software
Do you have any sources of that claim?
@@floatingchimney www.linkedin.com/jobs/view/1843928024/?alternateChannel=search
This one I found in a few minutes in Linkedin, I've seen it for EA Games aswell and some other mobile game company I can't remember in SF
@@floatingchimney im no code guru but I've developed cheating software for personal testing as I want to be a penetration tester. It involves a pretty decent amount of assembly.
@@jh-cv4lh Bro, you don't need to be a super hacker to create cheats.
I "hacked" into games' memory locations that are variables that hold game score and directly manipulated the memory, thus changing its value in-game.
That was without assembly.
@@floatingchimney What you're doing isn't nothing, but you can do so much more with assembly. Frankly, you don't even need to write Assembly to do game hacking, you just need to know how to read it. If you can read and understand assembly, you can actually just write your modifications in C using something like Detours. It's critical for anything past cheating in single player games. While I don't encourage hacking in multiplayer games, hacking has many uses. For instance, making games run on modern computers which don't work out of the box.
I am studying ARM Assembly at uni. It's really useful when it comes to raw performance, but the code is much harder to organize. Respect for developers back in the 80/90's
Well more so of the 60s-70s when they build operating systems solely in assembler. In the 80s and 90s we did mainly programmed assembler for demos. Or small sub routines that needed to be highly efficient, but we would link it to C applications as extern functions.
@@rdoetjes I was referring to game developers for handhelds or maybe devs for embedded systems and whatnot. I mainly program in C++ and I do understand the potential of C languages to be a friendly, nicer, more elegant substitute for assembly languages
@@rdoetjes even to this day some parts that require extreme efficiency will be made purely on assembly, but one can become a very good software engineer without knowing a single opcode these days
People think I’m crazy for enjoying ASM, but the Raspberry Pi and ARM architecture has been a great incentive to get back into playing with it again!
I love programming in assembly as a hobby: I 1st learnt about microprocessors at college by programming a 6800 using nothing but switches and discrete LEDs, i.e. programming in the machine code, This is the best way to learn!
Well, as an embedded engineer on the low and deep level, as I am, there is hardly any way to avoid assembly. My profession includes writing parts (or even the entirety) of device drivers in assembly, startup code in assembly, parts of operating systems in assembly, optimize specialized functions in assembly and so on. I actually write hex code on the fly too (with the help of the OP code TRM for the particular architecture of course). Sometimes I am too lazy to flash code changes (or write them in the first place), so I patch that stuff on-the-fly in RAM via debugger (given the code runs not from ROM). Also, high level software (wrappers) can make use of assembly too. I once wrote an ANSI C over ASM to C++ private method invoker for example. We had to call several C++ classes' private methods from the ANSI C context, but rewriting the classes would have broken the code base at that point. So writing a custom invoker in assembly mixed with C/C++ was the perfect intermediate solution. The compiler could not complain about "wrong access scopes", we could easily make our release and schedule a refactoring of said classes for the next time.
But true, I have one of those highly specialized jobs, but I still would recommend anyone seriously thinking working with embedded architectures (and I mean the real baremetal stuff, not this wanna be Arduino type of things) to dip their nose into it. See if you are fit for low or deep level (and ready to take the challenges) or if you rather should stay in the mid or high level embedded engineering where there is practically no assembly in your daily work (except seeing it in the debugger).
sound like u are one of the people who has genuine need to link C and C++ object files together to produce a mixed C and C++ binary which in some contexts is discouraged as bad practice. but you have genuine purpose for it good job
@@tacokoneko Well, mixing C and C++ is not that uncommon in certain industries. For instance, some safety requirements specify you to have hardware drivers in plain ANSI C and assembly if needed. Application software can be any depending on what performance goal and modularity is required.
MISRA for the win. You're doing industrial embedded ?
@@kayakMike1000 Yes, Automotive
@@leviathanx0815 nice. I am in industrial machine controllers, motor drives, PLC, so on... You think RISC-V is going to show up at the day job anytime soon?
I definitely agree. The value of a CS education is that you understand the underlying bits and pieces. If you want an applied approach to what you'll most likely be doing at your employer go to a bootcamp. Of course it's like they say: "easy come, easy go". If your skills can be learned in 3-4 months then you're easily replaceable and therefore less valuable in the marketplace. By the way I recently came across your channel and gotta say it's chock full of extremely useful information. Thank you for the high quality content!
Superb video. Superb advice. Your students are lucky to have you.
I loved my computer architecture classes. Learning how activation records are handled, for example, sharpens you.
And when I came to understand that authors of the compilers and the assemblers choose how those tools create their output, it kinda blew my mind. It also kind of forced me to experiment with writing op codes with a hex editor. Which is a bit like reading/understanding Joyce.
Interesting video please keep on creating content like this, not many content creators don't or can't create this type of content, thank you.
At my University I am at now, I am lucky to be able to go even further than assembly (we learned MIPS). Our course had us learn concurrently with assembly to build a custom made CPu. We used the assembly code, convert it to its binary representation and then… pass it through our own custom made 5-stage MIPS based 32 bit processor. It was by far the most difficult class I’ve ever taken but i learned truly what was happening at such a low level. My professor at the time was a retired engineer who had worked at Intel for 20 years making CPUs among other companies . I am studying computer engineering at the university of Arizona and the class was ECE369 if anyone wants to look it up.
nice !
Go Devils!
sounds hard core ! Kudos to your University . You might want to give it a shout out ?
Hello
solving 'crack-mes' or binary-explolitation exercises in CTF (capture the flag competitions) is a great way to make learning assembly fun, and as an added bonus learn about security.
really nice channel..i am thankful to search for embedded system and find this channel
Welcome. Glad you found it.
With assembly language, you control the implementation.
That's something that no HLL will give you. Coming from the embedded hardware side, there are times when you really do need that control. Usually it is when bringing up a machine (you might not have memory yet, and no stack either), or code that must execute with exact defined timing and access order (too fast and you violate timing, too slow and the hardware watchdog will fire), or maybe you are writing a thread switcher.
or a life or death app where you don't want the compiler messing with the code
Great video Jacob !
I think that more assembly content would be great, at least I would love to watch it.
me too
Personally, I want to learn assembly because I _really_ want to understand how software actually controls hardware. Even learning C++ and C, we've already gone so many layers of abstraction above what's happening on the hardware level that I feel like there's so much information I'm missing. I've curious about the relationship between hardware and software for a long time and then I took logic design course, which gave me a taste of how it all works, which just made me want to understand more.
There is a nice book called "How does it know". In it, the author develops a "Scott"-CPU step by step from simplest circuits at the begin into a fully functional cpu at the end of the book, so you understand really how it works.
Answer to your question: a CPU is just a large circuit. "Code" is just a set of encoded electrical impulses. If the impulses are applied to the circuit (=code is loaded into the CPU) predetermined actions (determined by the structure of the CPU) are performed. That's all. The circuitry of a CPU is just so mindblowing, that it seems like a kind of magic. But in reality its just feeding electrical signals into a circuit, which does what it does, due to its structure.
When optimizing signal processing algorithms (audio/video/electro magnetic/..) for digital signal processors, it is quite common to write certain critical parts in assembly. Before doing that they usually have special C functions that map directly to special asm instructions, you have to deal with exact memory layout and memory usage (usually no dynamic allocations whatsoever),.. fun times, fun times :p
The reason I learned x86, x86-64, 32-bit ARM, and 64-bit ARM assembly programming was more likely due to a mid life crisis. I recall looking cross eyed at some Assembly Instructions in a book about the Magnavox Odyssey or some other old game console that my father picked up from the flea market; which also included a TRS-80 computer along with a cassette tape storage. I didn't understand any of up. So many years later I decided to actually study this ancient art. I'm glad I did. It helped me better appreciate the old school programmers, the modern programmers and computer technology overall. That said, it's good to know for the sake for application debugging and its very therapeutic; especially if you are a control freak. :)
I can vouch for what you said. I'm an embedded engineer and knowing assembly, I can read a core dump. Over time you can even recognize what compiler generated the code, you'll recognize for-loops vs do-while loops in assembly making it easy to line up patterns in the higher level c/c++ code. And of course, you get major bragging rights when you're the person called upon to debug an issue mere mortals struggle with ;)
I've never heard of you before, but you were in my feed today only (most likely) because of my mass searching of assembly language tutorials on YT. Now subscribed.
Welcome!
Thanks for the amazing content! The quality is always top notch :)
Very well stated. You echo the same words I tried getting across to programmers that were writing in COBOL, REXX, VOICE & FORTRAN. The ones that could also find their way around ASSEMBLER always had better code; could debug faster and had faster programs regardless of the language.
www.youtube.com/@PICMICRO
well, i am learning assembly langauge (NASM) for shellcode development. It is really fun to write programs in assembly language and you also understand how actually code get executed on the CPU and process memory.
those are the reasons why I have started learning electronics!! With all together combined with programming give you the insight what really is happening with the computer! And this is a thing!
Completely agree. Acquiring low level awareness is without any doubt an advantage for any programmer.
Learning Assembly is mandatory. It is fundamental. You don’t need to use it every day but you must use it at least once to write some bare metal code like a simple boot loader.
It's a good video. Well structured and hits all the important points.
I enjoy assembler myself and feel it's essential for truly understanding programming.
www.youtube.com/@PICMICRO
Thanks Jacob, this is really good and an answer I was looking for
Hello, nice vídeo.
I used to program in Assembly, also programmed on machine code. Actually ir makes You understand a lot more the system You are using.
In my computer organization class In the function call section I made some recursive functions in assembly, it really help me understand proper management of the call stack
www.youtube.com/@PICMICRO
Love your channel, so glad I came across it!
Welcome!
After I learned A86 Assembly years ago, I started to write Macros that basically did the same things higher-level languages did. But Assembly is just cool. Although there was no OS memory protection in A86 so I often wiped out DOS and had to reload it. 😀
Nice video clip, keep it up, thank you for sharing it :)
Gosh dang Computer Architecture class was a monster! Hearing you talk about Assembly almost triggers flashbacks :) in all seriousness, I'm glad my CS degree required this course.
Nice. :)
Sorry for reviving painful memories.
If you want to understand microprocessor architecture, compilers and a lot more, learning assembly is essential (mandatory in most CS courses). Why assembly, you also have to get a level down and understand machine code. It was challenging, but translating your c program code to 0s and 1s manually has a pleasure of its own😊
Thank you for video.
Some computer systems do require direct assembly code for narrow functionality. I've spend some time for studying specific assembly instructions for AVR and STM8 cores. There was no other way, than assembly tricks!
In my opinion proper C program execution should be optimization level - independed. For writing such code there is a need for deep knowledge from few dozens of science areas. Compiler is just a tool.
I love this quality content!
Simply its still very good for educational purposes and when you finally switch to C you will thank god that somebody creates amazing language like this.
As a software developer, i still use asm (only for my personal projects).
Asm code is only used for critical part, where there is room for optimisations AND the cos time rework vs the performance boost.
For that, my asm code consist a lot of use of the vector instructions (sse, avx ..)
I'm learning assembly to do time-critycal stuff (braking systems, linear actuators, safety measures) for industrial applications. I understood nothing but it made me appreciate C a lot more
www.youtube.com/@PICMICRO
I wrote quite a lot of assembly for PIC microcontrollers and EasyPic development board (worked for a local company that makes them). Later on I became smarter and witched to backend development (lol)
I work for a top 500 company that still tracks its inventory and pricing using a assembly program written back in 1981. One of the old bosses told me that other companies are jealous of it even today for its simple ui and reliability.
I've been interested in OS development, so I had to learn the GNU syntax (I think, I forgot all the stupid million syntaxes) for x86_64. It may seem stupid to do, because it is, but it's also really fun. It really makes you think about how a system works, how to deal with memory, the stack, argument, and many other things. I am still working in real mode because I find it simpler, but it's always fun to look at, if you don't mind spending months to years ripping your hair out trying to understand what you're doing.
Thanks for the great video content. Wondering if you might address this for 2023 and possibly which chipset / tools you would suggest to "get enough" experience to help along the embedded path. Thanks again and keep up the great content.
Did learn ASM in the early 90s. Was busy using it during the Y2K preparations on a mainframe. A lot of centralized modules like date calculations etc were done in Assembler.
With multi core CPU's it is hard to believe a few lines of an assembly language subroutine would make that much of a difference. But it's great if you are into retro coding old 8 bit machines.
I learn 6502/6510 assembly as a kid on c64 then move to 68000 on amiga, later to x86 on PC now I use it on microcontrollers and some debugging. This is the best way to understand hardware, logic, addressing, banking, flags, stack, cycles, irq, buses..... You are a complete master of device.
Assembly Language was the first programming class I ever took. It was their introductory 'wash-out' class and introduced me to the way the computer works at its very lowest level. Two-thirds of our class had dropped out by the end of the quarter.
It was IBM 360/370 Macro Assembler but gave me the confidence to dive into 6502 Assembler on my Atari800 when Atari 8K Basic needed all the help it could get.
It was down on the chip level where all the Atari Magic was.
When I started learning Motorola 68000 Assembly Language I learned what a wonderful Real Computer this chip really was.
Recently I've started taking a look at x86_64 Assembler and was appalled at how this reminded me more of the 6502 than the 680X0 code (which had reminded me more of the mainframe Assembler I had first learned).
In the early '90s, I moved on to the original 'portable assembler', C.
In the late '90s, I became convinced that C compilers can assemble much more efficiently than I ever could.
I may look in ARM AL now that Apple has thankfully bailed on Intel's backward-looking technology, but I've found a lot of folks discouraging anyone to look into ARM AL the same way they discouraged us to look into PowerPC AL. "You really don't want to go so low-level on a RISC machine."
Assembly on Raspberry Pi! Worth the effort. 6502 Atari Rulez!
BALR 14,15
.....
B 15
Look familiar? I laugh when folks complain about modern assembly. Try programming without a stack! Yes, the IBM 360/370/XA architectures didn't have a stack. Just 16 general purpose registers. Somehow we managed just fine in 32K of memory partition..
This is a good video!! I've programmed assembler at various times since the 1980's. Learning assembly allows you to understand what's going on behind the scenes.
Honestly, I found taking COBOL in college to be most horrible.
I had a job writing COBOL. Even more horrible 😀 I had to keep explaining nested If's to the other programmers and one finally got it.
If you find Cobol horrible, try PL/1 .. yes, it CAN get worse. Brainfuck or dosfuscated code is also pretty bad, but for other reasons.
Taught myself assembly so i can reverse malware. Still only 4 months in and its soo interesting!!
Where did you learn it from? I know very little of it
I learned a bit Assembly for a school final project. I did a maze, and of course that I had to write thousands of pixels manually ....XOR MY_BRAIN, MY_BRAIN
Wish I could use it for malware
you also need experience in reverse-engineering, debugging, or how malwares work. learning assembly is not enough by itself. you must study how the (i'm guessing x86) arch works, how (again guessing) windows subsystems work etc. you should studying very simple viruses first, debug/reverse simple programs, before going for the malwares. there is a lot of work and time but if you go for it's doable. besides, having a good knowledge about the architecture, windows, debugging etc. if far more important than just reversing a malware.
That's so funny! I wrote a keylogger in assembler in 1990 when I got into college and they had this weird thing called "Novell fileserver" it asked for a username and password. And I realised that teachers had higher privileges (more drives) than us students. So I wanted to know where those "drives" came from. So I wrote a TSR keylogger. Then I realised it was a bit shite, because the key log file would be on those computers and you would need to go along each computer you had used before with a floppy to copy that file. And I also had to install it manually in each system.
So I actually build a worm/virus that carried the TSR keylogger and wrote the key log file on to the J: drive that was a community disk to share stuff between teachers/students. And I would put a version of that com file there too. Hoping that people would start it and it would infect the netx.com which I found was always started before the login. And after the login was done I would clear the TRS to avoid random crashes (terminate stay resident was finicky).
And soon I had the supervisor password, which in the 4 years had not been changed once! Those were different times back then.
@@ImTouchkv2 Learning assembly is simple, you can look up the x64 instruction set or arm if you are on a pi.
And find out how to start a file (.data .text boiler plate) and how to assemble it as to gcc on linux.
And voila...
To make stuff that is more interesting quickly I would suggest getting dosbox and an old version of Turbo Assembler. As you can do easy VGA programming in DOS unlike on Linux. And manipulating memory *aka pixels* is a nice visual way of the core of assembler -- moving memory around.
This is what I recently did (after 30 years) instagram.com/p/CC-8YaIgj2l/?
You can find information on VGA mode 13h with even some assembler examples.
Assembly language is my FAVORITE. I need to re-learn it for the new architectures; I learned 8086 assembler.
self-taught here. i really like assembly and low-level stuff, been messing with x86 and disassemblers for years now, reverse engineering games as a hobby. it's just endlessly intriguing to me. i wonder what sorts of jobs demand this kind of skillset.
It sounds cool! Can you reverse engineering games with that? Did you learn x86?
@@diogofarias1822 yes to both! it's quite relaxing actually, to sit down with a game and explore its code and document how it works :)
Me too self taught but working on PIC micro devices from Microchip and have a channel fod anyone wanting to work hands on with a
hardware using Assembly MPASM to control. My channel.
www.youtube.com/@PICMICRO
LOL I remembered back in the mid nineties. I studied assembly and the tutor who helped me. Asked why I bothered to learned it. When I can do everything easily with visual basic, c and c++.
Back then I want to do some inbeded programing.
I studied Electrical engineering, 12 years. I've completely forgotten about computer architecture and assembly language.
Now that I'm a developer, I am relearning assembly and architecture
www.youtube.com/@PICMICRO
Have you ever tackled Knuth's The Art of Computer Programming? If so, what sections would you recommend students to study thoroughly?
Just wait until all volumes have been published before starting.
I wonder if he'll drop MIX by then.
Does that have assembly?
The only assembly I learned was a risc architecture for a microcontroller. It wasn't that hard. We wrote programs in both C and assembly for that processor often on the same project. It was actually easier to handle interrupt service with assembly than it was for C. All had to do in assembly was put the branch command to the service routine in the interrupt address. With C you had to do this mult-step process to write the branch command in binary to the interrupt address. Then you had the issue of bloat the compiler added to the interrupt routine if you wrote it in C. Most of that didn't matter for the main program body, but it was a problem for extremely fast routines like interrupt service.
Turn of optimization! Good call!
I once had to debug a program, and I knew it had to output some data over the IEEE card we had and I was experiencing timing issues in that piece of hardware.
And I couldn’t find an OUT. So I started a debug and then I realized it used self modifying code to cleverly inc the value on an address that had opcode $ED (in AX, DX) to $EF (out DX, AX)
What the driver was, was a state engine that basically switch from a read to write loop and only self modifying code changed that IN and OUT instruction depending whether it was an even or odd count in the infinite loop.
And again, it was my years as a kid hacking 6502 on the C64 that taught me about self modifying code. Because in college we weren’t taught that.
Now when I read that disassembled driver code (and I understood the clever trickery) I realized that the problem was not the driver. It turned out to be a faulty terminator.
C64 had 6510
@@richardlighthouse5328 Yes but the assembler was 6502. The only difference was that the 6510 could bank switch but it was 6502 assembler.
Most important reason: otherwise you can't reverse engineer binaries. Yeah, IDA Pro and Ghidra come with a "decompiler" deriving compatible C code from the disassembly, which speeds things up a lot, but still, it helps understanding the disassembled code. And be it just for the purpose of patching the binary.
I've never used Assembly on PC/Mac but I use Assembly exclusivly when I'm programming microcontrollers. I know that you can use C or other higher level languages on this chips but I've never found enough reason to use them and I wrote a lot of complicated things for few microcontrollers. The size of program and in many cases how many clock cycles you spend is very often critical and nothing can beat hand written assembly code in this task.
Would love a video series on Assembly!
Any preference which one? x86_64?
@@JacobSorber yes! I believe that works with Mac
@@JacobSorber Please RISC-V.
More Assembly please!
If function pointers confuse you, a little assembly language will clear that right up. It's as simple as "call ebx"
I have seen C compilers produce some really interesting things. The strangest thing I remember seeing is calling a function pointer implemented with a "ret" instruction.
That's pretty common. At least for tail calls or switch statements. On 6502 a PHA;PHA;RTS sequence is smaller than (but a little slower than) storing to adjacent Zero Page locations and then an indirect jump. On RISC-V the "ret" instruction is actually an alias for "jalr r0,0(r1)" i.e. do a subroutine call to the address in r1 but don't bother to save the return address (because r0 is hard wired to 0, so storing a result there does nothing).
I'm using assembly when I have to design control, observer or filter algorithms in mcus/dsps. It gets the work done, and usually it's faster and more deterministic than some pre made libraries, or even bare-bone C functions. That being said. I've never created a full asm project, only the critical functions.
Hi Jacob, thanks for this quick video. i wanted to ask you what book would you suggest to learn assembly as beginner ?
i have used assembly to debug issues which are reproduced only when the compiler optimizations are enabled. it was easy to find how the code behaves during runtime in gdb disassembly.
In my case I am currently designing and building a discrete component processor and creating my own AL instruction set so it is a from the ground up creation and AL/Binary is the only option and learning a lot from the project.
actually you can write areal assembly code on a c compiler, all you have to do is to put it in the label asm{}, everything in brackets will be compiled as assembly, I did some testing with the holly debugger and its a one to one translation, also if you really want to learn assembly you have to use hollydbg, that thing shows you how the registers and ram are filling.
The useful use these days from Assembly language is in the cybersecurity industry specially in malware analysis. It's really important to analyze malwares in-order to understand what that piece of malware can go far
I've heard that if you've got a use for 6502 or 68000 Assembly, it's worth learning it. Anything higher or more recent, and it's best to just leave it to a compiler. I'd be interested in doing it some day, but it's very daunting compared to any other language I've studied.
That said, I like your argument a lot better.
I loved assembly language in uni, it was fun! Why would anyone complain about it!?
I wrote quicksort in MIPS64 assembly for one class.
I’ve also since written PIC24 microcontroller assembly and what I really liked about that is knowing exactly how long it will take, every cycle (on the microcontroller I used) took exactly 12.5 nanoseconds and the exact number of cycles were documented for each instruction. This meant you can determine exactly how long a given piece of code will run.
I love it too and have planned for 5 PIC devices from Microchip used in tutorials with Assembly and their MPASM assembler for hardware control. My channel for anyone interested in hardware control.
www.youtube.com/@PICMICRO
For embedded designers, like myself, that design around high speed applications and high MIPS the assembley is a must, and that depends on the application and the tasks that the design perform. Certain tasks can not be performed by C or Java because they can not run at the same speed of assembly. I used it in many of my projects.
Good for learning assembly information compilers produce.
Wow, I'm not alone in this universe - learning the assembly worth time :)
I don't know if there is interest or not, but this video is really good! Thank you!
Thanks.
Taking computer arch this semester my head hurts!
The problem in education is that the value of knowing something is often conflated with the value of teaching something. Institutions have a penchant for turning highly useful but non-essential skills into mandatory prerequisites, making them function more like barriers to entry than anything else.
very true :/
If you happen to be idle enough, have plenty of leisure time and strong interrest in learning machine computer (opcode), I sugest you to pick a simple midrange microcontroler as arduino's atmega 328p, download its IDE, read documentation (part datasheet, instruction set, compiler manuals), learn the basics of digital eletronics, pick a development board (or build it yourself) and go ahead.
There may be joy in it, but not without many tears.
'Thank GOD', and thank you very much for your support and time 😉 I am interested in learning assembly and C for the PIC processors 🤔 I would like 👍🏼 to see more examples 👍🏿
"PIC Microcontroller and Embeded Systems" by Muhammad Ali Mazidi et. al. is an inexpensive book on amazon and will teach you assembly and C for the PIC. They have a series that cover PIC, AVR, ST, TI, and ARM microcontrollers. I am working through the AVR one and highly recommend it. Next will be the PIC book.
@@stevengrosse2166 'Thank You', thank you very much ☺️
@@stevengrosse2166 I just realized 🙄 I can use the tutorials that comes with the unit 🤔
Inquire98 There are also lots of tutorials here on RUclips. Check out Microchip's channel, they've got some good starters. If you want to go in deeper check out the book. I've found most tutorials only take you so far or just get you going. I've found books are still the best overall sources of in depth info. With some exceptions perhaps. What PIC are you using? One of the curiosity boards with built in debugger?
@@stevengrosse2166 I want to develop autonomous vehicles using various PIC microcontrollers 🤔
Whenever I work with Microchip's PICs I have to program in Assembly. The memory size is very small and there are times where I have to make it run as fast as possible. My favorite was creating a signal generator using lookup tables and every pass through the loop took the exact same number of clock cycles.
@Mdmchannel Welcome to the club. (Smile) It's a royal pain in the rear, but when I have no other options, I have to deal with it. It's like Taxes and Death.
Why? I use their C compiler for PICs, and it's pretty damn good. However, I did do a couple of PIC routines in assembly, a bootloader and EEPROM writer, then linked in with the C object files.
Mark West C works very well when you have the larger PICs to work with, but on the smaller PICs the memory is very limited and the PIC can't hold the compiled code. The plus is I have total control over how many cycles a routine takes and that's very important when speed is critical.
AT&T vs Intel?
Definitely intel! it just makes more sense, especially when coming from other ASM syntaxes where the dest comes before the source (ARM, Risc-V, ...) it also resembles the C-Style assignment where you assign the thing to the left with the thing on the right.
Although I prefer Intel too, in the end it doesn't really matter. The differences are so insignificant, that every developer should get used to the syntax (eg required by coding guidelines) within less than 1 afternoon. More a thing of a habit.
prefer amd assembly
AT&T is ugly. Damn
I don't think he actually asked that question. It was just an example of large differences.
Cool video and yet I think it misses one great application of assembly language - profiling.I've only seen 2 profilers out there so far and it is VTune on Windows (which I have never yet managed to download by the way) and perf on Linux and both allow you to see what functions took how much performance of your program, and then you can inspect the actual disassembly of a selected function where you will be given with how much performance each of its instruction took.
The punchline here is that profiling is the only way to truly* measure the performance of your program. And if you program in C or C++ then it may be an obvious choice for you.
* profiler is sometimes not really helpful without a microbenchmark
Ahh Seka Assembler on my AMIGA 500 .. those were the days 😏
Need to start learning Assembly.
I laugh at the fool who says there is no need for assembly, OS developers and VM developers know the story.
Lol how many os and vm developers do you know ?
@@typedef_ how many poorly paid os and vm developers do you know?
@@theterribleanimator1793 I don't know any developers. Are you implying that os developers are well paid ?
@@typedef_ only the sane ones usually.
@@YxG6kTfZ3HhcAQP people still use cobol?
I'm a speed demon, addicted to it. Haven't done it in a while, but I've used ASM from time to time to make things a bit faster.
I can tell you from experience: I took a "break" from my annoying Python project to learn assembly for a few weeks. I never complained about my project again.
What was your project
I just started learning assembly in a class today... also c in another class but the teacher started from zero with basic programming languages concepts thrown in... but yeah like 8 weeks into uni and just now I start learning how to code... kinda unsatisfying but next year we'll get bombarded with programming related classes so I'm not complaining... assembly seems more interesting then I previously thought but its also kinda hard to grasp compared to c or cpp... well either way, algebra is still harder
I write in assembly as part of adding support for new single board computer and chip in the Linux Kernel
Fun exercise I once did: write a virtual machine in...java :) Absolute minimum, just to understand the principles.
It had 2 registers A and B, a program pointer (PC) and a stack pointer (SP), but no heap. Handful of commands: loadA, storA,...add, sub, call, pushSP, popSP, pushPC, popPC, ret. I added a printf pseudo-command - a cheat, so I could easily output values.
This is simpler to build than you think. And it allows you to understand the operation of a CPU better.
Much more difficult: make a parser for a mini C language that allows function calls, adding and subtracting, assignment and printing. Nothing more.
The parser needs to translate all C stuff into ASM stuff. Biggest challenge here was to implement function calls. This means putting all arguments on the stack, including the return address, jumping to the function, do stuff with the arguments and finally reset the stack pointer (SP) and jump back to the calling point.
What I learned was that translating C to ASM is essentially stack maintenance.
My resulting "program" was an array where each element was a tuple of a command id and an argument (I think I stuck with one argument), thus mimicking how a program is stored in memory.
After a lot of headaches, it all worked but...for some reason my virtual machine was really slow...
If you can build all of the above, next level would be expanding the parser to parse more complex arithmetic expressions, like a = 1*-2/(3-4)+5. This involves operator precedence and is quite tricky to implement. Next, allow expressions to contain function results, as in a = 3 +f(4). Once you have that, you'll see the pattern of recursion and the rest is easy.
What the author here is describing is "inline assembly", where you use pragmas to put assembly code directly into your C code. Very powerful.
If I had the time, I would dive in. The mind is capable of amazing things with time and patience, and an acquired taste and appreciation for intense engineering. If I did it though, I might never come back! :) I have so many other interests, I won't do it, until I live forever maybe :) So - balance, that's what I choose. I want to continue to taste and experience art, writing, music, and other things in life, along with programming (I do C, C++, and FreeBasic). I believe that an array of diverse interests transfer to each other, relate to each other, strengthen each other, and give you a unique perspective - how awesome is this universe in which we live, and how awesome is the human mind to be capable of seemingly infinite, passionate learning.
I program embedded systems. To me, a large processor has >4kB of RAM; I had one project where I used a whopping 60 bytes of RAM. I don't generally write assembly, but I dissect generated assembly frequently. I only write assembly when doing very low level things like implementing CAS.
Thanks. Sounds a lot like my work. People always give me an odd look when I refer to 2kB as a LOT of memory.
As someone who's self-taught, this video helped me understand people's mindsets when they ONLY know high level languages. I find they're entitled, and have seemingly no understanding of how even programming works, meanwhile they identify as software engineers. Educational, thank you for making content like this. It's comfortable.
If you just want to be a programmer, writing what people ask you to, do whatever you want. You want to be deep into comp sci, or software engineering, you need to understand assembly!! You don't need to immediately understand what an assembly file says, but you need to have *some* ability to interpret it, and solve problems using that information sometimes.
I wokn no m cnkd Wed k or ok or whatever m m to yo l Koo
It's one thing to be willing to dabble in low level like this, it's something else entirely to consider those who don't unworthy of the title. Do you think CERN physicists don't deserve the credit just because they don't do all of their calculations on paper like Newton used to?
@@yarpen26 that’s not even close to what I said? The equivalent would be if CERN physicists didn’t understand the underlying physics at all.
I think the point is that with assembly you learn how the computer works. That's the main benefit and what you should focus on, instead of inessential stuff, like differences between Intel vs AT&T syntaxes etc.
A recommendation on assembly books: Programming from Ground Up by Jonathan Bartlett, savannah . nongnu . org / projects / pgubook
I don't program in assembly but I sometimes have to debug in assembly. So there you go. You need to know assembly even in 2021 if you're gonna work with embedded software.
Thank you for this informative programming video.
Suggestion: video on how to debug your code remotely using syslog server 😁
Cheers