Very thorough.. except totally missed the many steps involved into turning the 0s and 1s of the source code into the 0s and 1s of an executable! Actually.. there's a lot to be covered even earlier, in how the brain controls fingers to hit keys, which are interpreted by the computer and turned into that source code to begin with. Probably should contain some background info on the mating habits of your parents to get you in front of the keyboard in the first place
Very much appreciated, though there might have been a bit of omission. Merely seeing the words "Hello World" isn't what triggers the dopamine rush. There has to be some context processing going on somewhere to combine the interpreted visual signal with memory of what you are currently doing and are hoping to achieve… that probably doesn't belong here.
We should all be grateful for the programmers that came before us, and built all the lower level code that can easily be taken for granted! This video serves to remind us of the complex processes that run automatically on our behalf.
And grateful for the inconsistent, buggy user interface that keeps changing so that those of us who grew up on Windows 95 to Windows 7 can unregrettably defenestrate the CPU. The cycle is complete when the CPU is properly grounded, preferably at ɡ0 from four stories.
@@gackmcshite4724inconsistent GUI? Talk about KDE,GNOME,Tcl,u name it... And the consistent chase to mimic Windows UI, yet failing miserably again and again. Yep, that's gnu/linux for ya
Absolutely. It's thanks to them I never have to write things in assembly. With that said, the more we stand on their shoulders, you reckon things are getting more bloated? Aspects of "I don't know what that file does, but if I take it out things stop working so... better leave it in." Rinse repeat that over years and software gets fatter? Happy to be corrected. There's merit still in understanding how the old stuff works and what it does so you know when some code can be dropped or refactored, you can maintain your libraries and patch them as new vulnerabilities and security shifts as well as remove old bloat that's not required any more. Personally I work with industry software. (Sage SalesLogix - Infor CRM). Sage sold this software back in around 2010 to Infor, but if you check the program files, there's still references to "sage" everywhere years later. Is that a time saving thing or (more likely) the new devs don't actually know what some of these files, binaries and libraries are really doing, but if they remove them the software stops working - so they remain and persist in the background of the software for each new version. Microsoft left Dave's task manager in the software for years mostly untouched (although it's replaced now I think - dunno I run Debian daily.). I daresay there's still some of Dave's code present still in Windows 11 - sitting there in the background untouched like it has been for years. Is that code still there because it works well and does its job (likely - it's DP), or is it that Microsoft don't want to remove it because things stop working if they do? To take it out, a programmer has to sit there, go through how it functions and rebuild/refactor it which involves them understanding it. I'm sure Dave and Microsoft religiously comment code so it's actually easy to understand - but this 100% is not the case everywhere. Poor documentation and code practices are all over the place and there's tight deadlines on many software releases. Is a company going to pay a programmer to sit there and go through old, badly commented and documented code to update it, patch it and improve it or are they going to take the "if it ain't broke, don't fix it" attitude. Ad infinitum for years of software updates and devs coming and going... you end up in a situation where the software is bloating because no one really understand how these old functions work... Often, instead of fixing core problems a developer may just patch over the top of an issue. Let's say some old code is calculating a date incorrectly but the developer isn't sure where it's coming from or why the calculation is wrong - but that developer knows he can hook in to that date field later on in their stack and in their code and patch over/correct the date so it calculates fine. Now the software is calculating a date incorrectly initially thanks to some old, non-working code which is then patched/overwritten later with the developers new correction where it's patching it successfully. Speaking somewhat from experience, if I've got a deadline to hit - is it going to be quicker for me to hunt down the old code and correct it or just patch my own function that does it correctly over the top? Again ad infinitum. The software bloats out more and consumes more resources unnecessarily because it's calculating a date twice now. (In reality, I'm a nerd - so I will actually burn the midnight oil trying to work out why the old code isn't working right and patch that. I also have access to good fellow nerds with far brighter minds than me who are in it for the same reason I am. We'll find the problem given enough coffee. Guarantee not everyone is like this though.)
Dave - please, *PLEASE* do more videos like this one. It helps so much for those that only really started with higher-level languages and never "really" learned how the low-level/byte code-side of things actually worked. The magic that's between the operator and leads to jumping voltages across pins is still astounding to me.
If you want to learn more about how computers work at the low level, I highly recommend Bean Eater's channel. He has a series where he builds a very basic 8 bit CPU on a breadboard and another where he builds a breadboard system around a 6502 cpu.
It is we, who have written Assembly code on Commodore 64; We have gained understanding of how a computer does the magic of bringing something on-screen. At that time, computers were so much more simple, that it was humanely possible to learn these things by studying the Main Board schematic diagrams, OS disassembly etc. I feel sorry for today's young people, who have missed that natural opportunity. On modern, hugely more complex computers and OS's, that level of learning is no longer possible... This video by Dave makes for a phenomenal work towards that direction!
I don't think there is any lower than machine code that we can possibly do on our end that tells the CPU to talk to other devices connected on the motherboard. In time we will learn, if we keep looking. I've learnt so much from Dave in the last 2 days.
With 50 years of C programming behind me, all I can say Dave, is that was absolutely delightful. I especially loved how seamlessly you you had the Information jump the "air gap" via photonic-coupling. Well done, my friend.
As an old 8 bit single stepper 8085 developer, your level of detail speaks to me. Oh how things were simpler before memory management and virtual machines. I could feel my neurons tingling as I watched this amazing video.
Yeah got down into that level when writing a short paper on players responding to visual events in Minecraft - how the bloody creeper got me - their excellent camouflage with grass and sky !
@@LTVoyager The retina does, and that's sometimes considered part of the brain. (It's highly debateable in some cases where the boundaries between "brain" and "things connected to the brain" actually are. Where you draw the line can depend on what particular process you're trying to analyse.)
I remember getting a job doing assembly programming after using just Basic and Fortran in school. I had fascinating discussions with a system engineer who enlightened me on what actually took place in the processor and OS. This was 1970. I've loved programming ever since and still do it. Love your content, Dave!
Upon graduating with a Computer Engineering degree, I was happy to know all levels of what happened between the event handler of a mouse click down to the p-n junction, along with its related chemistry and physics. Watching today's video made me realize I didn't consider the entire stack as the human part of it was missing. Always nice to have my horizons expanded.
This cracked me up! Thank you for the great belly laugh! 😂 And thank you for the great guide through the details. It feels like it's becoming more and more difficult to find folks that relish getting into the details. 😊
I remember there were some video on RUclips that explained how to write a Linux kernel module in C... that writes Hello World (with the kernel level privileges of course!) xD
Love that you covered how "Hello World!" is presented on not one but two complex systems. Love the sense of humour of a true nerd. Thanks for reminding me that we aren't odd, it's the rest of the world that is odd, we just get to be ourselves.
"Informative"? Absolutely! "Entertaining"? Very much so! The fact that you not only described in deep detail, how the 'printf("Hello, World" ");' send a cascade of actions going down the chain-of-command of Windows OS from the user application level to the hardware - but also covered how an eye turns photons hitting the rods and cones into nerve signals that exite certain areas in the brain? - *THAT* is why I (have a long time ago) clicked the Notification Bell *AND* clicked the "Like" button! I only wish there we more "Notification" and "Like" equivalent buttons to hit to let you know that I DEEPLY appreciate your willingness to share all the information the way you do share. Thank you, Dave. [Oh, by the way: What does it say about a nerd, that when his LED lights illuminating a window behing him flicker during a shoot, he adds in post a text panel for a few frames to tell us other nerds, what happened! Priceless!! 👌]
That reminded me of a lecture back in college where the teacher mapped out all the steps an 8088 processor takes to add two numbers and store the result in a new memory location. It took almost 3 blackboards to show all the steps. From reading in the CPU instruction commands into the program register, decoding the instructions, doing the fetch into memory and copying the values into CPU registers, then adding the two values in the CPU registers and finally copying the result back out to a memory location. I don't recall exactly how many lines of assembly code it was, but it wasn't much, maybe 5-7 lines.
Loved the video! It Literally brings back memories. I'm an Electrical Engineer by degree but have been a programmer for over 30 years, and an embedded programmer for 18+ years. "Back in the day" I would have to give a similar explanation to the powers that be, so that they could understand that "Hello World" was half the battle for a bare metal embedded controller app.
Brilliant professional level explanation. From now on, I will play this video to my students after they have programmed and tested the Hello word program
I was already excited for this video because I really enjoy how you present information and we got to around where I thought would be the end and you just kept going! Even better than my already high expectations. Thanks so much for making this!
Thanks for that video. Would have taken weeks if not months to study all this with different user mode and kernel mode debuggers. Now I got this in less than 14 mins.
To get a degree in computer engineering I had to write a basic linux kernel from scratch, and to do that I had to write the code that did all this stuff, well, the non biological parts. That same degree program also required me to learn how the CPU works from the physics to the devices to the circuits and so on. Looking forward to part 2 where you explain that whole other half of things lol. Really cool to learn some more about how this process works in windows as compared to linux though
Yes, yes, yes! I love this. More of these super in depth explanations. I want to know every single step. I have a feeling I could have great conversations with you Dave. This is exactly how I like to learn and I know very few people interested in such granular information.
Hey Dave, big fan of your work! I really admire how you bridge the gap between technology and the autistic community. I’ve been on this tech journey since my Apple //e running Applesoft, jumping to IBM PC/PS2, and building my own rigs along the way. I’ve been a Microsoft user since GWBASIC and MS-DOS 1.0, and also worked with IBM/DR/4DOS. Recently, I’ve been exploring Hackintosh, Windows 10, and Linux, and it’s been an exciting experience. It’s incredible how technology always keeps us connected, especially for those of us on the spectrum. Thanks for all the insights you share and for making me feel even more connected to tech. Greetings from Mexico! 🌍💻
My first computer was a Apple //e as well. Loading the OS then turning on the computer to get a > It was interesting stuff! But my line of work took me away from computers put me in the landscaping trades. I did program MS Access to design a program to run the business. Was more of a power user than a coder. I still have this slight desire to learn about programming but never broke though. Maybe one of these days. Greetings from San Jose, California!
Thank you for this, genuinely! I’m building my own OS and your perspective and experience has been invaluable in my efforts. Keep sharing your light with the world, it is very appreciated! ❤
When I learn about things like this, it makes me respect the inventor and programmers of the first computers even more than before. We have so many wonderful layers of abstraction that lets us focus on the problem at hand. They had to coax unreliable vacuum tubes into producing the right electrical signal at just the right time and were able to do some extraordinary things.
I've always marveled contemplating the massive amount of work all of these systems are doing just to manage a mouse movement on screen without destroying the presentation, always knowing the context of and restoring the screen at any specific location.
The mouse cursor is a special case. This could be done by double-buffering but actually there is often hardware support on the graphics card for this, similar to how sprites worked on old 8-bit computers such as the C64: special logic on the graphics card overlays the mouse cursor image onto the framebuffer image without modifying the framebuffer data.
How absolutly 'simpel' all those computing layers are in relation to our own eyes and brain :-):-) Great post!! Every programmer who ever done a 'Hello world' in any combination of languages should be made aware!
Thank YOU Dave... You make it all so simple with concise straight talk (about the complexity of the world. ) with an actual understanding of what is going on!
Hahaha I love how much detail you put into explaining the part of Hello World that takes place in the retina, optic nerve and synapses, all the way to the resulting dopamine release. I enjoy your style of humor 😂
When I was first trying to learn programming, I had visual c++ 6, and a windows API reference book from my vocational school. I wasn't actually enrolled into any programming courses. I just asked the IT teacher if I could borrow them. "Sure! We don't actually use them for anything here!" I had a bit of trouble figuring out how to make actual windows GUI applications, because from the reference manual I have already decided that the GDI Graphics API I found from the reference manual was obviously the thing to use for drawing stuff on the screen! At that time, I only had sporadic access to internet and there wasn't really anyone around me how to actually do stuff :) Ohwell, I turned out OK! /me Goes back to programming my Haskell based Web UI framework...
Appreciate that your explanation of the human anatomy highlights the similarities in processing information, like that we too have separate specialized parts of our body that focus on just one task, but communicate with each other.
lol I see some other comments, and think you made good choices for your choice of allotted time. you picked the sharks to jump and delivered a very informative and yet short and thorough look at one single call/example. applaud you knowledge and delivery sir! thank you for this and all the others! :)
I absolutely love this look at the software hardware and wetware involved in Hello World! This was awesome! Just the fact the blobs of ink (or pixels) can invoke voices and images is miraculous in so many ways. Thank you for helping us appreciate it. Its crazy when you get down to the level of the rhodopsin! :- )
Thanks Dave, that was interesting. I smiled when you mentioned the 6545 crtc - it reminded me of the early 80’s when SRAM ruled and it took a lot of Z80 assembly code to fill 64kB of memory. Abstraction wasn’t nearly as much of a thing back then, and being an EE, I was always trying to avoid ‘system overhead’ code to squeeze as much performance out of the hardware as possible.
I didn't expect this to go and end where it did... what a great video! At the very end I was just waiting for the last part where it jumps the person-to-person barrier and ends up with someone asking where your unit test for the code is.
Always such great videos. A fantastic source of knowledge and trivia for us that enjoy experimenting and programming the win32api! I said it before, and will say it again, these videos would be great to have as a podcast format!
With that detailed explanation of how we perceive computer-generated output, I'm glad that we don't rely on Windows once we get past about the DDI layer. A blue screen much farther downstream would be a very bad day.
I was very surprised, but pleasantly so when you kept talking after the text was displayed on the screen and then provided a detailed description of the human visual system
That sounds almost exactly like how I would explain or describe it. It's all a function of light, sound and the human mind. Simple yet complex at the same time.
Great video as always. Keep up the good work! IMO understanding of the software lower levels of computers (os, compilers etc) should be part of every software developers knowledge. Being involved with os/compiler development should be part of a programmers 'rite of passage'. Certainly when I did my CS degree more years ago than I like to remember we did practical os and compiler development 'lab' work to supplement the lectures. I remember once finding a bug in a Fortran compiler I was using. In a particular circumstance it would produce erroneous result. I reported this and was told 'Well the source is stored at xxx, go fix it'. Took me a couple of weeks but I finally found and fixed it. Absolutely great. I learnt as much from that work as I did from the lectures.
It’s 4:30 and I could not sleep. Found this video and thought it could help me get tired. Boy, was I wrong. Fantastic video, Dave! Did I miss the part about how encoding works in the console, or is that in part 2?
Dave, you really have a wicked sense of humour defining TMDS as Too Much Data Standard. For those who are interested, there is a Wikipedia article on the actual meaning of TMDS. 🙂
Your 6545 comment made me laugh, out of nostalgia, and wondering how many of your viewers caught that reference. I actually used that thing in a 6502 project, some unspeakable number of years ago. Great video, thanks for that.
Good summary Dave, thanks. However I feel forced to say that I think "Gratuitous Embellishment" at around 11:12 is an epic, cosmos level understatement. Well done!
OK, so even a simple "Hello world!" string meets the kernel mode twice: when processed by a system when the program fires up, for the first time, then next time when the actual rendering of pixel is going to happen, during a driver conversation with a hardware. Goodness gracious! It's quite complex, isn't it?
I love that the Windows API involves the deeper inner workings on my being lol The true level of integration of the Windows APIs into the Human API is underappreciated
Life was so much easier on the IMSAI when all you had to do was know the base memory address for the video card and start moving eight-bit ASCII characters to it one by one. On the PDP-11 you had to set a couple of registers on the serial interface and the ASR-33 would start pounding out your "Hello World" at 110 baud. I long for the days to be able to write code on bare metal again and toggle the program in through the front panel switches :)
Alot of new computers will happily allow you to run embedded programs, and all you haved to do is write to a (now standard) memory address to get hello world even on these latest Nvidia ai cards they support this stuff apple is the only manufacturer to my knowledge that doesnt allow you to do embedded programming like on old old computers
I found couple of glaring omissions. You left out how electrons in the monitor were excited by a slight surge of controlled power to move to a higher shell in their atoms, and when they returned to their normal shells, photons were released, which then travelled through space via the electromagnetic field until they THEN travelled through your corneas, lenses, and vitreous humour, ultimately reaching your retinas. Other than those things; great video! Mentioning all those components in Windows sure brought back a lot of blue screen memories!
Love that you just kept on going into how the human anatomy processes the info as well. :)
Me too, I audibly laughed at that part!
Very thorough.. except totally missed the many steps involved into turning the 0s and 1s of the source code into the 0s and 1s of an executable! Actually.. there's a lot to be covered even earlier, in how the brain controls fingers to hit keys, which are interpreted by the computer and turned into that source code to begin with. Probably should contain some background info on the mating habits of your parents to get you in front of the keyboard in the first place
Very much appreciated, though there might have been a bit of omission. Merely seeing the words "Hello World" isn't what triggers the dopamine rush.
There has to be some context processing going on somewhere to combine the interpreted visual signal with memory of what you are currently doing and are hoping to achieve… that probably doesn't belong here.
Yes, I also had to laugh 😅 ... that is one totally complete specification from input to output to input 😆
I kept listening until eventually i went "hold on, isnt that part of the brain?" and then realised we arent talking about the computer anymore.
We should all be grateful for the programmers that came before us, and built all the lower level code that can easily be taken for granted! This video serves to remind us of the complex processes that run automatically on our behalf.
Grateful until it's the source of a massive security issue that is. :)
And grateful for the inconsistent, buggy user interface that keeps changing so that those of us who grew up on Windows 95 to Windows 7 can unregrettably defenestrate the CPU. The cycle is complete when the CPU is properly grounded, preferably at ɡ0 from four stories.
@@gackmcshite4724inconsistent GUI? Talk about KDE,GNOME,Tcl,u name it... And the consistent chase to mimic Windows UI, yet failing miserably again and again. Yep, that's gnu/linux for ya
Teach them dave.
Absolutely. It's thanks to them I never have to write things in assembly.
With that said, the more we stand on their shoulders, you reckon things are getting more bloated? Aspects of "I don't know what that file does, but if I take it out things stop working so... better leave it in." Rinse repeat that over years and software gets fatter? Happy to be corrected.
There's merit still in understanding how the old stuff works and what it does so you know when some code can be dropped or refactored, you can maintain your libraries and patch them as new vulnerabilities and security shifts as well as remove old bloat that's not required any more.
Personally I work with industry software. (Sage SalesLogix - Infor CRM). Sage sold this software back in around 2010 to Infor, but if you check the program files, there's still references to "sage" everywhere years later. Is that a time saving thing or (more likely) the new devs don't actually know what some of these files, binaries and libraries are really doing, but if they remove them the software stops working - so they remain and persist in the background of the software for each new version.
Microsoft left Dave's task manager in the software for years mostly untouched (although it's replaced now I think - dunno I run Debian daily.). I daresay there's still some of Dave's code present still in Windows 11 - sitting there in the background untouched like it has been for years. Is that code still there because it works well and does its job (likely - it's DP), or is it that Microsoft don't want to remove it because things stop working if they do? To take it out, a programmer has to sit there, go through how it functions and rebuild/refactor it which involves them understanding it. I'm sure Dave and Microsoft religiously comment code so it's actually easy to understand - but this 100% is not the case everywhere. Poor documentation and code practices are all over the place and there's tight deadlines on many software releases. Is a company going to pay a programmer to sit there and go through old, badly commented and documented code to update it, patch it and improve it or are they going to take the "if it ain't broke, don't fix it" attitude. Ad infinitum for years of software updates and devs coming and going... you end up in a situation where the software is bloating because no one really understand how these old functions work...
Often, instead of fixing core problems a developer may just patch over the top of an issue. Let's say some old code is calculating a date incorrectly but the developer isn't sure where it's coming from or why the calculation is wrong - but that developer knows he can hook in to that date field later on in their stack and in their code and patch over/correct the date so it calculates fine. Now the software is calculating a date incorrectly initially thanks to some old, non-working code which is then patched/overwritten later with the developers new correction where it's patching it successfully. Speaking somewhat from experience, if I've got a deadline to hit - is it going to be quicker for me to hunt down the old code and correct it or just patch my own function that does it correctly over the top? Again ad infinitum. The software bloats out more and consumes more resources unnecessarily because it's calculating a date twice now.
(In reality, I'm a nerd - so I will actually burn the midnight oil trying to work out why the old code isn't working right and patch that. I also have access to good fellow nerds with far brighter minds than me who are in it for the same reason I am. We'll find the problem given enough coffee. Guarantee not everyone is like this though.)
Dave - please, *PLEASE* do more videos like this one. It helps so much for those that only really started with higher-level languages and never "really" learned how the low-level/byte code-side of things actually worked. The magic that's between the operator and leads to jumping voltages across pins is still astounding to me.
If you want to learn more about how computers work at the low level, I highly recommend Bean Eater's channel. He has a series where he builds a very basic 8 bit CPU on a breadboard and another where he builds a breadboard system around a 6502 cpu.
@@maximum988He also sells both projects as kits where you can follow along and build them yourself.
It is we, who have written Assembly code on Commodore 64; We have gained understanding of how a computer does the magic of bringing something on-screen. At that time, computers were so much more simple, that it was humanely possible to learn these things by studying the Main Board schematic diagrams, OS disassembly etc.
I feel sorry for today's young people, who have missed that natural opportunity. On modern, hugely more complex computers and OS's, that level of learning is no longer possible...
This video by Dave makes for a phenomenal work towards that direction!
Yes please
I don't think there is any lower than machine code that we can possibly do on our end that tells the CPU to talk to other devices connected on the motherboard. In time we will learn, if we keep looking. I've learnt so much from Dave in the last 2 days.
With 50 years of C programming behind me, all I can say Dave, is that was absolutely delightful. I especially loved how seamlessly you you had the Information jump the "air gap" via photonic-coupling. Well done, my friend.
As an old 8 bit single stepper 8085 developer, your level of detail speaks to me. Oh how things were simpler before memory management and virtual machines. I could feel my neurons tingling as I watched this amazing video.
8085? Was that number associated with Wang systems as a server?
When you ask Dave how something works at the lowest level, he literally describes it down to how you process photons. Good work Dave!
It's not a good idea to ignore that part of the system: that's where 99% of all errors occur.
Except your brain does not process photons. 😂
Yeah got down into that level when writing a short paper on players responding to visual events in Minecraft - how the bloody creeper got me - their excellent camouflage with grass and sky !
@@LTVoyager The retina does, and that's sometimes considered part of the brain. (It's highly debateable in some cases where the boundaries between "brain" and "things connected to the brain" actually are. Where you draw the line can depend on what particular process you're trying to analyse.)
These Windows videos of yours are by far my favorites! Really enjoyed the humor in this one as well!
Dude so much extra credit for the neurology lesson. This was awesome. Five stars.
I remember getting a job doing assembly programming after using just Basic and Fortran in school. I had fascinating discussions with a system engineer who enlightened me on what actually took place in the processor and OS. This was 1970. I've loved programming ever since and still do it. Love your content, Dave!
Upon graduating with a Computer Engineering degree, I was happy to know all levels of what happened between the event handler of a mouse click down to the p-n junction, along with its related chemistry and physics. Watching today's video made me realize I didn't consider the entire stack as the human part of it was missing. Always nice to have my horizons expanded.
This is the best analysis and explanation of 'hello world' I've ever heard. I think my rods and cones just ... never mind!
This cracked me up! Thank you for the great belly laugh! 😂 And thank you for the great guide through the details. It feels like it's becoming more and more difficult to find folks that relish getting into the details. 😊
I have almost no programming experience, but Dave’s videos are always so well explained that i still always learn something.
As a Linux guy, I found this positively fascinating. Thanks Dave!
I remember there were some video on RUclips that explained how to write a Linux kernel module in C...
that writes Hello World (with the kernel level privileges of course!) xD
@@igorthelight Making Simple Linux Kernel Module in C - Nir Lichtman
@@ryonagana Looks like it! Thanks!
Love that you covered how "Hello World!" is presented on not one but two complex systems. Love the sense of humour of a true nerd. Thanks for reminding me that we aren't odd, it's the rest of the world that is odd, we just get to be ourselves.
"Informative"? Absolutely!
"Entertaining"? Very much so!
The fact that you not only described in deep detail, how the 'printf("Hello, World"
");' send a cascade of actions going down the chain-of-command of Windows OS from the user application level to the hardware - but also covered how an eye turns photons hitting the rods and cones into nerve signals that exite certain areas in the brain? - *THAT* is why I (have a long time ago) clicked the Notification Bell *AND* clicked the "Like" button!
I only wish there we more "Notification" and "Like" equivalent buttons to hit to let you know that I DEEPLY appreciate your willingness to share all the information the way you do share.
Thank you, Dave.
[Oh, by the way: What does it say about a nerd, that when his LED lights illuminating a window behing him flicker during a shoot, he adds in post a text panel for a few frames to tell us other nerds, what happened! Priceless!! 👌]
That reminded me of a lecture back in college where the teacher mapped out all the steps an 8088 processor takes to add two numbers and store the result in a new memory location. It took almost 3 blackboards to show all the steps. From reading in the CPU instruction commands into the program register, decoding the instructions, doing the fetch into memory and copying the values into CPU registers, then adding the two values in the CPU registers and finally copying the result back out to a memory location.
I don't recall exactly how many lines of assembly code it was, but it wasn't much, maybe 5-7 lines.
Loved the video! It Literally brings back memories. I'm an Electrical Engineer by degree but have been a programmer for over 30 years, and an embedded programmer for 18+ years. "Back in the day" I would have to give a similar explanation to the powers that be, so that they could understand that "Hello World" was half the battle for a bare metal embedded controller app.
Brilliant professional level explanation. From now on, I will play this video to my students after they have programmed and tested the Hello word program
Brother that was epic. Amazing that ALL of that happens in the tiniest fraction of a second, right down to the brain processing the optical signals.
I loved the description of the wetware involved in this complex process. Nice!
I was already excited for this video because I really enjoy how you present information and we got to around where I thought would be the end and you just kept going! Even better than my already high expectations. Thanks so much for making this!
Pure good ol' nerdom, I sincerely love it. It just put a big smile on my face at the end, thank you. :)
I know people rag on the NT kernel architecture (and the Win32 API) but I'll always love it
The description of the how the program works on the brains "hardware" works!! The journey is great; the whole journey. Thanks you!
Thanks for that video. Would have taken weeks if not months to study all this with different user mode and kernel mode debuggers. Now I got this in less than 14 mins.
I watched this at 4am, phased out for what seemed like 10seconds and woke up hearing Dave talking neurochemistry. Felt like a fever dream 😂
We are grateful for you to share your knowledge and experiences in a very friendly way. Thanks Mr. Dave! ❤
To get a degree in computer engineering I had to write a basic linux kernel from scratch, and to do that I had to write the code that did all this stuff, well, the non biological parts. That same degree program also required me to learn how the CPU works from the physics to the devices to the circuits and so on. Looking forward to part 2 where you explain that whole other half of things lol. Really cool to learn some more about how this process works in windows as compared to linux though
Love this. Superb!
Yes, yes, yes!
I love this.
More of these super in depth explanations.
I want to know every single step.
I have a feeling I could have great conversations with you Dave.
This is exactly how I like to learn and I know very few people interested in such granular information.
Hey Dave, big fan of your work! I really admire how you bridge the gap between technology and the autistic community. I’ve been on this tech journey since my Apple //e running Applesoft, jumping to IBM PC/PS2, and building my own rigs along the way. I’ve been a Microsoft user since GWBASIC and MS-DOS 1.0, and also worked with IBM/DR/4DOS. Recently, I’ve been exploring Hackintosh, Windows 10, and Linux, and it’s been an exciting experience. It’s incredible how technology always keeps us connected, especially for those of us on the spectrum. Thanks for all the insights you share and for making me feel even more connected to tech. Greetings from Mexico! 🌍💻
My first computer was a Apple //e as well. Loading the OS then turning on the computer to get a > It was interesting stuff! But my line of work took me away from computers put me in the landscaping trades. I did program MS Access to design a program to run the business. Was more of a power user than a coder. I still have this slight desire to learn about programming but never broke though. Maybe one of these days. Greetings from San Jose, California!
Nice! Love to see how Windows works under the hood!
You crack me up, Dave. Keep it up!
11:28 Your continuation past this point was so funny! 🙂
Thank you for this, genuinely!
I’m building my own OS and your perspective and experience has been invaluable in my efforts. Keep sharing your light with the world, it is very appreciated! ❤
I'm reinventing the wheel. Any tips ?
@ yes - try enjoying what you do 😉
@@bobert286 I tried a square , but Im thinking round is better
@@loupasternak Make it roll better!
Dave would love another video like this covering the Linux kernel 😮
I love how you explain things. Thank you.
Amazing , the human interface and internal human process included , great video dave
Magnificent
Miles and miles ahead of the simple processing and Direct Memory Access used in the earliest microcomputers.
Great video.
When I learn about things like this, it makes me respect the inventor and programmers of the first computers even more than before. We have so many wonderful layers of abstraction that lets us focus on the problem at hand. They had to coax unreliable vacuum tubes into producing the right electrical signal at just the right time and were able to do some extraordinary things.
Great video! I would like to see more topics like that one.
I've always marveled contemplating the massive amount of work all of these systems are doing just to manage a mouse movement on screen without destroying the presentation, always knowing the context of and restoring the screen at any specific location.
The mouse cursor is a special case. This could be done by double-buffering but actually there is often hardware support on the graphics card for this, similar to how sprites worked on old 8-bit computers such as the C64: special logic on the graphics card overlays the mouse cursor image onto the framebuffer image without modifying the framebuffer data.
Thank you for your content. I enjoy how you explain how Windows works in a manner that we can understand.
Cool ! Thanks these subjects are the reason I started in computing in the first place.
How absolutly 'simpel' all those computing layers are in relation to our own eyes and brain :-):-) Great post!! Every programmer who ever done a 'Hello world' in any combination of languages should be made aware!
so much energy is converted layer by layer, thank you for the detail explanation 🙏
In my early days as a programmer, I did display, scanner, fax compressor card drivers for the company I worked for in MS-DOS. The days of TSRs.
Thank YOU Dave... You make it all so simple with concise straight talk (about the complexity of the world. ) with an actual understanding of what is going on!
Hahaha I love how much detail you put into explaining the part of Hello World that takes place in the retina, optic nerve and synapses, all the way to the resulting dopamine release. I enjoy your style of humor 😂
When I was first trying to learn programming, I had visual c++ 6, and a windows API reference book from my vocational school. I wasn't actually enrolled into any programming courses. I just asked the IT teacher if I could borrow them. "Sure! We don't actually use them for anything here!"
I had a bit of trouble figuring out how to make actual windows GUI applications, because from the reference manual I have already decided that the GDI Graphics API I found from the reference manual was obviously the thing to use for drawing stuff on the screen!
At that time, I only had sporadic access to internet and there wasn't really anyone around me how to actually do stuff :) Ohwell, I turned out OK!
/me Goes back to programming my Haskell based Web UI framework...
Appreciate that your explanation of the human anatomy highlights the similarities in processing information, like that we too have separate specialized parts of our body that focus on just one task, but communicate with each other.
Always fun listening to Dave.
I could listen to you describe how processes work all day.. Wow! I learned so much from this video?
lol I see some other comments, and think you made good choices for your choice of allotted time. you picked the sharks to jump and delivered a very informative and yet short and thorough look at one single call/example. applaud you knowledge and delivery sir! thank you for this and all the others! :)
Well a sidebar in physiology, just the twist we need. It all boils down to Dopamine, exactly why we watch this channel.
Thank you so much for all you share with us! Always excited to watch your videos.
TMI! TMI! Wow! That was very interesting. Love the way you explained it. 😀
I absolutely love this look at the software hardware and wetware involved in Hello World! This was awesome! Just the fact the blobs of ink (or pixels) can invoke voices and images is miraculous in so many ways. Thank you for helping us appreciate it. Its crazy when you get down to the level of the rhodopsin! :- )
This is the full cycle, only man to lay it out. I am sure he could have gone down each function an variable, but you/we got what we needed.
Thanks Dave, that was interesting. I smiled when you mentioned the 6545 crtc - it reminded me of the early 80’s when SRAM ruled and it took a lot of Z80 assembly code to fill 64kB of memory. Abstraction wasn’t nearly as much of a thing back then, and being an EE, I was always trying to avoid ‘system overhead’ code to squeeze as much performance out of the hardware as possible.
Dave's boss at Microsoft: SHIP IT
Dave: But it has lots of bugs
Dave's boss: SHIP IT
Absolutely awesome
I didn't expect this to go and end where it did... what a great video! At the very end I was just waiting for the last part where it jumps the person-to-person barrier and ends up with someone asking where your unit test for the code is.
Always such great videos. A fantastic source of knowledge and trivia for us that enjoy experimenting and programming the win32api!
I said it before, and will say it again, these videos would be great to have as a podcast format!
With that detailed explanation of how we perceive computer-generated output, I'm glad that we don't rely on Windows once we get past about the DDI layer. A blue screen much farther downstream would be a very bad day.
Love it Dave. Your content is great. Thank you!
I was very surprised, but pleasantly so when you kept talking after the text was displayed on the screen and then provided a detailed description of the human visual system
11:30 you just had to add a lengthy neuroanatomy bit as a set up for the dopamine joke. lol. That’s classic Dave.
Loving this content. Others are too, approaching 1M subscribers! Do you have something special planned for when we hit that milestone ?
As usual in computer science: it's abstractions all the way down. Love it.
Your channel is a source of wonder and inspiration. Thank you for your creativity and diligence!🐳🚕😍
That sounds almost exactly like how I would explain or describe it. It's all a function of light, sound and the human mind. Simple yet complex at the same time.
Awesome stuff. Thanks. Please give more doses of Windows internals and its Kernel stuff. 🙏🏼🙏🏼🙏🏼🤘🏼🤘🏼🤘🏼
11:30 LOL, that's a hell of detailed description 😁
Thanks Dave, I really enjoyed this video
Excellent. Loved the human interaction.
Good stuff Dave!
Great video as always. Keep up the good work! IMO understanding of the software lower levels of computers (os, compilers etc) should be part of every software developers knowledge. Being involved with os/compiler development should be part of a programmers 'rite of passage'. Certainly when I did my CS degree more years ago than I like to remember we did practical os and compiler development 'lab' work to supplement the lectures. I remember once finding a bug in a Fortran compiler I was using. In a particular circumstance it would produce erroneous result. I reported this and was told 'Well the source is stored at xxx, go fix it'. Took me a couple of weeks but I finally found and fixed it. Absolutely great. I learnt as much from that work as I did from the lectures.
This is just so great. As devs we often hear "why don't you just". Here's why.
Wonderful!
It’s 4:30 and I could not sleep. Found this video and thought it could help me get tired. Boy, was I wrong. Fantastic video, Dave! Did I miss the part about how encoding works in the console, or is that in part 2?
What a great video. I really like it. 🤟
Dave, you really have a wicked sense of humour defining TMDS as Too Much Data Standard. For those who are interested, there is a Wikipedia article on the actual meaning of TMDS. 🙂
Your 6545 comment made me laugh, out of nostalgia, and wondering how many of your viewers caught that reference. I actually used that thing in a 6502 project, some unspeakable number of years ago. Great video, thanks for that.
Going really deep on something very "basic" at the surface level. I really like it
Good summary Dave, thanks. However I feel forced to say that I think "Gratuitous Embellishment" at around 11:12 is an epic, cosmos level understatement. Well done!
I love that he went literally through all of the steps including the human anatomy section
Fantastic video
Good video. I knew there was lots involved in just writing to the screen. Thanks for the explanation..
Absolutely loved this video 😁
That was breathtakingly fun.🤣 I see that you also have the ability to go: *_"Full-Tism"_*...
OK, so even a simple "Hello world!" string meets the kernel mode twice: when processed by a system when the program fires up, for the first time, then next time when the actual rendering of pixel is going to happen, during a driver conversation with a hardware. Goodness gracious! It's quite complex, isn't it?
Brings me back to my assembly language class in college, triggering software interrupts to make BIOS calls. Fun stuff.
I expected turtles on the way down. Turtles take days to get there. The screen is a secure place that not everyone can do a "Hello World".
LOL. That's a true deep insight into the process!
Dave, I stuck with a stack overflow! 🤣My dirty glasses manipulate transmitted image! 😁
From the software to the firmware to the hardware to the wetware, this was quite the journey.
I love that the Windows API involves the deeper inner workings on my being lol
The true level of integration of the Windows APIs into the Human API is underappreciated
Life was so much easier on the IMSAI when all you had to do was know the base memory address for the video card and start moving eight-bit ASCII characters to it one by one. On the PDP-11 you had to set a couple of registers on the serial interface and the ASR-33 would start pounding out your "Hello World" at 110 baud. I long for the days to be able to write code on bare metal again and toggle the program in through the front panel switches :)
Grab an stm32 and you’ll be flipping registers to get ‘hello’ again
You can do that if you build your own CPU.
Alot of new computers will happily allow you to run embedded programs, and all you haved to do is write to a (now standard) memory address to get hello world even on these latest Nvidia ai cards they support this stuff
apple is the only manufacturer to my knowledge that doesnt allow you to do embedded programming like on old old computers
I found couple of glaring omissions. You left out how electrons in the monitor were excited by a slight surge of controlled power to move to a higher shell in their atoms, and when they returned to their normal shells, photons were released, which then travelled through space via the electromagnetic field until they THEN travelled through your corneas, lenses, and vitreous humour, ultimately reaching your retinas. Other than those things; great video! Mentioning all those components in Windows sure brought back a lot of blue screen memories!
Send me an image of five turtles stacked atop one another that doesn't offend you and I'll A/B test it.
😂😂
ELECTRON SHELLS, DAVE! XD
Your humour kills me every time, hahaha!