@@MrLunithy You can often find leaked schematics for modern computers. While there's a ton going on it's actually pretty easy to understand if you only look at one piece at a time. When chips are as complicated as they are now they don't need much support circuitry other than power supplies.
@Janek Winnicki that's funny, I started programming at the age of 10 on a acorn electron. I was highly fascinated that I could make a computer do stuff.
Computers still come with manuals... including a rough schematic. You just need to purchase your motherboard separately (and not a fully-built Dell, for example) to get that manual. You can also download them as a PDF from the manufacturer.
@@JonRowlison "rough schematic", it tells you which holes to plug things in, and each type of plug will only fit one type of hole. You've no idea how the LPC bus connects the ROM to the CPU, or if there even IS an LPC bus and what it replaced and why. There's still a basic IBM 5150 in there _somewhere!_ How ever many of it's interrupt controllers and timers are emulated in software at some level by some in-chipset CPU (and there's probably at least a couple of those, and you'll _never_ see the source code!). Just the spec for PCIE is only open to members of it's Special Interest Group, which will cost your company a shit-ton to join, if you Need To Know, and to download it you search for which _pieces_ of it you're interested in. Possibly nobody in the world has read the entire thing. Certainly nobody in the world understands a modern PC from gate level to the OK button on the screen. And there's stuff beneath the gate level. Modern PC "manuals" are nothing _near_ as comprehensive as the ones that came with computers back in the day. They had entire circuit diagrams. A modern motherboard might contain, essentially, 1 big chip, that's more complex than 1,000 chips on the original PC. There are billions of logic gates in there. With a PET or Spectrum or whatever, you could find out what bits to set to get immediate results in the hardware, no OS, no other code interpreting your wishes, you just DID it, right there and then, yourself! You could repair, replace, and upgrade parts that aren't even visible today cos they're all in one gigantic chip, and you'd need something like an atomic force microscope to see the parts if you got the lid off. Really computers today... thousands or millions of times more transistors, no exaggeration. They all do something. Then there's the millions of lines of code, all interdependent, where you used to have perhaps a few thousand that did exactly what they said they did. Now, a program you write might be running on a CPU like you expect. Or it might be emulated. Or hypervised, or trapped and partly-emulated. On a machine with similar or massively different architecture to what you thought you were programming for. It might well be running on some cloud server halfway across the world and only part of it's actually running on the so-called user's PC. Parts of it might be operating on different servers at the same time. You and me might be centuries dead, running as AI's in a virtual world a tiny fraction as interesting and sensual as the real one was. And nobody's told us! We really wouldn't know, and software doesn't, can't know. There is no comparison. The comparison between a Ferrari and a roller skate, really, on that order. Pardon all the hyperbole but I don't feel I've exaggerated at all, computers now really are ludicrously detailed inside. Where the stuff of the '70s and '80s wasn't so far on from the machines they used in WW2. Go look up transistor counts of PC CPUs from the 8086 to the modern Core Whatever. The 8088 CPU of the first PC contained 29,000 transistors. Some recent thing from AMD contains 39 billion transistors! Admittedly a lot of that is cache RAM, silicon monoculture rather than the jungle of other parts, but I think it makes a point!
I started a new project and went from c# to embedded c and c++. I went from thinking I understood computers to spending hours relearning what I thought I understood. I fell from the shoulders of giants but gained a giant respect for them climbing back up.
I'd love to see you walk through everything that happens from the moment of powerup, through someone typing PRINT "HELLO WORLD" on the keyboard, and finally hitting return and the output shows. It would be quite a journey!
Thanks for the idea! Not a bad one - could start with turning the machine on, then what happens until it goes input idle, then how it parses and executes the line of code. I'll think about that!
@@DavesGarage *Builds TPF file from original sprouting as BMP from default texmod tool* *Upscales multiple ways and purges old .log file with removed associations so its absolutely fresh and associated, while making sure the name matched after removing the non upscaled exacting format* I can't get Jurassic Park back online without Dennis Nedry. -Ray Arnold The brain inflammation stage from staring at a screen for 65 million years is setting in. Ill cycle through the different compression technologies and see if I can definickize this blasted tool which im glad with very little effort I figured at least the above out.
@@DavesGarage The Eagle has landed. The trick was not using BMP default since Gigapixel cant meet that. JPG in the .log file and outputting in Gigapixel worked. My T Rex looks far less compressed. Ill have to figure out the multiple asset same name, tree of funsies in time since its the only thing left to figure out, then its just pure drudgery.
This is so valuable. I also grew up in that era and slightly before. By the time I was teaching programming at university, students were already getting abstract information without any base history. It was a little painful to watch. I still teach, and I’m now having to make videos on steps to programming that I never thought would be necessary. Students come to my courses with nearly no understanding of what a computer is or does, even if they are reasonably good at using one. The horrors of tech support are hilarious reading, but also a tale of shame. If you get a chance to read Reddit’s tech support tales, highly recommended.
@@wandererstraining My university has some classes that deal with writing assembly and how logic gates and an ALU work. I don't remember if they ever expanded that into the full picture though.
The first time I did much with computing, it was to reprogramme a training simulator for up to 32 staff. I did not have access to the assembler, for programming, so wrote code long hand in hex, and entered it byte by byte into memory. You reminded me of how much I have forgotten in over 40 years. I did not have a keyboard-map for the function-specific keyboard, so I wrote my routine to map the codes. In contrast, I have just updated my PC from Win-10 to Win-11. I spent the last two days locating, and removing, all the OEM software to render it stable once more. Times change, not always for the best.
It's nice when you have the schematic for something, but even nicer when you fully understand it and know exactly what each part does and why that's useful.
Indeed, but we have so many layers of complexity and abstraction that no single person can fully understand it on anything reasonably "modern" to be useful to an average user today. You can pick and choose which layers are useful/interesting to learn, although some parts have little to no public documentation and are essentially secrets (like firmware and microcode).
The book "Microprocessor interfacing techniques" by Rodnay Zaks is a great introduction to this kind of stuff, first published in the late 70s, it's a brilliant read
I bought a second hand Pet in about 1988 off the company I worked for at the time. The only experience I had at the time was with cheap home computers such as the ZX81 and Oric 1, plus a week long 6502 assembly course when I was 13 in 1984. With no instructions, and of course no Internet it was a great learning experience. I actually managed to program a rudimentary BASIC horse racing game where you could bet on the outcome of each race, each horse was just the number 1-6 moving along a track made of the minus symbol and a random number generator to decide if each horse moved based on their odds. If I remember correctly the entire screen was redrawn at every passthrough of the code. Years later I managed to get a job in IT support but knowing how to code certainly helped my career, especially in the early days of DOS and UNIX support. Still dabble in C based languages to this day. Videos like this remind me of those days and how exciting computers were before they became a tool just for work, and help reignite the passion :)
My career started at Texas Instruments in June of 1980 in the MOS memory group based in Houston, TX. Static RAM came in two flavors: fast static like you might think of related to cache. But also there was slow, as in not-so-fast, static RAM. Usually found in 600mil packages and pin for pin compatible, except for one pin, with ROM chips. If memory serves me moderately, they were 250nS to 150nS access times instead of the 15nS access times of the much more expensive 300mil fast static devices. My very first project was to reverse engineer the CMOS based Toshiba 5516, a 16K static RAM, which I did quite successfully revealing that Toshiba was much farther ahead in the wafer fab process business than my company, the number one behemoth semiconductor manufacturer that TI was at the time. And was an early indicator that these new Japanese semiconductor companies were becoming serious compeditors in that industry. Especially with that ever increasingly important DRAM chip. P.S. jump (IBM / Intel 16 bit 8088. The jump to location being the battle Intel and AMD fought over with claims of copyright infringement.)
Hi ChesslsJustAGame. Did you work on any memory test equipment when you were at TI? I believe the testers I was developing were purchased by TI for those parts. The SRAM business was brutally competitive at that time...
Literally the first computer I ever used and I’ve always had a soft spot for it. I rebuilt one recently on my channel (barely more than a 100 subs) but I got it like a museum piece. I have made the point many times that old 8 bit machines were totally different in that even as a teenager I could understand assembly language, understand the hardware and understand the OS. I wrote an expansion ROM for the CPC range that added 150 new commands whilst still at school and a disc copier better than literally any on the market for the BBC micro. Videos like this really take me back!
As a 14-year-old, I cut my coding teeth handwriting machine code and punching it into a ZX81 (and then a TS 1000 after I fried the ZX81 when clumsily trying to upgrade its SRAM to 4K :P). Your video reminded me of those old days when you could actually fully understand both the hardware and software of a machine. Zoom forward to 2022 and I have no idea how or what, let alone why, my Windows machine is doing what it is doing :S
the IBM PC, while more complicated, was still comprehensible to me on a software level. the hardware design was well past my understanding. DOS 1-3 and the IBM PC BIOS were well documented, and you could lterally debug through most of it and make sense. I learned assembly from the examples in the MS-DOS 3 reference, which also had an appendix with the IBM PC BIOS reference. Amazing that someone (or more likely a team) at Microsoft wrote assembly examples to demonstrate literally every single DOS 1, 2 and 3 function!
@@linusfu515 The nice thing about having libraries that interact with the OS for you is that they might be OS-independent so it'll work on Windows, MacOS and Linux without you having to write it 3 times.
@@linusfu515 I have 3 books on my shelf specifically there for computer stuff. O'Reilly "learning Python" , Sobell "A Practical guide to Linux commands, editors, and shell programming." , and Comptia's 7th edition A+ guidebook. Mentioning this, because I personally think you have the right of it mostly on how people kind of gravitate towards Python when first learning, despite having other resources available to them. I don't know if that's a good or bad thing, but I do blame my raspberry pi and its camera for needing to learn Python to use the camera properly (at the time at least) on why it ultimately became one of my first languages being learned; instead of something else like Assembly and C with its variants.
This is where I learned to code, ended up having 3 of them. A 2001, a uncommon Purple screen 4032, and a SuperPET 9000. I still have all three tucked away.
I love that you took as an example the journey of a character from the keyboard to the screen. This is one of my favorite examples to show the complexity of computers as the character has to traverse many layers of hardware and software. Yet it looks obviously trivial to everyone.
VIC-20 owner here, loved this video. The first computer I wrote any code on was the ZX81. We couldn't afford one at the time (£100!), so I would go to local electronics store after school to read the manual and code on the demo machine on display. Later I got VIC-20 which was cheaper when it came out, and later an Atari 800XL. I learned so much just from magazines and books; I ended up knowing more than my computer studies teacher that he used to let me take home one of the school's computers (Research Machines 480z) over the summer breaks. Ever thankful to that teacher for the encouragement.
Ben Eater has an awesome series of videos how to make a tiny computer based on the 6502 and bread boards. If you like Dave's videos about that topic, Ben's are a must to watch. They are maintained as a playlist on his channel.
My first programming class used a KIM-1 single board computer with a 6502 and a whole 1K of RAM. I got to where I could perform the startup from memory. That class led to a field service career for CNC machine tools that were using punched paper tape, but we had to key in by hand the 30 or 40 step program that made the tape reader usable - no ROM. I really miss front panels and bit manipulations. 🖖
Dave, thank you so much! Your story telling skills are pretty much godlike, metaphorically speaking of course. But yeah, the way you present the information in your stories is so much fun to watch and absorb. Big hug!
"You may not know every detail but when you have a firm grasp of the big picture, the details you do know can be viewed in context". This is such a good statement I had to type it out. Will definitely be in my head as I continue my education.
@@greenaum I use vim but it can be annoying that because it's more of a "line editor" you have to put it in edit mode once you get to the line you want and then you have to exit the edit mode before you can move on to a different line
It was Microsoft BASIC that did it. The PC's version does the same thing. It also maintains an extra byte in RAM for each line on the screen to indicate whether the text on one line wraps into the next line, so that the screen scraper knows where to stop interpreting.
“In the beginning…”, I used assembly language for my coding. I had to write drivers, I/O interfaces, BCD to binary, clock timing, etc. I wrote floating point algorithms for fixed-point machines that were 16-bit words but included 16-bit, 32, 64, 128, and 256-bit using some great algorithms, graphics for screens, radar screens, antenna patterns, all in critical time - real-time is I hit a key and an answer shows up, critical time is x eqns in your time. Any time left over was used for background stuff, so interrupt structures. I loved it - absolute control of everything! I even restrung core memory!
Just blew my mind with the "READY" thing, I used C64 and still have it around, but I've never thought of doing that. In my mind, the interpreter printouts were still somehow magically different than the stuff I typed, even though I know better, when I actually think about it. Cool trick, love it.
wow does this ever take me back. my 1st computer was a commodore PET in 1977. I got it and a "semi useful" manual (guide book) and started my journey by self teaching programing. my 1st retirement from the tech industry was in 2002, and ushered in my worst (boring) couple of years in my life. I went back to work, and retrained myself back into relevance and now run the R&D development network for a major system house. celebrating my 46th year in high tech this year and don't plan on retiring anytime soon.... thanx Dave. I think I still have my old PET in 1 of my out buildings. gonna have to wander out back and fire it up again soon.....
If the PET could use characters from RAM, it would have to access main memory for a byte around once every microsecond, starving the CPU like the ZX81 suffered. The C64 alternated accesses of main memory between the CPU and VIC, but that would have taken a lot of discrete logic chips to implement (which is why the VIC does so much in the '64), and probably faster ram chips, Ram chip access times from the 70's has alas faded from my memory..
This is also my theory. You'd have to dual port whatever RAM the character map lived in. As it is you had to dual-port the 1k of screen memory. Note the ZX81 didn't only have to deal with dual ported memory it actually generated the video directly, so the CPU was quite busy when video was being displayed. It's possible do the necessary dual-porting but that would have complicated things quite a bit. So they allowed two different character sets and that was that.
RAM and ROM speeds generally kept pace with CPU clock speeds. When specifications needed, you could opt to spend more money and get memory chips that had faster timing. To switch which device was using the address and data buses, you would use tri-state buffers, multiplexers, transceivers using low power Schottky transistor-transistor logic chips (LS TTL). The three states are Hi, Lo, and high impedance (the tri or third state). The tri-state high impedance state would allow some other chip to drive the given bus lines high or low. A bus conflict happens when two devices are driving say the address bus at the same time thereby causing a short circuit state on the lines where one driver was pulling the line to low, while another device was trying to pull the bus signal line high. Chip damage would happen if this situation lasted too long and made the power dissipation per chip exceed design specifications for long enough time to "smoke" the chip drivers on a given pin or maybe the whole IC would fail and need replacement with a working one. Positive holes flow toward negative polarity ground potential causing unnecessarily high current flow or a short circuit. Any downstream device listening to this electrical fighting would likely see the ground or negative driving circuit win these fights because TTL chips had lower impedance to ground polarity. The internal circuitry of the TTL outputs would be specified in the TTL data book that listed the given chips involved. Usually the outputs were using totem pole configuration using complementary polarity transistors--a PNP and and NPN bipolar junction transistor. The upper schematic transistor would be tied to +5 Volts collector circuit voltage or Vcc through about a 260 or so Ohm resistor, whereas the lower schematic transistor would have its emitter tied to ground through a pretty low valued resistance like 10 Ohms or so. The P and N junctions for bipolar transistors are formed by using infusion or doping of the semiconductor material with either a donor or hole material or atomic element whose outer electron shells would either have excess electrons or hole slots in their outer electron orbitals. The crystal lattice would have each atom of silicon have neighbors (top and bottom) as well as forward in and backward out and finally right and left with these excess electrons or excess holes (positive charge carriers). Flow of charges in junctions would only be in favor of one direction. The physics says that if the charge flow opposes and depletes the charges in the native PN junction field, the flow shall continue unabated. However if the charge flow being attempted reinforces the doped field junction, then the electric field magnitude would just grow bigger and bigger until the voltage was so high that breakdown would suddenly allow the current to flow backward through the junction. If the current flow magnitude was too large through the junction, excessive heat shall smoke the device unless the current flow was limited to sane levels within designed specifications allowable by heat dissipation abilities of the device. This would be the situation in a reverse breakdown diode junction known as a Zener diode where the reverse breakdown voltage would be a known calibrated X number of volts. Field effect transistors or FETs use a by design a channel through which flow would be made to happen by enhancing the conductivity of the channel (enhancement mode) or by design a channel through which flow is pinched off or blocked by depleting the field (depletion mode). The channel can be made to carry positive holes as in P-channel or to carry electrons or negative charge carriers as in N-channel FETs. The control lines into these various bus switching governing chips would determine who was allowed to talk or listen to the given addressed memory or I/O component. Way back in the late 1970s the speed of CORE non-volatile RAM magnetic memory was low microseconds. Semiconductor memory was about 1 microsecond down to about 450 nanoseconds. Each successive year the memory speeds got faster for semiconductor RAM. The DRAM chips for the Apple II were around 200 to 300 ns. The first 386 processors at 16 MHz were using 150 ns DRAM chips in sockets on the mainboard and expansion RAM boards. Expansion RAM boards in the early PCs would not necessarily be able to use ultra fast static RAM cache sets, nor could they use faster SCSI bus mastering DMA controller boards unless specifically designed to do so on the newer 486 mainboards with premium level chip sets and bus expansion slot standards that would permit such stuff. CPUs and chipsets started to be fabricated using advanced high speed CMOS--complimentary metal oxide semiconductors. The signaling was using less current flows to swing the voltages involved. MOS transistors used field effect transistors. NMOS and PMOS complimentary transistors to form CMOS. The fields on the signaling gates would either use depletion mode or enhancement mode to turn the transistors on or off. Instead of base input, collector or emitter outputs, the newer transistors would have gate input, drain and source outputs. Gates would require field voltages rather than input currents to do switching. The only current flows required were to charge or discharge the parasitic capacitance involved in the small gate circuitry. Leakage currents were reduced with each newer generation, thereby lowering power consumption. DIMM RAM started to come out for the Pentium series CPUs in the mid to late 1990s. It had RAM speed at or faster than 80 ns and eventually in the early 2000s was running close to 7 ns. Eventually dual data rate (DDR) chips started to come out that would do block transfers to and from the CPU's internal cache a bunch of words at a time. DDR2, DDR3, DDR4, and so on. It turns out that every step forward was done at the scale of the chip die size being shrunken more and more each new generation making the transistors smaller, closer together, have less active reactance charge capacitance or inductance delays. A lot of this was done initially with semiconductor etching masks made of shrinkable fabrics where the pattern was transferred, and then the mask was shrunken and then applied to the semiconductor wafer layers. As these shrinking advances came in, so did power supply voltage drops come into play. Switching times of the complimentary totem pole configuration transistors determined the power used because as the frequency of the signal switching increased toward and beyond the switching speeds of the two transistors, there became more and more a window of time when the bottom and top of the totem pole of transistors could be on at the same time thereby drawing current straight from Vdd to ground from top transistor through bottom transistor causing momentary short circuits causing huge power dissipation. With bipolar junction transistors such heat can cause what is called thermal runaway with excess charge carriers preventing the transistors from shutting off the current flow without a much greater base junction input current to attempt to shut the flow off. From what I can recall, CMOS field effect transistors do not tend to have thermal runaway and in fact the increased temperature would tend to somewhat increase resistance to such flows.
Let me refresh your memory. 150ns was common. Any faster got exponentially more expensive....in the end 286s etc. Ran 60 or 70ns ram, some 10+ years later.
I would solve that differently.. if the character rom was really ram, it could be mapped into a block and CPU access would take priority over the character generator. Given the system would load this ram from rom at boot, the extra blank screen at start would be un-noticed. Given that dynamic access to the character ram would be unusual and minimal, it could be timed during the blanking period without creating visual aberration's.
A series on the C64 (assuming there is to much for a single episode) would be amazing. Interesting stuff as always look forward to seeing more in the future.
I love this. I could listen to you for hours talking about PETs and C64s, and enjoy myself while doing so. I hope you continue with this old school stuff now and then! :) The PET was the first actual computer I ever saw and got to play a bit with, back in the late 70s. I was totally hooked. A few years later and after endless begging, my father got me a C64. The world changed forever that day.
Absolutely right, I think it was around the mid 90s for me, I was brought up building and designing my own computers in the 70s and 80s. Once we hit 80386 protected mode and NT, and I could no longer bit bang the parallel port without having to hack a driver, I lost my grip of having a comprehensive understanding the entire stack. I did write display drivers for Windows 3.0, but only a very few basic drivers for NT, and of course, the driver model kept changing. Thankfully since WinUSB we hardware guys can largely avoid writing drivers these days, but it's taken a couple of decades of USB to get here. Regarding the PET's character display, AIUI the character generator ROM isn't memory mapped, it's on it's own private bus. There isn't enough time to be able to fetch the character and look the character's bit map up on the same bus. The 6502 has a dual phase clock. On one phase, the CPU owns the bus (eg, instruction and data fetches/stores), and on the other phase the video can use the bus. As the clock is 1MHz, or a period of 1us, there's 500ns of bus access time for the CPU and then 500ns for the video. RAM and ROM of the day had typical access times of 250 to 450ns. By the time you've added the glue logic propagation delay times, there's not enough time to allow for any more than those two memory accesses inside that 1us. Thus, to achieve the timing constraints of video refresh and the 1MHz clock, there's a second address/data bus, physically inaccessible from the CPU's buses, for the character generator ROM. The dual phase clock of the 6502 architecture was a key reason for its popularity among manufacturers precisely because you could easily memory map the VDU memory and avoid timing contention with the time definite phases. Life was a bit harder for hardware designers on processors like the Z80 which doesn't have this feature.
Shoving the character ROM on it's own bus certainly wouldn't be all that odd a choice- the NES did the same with tiles, and I think the IBM PC monochrome adapter _may_ have done the same with it's character ROM. It would always have been possible to add some circuitry (e.g. a DMA circuit in the video processor) to get characters moved, but that would have (slightly) complicated the video circuitry, and would have required extra chips (and as I recall, ROM was cheaper than RAM).
This is why I migrated to Linux relative early. I even own the T-shirt "Ich bin /root, Ich darf das!" = "I am /root, I am allowed to do that!" I also stopped developing hardware stuff for the PC and migrated to embedded systems only using Ethernet and browser based interfaces. Less pain and no sweat anticipating the next OS upgrade.
I am a kid who is into computers, but have no depthy knowledge, and I'm able to understand what is being talked in the video. Not really do I have a full grasp yet, but I am able to register it on my brain, so I guess you did a great job. I do have a foresight that this video does have s very high CTR and people would be clicking off at the early of, because big brain stuff folks.
What a great video. Takes me back to my younger days when I was all in on programming my C64 in BASIC as well as in 6502/6510 assembly language. Back then I really did know almost everything there was to know about how that computer functioned.
Nice video! Reminds me of 1981 or so. Got a Tandy TRS 80 PC with 4K RAM. The same evening I replaced the RAM with 16K chips. Later upgraded it to 48 k by soldering 2 RAM ICs on top of each RAM and adding an address decoding circuit. All piggy-backed on the original ICs. Next a lower case character generator and after that an APL character generator for the specific symbols for the APL language. 2 pcs 5 1/4 floppy drives with home build controller. Real-time clock and a 75/1200 Baud modem. That was indeed the last system I understood 100%.
It used to be more fun when we still could at least pretend to understand the whole stack from digital circuits to all levels of software. Becoming a software or system person was possible and even beneficial to start from digital circuits, hardware and moving upwards.
You still can, although it requires quite a bit of patience, but at least there are tons of easier to reach documentation on how to do it today. Want to build your own TTL computer? You can, there are even entire series on RUclips on how to design and build your own custom CPU if you want. Want to write your own OS? There are tons of wikis, tutorials and books on the subject, and entire open source projects that you can download for free. Want to build an 8-bit or 16-bit computer? There are kits you can buy and build and tutorials on how to design your own. Microcontrollers (and their development kits) and basic logic analyzers are essentially commodities now. And FPGAs went from a thing you could only dream of, unless you worked for very specific fields, to something you can buy a decently sized one for a reasonable price and build quite complex digital circuits with it. There are even video series on how to develop cores for FPGAs today and tons of open source cores that you can learn from. If you are so inclined, take a look at the MiSTer FPGA project, lots of retro gaming fun and also a very nice learning tool if you want to either dabble or deep dive into digital circuit design. Sure, it's "expensive", but still costs less than a modern video game console or a gaming computer and I promise you, it will give you way more hours of fun than those other options.
@@gcolombelli If you build your own, that is different situation. Also open source and open designs are different. I agree to that. I used to work with closed source systems mainly. But having access to all hardware levels and low level interfaces and functionality is not so accessable for commercial reasons. Nowadays there are exceptions, I agree. In my university years I had access to OS source code from DEC and other companies. But later even working in a computer company, getting access to the source of the software I was selling and supporting was restricted. Open source wasn't really part of the business plans of that company. Inhouse CPU development information was even more separated and information restricted.
Thanks, Dave for yet another amazing episode. What a rich plethora of information that I, like most other grandpa's have forgotten or at least pushed further back into the recesses of near oblivion call my memories.
I used the kernel routines on the vic-20 to print to the screen, that way when a memory cartridge is installed and the screen memory changes the code still works as the kernel routine adjusts for that.
I remember discovering software interrupts when I was 12 years old. I bought all the books I could find on anything PC. In one summer I made TSRs to cheat in games, made a program to add echo to live microphone input, wrote a simple graphics library to use all VESA modes on a Tseng labs 4000 and read data from a CD-ROM. Best summer ever! Just one summer later I had internet connectivity and Pathfinder landed on Mars. Epic times.
I find this all interesting because I was a kid when Windows 95 was released. It's interesting seeing the trade off going from the PET to the VIC20. The problems I faced when I was younger was: trying to use the Compaq my mom bought in 96 through the year 2005 because I had no money to buy a computer, and everything was a hand me down
In my early teens, I had a Sinclair ZX81, followed by two Atari 8 bits. Although I never got in to assembly language, I did have a lot of fun programming. Poking values from BASIC, altering characters, setting bits on the video and sound chips on the Atari to achieve different effects etc. References like "Mapping the Atari" and "De Res Atari" were incredible resources on the system architecture and OS, even if I only understood a fraction of it. Nowadays, everything is a black box, closed off and accessed by a gazillion gigantic libraries, and most kids today have no idea about bits and bytes.
The PET the was first computer I used and I was hooked when it dawned on me that I could make it do anything I wanted. I just needed to learn how and I'm still learning today thanks to the highly complex hardware and software systems of today. Today I'm learning about Serial Peripheral Interface (SPI) programming so I can unbrick a BIOS.
I didn't enter the computer field at the PET stage. I entered in 1982 with the HEATH model H89 running a 64k CP/M system with a Z80 CPU (2MHz). Even within 64k RAM, the thing could run a good word processor that, surprisingly, had many useful features considering it ran in what was left of the 64k after the CP/M system filled its share of the RAM. I even had an MBASIC compiler and a FORTRAN compiler on that H89. Storage used 5“ floppies, hard- and/or soft-sectored.
Loved the PET. Cut my programming teeth on it at School. Ducking out many other classes to be found in the computer lab (2 PETs) coding away with no idea what I was doing but generally succeeding anyway.
Just found your channel (by random chance, I just had the utube on for background noise as I was playing on my MiniPET trying to fix a BASIC program) and holy crap, I think I could listed to you for hours just talking about this stuff.
Growing up during the 8-bit era of computers was definitely a special time period because you could have access to nearly every byte in memory at will on most machines. The only OS that even comes close to emulating this behavior in the modern era might be TempleOS, though that is a very obscure, fringe case.
I spent Friday nights in the downtown Toronto computer store Batteries Included in the mall Village by the Grange. where Jim Butterfield introduced a wide eyed Grade 9 student the Commodore PET including music! Awe struck. In the 90's he wanted me to digitize his First Book of KIM. Wonderful mentor. RIP. JIm Butterfield.
The reason the PET didn't have user defined characters is that the character ROM isn't in the 6502's address space, it is hard-wired into the video circuitry. You can't read the character ROM on a PET from a program even if you wanted to. The video circuitry in the PET essentially operates asynchronously on its own, with the CPU grabbing access to the video RAM when it wants it - do this at the wrong moment and you'd see "snow" on the screen where the wrong data gets shifted out to the electron beam! That is why BASIC usually waited for the vertical retrace interval before writing to the video memory. Later PETs, the 16K and 32K pets (2016, 2032, 3016, 3032 etc) did use dynamic ram. These are often referred to as the "Dynamic PETs" by forums.
Dave this such a great walk back through history.. my own experiences reflect those early day's where you were close to the system, or writing an input routine to grab the keystrokes and then move onto the next task and be back in time to pickup the next keystroke. Thank you for putting a wide smile on my face.
Nice, bring back the 8 bit days, I loved being able to have my head around 95% of my machine. "Character gen rom not in memory map", same as my TRS-80 M1, having the actual character 'rom' in ram was a problem back then, the video refresh circuit has already pushed the CPU aside so at to read the video ram, now it has to also index back into ram to get the character patterns for that screen position, usually that was too much bandwidth for the main address bus to handle. I did it on as a hack mod on my TRS by having my programmable character 'ram' chip as a dedicated to character generation IC that was not 'normally' in the main memory map, so it's operation did not effect normal memory bus operations. However when I wanted the CPU to write to it, it was quickly switched from the video cct, back into the main memory map. On my TRS, there was not a handy 2K hole in the memory map (I used a 6116 2k ram chip) so I mapped it ontop of the 12K Basic ROM. If you do a read, you get the ROM, if you do a write, if goes into the character gen ram. The only downside is that I could not read that ram, only write to it. In the PET days, adding all of this would have made the circuit board somewhat bigger. By the time the C64 came out, Commodore were able to make their own custom chips, making this a lot easier, as the circuit board could be shrunk quite a bit. the downside for the C64 is a few 'unobtainium' IC's that makes repairs a tad hard. (yes I know there are modern solutions, now, - only due to the C64's popularity)
I did something similar on my ZX80 in around September 1980. I had a little board that I designed that plugged into the ROM socket, but also added some extra logic and a couple of 2114 RAM chips. My ZX80 had the normal 64 characters, plus 63 user defined characters and their inverses as well as the inverse of the 64th user defined character that you could not display normally due to the way the ZX80 display generation worked. Fun days!
After 7 years of being a computer enthusiast and when I completed my Computer Science Degree in 1980 - which makes me the same vintage as this presenter - I had learned most of this stuff. Additionally my first few years of professional low level process-control SW development taught me all about 8 bit computers....and I mean ALL. Interrupt handling, O/S structure, Assemble, Machine code, how input and output is multiplexed, RS232, and a myriad of other stuff. I am SO GRATEFUL to have been on the ground floor of microcomputers. You haven't lived if you haven't put a 16 channel logic analyser onto a single board computer, looking for why a data fetch hasn't worked correctly :)
At one time the university I worked for moved from a mish mash of early computers, mainly the Commodore PET, to PC clones. In the basement of one building they had several hundred Commodore PET's stacked with shrink wrap on wood pallets because they couldnt get rid of them!!! No other institution wanted them. Illegal to send them to the landfill. They were offered free to employees and students but there were few takers, including me! I believe they eventually had to pay a recycler to take them. These now sell for well over $1000 on Ebay!!!!! Coulda, woulda, shoulda!!!! AAArrrggg
I feel your pain. I still kick myself often for not taking the opportunity to snag my family's first computer, an Apple II Plus, when my dad was cleaning out his old house prior to selling it. I also enjoy watching videos of car enthusiasts restoring antique cars. Hearing restorers talk about their finds, I used to find myself shaking my head and wondering why such beautiful machines were just abandoned and left to rot - referring to both the computers and the cars. I have to stop myself and remember that at the time they were replaced, they were just old, obsolete machines, often crippled by problems that weren't worth fixing at the time. They couldn't run modern productivity software, Internet access was beyond them, and there were thousands more just like them here, there and everywhere, so unless except for special editions, they weren't worth anything. And a quick look at eBay suggests that these old gems, still aren't worth much to the market at large, and probably never will be. I currently have a Compaq Deskpro from 1998 or '99 that was retired just a little over a year ago, long after most of its peers had been chucked. I'm currently using it to demonstrate some long-gone storage media to a group of high school students. I'm preservation-minded enough that I'm not going to throw that Deskpro out when I'm done with it, especially since it's in pretty good working order. As far as what I *will* do with it, that's a good question. It's too old to be useful, but too new to be a museum piece.
@@mar4kl Absolutely. Im seeing 5 1/4 floppy drives, old sound and video cards, memory sticks bringing in crazy $$$. Granted these are not making someone an instant millionaire over night but I personally have thrown away many of these items and its painful to pay any $$$ for stuff you once trashed!! What are we throwing away or neglecting today that will have value in the future? Its just difficult to know! On your Apple II Plus over time they along with the first Mac's joined the pallets of unwanted computers.
Thanks for the walk down memory lane. I started with the Commodore 64. I recently did a hobby project for my IBM PC-XT and wrote an assembly language program to interact with the battery backed up clock board on the IO buss. The original program was long gone from the hard drive which required low level formatting to bring back to life. I had to trace out the board, create a schematic, and then look up the spec sheet for the chip (MM58167). I used MASM to write the code. The project was very satisfying an a lot of fun. The last time I did any assembly language coding was 30 years ago.
I'd love a deep dive into the c64. In fact, a series on the c64, maybe walking us through a complete program start to finish, would be freakin' awesome!
I miss my machine code programming days on the VIC-20 and C64. Writing entire games that wouldn't even fit into the memory space needed for a simple icon nowadays was very rewarding, especially for a 14 year old kid.
I love it! Yes!!!!! This is exactly what I try to express to people why I love vintage computing. It was simpler back in the day. When I teach cyber basic classes for my organization, I try to teach these young kids (25-35 is young by my standard) they miss the elegance of the old equipment. For sure, cool to have a fast computer in your pocket (mobile device) but everyone has become a user and so few understand the what's happening under the hood. It was easier then to be a mile deep in many areas of computing. Now you can only be an inch or so deep in multiple areas and you have to choose a specific area to go a mile deep. Sure more sophisticated? I'm not sure. I miss the days of actually being able to understand, diagnose, and repair.....like you did with your PET. Cheers
Very well presented. I admire your dedication to these legacy systems. They serve well as a way of explaining the basics of computing. I really enjoy your channel.
Very cool video. I stated with a RCA 1802 Elf in the 1970s. A friend and I built a system to replace a controller board for an IBM selectric printer - with 2 k 2102s, and serial IO to control the selenoids to control the mechanics from a serial input. We built our own hardware emulator, and programmed in machine code, including an eprom. The 1802 was a good platform since it was static so we could control he clock and step through the routines to ge the timing right.. I agree with you that is is helpful to have a knowledge of how these tools work. It was fun, but I enjoy the current capabilities of running MS SQL and Tensor Flow.
I always appreciate your explanations. Having grown up in the commodore era, it's always worth it to me to sit back and listen and remember the amazing things these computers were able to do. If we could share photos here on YT, I'd gladly post a photo I found last year of me and my brand new Vic-20 on Christmas 1982. I would've been 12yrs old. That quickly led to acquiring a C64, multiple 1541's (because I ran a BBS at 13-14yrs old), and eventually a C128 with a couple more 1571 drives. Back in the early 90's, I found a fully working PET on the trash. I knew it worked because it came home with me and I plugged it in. All of that equipment landed in the trash. How I wish I could go back and convince myself to store it someplace safe just so I could play with it today. Who remembers Person-to-person transfers on Qlink? :)
Life was good when I could stop toggling the boot loader into the front panel of a PDP-8, and use the boot loader in Core RAM in a PDP-11. Imagine my delight when the bootloader and monitor were burned into EEPROM in my 1975 Digital Group Z-80 computer that I hand-soldered together, all 1024 bytes of static RAM, later expanded to a voluminous 8 kbytes. It would load and save using audio tape. I wrote a lobby kiosk annunciator board app, in hand-written machine code, for that sucker. Booting Windows or Linux from SSD is an incredible luxury! (By the way, get off my lawn).
Dave a lot of the first generation of "homebrew" hackers and gamers went through the knowledge bootstrapping process. Many even independently invented or discovered how to build computers from scratch. Early software and hardware are so intertwined that it was not possible to know one without the other. Computer magazines in the 80s actually had complete program source codes or circuit diagrams, and Byte magazine was our industry bible.
I once tried to explain old school bootstrapping all the way from power up to sector zero loading, and jumping vectors in the MBR. The person I was explaining it to, just looked at me like a was purple, and didn't understand a word I was saying! KIDS TODAY! SHEESH!
I can make a BASIC Commodore 64 program but I never learned any POKE commands except the one to change the screen colour. I’m Australian and we have the extra U too! Anyway, as long as you don’t want fancy graphics or sound, I’m pretty sure I could knock up any program you need. Mind you, I think trying to make a spreadsheet application in BASIC. I’m not sure I ould do that. It would be interesting to try. Anyway, I’m just leaving a comment to boost user engagement. I love this channel and I loved this video. Yes, more, please!
Back in the early '90s I was at one of two startups that were doing compression that could be used on discs. Thanks to books like undocumented DOS and a pension for disassembling code, I was able to contribute to that effort. Once we had a product, and I thought about what happened to lotus one two three, I walked into the boardroom and told the guys we got 2 years to make money on this and then Microsoft is going to have this compression in the operating system. It's as sure as Moore's law. So YOU FOR ONE OF THE GUYS THAT MADE MY PREDICTION COME TRUE! Bravo, much respect!
Would love to see you collaborate with Bill Herd, he's an ex Commodore engineer (though he was there after the PET), he always has amazing stories to tell!
I remember writing my first basic program from the commodore 64 manual. The hot air balloon pixel graphic that would float across the screen. I then spent days writing code to use the keyboard to move it around. So much fun as a kid with tech that seemed like it was from the future. Thanks Dave for bringing back fond memories. Keep it up.
I remember cutting my teeth on a TI 99-4/a back in the day. I didn’t understand it at this level, because I was too young and there wasn’t an Internet. BBSes came a bit later for me, but the early days were spent in TI Extended Basic. Thanks Dave, for shedding light on the old school.
Takes me back to the early 80s when I learned the same things about the pets we were using for industrial machine controllers in the UK.....thanks for reigniting those memories
Dave really should setup an old school BBS for his userbase (rather than a discord or other) I think it would go down pretty well :P Perhaps not with a phone dialin though, some sort of emulation over TCP instead :)
Infact ... it could be done, a standard BSD server, users sign in with ssh like expected, then they use minicom or such to a local BBS .... this must be tried!
Even IRC would be a good midpoint, old enough that you can get old hardware to talk to it, but not that old that modern hardware can't also talk to it! You can even get smartphone apps for IRC, just!
I used to help a friend run an Acorn BBS with 3 lines, allowing a whopping 4 people to chat together - well until their parents got their monthly phone bill and banned them 😏
Something I constantly have to teach students - input/output. They get that a little with PHP or with CICS. But in normal programming, they ignore the WFA form (or WPF, etc.) until the last minute. I was always taught to use spacing charts to plan I/O before coding (partially because it helped plan some of the variables).
When you mentioned the part about not understanding some of the other sublayers and bits of the system, it got me remembering a thing I watched a while back about ALSA (Advanced Linux Sound Architecture). The discussion was about how without some kind of added layer of control modules and extensive system components in the middle of him and ALSA - he would have virtually no idea how to make it useful. And that's just the sound. Imagine also trying to interface on a machine-code level with a modern graphics card, with all of their complex functions. 😳 Everything has been reduced to specialized modules that entirely do things for us. The problem is that those modules are extremely power-hungry, when you consider resource uses. I started using computers when an entire OS would boot from a floppy. Now single video games are significantly bigger than most of the hard drives I used for the first 10 years of using computers. 😳
Modern video game sizes shouldn't really be compared to those old disk drives, because most of the consumed space is graphics (whether images or 3d models).
I had not heard the name Jim Butterfield in a long time. I had some books of his learning how to program the C64 1541 floppy drive. Thanks for the trip down memory lane.
7:34 I think it is a bit misleading to show the electron gun to cover a full character as the scanline went through all characters first row altogether. I am curious if there was any caching like mechanism here as the video control had to access the same N character bitpattern 8 times.
The character ROM seems to feed data into a 74165, a shift register. So the byte is read from ROM only once per character per scanline, not once per pixel.
I remember a BASIC progrem called FLOORMAT which looked like you were formatting your system drive and no way to stop it till it ended with a persional messsage, "Got you Dave"
The stuff on how characters are transferred from a map on a chip to electrons on a screen was interesting. I appreciate the time and detail you put into the videos you make! Keep up the good work, Dave.
Tell me more! Email me at davepl@davepl.com with the story! I assume you mean the display? I just swapped one and noticed the PCB wasn't a rev, it was COMPLETELY different than the old one!
i never programmed the PET or Comodore. I had a TRS80 with Z80 processor. The layout of memory is comparable. I’ve forgotten lots of the details of that machine but I remember that it was great fun to work with it and talk about it with people in our little club (8 people) and show of what you had made. Later I bought the Atari ST with a Motorola 68000 processor. That was a huge step forward. Too bad Atari didn’t manage to stay in the game. Today I’m a retired programmer after 47 years in IT and still love computers. My current machine in comparison with the old machines is so much more powerful but also very much more complex. I understood the Architecture of the Z80 but the Ryzen 3900X with 12 cores (24 hyperthreaded) is so complex that I can’t make head or tails anymore. I leave it to the people that do understand. Thanks Dave for this little journey into the past. Relived old memories.
@@therealb888 We do learn how computers work at a fairly low level but it's more focused on modern platforms which he mentioned are huge compared to these older machines. This video is like looking at how a go kart works to understand a passenger jet.
@@coolbrotherf127 Good for you man. My retarded college is stuck with turbo c & 8085 (not even 8086). But low level understanding goes a long way. Instead of having a course on modern wireless comm sys like 4G/LTE networks, we were thought the GSM/CDMA standard.
@@therealb888 If you want deep dives, I'm studying assembly x86 as a Comp E student. We didn't go over basic yet, but we either will in future (I hope) or we will go back to C++ I'm typing cuz CS doesn't really go into oldschool, hardware type stuff, that's our territory. CS usually goes into performance engineering and such with more modern stuff. What i mean to say is that they really don't need to focus on controlling hardware as much, CS is more about applications and etc. Maybe @@coolbrotherf127 will go into Rust and the like in the future
I had a Commodore PET. I even remember playing Space Invaders on it as displayed in this video. I remember loading the game from a tape deck attached to the computer. It's been an amazing time.
Re: character generator ROM: My guess is that allowing both the CPU and video circuitry y access the RAM would require additional circuitry to implement a dual-ported RAM solution - adding cost to the system. It was also the case with the TRS-80.
When the CPU was slow (1MHz range), the static RAM once upon a time was more than twice as fast than the CPU, allowing interleaved access by CPU and graphics. So no dual-port RAM needed. Just waiting for the CPU to release the address enable signal and have the graphics card step in and read. I have a home-built Z80 machine using this logic. Nice, clean, and fun to implement.
I may be wrong, but I think the character ROM on the PET (unlike the C64) was not hooked up to the main CPU--it was only attached to the CRTC. So, the actual hardware made it impossible to relocate the ROM.
@@whollymindless Yes, contention is definitely the issue. But there is a cheat possible in some situations. During CRT horisontal and vertical retrace, the graphis do not need to read out data. So it's possible to switch ownership for accesses. Not a perfect solution for graphics RAM since it limits available draw time per frame. But enough time to redefine sprites in a RAM-mapped character generator. Edit: but it does require some MUX chips to switch if addresses comes from CPU or CRTC or whatever is used to drive the graphics display. And it requires an extra dual-direction buffer chip on the data lines since the character generator data must be muted from the CPU data bus while graphics is being emitted. Easy to build but adds cost and PCB space.
@@d2factotum That is why MUX chips are needed, to switch ownership of data and address lines. And a means ti synchronize the code with the horisontal and vertical retraces
Amongst the first dedicated computer was the AES-100/103 word processors. Although powered by an Intel 8080 cpu, most of the functions you describe were done in hardware. Yes, for example the on screen cursor had a series of TTL (7490 type) horizontal counter counting up to 80 (caracters) and a vertical counter counting up to 25 (lines). The half tone cursor was driven on the video board with a resistor injecting half the current in the gun, hence the tone difference. The boot sequence was controlled by a bootstrap loader in ROM. The disk access, head placement, rotation, data transfer, etc. were all hardware implemented. The same video ROM (different countries characters were available with differents SKU) idea from the PET was used, altough the text on the screen was stored and "remembered" in DRAM memory. This was also hardware controlled to the point that the text on screen and off screen was stored in contiguous memory as it was displayed. The text was moving on screen because there were hardware counters doing that. The keyboard scanning was all implemented in hardware, only issuing an interrupt on any "Hall effect key" keypress. In effect at the time the limiting factor in all systems was the cpu itself. The cpu was burdened with all the interrupts and housekeeping duties they barely had time left for anything else than coordinating the processes. It was faster to just dump a command or a couple of bytes in the hardware and let it do it's own magical business. Then come back on an interruption to look at the outcome of whatever was pending. There was extensive use of DMA for anything data related. Communications were handled the same way. A respected engineer (especially mission critical like communications) at the time would not trust anything cpu! We would implement a series of lap time watchdogs timers (Bill P.) , tigth timings, wake up call timers, etc. and make sure the hardware was ready to press on the reset button at 4 millisecond notice!! That's the way it was done then... Debuging a video display was simply a matter of looking on the screen, with experience we knew exactly what had failed since it was a sort of inverted WYSIWYG! Remember AA55 and the Quick brown fox generator!?
I started on the Apple II. Very, very, similar to the PET. About 80% of this video also applies to the Apple II. The only major difference is the Apple II had 2 graphics modes in addition to the text mode. Great video. Took me back in time.
Damn, if you hadn't said in the beginning that it was not a guarantee (that I would understand how it works), I'd be suing you right now, cuz I still don't understand anything lol
Thanks for that ride down memory lane, Dave. I started off (if we discount a 4 bit AIM development board running a 4004) with a Tandy (Radio Shack Model one. I was even simpler than the Pet. I too enjoyed the luxury of knowing what every (well most) component was for and thanks to reverse engineering (or screwing around with a Monitor in the ROM, as I would have called it at the time) I learnt what every bit of code (including those Microsoft routines) did. It was a time of rapid development and personal computer costing £1000s in todays money would be obsolete in a couple of years. It wasn't just the hardware, the knowledge also suffered from obsolescence. Within 5 years, I wouldn't have even be able to tell you how a sound card worked (though I knew it didn't use the relay that switched the tape recorder on and off to make the sound). Thanks again, Dave.
It made me uncomfortable that you described the data going to the electron gun one character at a time instead of one row of each character across the whole screen
I didn't mean (or say!) that. The one little image shows the scanning of each line, which is how it happens. If your screen says COMMODORE first it draws the top of the C, then the top of the O, the the top of the Ms, etc. Then the next scanline of the C, the O, the M, and so on. It's definitely one ROW of pixels, not a character at a time!
The first computer I ever used was a TRS-80 Model 1 in 1982 in Year 7 at school. All I could do was a BASIC "Hello" program. The first computer I owned was a TRS-80 MC-10 in 1983 with a whopping 4K of RAM. I typed in many BASIC programs and ended up writing some including a very crude financial accounting program. I learnt how to POKE the screen to do unusual text and graphics. I did some elementary Assembler programming. Yes, that tiny computer could do all that. I later upgraded to a CoCo 2 and then a CoCo3 with the 512K RAM upgrade module and 2 x 5.25" disk drives in the end before moving to PCs in about 1990. The good thing about PC's was the open architecture which lended itself to learning all about it if you wanted or needed to know.
Sadly no one seems to be watching this one, but they like it when the do, so I'd love to! I have to think of some clickbait name to make people click though :-)
@@DavesGarage I think the title might be putting off people who don't care about the PET. If the title were more generic maybe more people would watch.
I used to repair Commodores Pets back in the 80s. I had a reference system with the 40 pin chips socketed. A cabinet with spare chips and reference floppy for setting up the heads.
Thanks, sir! I truly appreciate compressing my ol' brain cells dusting off the grey matter and making room for some new items in the cracks. I Didn't realize how similarly the Pet and TRS80 operated, though i did do some cross platform translations for a couple Avalon Hill programs. In those days, I treated other systems as "competitors" instead of embracing their similarities and treating them as cousins. {we do / should learn as we get older}. I wish that we could have shared corridors in Redmond. To stroke a well deserved ego, you continue to impress me, Kind Sir!
I miss when computers used to come with manuals. And those manuals had schematics.
I think the schematic for even a Z58 chip-set with a i7-2700k would be stupendous 🙄🤯🤔
@@MrLunithy You can often find leaked schematics for modern computers. While there's a ton going on it's actually pretty easy to understand if you only look at one piece at a time. When chips are as complicated as they are now they don't need much support circuitry other than power supplies.
@Janek Winnicki that's funny, I started programming at the age of 10 on a acorn electron. I was highly fascinated that I could make a computer do stuff.
Computers still come with manuals... including a rough schematic. You just need to purchase your motherboard separately (and not a fully-built Dell, for example) to get that manual. You can also download them as a PDF from the manufacturer.
@@JonRowlison "rough schematic", it tells you which holes to plug things in, and each type of plug will only fit one type of hole. You've no idea how the LPC bus connects the ROM to the CPU, or if there even IS an LPC bus and what it replaced and why. There's still a basic IBM 5150 in there _somewhere!_ How ever many of it's interrupt controllers and timers are emulated in software at some level by some in-chipset CPU (and there's probably at least a couple of those, and you'll _never_ see the source code!). Just the spec for PCIE is only open to members of it's Special Interest Group, which will cost your company a shit-ton to join, if you Need To Know, and to download it you search for which _pieces_ of it you're interested in. Possibly nobody in the world has read the entire thing. Certainly nobody in the world understands a modern PC from gate level to the OK button on the screen. And there's stuff beneath the gate level.
Modern PC "manuals" are nothing _near_ as comprehensive as the ones that came with computers back in the day. They had entire circuit diagrams. A modern motherboard might contain, essentially, 1 big chip, that's more complex than 1,000 chips on the original PC. There are billions of logic gates in there.
With a PET or Spectrum or whatever, you could find out what bits to set to get immediate results in the hardware, no OS, no other code interpreting your wishes, you just DID it, right there and then, yourself! You could repair, replace, and upgrade parts that aren't even visible today cos they're all in one gigantic chip, and you'd need something like an atomic force microscope to see the parts if you got the lid off.
Really computers today... thousands or millions of times more transistors, no exaggeration. They all do something. Then there's the millions of lines of code, all interdependent, where you used to have perhaps a few thousand that did exactly what they said they did. Now, a program you write might be running on a CPU like you expect. Or it might be emulated. Or hypervised, or trapped and partly-emulated. On a machine with similar or massively different architecture to what you thought you were programming for. It might well be running on some cloud server halfway across the world and only part of it's actually running on the so-called user's PC. Parts of it might be operating on different servers at the same time. You and me might be centuries dead, running as AI's in a virtual world a tiny fraction as interesting and sensual as the real one was. And nobody's told us! We really wouldn't know, and software doesn't, can't know.
There is no comparison. The comparison between a Ferrari and a roller skate, really, on that order.
Pardon all the hyperbole but I don't feel I've exaggerated at all, computers now really are ludicrously detailed inside. Where the stuff of the '70s and '80s wasn't so far on from the machines they used in WW2. Go look up transistor counts of PC CPUs from the 8086 to the modern Core Whatever.
The 8088 CPU of the first PC contained 29,000 transistors. Some recent thing from AMD contains 39 billion transistors! Admittedly a lot of that is cache RAM, silicon monoculture rather than the jungle of other parts, but I think it makes a point!
I started a new project and went from c# to embedded c and c++. I went from thinking I understood computers to spending hours relearning what I thought I understood. I fell from the shoulders of giants but gained a giant respect for them climbing back up.
Wow! That went Deep!
70's computer nerds were on another level. Gentleman you have my respect
"When we use a modern computer, we're standing on the shoulders of giants so tall that the details are obscured by clouds."
That's just poetic, man.
There's at least two album names on this sentence
@@luisclaudiofugolin6250 That's what I thought as well when I read it.
@@luisclaudiofugolin6250 where
_"... obscured by _*_clouds_*_ "_ ☁ 🤓
@@RichardNobel lol
I'd love to see you walk through everything that happens from the moment of powerup, through someone typing PRINT "HELLO WORLD" on the keyboard, and finally hitting return and the output shows. It would be quite a journey!
Thanks for the idea! Not a bad one - could start with turning the machine on, then what happens until it goes input idle, then how it parses and executes the line of code. I'll think about that!
Kinda boring imo, the DOS interrupt vectors are more fun. ;)
@@DavesGarage
*Builds TPF file from original sprouting as BMP from default texmod tool*
*Upscales multiple ways and purges old .log file with removed associations so its absolutely fresh and associated, while making sure the name matched after removing the non upscaled exacting format*
I can't get Jurassic Park back online without Dennis Nedry.
-Ray Arnold
The brain inflammation stage from staring at a screen for 65 million years is setting in. Ill cycle through the different compression technologies and see if I can definickize this blasted tool which im glad with very little effort I figured at least the above out.
@@DavesGarage
My first lead. Gigapixel is a liar. Compression is TIFF despite the setting of keeping it the same.
@@DavesGarage
The Eagle has landed.
The trick was not using BMP default since Gigapixel cant meet that. JPG in the .log file and outputting in Gigapixel worked. My T Rex looks far less compressed. Ill have to figure out the multiple asset same name, tree of funsies in time since its the only thing left to figure out, then its just pure drudgery.
This is so valuable. I also grew up in that era and slightly before. By the time I was teaching programming at university, students were already getting abstract information without any base history. It was a little painful to watch. I still teach, and I’m now having to make videos on steps to programming that I never thought would be necessary. Students come to my courses with nearly no understanding of what a computer is or does, even if they are reasonably good at using one. The horrors of tech support are hilarious reading, but also a tale of shame. If you get a chance to read Reddit’s tech support tales, highly recommended.
Perhaps some university programs could have some retro programming classes! Could be fun.
@@wandererstraining My university has some classes that deal with writing assembly and how logic gates and an ALU work.
I don't remember if they ever expanded that into the full picture though.
@@cameron7374 OSSU has in its curriculum a course called NAND to Tetris, which is apparently very challenging but fun.
Nothing on planet Earth warrants a visit to Reddit.
Imagine all of the people who watched TV and had no idea how they worked.
The first time I did much with computing, it was to reprogramme a training simulator for up to 32 staff. I did not have access to the assembler, for programming, so wrote code long hand in hex, and entered it byte by byte into memory. You reminded me of how much I have forgotten in over 40 years. I did not have a keyboard-map for the function-specific keyboard, so I wrote my routine to map the codes. In contrast, I have just updated my PC from Win-10 to Win-11. I spent the last two days locating, and removing, all the OEM software to render it stable once more. Times change, not always for the best.
It's nice when you have the schematic for something, but even nicer when you fully understand it and know exactly what each part does and why that's useful.
Indeed, but we have so many layers of complexity and abstraction that no single person can fully understand it on anything reasonably "modern" to be useful to an average user today. You can pick and choose which layers are useful/interesting to learn, although some parts have little to no public documentation and are essentially secrets (like firmware and microcode).
The book "Microprocessor interfacing techniques" by Rodnay Zaks is a great introduction to this kind of stuff, first published in the late 70s, it's a brilliant read
I bought a second hand Pet in about 1988 off the company I worked for at the time. The only experience I had at the time was with cheap home computers such as the ZX81 and Oric 1, plus a week long 6502 assembly course when I was 13 in 1984. With no instructions, and of course no Internet it was a great learning experience. I actually managed to program a rudimentary BASIC horse racing game where you could bet on the outcome of each race, each horse was just the number 1-6 moving along a track made of the minus symbol and a random number generator to decide if each horse moved based on their odds. If I remember correctly the entire screen was redrawn at every passthrough of the code. Years later I managed to get a job in IT support but knowing how to code certainly helped my career, especially in the early days of DOS and UNIX support. Still dabble in C based languages to this day. Videos like this remind me of those days and how exciting computers were before they became a tool just for work, and help reignite the passion :)
My career started at Texas Instruments in June of 1980 in the MOS memory group based in Houston, TX.
Static RAM came in two flavors: fast static like you might think of related to cache. But also there was slow, as in not-so-fast, static RAM. Usually found in 600mil packages and pin for pin compatible, except for one pin, with ROM chips. If memory serves me moderately, they were 250nS to 150nS access times instead of the 15nS access times of the much more expensive 300mil fast static devices.
My very first project was to reverse engineer the CMOS based Toshiba 5516, a 16K static RAM, which I did quite successfully revealing that Toshiba was much farther ahead in the wafer fab process business than my company, the number one behemoth semiconductor manufacturer that TI was at the time. And was an early indicator that these new Japanese semiconductor companies were becoming serious compeditors in that industry. Especially with that ever increasingly important DRAM chip.
P.S. jump (IBM / Intel 16 bit 8088. The jump to location being the battle Intel and AMD fought over with claims of copyright infringement.)
Hi ChesslsJustAGame. Did you work on any memory test equipment when you were at TI? I believe the testers I was developing were purchased by TI for those parts. The SRAM business was brutally competitive at that time...
Literally the first computer I ever used and I’ve always had a soft spot for it. I rebuilt one recently on my channel (barely more than a 100 subs) but I got it like a museum piece. I have made the point many times that old 8 bit machines were totally different in that even as a teenager I could understand assembly language, understand the hardware and understand the OS. I wrote an expansion ROM for the CPC range that added 150 new commands whilst still at school and a disc copier better than literally any on the market for the BBC micro. Videos like this really take me back!
Link to channel please?
@@LordAssClownclick on his name
@@LordAssClown You can just click his username or profile picture.
As a 14-year-old, I cut my coding teeth handwriting machine code and punching it into a ZX81 (and then a TS 1000 after I fried the ZX81 when clumsily trying to upgrade its SRAM to 4K :P). Your video reminded me of those old days when you could actually fully understand both the hardware and software of a machine. Zoom forward to 2022 and I have no idea how or what, let alone why, my Windows machine is doing what it is doing :S
the IBM PC, while more complicated, was still comprehensible to me on a software level. the hardware design was well past my understanding. DOS 1-3 and the IBM PC BIOS were well documented, and you could lterally debug through most of it and make sense. I learned assembly from the examples in the MS-DOS 3 reference, which also had an appendix with the IBM PC BIOS reference. Amazing that someone (or more likely a team) at Microsoft wrote assembly examples to demonstrate literally every single DOS 1, 2 and 3 function!
Or NOT doing. Me too.
@@linusfu515 The nice thing about having libraries that interact with the OS for you is that they might be OS-independent so it'll work on Windows, MacOS and Linux without you having to write it 3 times.
@@linusfu515 I have 3 books on my shelf specifically there for computer stuff. O'Reilly "learning Python" , Sobell "A Practical guide to Linux commands, editors, and shell programming." , and Comptia's 7th edition A+ guidebook.
Mentioning this, because I personally think you have the right of it mostly on how people kind of gravitate towards Python when first learning, despite having other resources available to them. I don't know if that's a good or bad thing, but I do blame my raspberry pi and its camera for needing to learn Python to use the camera properly (at the time at least) on why it ultimately became one of my first languages being learned; instead of something else like Assembly and C with its variants.
This is where I learned to code, ended up having 3 of them. A 2001, a uncommon Purple screen 4032, and a SuperPET 9000. I still have all three tucked away.
I love that you took as an example the journey of a character from the keyboard to the screen. This is one of my favorite examples to show the complexity of computers as the character has to traverse many layers of hardware and software. Yet it looks obviously trivial to everyone.
Yeah I use this with selected people when they complain that the video conferencing room has a hickup so the people overseas look a bit blurry.
VIC-20 owner here, loved this video. The first computer I wrote any code on was the ZX81. We couldn't afford one at the time (£100!), so I would go to local electronics store after school to read the manual and code on the demo machine on display. Later I got VIC-20 which was cheaper when it came out, and later an Atari 800XL. I learned so much just from magazines and books; I ended up knowing more than my computer studies teacher that he used to let me take home one of the school's computers (Research Machines 480z) over the summer breaks. Ever thankful to that teacher for the encouragement.
Ben Eater has an awesome series of videos how to make a tiny computer based on the 6502 and bread boards. If you like Dave's videos about that topic, Ben's are a must to watch. They are maintained as a playlist on his channel.
I also recommend James Sharman's builds.
My first programming class used a KIM-1 single board computer with a 6502 and a whole 1K of RAM. I got to where I could perform the startup from memory.
That class led to a field service career for CNC machine tools that were using punched paper tape, but we had to key in by hand the 30 or 40 step program that made the tape reader usable - no ROM. I really miss front panels and bit manipulations. 🖖
@@markcoleman9892I'm not saying you LIKE pain, but I'm not saying you don't either. :)
Throughout the early 90s as the only programmer on a project, Visual Basic for DOS made me a hero. In some cases literally over night.
Dave, thank you so much!
Your story telling skills are pretty much godlike, metaphorically speaking of course.
But yeah, the way you present the information in your stories is so much fun to watch and absorb.
Big hug!
"You may not know every detail but when you have a firm grasp of the big picture, the details you do know can be viewed in context". This is such a good statement I had to type it out. Will definitely be in my head as I continue my education.
Commodore's way of editing was so underappreciated, but so innovative.
who invented screen editing? I didn't think it was Commodore
Ah, you should see Vi !
@@greenaum I use vim but it can be annoying that because it's more of a "line editor" you have to put it in edit mode once you get to the line you want and then you have to exit the edit mode before you can move on to a different line
I think my old Apple ][+ did the same thing, including the ReadY thing. (I could be wrong - it's been a REALLY long time!)
It was Microsoft BASIC that did it. The PC's version does the same thing. It also maintains an extra byte in RAM for each line on the screen to indicate whether the text on one line wraps into the next line, so that the screen scraper knows where to stop interpreting.
“In the beginning…”, I used assembly language for my coding. I had to write drivers, I/O interfaces, BCD to binary, clock timing, etc. I wrote floating point algorithms for fixed-point machines that were 16-bit words but included 16-bit, 32, 64, 128, and 256-bit using some great algorithms, graphics for screens, radar screens, antenna patterns, all in critical time - real-time is I hit a key and an answer shows up, critical time is x eqns in your time. Any time left over was used for background stuff, so interrupt structures. I loved it - absolute control of everything! I even restrung core memory!
Just blew my mind with the "READY" thing, I used C64 and still have it around, but I've never thought of doing that. In my mind, the interpreter printouts were still somehow magically different than the stuff I typed, even though I know better, when I actually think about it. Cool trick, love it.
wow does this ever take me back. my 1st computer was a commodore PET in 1977. I got it and a "semi useful" manual (guide book) and started my journey by self teaching programing. my 1st retirement from the tech industry was in 2002, and ushered in my worst (boring) couple of years in my life. I went back to work, and retrained myself back into relevance and now run the R&D development network for a major system house. celebrating my 46th year in high tech this year and don't plan on retiring anytime soon.... thanx Dave. I think I still have my old PET in 1 of my out buildings. gonna have to wander out back and fire it up again soon.....
If the PET could use characters from RAM, it would have to access main memory for a byte around once every microsecond, starving the CPU like the ZX81 suffered. The C64 alternated accesses of main memory between the CPU and VIC, but that would have taken a lot of discrete logic chips to implement (which is why the VIC does so much in the '64), and probably faster ram chips, Ram chip access times from the 70's has alas faded from my memory..
Well the newest LTT video had them trying to use a networked swap partition and that definitely starved the CPU
This is also my theory. You'd have to dual port whatever RAM the character map lived in. As it is you had to dual-port the 1k of screen memory. Note the ZX81 didn't only have to deal with dual ported memory it actually generated the video directly, so the CPU was quite busy when video was being displayed. It's possible do the necessary dual-porting but that would have complicated things quite a bit. So they allowed two different character sets and that was that.
RAM and ROM speeds generally kept pace with CPU clock speeds. When specifications needed, you could opt to spend more money and get memory chips that had faster timing.
To switch which device was using the address and data buses, you would use tri-state buffers, multiplexers, transceivers using low power Schottky transistor-transistor logic chips (LS TTL). The three states are Hi, Lo, and high impedance (the tri or third state). The tri-state high impedance state would allow some other chip to drive the given bus lines high or low.
A bus conflict happens when two devices are driving say the address bus at the same time thereby causing a short circuit state on the lines where one driver was pulling the line to low, while another device was trying to pull the bus signal line high. Chip damage would happen if this situation lasted too long and made the power dissipation per chip exceed design specifications for long enough time to "smoke" the chip drivers on a given pin or maybe the whole IC would fail and need replacement with a working one.
Positive holes flow toward negative polarity ground potential causing unnecessarily high current flow or a short circuit. Any downstream device listening to this electrical fighting would likely see the ground or negative driving circuit win these fights because TTL chips had lower impedance to ground polarity.
The internal circuitry of the TTL outputs would be specified in the TTL data book that listed the given chips involved. Usually the outputs were using totem pole configuration using complementary polarity transistors--a PNP and and NPN bipolar junction transistor.
The upper schematic transistor would be tied to +5 Volts collector circuit voltage or Vcc through about a 260 or so Ohm resistor, whereas the lower schematic transistor would have its emitter tied to ground through a pretty low valued resistance like 10 Ohms or so.
The P and N junctions for bipolar transistors are formed by using infusion or doping of the semiconductor material with either a donor or hole material or atomic element whose outer electron shells would either have excess electrons or hole slots in their outer electron orbitals. The crystal lattice would have each atom of silicon have neighbors (top and bottom) as well as forward in and backward out and finally right and left with these excess electrons or excess holes (positive charge carriers).
Flow of charges in junctions would only be in favor of one direction.
The physics says that if the charge flow opposes and depletes the charges in the native PN junction field, the flow shall continue unabated.
However if the charge flow being attempted reinforces the doped field junction, then the electric field magnitude would just grow bigger and bigger until the voltage was so high that breakdown would suddenly allow the current to flow backward through the junction.
If the current flow magnitude was too large through the junction, excessive heat shall smoke the device unless the current flow was limited to sane levels within designed specifications allowable by heat dissipation abilities of the device. This would be the situation in a reverse breakdown diode junction known as a Zener diode where the reverse breakdown voltage would be a known calibrated X number of volts.
Field effect transistors or FETs use a by design a channel through which flow would be made to happen by enhancing the conductivity of the channel (enhancement mode) or by design a channel through which flow is pinched off or blocked by depleting the field (depletion mode). The channel can be made to carry positive holes as in P-channel or to carry electrons or negative charge carriers as in N-channel FETs.
The control lines into these various bus switching governing chips would determine who was allowed to talk or listen to the given addressed memory or I/O component.
Way back in the late 1970s the speed of CORE non-volatile RAM magnetic memory was low microseconds. Semiconductor memory was about 1 microsecond down to about 450 nanoseconds. Each successive year the memory speeds got faster for semiconductor RAM. The DRAM chips for the Apple II were around 200 to 300 ns.
The first 386 processors at 16 MHz were using 150 ns DRAM chips in sockets on the mainboard and expansion RAM boards. Expansion RAM boards in the early PCs would not necessarily be able to use ultra fast static RAM cache sets, nor could they use faster SCSI bus mastering DMA controller boards unless specifically designed to do so on the newer 486 mainboards with premium level chip sets and bus expansion slot standards that would permit such stuff.
CPUs and chipsets started to be fabricated using advanced high speed CMOS--complimentary metal oxide semiconductors. The signaling was using less current flows to swing the voltages involved. MOS transistors used field effect transistors. NMOS and PMOS complimentary transistors to form CMOS.
The fields on the signaling gates would either use depletion mode or enhancement mode to turn the transistors on or off. Instead of base input, collector or emitter outputs, the newer transistors would have gate input, drain and source outputs.
Gates would require field voltages rather than input currents to do switching. The only current flows required were to charge or discharge the parasitic capacitance involved in the small gate circuitry. Leakage currents were reduced with each newer generation, thereby lowering power consumption.
DIMM RAM started to come out for the Pentium series CPUs in the mid to late 1990s. It had RAM speed at or faster than 80 ns and eventually in the early 2000s was running close to 7 ns.
Eventually dual data rate (DDR) chips started to come out that would do block transfers to and from the CPU's internal cache a bunch of words at a time. DDR2, DDR3, DDR4, and so on.
It turns out that every step forward was done at the scale of the chip die size being shrunken more and more each new generation making the transistors smaller, closer together, have less active reactance charge capacitance or inductance delays. A lot of this was done initially with semiconductor etching masks made of shrinkable fabrics where the pattern was transferred, and then the mask was shrunken and then applied to the semiconductor wafer layers.
As these shrinking advances came in, so did power supply voltage drops come into play.
Switching times of the complimentary totem pole configuration transistors determined the power used because as the frequency of the signal switching increased toward and beyond the switching speeds of the two transistors, there became more and more a window of time when the bottom and top of the totem pole of transistors could be on at the same time thereby drawing current straight from Vdd to ground from top transistor through bottom transistor causing momentary short circuits causing huge power dissipation. With bipolar junction transistors such heat can cause what is called thermal runaway with excess charge carriers preventing the transistors from shutting off the current flow without a much greater base junction input current to attempt to shut the flow off.
From what I can recall, CMOS field effect transistors do not tend to have thermal runaway and in fact the increased temperature would tend to somewhat increase resistance to such flows.
Let me refresh your memory. 150ns was common. Any faster got exponentially more expensive....in the end 286s etc. Ran 60 or 70ns ram, some 10+ years later.
I would solve that differently.. if the character rom was really ram, it could be mapped into a block and CPU access would take priority over the character generator. Given the system would load this ram from rom at boot, the extra blank screen at start would be un-noticed. Given that dynamic access to the character ram would be unusual and minimal, it could be timed during the blanking period without creating visual aberration's.
A series on the C64 (assuming there is to much for a single episode) would be amazing. Interesting stuff as always look forward to seeing more in the future.
The 8-bit Guy did an entire series on Commodore computers.
@@eriksiers true. But Dave has a different perspective that would nicely compliment David's.
Definitely look up the 8 Bit Guy, hes got so much on old computers, Commodores, and the like. He will definitely fill what your looking for.
Dave is the man!! Really enjoy these trips down memory lane. I was a TRS-80 Mod-1 guy and loved Z-80 assembler.
I love this. I could listen to you for hours talking about PETs and C64s, and enjoy myself while doing so.
I hope you continue with this old school stuff now and then! :)
The PET was the first actual computer I ever saw and got to play a bit with, back in the late 70s. I was totally hooked.
A few years later and after endless begging, my father got me a C64. The world changed forever that day.
hear hear!
Absolutely right, I think it was around the mid 90s for me, I was brought up building and designing my own computers in the 70s and 80s. Once we hit 80386 protected mode and NT, and I could no longer bit bang the parallel port without having to hack a driver, I lost my grip of having a comprehensive understanding the entire stack.
I did write display drivers for Windows 3.0, but only a very few basic drivers for NT, and of course, the driver model kept changing. Thankfully since WinUSB we hardware guys can largely avoid writing drivers these days, but it's taken a couple of decades of USB to get here.
Regarding the PET's character display, AIUI the character generator ROM isn't memory mapped, it's on it's own private bus. There isn't enough time to be able to fetch the character and look the character's bit map up on the same bus.
The 6502 has a dual phase clock. On one phase, the CPU owns the bus (eg, instruction and data fetches/stores), and on the other phase the video can use the bus. As the clock is 1MHz, or a period of 1us, there's 500ns of bus access time for the CPU and then 500ns for the video. RAM and ROM of the day had typical access times of 250 to 450ns. By the time you've added the glue logic propagation delay times, there's not enough time to allow for any more than those two memory accesses inside that 1us.
Thus, to achieve the timing constraints of video refresh and the 1MHz clock, there's a second address/data bus, physically inaccessible from the CPU's buses, for the character generator ROM.
The dual phase clock of the 6502 architecture was a key reason for its popularity among manufacturers precisely because you could easily memory map the VDU memory and avoid timing contention with the time definite phases. Life was a bit harder for hardware designers on processors like the Z80 which doesn't have this feature.
Shoving the character ROM on it's own bus certainly wouldn't be all that odd a choice- the NES did the same with tiles, and I think the IBM PC monochrome adapter _may_ have done the same with it's character ROM. It would always have been possible to add some circuitry (e.g. a DMA circuit in the video processor) to get characters moved, but that would have (slightly) complicated the video circuitry, and would have required extra chips (and as I recall, ROM was cheaper than RAM).
This is why I migrated to Linux relative early. I even own the T-shirt "Ich bin /root, Ich darf das!" = "I am /root, I am allowed to do that!"
I also stopped developing hardware stuff for the PC and migrated to embedded systems only using Ethernet and browser based interfaces. Less pain and no sweat anticipating the next OS upgrade.
@@CC-ke5np thank you for teaching us Germany
I am a kid who is into computers, but have no depthy knowledge, and I'm able to understand what is being talked in the video. Not really do I have a full grasp yet, but I am able to register it on my brain, so I guess you did a great job.
I do have a foresight that this video does have s very high CTR and people would be clicking off at the early of, because big brain stuff folks.
What a great video. Takes me back to my younger days when I was all in on programming my C64 in BASIC as well as in 6502/6510 assembly language. Back then I really did know almost everything there was to know about how that computer functioned.
Nice video!
Reminds me of 1981 or so. Got a Tandy TRS 80 PC with 4K RAM. The same evening I replaced the RAM with 16K chips. Later upgraded it to 48 k by soldering 2 RAM ICs on top of each RAM and adding an address decoding circuit. All piggy-backed on the original ICs. Next a lower case character generator and after that an APL character generator for the specific symbols for the APL language.
2 pcs 5 1/4 floppy drives with home build controller. Real-time clock and a 75/1200 Baud modem.
That was indeed the last system I understood 100%.
It used to be more fun when we still could at least pretend to understand the whole stack from digital circuits to all levels of software. Becoming a software or system person was possible and even beneficial to start from digital circuits, hardware and moving upwards.
thats what im doing right now lmfao.
Getting a shit ton of TTL chips and making my own computer, and then moving onto OS.
You still can, although it requires quite a bit of patience, but at least there are tons of easier to reach documentation on how to do it today.
Want to build your own TTL computer? You can, there are even entire series on RUclips on how to design and build your own custom CPU if you want. Want to write your own OS? There are tons of wikis, tutorials and books on the subject, and entire open source projects that you can download for free. Want to build an 8-bit or 16-bit computer? There are kits you can buy and build and tutorials on how to design your own.
Microcontrollers (and their development kits) and basic logic analyzers are essentially commodities now. And FPGAs went from a thing you could only dream of, unless you worked for very specific fields, to something you can buy a decently sized one for a reasonable price and build quite complex digital circuits with it. There are even video series on how to develop cores for FPGAs today and tons of open source cores that you can learn from.
If you are so inclined, take a look at the MiSTer FPGA project, lots of retro gaming fun and also a very nice learning tool if you want to either dabble or deep dive into digital circuit design. Sure, it's "expensive", but still costs less than a modern video game console or a gaming computer and I promise you, it will give you way more hours of fun than those other options.
@@gcolombelli
If you build your own, that is different situation. Also open source and open designs are different. I agree to that.
I used to work with closed source systems mainly.
But having access to all hardware levels and low level interfaces and functionality is not so accessable for commercial reasons.
Nowadays there are exceptions, I agree.
In my university years I had access to OS source code from DEC and other companies. But later even working in a computer company, getting access to the source of the software I was selling and supporting was restricted. Open source wasn't really part of the business plans of that company.
Inhouse CPU development information was even more separated and information restricted.
Thank you for sharing with us ur priceless experience, with software engineers all over the world, greetings from Morocco.
heyy im also morrocan
Thanks, Dave for yet another amazing episode. What a rich plethora of information that I, like most other grandpa's have forgotten or at least pushed further back into the recesses of near oblivion call my memories.
I thought it was about some things for common users. I didn't understood a thing but it was good hearing you.
I used the kernel routines on the vic-20 to print to the screen, that way when a memory cartridge is installed and the screen memory changes the code still works as the kernel routine adjusts for that.
I remember discovering software interrupts when I was 12 years old. I bought all the books I could find on anything PC. In one summer I made TSRs to cheat in games, made a program to add echo to live microphone input, wrote a simple graphics library to use all VESA modes on a Tseng labs 4000 and read data from a CD-ROM.
Best summer ever!
Just one summer later I had internet connectivity and Pathfinder landed on Mars.
Epic times.
I find this all interesting because I was a kid when Windows 95 was released. It's interesting seeing the trade off going from the PET to the VIC20. The problems I faced when I was younger was: trying to use the Compaq my mom bought in 96 through the year 2005 because I had no money to buy a computer, and everything was a hand me down
In my early teens, I had a Sinclair ZX81, followed by two Atari 8 bits. Although I never got in to assembly language, I did have a lot of fun programming. Poking values from BASIC, altering characters, setting bits on the video and sound chips on the Atari to achieve different effects etc. References like "Mapping the Atari" and "De Res Atari" were incredible resources on the system architecture and OS, even if I only understood a fraction of it. Nowadays, everything is a black box, closed off and accessed by a gazillion gigantic libraries, and most kids today have no idea about bits and bytes.
The PET the was first computer I used and I was hooked when it dawned on me that I could make it do anything I wanted. I just needed to learn how and I'm still learning today thanks to the highly complex hardware and software systems of today. Today I'm learning about Serial Peripheral Interface (SPI) programming so I can unbrick a BIOS.
I didn't enter the computer field at the PET stage. I entered in 1982 with the HEATH model H89 running a 64k CP/M system with a Z80 CPU (2MHz). Even within 64k RAM, the thing could run a good word processor that, surprisingly, had many useful features considering it ran in what was left of the 64k after the CP/M system filled its share of the RAM. I even had an MBASIC compiler and a FORTRAN compiler on that H89. Storage used 5“ floppies, hard- and/or soft-sectored.
Loved the PET. Cut my programming teeth on it at School. Ducking out many other classes to be found in the computer lab (2 PETs) coding away with no idea what I was doing but generally succeeding anyway.
Just found your channel (by random chance, I just had the utube on for background noise as I was playing on my MiniPET trying to fix a BASIC program) and holy crap, I think I could listed to you for hours just talking about this stuff.
Growing up during the 8-bit era of computers was definitely a special time period because you could have access to nearly every byte in memory at will on most machines. The only OS that even comes close to emulating this behavior in the modern era might be TempleOS, though that is a very obscure, fringe case.
I spent Friday nights in the downtown Toronto computer store Batteries Included in the mall Village by the Grange. where Jim Butterfield introduced a wide eyed Grade 9 student the Commodore PET including music! Awe struck. In the 90's he wanted me to digitize his First Book of KIM. Wonderful mentor. RIP. JIm Butterfield.
The reason the PET didn't have user defined characters is that the character ROM isn't in the 6502's address space, it is hard-wired into the video circuitry. You can't read the character ROM on a PET from a program even if you wanted to. The video circuitry in the PET essentially operates asynchronously on its own, with the CPU grabbing access to the video RAM when it wants it - do this at the wrong moment and you'd see "snow" on the screen where the wrong data gets shifted out to the electron beam! That is why BASIC usually waited for the vertical retrace interval before writing to the video memory.
Later PETs, the 16K and 32K pets (2016, 2032, 3016, 3032 etc) did use dynamic ram. These are often referred to as the "Dynamic PETs" by forums.
I suppose that ram was also a precious resource and by using rom more ram would be available for programs.
Dave this such a great walk back through history.. my own experiences reflect those early day's where you were close to the system, or writing an input routine to grab the keystrokes and then move onto the next task and be back in time to pickup the next keystroke. Thank you for putting a wide smile on my face.
Nice, bring back the 8 bit days, I loved being able to have my head around 95% of my machine.
"Character gen rom not in memory map", same as my TRS-80 M1, having the actual character 'rom' in ram was a problem back then, the video refresh circuit has already pushed the CPU aside so at to read the video ram, now it has to also index back into ram to get the character patterns for that screen position, usually that was too much bandwidth for the main address bus to handle. I did it on as a hack mod on my TRS by having my programmable character 'ram' chip as a dedicated to character generation IC that was not 'normally' in the main memory map, so it's operation did not effect normal memory bus operations. However when I wanted the CPU to write to it, it was quickly switched from the video cct, back into the main memory map.
On my TRS, there was not a handy 2K hole in the memory map (I used a 6116 2k ram chip) so I mapped it ontop of the 12K Basic ROM. If you do a read, you get the ROM, if you do a write, if goes into the character gen ram. The only downside is that I could not read that ram, only write to it.
In the PET days, adding all of this would have made the circuit board somewhat bigger. By the time the C64 came out, Commodore were able to make their own custom chips, making this a lot easier, as the circuit board could be shrunk quite a bit. the downside for the C64 is a few 'unobtainium' IC's that makes repairs a tad hard. (yes I know there are modern solutions, now, - only due to the C64's popularity)
I did something similar on my ZX80 in around September 1980. I had a little board that I designed that plugged into the ROM socket, but also added some extra logic and a couple of 2114 RAM chips. My ZX80 had the normal 64 characters, plus 63 user defined characters and their inverses as well as the inverse of the 64th user defined character that you could not display normally due to the way the ZX80 display generation worked. Fun days!
After 7 years of being a computer enthusiast and when I completed my Computer Science Degree in 1980 - which makes me the same vintage as this presenter - I had learned most of this stuff. Additionally my first few years of professional low level process-control SW development taught me all about 8 bit computers....and I mean ALL. Interrupt handling, O/S structure, Assemble, Machine code, how input and output is multiplexed, RS232, and a myriad of other stuff. I am SO GRATEFUL to have been on the ground floor of microcomputers.
You haven't lived if you haven't put a 16 channel logic analyser onto a single board computer, looking for why a data fetch hasn't worked correctly :)
At one time the university I worked for moved from a mish mash of early computers, mainly the Commodore PET, to PC clones.
In the basement of one building they had several hundred Commodore PET's stacked with shrink wrap on wood pallets because they couldnt get rid of them!!!
No other institution wanted them. Illegal to send them to the landfill. They were offered free to employees and students but there were few takers, including me!
I believe they eventually had to pay a recycler to take them. These now sell for well over $1000 on Ebay!!!!! Coulda, woulda, shoulda!!!! AAArrrggg
I feel your pain. I still kick myself often for not taking the opportunity to snag my family's first computer, an Apple II Plus, when my dad was cleaning out his old house prior to selling it. I also enjoy watching videos of car enthusiasts restoring antique cars. Hearing restorers talk about their finds, I used to find myself shaking my head and wondering why such beautiful machines were just abandoned and left to rot - referring to both the computers and the cars. I have to stop myself and remember that at the time they were replaced, they were just old, obsolete machines, often crippled by problems that weren't worth fixing at the time. They couldn't run modern productivity software, Internet access was beyond them, and there were thousands more just like them here, there and everywhere, so unless except for special editions, they weren't worth anything. And a quick look at eBay suggests that these old gems, still aren't worth much to the market at large, and probably never will be.
I currently have a Compaq Deskpro from 1998 or '99 that was retired just a little over a year ago, long after most of its peers had been chucked. I'm currently using it to demonstrate some long-gone storage media to a group of high school students. I'm preservation-minded enough that I'm not going to throw that Deskpro out when I'm done with it, especially since it's in pretty good working order. As far as what I *will* do with it, that's a good question. It's too old to be useful, but too new to be a museum piece.
@@mar4kl Absolutely. Im seeing 5 1/4 floppy drives, old sound and video cards, memory sticks bringing in crazy $$$. Granted these are not making someone an instant millionaire over night but I personally have thrown away many of these items and its painful to pay any $$$ for stuff you once trashed!! What are we throwing away or neglecting today that will have value in the future? Its just difficult to know! On your Apple II Plus over time they along with the first Mac's joined the pallets of unwanted computers.
Thanks for the walk down memory lane. I started with the Commodore 64.
I recently did a hobby project for my IBM PC-XT and wrote an assembly language program to interact with the battery backed up clock board on the IO buss. The original program was long gone from the hard drive which required low level formatting to bring back to life.
I had to trace out the board, create a schematic, and then look up the spec sheet for the chip (MM58167). I used MASM to write the code. The project was very satisfying an a lot of fun. The last time I did any assembly language coding was 30 years ago.
I'd love a deep dive into the c64. In fact, a series on the c64, maybe walking us through a complete program start to finish, would be freakin' awesome!
I miss my machine code programming days on the VIC-20 and C64. Writing entire games that wouldn't even fit into the memory space needed for a simple icon nowadays was very rewarding, especially for a 14 year old kid.
Dave, from another old guy, you're the man! Thanks for this! It was great!
Glad you enjoyed it!
I love it! Yes!!!!! This is exactly what I try to express to people why I love vintage computing. It was simpler back in the day. When I teach cyber basic classes for my organization, I try to teach these young kids (25-35 is young by my standard) they miss the elegance of the old equipment. For sure, cool to have a fast computer in your pocket (mobile device) but everyone has become a user and so few understand the what's happening under the hood. It was easier then to be a mile deep in many areas of computing. Now you can only be an inch or so deep in multiple areas and you have to choose a specific area to go a mile deep. Sure more sophisticated? I'm not sure. I miss the days of actually being able to understand, diagnose, and repair.....like you did with your PET. Cheers
Very well presented. I admire your dedication to these legacy systems. They serve well as a way of explaining the basics of computing. I really enjoy your channel.
Thanks for the kind words!
My first computer was a Commodore 16. I still use the skills I developed with it. Awesome times.
Thanks for such a great video!
Keep up the good work. Really enjoying your videos.
Very cool video. I stated with a RCA 1802 Elf in the 1970s. A friend and I built a system to replace a controller board for an IBM selectric printer - with 2 k 2102s, and serial IO to control the selenoids to control the mechanics from a serial input. We built our own hardware emulator, and programmed in machine code, including an eprom. The 1802 was a good platform since it was static so we could control he clock and step through the routines to ge the timing right.. I agree with you that is is helpful to have a knowledge of how these tools work. It was fun, but I enjoy the current capabilities of running MS SQL and Tensor Flow.
"Standing on the shoulders of giants" - great write up!
I always appreciate your explanations. Having grown up in the commodore era, it's always worth it to me to sit back and listen and remember the amazing things these computers were able to do. If we could share photos here on YT, I'd gladly post a photo I found last year of me and my brand new Vic-20 on Christmas 1982. I would've been 12yrs old. That quickly led to acquiring a C64, multiple 1541's (because I ran a BBS at 13-14yrs old), and eventually a C128 with a couple more 1571 drives. Back in the early 90's, I found a fully working PET on the trash. I knew it worked because it came home with me and I plugged it in. All of that equipment landed in the trash. How I wish I could go back and convince myself to store it someplace safe just so I could play with it today. Who remembers Person-to-person transfers on Qlink? :)
When Dave wears the jacket, he gains +4 speed and agility.
DAVE: let's get back to basics
DAVE: *crash course in concepts considered advanced by beginners with minimal context for jargon*
Life was good when I could stop toggling the boot loader into the front panel of a PDP-8, and use the boot loader in Core RAM in a PDP-11. Imagine my delight when the bootloader and monitor were burned into EEPROM in my 1975 Digital Group Z-80 computer that I hand-soldered together, all 1024 bytes of static RAM, later expanded to a voluminous 8 kbytes. It would load and save using audio tape. I wrote a lobby kiosk annunciator board app, in hand-written machine code, for that sucker.
Booting Windows or Linux from SSD is an incredible luxury! (By the way, get off my lawn).
Dave a lot of the first generation of "homebrew" hackers and gamers went through the knowledge bootstrapping process. Many even independently invented or discovered how to build computers from scratch. Early software and hardware are so intertwined that it was not possible to know one without the other. Computer magazines in the 80s actually had complete program source codes or circuit diagrams, and Byte magazine was our industry bible.
I once tried to explain old school bootstrapping all the way from power up to sector zero loading, and jumping vectors in the MBR.
The person I was explaining it to, just looked at me like a was purple, and didn't understand a word I was saying! KIDS TODAY! SHEESH!
Everyone should know how it happens!
I can make a BASIC Commodore 64 program but I never learned any POKE commands except the one to change the screen colour. I’m Australian and we have the extra U too! Anyway, as long as you don’t want fancy graphics or sound, I’m pretty sure I could knock up any program you need. Mind you, I think trying to make a spreadsheet application in BASIC. I’m not sure I ould do that. It would be interesting to try. Anyway, I’m just leaving a comment to boost user engagement. I love this channel and I loved this video. Yes, more, please!
I found it easier to do calculations in BASIC than using a spreadsheet program. :-)
Love this retro deep dive, please keep them coming!
Back in the early '90s I was at one of two startups that were doing compression that could be used on discs. Thanks to books like undocumented DOS and a pension for disassembling code, I was able to contribute to that effort.
Once we had a product, and I thought about what happened to lotus one two three, I walked into the boardroom and told the guys we got 2 years to make money on this and then Microsoft is going to have this compression in the operating system. It's as sure as Moore's law.
So YOU FOR ONE OF THE GUYS THAT MADE MY PREDICTION COME TRUE!
Bravo, much respect!
Would love to see you collaborate with Bill Herd, he's an ex Commodore engineer (though he was there after the PET), he always has amazing stories to tell!
I remember writing my first basic program from the commodore 64 manual. The hot air balloon pixel graphic that would float across the screen. I then spent days writing code to use the keyboard to move it around.
So much fun as a kid with tech that seemed like it was from the future.
Thanks Dave for bringing back fond memories. Keep it up.
If you've made it this far to get to this video, you're doing alright for yourself.
I remember cutting my teeth on a TI 99-4/a back in the day. I didn’t understand it at this level, because I was too young and there wasn’t an Internet. BBSes came a bit later for me, but the early days were spent in TI Extended Basic.
Thanks Dave, for shedding light on the old school.
Actually, I knew 99% of all that; but nevertheless the video was still fascinating and nostalgic.
I knew 100% of everything and still found it interesting. 😉
I knew 90% of it and had fun doing it :-)
I knew 0% and enjoyed myself 100% of the time
Takes me back to the early 80s when I learned the same things about the pets we were using for industrial machine controllers in the UK.....thanks for reigniting those memories
Dave really should setup an old school BBS for his userbase (rather than a discord or other) I think it would go down pretty well :P
Perhaps not with a phone dialin though, some sort of emulation over TCP instead :)
Infact ... it could be done, a standard BSD server, users sign in with ssh like expected, then they use minicom or such to a local BBS .... this must be tried!
Even IRC would be a good midpoint, old enough that you can get old hardware to talk to it, but not that old that modern hardware can't also talk to it! You can even get smartphone apps for IRC, just!
I used to help a friend run an Acorn BBS with 3 lines, allowing a whopping 4 people to chat together - well until their parents got their monthly phone bill and banned them 😏
In the *nix world there'd be no emulation required to work over TCP and/or serial line.
I bet he already has 😜
Something I constantly have to teach students - input/output. They get that a little with PHP or with CICS. But in normal programming, they ignore the WFA form (or WPF, etc.) until the last minute. I was always taught to use spacing charts to plan I/O before coding (partially because it helped plan some of the variables).
When you mentioned the part about not understanding some of the other sublayers and bits of the system, it got me remembering a thing I watched a while back about ALSA (Advanced Linux Sound Architecture). The discussion was about how without some kind of added layer of control modules and extensive system components in the middle of him and ALSA - he would have virtually no idea how to make it useful. And that's just the sound. Imagine also trying to interface on a machine-code level with a modern graphics card, with all of their complex functions. 😳
Everything has been reduced to specialized modules that entirely do things for us. The problem is that those modules are extremely power-hungry, when you consider resource uses.
I started using computers when an entire OS would boot from a floppy. Now single video games are significantly bigger than most of the hard drives I used for the first 10 years of using computers. 😳
Modern video game sizes shouldn't really be compared to those old disk drives, because most of the consumed space is graphics (whether images or 3d models).
I had not heard the name Jim Butterfield in a long time. I had some books of his learning how to program the C64 1541 floppy drive. Thanks for the trip down memory lane.
7:34 I think it is a bit misleading to show the electron gun to cover a full character as the scanline went through all characters first row altogether. I am curious if there was any caching like mechanism here as the video control had to access the same N character bitpattern 8 times.
The character ROM seems to feed data into a 74165, a shift register. So the byte is read from ROM only once per character per scanline, not once per pixel.
I remember a BASIC progrem called FLOORMAT which looked like you were formatting your system drive and no way to stop it till it ended with a persional messsage, "Got you Dave"
Have you seen Ben eater's channel? You might get a kick out of it.
I have, he is spectacular. Love his "Hello World" episode on the 6502!
The stuff on how characters are transferred from a map on a chip to electrons on a screen was interesting. I appreciate the time and detail you put into the videos you make! Keep up the good work, Dave.
I designed the Monitor on the PET.
Tell me more! Email me at davepl@davepl.com with the story!
I assume you mean the display? I just swapped one and noticed the PCB wasn't a rev, it was COMPLETELY different than the old one!
i never programmed the PET or Comodore. I had a TRS80 with Z80 processor. The layout of memory is comparable. I’ve forgotten lots of the details of that machine but I remember that it was great fun to work with it and talk about it with people in our little club (8 people) and show of what you had made. Later I bought the Atari ST with a Motorola 68000 processor. That was a huge step forward. Too bad Atari didn’t manage to stay in the game. Today I’m a retired programmer after 47 years in IT and still love computers. My current machine in comparison with the old machines is so much more powerful but also very much more complex. I understood the Architecture of the Z80 but the Ryzen 3900X with 12 cores (24 hyperthreaded) is so complex that I can’t make head or tails anymore. I leave it to the people that do understand. Thanks Dave for this little journey into the past. Relived old memories.
As a computer science student, there's so much to learn that it's sometimes seems insane that I'm even trying.
I feel you. How much of this video is included in your courses?
@@therealb888 We do learn how computers work at a fairly low level but it's more focused on modern platforms which he mentioned are huge compared to these older machines. This video is like looking at how a go kart works to understand a passenger jet.
@@coolbrotherf127 Good for you man. My retarded college is stuck with turbo c & 8085 (not even 8086). But low level understanding goes a long way. Instead of having a course on modern wireless comm sys like 4G/LTE networks, we were thought the GSM/CDMA standard.
@@therealb888 If you want deep dives, I'm studying assembly x86 as a Comp E student. We didn't go over basic yet, but we either will in future (I hope) or we will go back to C++
I'm typing cuz CS doesn't really go into oldschool, hardware type stuff, that's our territory. CS usually goes into performance engineering and such with more modern stuff. What i mean to say is that they really don't need to focus on controlling hardware as much, CS is more about applications and etc. Maybe @@coolbrotherf127 will go into Rust and the like in the future
I had a Commodore PET. I even remember playing Space Invaders on it as displayed in this video. I remember loading the game from a tape deck attached to the computer. It's been an amazing time.
Re: character generator ROM:
My guess is that allowing both the CPU and video circuitry y access the RAM would require additional circuitry to implement a dual-ported RAM solution - adding cost to the system.
It was also the case with the TRS-80.
When the CPU was slow (1MHz range), the static RAM once upon a time was more than twice as fast than the CPU, allowing interleaved access by CPU and graphics. So no dual-port RAM needed. Just waiting for the CPU to release the address enable signal and have the graphics card step in and read.
I have a home-built Z80 machine using this logic. Nice, clean, and fun to implement.
Easier to debug as separate units and both circuits were already well known. But I was thinking bus contention as well.
I may be wrong, but I think the character ROM on the PET (unlike the C64) was not hooked up to the main CPU--it was only attached to the CRTC. So, the actual hardware made it impossible to relocate the ROM.
@@whollymindless Yes, contention is definitely the issue. But there is a cheat possible in some situations.
During CRT horisontal and vertical retrace, the graphis do not need to read out data. So it's possible to switch ownership for accesses.
Not a perfect solution for graphics RAM since it limits available draw time per frame. But enough time to redefine sprites in a RAM-mapped character generator.
Edit: but it does require some MUX chips to switch if addresses comes from CPU or CRTC or whatever is used to drive the graphics display. And it requires an extra dual-direction buffer chip on the data lines since the character generator data must be muted from the CPU data bus while graphics is being emitted. Easy to build but adds cost and PCB space.
@@d2factotum That is why MUX chips are needed, to switch ownership of data and address lines. And a means ti synchronize the code with the horisontal and vertical retraces
Amongst the first dedicated computer was the AES-100/103 word processors. Although powered by an Intel 8080 cpu, most of the functions you describe were done in hardware. Yes, for example the on screen cursor had a series of TTL (7490 type) horizontal counter counting up to 80 (caracters) and a vertical counter counting up to 25 (lines). The half tone cursor was driven on the video board with a resistor injecting half the current in the gun, hence the tone difference. The boot sequence was controlled by a bootstrap loader in ROM. The disk access, head placement, rotation, data transfer, etc. were all hardware implemented. The same video ROM (different countries characters were available with differents SKU) idea from the PET was used, altough the text on the screen was stored and "remembered" in DRAM memory. This was also hardware controlled to the point that the text on screen and off screen was stored in contiguous memory as it was displayed. The text was moving on screen because there were hardware counters doing that. The keyboard scanning was all implemented in hardware, only issuing an interrupt on any "Hall effect key" keypress. In effect at the time the limiting factor in all systems was the cpu itself. The cpu was burdened with all the interrupts and housekeeping duties they barely had time left for anything else than coordinating the processes. It was faster to just dump a command or a couple of bytes in the hardware and let it do it's own magical business. Then come back on an interruption to look at the outcome of whatever was pending. There was extensive use of DMA for anything data related. Communications were handled the same way. A respected engineer (especially mission critical like communications) at the time would not trust anything cpu! We would implement a series of lap time watchdogs timers (Bill P.) , tigth timings, wake up call timers, etc. and make sure the hardware was ready to press on the reset button at 4 millisecond notice!! That's the way it was done then... Debuging a video display was simply a matter of looking on the screen, with experience we knew exactly what had failed since it was a sort of inverted WYSIWYG! Remember AA55 and the Quick brown fox generator!?
We NEED an 8 bit guy/Dave's garage crossover episode!
I started on the Apple II. Very, very, similar to the PET. About 80% of this video also applies to the Apple II. The only major difference is the Apple II had 2 graphics modes in addition to the text mode. Great video. Took me back in time.
Damn, if you hadn't said in the beginning that it was not a guarantee (that I would understand how it works), I'd be suing you right now, cuz I still don't understand anything lol
Thanks for that ride down memory lane, Dave. I started off (if we discount a 4 bit AIM development board running a 4004) with a Tandy (Radio Shack Model one. I was even simpler than the Pet. I too enjoyed the luxury of knowing what every (well most) component was for and thanks to reverse engineering (or screwing around with a Monitor in the ROM, as I would have called it at the time) I learnt what every bit of code (including those Microsoft routines) did. It was a time of rapid development and personal computer costing £1000s in todays money would be obsolete in a couple of years. It wasn't just the hardware, the knowledge also suffered from obsolescence. Within 5 years, I wouldn't have even be able to tell you how a sound card worked (though I knew it didn't use the relay that switched the tape recorder on and off to make the sound). Thanks again, Dave.
It made me uncomfortable that you described the data going to the electron gun one character at a time instead of one row of each character across the whole screen
I didn't mean (or say!) that. The one little image shows the scanning of each line, which is how it happens. If your screen says COMMODORE first it draws the top of the C, then the top of the O, the the top of the Ms, etc. Then the next scanline of the C, the O, the M, and so on.
It's definitely one ROW of pixels, not a character at a time!
The first computer I ever used was a TRS-80 Model 1 in 1982 in Year 7 at school. All I could do was a BASIC "Hello" program. The first computer I owned was a TRS-80 MC-10 in 1983 with a whopping 4K of RAM. I typed in many BASIC programs and ended up writing some including a very crude financial accounting program. I learnt how to POKE the screen to do unusual text and graphics. I did some elementary Assembler programming. Yes, that tiny computer could do all that. I later upgraded to a CoCo 2 and then a CoCo3 with the 512K RAM upgrade module and 2 x 5.25" disk drives in the end before moving to PCs in about 1990. The good thing about PC's was the open architecture which lended itself to learning all about it if you wanted or needed to know.
Do the same for c64 and amiga 500.
Sadly no one seems to be watching this one, but they like it when the do, so I'd love to! I have to think of some clickbait name to make people click though :-)
@@DavesGarage I think the title might be putting off people who don't care about the PET. If the title were more generic maybe more people would watch.
I used to repair Commodores Pets back in the 80s. I had a reference system with the 40 pin chips socketed. A cabinet with spare chips and reference floppy for setting up the heads.
Thanks, sir! I truly appreciate compressing my ol' brain cells dusting off the grey matter and making room for some new items in the cracks. I Didn't realize how similarly the Pet and TRS80 operated, though i did do some cross platform translations for a couple Avalon Hill programs. In those days, I treated other systems as "competitors" instead of embracing their similarities and treating them as cousins. {we do / should learn as we get older}. I wish that we could have shared corridors in Redmond. To stroke a well deserved ego, you continue to impress me, Kind Sir!
You are one of those pillars of computering!