If you want a 6502 programming challenge I cant think of anything thats better than the Atari 2600 actually. You learn to appreciate every cycle and figure out the best ways to achieve anything on the console.
@@64jcl I'd say building your own 6502 computer and programming its system ROM is more of an accomplishment. Ben Eater has a project for building one on breadboards and people are crazy for expanding on it.
@@YaroKasear , indeed, Ben Eaters videos are great. But I'd say that is perhaps more about learning to interface hardware with a 6502. As a pure programming exercise the Atari 2600 requires you to not be sloppy about the code which you can do with most other programming. In general game programming, e.g. for the Commodore 64 which I work with a lot, is a good challenge as you have to make sure things are as fast as possible since you are often striving to have 50/60 fps running smoothly.
I now feel even more sympathy for the one guy that had to rush to create the ET game. It’s incredible what he managed to create in such a short amount of time considering that this is the nonsense he had to deal with on the programming side. Good heavens what a delicate system.
Incredible indeed! At first, I thought -- "Wow, there's so little abstraction that I bet I could learn this in a weekend. How quaint, so much freedom!" *Five minutes later*: Yawning, existential horror at the concept of HAND COUNTING MACHINE CYCLES and PERFECTLY TIMED LOCKSTEP EXECUTION all while remembering to account for the fact that different rendering instructions have VARYING DISPLAY LAG TIMINGS.
It would have taken days just to create the start screen! Astonishing that anyone ever created ANYTHING for this system. The ET guy should be remembered as a hero. Of course the game was awful, look at what he was dealing with!
@@MasterChaoko Welcome to the joys of the 2600. It's not so bad once you get used to it. Batari made it a bit easier with Batari BASIC. But even then, you need to write a custom kernal (the term commonly used for the display code) if you want to do anything very sophisticated. Search for my game Deimos Lander for an example of a game that blends Batari BASIC for gameplay and a custom kernel for cut scenes. I'm actually pretty proud of the printer-paper screens as no one managed to fit as much text as I did, flicker free.
17:35 It made me laugh that the "V-blank for graphics, active scan for game logic" pattern from the SNES/etc. is inverted here. But it makes sense upon reflection, since the Atari CPU is acting more like a graphics chip that happens to do game logic when it can spare a moment.
Many systems has a graphics co processor that reads VRAM and writes to the screen, and that's the reason graphics updates can only happen in blanking periods. And even that is kinda omittable, by using double VRAM, which would basically be double buffer, just like how modern GPU handles things. The trade-off would be, it's twice as expensive, and there's no more DMA effects when using double buffer, also there's one more frame of delay. (There's also screen-tearing, but that probably doesn't matter at that time)
@@chyza2012 Hence I used the word "basically". I think for vintage hardware, the only way to do it is to use 2 VRAM chip, and both the CPU and the graphic coprocessor swap chip each frame, so that they don't access any one of the two at the same time. It's unlike modern hardware that are fast enough so having to wait for VRAM access isn't a big deal.
@@FlameRat_YehLon "does that allows two devices to access the same ram at the same time?" Essentially yes. The ram has double the number of IO ports, one set for input and one for output. The CPU can write values while the PPU can access the memory within the same clock domain. They do not need to wait for the other to finish the transaction to start. Now only 1 can read at a time and only one can write at a time, which means that the CPU will interrupt the PPU if it needs to read memory back. This kind of memory has also been used in computers like the Mac II FX to reduce latency and boost performance for the CPU. However in that use its benefits are more limited, it does however give some pretty big boosts to DMA transfers.
Normal sprite management: Ok, you want this sprite at (120,200)? Sure thing. Atari sprite management: Ok, you want a sprite? And you want it to be 5 px offset? How far along the screen do you want it? Y'know what, I'll drag it along the screen and you can tell me when to stop. Tick. Tick. Tick. Okay stop? Sure. Now what graphic do you want to put on your sprite? Actually, hold on, I'm gonna need you to hand it to me row-by-row...
Reminds me of the way early mechanical telephone exchanges worked. So you want to select this particular circuit and the mechanism starts moving and sends electric pulses every time it hits another terminal on the panel. Then the sender (control unit) tells it to stop when the brush is on the desired terminal. Of course, it also needs to translate your phone number into a terminal position first.
@@IronicHavoc I got curious about this so I actually looked up some technical documentation about the 2600's contemporaries. The Odyssey2 (released a year later) had four sprites that just took an X/Y coordinate and 8 bytes of graphics data. So, basically what you'd expect. However, you don't really get a proper "background layer"; instead you get an 9x10 grid of lines (which you can specify as filled-in) and the ability to place text on the screen along with your four sprites. So your ability to draw a playfield is really limited. The Fairchild Channel F (released a year *earlier*) has a proper framebuffer of... 95x58 pixels. No sprites as far as I can tell. I'd like to know how the hell they were able to fit video RAM in their design given that both Atari and Magnavox avoided it like the plague for cost reasons. All that aside I'm still going to call out the 2600 TIA *anyway*, purely because of the whole "position sprites with cycle-timed code" thing.
@@SuperSmashDolls The Fairchild Channel F used dynamic RAM for its frame buffer. Dynamic RAM was about 1/4 the cost of static RAM, but the hardware to support random access was rather a nuisance because one had to multiplex the high-order and low-order parts of the address bus, and also ensure that no row would go very long without being accessed. The Fairchild and Apple II both exploited video memory scanning to take care of refresh, but if I recall the Fairchild was rather limited in terms of when the CPU could access the display.
Here's what weirds me out about it: you wait to write to RESP0 or RESP1, but you only have to do it once. The system remembers it until you update it. So why couldn't you just write a value to whatever internal register handles that?
In the Atari 2600 enthusiast community, programmers (especially ones that programmed the classics like Pitfall) are often given titles of great respect such as "programming wizard" and "code magician" and "miracle worker". This video illustrates why these titles are not entirely cheeky little jokes made by the connoisseurs of the classic gaming community. :)
I've heard several former Atari employees refer to themselves as "engineers" rather than "programmers," as they viewed programming for the VCS to be not simply writing code, but examining systems and figuring out ways to make said systems more efficient to be able to execute said code on the VCS hardware.
@@XanthinZarda The thing is, if you look at other consoles of those days, the Atari ended up being able to outperform some of them BECAUSE the lack of video memory allowed fancy tricks that couldn't be done with the low amounts of video memory that other systems had. It was very much a trade-off of being harder to develop for, but with a higher bar of what could be accomplished.
@@XanthinZarda It wasn't so much cheap at the time as not ungodly expensive. Memory chips were ridiculously expensive back in the 70s. The fact that Apple used a frame buffer in the Apple II was kind of shocking back then. Using registers and shifter chips was a way of getting around making the console over a thousand dollars. The 2600 did have some working RAM, but it turned out to be only 128 bytes (!). That was less than a single addressable page using 8-bit addressing. Stack overflows were... interesting.
@@josugambee3701 Stack counts downward from the end of memory. So you would usually put your stack pointer at the end of the 128 bytes at startup. Then you'd put the variables you tracked at the beginning of memory and hope that these two pools of memory never collided. You really had to watch sub-routine calls as it was incredibly easy to push too much on to the stack before you exceeded the 12 to 32 bytes you might reserve for the stack. If I remember correctly, attempting to access the memory hole would just wrap around the 128 bytes. But it's been a long time so don't quote me on that.
you know what topic I think would be perfect for this channel, that I haven't seen a proper explanation of in video form anywhere? a video about the way level data is stored in super mario bros, explaining why exactly the minus worlds look like normal levels and not garbage tiles. I think that would be really cool
I'm just guessing, but I think most levels were built from objects with dimensional parameters. With a few level primitives you could build fairly complex levels using very little storage. I think SMW used a similar method, but at this point I'm just talking out of my ass.
@@IntegerOfDoom Kind of, it's a few pre-made layouts, which can be changed with extra objects that "spawn" chains of blocks, coins, or other background objects on top of the base layout. So to have a line of say, nine ? blocks, you only need a single object at the right place with the property that says "spawn 9 ? blocks to the right as the screen scrolls". For example, those staircases made from blocks at the end of many levels are actually all from a single object. This is also the technical reason why going backwards in SMB1 was not feasible to allow.
The reason I think why the minus worlds look like normal levels is the calculated index for what level data sometimes is in the normal range, so they use real levels (I think the game wraps around the level index in some way if it's out of range). For the minus world the index just happens to end up being for a water level. Edit: The actual reason why it looks the same is from reading unrelated data. It first gets the index of the respective level id from an array holding these indices, but because the world number is way higher than 8 (36), it ends up reading a value from the array located right after with the level ids, which so happens to be 0x33. It then plugs this value into the level id array, but ends up reading a value out of bounds in the enemy data after the array (1), which corresponds to 7-2.
This is probably as close as you could get to bare metal. It's a miracle this system works at all, and it had such a long lifespan alongside other much more advanced systems.
I'd argue it's just as much bare metal as anything else, you just have to handle a lot of things typically handled by the hardware in software. If anything, it's less bare metal, cause more things are handled in software ;)
@@vyor8837 If you're saying it means anything vastly more specific than "no operating system, interfacing directly with hardware", then no, I don't know what it means. I'm just suggesting that not having specialized graphics hardware doesn't make the Atari 2600 "more" bare-metal than (for example) the NES; they both have no OS and directly interface with hardware (the only difference is how advanced that hardware is). The last part was a joke, hence the ";)".
@@leandersmainchannel4493 Yeah. Not the greatest name. batari BASIC itself is pretty sweet though. Look up Princess Rescue for an example of what you can do.
To be clear. This "vramless design" was not only never done since. It hasn't been done before. The Fairchild Channel F which predated the Atari 2600 actually had 2k of traditional VRAM. This was done because RAM pricey. It was kind of a blessing is disguise. Beam racing is what made the Atari 2600 thrive since you could get similar quality, full color games to competitors for cheaper. At the cost of programmer sanity of course.
There are a few other systems that used beam-racing, although not quite so intensely, and usually with coprocessor support, such as the Copper on the Amiga, HDMA on the SNES, and the polygon setup coprocessor on the Nintendo DS that let it fake 3D graphics using 2D sprites. And there were a lot of great programming techniques which were enabled by these primitive graphics architectures which have been more or less forgotten by today’s programmers.
@@fluffycritter just to add one more that is much less widely known, the Apple IIgs had relatively weak graphics hardware that certain programmers were able to stretch with beam racing techniques. Nothing like this, though-at least it had real frame buffers.
@@fluffycritter I imagine it was at least possible on nearly all 8- and 16-bit platforms, although they were less likely to need it given how much they could achieve without it. I know the Commodore 64 lets you do it; 8-Bit Guy demonstrated a program that cycles the background color as quickly as the CPU can manage, and it was changing every few characters' worth of pixels.
@@stevethepocket Yeah, beam racing is possible pretty much everywhere, with a few exceptions (like the original PC CGA, which shows visible snow if VRAM changes outside of HBLK), but I was talking more about things that were *designed* around it and made it a fundamental part of its operation. Beam-racing was used in a lot of C64 games and demos, though, yes - you could get some really neat effects with it, and you could also multiplex sprites to get more than 8 on the screen. Lots of NES games did similar things as well.
@@Toonrick12 this is what made the Amiga difficult to emulate, there was lots of cycle level accuracy needed to get it to work properly, especially with the copper but also video output in general. Early emulators wouldn't handle demos or late generation games and it took a lot of years for host platforms to be fast enough to get it right.
@Toonrick12 that might change with the advent of a new game called former dawn. This relies on cycle accurate timings on the NES to force the amount of graphics to the screen beyond what was thought possible. In particular, the colors are using the rarely used masking bits to get more color out of the system. And anything less will not work, only real hardware can do it right now.
This is insane, and I thought sprites management on the Game Boy was complicated... Great work, you must have spent a lot of time researching and animating this and it shows!
There are quite a few systems out there that use the CPU to generate the video output. The ZX80/81, the Galaksija, and the Gigatron among others all generate video with the processor. But they all have video memory, so they can treat the display as a state machine. Then, when it comes time to actually *draw* the video on-screen, the CPU just bitbangs whatever's in video memory to the display. The Atari doesn't even have that: Games actually have to use logic in real-time to decide what gets displayed, as the electron beam scans the screen. I suppose it could be thought of as a really fancy data compression, since fundamentally all the data that you need to figure out what's on-screen is still within memory, just not in a format that you typically associate with video displays. And data can usually be compressed *extremely* efficiently.
The inverse of this would be to pre-render frames for every possible combination of inputs you could make. But this extremely quickly adds up to *many many googols* of combinations, so it's scientifically impossible to do except for extremely simple games. For the record this is basically what laserdisc games do. The game engine could be thought of mathematically as an extremely sophistocated single-use data encoding/compression, which encodes every possible frame that the game could ever output. By playing the game, you're *decompressing* the data into output frames. From an abstract mathematic view anyway. But the benefit of this is that the game is kilobytes instead of googolbytes. The way the Atari makes its video is basically the same general concept.
@@jonathanfaber3291 Don’t worry. Gameboy sprites are a bit weird, but not too weird. Certainly not *nearly* as bad as the stuff shown in this video. The 8x8 or 8x16 (chosen by a bit in the LCDC hardware register) 2 bit depth tile data is stored in a certain part of VRAM the same as any other tile you would use for the background or window. Sprites themselves are stored in OAM (Object Attribute Memory) as 4 byte entries. You can have a maximum of 40 sprites (though there is a way to get around that with DMA transfers if you really need to) at a time, but a maximum of 10 will display on a line. Each entry contains the X position, Y position, which tile to draw, and some flags like flipping, priority over background/window, and which palette to use. The tile data and palette stuff is a bit different on the Gameboy Color, but largely the same. TL;DR: Put a little picture of the sprite somewhere in memory and then tell it the position, which picture to use, and some other stuff somewhere else in memory. I would highly recommend the Ultimate Gameboy Talk video for a nice visual explanation about sprites and other aspects of the Gameboy (you can skip the pixel FIFO section if it confuses you, it won’t matter much to most programmers). The Pan Docs website is also a good general reference for the system. If you want to get into development, look into gbdk-2020 and the myriad of videos and tutorials for making Gameboy games. You can skip assembly and program in C if you really want to. There are also tons of tools for designing graphics and animations that take care of most of the code for you.
Management : so what did you get done this week? Dev: I got a single pixel of a sprite to display on screen at the right position. Management: good job 👍
I really enjoy how, when you demonstrate the features of a particular system, you use typography and graphics that match what real software looks like.
This is the most fucking cursed method of drawing graphics to the screen that I've ever seen, by a long shot. This video is really good and an interesting breakdown of it, but I honestly had trouble get through it. It was like watching someone get kicked in the balls for 40 minutes
As a 2600 homebrew programmer (Skeleton) I've raced the beam. But you've done an excellent job explaining the 48 pixel sprite routine and how games can perform mid-line updates to create different effects with such a small number of properties. One quibble is vertical resolution isn't a fixed value on the VCS - the number of active lines varied considerably between games. And because VSYNC is under software control, some games also didn't have the "correct" number of lines per screen (which can cause issues with some modern TVs).
Some trivia: I read somewhere that the sinking-into-water animation/effect in Pitfall -happened by accident- _(that seems deliberate, see the replies)_ . They moved the sprite further down the screen, but they hadn't written any code to draw the sprite after a certain scanline. Thus, the bottom part of Pitfall Harry gradually disappears, as if he was sinking.
I'm not sure I buy the accident story; the code seems to be specifically designed for this effect and I'm not sure why they'd write a routine to move the sprite down if you didn't already have the sinking effect in mind. Pitfall uses a complicated 9 part kernel routine to draw the screen - with each kernel drawing a separate strip. Kernel 6 (@$F2C2 in memory) is specifically written to not draw Harry, just the bottom of the ground objects and tops of holes/pits. Kernel 7 (@$F466) which is called instead when there is a ladder on screen, immediately starts with code to see if it should start drawing Harry. The lower strip kernels call the routine to draw him (hence why he can go down the ladder and into the tunnel). I suppose they could have gotten the effect by accident in an earlier version of the code, but what's in the final game is very purposeful in the effect.
@@3vi1J I found my source! But it seems it was deliberate, not by accident. So, you're right. “Pitfall Classic Postmortem With David Crane Panel at GDC 2011 (Atari 2600)” at 32:40 ruclips.net/video/MBT1OK6VAIU/видео.html
This is OUTSTANDING! I have been working on a long term Atari 2600 project, and have done my fair share of research - never have I come across a presentation this organized and tight, which didn't dummy down the concepts. I would very much like to see you deconstruct, in such clear and detailed fashion, topics like the venetian blinds technique or multicolor through flickering. The ultimate would be a full .asm deconstruction.
Same, although I think the audio is interesting. There's not much in the realm of game music on this system, and part of the reason was how much attention the display needed.
@@vuurniacsquarewave5091 That, plus the fact that the sound hardware is unable to reproduce most notes in a scale with sufficient accuracy that they don't sound out of tune. I think Pitfall 2 has a custom chip in the cartridge to be able to get the frequencies right (and to fix a few other missing features in the hardware).
@@possible-realities Yes, it's incredibly restrictive. You have to come up with your arbitrary tuning frequency and relate the existing pitches to that to get anything that sounds like music, relatively speaking. But this will still drive someone with absolute pitch mad.
The 2600 is in some ways surprisingly easy to code for with the aid of a modern emulator. Basically, you have to figure out what will need to happen at each spot in the screen, and then work from top to bottom, writing and testing the code that does each strip. In many cases, the screen-drawing parts of the code will be surprisingly straightforward. Mostly cycle-counted loops which load something using (zp),y addressing, write it to a display register, load something else with (zp),y addressing, write that to a display register, etc. The hard parts are the zones between display stripes where it seems like there's the smallest amount of stuff going on, but code is actually preparing the pointers for use in the next zone. If one leaves a few lines between zones in the screen layout where all one needs to do is keep showing one sprite, however, even those parts aren't too bad. In my Toyshop Trouble game, I used 32 bytes of RAM to hold the shape of the 17-row player sprite in the two 16-line zones where it appears. There were I think about four different coloring patterns, all of which were green in the top and bottom row, so I could simply have one set of zero-page pointer values for even zones and one for odd zones, and show a suitably offset 16-color pattern. The game looks insanely complicated, and eking out the last couple cycles when clipping the left or right edges of sprite shapes was hard, but if one isn't trying to draw four independent-shape sprites per scan line the approach of setting up zero-page pointers to minimize the amount of decision making during scan lines will leave one with enough time to do a surprising amount.
NES is pretty fast, but you have to race the vertical blanking interval rather than the beam. If your graphics code takes too long you can get glitches
So the thing is, Atari originally intended for it to just run Pong, Combat and a few other super simple games. They were even planning to replace it with a more powerful console after 1977 before their parent company, Warner Communications, made them keep supporting it. And so after that, pretty much every game made for the 2600 had to be made with extremely clever and complex programming tricks and exploits that Atari never even knew existed at first.
It was a Pong clone machine with a few bells and whistles added to handle Combat type games. Every one of its capabilities made it easy to program Pong or Combat type games, but sorcery to program anything more complicated. It was meant to have maybe 5-10 games, all of which would be Pong or Combat ripoffs, and then hopefully it would generate enough money to pull something better together. Unfortunately programmers figured out how to trick more and more life out of the darned thing. The hardware that was meant to become its successor became the Atari 400/800, and didn't get released as another dedicated games console until the 5200 finally came out, by which point it was dated.
I tried doing some 2600 coding as a kid, boy was it frustrating to learn, but extremely satisfying when it worked! Later I would do some 6502 on the atari 8 bit systems and while still extremely primitive was much more flexible. Todays programmers never had it so good, what with hardware acceleration, APIs, 3D and multiple engines to do the hard lifting they don't know what it like to sync hardware by effectively counting execution time. Good indepth video as usual, brought back some of the nightmares I'd almost forgot!
I think this is kind of a detriment in a way, to be honest, that modern programmers don't have to worry about any of this stuff. It's why we get games like CP2077 that are just so terribly optimized because programmers just don't have to care, so they don't care and they just jumble spaghetti code together, whatever works, throw it in there. Meanwhile back in the 8 bit days, you had to count in machine cycles to get things to line up just right, and you had to squeeze every last bit you could out of the code. If modern coders did this, counting down to the singular bits, games would be insanely optimized, but it would also take years and years to make anything, sadly. But still. I think some modern developers could do a little better and stop rushing buggy messes out the door and try to optimize a little better.
The Atari 8-bit was primitive ? Have you ever looked at the other 3 home computers that were out at the same time? Radio Shack, Commodore Pet, and the Apple ? Talk about primitive. The Atari 8-bit had a graphics co-processor, 128 colors (later models had 256), built-in horizontal and vertical scrolling, a sound chip (4 8-bit channels or 2 8-bit channels and 1 16 bit channel, or 2 16 bit channels), three different ways of doing graphics (bit-mapped, character graphics, and sprites (known as player/missile graphics). It's 6502C microprocessor was clocked at ~ 79% faster than an Apple. Those other 3 computers didn't even come close to the Atari.
Incredible video! As experienced Commodore programmer, I'm very familiar with 8-bit low-level software. I've read documentation about on 2600 and was horrified. You have brought that horror into plain view for the masses. It is TRULY amazing what the programmers could do with 128 bytes of RAM, 4K of ROM, no operating system, and no GPU. Anyway, thanks for explaining it with real-life examples in detail!
I am blown away by this, both the complexity of the 2600 and how you were able to explain it so painlessly. It took me 10 minutes to realize that the weird gap in the repeated sprites was deliberately so they could mesh into a 6-wide block, so all that register juggling was the expected method by the hardware designers.
I'm not sure that was the main purpose, or even realized that it was possible at the time. I think it was mostly there to have more things to shoot at (e.g in Combat) or have a "team" to maneuver (e.g. Football). I can't think of any games that used this 48-pixel wide "six sprite" programming strategy for scores or other things in the first few years of game releases.
So what you're telling me is the people who designed the Atari 2600 hardware are pure sadists and enjoyed watching poor early software engineers squirm
Keep in mind that the 2600 was only supposed to run a handful of games like Pong, Breakout, and Combat. Once you realize that the hardware was designed around these games, it makes a lot more sense. Not only that, it looks downright easy to program for. What was unexpected was that the development teams kept figuring out how to push the system further and further. Eventually most of them left and formed Activision which created MIND-BLOWING quality on the 2600. (e.g. Pitfall, H.E.R.O., Kaboom!, etc.)
@@thewiirocks Sure; but unlike a certain company known for Hanafuda cards and leaving luck to Heaven, Atari didn't intend for their pong machine to be expandable. Which would later kick them in the goiters, stunting the growth of Atari entirely.
If I understand correctly... the Atari 2600 combines the worst parts of sprite and CPU driven graphics into the most unholy display system of all time?
@@someguystudios23 The POKEY is quite an interesting beast of a sound chip. It's one of my favorite ones actually, well not because of its tuning issues. The TIA's tuning is so utter crap.
@@TheBeeshSpweesh Have you heard the BTP2 music driver used by Stella's Stocking? Four voices of five octave full chromatics with two volume levels, using 46 cycles out of every 76.
It's stuff like this that makes me so obsessed with early video games... in our age of near-infinite memory and processing power, it's just never going to be as impressive a feat to me if you create a stunning AAA perfect-graphics title as it would be to create something interesting within the extreme limitations of early consoles and computers.
Limitations breed creativity. So many peices of hardware or genius bits of code were invented because someone wanted one extra sprite on the screen, or a few more polygons on the character model without nuking the framerate.
@@googiegress My opinion is that back then the complexity was on the software side, the hardware was simple. Nowadays it's the opposite: writing the software (in user mode at least, API, drivers and OSs are still hard to program) is kind of simple (if you don't have huge optimization constraints), but the hardware is a lot more complicated than back then.
@@benjib2691 What do you think about the efficiency loss when using software development tools that aren't purpose-built to that specific implementation? An example might be a video game using an engine that was designed for all types of games, whereas when writing the game in an engine designed specifically and only for that type of game you have more limitations but also no extraneous bloat in the engine. One might argue that there's an efficiency loss at each layer from the hardware instructions up to the customer booting up the .exe. I feel that the end result, of possibly less-efficient software, is still pretty great because today's development tools lower the barrier to producing the software in the first place. Using standard tools means other people can more easily build upon it later. And, even if the developer went through all the effort to write his own ~everything~ that his game or whatever depends on, maybe he would come up with less-efficient specialized solutions than the industry standard generalized ones.
There were more devices that didn't, example the NES, it did everything using tiling backgrounds and 8x8 sprites. Still, heaven compared to 2600 coding
I now understand every Atari game I ever played. Or rather, I now understand the idiosyncrasies in their graphics. In Yars' Revenge for example, the Missile and the Zorlon Cannon must both be the two missile sprites. The pellet shot is the bullet sprite, and the Quotile/Swirl and Yar are the two player sprites. The neutral zone in the middle is just a constantly fluctuating background; that's why it's made of thin bars. The shield is a background, with the rotating version just being a cool way to make it work. And it moves up and down because you have pixel precision on vertical movements for backgrounds, but not horizontally. Indeed, that last bit explains *so many* Atari games and their penchant for vertical scrolling. In Adventure, I knew objects flickered when 3 or more were on the screen, but I now know why (they're all player sprites, and you can't have more than two render on the same scanline, so to make things simple, it only renders two of them ever). Also, the player is a missile sprite, so that's why it doesn't flicker. I also know how the labyrinths work and why it flickers like an object: it's a "player" sprite. The background is made the same color as the default color, but this "player" sprite is made to appear behind the background, so it makes the background visible. And I even know why the famous dot is a *dot* (it's the bullet sprite, since that's all that's left at this point). And I *bet* the barriers the dot makes disappear are actually the other missile sprite.
The neutral zone is more than just a constantly fluctuating background, It's the graphical representation of the actual game code in the Yars' Revenge Rom itself, overlaid on itself and counterscrolling, with some x-y offsets and random color processing thrown in.
The "famous dot" was a player sprite which just so happened to only have a single pixel set. The adventurer's sprite was the Ball, which shares its color with the playfield. Rooms that had just a left wall or just a right wall would use the missile sprites for those purposes, and their colors were shared with the player sprites. The game was coded to allow passage through the right wall if its color matched the background, which would only occur if the dot was in the room with at least two other objects.
@@flatfingertuning727 My mistake. The dot would have to be a player sprite, since the way you find it is that its room causes flickering when there's 1 fewer items than you would expect to cause flickering (ie: when you bring any item to that room).
Developing games today: "If our game is too heavy on the console's cpu, the frames will take longer to render and it will look choppier... Let's launch it like that" Developing games back in the day: "The game won't f**king work if I write/read too many variables before the next line needs to be rendered"
Typically, if a game used WSYNC after tricky scan lines, trying to put too much into such scan line would cause all subsequent lines to be shifted down on the screen, and possibly cause the frame length to be a line too long (depending upon whether the RIOT timer was set at the start or end of the visible part of the frame). There were quite a few games where this would happen if some characters were too far toward the right edge of the screen.
@@flatfingertuning727 These devs managed the impossible with the cpu of a calculator, meanwhile I get frustrated when I forget a semicolon and the whole code won't compile.
Modern publishers: You spent eight years getting this game to a barely playable state. Mind doing another two years to make the women uglier? Make them look like men. Our investors are mad at us for having women in our game - "oversexualised" they call it. No, don't try to fix those bugs or make the game a little harder - in fact we'll replace you with this guy with the pronoun pin. Ancient publishers: Six months. Six months to make E.T. for our bare metal platform without even so much as a sodding BIOS. We're also going to print millions of them despite the fact that our install base isn't even that big. No, we won't give you another week to fix the collision detection or make E.T. brown.
Your explanation of vertical delay is hands down the best I've seen. I've been working on an Atari 2600 emulator for fun and mostly going off the Stella guide, and even though I've read its one cryptic paragraph on vertical delay 50 times at least, I never made the connection between that feature and its application to complex graphics. Using your video, I was able to get E.T.'s face (and so many other things) to display properly in my emulator. Thank you!
Gamers: I know ET for the A2600 is the worst game ever. A2600: I require a phd to draw a funny face, and a sacrifice of your smallest limb to make a game. Kneel before me tiny human.
E.T. was overprinted, it actually sold quite well. It was also horrendously rushed. There's a romhack that fixes its problems and makes it one of the best A2600 games ever made. For ACTUAL bad games, we have... Concord? Dustborn? Veilguard? To name but three...
17:36 Quite hilarious how the relationship between v-blank and drawing space ended up being reversed on the NES and newer consoles. Then again, the Atari 2600 is such a primitive system that the CPU has to handle almost all of its graphics manually (leaving only the pixels themselves to the PPU or its Atari 2600 equivalent), so the only time it can do game logic is when it isn't buisy to draw the screen. As a below commention mention it, it truely is more like a graphics processor doing only expected CPU stuff when not drawing. This contrasts with e.g. the NES which instead tells the comperatively advanced PPU on how to draw the final screen during v-blank and only interfers with the drawing process if something needs to be changed mid-screen (e.g. status bars, parallax scrolling) to the point it doesn't come in with raster interrupts besides the unprogrammable NMI by default and needs them to be included with MMCs! Either way, the developers, much like computer scientists back in the day, surely did a lot of theoretical stuff first before tackling on the game proper while nowadays, you don't even need knowledge in programming besides some very abstract code to create a new game.
There was an IRQ functionality planned for the NES. A decap of the 2A03 revealed an unfinished IRQ timer on the die that was never connected to anything. So in the end it almost actually happened.
@@vuurniacsquarewave5091 It's possible to set up IRQs to happen at predictable times on the frame using just the NES hardware, using the DMA audio to output dummy data. If one writes the DMA rate shortly after an IRQ, and then a certain minimum time after that, the DMA time will be the first value programmed plus seven times the second value.
@@flatfingertuning727 Yes, I've used this trick before, but there was always a small amount of jitter, because the DMC channel is not synced up with the picture (why would it be though).
@@vuurniacsquarewave5091 The jitter can be kept to within about 10 cycles or so. If software sets up the interrupt to be synchronized with the DMC, it should remain synchronized, though the fact that the number of cycles per frame isn't an integer means the interrrupt handler needs to use a pattern of short-short-long frames. I'm not aware of anyone using the trick of using two DMC rate values for each interrupt, but that greatly reduces the amount of overhead necessary to maintain stable raster splits. What really puzzles me is the use of a programmed 4-bit value as a lookup into a table of 16 output rates, many of which are so slow as to be pretty well useless for outputting audio.
@@flatfingertuning727 They are still usable for very low-fi sounds. If you make the base of your sound correct at say Pitch 7, you can use only two small samples to cover almost two octaves. I'm not aware of any games using this, but I did it when I added DPCM sample sounds to a romhack of mega man 3 (never released though).
Unbelievable system. The book "Racing the Beam" was a fantastic read, and anyone who loved this video should pick up a copy. I'd love to see a technical tear-down on how Solaris (my favorite 2600 game, and an absolute technical tour-de-force!!) was done.
Man your videos help me sleep late at night. I’m not saying you’re boring, or that it’s night right now, but you absolutely help, so thanks, my insomnia can be a bitch.
Watch the Chrontendo videos, where the guy plays every NES game in order of release. His droning voiceover is perfect for nodding off to a fine, nerdy sleep. Chrontendo is as much of my nightly sleep routine as brushing my teeth.
Learning 6502 Assembly at present, this is so fascinating. It's like a super sonic relay race, so much overlapping (and yet not) code just to draw out pixels to a screen. Thanks for the explanation - my tutorial only briefly touched on it.
I knew the 2600 was a very obtuse machine to develop for, I did not know it required some kind of cosmic super intelligence to figure out. Please, I must know more about it!
Nah. You just need to memorize all the instruction timings, otherwise it would take you a very long time to get anything working. Fortunately, the CPU is pretty simple.
Fantastic video! You've got everything about the graphics right and presented it very eloquently. I would certainly like to see you do other Atari videos. Perhaps even something about the Atari 8-bit home computers? Those have a very interesting architecture, which is in many ways an "upgraded" version of what the 2600 has to work with. A little mistake, though: the 6507 does not have a store-zero instruction, as depicted at 16:48. That instruction is only present on 65C02 and 65816 processors.
Outstanding video as always!!! Wow, I can't imagine what it takes for someone to be able to keep track over each processor instruction. I suppose that that's the "magic" of mesozoic computer hardware, hehe. Cheers!
I've dabbled in Atari 2600 programming before, and while it is constrained in some senses, it gives you some ability to produce certain effects that are more difficult or impossible on other systems through the 80s. I think it's fun being able to control vsync/vblank. Supposedly it is possible to create an interlaced picture if you time it right. And 128 colors! This was in 1977!
IIRC, the Atari 2600 did "rainbow" effects really well because you could change the color for each scanline to produce a reasonably smooth gradient - no other classic system could do it as well as the Atari 2600 did.
This is a great video - excellent job! Being a current 2600 developer, I must say you explained everything perfectly - even the sometimes-confusing VDELPx registers! Subscribed! :)
Honestly: thank you Thank you for making such videos. They teach very technical things using a topic (video games) that really interests me and videos like yours just showed me that I have a passion for low level hardware interaction.
One cool thing to highlight, if there is ever a follow up video, is why some Atari 2600 games have black bars on the left side of the screen. I have heard that it is due to game code being executed instead of writing to screen and after this video, I can see how that is the case. Still a video on it might be interesting.
I cannot even tell you how awesome this video is. I bought a book about Atari 2600 game programming and was totally lost. This video did such an amazing job of explaining everything to me. Thank you so much.
Funnily enough the concept of timed CPU cycles is still relevant even to thus day. Mostly for security and hacking prevention. The amount of time a computer takes to decrypt data can give away the data, so to make the code more secure it's done in a manner so that it takes the same amount of time regardless of input. My understanding is that Pokemon Sun and Moon's random number generator was cracked by measuring how long it took to load your character's face on the file select screen
Yep. The Wii's optical drive password was cracked by timing how long a wrong password took to be rejected. If you got the first character of the password correct, it would take slightly longer to be rejected than if the first character was wrong. Then you can do the same with the second character and so forth. It's called a "timing attack" and it is pretty ingenious.
In Pokémon S/M's case, this could be made less reliable if you install the game rather than playing on a cartridge: the 3DS' SD reader is capable of significantly faster I/O than what retail cartridges support. (I don't believe the full potential of the cartridge slot was ever officially used)
These videos that you've put on your channel explain things like this in such detail that not many other youtubers tend to tread into. This content is really insightful and you are very underrated. Suscribed!
I tried to learn how to make Atari 2600 games back when I was a high schooler with entire summers of free time. After reading all the requisite manuals and learning the basics of machine code, I got as far as making a happy face move around the screen with joystick control.
I tried to follow a 2600 programming book thinking it would be a good introduction to assembly language. It was a good introduction, but all I managed to do was get some colors and a sprite on the screen before I gave up and switched to NES.
Imagine one person coming up with a game concept, designing the mechanics and balance, creating all the artwork, writing 500 lines of code just for a basic title screen, and then your boss refuses to put your name on something that was solely your creation.
Amazing video! Yes, we would like to see more from the Atari 2600. Looking at games like River Raid and Keystone Kappers with their scrolling playfields and many sprites, they seem to be wizardry after watching this video.
This amazing explanation brings so much respect to the developers who presented so many video games back in the day. It is just amazing to understand how constrained they were to develop games with some sort of quality.
Interestingly enough, Jay Miner was involved in the development of the TIA of the 2600, the OCS of the Amiga and the ANTIC of the Atari 8-bit computers. The latter two share the idea of display lists, offloading the CPU from a lot of the beam racing tasks.
I guess the hardware guys designing the Atari 2600 hardware did have only Pong and a couple more of games as a reference, and designed the hardware around these two games. Having "ball" and "missile" sprites and a mirrored background confirms that. Any other game apart from Pong (e.g. Pitfall, E.T. etc..) is literally a software hack shoehorning the hardware to different needs. Later machines (Commodore64, Amiga etc..) made the hardware more general, providing tiled background and a set of movable sprites all equal to each other.
Wonderful explanation! It's hard enough to pull this off these synchronizations w/ modern emulators with debuggers like Stella, i can't imagine trying to work this out on a late 70s workstation with nothing but the code and a ROM burner. Thanks for another great video!
The 2600 is an allegory for life. We're all racing the beams of the sun, trying to time everything perfectly so that it syncs up with everyone's schedules. Only during that vblank period we call the weekend do we really have time to ourselves.
Programmers today don't really appreciate how easy they have it today in programming a game. Not only do they have virtually unlimited hardware resources (storage space, processor speed, RAM, etc) but the actual TOOLS available to create the programs (games) are FAR beyond anything that was available back then. These guys were literally using graph paper and calculators! Sometimes getting the timing right would take weeks. David Crane, the programmer of "Pitfall!" tells the story of how he initially only wanted to give the player ONE life to do the whole thing, but after he was talked out of that idea it took him several weeks to optimize and reduce his code size enough to where he could add the feature of having (and keeping track of) more than one life. Imagine that! It's worth mentioning that in addition to the limitations mentioned in this video, the 2600 also only had *128 bytes (yes, BYTES)* of RAM to store variables and anything else that needed to change during program execution. And also, although programming any kind of game for the system was a challenge due to the programmer being responsible for EVERYTHING, it also was the thing that allowed the system to last far beyond its expected lifespan because it meant the sky was the limit instead of the specific capabilities of a video chip.
128 bytes only in the base hardware, but some cartridges release late in the 2600 production included additional RAM (e.g. another 128 or 256 bytes that could be accessed through special reads and writes in the ROM space, and probably always coupled with expanded ROM space too, by that point). But this made the cartridges more complex to design and build and therefore more expensive.
@@michaellosh1851 True. RAM at that time was super duper expensive so they did the best they could at the price point they were shooting for. The original plan was to only have 64 bytes! But they were able to cost-reduce in other places and double it to a massive 128 bytes. And yeah I've heard of later games having their own additional hardware such as RAM. _Pitfall 2: The Lost Caverns_ had additional hardware for 2 extra sound channels and better display capabilities, for example. But limitations in game consoles cause something very interesting to happen: Programmers are forced to actually spend time on gameplay to get the most out of the system. And although consoles today have hundreds of times more capability I can't really say the games themselves are hundreds of times more fun to play than some of those old Atari games. I used to spend HOURS playing _Adventure_ and that's about as simple as you can get. It didn't even have any music and VERY few sound effects. But it was very well designed, especially considering the limitations of the machine.
@@sandal_thong8631I think back in the early days manufacturers didn't know if this was going to be a short-lived fad or not, so they invested the bare minimum. Later on though, when it was obvious that this was going to continue to evolve for many years, they actually did get sound men and women to do such things (and graphics people to work on graphics, etc). But in the beginning the programmer had to do everything him or herself, with very limited tools and slow compilers.
This is absolutely mindblowing and, from an outsider's perspective whose job isn't on the line for this, actually sounds extremely fun in a way to experiment with.
Another fun thing about this console: it gave you a whopping 128... *BYTES* of RAM to store all your variables, shared with the call stack. If you wanted more, you had to provide it in your own cartridge. Though to be fair, you could easily have several KiB of that. It's understandable why they did it, considering the need to minimize cost and the simplicity of the games it was originally designed for. Still a shocking limitation in retrospect, though.
@@fllthdcrb I know, right? I actually knew about the 128 byte RAM because of some research I did a few years back (was curious how the technical specs of old consoles compared), but yeah, that was pretty much my reaction. IIRC, you only get 4KB of ROM too, unless you bankswap. These games are *puny* .
This was in the days where computers were still a very technological and niche thing which was more in line with hard electronics rather than entertainment systems. It was a games machine in name, but it was by and large repurposed from tech with no frills.
@@fllthdcrb When I first read about the 128 bytes ram, I was surprised and wondered how all the graphics and game code could be stored in 128 bytes. You see, at that time, I thought that atari was like a modern system where EVERYTHING has to be in ram since the cpu/gpu cannot directly access the HardDisk/SSD. But then later I learned that the atari can directly access cartidge memory. I guess that is one of the advantages of cartridge memory.
@@cylemons8099 Yup, that's a thing with these old systems. There was so little internal memory compared to the address space (not too unlike these days, come to think of it, unless you're running a NUMA system that is), and the CPU slow enough compared to the memory, that you could attach more through a cartridge or other expansion port and have the memory mapping hardware map it to some address range so the CPU sees it as just more memory. Well, that wasn't always the case, though. The Commodore 64 had a full 64 KiB of RAM-already the size of the 6510's 16-bit address space-in addition to some memory mapped I/O and 20 KiB of ROMs, so it already had to employ bank switching (hence having the 6510 with its extra pins, some of which were used to change banks, instead of the 6502 that was far more common in other microcomputers). RAM expansions had to be accessed differently (e.g. transfers) instead of mapping.
I knew that programming for the Atari was a struggle, but HOLY SHIT how much thinking for only displaying a bitmap or a sprite! This could be as easy as setting 2 pointers, or at most a MOV loop to screen memory. If we had a decent amount of it, like for example the whole screen which is not too large anyway. And it is not interactive yet, where input must be processed, decisions must be made, and graphic should look like something real, not just squares of the same colour... (another thing: the animations of the video are GREAT and helpful and it must have been much work, thanks!)
It's amazing what General Computer did for Atari "the creators of the silver lable games such as Centipede, Pole Position, and especially Galaxian!" The Atari 2600 had no buisness playing that generation of games. Only the 5200 was suppose to be able to do those things but General Computer Corporation made it happen.
the fact that actual human beings sat down and programmed games on this pile of garbage is absolutely horrifying to me. no wonder they were so pissed about not getting creditted
This makes the 2600 seem even more miraculous. It seems like everything about coding for this thing was some sort of compromise or bodge.
You’re not wrong.
In a big way. Lots of math involved. Plus you had to "think" like the Atari. It was challenging but fun to code for.
If you want a 6502 programming challenge I cant think of anything thats better than the Atari 2600 actually. You learn to appreciate every cycle and figure out the best ways to achieve anything on the console.
@@64jcl I'd say building your own 6502 computer and programming its system ROM is more of an accomplishment. Ben Eater has a project for building one on breadboards and people are crazy for expanding on it.
@@YaroKasear , indeed, Ben Eaters videos are great. But I'd say that is perhaps more about learning to interface hardware with a 6502. As a pure programming exercise the Atari 2600 requires you to not be sloppy about the code which you can do with most other programming. In general game programming, e.g. for the Commodore 64 which I work with a lot, is a good challenge as you have to make sure things are as fast as possible since you are often striving to have 50/60 fps running smoothly.
I now feel even more sympathy for the one guy that had to rush to create the ET game.
It’s incredible what he managed to create in such a short amount of time considering that this is the nonsense he had to deal with on the programming side.
Good heavens what a delicate system.
Incredible indeed! At first, I thought -- "Wow, there's so little abstraction that I bet I could learn this in a weekend. How quaint, so much freedom!"
*Five minutes later*: Yawning, existential horror at the concept of HAND COUNTING MACHINE CYCLES and PERFECTLY TIMED LOCKSTEP EXECUTION all while remembering to account for the fact that different rendering instructions have VARYING DISPLAY LAG TIMINGS.
It would have taken days just to create the start screen! Astonishing that anyone ever created ANYTHING for this system. The ET guy should be remembered as a hero. Of course the game was awful, look at what he was dealing with!
@@RUclipsMrP Well, I imagine the routine to do the character portrait on the start screen was a standard routine that the Atari devs had lying around.
@@MasterChaoko Welcome to the joys of the 2600. It's not so bad once you get used to it. Batari made it a bit easier with Batari BASIC. But even then, you need to write a custom kernal (the term commonly used for the display code) if you want to do anything very sophisticated. Search for my game Deimos Lander for an example of a game that blends Batari BASIC for gameplay and a custom kernel for cut scenes. I'm actually pretty proud of the printer-paper screens as no one managed to fit as much text as I did, flicker free.
It's very robust, actually. The real hard part is flicker management.
This was just the norm for people at the time, nothing else really existed.
"I'm not procrastinating, I'm just waiting for the scanline to sync"
What do you mean, you dont know what a scanline is?
@@ecernosoft3096 waiting for the scanline to get where it needs to be to start writing to the screen
17:35 It made me laugh that the "V-blank for graphics, active scan for game logic" pattern from the SNES/etc. is inverted here. But it makes sense upon reflection, since the Atari CPU is acting more like a graphics chip that happens to do game logic when it can spare a moment.
Many systems has a graphics co processor that reads VRAM and writes to the screen, and that's the reason graphics updates can only happen in blanking periods.
And even that is kinda omittable, by using double VRAM, which would basically be double buffer, just like how modern GPU handles things. The trade-off would be, it's twice as expensive, and there's no more DMA effects when using double buffer, also there's one more frame of delay.
(There's also screen-tearing, but that probably doesn't matter at that time)
@@chyza2012 Hence I used the word "basically". I think for vintage hardware, the only way to do it is to use 2 VRAM chip, and both the CPU and the graphic coprocessor swap chip each frame, so that they don't access any one of the two at the same time.
It's unlike modern hardware that are fast enough so having to wait for VRAM access isn't a big deal.
@@FlameRat_YehLon Dual ported ram solved this problem.
@@wishusknight3009 does that allows two devices to access the same ram at the same time?
@@FlameRat_YehLon "does that allows two devices to access the same ram at the same time?"
Essentially yes. The ram has double the number of IO ports, one set for input and one for output. The CPU can write values while the PPU can access the memory within the same clock domain. They do not need to wait for the other to finish the transaction to start.
Now only 1 can read at a time and only one can write at a time, which means that the CPU will interrupt the PPU if it needs to read memory back.
This kind of memory has also been used in computers like the Mac II FX to reduce latency and boost performance for the CPU. However in that use its benefits are more limited, it does however give some pretty big boosts to DMA transfers.
Normal sprite management: Ok, you want this sprite at (120,200)? Sure thing.
Atari sprite management: Ok, you want a sprite? And you want it to be 5 px offset? How far along the screen do you want it? Y'know what, I'll drag it along the screen and you can tell me when to stop. Tick. Tick. Tick. Okay stop? Sure. Now what graphic do you want to put on your sprite? Actually, hold on, I'm gonna need you to hand it to me row-by-row...
Reminds me of the way early mechanical telephone exchanges worked. So you want to select this particular circuit and the mechanism starts moving and sends electric pulses every time it hits another terminal on the panel. Then the sender (control unit) tells it to stop when the brush is on the desired terminal. Of course, it also needs to translate your phone number into a terminal position first.
Thing is convoluted sprite management used to be "normal". We take it for granted now.
@@IronicHavoc I got curious about this so I actually looked up some technical documentation about the 2600's contemporaries.
The Odyssey2 (released a year later) had four sprites that just took an X/Y coordinate and 8 bytes of graphics data. So, basically what you'd expect. However, you don't really get a proper "background layer"; instead you get an 9x10 grid of lines (which you can specify as filled-in) and the ability to place text on the screen along with your four sprites. So your ability to draw a playfield is really limited.
The Fairchild Channel F (released a year *earlier*) has a proper framebuffer of... 95x58 pixels. No sprites as far as I can tell. I'd like to know how the hell they were able to fit video RAM in their design given that both Atari and Magnavox avoided it like the plague for cost reasons.
All that aside I'm still going to call out the 2600 TIA *anyway*, purely because of the whole "position sprites with cycle-timed code" thing.
@@SuperSmashDolls The Fairchild Channel F used dynamic RAM for its frame buffer. Dynamic RAM was about 1/4 the cost of static RAM, but the hardware to support random access was rather a nuisance because one had to multiplex the high-order and low-order parts of the address bus, and also ensure that no row would go very long without being accessed. The Fairchild and Apple II both exploited video memory scanning to take care of refresh, but if I recall the Fairchild was rather limited in terms of when the CPU could access the display.
Here's what weirds me out about it: you wait to write to RESP0 or RESP1, but you only have to do it once. The system remembers it until you update it. So why couldn't you just write a value to whatever internal register handles that?
In the Atari 2600 enthusiast community, programmers (especially ones that programmed the classics like Pitfall) are often given titles of great respect such as "programming wizard" and "code magician" and "miracle worker". This video illustrates why these titles are not entirely cheeky little jokes made by the connoisseurs of the classic gaming community. :)
I've heard several former Atari employees refer to themselves as "engineers" rather than "programmers," as they viewed programming for the VCS to be not simply writing code, but examining systems and figuring out ways to make said systems more efficient to be able to execute said code on the VCS hardware.
I'm only half joking when I say programming for the 2600 is only a step above chiseling 1s and 0s into a stone tablet lol.
Fitting, because i don't know much about coding so this all sounds like black magic to me
this is seriously some exapunks/tis-100 type stuff but even harder
Dev: "nice console, how much video memory does it have?"
Atari: "no"
I knew that Atari had built the 2600 cheap, but I didn't realize it was this cheap.
@@XanthinZarda The thing is, if you look at other consoles of those days, the Atari ended up being able to outperform some of them BECAUSE the lack of video memory allowed fancy tricks that couldn't be done with the low amounts of video memory that other systems had. It was very much a trade-off of being harder to develop for, but with a higher bar of what could be accomplished.
@@XanthinZarda It wasn't so much cheap at the time as not ungodly expensive. Memory chips were ridiculously expensive back in the 70s. The fact that Apple used a frame buffer in the Apple II was kind of shocking back then. Using registers and shifter chips was a way of getting around making the console over a thousand dollars. The 2600 did have some working RAM, but it turned out to be only 128 bytes (!). That was less than a single addressable page using 8-bit addressing. Stack overflows were... interesting.
@@thewiirocks Let me guess... Most of the stack was a big hole?
@@josugambee3701 Stack counts downward from the end of memory. So you would usually put your stack pointer at the end of the 128 bytes at startup. Then you'd put the variables you tracked at the beginning of memory and hope that these two pools of memory never collided. You really had to watch sub-routine calls as it was incredibly easy to push too much on to the stack before you exceeded the 12 to 32 bytes you might reserve for the stack.
If I remember correctly, attempting to access the memory hole would just wrap around the 128 bytes. But it's been a long time so don't quote me on that.
you know what topic I think would be perfect for this channel, that I haven't seen a proper explanation of in video form anywhere? a video about the way level data is stored in super mario bros, explaining why exactly the minus worlds look like normal levels and not garbage tiles. I think that would be really cool
It totally makes sense to me you're in this channel
I'm just guessing, but I think most levels were built from objects with dimensional parameters. With a few level primitives you could build fairly complex levels using very little storage.
I think SMW used a similar method, but at this point I'm just talking out of my ass.
@@IntegerOfDoom Kind of, it's a few pre-made layouts, which can be changed with extra objects that "spawn" chains of blocks, coins, or other background objects on top of the base layout. So to have a line of say, nine ? blocks, you only need a single object at the right place with the property that says "spawn 9 ? blocks to the right as the screen scrolls". For example, those staircases made from blocks at the end of many levels are actually all from a single object. This is also the technical reason why going backwards in SMB1 was not feasible to allow.
Groovy, now if only these answers were in video form
The reason I think why the minus worlds look like normal levels is the calculated index for what level data sometimes is in the normal range, so they use real levels (I think the game wraps around the level index in some way if it's out of range). For the minus world the index just happens to end up being for a water level.
Edit: The actual reason why it looks the same is from reading unrelated data. It first gets the index of the respective level id from an array holding these indices, but because the world number is way higher than 8 (36), it ends up reading a value from the array located right after with the level ids, which so happens to be 0x33. It then plugs this value into the level id array, but ends up reading a value out of bounds in the enemy data after the array (1), which corresponds to 7-2.
This is probably as close as you could get to bare metal. It's a miracle this system works at all, and it had such a long lifespan alongside other much more advanced systems.
I'd argue it's just as much bare metal as anything else, you just have to handle a lot of things typically handled by the hardware in software. If anything, it's less bare metal, cause more things are handled in software ;)
@@berylliosis5250 you don't understand what that term means, do you?
@@vyor8837 If you're saying it means anything vastly more specific than "no operating system, interfacing directly with hardware", then no, I don't know what it means.
I'm just suggesting that not having specialized graphics hardware doesn't make the Atari 2600 "more" bare-metal than (for example) the NES; they both have no OS and directly interface with hardware (the only difference is how advanced that hardware is). The last part was a joke, hence the ";)".
@@berylliosis5250 Fun fact: NES programmers couldn't really interface with the RAM the way the atari did. Thus: the atari was more bare metal.
@@vyor8837 Is there software in between the NES and it's RAM? I doubt it. I think "more" or "less" bare metal is pretty meaningless.
Me: *struggling with Unity*
The chads making Atari 2600 games:
Me: *Breezing with Unreal Engine 4*
Me: procrastinating on working on my Unity projects
@@GconduitYTubeAccount lol 🅱️atari unfunny
@@leandersmainchannel4493 Yeah. Not the greatest name. batari BASIC itself is pretty sweet though. Look up Princess Rescue for an example of what you can do.
Sttangely i struggle with Unity but i can create atari 2600 games lol
To be clear. This "vramless design" was not only never done since. It hasn't been done before. The Fairchild Channel F which predated the Atari 2600 actually had 2k of traditional VRAM. This was done because RAM pricey.
It was kind of a blessing is disguise. Beam racing is what made the Atari 2600 thrive since you could get similar quality, full color games to competitors for cheaper. At the cost of programmer sanity of course.
There are a few other systems that used beam-racing, although not quite so intensely, and usually with coprocessor support, such as the Copper on the Amiga, HDMA on the SNES, and the polygon setup coprocessor on the Nintendo DS that let it fake 3D graphics using 2D sprites. And there were a lot of great programming techniques which were enabled by these primitive graphics architectures which have been more or less forgotten by today’s programmers.
@@fluffycritter just to add one more that is much less widely known, the Apple IIgs had relatively weak graphics hardware that certain programmers were able to stretch with beam racing techniques. Nothing like this, though-at least it had real frame buffers.
@@fluffycritter I imagine it was at least possible on nearly all 8- and 16-bit platforms, although they were less likely to need it given how much they could achieve without it. I know the Commodore 64 lets you do it; 8-Bit Guy demonstrated a program that cycles the background color as quickly as the CPU can manage, and it was changing every few characters' worth of pixels.
@@stevethepocket Yeah, beam racing is possible pretty much everywhere, with a few exceptions (like the original PC CGA, which shows visible snow if VRAM changes outside of HBLK), but I was talking more about things that were *designed* around it and made it a fundamental part of its operation.
Beam-racing was used in a lot of C64 games and demos, though, yes - you could get some really neat effects with it, and you could also multiplex sprites to get more than 8 on the screen. Lots of NES games did similar things as well.
Lol that's like the opposite of developers philosophies now
And this is why all Atari VCS emulators must, by definition, be cycle-accurate.
It's a mixed blessing that there's so little reproduce. Imagine the insanity if even the NES had to have that amount of bit perfect emulation.
@@Toonrick12 this is what made the Amiga difficult to emulate, there was lots of cycle level accuracy needed to get it to work properly, especially with the copper but also video output in general. Early emulators wouldn't handle demos or late generation games and it took a lot of years for host platforms to be fast enough to get it right.
@Toonrick12 that might change with the advent of a new game called former dawn. This relies on cycle accurate timings on the NES to force the amount of graphics to the screen beyond what was thought possible. In particular, the colors are using the rarely used masking bits to get more color out of the system. And anything less will not work, only real hardware can do it right now.
This is insane, and I thought sprites management on the Game Boy was complicated... Great work, you must have spent a lot of time researching and animating this and it shows!
“Game boy sprites are complex”
Me: *cries in wannabe retro game maker
There are quite a few systems out there that use the CPU to generate the video output.
The ZX80/81, the Galaksija, and the Gigatron among others all generate video with the processor. But they all have video memory, so they can treat the display as a state machine.
Then, when it comes time to actually *draw* the video on-screen, the CPU just bitbangs whatever's in video memory to the display.
The Atari doesn't even have that: Games actually have to use logic in real-time to decide what gets displayed, as the electron beam scans the screen.
I suppose it could be thought of as a really fancy data compression, since fundamentally all the data that you need to figure out what's on-screen is still within memory, just not in a format that you typically associate with video displays. And data can usually be compressed *extremely* efficiently.
The inverse of this would be to pre-render frames for every possible combination of inputs you could make.
But this extremely quickly adds up to *many many googols* of combinations, so it's scientifically impossible to do except for extremely simple games.
For the record this is basically what laserdisc games do.
The game engine could be thought of mathematically as an extremely sophistocated single-use data encoding/compression, which encodes every possible frame that the game could ever output. By playing the game, you're *decompressing* the data into output frames. From an abstract mathematic view anyway.
But the benefit of this is that the game is kilobytes instead of googolbytes.
The way the Atari makes its video is basically the same general concept.
@@jonathanfaber3291 Don’t worry. Gameboy sprites are a bit weird, but not too weird. Certainly not *nearly* as bad as the stuff shown in this video.
The 8x8 or 8x16 (chosen by a bit in the LCDC hardware register) 2 bit depth tile data is stored in a certain part of VRAM the same as any other tile you would use for the background or window.
Sprites themselves are stored in OAM (Object Attribute Memory) as 4 byte entries. You can have a maximum of 40 sprites (though there is a way to get around that with DMA transfers if you really need to) at a time, but a maximum of 10 will display on a line.
Each entry contains the X position, Y position, which tile to draw, and some flags like flipping, priority over background/window, and which palette to use. The tile data and palette stuff is a bit different on the Gameboy Color, but largely the same.
TL;DR: Put a little picture of the sprite somewhere in memory and then tell it the position, which picture to use, and some other stuff somewhere else in memory.
I would highly recommend the Ultimate Gameboy Talk video for a nice visual explanation about sprites and other aspects of the Gameboy (you can skip the pixel FIFO section if it confuses you, it won’t matter much to most programmers). The Pan Docs website is also a good general reference for the system.
If you want to get into development, look into gbdk-2020 and the myriad of videos and tutorials for making Gameboy games. You can skip assembly and program in C if you really want to. There are also tons of tools for designing graphics and animations that take care of most of the code for you.
When I learned how graphic compression on the Gameboy worked, I was like "oh, that's real clever". With the 2600 it's only "oh dear God, why?!"
Management : so what did you get done this week?
Dev: I got a single pixel of a sprite to display on screen at the right position.
Management: good job 👍
Your colleagues might realize that was a good job, I'm not so sure that management ever did.
"Well hurry up, your game is going to production in a week."
"Here's my doctoral thesis on how to draw a static image on the screen."
I really enjoy how, when you demonstrate the features of a particular system, you use typography and graphics that match what real software looks like.
E
It's a bloody damn miracle anything was programmed for this machine
This is the most fucking cursed method of drawing graphics to the screen that I've ever seen, by a long shot. This video is really good and an interesting breakdown of it, but I honestly had trouble get through it. It was like watching someone get kicked in the balls for 40 minutes
On the bright side, you now probably know why graphics on the 2600 were so simple and chunky, right?
And each ball kick is started 3 cycles, plus 68 before it needed to land.
it was smart lol big brain time
Is it bad that I kinda love it
As a 2600 homebrew programmer (Skeleton) I've raced the beam. But you've done an excellent job explaining the 48 pixel sprite routine and how games can perform mid-line updates to create different effects with such a small number of properties.
One quibble is vertical resolution isn't a fixed value on the VCS - the number of active lines varied considerably between games. And because VSYNC is under software control, some games also didn't have the "correct" number of lines per screen (which can cause issues with some modern TVs).
Some trivia: I read somewhere that the sinking-into-water animation/effect in Pitfall -happened by accident- _(that seems deliberate, see the replies)_ . They moved the sprite further down the screen, but they hadn't written any code to draw the sprite after a certain scanline. Thus, the bottom part of Pitfall Harry gradually disappears, as if he was sinking.
I'm not sure I buy the accident story; the code seems to be specifically designed for this effect and I'm not sure why they'd write a routine to move the sprite down if you didn't already have the sinking effect in mind. Pitfall uses a complicated 9 part kernel routine to draw the screen - with each kernel drawing a separate strip. Kernel 6 (@$F2C2 in memory) is specifically written to not draw Harry, just the bottom of the ground objects and tops of holes/pits. Kernel 7 (@$F466) which is called instead when there is a ladder on screen, immediately starts with code to see if it should start drawing Harry. The lower strip kernels call the routine to draw him (hence why he can go down the ladder and into the tunnel). I suppose they could have gotten the effect by accident in an earlier version of the code, but what's in the final game is very purposeful in the effect.
@@3vi1J I found my source! But it seems it was deliberate, not by accident. So, you're right.
“Pitfall Classic Postmortem With David Crane Panel at GDC 2011 (Atari 2600)” at 32:40
ruclips.net/video/MBT1OK6VAIU/видео.html
RGME spent more time making this video than HSW was given to program ET
This is OUTSTANDING! I have been working on a long term Atari 2600 project, and have done my fair share of research - never have I come across a presentation this organized and tight, which didn't dummy down the concepts. I would very much like to see you deconstruct, in such clear and detailed fashion, topics like the venetian blinds technique or multicolor through flickering. The ultimate would be a full .asm deconstruction.
Everytime I learn more about the A2600, the reprehension towards fooling around with it gets stronger.
Same, although I think the audio is interesting. There's not much in the realm of game music on this system, and part of the reason was how much attention the display needed.
@@vuurniacsquarewave5091 That, plus the fact that the sound hardware is unable to reproduce most notes in a scale with sufficient accuracy that they don't sound out of tune. I think Pitfall 2 has a custom chip in the cartridge to be able to get the frequencies right (and to fix a few other missing features in the hardware).
@@possible-realities Yes, it's incredibly restrictive. You have to come up with your arbitrary tuning frequency and relate the existing pitches to that to get anything that sounds like music, relatively speaking. But this will still drive someone with absolute pitch mad.
@@vuurniacsquarewave5091 Idk, I think my port of some Zelda 1 music (check my youtube) sounds pretty decent.
The 2600 is in some ways surprisingly easy to code for with the aid of a modern emulator. Basically, you have to figure out what will need to happen at each spot in the screen, and then work from top to bottom, writing and testing the code that does each strip. In many cases, the screen-drawing parts of the code will be surprisingly straightforward. Mostly cycle-counted loops which load something using (zp),y addressing, write it to a display register, load something else with (zp),y addressing, write that to a display register, etc. The hard parts are the zones between display stripes where it seems like there's the smallest amount of stuff going on, but code is actually preparing the pointers for use in the next zone. If one leaves a few lines between zones in the screen layout where all one needs to do is keep showing one sprite, however, even those parts aren't too bad.
In my Toyshop Trouble game, I used 32 bytes of RAM to hold the shape of the 17-row player sprite in the two 16-line zones where it appears. There were I think about four different coloring patterns, all of which were green in the top and bottom row, so I could simply have one set of zero-page pointer values for even zones and one for odd zones, and show a suitably offset 16-color pattern. The game looks insanely complicated, and eking out the last couple cycles when clipping the left or right edges of sprite shapes was hard, but if one isn't trying to draw four independent-shape sprites per scan line the approach of setting up zero-page pointers to minimize the amount of decision making during scan lines will leave one with enough time to do a surprising amount.
Have we gone too deep? Never. The steps the programmers had to take just to draw one square on the screen is crazy
Brains so wrinkled they clipped into themselves.
This makes the NES seem like a supercomputer
NES is pretty fast, but you have to race the vertical blanking interval rather than the beam. If your graphics code takes too long you can get glitches
The NES was decent for its time. It had hardware support with the PPU and its own sprite layer - something even the Amiga didn't have.
The NES really seems like the first major console that wasn't a complete nightmare to program on.
@@williamdrum9899 not as bad as it sounds but yeah it's hard to cram in stuff like nametable changes
At least with the Amiga they gave you a hardware bit-blit, so you didn't really need hardware sprites.
holy, poor atari guys lol
got my head spinning of so much stuff to do just to display a single image..
awesome video dude, you always deliver quality
From what I understand, the Atari was basically designed to be a Combat-at-home machine. Everything else is just a bonus.
Sure; the Famicom was supposed to run a competent port of Donkey Kong.
It can also run Elite, Prince of Persia, and Kirby's Adventure.
No, it's meant to run pong.
So the thing is, Atari originally intended for it to just run Pong, Combat and a few other super simple games. They were even planning to replace it with a more powerful console after 1977 before their parent company, Warner Communications, made them keep supporting it. And so after that, pretty much every game made for the 2600 had to be made with extremely clever and complex programming tricks and exploits that Atari never even knew existed at first.
@@XanthinZarda Only the PAL Famicom can run Elite. The NTSC versions don't have nearly enough scan lines in vblank.
It was a Pong clone machine with a few bells and whistles added to handle Combat type games. Every one of its capabilities made it easy to program Pong or Combat type games, but sorcery to program anything more complicated. It was meant to have maybe 5-10 games, all of which would be Pong or Combat ripoffs, and then hopefully it would generate enough money to pull something better together.
Unfortunately programmers figured out how to trick more and more life out of the darned thing. The hardware that was meant to become its successor became the Atari 400/800, and didn't get released as another dedicated games console until the 5200 finally came out, by which point it was dated.
Programmers back then were arcane wizards.
Limitations breed creativity
The world need more arcane wizards
Technomancy is a beautiful thing
I tried doing some 2600 coding as a kid, boy was it frustrating to learn, but extremely satisfying when it worked!
Later I would do some 6502 on the atari 8 bit systems and while still extremely primitive was much more flexible.
Todays programmers never had it so good, what with hardware acceleration, APIs, 3D and multiple engines to do the hard lifting they don't know what it like to sync hardware by effectively counting execution time.
Good indepth video as usual, brought back some of the nightmares I'd almost forgot!
Hardware acceleration is one of the best things ever imo, makes drawing animated UIs that much easier
I think this is kind of a detriment in a way, to be honest, that modern programmers don't have to worry about any of this stuff. It's why we get games like CP2077 that are just so terribly optimized because programmers just don't have to care, so they don't care and they just jumble spaghetti code together, whatever works, throw it in there. Meanwhile back in the 8 bit days, you had to count in machine cycles to get things to line up just right, and you had to squeeze every last bit you could out of the code. If modern coders did this, counting down to the singular bits, games would be insanely optimized, but it would also take years and years to make anything, sadly. But still. I think some modern developers could do a little better and stop rushing buggy messes out the door and try to optimize a little better.
The Atari 8-bit was primitive ? Have you ever looked at the other 3 home computers that were out at the same time? Radio Shack, Commodore Pet, and the Apple ? Talk about primitive. The Atari 8-bit had a graphics co-processor, 128 colors (later models had 256), built-in horizontal and vertical scrolling, a sound chip (4 8-bit channels or 2 8-bit channels and 1 16 bit channel, or 2 16 bit channels), three different ways of doing graphics (bit-mapped, character graphics, and sprites (known as player/missile graphics). It's 6502C microprocessor was clocked at ~ 79% faster than an Apple.
Those other 3 computers didn't even come close to the Atari.
Incredible video! As experienced Commodore programmer, I'm very familiar with 8-bit low-level software. I've read documentation about on 2600 and was horrified. You have brought that horror into plain view for the masses. It is TRULY amazing what the programmers could do with 128 bytes of RAM, 4K of ROM, no operating system, and no GPU. Anyway, thanks for explaining it with real-life examples in detail!
I am blown away by this, both the complexity of the 2600 and how you were able to explain it so painlessly.
It took me 10 minutes to realize that the weird gap in the repeated sprites was deliberately so they could mesh into a 6-wide block, so all that register juggling was the expected method by the hardware designers.
I'm not sure that was the main purpose, or even realized that it was possible at the time. I think it was mostly there to have more things to shoot at (e.g in Combat) or have a "team" to maneuver (e.g. Football). I can't think of any games that used this 48-pixel wide "six sprite" programming strategy for scores or other things in the first few years of game releases.
I think it was originally worked out by someone at Activision, and quickly became a widespread technique.
@@michaellosh1851 I believe the infamous Dragster uses the "six sprite" technique to draw the two dragster sprites, and maybe the timers.
So what you're telling me is the people who designed the Atari 2600 hardware are pure sadists and enjoyed watching poor early software engineers squirm
Don't blame the hardware guys, blame the management who wanted something as cheap as possible. That's the REAL story here.
Keep in mind that the 2600 was only supposed to run a handful of games like Pong, Breakout, and Combat. Once you realize that the hardware was designed around these games, it makes a lot more sense. Not only that, it looks downright easy to program for. What was unexpected was that the development teams kept figuring out how to push the system further and further. Eventually most of them left and formed Activision which created MIND-BLOWING quality on the 2600. (e.g. Pitfall, H.E.R.O., Kaboom!, etc.)
@@gordontaylor2815 I’m not really sure if blaming management would even be right. The market probably wasn’t able to bear the 50% price increase
@@thewiirocks Sure; but unlike a certain company known for Hanafuda cards and leaving luck to Heaven, Atari didn't intend for their pong machine to be expandable. Which would later kick them in the goiters, stunting the growth of Atari entirely.
@@XanthinZarda - the Atari was made in 1976. It used cartridges, which made it expandable. Nintendo didn't outsell it for the next 10 years.
The fact this could pull off more than simple pong-style games boggles my mind. Huge kudos to anyone who has developed for the 2600.
If I understand correctly... the Atari 2600 combines the worst parts of sprite and CPU driven graphics into the most unholy display system of all time?
Basically. Everything about this system is chosen because of price. Even if it means programming for the system is hell.
Don't forget the detuned-peice-of-crap POKEY chip.
@@someguystudios23 The POKEY is quite an interesting beast of a sound chip. It's one of my favorite ones actually, well not because of its tuning issues. The TIA's tuning is so utter crap.
@@TheBeeshSpweesh Have you heard the BTP2 music driver used by Stella's Stocking? Four voices of five octave full chromatics with two volume levels, using 46 cycles out of every 76.
@@flatfingertuning727 I've heard the Stella's Stocking music, yes.
It's stuff like this that makes me so obsessed with early video games... in our age of near-infinite memory and processing power, it's just never going to be as impressive a feat to me if you create a stunning AAA perfect-graphics title as it would be to create something interesting within the extreme limitations of early consoles and computers.
Limitations breed creativity. So many peices of hardware or genius bits of code were invented because someone wanted one extra sprite on the screen, or a few more polygons on the character model without nuking the framerate.
Console = computer
Imagine what developers could wring out of modern hardware if they used resources as carefully and creatively as they did back then.
@@googiegress My opinion is that back then the complexity was on the software side, the hardware was simple. Nowadays it's the opposite: writing the software (in user mode at least, API, drivers and OSs are still hard to program) is kind of simple (if you don't have huge optimization constraints), but the hardware is a lot more complicated than back then.
@@benjib2691 What do you think about the efficiency loss when using software development tools that aren't purpose-built to that specific implementation?
An example might be a video game using an engine that was designed for all types of games, whereas when writing the game in an engine designed specifically and only for that type of game you have more limitations but also no extraneous bloat in the engine.
One might argue that there's an efficiency loss at each layer from the hardware instructions up to the customer booting up the .exe.
I feel that the end result, of possibly less-efficient software, is still pretty great because today's development tools lower the barrier to producing the software in the first place. Using standard tools means other people can more easily build upon it later. And, even if the developer went through all the effort to write his own ~everything~ that his game or whatever depends on, maybe he would come up with less-efficient specialized solutions than the industry standard generalized ones.
Imagine not having a framebuffer to work with in 1977.
*This post made by the Channel F gang*
There were more devices that didn't, example the NES, it did everything using tiling backgrounds and 8x8 sprites. Still, heaven compared to 2600 coding
@@iDontProgramInCpp Yeah but the Channel F was actually released before the Atari. Not over half a decade after.
@@stevethepocket I know, I just added a remark about the NES
The thought of programming for the 2600 made my brain collapse.
Thank you for a superb explanation of this headache rabbit hole.
I was cringing in horror not even five minutes in.
I'm never going to complain about any other language again dear god.
I now understand every Atari game I ever played. Or rather, I now understand the idiosyncrasies in their graphics.
In Yars' Revenge for example, the Missile and the Zorlon Cannon must both be the two missile sprites. The pellet shot is the bullet sprite, and the Quotile/Swirl and Yar are the two player sprites. The neutral zone in the middle is just a constantly fluctuating background; that's why it's made of thin bars. The shield is a background, with the rotating version just being a cool way to make it work. And it moves up and down because you have pixel precision on vertical movements for backgrounds, but not horizontally.
Indeed, that last bit explains *so many* Atari games and their penchant for vertical scrolling.
In Adventure, I knew objects flickered when 3 or more were on the screen, but I now know why (they're all player sprites, and you can't have more than two render on the same scanline, so to make things simple, it only renders two of them ever). Also, the player is a missile sprite, so that's why it doesn't flicker.
I also know how the labyrinths work and why it flickers like an object: it's a "player" sprite. The background is made the same color as the default color, but this "player" sprite is made to appear behind the background, so it makes the background visible.
And I even know why the famous dot is a *dot* (it's the bullet sprite, since that's all that's left at this point). And I *bet* the barriers the dot makes disappear are actually the other missile sprite.
The neutral zone is more than just a constantly fluctuating background, It's the graphical representation of the actual game code in the Yars' Revenge Rom itself, overlaid on itself and counterscrolling, with some x-y offsets and random color processing thrown in.
The "famous dot" was a player sprite which just so happened to only have a single pixel set. The adventurer's sprite was the Ball, which shares its color with the playfield. Rooms that had just a left wall or just a right wall would use the missile sprites for those purposes, and their colors were shared with the player sprites. The game was coded to allow passage through the right wall if its color matched the background, which would only occur if the dot was in the room with at least two other objects.
@@flatfingertuning727 My mistake. The dot would have to be a player sprite, since the way you find it is that its room causes flickering when there's 1 fewer items than you would expect to cause flickering (ie: when you bring any item to that room).
Developing games today: "If our game is too heavy on the console's cpu, the frames will take longer to render and it will look choppier... Let's launch it like that"
Developing games back in the day: "The game won't f**king work if I write/read too many variables before the next line needs to be rendered"
Atari 2600 is so slow, you gotta budget your CPU usage by scanline sometimes :)
@@mzxrules ._.
Typically, if a game used WSYNC after tricky scan lines, trying to put too much into such scan line would cause all subsequent lines to be shifted down on the screen, and possibly cause the frame length to be a line too long (depending upon whether the RIOT timer was set at the start or end of the visible part of the frame). There were quite a few games where this would happen if some characters were too far toward the right edge of the screen.
@@flatfingertuning727 These devs managed the impossible with the cpu of a calculator, meanwhile I get frustrated when I forget a semicolon and the whole code won't compile.
Modern publishers: You spent eight years getting this game to a barely playable state. Mind doing another two years to make the women uglier? Make them look like men. Our investors are mad at us for having women in our game - "oversexualised" they call it. No, don't try to fix those bugs or make the game a little harder - in fact we'll replace you with this guy with the pronoun pin.
Ancient publishers: Six months. Six months to make E.T. for our bare metal platform without even so much as a sodding BIOS. We're also going to print millions of them despite the fact that our install base isn't even that big. No, we won't give you another week to fix the collision detection or make E.T. brown.
Your explanation of vertical delay is hands down the best I've seen. I've been working on an Atari 2600 emulator for fun and mostly going off the Stella guide, and even though I've read its one cryptic paragraph on vertical delay 50 times at least, I never made the connection between that feature and its application to complex graphics. Using your video, I was able to get E.T.'s face (and so many other things) to display properly in my emulator. Thank you!
This is a brilliant explanation of this process! I can't imagine having to write code for this console without going mad.
There's a reason why when Atari 2600 enthusiasts call those programmers "wizards" or "magicians", it's only partly in jest. :)
Gamers: I know ET for the A2600 is the worst game ever.
A2600: I require a phd to draw a funny face, and a sacrifice of your smallest limb to make a game. Kneel before me tiny human.
A2600: My soundchip uses an alien scale. Good luck making music your puny human mind can understand.
E.T. was overprinted, it actually sold quite well. It was also horrendously rushed. There's a romhack that fixes its problems and makes it one of the best A2600 games ever made.
For ACTUAL bad games, we have... Concord? Dustborn? Veilguard? To name but three...
17:36 Quite hilarious how the relationship between v-blank and drawing space ended up being reversed on the NES and newer consoles. Then again, the Atari 2600 is such a primitive system that the CPU has to handle almost all of its graphics manually (leaving only the pixels themselves to the PPU or its Atari 2600 equivalent), so the only time it can do game logic is when it isn't buisy to draw the screen. As a below commention mention it, it truely is more like a graphics processor doing only expected CPU stuff when not drawing.
This contrasts with e.g. the NES which instead tells the comperatively advanced PPU on how to draw the final screen during v-blank and only interfers with the drawing process if something needs to be changed mid-screen (e.g. status bars, parallax scrolling) to the point it doesn't come in with raster interrupts besides the unprogrammable NMI by default and needs them to be included with MMCs!
Either way, the developers, much like computer scientists back in the day, surely did a lot of theoretical stuff first before tackling on the game proper while nowadays, you don't even need knowledge in programming besides some very abstract code to create a new game.
There was an IRQ functionality planned for the NES. A decap of the 2A03 revealed an unfinished IRQ timer on the die that was never connected to anything. So in the end it almost actually happened.
@@vuurniacsquarewave5091 It's possible to set up IRQs to happen at predictable times on the frame using just the NES hardware, using the DMA audio to output dummy data. If one writes the DMA rate shortly after an IRQ, and then a certain minimum time after that, the DMA time will be the first value programmed plus seven times the second value.
@@flatfingertuning727 Yes, I've used this trick before, but there was always a small amount of jitter, because the DMC channel is not synced up with the picture (why would it be though).
@@vuurniacsquarewave5091 The jitter can be kept to within about 10 cycles or so. If software sets up the interrupt to be synchronized with the DMC, it should remain synchronized, though the fact that the number of cycles per frame isn't an integer means the interrrupt handler needs to use a pattern of short-short-long frames. I'm not aware of anyone using the trick of using two DMC rate values for each interrupt, but that greatly reduces the amount of overhead necessary to maintain stable raster splits.
What really puzzles me is the use of a programmed 4-bit value as a lookup into a table of 16 output rates, many of which are so slow as to be pretty well useless for outputting audio.
@@flatfingertuning727 They are still usable for very low-fi sounds. If you make the base of your sound correct at say Pitch 7, you can use only two small samples to cover almost two octaves. I'm not aware of any games using this, but I did it when I added DPCM sample sounds to a romhack of mega man 3 (never released though).
Unbelievable system. The book "Racing the Beam" was a fantastic read, and anyone who loved this video should pick up a copy. I'd love to see a technical tear-down on how Solaris (my favorite 2600 game, and an absolute technical tour-de-force!!) was done.
As a developer, this was deeply fascinating to me, I love all of this retro stuff. Thank you for the great content.
Man your videos help me sleep late at night. I’m not saying you’re boring, or that it’s night right now, but you absolutely help, so thanks, my insomnia can be a bitch.
Watch the Chrontendo videos, where the guy plays every NES game in order of release. His droning voiceover is perfect for nodding off to a fine, nerdy sleep. Chrontendo is as much of my nightly sleep routine as brushing my teeth.
Learning 6502 Assembly at present, this is so fascinating. It's like a super sonic relay race, so much overlapping (and yet not) code just to draw out pixels to a screen. Thanks for the explanation - my tutorial only briefly touched on it.
understanding the content of your videos make me feel smarter and for that I say thank you
I knew the 2600 was a very obtuse machine to develop for, I did not know it required some kind of cosmic super intelligence to figure out. Please, I must know more about it!
There's a reason why the enthusiast community calls the programmers "wizards" or "magicians" - it's not entirely a joke. ;)
Nah. You just need to memorize all the instruction timings, otherwise it would take you a very long time to get anything working. Fortunately, the CPU is pretty simple.
Fantastic video! You've got everything about the graphics right and presented it very eloquently.
I would certainly like to see you do other Atari videos. Perhaps even something about the Atari 8-bit home computers? Those have a very interesting architecture, which is in many ways an "upgraded" version of what the 2600 has to work with.
A little mistake, though: the 6507 does not have a store-zero instruction, as depicted at 16:48. That instruction is only present on 65C02 and 65816 processors.
Outstanding video as always!!!
Wow, I can't imagine what it takes for someone to be able to keep track over each processor instruction. I suppose that that's the "magic" of mesozoic computer hardware, hehe.
Cheers!
I've dabbled in Atari 2600 programming before, and while it is constrained in some senses, it gives you some ability to produce certain effects that are more difficult or impossible on other systems through the 80s. I think it's fun being able to control vsync/vblank. Supposedly it is possible to create an interlaced picture if you time it right. And 128 colors! This was in 1977!
IIRC, the Atari 2600 did "rainbow" effects really well because you could change the color for each scanline to produce a reasonably smooth gradient - no other classic system could do it as well as the Atari 2600 did.
And it came out at a time when most video games, including arcade games, were in black and white.
This is a great video - excellent job! Being a current 2600 developer, I must say you explained everything perfectly - even the sometimes-confusing VDELPx registers! Subscribed! :)
Honestly: thank you
Thank you for making such videos.
They teach very technical things using a topic (video games) that really interests me and videos like yours just showed me that I have a passion for low level hardware interaction.
same, you put my thoughts into words
One cool thing to highlight, if there is ever a follow up video, is why some Atari 2600 games have black bars on the left side of the screen. I have heard that it is due to game code being executed instead of writing to screen and after this video, I can see how that is the case. Still a video on it might be interesting.
I cannot even tell you how awesome this video is. I bought a book about Atari 2600 game programming and was totally lost. This video did such an amazing job of explaining everything to me. Thank you so much.
Funnily enough the concept of timed CPU cycles is still relevant even to thus day. Mostly for security and hacking prevention. The amount of time a computer takes to decrypt data can give away the data, so to make the code more secure it's done in a manner so that it takes the same amount of time regardless of input. My understanding is that Pokemon Sun and Moon's random number generator was cracked by measuring how long it took to load your character's face on the file select screen
Yep. The Wii's optical drive password was cracked by timing how long a wrong password took to be rejected. If you got the first character of the password correct, it would take slightly longer to be rejected than if the first character was wrong. Then you can do the same with the second character and so forth. It's called a "timing attack" and it is pretty ingenious.
@@3rdalbum Sounds not unlike listening for the clicks when cracking a safe combination
In Pokémon S/M's case, this could be made less reliable if you install the game rather than playing on a cartridge: the 3DS' SD reader is capable of significantly faster I/O than what retail cartridges support. (I don't believe the full potential of the cartridge slot was ever officially used)
These videos that you've put on your channel explain things like this in such detail that not many other youtubers tend to tread into. This content is really insightful and you are very underrated. Suscribed!
A 40 minute long video about Atari 2600 development? Awesome!
Edit: please stop stalking me
hi tux
no way it's tux one
@@ultlang hi ultlang
*stalking*
Tux :)
Welcome back! We missed you 💜
I tried to learn how to make Atari 2600 games back when I was a high schooler with entire summers of free time. After reading all the requisite manuals and learning the basics of machine code, I got as far as making a happy face move around the screen with joystick control.
I tried to follow a 2600 programming book thinking it would be a good introduction to assembly language. It was a good introduction, but all I managed to do was get some colors and a sprite on the screen before I gave up and switched to NES.
We have it so easy now. We can concentrate more on creating the game than how the hardware works. Doesn't mean we can create better games though.
Imagine one person coming up with a game concept, designing the mechanics and balance, creating all the artwork, writing 500 lines of code just for a basic title screen, and then your boss refuses to put your name on something that was solely your creation.
Amazing work! So detailed and meticulous and well-presented!
I did understand about 5% but I found this insanely entertaining to watch and listen to!
Nearly 40 minute long video, 3 month wait, this is going to be great
40 minutes later... "my brain hurts"
I'm glad this came up in my recommendations because for some reason my notifications got turned off. Great video glad I ended up catching this video!!
Amazing video! Yes, we would like to see more from the Atari 2600. Looking at games like River Raid and Keystone Kappers with their scrolling playfields and many sprites, they seem to be wizardry after watching this video.
This amazing explanation brings so much respect to the developers who presented so many video games back in the day. It is just amazing to understand how constrained they were to develop games with some sort of quality.
The absolute creativity and pure ingenuity of these old console devs just amazes me
Back in the mid late 80s that could be achieved in the Commodore Amiga using the Copper (Agnus chip) aka copper lists instructions.
Interestingly enough, Jay Miner was involved in the development of the TIA of the 2600, the OCS of the Amiga and the ANTIC of the Atari 8-bit computers. The latter two share the idea of display lists, offloading the CPU from a lot of the beam racing tasks.
I guess the hardware guys designing the Atari 2600 hardware did have only Pong and a couple more of games as a reference, and designed the hardware around these two games. Having "ball" and "missile" sprites and a mirrored background confirms that. Any other game apart from Pong (e.g. Pitfall, E.T. etc..) is literally a software hack shoehorning the hardware to different needs. Later machines (Commodore64, Amiga etc..) made the hardware more general, providing tiled background and a set of movable sprites all equal to each other.
Wonderful explanation! It's hard enough to pull this off these synchronizations w/ modern emulators with debuggers like Stella, i can't imagine trying to work this out on a late 70s workstation with nothing but the code and a ROM burner. Thanks for another great video!
Programming back then: this video
Programming today: How do I center a div?
so cool visualising clock cycles on as CRT scans-amazing content and fantastically visualised. Looking forward to checking out your other videos!
The 2600 is an allegory for life. We're all racing the beams of the sun, trying to time everything perfectly so that it syncs up with everyone's schedules. Only during that vblank period we call the weekend do we really have time to ourselves.
That's deep
Im an atari 2600 and this is deep
I'm a deepfake and this is deep
It’s a miracle this could be figured out as workable. Really great and thorough explanation, I loved this!
This makes the NES look like an absolutely luxourious console to program for
Programmers today don't really appreciate how easy they have it today in programming a game. Not only do they have virtually unlimited hardware resources (storage space, processor speed, RAM, etc) but the actual TOOLS available to create the programs (games) are FAR beyond anything that was available back then. These guys were literally using graph paper and calculators! Sometimes getting the timing right would take weeks. David Crane, the programmer of "Pitfall!" tells the story of how he initially only wanted to give the player ONE life to do the whole thing, but after he was talked out of that idea it took him several weeks to optimize and reduce his code size enough to where he could add the feature of having (and keeping track of) more than one life. Imagine that!
It's worth mentioning that in addition to the limitations mentioned in this video, the 2600 also only had *128 bytes (yes, BYTES)* of RAM to store variables and anything else that needed to change during program execution. And also, although programming any kind of game for the system was a challenge due to the programmer being responsible for EVERYTHING, it also was the thing that allowed the system to last far beyond its expected lifespan because it meant the sky was the limit instead of the specific capabilities of a video chip.
128 bytes only in the base hardware, but some cartridges release late in the 2600 production included additional RAM (e.g. another 128 or 256 bytes that could be accessed through special reads and writes in the ROM space, and probably always coupled with expanded ROM space too, by that point). But this made the cartridges more complex to design and build and therefore more expensive.
@@michaellosh1851 True. RAM at that time was super duper expensive so they did the best they could at the price point they were shooting for. The original plan was to only have 64 bytes! But they were able to cost-reduce in other places and double it to a massive 128 bytes.
And yeah I've heard of later games having their own additional hardware such as RAM. _Pitfall 2: The Lost Caverns_ had additional hardware for 2 extra sound channels and better display capabilities, for example. But limitations in game consoles cause something very interesting to happen: Programmers are forced to actually spend time on gameplay to get the most out of the system. And although consoles today have hundreds of times more capability I can't really say the games themselves are hundreds of times more fun to play than some of those old Atari games. I used to spend HOURS playing _Adventure_ and that's about as simple as you can get. It didn't even have any music and VERY few sound effects. But it was very well designed, especially considering the limitations of the machine.
Seems like there should have at least been a sound guy to help with sound effects and music for various programmers.
@@sandal_thong8631I think back in the early days manufacturers didn't know if this was going to be a short-lived fad or not, so they invested the bare minimum. Later on though, when it was obvious that this was going to continue to evolve for many years, they actually did get sound men and women to do such things (and graphics people to work on graphics, etc). But in the beginning the programmer had to do everything him or herself, with very limited tools and slow compilers.
So, what you're saying is that every atari 2600 game was a goddamn miracle.
Froggo was the master of damning us with their miracles.
It’s nice to see you talk about the Atari 2600
Not even halfway into the video, and the 2600 sounds like it was HELL to program games for it...
Back then you didn't have emulation either so testing took much more time
i have so much respect for everyone who made games before modern times with C and IDEs and stuff
also, best channel on yt!
Wow, what a nightmare that programming must have been. Great video! I always wondered about this, and your explanation is the best that I've seen.
Blown away by this video. Really good and really accurate and beautifully presented. Atari 2600 development requirements are no joke.
Thank you a lot for uploading
An incredibly well made video! :)
I am so sorry for everyone who had to write for this.
This is absolutely mindblowing and, from an outsider's perspective whose job isn't on the line for this, actually sounds extremely fun in a way to experiment with.
The legend has returned
Holy crap. How did *anyone* code *anything* on this!?
Another fun thing about this console: it gave you a whopping 128... *BYTES* of RAM to store all your variables, shared with the call stack. If you wanted more, you had to provide it in your own cartridge. Though to be fair, you could easily have several KiB of that.
It's understandable why they did it, considering the need to minimize cost and the simplicity of the games it was originally designed for. Still a shocking limitation in retrospect, though.
@@fllthdcrb I know, right? I actually knew about the 128 byte RAM because of some research I did a few years back (was curious how the technical specs of old consoles compared), but yeah, that was pretty much my reaction.
IIRC, you only get 4KB of ROM too, unless you bankswap. These games are *puny* .
This was in the days where computers were still a very technological and niche thing which was more in line with hard electronics rather than entertainment systems. It was a games machine in name, but it was by and large repurposed from tech with no frills.
@@fllthdcrb When I first read about the 128 bytes ram, I was surprised and wondered how all the graphics and game code could be stored in 128 bytes. You see, at that time, I thought that atari was like a modern system where EVERYTHING has to be in ram since the cpu/gpu cannot directly access the HardDisk/SSD. But then later I learned that the atari can directly access cartidge memory. I guess that is one of the advantages of cartridge memory.
@@cylemons8099 Yup, that's a thing with these old systems. There was so little internal memory compared to the address space (not too unlike these days, come to think of it, unless you're running a NUMA system that is), and the CPU slow enough compared to the memory, that you could attach more through a cartridge or other expansion port and have the memory mapping hardware map it to some address range so the CPU sees it as just more memory.
Well, that wasn't always the case, though. The Commodore 64 had a full 64 KiB of RAM-already the size of the 6510's 16-bit address space-in addition to some memory mapped I/O and 20 KiB of ROMs, so it already had to employ bank switching (hence having the 6510 with its extra pins, some of which were used to change banks, instead of the 6502 that was far more common in other microcomputers). RAM expansions had to be accessed differently (e.g. transfers) instead of mapping.
I knew that programming for the Atari was a struggle, but HOLY SHIT how much thinking for only displaying a bitmap or a sprite! This could be as easy as setting 2 pointers, or at most a MOV loop to screen memory. If we had a decent amount of it, like for example the whole screen which is not too large anyway. And it is not interactive yet, where input must be processed, decisions must be made, and graphic should look like something real, not just squares of the same colour...
(another thing: the animations of the video are GREAT and helpful and it must have been much work, thanks!)
thanks for the vid, must have been a ton of work, i love it
It's amazing what General Computer did for Atari "the creators of the silver lable games such as Centipede, Pole Position, and especially Galaxian!" The Atari 2600 had no buisness playing that generation of games. Only the 5200 was suppose to be able to do those things but General Computer Corporation made it happen.
They went on to develop the Atari 7800 and many of its games as well.
The guy making these videos is a certified genius. FAANG should be beating down his door with offers.
the fact that actual human beings sat down and programmed games on this pile of garbage is absolutely horrifying to me. no wonder they were so pissed about not getting creditted
Memory was expensive in the late 1970s, so they couldn't afford to put much in the 2600. The ROMs are 4 KB, usually
But nowadays demo makers ROTATE FUCKING TEXTURE FILLED CUBE on this shit.
Thank you for documenting this.
You read my comments last time thx for the vid I enjoyed it
This is an amazing illustration of how the 2600 works. Thanks for putting together this high quality explanation!