Great video! re the comparison: To be fair, you should probably had compared a cheaper monochrome monitor for the C64 as the TRS-80 was monochrome and I think the monitor included in your Apple II example also was monochrome. (The C64 produces excellent monochrome picture from the "luma" pin). P.S. speaking of CP/M computers: As you speak German, my suggestion would be something about Robotron! In particular, was their SCP a CP/M compatible OS or was it just an unlicensed and renamed copy of the actual CP/M?
Oh yes, Robotron: the absolute Unobtanium from former GDR (ex-DDR). I'd love to get a hand on one of these, absolutely. Intreaging machines certainly worth an exploration.
I wish something could be done showing the C128 running its CP/M mode. There are still many C128's around (compared with few CP/M carts for C64 still available). I own a C128, have never done anything with its CP/M.
Ah, yes, a benchmark battle between two Commodore CP/M solutions The 128 would have advantages of 80 column display and speedier floppy disk access. But is said the Z80 was clocked at about half of what was typical for a Z80 computer of the day, but am betting it’s the same case for the Z80 cartridge for the C64 One question about CP/M on C64 is if it’s floppy drive access is compatible with jiffy dos
@@THEPHINTAGECOLLECTOR , a task I have on my agenda is to get made (maybe through an 8-bit friend of mine) a 3.5 inch disk to boot the CP/M on my C128. Because I own a 1581 drive.
Fun fact: Apparently in some cases it's possible to use the C64 CP/M cartridge on a C64. Totally useless as it's way inferior in all ways, but still :)
It's interesting that the CP/M cartridge was a dismal failure and yet one of the key selling points of the C= 128 was it's CP/M compatibility. Also you should consider direct video capture instead of relying on your camera locking to a very slow CRT refresh rate.
Yeah, I was too lazy redoing the footage. All follow-up screen captures look a lot better after fixing the camera settings. I agree, direct video capture would be of higher quality. I'm saving up to afford a some good capture devices, though until then, camera captures it is.
A major difference is that the C128 CP/M is actually usable. The fun fact is that C128 got it's CP/M mode due to Commodore aiming for it being able to do everything a C64 could do, and they had a hard time getting the C64 CP/M cart to work with the early C128 prototypes, so they decided to add the CP/M capabilities directly to the C128 :)
@@THEPHINTAGECOLLECTOR Re capture: For this type of capture where you don't need that long sequences and whatnot, I've heard that a DVD recorder does a decent job. Of course there is the downside of having to burn a DVD (at best a RW disc), but if you would only need it for a few videos per year it might be an option if you find a DVD recorder at a thrift store. P.S. don't take my word on this, but afaik most DVD recorders and DVD players from the 00's uses regular PC IDE/PATA optical drives, but with a different front panel, so if the drive of a DVD recorder seems bad it might be possible to swap it out with a generic PC one. Otherwise you might want to try filming a 00's or early 2010's flat screen TV connected to your C64 perhaps?
You are most probably the first youtuber that says thruth about numbers of c64 sold. All others are just repeating, as bunch of stupid idiots, what tramiel said: 30 million. Without any thinking! Thank you for this detail. I really, really appreciate that.
@@bobns509 Indeed, the information about total produced units is ranging between extremes, depending on who was asked. The english Wikipedia has the best perspective, showing both, the „official“ (not proven) copy-paste statement and the inofficial investigations coming to the 12ish million conclusion. I too think Tramiel was overexagerating on the numbers.
4:57 Your math on the table is off. Specifically the C=64 column: $595+399+399+130 = 1523. Clearly, you forgot to add the cost of the CP/M cartridge. That's still significantly less than the competition, so, not wrong in saying that the C=64 option was the cheapest. But damn, computers were expensive back then. It kind of makes me feel bad for my folks after all these years, having such an expensive hobby as a kid and costing them so much money. But then again, kids are expensive no matter what, eh? At least now computers are cheap (well, the older ones, anyway), now that I have to buy my own.
Nice to see you got your CP/M cartridge running. That yellow cartridge is apparently a Dirk Wouters' re-engineered design (see the RUclips video "CP/M v2.2 on a Commodore 64!" by Mike Paull); it seems to work like the original but runs reliably on a C64C.
Really good video. I do remember the CP/M cartridge at the time but I never got to use it as I had a Vic20 and then a Spectrum 128k but its nice to see it here. Thank you for sharing this important piece of computer history
(in Mr. Know-it-all's voice) You got it booting and _didn't_ type DIR immediately? What kind of CP/M demonstration is that?! In all honesty though, it's an interesting story. The whole idea of putting a second CPU of a different architecture just to run some non-native software seems so alien today, even though it seems it was somewhat common in the 80s and even 90s (like the PC cards for mid-90s Macs).
"The whole idea of putting a second CPU of a different architecture just to run some non-native software seems so alien today" That's because the ever growing abstractions and complexities of the system chipsets which don't allow to do these things today. MMU's for example, don't allow you to share memory between CPUs of different architecture. PCI (by default, without an IOMMU) is also not shareable between memory controller hosts. You need an specific chip complex to allow CPUs of different architectures to share an high bandwith link between them. SGIs had that xBar bus thing. Last attempt on an unified bus for multiple coprocessors connected to a master complex was Hypertransport in the 2000s.
@@hyoenmadan I don't think _that's_ the reason. We have the main RAM being shared with GPUs for example. We could also have some kind of translation layer. Or the second CPU could have its own RAM, just like the Apple PC Compatibility Cards I mentioned, which were essentially full PCs on a board, just with an unusual IO arrangement for keyboard, mouse and the like. Of course the boring (and true) answer is that software just got portable (at either source level or through various interpreters and VMs) and there is no longer any need to do that. And when there is, things like emulation and virtualization do the job well enough, because the CPUs just got so powerful. Still, it _feel_ ridiculous. Imagine someone today was selling the solution to "I can't run Windows on my Apple Silicon Mac" in the form of a Core i5 Thunderbolt dongle - imagine even just the idea of that, regardless of how technically feasible that is.
@@kFY514 Connecting different architecture CPUs to the same RAM is different to connecting a GPU (and TPUs btw, them are the same category) to your system. For one, an extra CPU is a coprocessor, and a GPU, even the bolted APUs, are devices. So the semantics are also different. Your CPU access main RAM and you access your GPU through strict hardware and firmware semantics, and the software APIs and applications have accommodated to them, while each CPU vendor and architecture have their own semantics and requirements which are different and diverse between them. You can't just make alien CPUs to coexist without a bunch of circuitry and glue logic in the form of chip complexes, as Hypertransport in my post. Also, system's on a board, like the Apple compatibility hardware, or the SunPCIs never were an option because them were SLOOOOOW in practice. Suitable for business programs and applications, but subpar to everything else. Also them were expensive stuff. They never stood a chance for success, and they were practically killed with the advent of advanced graphics and GPUs. Now, a true coprocessor, as chip or as a card, would still have its use today, like for example having hybrid ARM/x86 machines for development and highly geeky stuff. But the complexities of the MMUs on both CPUs and the hypertransport complex connecting them would make such solution prohibitely expensive. Also the CPU makers of both architecture should agree and tailor their solutions on hypertransport as connectivity link between them. In the end, is cheaper just having separate machines for your needs.
Regarding the performance of the CP/M cartridge - I'm pretty sure that the cardridge's Z80 CPU was driven from the C64's 6502 clock (especially as the cartridge was designed so that the Z80 used the C64's RAM, rather than having its own). The upshot of this is that the Z80 would have been clocked at 1MHz (there or thereabouts, depending on whether you had an NTSC or PAL model), so limiting its performance. (The "original" Z80 could have been clocked at up to 2.5MHz, the Z80A at up to 4MHz, and the Z80B at up to 6MHz ; Acorn's Z80 second processor units for the BBC Micro used Z80B parts, clocked, IIRC, at the "full" 6MHz.)
And, by comparison, I also *think* that Microsoft's Z80 card for the Apple ][ worked in a broadly-similar fashion to Commodore's CP/M cartridge - it had a Z80 and glue-logic on the card, but relied on the host computer for RAM and CPU clock.
I don't know how the C64 CP/M cart is actually implemented, but the 8MHz dot clock is available on the cartridge port. Similarly the 7MHz dot clock is available in the Apple II card slots. So for either of those systems the Z80 could be clocked fast and have wait states inserted to sync up with the 1MHz synchronous bus. A Z80 always uses at least three clock cycles for each memory read/write cycle, so ignoring "bad lines" on a C64 an optimized CP/M cart would perform like a 3MHz Z80 when accessing memory. Not sure if the Z80 "idles" the bus when doing internal processing, if so the Z80 could be clocked at a higher speed and instructions that take many cycles but don't use memory that much, like ADD, would run at almost the full clock speed for a faster clock. (Btw Z80 is stable down to 0Mhz, i.s. you can single step the clock pulses. Thus you can insert wait states by just stopping the clock).
It’s downside is that the C64 floppy disk were pitifully slow and CP/M is a lot about floppy disk interaction (a lot of C64 software, in contrast, was available via cartridge so not as much of an issue. It’s second big downfall is that a 40 column display limitation was not a great match to much CP/M software geared to 80 columns. And the third big problem was access to CP/M software given the infamous problem of floppy disk formats proliferation - that one was actually more the fault of Digital Research for not imposing some standards for floppy disk formats.
I bought Turbo Pascal and told them, that I would use it on my c64. So they sent me a perfect c64 disk with TP on it. Yeah, I had no 80 columns and only 1 1541, but I coded quite some lines on my c64. My biggest problem were the frequent crashed, so I had to save my sources all the time.
I had a c64, it was popular here. I knew a few people at school and more at college. My dual tape stereo was used a lot. Spectrum was what poor people had. Electron, amstrad and BBC were what you were unlucky and your parents got sold a turkey. Nobody admitted to owning a c16 or plus 4. A friend found one in a skip.
Booting, perhaps, though this would have required extra components on the cartridge PCB - perhaps something that Commodore didn't want to do for cost reasons. (As an aside, the company Torch developed an alternative to Acorn's Z80 second processor for the BBC Micro - and this included "CPN", a Torch-developed CP/M look-alike OS, and I believe this could boot a minimal version from the built-in ROM.) Beyond booting, I think that CP/M (and CP/M software) were generally written on the assumption of floppy drives (well, at least one floppy drive) being used...
Given the floppy disk contains roughly 100K of data, it would be interesting to evaluate what 100K of ROM would have cost in 1982. Plus, if there would any additional discrete logic be required to make this cartridge a co-processor and ROM cartridge hybrid. The EPYX Fastload did sell for around 40 US$ in 1984. Given it had a 64K ROM, prices for ROM couldn't be too high actually. So the cost would propably be moderately cheap to add a 100K ROM to the CP/M cartridge. It could have been cost reasons for not doing this, but I think it wasn't mainly the hardware cost. I could imagine the effort in adapting CP/M to boot from a ROM would be too time consuming, or maybe even would have required architectural adaptions to the C64 itself, hence increasing cost and time delays there. Going with the floppy was propably more straight forward, and had a nice side-effect: Selling the customers a floppy drive.
The only part that would had made much sense to have in ROM would be the BIOS+BDOS and possibly CCP, as that would free up some RAM (well it would if either tricky custom bank switching would had been added, or CP/M 3 (also known as CP/M plus) had been used). Everything else would just had been expensive. Compare with that not even the Amiga with it's 256/512k ROM has for example "dir" (or "list") in ROM.
I bought the CP/M cartridge for my C64 when I was in college! Great video! Brings back memories.
5:20 Love the Australian Commodore (Computer) advert! Now watch a 1980s Holden Commodore car advert :).
Mr Know It All, we need to see more of you. Hope you're well. Greetings from the RUclips comments section
Intriguing! Gratefully appreciate the breakdown and marketing timeline of the Commodore 64 CP/M cartridge.
Great video!
re the comparison: To be fair, you should probably had compared a cheaper monochrome monitor for the C64 as the TRS-80 was monochrome and I think the monitor included in your Apple II example also was monochrome. (The C64 produces excellent monochrome picture from the "luma" pin).
P.S. speaking of CP/M computers: As you speak German, my suggestion would be something about Robotron! In particular, was their SCP a CP/M compatible OS or was it just an unlicensed and renamed copy of the actual CP/M?
Oh yes, Robotron: the absolute Unobtanium from former GDR (ex-DDR).
I'd love to get a hand on one of these, absolutely. Intreaging machines certainly worth an exploration.
That box art! 😍
I wish something could be done showing the C128 running its CP/M mode. There are still many C128's around (compared with few CP/M carts for C64 still available). I own a C128, have never done anything with its CP/M.
@@ReadrOFilz I‘m scouting for a C128. May take some time, but I‘m curious as well.
Ah, yes, a benchmark battle between two Commodore CP/M solutions
The 128 would have advantages of 80 column display and speedier floppy disk access. But is said the Z80 was clocked at about half of what was typical for a Z80 computer of the day, but am betting it’s the same case for the Z80 cartridge for the C64
One question about CP/M on C64 is if it’s floppy drive access is compatible with jiffy dos
@@THEPHINTAGECOLLECTOR , a task I have on my agenda is to get made (maybe through an 8-bit friend of mine) a 3.5 inch disk to boot the CP/M on my C128. Because I own a 1581 drive.
Fun fact: Apparently in some cases it's possible to use the C64 CP/M cartridge on a C64. Totally useless as it's way inferior in all ways, but still :)
@@TheSulross Apparently it is, see the RUclips video "CP/M v2.2 on a Commodore 64!" by Mike Paull.
It's interesting that the CP/M cartridge was a dismal failure and yet one of the key selling points of the C= 128 was it's CP/M compatibility.
Also you should consider direct video capture instead of relying on your camera locking to a very slow CRT refresh rate.
Yeah, I was too lazy redoing the footage.
All follow-up screen captures look a lot better after fixing the camera settings.
I agree, direct video capture would be of higher quality.
I'm saving up to afford a some good capture devices, though until then, camera captures it is.
A major difference is that the C128 CP/M is actually usable.
The fun fact is that C128 got it's CP/M mode due to Commodore aiming for it being able to do everything a C64 could do, and they had a hard time getting the C64 CP/M cart to work with the early C128 prototypes, so they decided to add the CP/M capabilities directly to the C128 :)
@@THEPHINTAGECOLLECTOR Re capture: For this type of capture where you don't need that long sequences and whatnot, I've heard that a DVD recorder does a decent job. Of course there is the downside of having to burn a DVD (at best a RW disc), but if you would only need it for a few videos per year it might be an option if you find a DVD recorder at a thrift store.
P.S. don't take my word on this, but afaik most DVD recorders and DVD players from the 00's uses regular PC IDE/PATA optical drives, but with a different front panel, so if the drive of a DVD recorder seems bad it might be possible to swap it out with a generic PC one.
Otherwise you might want to try filming a 00's or early 2010's flat screen TV connected to your C64 perhaps?
You are most probably the first youtuber that says thruth about numbers of c64 sold. All others are just repeating, as bunch of stupid idiots, what tramiel said: 30 million. Without any thinking! Thank you for this detail. I really, really appreciate that.
@@bobns509 Indeed, the information about total produced units is ranging between extremes, depending on who was asked.
The english Wikipedia has the best perspective, showing both, the „official“ (not proven) copy-paste statement and the inofficial investigations coming to the 12ish million conclusion.
I too think Tramiel was overexagerating on the numbers.
Interesting video...looking forward for part two
Very interesting story! Seriously! Thanks for this research and the video.
@@SobieRobie you‘re welcome!
4:57 Your math on the table is off. Specifically the C=64 column: $595+399+399+130 = 1523. Clearly, you forgot to add the cost of the CP/M cartridge. That's still significantly less than the competition, so, not wrong in saying that the C=64 option was the cheapest.
But damn, computers were expensive back then. It kind of makes me feel bad for my folks after all these years, having such an expensive hobby as a kid and costing them so much money. But then again, kids are expensive no matter what, eh? At least now computers are cheap (well, the older ones, anyway), now that I have to buy my own.
Ooppsie, my bad! Good spotting there! ^^
Nice to see you got your CP/M cartridge running. That yellow cartridge is apparently a Dirk Wouters' re-engineered design (see the RUclips video "CP/M v2.2 on a Commodore 64!" by
Mike Paull); it seems to work like the original but runs reliably on a C64C.
Really good video. I do remember the CP/M cartridge at the time but I never got to use it as I had a Vic20 and then a Spectrum 128k but its nice to see it here. Thank you for sharing this important piece of computer history
@@NiceCakeMix my pleasure!
That's one hell of a cliffhanger!
In the 80's, I paid $199 for the C64, $220 for the 1541 disc drive. I didn't get the CP/M, but I remember it being $99.
(in Mr. Know-it-all's voice) You got it booting and _didn't_ type DIR immediately? What kind of CP/M demonstration is that?!
In all honesty though, it's an interesting story. The whole idea of putting a second CPU of a different architecture just to run some non-native software seems so alien today, even though it seems it was somewhat common in the 80s and even 90s (like the PC cards for mid-90s Macs).
@@kFY514 part II next week will be a bit more hands-on …
@@kFY514 and in my own narrative voice: „Sure, sure, but don‘t you think people will be bored by starring at a simple DIR listing?“
"The whole idea of putting a second CPU of a different architecture just to run some non-native software seems so alien today"
That's because the ever growing abstractions and complexities of the system chipsets which don't allow to do these things today. MMU's for example, don't allow you to share memory between CPUs of different architecture. PCI (by default, without an IOMMU) is also not shareable between memory controller hosts. You need an specific chip complex to allow CPUs of different architectures to share an high bandwith link between them. SGIs had that xBar bus thing. Last attempt on an unified bus for multiple coprocessors connected to a master complex was Hypertransport in the 2000s.
@@hyoenmadan I don't think _that's_ the reason. We have the main RAM being shared with GPUs for example. We could also have some kind of translation layer. Or the second CPU could have its own RAM, just like the Apple PC Compatibility Cards I mentioned, which were essentially full PCs on a board, just with an unusual IO arrangement for keyboard, mouse and the like.
Of course the boring (and true) answer is that software just got portable (at either source level or through various interpreters and VMs) and there is no longer any need to do that. And when there is, things like emulation and virtualization do the job well enough, because the CPUs just got so powerful. Still, it _feel_ ridiculous. Imagine someone today was selling the solution to "I can't run Windows on my Apple Silicon Mac" in the form of a Core i5 Thunderbolt dongle - imagine even just the idea of that, regardless of how technically feasible that is.
@@kFY514 Connecting different architecture CPUs to the same RAM is different to connecting a GPU (and TPUs btw, them are the same category) to your system. For one, an extra CPU is a coprocessor, and a GPU, even the bolted APUs, are devices. So the semantics are also different. Your CPU access main RAM and you access your GPU through strict hardware and firmware semantics, and the software APIs and applications have accommodated to them, while each CPU vendor and architecture have their own semantics and requirements which are different and diverse between them. You can't just make alien CPUs to coexist without a bunch of circuitry and glue logic in the form of chip complexes, as Hypertransport in my post.
Also, system's on a board, like the Apple compatibility hardware, or the SunPCIs never were an option because them were SLOOOOOW in practice. Suitable for business programs and applications, but subpar to everything else. Also them were expensive stuff. They never stood a chance for success, and they were practically killed with the advent of advanced graphics and GPUs.
Now, a true coprocessor, as chip or as a card, would still have its use today, like for example having hybrid ARM/x86 machines for development and highly geeky stuff. But the complexities of the MMUs on both CPUs and the hypertransport complex connecting them would make such solution prohibitely expensive. Also the CPU makers of both architecture should agree and tailor their solutions on hypertransport as connectivity link between them. In the end, is cheaper just having separate machines for your needs.
Regarding the performance of the CP/M cartridge - I'm pretty sure that the cardridge's Z80 CPU was driven from the C64's 6502 clock (especially as the cartridge was designed so that the Z80 used the C64's RAM, rather than having its own).
The upshot of this is that the Z80 would have been clocked at 1MHz (there or thereabouts, depending on whether you had an NTSC or PAL model), so limiting its performance.
(The "original" Z80 could have been clocked at up to 2.5MHz, the Z80A at up to 4MHz, and the Z80B at up to 6MHz ; Acorn's Z80 second processor units for the BBC Micro used Z80B parts, clocked, IIRC, at the "full" 6MHz.)
And, by comparison, I also *think* that Microsoft's Z80 card for the Apple ][ worked in a broadly-similar fashion to Commodore's CP/M cartridge - it had a Z80 and glue-logic on the card, but relied on the host computer for RAM and CPU clock.
I don't know how the C64 CP/M cart is actually implemented, but the 8MHz dot clock is available on the cartridge port. Similarly the 7MHz dot clock is available in the Apple II card slots. So for either of those systems the Z80 could be clocked fast and have wait states inserted to sync up with the 1MHz synchronous bus. A Z80 always uses at least three clock cycles for each memory read/write cycle, so ignoring "bad lines" on a C64 an optimized CP/M cart would perform like a 3MHz Z80 when accessing memory. Not sure if the Z80 "idles" the bus when doing internal processing, if so the Z80 could be clocked at a higher speed and instructions that take many cycles but don't use memory that much, like ADD, would run at almost the full clock speed for a faster clock.
(Btw Z80 is stable down to 0Mhz, i.s. you can single step the clock pulses. Thus you can insert wait states by just stopping the clock).
The real problem is the 40 column display. They should have put an 80 column display in the cartridge.
It’s downside is that the C64 floppy disk were pitifully slow and CP/M is a lot about floppy disk interaction (a lot of C64 software, in contrast, was available via cartridge so not as much of an issue. It’s second big downfall is that a 40 column display limitation was not a great match to much CP/M software geared to 80 columns. And the third big problem was access to CP/M software given the infamous problem of floppy disk formats proliferation - that one was actually more the fault of Digital Research for not imposing some standards for floppy disk formats.
I bought Turbo Pascal and told them, that I would use it on my c64. So they sent me a perfect c64 disk with TP on it. Yeah, I had no 80 columns and only 1 1541, but I coded quite some lines on my c64. My biggest problem were the frequent crashed, so I had to save my sources all the time.
I had the CP/M cartridge for the C64 in early high school before getting the C128. Don't really remember the cartridges being rare.
I never knew anyone that had a C64 we had CPC's 464/128 Speccy's or if you were rich you had a MX2.
I had a c64, it was popular here. I knew a few people at school and more at college. My dual tape stereo was used a lot. Spectrum was what poor people had. Electron, amstrad and BBC were what you were unlucky and your parents got sold a turkey. Nobody admitted to owning a c16 or plus 4. A friend found one in a skip.
I never seen a working C64 in person...
I would’ve thought it would be better to run CP/ M from the cartridge rather than a floppy disk.
Booting, perhaps, though this would have required extra components on the cartridge PCB - perhaps something that Commodore didn't want to do for cost reasons.
(As an aside, the company Torch developed an alternative to Acorn's Z80 second processor for the BBC Micro - and this included "CPN", a Torch-developed CP/M look-alike OS, and I believe this could boot a minimal version from the built-in ROM.)
Beyond booting, I think that CP/M (and CP/M software) were generally written on the assumption of floppy drives (well, at least one floppy drive) being used...
Given the floppy disk contains roughly 100K of data, it would be interesting to evaluate what 100K of ROM would have cost in 1982.
Plus, if there would any additional discrete logic be required to make this cartridge a co-processor and ROM cartridge hybrid.
The EPYX Fastload did sell for around 40 US$ in 1984.
Given it had a 64K ROM, prices for ROM couldn't be too high actually.
So the cost would propably be moderately cheap to add a 100K ROM to the CP/M cartridge.
It could have been cost reasons for not doing this, but I think it wasn't mainly the hardware cost.
I could imagine the effort in adapting CP/M to boot from a ROM would be too time consuming, or maybe even would have required architectural adaptions to the C64 itself, hence increasing cost and time delays there.
Going with the floppy was propably more straight forward, and had a nice side-effect:
Selling the customers a floppy drive.
The only part that would had made much sense to have in ROM would be the BIOS+BDOS and possibly CCP, as that would free up some RAM (well it would if either tricky custom bank switching would had been added, or CP/M 3 (also known as CP/M plus) had been used).
Everything else would just had been expensive. Compare with that not even the Amiga with it's 256/512k ROM has for example "dir" (or "list") in ROM.