I love this era of computing. Everything was happening so fast and the market was expanding so quickly, that you had to keep innovating just to survive the cutthroat competition.
I remember as a young professional during this period needing to buy a new computer just about every year. Now I have a 13 year old Macbook. I have replaced the battery. I can't think one one good reason why I should buy a new computer...
That has its upsides and downsides. One of the things powering that dynamic was early computers were horrifically under powered. Someone back then allegedly said "Nobody will ever need more than 640k" But this obviously wrong even at the time. A plain text file of the average book is around 800k to around 1MB. Not only does the whole book not fit in memory, but it wouldn't even fit on a single floppy disk of the time. In 1981 when the PC first shipped, they didn't have anywhere near even 640k. Most were like 128-256kb of RAM. So an author writing a book on a PC would have to do it by chapter across multiple disks. So they would also need to print the book out as they go to make referencing older parts of the book easy.
@@tarstarkusz That's mostly true, but I would point out that other contemporary 8088 machines, like the Victor 9000/Sirius 1 had GCR-encoded floppies with variable speed zone recording that could hold 1.2 MB. Also, because those pre-dated the IBM PC's memory-mapped I/O reservation of the addresses from 0xA000 to 0xFFFFF, you could run MS-DOS with 896K of RAM. (But of course, no one could afford the chips!)
@@WardCo I think Commodore used the GCR encoding with the 1541 and was also variable speed. IIRC, it was able to put more sectors on the outer tracks. While it is true the 8088 has a 20 bit address bus to get to 1mb, they decided to put the video card bios and video ram in the C segment (also the hard disk controller) So that effectively limited you to 768k, at least contiguously. And, of course, due to the soft segmenting, a new segment could start at FFFF:FFF0 which gave you 64k (-15 bytes) above 1 megabyte That's what himem-sys (I use dash because a period would make it a link) does. It loads a portion of dos into that 64k above the 1MB mark and you can access it without having to switch into protected mode. Note, it only works on a 286 or higher. Qemm (an aftermarket memory manager) did something to give much larger conventional memory. It did some kind of shadowing and freed up basically all of the upper segments. (But, it generally doesn't do this. IIRC, it will only ad the A segment with the rest being for TSRs, device drivers and the like) EMS allowed basically unlimited RAM with a bank-switching scheme. I used to have a BUNCH of these cards loaded with chips. I retired XT class PCs (about 100 of them) with 2-4MB of RAM on these cards. It would be worth a decent amount today on ebay
It was exciting to follow it. I bought a lot of computer magazines and went to several computer shows back then. I miss those days. Now everything is just an improvement on things that existed before.
This isn’t *that* surprising- lotus notes may be one of the stickiest bits of software ever created. In addition to just email it also had what we’d today call a low-code toolset and enabled relatively simple line of business applications, those apps ended up becoming core to a lot of large businesses operations, usually with a bunch of poorly documented business logic built in in somewhat hard to decipher ways since they were rarely built by dedicated programming teams, unwinding all of that without bringing the business to a halt was a hard problem.
@@ajax700 lotus was rebranded to HCL Verse but effectively still lotus. They dropped it last year but surprisingly there was quite a lot of resistance to dropping Verse. They now run outlook (in line with all their clients). I wouldn’t say they’re irrelevant- they still are a massive player in enterprise, for personal computers they exited that business with the sale of the PC business to Lenovo. (And in hindsight that was probably a good move)
It's also not surprising because changing something as fundamental as the core email platform can have bad consequences for archival requirements, will result in a lot of expensive training/retraining of the staff, and could break lots of automation.
Yes its a mistake. Apple II was introduced in 1977, and that's what stormed the market, the visicalc machine. In fact Apple Is are rare. Only 2 years later the Apple II+ was in the market and that was the king by 1981, absolutely not the I. There was the III in that year but it was a flop, in 1983 came the IIe which was the biggest seller of this era for Apple.
Both the Alto computer and Aldus Pagemaker are shoutouts to Aldus Manutius, a Rennaissance printer who designed "small" books that could fit into saddlebags and worked with the legendary Humanist Erasmas (who is the guy on the Pagemaker logo). Among other things, Aldus developed the first fonts used to print copies of classic Greek literature, as well as the original texts of the New Testament.
The Xerox Alto computer got its name from the Xerox Palo Alto Research Center (PARC), alto being the Spanish word for tall/high/upper. The GUI technology was first developed at the Standford Research Center (SRI), Xerox licensed the tech, and created the Alto computer at the Xerox Palo Alto Research Center.
One of the things that made the IBM PC really take off was the 'clones'. The BIOS was reverse engineered along with the entire open architecture, prices quickly dropped, new generations of Intel CPUs supplied more and more compute power (286/386/486/Pentium), and in a relative few years PC clones were EVERYWHERE. I loved that era, got to work on a lot of cool projects. :)
Great video. One quibble. Atari did not "collapse." It contracted but immediately began development on the Atari ST computer which was released in 1985.
Well, Atari broke up, which is perhaps not the same as collapse, but it certainly isn't the same as just contracting, either. Tramiel bought one of the divisions, which led to the ST, but the arcade business went to Namco. And Tramiel had already started the groundwork for the ST before buying Atari, so there is an argument that such work would have continued without any Atari purchase.
The education market was basically what kept Apple alive during those early years after the IBM PC came out. IBM took away the high-margin business market (because IBM) and Commodore was eating their lunch on the high-volume home market (because Apple stuff was stupidly expensive and C64s were cheap). Apple really had few customers other than schools.
Our school computer room had one lonely Mac and five C-64s each with 1541 drive and monitor. Everyone played Summer Games by Epyx or Spy vs Spy on them. No one touched the Mac, I am not sure anyone knew how to use it.
Apple has always sold style, not science. That's a good way of making money: the richest men in the world are just stylists, in cars and handbags. It doesn't add a lot of our general well-being over the long haul. Technology? Jobs just went out and copied. The original Apple PC is simply a cheap knock-off of the $9,000 Xerox machine of the time.
@@TheDavidlloydjones The original Apple had nothing to do with Xerox PARC stuff. It was more a heavily cost-reduced machine built around the 6502 using off-the-shelf parts and some very clever custom circuitry from Woz to tie it together as best as they could without having access to their own fab (which is something Commodore leveraged to make their stuff cheap). The later Lisa and Macintosh were the ones that borrowed heavily from PARC's Alto, but that became a theme later on. Again, Woz had some good input in the Macintosh. Jobs indeed was responsible for tons of styling and marketing decisions (some of them smart, some of them very stupid -- for instance his aversion to fans and expandibility), but Woz made a lot of genius designs turning out a far better machine than such a small company had any right to be making. It was Jobs decision for high profit margins that essentially killed Apple in the home market. Computing devices wouldn't become a fashion statement until decades later, so their price competitors were in institutional use (business and education).
@@TheDavidlloydjones Yup, it was always about style then and now. And with their closed-source mentality, the majority (entire?) of the industrial, engineering, scientific, and gaming community uses Windows, not Apple OS.
Some of my friends bought one. It had 128K of RAM which we could not imagine what to use for. It is famous in my circle of friends because my buddy Dan beat the chess program SARGON on it while flying on acid one time -- something he was never able to do sober. Good times.
I worked for a publishing company in the 80s; We published a group of magazines targeted at computer manufacturers (DEC, IBM, HP). We had an 'art department', as you describe it. People cutting and pasting strips of text into the shape of a printed page. And then along came Adobe with Aldus Page Maker. We bought a Mac and an IBM PC-AT. Installed Pagemaker on both. And sat them down side by side in the Art Department. It took a day to decide. I ended up buying every new Mac that came out. And the SAVINGS were insane. Before you print 250000 copies of the magazine, you have to print ONE copy and have the advertisers approve it. That costs about $6000 per page. Until you can do it yourself. And the big machine that spewed out columns of printed text to be cut/pasted....gets unplugged. And I got to avoid having to support the IBM PC and Windows. Thank you Adobe. I named my son Aldus.
But it limited you to black and white. Don't most magazines get printed in 4 color? How did you get around the color limitation? It was slow and time consuming. On an early Mac, I would think everything is 1 page at a time. You simply didn't have the RAM to try and load even a black and white magazine length file or even just more than 1 page.
@@tarstarkuszearly on Aldus Pagemaker was used mostly for mock ups, with the main work still being cut and pastes. I think OP is describing the mock up use case and the move into replace paste ups later. Though OP, aldus was a separate company. Adobe and Aldus were rivals. They weren't together. And Aldus sued Adobe multiple times for copying their software (and own more often than they lost)
@@medes5597Plus, PageMaker was an absolute joke for most publishing houses when DTP *really* took off. If you were serious, you used QuarkXpress. Aldus PageMaker was common for form houses and in house SMB. Adobe didn't have crap to compete until they bought Aldus and got PageMaker. But still, they were a non issue until InDesign was released and even then, Quark had to drop the ball on the OSX release of QuarkXpress 4 for InDesign to take over.
Ah, easy way to ensure that the son will be bullied in school. I helped out at times in an intrnal publishing division of a scientific research institution. I remember Xerox Ventura; and then QuarkXpress.
This channel is constantly blowing me away with the topics and depth.... i always learn new things about topics i think i already know about! Kudos, glad to see this channel is growing!
May I submit a correction? The original disk drive controller for the Apple II was called the "Disk II controller". The "Integrated Woz Machine" was the single-chip version used in later models like the IIGS and IIc.
I bought my first computer in 1991. The choice was between a used 286 with a 14" VGA monitor and MS-DOS 5 (maybe 4.x) and Win 3 or a Apple Machintosh with a 10" BW display. The price difference was that enormous. What made me choose the 286 was the fact that unlike the mac there was almost an unlimited number of games/software to choose from while the mac only had stuff from Apple. Also the UI on the Apple was only available in Icelandic and even though that's the country I live in I hate using electronic products that are not running on english.
The early Mac definitely didn't just rely on software from Apple - certainly not in 1991. Back then, if you were writing a report on a Mac it was probably done on Microsoft Word. If you were crunching numbers, it was probably done on Excel. If you were connecting to a BBS, you would have used Telefinder by Silicon Beach Software or one of the many terminal emulators. Yes you might have used some Apple software too, such as HyperCard. But the average user could well have used a system with no Apple software at all.
@@OldAussieAds Not where I was living. Getting Apple software in shops that sold PCs was hardly an option. Then there was the issue with the monitor, the only one available for the Apple I could afford in the early 90's was 20" while the one with the 286 was 14" and the whole package was still cheaper than the Apple.
@@bjarkih1977 Yeah I agree there would have been differences some regions. That said, I'm in Australia and we certainly weren't first class citizens to US companies either. I'm not sure I I understand your monitor comparison. The Mac monitor was 20" while the PC monitor was 14"? Did you get those measurements around the wrong way? 12"- 14" was standard for a colour Mac in 1991. 20" would have been absolutely huge, even for a PC.
The Integrated Woz Machine came later. It was a single-chip (= cheaper) version of his original disk controller which mainly consisted of an EPROM + some off-the-shelf TTL chips (+ some help from the main CPU). The IWM was key to making later Apple II and III machines cheaper to produce.
@@shorttimer874I agree. The IWM was a single chip version of the disk ][ controller card. The original card used discrete chips. And really that was the innovation. Woz took a lot of the operation out of hardware and into software (255 bytes of software!) and used commodity (74 series) chips for the apple ][ disk drive. He took drive controller chips off the card. He removed the data separator from the disk drive itself. It was an amazing design. Later he worked on the IWM, around the //e or //c timeframe. That was a single chip. Company production volumes had risen and the cost of custom chips was down so making their own made sense. The iigs added the Mega II which put basically a //c in a chip. I think the //c+ used a variant of that. Also Wendell Sander worked on the IWM revising it to the SWIM (Sander Woz Integrated Machine) which added support for 1.44M floppies (SuperDrive). If nothing else that finally gave interoperability between Apple and IBM floppies. This was done partially by putting back the old MFM data separator that Woz had removed in the disk ][. It was the right move so floppies could be standardized between systems. But the funny thing was the age of GCR (group code recording) was finally here just as Woz had used with the disk ][. Hard drives left MFM for RLL and never looked back. Now on your computer your disk (HDMI or DisplayPort), Ethernet, slots (PCIe) and USB all use GCR. it has various names, 8b/10b or 128b/130b or such. But it’s all similar in design to the disk ][ design which could be named 5b/8b (up to DOS 3.2, 13 sectors or 6b/8b (DOS 3.3, 16 sectors) in modern parlance. Anyway I think the video gets it wrong. There are other small errors too like showing a genuine Apple Mac 7200 (easily one of the worst Macs ever made) when it speaks of clones. None of this undercuts the story. He gets the tale right even despite some small mistakes.
If you are curious about the 'tie-in' of the "Little Tramp" character (seen at 10:40) by Charlie Chaplin and the IBM-PC? Possibly the advertising agency that came up with the idea was using the "Modern Times" movie Chaplin produced and starred in back in 1936. After all, the tag-line for that IBM-PC Ad campaign was: "A tool for _modern times."_
Not going to lie, I want to live in the alternate universe where Commodore came out on top. The Amiga was so far ahead of the Mac and PC in the mid-to-late 80s...
In the end nothing could compete with the clones. Apple, commodore, atari etc. where all custom machines trying to carve out their own incompatible market. The flexibility, expandability, competition and innovation of the pc once the cloners entered the market made competing with it pretty much impossible. It would have been nice if they where able to keep going for longer with new products.
S-100 computers were the king when the Apple II was brought to the market. At first it appeared that the Apple II would be an open design so small companies started making various add-on boards for various tasks. It was when Franklin and a couple other firms decided to start making Apple II clones. Apple went after them with a vengeance and soon the clones were history. When IBM came out with the PC, immediately clones started appearing. The only thing holding them back was the firmware BIOS that was owned by IBM. Companies like Compaq developed clean versions of the BIOS using software engineers who wrote the bios from scratch using only the specifications of the BIOS, never looking at any of the source. So before long there were 10's of companies building PC compatibles at low price. I don't think affected IBM very much because most of the IBM customers were not going to take their chances on a clone. Apple locked down their design thus limiting the installation base of the Apple II and Mac architectures.
and it was this openness and competition that led to so much success and innovation in the PC space. Today Intel cuts its own throat. eg. think about the Atom line. Generally bootloader ('bios') was locked. how cool would it be to be able to boot full x86/64 windows on your phone eg (zenphone2) or $50-100 tablets. eventually is was possible on the zenphone2 but... this should've been a feature, not a chore for the average consumer!
@@PRH123 IBM selling off their personal computers business is a dumb move. The name recognition factor alone will make them some serious money. The Thinkpad was so good that can be milk for many years to come and only need minimal investment. Yet they sell it for quick cash. Now the money is gone. All they have now is service and a R&D department hoping for a jackpot, But they don't have enough money and resources to make it happen.
False. Not many people cared about the 'openness' of architectures in the 80s. A computer back then was considered open of it had expansion slots and people didn't expect computers to be back compat. The IBM pc didn't succeed because it was ,ore 'open'. People largely expected that the IBM PC would win the PC market and drive out apple and commodore and others like they had done to Honeywell and Burroughs in the 60s and 70s. Truth is that for 3 years after the first IBM PC did not have any competitive clones.
Have you ever thought of doing a video on how qwerty and other key layout setups came about? Apologies if you already have, it's just something that I keep thinking about when I look at all those weird attempts at keyboards in the early eighties.
Note that the 6502 is not based on the Motorola 6800 design. MOS Technology made a version, 6501, that was pin compatible with the 6800. But pin compatibility was the only compatibility. The 6501 was dropped after a lawsuit from Motorola. The 6502 was constructed in a different way than the 6800, and this made it possible to make it for cents and sell it for a few dollars. So the case with 6800 vs 6502 is very different than 8080 vs Z80, which were machine code compatible, even if the Z80 were constructed very differently than 8080, and had its own assembly language. And, btw, thanks for your high quality and very interesting content! I love your channel.
@@peterfireflylund : agreed. Job’s was a visionary, but his ideas exceeded the hardware/software capabilities at the time. He wanted it so bad, he was obsessed. You couldn’t spend money on development, like he was doing, and not have investors get upset.
@@deepsleep7822 Jobs' one talent was marketing himself and convincing people that he was a 'visionary'. What he was was a gold-plated arsehole. Wozniak was the reason Apple existed; Jobs was nearly the reason it didn't.
I have used the Apple Lisa computer and LaserWriter while back in the late 1980's, maybe first in South Korea. These machines were revolutionary piece of art, science and engineering marvel . Graphic screen, UI and mouse and beautiful printed output, rivalling phototypeset book. Window desktop and object-oriented drawing tool software. The software itself worth the money. IT was easy to use and understand. It was culture shock. I wrote my CS master's degree thesis about object-oriented language with Lisa (then Mac XL) & LaserWriter. So I couldn't understand why this revolutionary machine didn't sell well enough in market.
According to Bill Atkinson it took the 8 MHz Macintosh at least half a second to draw a screenful of characters - 1.75 times faster than Lisa, and that's before accounting for Lisa's larger screen. Lisa didn't sell because it was unusably slow, and also cost $10,000 (±$31,000 today).
Very nice. The Fall 1985 marketing push you cite resulted in my company becoming a Mac / LaserWriter / PageMaker customer at the time. I was the one sent out to "the seminar." I reported back that it was great, and we should buy it.
This review of the early history of the PC is extremely interesting to me. I was watching this field intently during the mid-70's to mid-80's. In the late '70's the Z80 was the best 8 bit micro but the 8080 had a head start and good marketing. The Motorola 6800 was very good too but it didn't have the marketing clout behind it. The Texas Instruments TI-994A was more of a toy computer, ok for games but not for serious work applications. When the IBM PC came out, its hardware wasn't very good but the IBM name was magical and everyone wanted something that was "IBM Compatible". The Texas Instruments PC ("Professional Computer") in the early '80's had much better hardware but without the feature of IBM compatibility, it wasn't a marketing success.
Yeah. I'm of the age group that grew with Apple and Microsoft, and it's bewildering they're basically retreading the mistakes of the companies they disrupted.
You seem to be mixing up CPUs (Z80 and 6800) and computers (TI-99/4A). The 6809 derivative of the 6800 was well loved by its programmers but apparently too expensive compared to the competition. The TMS9900 as the CPU in the TI-99/4A was a true 16-bit CPU when all the competition was 8-bit, but other parts of the computer's design was so compromised that the result ended up slower than the 8-bit competitors. (I'm and old Z80 coder and never coded for the 6800, 6809, or TMS9900 but read a lot of what other people thought about them.)
@@andrewdunbar828 In the late '70's I thought the Z80 was the best of the 8-bit micros. I compared it to the TMS9900 and finally decided on the TMS9900 since I was working at TI. I built my own homebrew computer and wrote a lot of software at home for the TMS9900 that I later used on a real project at TI. Unfortunately TI didn't continue to develop their micros. In 1982 they gave me a TI994/A with all the accessories and software so I could evaluate it. This was fair for games but not too good for general programming. The IBM PC stole the show and set the standard, (except for Apple). The Z80 and 6800 and 68000 were much better hardware-wise than the 8080 family but Intel's marketing prevailed.
@@bob456fk6 Intel's marketing didn't prevail against the Z80: a _lot_ more designs used the Z80 than the 8080 or 8085. In part that's because the Z80 was an 8080 clone with some extra features. Intel won with the 8086 because IBM used it and the IBM PC took off. It's kind of a crap design compared to a 68000, but Intel got working versions into production first; the 68000 apparently wasn't really yet ready when IBM was evaluating it for the upcoming PC. I wouldn't say the 6800 was better than the 8080, no. Not even as good, really, except that it needed only +5V rather than +5/-5/+12. (I've done a fair amount of programming on the 6800.) The 6502 was probably about as good as the 8080: it didn't look as good on specs, but it ran pretty darn fast if you knew what you were doing. (A 1 MHz 6502 easily is faster than a 2 MHz Z80.)
The Apple II E was our family's first PC and I remember it very fondly. My university had a UNIX network on campus, and every student got an account. My first very own computer was a used 386 that friends who worked at Siemens gifted me. Those last two weren't Apple of course, but I always think of all those systems together, with tons of nostalgia! Thanks for the memories! 😊
Or may the history of Atari. Despite the claim that they went out of business in 1983, Atari came out with the Atari ST in 1985. One of the fun things about it was that you could take the roms out of a Mac, put them in a cartridge, insert it into the Atari ST, and with a small program it could turn your ST into a Mac that also ran FASTER than the Mac.
Seeing those rooms with the old computers reminds me of an urbex video I saw the other year where an abandoned textiles factory in Italy was explored and there remained so many PCs that part of me wanted to dash over there to find one a new home. Sure, they would require repairs but it's fun seeing relics of the past still exist, even if they're in bad shape.
Jobs never understood engineers or computer users in general. Worst purchase I ever made was an original Macintosh. I regret every penny I spent on that piece of 'stuff'. We were completely abandoned, with no capabilities and no upgrade path. To this day I've never spent another dime on anything Apple.
I too bought one of the early Macs. It was cute, the pulldown help menus did help to understand it, but it had just those three programs (word processor, spreadsheet, something) which I had no use for, it had no storage to speak of, it had that dinky b/w screen, and after bringing it work (I was a Unix programmer) and everyone getting a chance to play with it, sent it to my sister, who used it for years with her church. The only reason it was not my worst purchase is because I had no great expectations and it was intended for my sister right from the start. It did also teach me about Steve Jobs dictatorial attitude. I have had jobs using Macs, and it drove me crazy how hard they were to customize. Steve Jobs attitude was "my way or the highway", Bill Gates was "who cares and it's crap", and Unix/Linux was a zillion choices and all the customization you could want, but no one made it easy. I use System76 laptops now, wish they had a Gentoo option, but they are fine hardware.
The Altair 8800 was hardly a personal computer. It, like the Kim I, were really just hobbyist kits. Basically, it was a box with switches and some LEDs on the front of it. While it was affordable, there was no keyboard for input and no display for output. Im my opinion, a more realistic view of what makes a personal computer is that at the least it REQUIRES keyboard (or something more advanced) and a display of some sort (teletype and LEDs notwithstanding). In my view, it would have to be something akin to an IBM 5150, an Apple II, or a VIC 20. I wouldn't even count an Apple I, since it was a kit which keeps it in "kit" territory. While much of the hardware has changed from the IBM 5150 or Apple II to now, the basic paradigm for using a computer hasn't drastically changed. I'm typing on a laptop that has a keyboard for input and a video display showing letters (or characters) when typed. Modern computers do generally have GUIs and pointing devices, however micr are not new as everyone who has ever seen "The Mother of All Demos" can attest. The only thing really keeping Douglas Engelbart's system from being a PC was the fact that it was so horrifically expensive. The truth is that before there were affordable systems with keyboards and displays, the systems that preceded them were too difficult to be used by anyone but a hobbyist or someone in the industry. The advent of the actual PC changed that.
The Kim 1, Altair, Imsai, etc. were all very capable and expandable computers. But you needed a separate terminal to interact with them - which was a significant added cost.
@@markmuir7338 From what I've seen of those machines, getting them working with a keyboard and display is not a trivial affair and requires toggling memory locations using switches on the front of the case and extra, expensive cards and such. If my understanding is correct, this disqualifies them at least in my mind. The point of the "P" in "PC" was that you could: A) Afford the thing. B) Take it out of the box and do something useful with it as a normal person. At 8, I was able to teach myself Apple II basic, TRS 80 basic, and IBM PC basic. With one of those other computers, I sincerely doubt I would have had a clue what to do with them.
@@kevintrumbull5620 These hobbyist micro computers worked the same way as mini computers and mainframes of the era. They were intended for people who already used computers, but who wanted to have one to themselves. That's why they relied on serial terminals - that's what their users were used to. Once you had all the necessary equipment (terminal, disk drives, disk controller card, memory expansion card) you could run 'proper' operating systems on them (CP/M) along with plenty of commercial software. As you said, adding a built-in terminal which worked out of the box would expand the potential market substantially - by including people who had never used a computer before. I also got into computing that way via a Commodore VIC-20 when I was 6, then an Apple II when I was 8. It's important to recognize there were a few of these integrated systems before Apple (eg MCM/70) which didn't have the marketing genius of Steve Jobs, hence are lost to time. Also, several hobbyist computing companies (eg SWTPC) sold their own terminals alongside their micro computers, which were pre-configured. These companies and their products continued along until the IBM PC stole away most of their customers. The RUclips channels 'Adrian's Digital Basement' and 'Usagi Electric' have vastly expanded my knowledge of these early systems, and made me less of an Apple fanboy along the way. There was plenty of innovation all around.
If you were willing to add a terminal, serial port and ROM board to an Altair you could have a system that booted straight up into whatever the ROM does, with no toggling. The Apple II was sold built, as well as in kit form. You could just go down to The Byte Shop, buy one that included a case and PSU, and plug it into your TV at home. Though of course it booted up into WozMon, rather than BASIC or anything like that. But you're right that for consumer adoption, something "akin to an IBM 5150, an Apple II or a VIC 20" was required. But that didn't happen in 1981 with the first IBM PC: that happened in 1977 with the Commodore PET, Apple II and TRS-80 (later called the TRS-80 Model I).
That "other" on the first pie chart, where the collective CP/M (S100) machines, from many small manufacturers. The MITS Altair, and others similarly built, could run Digital Research's CP/M operating system (OS). Yes, Apple had it own proprietary OS. Though Microsoft provided a plugin card, that could run CP/M. The bulk of the CP/M machines exited the market by 1985. As for floppy drives, those first ones were 8" drives. You don't know "floppy", unless you've really handled an 8" floppy ! At the time of the IBM PC, getting patents on software was difficult. Most software was only protected by copyright. I had access to an IBM Technical Reference Manual. It had all schematics, and full printout of the BIOS source code. Used that manual a lot.
Next you should do a video on Commodore and the Amiga. I grew up on the Timex Sinclair and Commodore 64 as a kid. The Demo scene on the Amiga with 4096 color palette far exceeded what the Mac could do. But the best 3rd party add in card was Newtek's Video Toaster. Video switching, compositing, and 3D Computer graphics creation. You could do a series about the Computer Giants of yesteryear. The CPU wars never get old.
The Amiga topic is interesting but I feel like it would end up being just a footnote in the history of microcomputers The best would be a story about Commodore, a company that made millions but in the end only the memory of it remains
I feel like Steve's hiatus from Apple (@24:30 onward) was what made him Steve Jobs. He finally went out into the world and watched people use computers. It's after this that he started saying those iconic things about "starting with the customer, and working your way back to the technology", and so on. Before this he was a great manager and idea guy, but he didn't have a concept of the mission. That's not to say he was "wrong" in redesigning the first Macintosh and quintupling the price from $500 to $2495; the price was not the issue but the technologies and software it shipped with.
the Lisa embodied much of all the same characteristics, but clearly showed that price was very much an issue. What would have changed the dynamics for the Macintosh intro would have been to ship with 512K base memory instead of 128K with expansion capable up to 2MB. All those 128K Macs were useless MacPaint doodlers only and over $3000 was a lot to pay for a doodling computer. And it needed that SCSI port from day one so that serious users (business) could attach large capacity storage. IOW, they needed to have inrtoduced something much more like the Mac Plus from day one and then they would have had a very serious contender against the business serious IBM PC
@@acmefixer1 Basically they made a very expensive device with not a very great idea of who the intended customer was supposed to be. Computers aren't just specs, they are the software they ship with, the software they can run, and the things you use them for.
@@TheSulrossIIRC the SCSI standard wasn't finalised until 1986 - even the Macintosh Plus, released in January of that year, isn't fully compliant with SCSI-1 specifications.
The price of the first Mac absolutely was a major problem. That, and the gross mismanagement of the IIGS, cost Apple what should have been a position of permanent dominance in the computer market.
Yeah! The $3K [US] amount was pricy for its time, and not affordable for the average household, as adjusted for inflation, that 1983 price would equal to near $9K in 2023 [US] dollars.
my first computer c. 1981 was a eurapple. an apple with a non ntsc video output. I clearly remember I got big PCB which needed all the parts and a bunch of cuts and straps to get ntsc. I had one mistake. I built a linear power supply, used a nice wooden Hammond case and surplus keyboard. I first I used a TV set and taped into the circuit with the ntsc signal. Later I got a green screen monitor using a phono jack for the video. Next came the floppy drives, 2 of them Asuka drives. and much later a HDD. A big thing back then was to an 80 column card. I also remember getting a Z80 card to run a spread sheet. I designed and built several cards that plugged into the slots.
Thank you for this fascinating tour through this history. I lived through this whole period, buying an Apple ][+ in 1980. What I particularly liked about your video was, in your usual fashion, its completeness -- I learned a lot that I did not know before, and I feel like I have a much broader view of what was happening at the time. Being so immersed in it, I didn't see the forest for the trees. As always, I am looking forward to your next video!
HP wasn't in the PC market in the early 80's. Their computers may have looked like PC's but they were aimed at professionals and so were the prices. They didn't sell in any meaningful numbers. Not sure where you sourced your data for the pie chart at 11:05. It also conflicts with the market share chart you show later at 22:48. That chart is missing the "other" category as it doesn't include computers like the VIC-20 which was still selling in good numbers into the early 80's. Even the Ti-99 was still on the shelves in those years. One of the reasons Commodore was able to weather the gaming console crash and price wars of the early 80's is that it owned MOS Technologies which made the 6502 CPU. Every Apple II sold was generating revenue for Commodore!
HP had various models that could run CP/M in the early 1980s including the HP-86 and HP-125. I think that pretty much makes them personal computers, albeit not IBM PC-compatible ones. Yes, they were expensive and noted as such at the time, but then there was a perception of enhanced quality over the most comparable competition. It is an odd statement indeed to distinguish between personal computers and computers "aimed at professionals", since a lot of the money made in early microcomputing was made precisely by selling computers to professionals. That was kind of the point of the IBM PC, after all.
Found 'Nokia: The Inside Story' book at a thrift store and was an amazing read. Maybe want to take a look at it for a future video in case you were not already working on it. Thanks for the great content.
Minor note, I bought the Apple 2 Plus, the second Apple 2 generation, in 1979. The actual Apple "1" was a kit with no case or keyboard, very few of those were sold, so I am guessing in the beginning of this video the count of Apple "1"s actually includes Apple 2s.
the biggest problem with the Apple III wasn't chips falling out of their sockets (yes, that was a problem to a small extent, but not the big deal breaker), the BIG problem was it's poor ventilation design caused the components inside to overheat significantly. It had no propose built ventilation and no fan, causing it to overheat badly if left on for more then a couple hours. I remember one Apple III in a computer lab long ago and I remember a note on it said "turn off after one hour" meaning no one could use it for more then an hour cause it was so prone to overheating.
[10:14] The infamous Wall Street Journal ad: "Over the next decade, the growth of the personal computer will continue in logarithmic leaps." It's odd that they didn't say exponential, and instead said the inverse, logarithmic. Anyway, thank you for your fascinating and well-researched videos, they are always worthwhile.
The IBM PC was very similar to a CP/M-based computer, but with the advantage of being able to use more memory, and having a more powerful computer - and a single standard disk format. So it started by swallowing up that segment of the market. The Commodore 64 was a direct competitor to the Apple II in the home market; but since the Apple II was a factor in the business market, IBM was also a competitor for it.
this is when i grew up using computers: aldus pagemaker, photoshop (just photoshop, no number or date) illustrator and premiere. did my senior project (wrote and produced a pilot, but only had time to shoot the commercial during school year) and incorporated 3D graphics and overlays using a 660av and a VHS VCR. had that tape for 20 years till i lost it in a move. it was a spectacular disaster, but it was my disaster. if i still had it, it would be on RUclips for everyone to gawk in horror. 😮😅 Considering I'm nowhere near Hollywood, there was never a risk of my becoming an actual producer, but it was still fun learning the process.
Closed products were Jobs' MO from day one. The only reason Apple started out open was because Wozniak had influence in the early days as the chief designer. However, as his health and interest in the company waned, Jobs' influence rose. Enter the Apple 3 disaster. That was a direct result of Jobs' form over function design philosophy. And he kept making that mistake over and over again. See the Power Mac G4 Cube. He only gained eventual success because the smartphone was a product class where design in fact does beat function. (Cough *AntennaGate* cough)
@@JimmyDoresHairDye The original iMac needed to get out fast, given Apple's very limited resources in 1997. Jobs' input on the product was curtailed as a result. He literally threw a fit when they showed him the prototype with a sliding DVD drive instead of a slot loader. For the same reason they adopted USB, which allowed 3rd party expansion and compatibility with PC devices. The iPod only really took off when they opened it up for Windows users. If they kept it a Mac exclusive accessory, it would never have been as successful as it was.
Question about the ending ... What was Scully's new "game changing" project for apple that became a disaster? Is there another video that explores this?
5 месяцев назад
It’s worth noting that Wozniak’s disk drive lacked the capability to track the position of its drive heads. Consequently, when it needed to reset, the motor would forcefully push the drive heads for about a second, hoping to return them to the starting point. This process generated a loud and distinctive noise, making the disk drive significantly noisier than others available at the time. All in the name of using as few of components and saving cost
Its a pretty strange era there were a lot of good computer companies in the early 80s that where way better than apple and ibm. like commodore Atari Sinclair ti but they all got crushed and forgotten after the 8 bit era. my guess is that they overestimated the home user market instead of focussing on the office market. in the 80s the average person didn't need a computer it could have been handy and fun for gaming. but it wasn't until the 90s and the rise of the internet that the average person wanted a computer for more than just gaming. i think another advantage ibm had and apple to a lesser extend was that their computers where more expandable and open so other companies could improve the pc. and even the old apple computers had more expansion options than the c64 .and when someone did make a hardware upgrade for the c64 nobody supported it.
@belstar1128 At least commodre and Atari did survive into the 16bit era and I do agree that in the 80s a lot of people had no reason to own a computer beyond games but they did try to find success in the business market with the commodre plus 4 and Atari portfolio and stacy. The fact that sometimes as you say expansions don't get supported alongside fragmenting the userbase is maybe why some computer companies don't or didn't support the idea of more expandable systems
The problem is Apple doesn't license its operating system to manufactures which means don't allow these PC manufactures to make computers with MacOS as Microsoft does with Windows. This limits your choices
I remember buying a computer in 1985 for under $200.00 and it used two cassette drives. I could use my music cassettes in it. It had no monitor so I hooked it up to an old black and white large screen TV that was in the basement. It took a while for the computer to move the cassette tapes back and forth while operating. I used a word processing program on it to prepare pamplets, etc.
The early IBM PC’s had a cassette interface. I had an early PC and tried it. It worked, but it was something I wouldn’t rely on. The first IBM PC’s had a max of 64k on the mobo. Mine was the version when they first switched to 256k on the mobo.
Strictly speaking the "integrated Woz Machine" was not the first superb Woz design, which was like 6 chips. It was a one-chip Woz machine that was used on the Apple IIc and early Macs.
So Steve Jobs wasn't so great as many creating him. But I could guess it from words of Jim Keller, which he was telling about how they must hid products from Jobs in early stages, because he don't understand product development cycle, that prototype is not as final product.
My Mac IIe was so s-l-o-w I had CompUSA bust the case and double its memory for "only" $150. Wish I had instead put that money into Apple common stock! ¯\_(ツ)_/¯
Shout to the Performa 5200 series, I loved the industrial design of that machine and it was fairly expandable at least via SCSI, if not obviously slow relative to Windows PCs. Still, it was and still is an extremely fun machine to use during the glory days of CRT. Definitely a nice prelude to the iMac. And way better than all those PowerPC/Performa 6xxx series of which there were way too many configurations (mac ppl know what I’m talking about) System 7.6 - 8.6 looked awesome on that Performa, Ethernet was effortless, 100Mb Zip Drives (mostly) worked for low to mid-size files. I’ve made a point to keep everything compatible with my 5215 still working. SoundEdit 16 v2 was/is also a really awesome piece of software, even if it was slow, the ability to *see* raw sound and music was amazing. Photoshop, Illustrator and QuarkXpress also did ok on this machine. Even when it lagged, it was still an amazing creative tool. It could also handle a certain type of file called the .mp3 ;)
Many people forget how groundbreaking the original Apple II was, it had: BASIC on ROM, expansion, color, graphics, & sound in 1977. It's only "competition" were the TRS-80 & Commodore PET which were text only, no sounds, no graphics. Then the same people brag about Commodore 64, which did have sound & graphics, but was not released until 1982.
There is a briliant story, of Apple investment (way back when Jobs was on Lisa team I belive) into a Hungarian startup called Graphisoft, who subsequently went on and invented new way of designing buildings on computer. Their flagship software Archicad battles the hegemon of AEC industry - Autodesk to this day in much similair fashion as Apple and IBM once did. I would love to see your take on the story, @Asionometry.
Small Costs were such a huge driving force… my high school of 800 kids shared three Trash 80s… cassettes were mostly used for program storage… one of the TRS80s had a disk drive… 1983. Buying a computer to go to college… my class was the first to require one computer for each student… class of ‘87… DEC Pro30, compatible with the DEC10 mainframe on campus…. Word11 was a great word processor. Took a year to get a dot matrix printer (expensive and limited supply) Dad went Commodore64 after that… low cost word processing that looked better than a typewriter could do… huge savings in manpower. Or Mom-power…. 😀 Do typewriters still exist?
As a teenager interested in computers, much if this was happening around me, but I never really understood the impact of all this until much later. Well done.
16:35 "Inspired". Xerox had a great team building a new object-oriented language called Smalltalk with Alan Kay, Dan Ingalls, Adele Goldberg and others (based on Simula 67, the very first object-oriented programming language) and Steve Jobs happened to visit them one day and requested (or mandated) to be shown it. Adele refused but the directives forced her to give him a tour including a demo. Jobs mentions later in an interview that he was shown three things but one of them hit him hard: there were square things, the so called "windows" in the screen and Ingalls was manipulating it with a mouse. He knew he had to "be inspired" by it. The anecdote says that he complained that the window was redrawing too slowly when dragged and Ingalls fixed it on the go (as only with Smalltalk you would be able to do it, with its own integrated debugger) so that the window now fluently moved around. Check video id J33pVRdxWbw "How Steve Jobs got the ideas of GUI from XEROX " at 6:26 for that. All the features that "unique interface" had mentioned at 16:55 were already at Palo Alto with their version of Smalltalk.
i have the apple iie enhanced from 86, the ibm 5150, trs80 model 3, vic 20 (2 prong pwr cord), radio shack coco2 and TI 99+4. the apple has mitsumi key switches and the model 3 and TI have alps, they all feel nice.
The Apple IIe was the very first computer I ever used. Prolly back around 1988-90 using it to play simple games to keep me entertained after school around the age of 7.
I actually used the Apple //e in school, Virginia School for the deaf and blind in Hampton, where I was, had a computer lab with Apple //e's and //e Platinums… and a version of "Dragon Maze" that was made with the option to play it with or without the dragon…
We had some Apple IIe computers with dual disk-drives when I was in Matric (year 11 & 12) - the guys that used them had to have two sets of floppies. One set for hot weather and one set for cold. Neither set could be read in the the other season. (I stuck to the PDP 11/70 via terminals; far more reliable!)
I don't remember the Apple II have color. The monitors were either B/W, Green or Amber. I think later a color card could be inserted to work with a television. That was in the UK. I loved the Apple II which I bought for home use.
The first disk drive for Apple ii did bid NOT use Integrated Woz Machine. The Disk ][ board has a prom and logic ICs only. IWM was the IC put in the //c, later IIe cobtrollersy, and Iigs
Give me an Apple //e or GS over anything else in that era. The Woz machines were so much better than anything else that Apple produced and it’s inconceivable as to why they deliberately crippled the power of the GS as it was a far better machine than the Mac of the same era. I believe that the GS especially with an accelerator card that allowed GSOS to run properly outperformed the MS DOS based PCs of the era. Jobs never liked the open design philosophy of the Apple 2 line, time has definitely shown Woz’s vision was the future for Personal Computing
I loved the irony of it. Big Blue, the immobile corporate behemoth, ended up creating the hacker's computer. Not apple, not Commodore, not Texas Instruments or Atari. IBM did.
Yup, and disappointing. OS/2 was a fantastic operating system, but they recognized their mistake WAY too late. When the PC came out, IBM was of the mind the money was all in the hardware - which it was (and is) with the mainframe, although in fairness, they make a ton of $ with mainframe software. It’s such a shame they literally cut their own throat introducing a personal computer with off the shelf parts. Had they taken the time to either develop or buy an operating system, the world, with no exaggeration, would be different.
@@ronjon7942 : well, IBM did that with the machines they developed using microchannel. A true innovation at the time. However, IBM got greedy and wanted to license the design. If you want to build an add-on card, you had to pay a fee. No one else in the industry was doing that. In the end IBM lost.
@@ronjon7942 "It’s such a shame they literally cut their own throat introducing a personal computer with off the shelf parts." Taking the time to develop everything in-house would have been a HUGE mistake on their part. You're forgetting what a behemoth they were. They were institutionally incapable of getting something quickly to market using their in-house procedures. It would have taken them several years to develop the PC if they had done that, and the market would have passed them by. That market was moving VERY quickly. It was precisely because they used off the shelf parts that they were able to get the PC out as quickly as they did. Could they/should they have patented the BIOS and put an exclusivity clause in the deal with Microsoft for the OS? Perhaps, but they did make a lot of money from the PC. I don't consider what happened to be a shame at all. The consumers benefited, and that's what it's all about, not the fate of one company.
Fortunately That error is what meant that now we do not have to move between different architectures and different systems, today it's only x86 or ARM and Windows or Linux, it makes things much simpler
The time between 1975-1995 was the golden age of computing. The one aspect of both the IBM PC and the Apple II that I really liked was their open architecture (especially the PC), It you wanted to tinker and experiment with different configurations and enable new functions that the machine was not designed for, you could do it. I got an understanding of how computers worked at the hardware and basic OS level that one really can't get today with today's machines. Nowadays, too much is locked down or abstracted to a point where people are just users. Apple especially has tight control of hardware and software. But the Windows world isn't that much different. And the last refuge of the tinkerer is the mid to high end gaming PC that one can custom build. But even then, you are reduced down to a few choices of hardware and software to install.
I was about 12 years old in the early 80s. I had gotten a Vic 20 then a Commodore 64, and loved programming them. I remember going in the local computer store, and they had one brand new Macintosh you could try. It was cool, that was the first mouse I ever used, and it was fun scribbling my name in the paint program. But it was thousands and thousands of dollars, walked away and eventually ended up buying a refurbished Commodore Amiga 1000 for $999 (my Dad bought it for me for Christmas). I can tell you that the Amiga was a superior machine at the time, unfortunately it never really fully took off.
@jr: agreed. And you couldn’t easily exchange data with a PC - different disk formats. Job’s wasn’t dumb, he just wouldn’t accept the fact. He thought the Mac was a better machine and perhaps it was. But you know what his arrogance got him.
Apple... Even from the start it was more about style than substance. That alongside enforced obsolescence of the old models. But you have to give to them for the original Macintosh. I remember seeing that when I was a teenager and LOVED the interface. It was gorgeous!
19:00 Just so you know, Lisa was actually the name of Job's daughter, and the computer was named after her. Even though he didn't accept her as his daughter. He... Was a complicated man to say the least Losing control over his child (the computer project) was devastating for Jobs, although it was the correct decision by the board at the time. In completely unrelated news, this bickering with the board will come to a head later and would directly lead to the re-founding of Pixar as we know it. Yes, the one that makes animated movies.
There was a single networked 1st gen LaserWriter in the Mac lab in my art/design college around 1990. It was named "Godot"...as you were constantly waiting for it.
I love this era of computing. Everything was happening so fast and the market was expanding so quickly, that you had to keep innovating just to survive the cutthroat competition.
I remember as a young professional during this period needing to buy a new computer just about every year. Now I have a 13 year old Macbook. I have replaced the battery. I can't think one one good reason why I should buy a new computer...
That has its upsides and downsides. One of the things powering that dynamic was early computers were horrifically under powered.
Someone back then allegedly said "Nobody will ever need more than 640k" But this obviously wrong even at the time. A plain text file of the average book is around 800k to around 1MB. Not only does the whole book not fit in memory, but it wouldn't even fit on a single floppy disk of the time. In 1981 when the PC first shipped, they didn't have anywhere near even 640k. Most were like 128-256kb of RAM. So an author writing a book on a PC would have to do it by chapter across multiple disks. So they would also need to print the book out as they go to make referencing older parts of the book easy.
@@tarstarkusz That's mostly true, but I would point out that other contemporary 8088 machines, like the Victor 9000/Sirius 1 had GCR-encoded floppies with variable speed zone recording that could hold 1.2 MB. Also, because those pre-dated the IBM PC's memory-mapped I/O reservation of the addresses from 0xA000 to 0xFFFFF, you could run MS-DOS with 896K of RAM. (But of course, no one could afford the chips!)
@@WardCo I think Commodore used the GCR encoding with the 1541 and was also variable speed. IIRC, it was able to put more sectors on the outer tracks.
While it is true the 8088 has a 20 bit address bus to get to 1mb, they decided to put the video card bios and video ram in the C segment (also the hard disk controller) So that effectively limited you to 768k, at least contiguously. And, of course, due to the soft segmenting, a new segment could start at FFFF:FFF0 which gave you 64k (-15 bytes) above 1 megabyte That's what himem-sys (I use dash because a period would make it a link) does. It loads a portion of dos into that 64k above the 1MB mark and you can access it without having to switch into protected mode. Note, it only works on a 286 or higher.
Qemm (an aftermarket memory manager) did something to give much larger conventional memory. It did some kind of shadowing and freed up basically all of the upper segments. (But, it generally doesn't do this. IIRC, it will only ad the A segment with the rest being for TSRs, device drivers and the like)
EMS allowed basically unlimited RAM with a bank-switching scheme. I used to have a BUNCH of these cards loaded with chips. I retired XT class PCs (about 100 of them) with 2-4MB of RAM on these cards. It would be worth a decent amount today on ebay
It was exciting to follow it. I bought a lot of computer magazines and went to several computer shows back then. I miss those days. Now everything is just an improvement on things that existed before.
Trivia: IBM only ditched Lotus as their corporate email platform... last year in 2022.
This isn’t *that* surprising- lotus notes may be one of the stickiest bits of software ever created. In addition to just email it also had what we’d today call a low-code toolset and enabled relatively simple line of business applications, those apps ended up becoming core to a lot of large businesses operations, usually with a bunch of poorly documented business logic built in in somewhat hard to decipher ways since they were rarely built by dedicated programming teams, unwinding all of that without bringing the business to a halt was a hard problem.
What do they use now?
IBM kind of disappeared into irrelevance now.
Best wishes.
@@ajax700 lotus was rebranded to HCL Verse but effectively still lotus. They dropped it last year but surprisingly there was quite a lot of resistance to dropping Verse. They now run outlook (in line with all their clients).
I wouldn’t say they’re irrelevant- they still are a massive player in enterprise, for personal computers they exited that business with the sale of the PC business to Lenovo. (And in hindsight that was probably a good move)
HCL Verse was recently uninstalled from all of their laptops, so I guess only the government is using it now.
It's also not surprising because changing something as fundamental as the core email platform can have bad consequences for archival requirements, will result in a lot of expensive training/retraining of the staff, and could break lots of automation.
Slight correction - only around 200 Apple Is were made - I believe the 300,000 number you mentioned are Apple IIs.
That's correct. By 1981 the Apple I was already deprecated as the Apple II line expanded.
No. He actually said only around 200 Apple Is were sold. No idea where you got 300,000 from. He never said that.
It sure sounds like he says they had an install base of 300,000 Apple 1s in the first ten seconds of the video
Yes its a mistake. Apple II was introduced in 1977, and that's what stormed the market, the visicalc machine. In fact Apple Is are rare.
Only 2 years later the Apple II+ was in the market and that was the king by 1981, absolutely not the I. There was the III in that year but it was a flop, in 1983 came the IIe which was the biggest seller of this era for Apple.
Check the subtitles, he just mispronounced.
Both the Alto computer and Aldus Pagemaker are shoutouts to Aldus Manutius, a Rennaissance printer who designed "small" books that could fit into saddlebags and worked with the legendary Humanist Erasmas (who is the guy on the Pagemaker logo). Among other things, Aldus developed the first fonts used to print copies of classic Greek literature, as well as the original texts of the New Testament.
Erasmus.
The Xerox Alto computer got its name from the Xerox Palo Alto Research Center (PARC), alto being the Spanish word for tall/high/upper.
The GUI technology was first developed at the Standford Research Center (SRI), Xerox licensed the tech, and created the Alto computer at the Xerox Palo Alto Research Center.
Who here remembers buying the doorstop posing as a monthly magazine called Computer Shopper?
OMG yes I had forgotten that! It was huge!
One of the things that made the IBM PC really take off was the 'clones'. The BIOS was reverse engineered along with the entire open architecture, prices quickly dropped, new generations of Intel CPUs supplied more and more compute power (286/386/486/Pentium), and in a relative few years PC clones were EVERYWHERE. I loved that era, got to work on a lot of cool projects. :)
it was the clones that eventually droveibm our of the PC market
IBM went to Atari HQ in 1980-1981 and briefly considered licensing or using the Atari 800 as the basis for the original PC.
That would have been much better than what we got lol. the Atari 800 was genuinely almost 10 years ahead of the pc.
IBM legitimized PC’s for business. The old adage back then was youcan’t go wrong buying IBM products.
@@belstar1128another Jay Miner miracle.
Great video. One quibble. Atari did not "collapse." It contracted but immediately began development on the Atari ST computer which was released in 1985.
Well, Atari broke up, which is perhaps not the same as collapse, but it certainly isn't the same as just contracting, either. Tramiel bought one of the divisions, which led to the ST, but the arcade business went to Namco. And Tramiel had already started the groundwork for the ST before buying Atari, so there is an argument that such work would have continued without any Atari purchase.
The education market was basically what kept Apple alive during those early years after the IBM PC came out. IBM took away the high-margin business market (because IBM) and Commodore was eating their lunch on the high-volume home market (because Apple stuff was stupidly expensive and C64s were cheap). Apple really had few customers other than schools.
Our school computer room had one lonely Mac and five C-64s each with 1541 drive and monitor. Everyone played Summer Games by Epyx or Spy vs Spy on them. No one touched the Mac, I am not sure anyone knew how to use it.
Apple has always sold style, not science. That's a good way of making money: the richest men in the world are just stylists, in cars and handbags. It doesn't add a lot of our general well-being over the long haul.
Technology? Jobs just went out and copied. The original Apple PC is simply a cheap knock-off of the $9,000 Xerox machine of the time.
@@TheDavidlloydjones The original Apple had nothing to do with Xerox PARC stuff. It was more a heavily cost-reduced machine built around the 6502 using off-the-shelf parts and some very clever custom circuitry from Woz to tie it together as best as they could without having access to their own fab (which is something Commodore leveraged to make their stuff cheap).
The later Lisa and Macintosh were the ones that borrowed heavily from PARC's Alto, but that became a theme later on. Again, Woz had some good input in the Macintosh.
Jobs indeed was responsible for tons of styling and marketing decisions (some of them smart, some of them very stupid -- for instance his aversion to fans and expandibility), but Woz made a lot of genius designs turning out a far better machine than such a small company had any right to be making.
It was Jobs decision for high profit margins that essentially killed Apple in the home market. Computing devices wouldn't become a fashion statement until decades later, so their price competitors were in institutional use (business and education).
Funny how American academics tend to be the dumbest yet most financially stable customer base😂
@@TheDavidlloydjones Yup, it was always about style then and now. And with their closed-source mentality, the majority (entire?) of the industrial, engineering, scientific, and gaming community uses Windows, not Apple OS.
Your video is the first place where I heard a mention of an "Apple III".
Some of my friends bought one. It had 128K of RAM which we could not imagine what to use for. It is famous in my circle of friends because my buddy Dan beat the chess program SARGON on it while flying on acid one time -- something he was never able to do sober. Good times.
I worked for a publishing company in the 80s;
We published a group of magazines targeted at computer manufacturers (DEC, IBM, HP).
We had an 'art department', as you describe it. People cutting and pasting strips of text into the shape of a printed page.
And then along came Adobe with Aldus Page Maker.
We bought a Mac and an IBM PC-AT. Installed Pagemaker on both. And sat them down side by side in the Art Department.
It took a day to decide.
I ended up buying every new Mac that came out.
And the SAVINGS were insane.
Before you print 250000 copies of the magazine, you have to print ONE copy and have the advertisers approve it.
That costs about $6000 per page.
Until you can do it yourself.
And the big machine that spewed out columns of printed text to be cut/pasted....gets unplugged.
And I got to avoid having to support the IBM PC and Windows.
Thank you Adobe.
I named my son Aldus.
But it limited you to black and white. Don't most magazines get printed in 4 color? How did you get around the color limitation?
It was slow and time consuming. On an early Mac, I would think everything is 1 page at a time. You simply didn't have the RAM to try and load even a black and white magazine length file or even just more than 1 page.
@@tarstarkuszearly on Aldus Pagemaker was used mostly for mock ups, with the main work still being cut and pastes. I think OP is describing the mock up use case and the move into replace paste ups later.
Though OP, aldus was a separate company. Adobe and Aldus were rivals. They weren't together. And Aldus sued Adobe multiple times for copying their software (and own more often than they lost)
@@medes5597 Thanks.
@@medes5597Plus, PageMaker was an absolute joke for most publishing houses when DTP *really* took off. If you were serious, you used QuarkXpress. Aldus PageMaker was common for form houses and in house SMB. Adobe didn't have crap to compete until they bought Aldus and got PageMaker. But still, they were a non issue until InDesign was released and even then, Quark had to drop the ball on the OSX release of QuarkXpress 4 for InDesign to take over.
Ah, easy way to ensure that the son will be bullied in school.
I helped out at times in an intrnal publishing division of a scientific research institution. I remember Xerox Ventura; and then QuarkXpress.
I noticed you said "Apple I" instead of "Apple II" around 0:10. There were only a total of roughly 175 Apple Is sold.
This channel is constantly blowing me away with the topics and depth.... i always learn new things about topics i think i already know about!
Kudos, glad to see this channel is growing!
You kudos are too lazy to read, needing video content !
apple was always the #1, who made money on IBM PC or Gateway 2000 ?
May I submit a correction? The original disk drive controller for the Apple II was called the "Disk II controller". The "Integrated Woz Machine" was the single-chip version used in later models like the IIGS and IIc.
In addition, the ImageWriter was 72 or 144dpi
I bought my first computer in 1991. The choice was between a used 286 with a 14" VGA monitor and MS-DOS 5 (maybe 4.x) and Win 3 or a Apple Machintosh with a 10" BW display. The price difference was that enormous. What made me choose the 286 was the fact that unlike the mac there was almost an unlimited number of games/software to choose from while the mac only had stuff from Apple. Also the UI on the Apple was only available in Icelandic and even though that's the country I live in I hate using electronic products that are not running on english.
A wise choice.
The early Mac definitely didn't just rely on software from Apple - certainly not in 1991. Back then, if you were writing a report on a Mac it was probably done on Microsoft Word. If you were crunching numbers, it was probably done on Excel. If you were connecting to a BBS, you would have used Telefinder by Silicon Beach Software or one of the many terminal emulators. Yes you might have used some Apple software too, such as HyperCard. But the average user could well have used a system with no Apple software at all.
@@OldAussieAds Not where I was living. Getting Apple software in shops that sold PCs was hardly an option. Then there was the issue with the monitor, the only one available for the Apple I could afford in the early 90's was 20" while the one with the 286 was 14" and the whole package was still cheaper than the Apple.
@@bjarkih1977 Yeah I agree there would have been differences some regions. That said, I'm in Australia and we certainly weren't first class citizens to US companies either.
I'm not sure I I understand your monitor comparison. The Mac monitor was 20" while the PC monitor was 14"? Did you get those measurements around the wrong way? 12"- 14" was standard for a colour Mac in 1991. 20" would have been absolutely huge, even for a PC.
@@OldAussieAds That 20"was well out of my reach.
The Integrated Woz Machine came later. It was a single-chip (= cheaper) version of his original disk controller which mainly consisted of an EPROM + some off-the-shelf TTL chips (+ some help from the main CPU). The IWM was key to making later Apple II and III machines cheaper to produce.
Bought my first Apple floppy drive in 1979. To the best of my memory the code running the drive was already being referred to as IWM.
@@shorttimer874I agree. The IWM was a single chip version of the disk ][ controller card. The original card used discrete chips. And really that was the innovation. Woz took a lot of the operation out of hardware and into software (255 bytes of software!) and used commodity (74 series) chips for the apple ][ disk drive. He took drive controller chips off the card. He removed the data separator from the disk drive itself. It was an amazing design.
Later he worked on the IWM, around the //e or //c timeframe. That was a single chip. Company production volumes had risen and the cost of custom chips was down so making their own made sense. The iigs added the Mega II which put basically a //c in a chip. I think the //c+ used a variant of that.
Also Wendell Sander worked on the IWM revising it to the SWIM (Sander Woz Integrated Machine) which added support for 1.44M floppies (SuperDrive). If nothing else that finally gave interoperability between Apple and IBM floppies. This was done partially by putting back the old MFM data separator that Woz had removed in the disk ][. It was the right move so floppies could be standardized between systems. But the funny thing was the age of GCR (group code recording) was finally here just as Woz had used with the disk ][. Hard drives left MFM for RLL and never looked back. Now on your computer your disk (HDMI or DisplayPort), Ethernet, slots (PCIe) and USB all use GCR. it has various names, 8b/10b or 128b/130b or such. But it’s all similar in design to the disk ][ design which could be named 5b/8b (up to DOS 3.2, 13 sectors or 6b/8b (DOS 3.3, 16 sectors) in modern parlance.
Anyway I think the video gets it wrong. There are other small errors too like showing a genuine Apple Mac 7200 (easily one of the worst Macs ever made) when it speaks of clones. None of this undercuts the story. He gets the tale right even despite some small mistakes.
Steve Woz is a genius when it came to hardware design. Some of the software he developed was amazing, also, for its tight code and small footprint.
If you are curious about the 'tie-in' of the "Little Tramp" character (seen at 10:40) by Charlie Chaplin and the IBM-PC?
Possibly the advertising agency that came up with the idea was using the "Modern Times" movie Chaplin produced and starred in back in 1936. After all, the tag-line for that IBM-PC Ad campaign was: "A tool for _modern times."_
they also did it becaise os washad fallen into public c domain and hey didn't ahve to pay royalties
Not going to lie, I want to live in the alternate universe where Commodore came out on top. The Amiga was so far ahead of the Mac and PC in the mid-to-late 80s...
And the C64/128 were impressive in their day before that.
In the end nothing could compete with the clones. Apple, commodore, atari etc. where all custom machines trying to carve out their own incompatible market.
The flexibility, expandability, competition and innovation of the pc once the cloners entered the market made competing with it pretty much impossible.
It would have been nice if they where able to keep going for longer with new products.
S-100 computers were the king when the Apple II was brought to the market. At first it appeared that the Apple II would be an open design so small companies started making various add-on boards for various tasks. It was when Franklin and a couple other firms decided to start making Apple II clones. Apple went after them with a vengeance and soon the clones were history. When IBM came out with the PC, immediately clones started appearing. The only thing holding them back was the firmware BIOS that was owned by IBM. Companies like Compaq developed clean versions of the BIOS using software engineers who wrote the bios from scratch using only the specifications of the BIOS, never looking at any of the source. So before long there were 10's of companies building PC compatibles at low price. I don't think affected IBM very much because most of the IBM customers were not going to take their chances on a clone. Apple locked down their design thus limiting the installation base of the Apple II and Mac architectures.
and it was this openness and competition that led to so much success and innovation in the PC space.
Today Intel cuts its own throat.
eg. think about the Atom line.
Generally bootloader ('bios') was locked. how cool would it be to be able to boot full x86/64 windows on your phone eg (zenphone2) or $50-100 tablets.
eventually is was possible on the zenphone2 but... this should've been a feature, not a chore for the average consumer!
There were countless Apple II clone in Asia.
Although Apple still produces notebooks and desktops now, and IBM has not for a long, long time…
@@PRH123 IBM selling off their personal computers business is a dumb move. The name recognition factor alone will make them some serious money. The Thinkpad was so good that can be milk for many years to come and only need minimal investment.
Yet they sell it for quick cash. Now the money is gone. All they have now is service and a R&D department hoping for a jackpot, But they don't have enough money and resources to make it happen.
False. Not many people cared about the 'openness' of architectures in the 80s. A computer back then was considered open of it had expansion slots and people didn't expect computers to be back compat. The IBM pc didn't succeed because it was ,ore 'open'. People largely expected that the IBM PC would win the PC market and drive out apple and commodore and others like they had done to Honeywell and Burroughs in the 60s and 70s. Truth is that for 3 years after the first IBM PC did not have any competitive clones.
Have u started this full time now, very regular uploads .
Have you ever thought of doing a video on how qwerty and other key layout setups came about? Apologies if you already have, it's just something that I keep thinking about when I look at all those weird attempts at keyboards in the early eighties.
Thanks!
This video illustrates to me why this is one of the best channels on RUclips.
You left us all on a cliffhanger! 😅
Note that the 6502 is not based on the Motorola 6800 design. MOS Technology made a version, 6501, that was pin compatible with the 6800. But pin compatibility was the only compatibility. The 6501 was dropped after a lawsuit from Motorola. The 6502 was constructed in a different way than the 6800, and this made it possible to make it for cents and sell it for a few dollars. So the case with 6800 vs 6502 is very different than 8080 vs Z80, which were machine code compatible, even if the Z80 were constructed very differently than 8080, and had its own assembly language.
And, btw, thanks for your high quality and very interesting content! I love your channel.
Maybe I misunderstood something, but it sounded like Steve Jobs almost killed the company in the 80s.
He did. John Scully saved it.
@@peterfireflylund : agreed. Job’s was a visionary, but his ideas exceeded the hardware/software capabilities at the time. He wanted it so bad, he was obsessed. You couldn’t spend money on development, like he was doing, and not have investors get upset.
Steve Jobs succeeded in killing himself in 2011.
@@deepsleep7822 Jobs' one talent was marketing himself and convincing people that he was a 'visionary'. What he was was a gold-plated arsehole. Wozniak was the reason Apple existed; Jobs was nearly the reason it didn't.
@@peterfireflylund That's not exactly how the Apple story ended.
I have used the Apple Lisa computer and LaserWriter while back in the late 1980's, maybe first in South Korea. These machines were revolutionary piece of art, science and engineering marvel . Graphic screen, UI and mouse and beautiful printed output, rivalling phototypeset book. Window desktop and object-oriented drawing tool software. The software itself worth the money. IT was easy to use and understand. It was culture shock. I wrote my CS master's degree thesis about object-oriented language with Lisa (then Mac XL) & LaserWriter. So I couldn't understand why this revolutionary machine didn't sell well enough in market.
According to Bill Atkinson it took the 8 MHz Macintosh at least half a second to draw a screenful of characters - 1.75 times faster than Lisa, and that's before accounting for Lisa's larger screen.
Lisa didn't sell because it was unusably slow, and also cost $10,000 (±$31,000 today).
Very nice. The Fall 1985 marketing push you cite resulted in my company becoming a Mac / LaserWriter / PageMaker customer at the time. I was the one sent out to "the seminar." I reported back that it was great, and we should buy it.
This review of the early history of the PC is extremely interesting to me.
I was watching this field intently during the mid-70's to mid-80's.
In the late '70's the Z80 was the best 8 bit micro but the 8080 had a head start and good marketing.
The Motorola 6800 was very good too but it didn't have the marketing clout behind it.
The Texas Instruments TI-994A was more of a toy computer, ok for games but not for serious work applications.
When the IBM PC came out, its hardware wasn't very good but the IBM name was magical and everyone wanted something that was "IBM Compatible".
The Texas Instruments PC ("Professional Computer") in the early '80's had much better hardware but without the feature of IBM compatibility, it wasn't a marketing success.
Yeah. I'm of the age group that grew with Apple and Microsoft, and it's bewildering they're basically retreading the mistakes of the companies they disrupted.
You seem to be mixing up CPUs (Z80 and 6800) and computers (TI-99/4A). The 6809 derivative of the 6800 was well loved by its programmers but apparently too expensive compared to the competition. The TMS9900 as the CPU in the TI-99/4A was a true 16-bit CPU when all the competition was 8-bit, but other parts of the computer's design was so compromised that the result ended up slower than the 8-bit competitors. (I'm and old Z80 coder and never coded for the 6800, 6809, or TMS9900 but read a lot of what other people thought about them.)
@@andrewdunbar828 In the late '70's I thought the Z80 was the best of the 8-bit micros. I compared it to the TMS9900 and finally decided on the TMS9900 since I was working at TI. I built my own homebrew computer and wrote a lot of software at home for the TMS9900 that I later used on a real project at TI. Unfortunately TI didn't continue to develop their micros. In 1982 they gave me a TI994/A with all the accessories and software so I could evaluate it. This was fair for games but not too good for general programming. The IBM PC stole the show and set the standard, (except for Apple).
The Z80 and 6800 and 68000 were much better hardware-wise than the 8080 family but Intel's marketing prevailed.
@@bob456fk6 Intel's marketing didn't prevail against the Z80: a _lot_ more designs used the Z80 than the 8080 or 8085. In part that's because the Z80 was an 8080 clone with some extra features.
Intel won with the 8086 because IBM used it and the IBM PC took off. It's kind of a crap design compared to a 68000, but Intel got working versions into production first; the 68000 apparently wasn't really yet ready when IBM was evaluating it for the upcoming PC.
I wouldn't say the 6800 was better than the 8080, no. Not even as good, really, except that it needed only +5V rather than +5/-5/+12. (I've done a fair amount of programming on the 6800.) The 6502 was probably about as good as the 8080: it didn't look as good on specs, but it ran pretty darn fast if you knew what you were doing. (A 1 MHz 6502 easily is faster than a 2 MHz Z80.)
Sinclair was for kids only, never better than apple one !
Only the Gateway 2000 compaq people made PC bigger than apple ever was, apple won it all !
The Apple II E was our family's first PC and I remember it very fondly. My university had a UNIX network on campus, and every student got an account. My first very own computer was a used 386 that friends who worked at Siemens gifted me. Those last two weren't Apple of course, but I always think of all those systems together, with tons of nostalgia! Thanks for the memories! 😊
Or may the history of Atari. Despite the claim that they went out of business in 1983, Atari came out with the Atari ST in 1985. One of the fun things about it was that you could take the roms out of a Mac, put them in a cartridge, insert it into the Atari ST, and with a small program it could turn your ST into a Mac that also ran FASTER than the Mac.
Merci!
Seeing those rooms with the old computers reminds me of an urbex video I saw the other year where an abandoned textiles factory in Italy was explored and there remained so many PCs that part of me wanted to dash over there to find one a new home. Sure, they would require repairs but it's fun seeing relics of the past still exist, even if they're in bad shape.
Apple II and II+ is what you meant in 1981 as the Apple I was the 1976 single board computer requiring soldering…
Jobs never understood engineers or computer users in general. Worst purchase I ever made was an original Macintosh. I regret every penny I spent on that piece of 'stuff'. We were completely abandoned, with no capabilities and no upgrade path. To this day I've never spent another dime on anything Apple.
I too bought one of the early Macs. It was cute, the pulldown help menus did help to understand it, but it had just those three programs (word processor, spreadsheet, something) which I had no use for, it had no storage to speak of, it had that dinky b/w screen, and after bringing it work (I was a Unix programmer) and everyone getting a chance to play with it, sent it to my sister, who used it for years with her church. The only reason it was not my worst purchase is because I had no great expectations and it was intended for my sister right from the start.
It did also teach me about Steve Jobs dictatorial attitude. I have had jobs using Macs, and it drove me crazy how hard they were to customize. Steve Jobs attitude was "my way or the highway", Bill Gates was "who cares and it's crap", and Unix/Linux was a zillion choices and all the customization you could want, but no one made it easy. I use System76 laptops now, wish they had a Gentoo option, but they are fine hardware.
I mean, Unix programmer probably wasn’t their target audience? Seems like it worked really well for your sister.
Same, *NEVER* understood the hype for Apple, when Android is better in every way, atleast to me, imo!
I used Android for a decade until iPhone 14 Pro. To say that Android is better in every way is delusionally biased.
@@jonhall2274 Don't forget Android is a grandchild of Unix and that its distant cousin is OS X.
The Altair 8800 was hardly a personal computer. It, like the Kim I, were really just hobbyist kits. Basically, it was a box with switches and some LEDs on the front of it. While it was affordable, there was no keyboard for input and no display for output.
Im my opinion, a more realistic view of what makes a personal computer is that at the least it REQUIRES keyboard (or something more advanced) and a display of some sort (teletype and LEDs notwithstanding). In my view, it would have to be something akin to an IBM 5150, an Apple II, or a VIC 20. I wouldn't even count an Apple I, since it was a kit which keeps it in "kit" territory.
While much of the hardware has changed from the IBM 5150 or Apple II to now, the basic paradigm for using a computer hasn't drastically changed. I'm typing on a laptop that has a keyboard for input and a video display showing letters (or characters) when typed. Modern computers do generally have GUIs and pointing devices, however micr are not new as everyone who has ever seen "The Mother of All Demos" can attest. The only thing really keeping Douglas Engelbart's system from being a PC was the fact that it was so horrifically expensive.
The truth is that before there were affordable systems with keyboards and displays, the systems that preceded them were too difficult to be used by anyone but a hobbyist or someone in the industry. The advent of the actual PC changed that.
The Kim 1, Altair, Imsai, etc. were all very capable and expandable computers. But you needed a separate terminal to interact with them - which was a significant added cost.
@@markmuir7338 From what I've seen of those machines, getting them working with a keyboard and display is not a trivial affair and requires toggling memory locations using switches on the front of the case and extra, expensive cards and such. If my understanding is correct, this disqualifies them at least in my mind.
The point of the "P" in "PC" was that you could:
A) Afford the thing.
B) Take it out of the box and do something useful with it as a normal person.
At 8, I was able to teach myself Apple II basic, TRS 80 basic, and IBM PC basic. With one of those other computers, I sincerely doubt I would have had a clue what to do with them.
@@kevintrumbull5620 These hobbyist micro computers worked the same way as mini computers and mainframes of the era. They were intended for people who already used computers, but who wanted to have one to themselves. That's why they relied on serial terminals - that's what their users were used to. Once you had all the necessary equipment (terminal, disk drives, disk controller card, memory expansion card) you could run 'proper' operating systems on them (CP/M) along with plenty of commercial software.
As you said, adding a built-in terminal which worked out of the box would expand the potential market substantially - by including people who had never used a computer before. I also got into computing that way via a Commodore VIC-20 when I was 6, then an Apple II when I was 8. It's important to recognize there were a few of these integrated systems before Apple (eg MCM/70) which didn't have the marketing genius of Steve Jobs, hence are lost to time. Also, several hobbyist computing companies (eg SWTPC) sold their own terminals alongside their micro computers, which were pre-configured. These companies and their products continued along until the IBM PC stole away most of their customers.
The RUclips channels 'Adrian's Digital Basement' and 'Usagi Electric' have vastly expanded my knowledge of these early systems, and made me less of an Apple fanboy along the way. There was plenty of innovation all around.
If you were willing to add a terminal, serial port and ROM board to an Altair you could have a system that booted straight up into whatever the ROM does, with no toggling.
The Apple II was sold built, as well as in kit form. You could just go down to The Byte Shop, buy one that included a case and PSU, and plug it into your TV at home. Though of course it booted up into WozMon, rather than BASIC or anything like that.
But you're right that for consumer adoption, something "akin to an IBM 5150, an Apple II or a VIC 20" was required. But that didn't happen in 1981 with the first IBM PC: that happened in 1977 with the Commodore PET, Apple II and TRS-80 (later called the TRS-80 Model I).
That "other" on the first pie chart, where the collective CP/M (S100) machines, from many small manufacturers. The MITS Altair, and others similarly built, could run Digital Research's CP/M operating system (OS). Yes, Apple had it own proprietary OS. Though Microsoft provided a plugin card, that could run CP/M. The bulk of the CP/M machines exited the market by 1985.
As for floppy drives, those first ones were 8" drives. You don't know "floppy", unless you've really handled an 8" floppy !
At the time of the IBM PC, getting patents on software was difficult. Most software was only protected by copyright. I had access to an IBM Technical Reference Manual. It had all schematics, and full printout of the BIOS source code. Used that manual a lot.
Next you should do a video on Commodore and the Amiga. I grew up on the Timex Sinclair and Commodore 64 as a kid. The Demo scene on the Amiga with 4096 color palette far exceeded what the Mac could do. But the best 3rd party add in card was Newtek's Video Toaster. Video switching, compositing, and 3D Computer graphics creation. You could do a series about the Computer Giants of yesteryear. The CPU wars never get old.
Amiga only, commodore did too many weird systems, who needed that ?
kids, play a game on it !
The Amiga topic is interesting but I feel like it would end up being just a footnote in the history of microcomputers
The best would be a story about Commodore, a company that made millions but in the end only the memory of it remains
I feel like Steve's hiatus from Apple (@24:30 onward) was what made him Steve Jobs. He finally went out into the world and watched people use computers. It's after this that he started saying those iconic things about "starting with the customer, and working your way back to the technology", and so on. Before this he was a great manager and idea guy, but he didn't have a concept of the mission. That's not to say he was "wrong" in redesigning the first Macintosh and quintupling the price from $500 to $2495; the price was not the issue but the technologies and software it shipped with.
the Lisa embodied much of all the same characteristics, but clearly showed that price was very much an issue.
What would have changed the dynamics for the Macintosh intro would have been to ship with 512K base memory instead of 128K with expansion capable up to 2MB. All those 128K Macs were useless MacPaint doodlers only and over $3000 was a lot to pay for a doodling computer. And it needed that SCSI port from day one so that serious users (business) could attach large capacity storage.
IOW, they needed to have inrtoduced something much more like the Mac Plus from day one and then they would have had a very serious contender against the business serious IBM PC
I'm not sure what that last sentence, after $2495, means; it needs to be explained or rephrased. What was the issue?
@@acmefixer1 Basically they made a very expensive device with not a very great idea of who the intended customer was supposed to be. Computers aren't just specs, they are the software they ship with, the software they can run, and the things you use them for.
@@TheSulrossIIRC the SCSI standard wasn't finalised until 1986 - even the Macintosh Plus, released in January of that year, isn't fully compliant with SCSI-1 specifications.
The price of the first Mac absolutely was a major problem. That, and the gross mismanagement of the IIGS, cost Apple what should have been a position of permanent dominance in the computer market.
This article does not seem to tell the cost of the IBM PC. I had one in 1983 or 84 and it cost almost $3K with a monitor and printer.
Yeah!
The $3K [US] amount was pricy for its time, and not affordable for the average household, as adjusted for inflation, that 1983 price would equal to near $9K in 2023 [US] dollars.
my first computer c. 1981 was a eurapple. an apple with a non ntsc video output. I clearly remember I got big PCB which needed all the parts and a bunch of cuts and straps to get ntsc. I had one mistake. I built a linear power supply, used a nice wooden Hammond case and surplus keyboard. I first I used a TV set and taped into the circuit with the ntsc signal. Later I got a green screen monitor using a phono jack for the video. Next came the floppy drives, 2 of them Asuka drives. and much later a HDD. A big thing back then was to an 80 column card. I also remember getting a Z80 card to run a spread sheet. I designed and built several cards that plugged into the slots.
Thank you for this fascinating tour through this history. I lived through this whole period, buying an Apple ][+ in 1980. What I particularly liked about your video was, in your usual fashion, its completeness -- I learned a lot that I did not know before, and I feel like I have a much broader view of what was happening at the time. Being so immersed in it, I didn't see the forest for the trees. As always, I am looking forward to your next video!
Personal computing is one of the few that's grown up in an entire human generation.
This was a great video. It brought back so many memories. Filled in so many details. Thank you.
HP wasn't in the PC market in the early 80's. Their computers may have looked like PC's but they were aimed at professionals and so were the prices. They didn't sell in any meaningful numbers. Not sure where you sourced your data for the pie chart at 11:05. It also conflicts with the market share chart you show later at 22:48. That chart is missing the "other" category as it doesn't include computers like the VIC-20 which was still selling in good numbers into the early 80's. Even the Ti-99 was still on the shelves in those years.
One of the reasons Commodore was able to weather the gaming console crash and price wars of the early 80's is that it owned MOS Technologies which made the 6502 CPU. Every Apple II sold was generating revenue for Commodore!
Mainly through licensing to companies like Rockwell and Synertek.
@@brodriguez11000 Yes second source manufacturing
HP had various models that could run CP/M in the early 1980s including the HP-86 and HP-125. I think that pretty much makes them personal computers, albeit not IBM PC-compatible ones. Yes, they were expensive and noted as such at the time, but then there was a perception of enhanced quality over the most comparable competition.
It is an odd statement indeed to distinguish between personal computers and computers "aimed at professionals", since a lot of the money made in early microcomputing was made precisely by selling computers to professionals. That was kind of the point of the IBM PC, after all.
Been waiting for you to do this topic! Thank you
Found 'Nokia: The Inside Story' book at a thrift store and was an amazing read. Maybe want to take a look at it for a future video in case you were not already working on it. Thanks for the great content.
Minor note, I bought the Apple 2 Plus, the second Apple 2 generation, in 1979. The actual Apple "1" was a kit with no case or keyboard, very few of those were sold, so I am guessing in the beginning of this video the count of Apple "1"s actually includes Apple 2s.
you need to do a video on Novell
Concur.
Thank you! Your videos are just excellent!
the biggest problem with the Apple III wasn't chips falling out of their sockets (yes, that was a problem to a small extent, but not the big deal breaker), the BIG problem was it's poor ventilation design caused the components inside to overheat significantly. It had no propose built ventilation and no fan, causing it to overheat badly if left on for more then a couple hours. I remember one Apple III in a computer lab long ago and I remember a note on it said "turn off after one hour" meaning no one could use it for more then an hour cause it was so prone to overheating.
Okay, you can't just end on a cliffhanger like that. Can't wait for part two.
300,000 Apple I computers instaled? Less than 200 were ever made....
[10:14] The infamous Wall Street Journal ad: "Over the next decade, the growth of the personal computer will continue in logarithmic leaps." It's odd that they didn't say exponential, and instead said the inverse, logarithmic. Anyway, thank you for your fascinating and well-researched videos, they are always worthwhile.
The IBM PC was very similar to a CP/M-based computer, but with the advantage of being able to use more memory, and having a more powerful computer - and a single standard disk format. So it started by swallowing up that segment of the market. The Commodore 64 was a direct competitor to the Apple II in the home market; but since the Apple II was a factor in the business market, IBM was also a competitor for it.
this is when i grew up using computers: aldus pagemaker, photoshop (just photoshop, no number or date) illustrator and premiere. did my senior project (wrote and produced a pilot, but only had time to shoot the commercial during school year) and incorporated 3D graphics and overlays using a 660av and a VHS VCR. had that tape for 20 years till i lost it in a move.
it was a spectacular disaster, but it was my disaster. if i still had it, it would be on RUclips for everyone to gawk in horror. 😮😅
Considering I'm nowhere near Hollywood, there was never a risk of my becoming an actual producer, but it was still fun learning the process.
the fact that Apple started out open and expandable is so bizarre. Now they operate solely as a completely walled garden ecosystem by design.
Closed products were Jobs' MO from day one. The only reason Apple started out open was because Wozniak had influence in the early days as the chief designer. However, as his health and interest in the company waned, Jobs' influence rose. Enter the Apple 3 disaster. That was a direct result of Jobs' form over function design philosophy. And he kept making that mistake over and over again. See the Power Mac G4 Cube. He only gained eventual success because the smartphone was a product class where design in fact does beat function. (Cough *AntennaGate* cough)
What about the iMac and iPod?
@@JimmyDoresHairDye The original iMac needed to get out fast, given Apple's very limited resources in 1997. Jobs' input on the product was curtailed as a result. He literally threw a fit when they showed him the prototype with a sliding DVD drive instead of a slot loader. For the same reason they adopted USB, which allowed 3rd party expansion and compatibility with PC devices. The iPod only really took off when they opened it up for Windows users. If they kept it a Mac exclusive accessory, it would never have been as successful as it was.
Question about the ending ... What was Scully's new "game changing" project for apple that became a disaster? Is there another video that explores this?
It’s worth noting that Wozniak’s disk drive lacked the capability to track the position of its drive heads. Consequently, when it needed to reset, the motor would forcefully push the drive heads for about a second, hoping to return them to the starting point. This process generated a loud and distinctive noise, making the disk drive significantly noisier than others available at the time. All in the name of using as few of components and saving cost
Apple sold hundreds of apple 1s, not 300k. That would be the Apple II.
Its a pretty strange era there were a lot of good computer companies in the early 80s that where way better than apple and ibm. like commodore Atari Sinclair ti but they all got crushed and forgotten after the 8 bit era. my guess is that they overestimated the home user market instead of focussing on the office market. in the 80s the average person didn't need a computer it could have been handy and fun for gaming. but it wasn't until the 90s and the rise of the internet that the average person wanted a computer for more than just gaming.
i think another advantage ibm had and apple to a lesser extend was that their computers where more expandable and open so other companies could improve the pc. and even the old apple computers had more expansion options than the c64 .and when someone did make a hardware upgrade for the c64 nobody supported it.
@@ghost_mall English is a dumb language
@belstar1128 At least commodre and Atari did survive into the 16bit era and I do agree that in the 80s a lot of people had no reason to own a computer beyond games but they did try to find success in the business market with the commodre plus 4 and Atari portfolio and stacy. The fact that sometimes as you say expansions don't get supported alongside fragmenting the userbase is maybe why some computer companies don't or didn't support the idea of more expandable systems
The problem is Apple doesn't license its operating system to manufactures which means don't allow these PC manufactures to make computers with MacOS as Microsoft does with Windows. This limits your choices
They licensed the OS for a while, but found out that other companies made a better Mac.
That was a very well done research. Especially looking for markets and applications for (existing) products.
A great episode that needs a part 2. Is there one or will there be one? 😊
I remember buying a computer in 1985 for under $200.00 and it used two cassette drives. I could use my music cassettes in it. It had no monitor so I hooked it up to an old black and white large screen TV that was in the basement. It took a while for the computer to move the cassette tapes back and forth while operating. I used a word processing program on it to prepare pamplets, etc.
The early IBM PC’s had a cassette interface. I had an early PC and tried it. It worked, but it was something I wouldn’t rely on. The first IBM PC’s had a max of 64k on the mobo. Mine was the version when they first switched to 256k on the mobo.
Strictly speaking the "integrated Woz Machine" was not the first superb Woz design, which was like 6 chips. It was a one-chip Woz machine that was used on the Apple IIc and early Macs.
I think I see an Apple Chapter 2 video in my future. Great Video, Jon - Thanks!
So Steve Jobs wasn't so great as many creating him. But I could guess it from words of Jim Keller, which he was telling about how they must hid products from Jobs in early stages, because he don't understand product development cycle, that prototype is not as final product.
My Mac IIe was so s-l-o-w I had CompUSA bust the case and double its memory for "only" $150. Wish I had instead put that money into Apple common stock!
¯\_(ツ)_/¯
Shout to the Performa 5200 series, I loved the industrial design of that machine and it was fairly expandable at least via SCSI, if not obviously slow relative to Windows PCs. Still, it was and still is an extremely fun machine to use during the glory days of CRT. Definitely a nice prelude to the iMac. And way better than all those PowerPC/Performa 6xxx series of which there were way too many configurations (mac ppl know what I’m talking about) System 7.6 - 8.6 looked awesome on that Performa, Ethernet was effortless, 100Mb Zip Drives (mostly) worked for low to mid-size files. I’ve made a point to keep everything compatible with my 5215 still working. SoundEdit 16 v2 was/is also a really awesome piece of software, even if it was slow, the ability to *see* raw sound and music was amazing. Photoshop, Illustrator and QuarkXpress also did ok on this machine. Even when it lagged, it was still an amazing creative tool. It could also handle a certain type of file called the .mp3 ;)
Many people forget how groundbreaking the original Apple II was, it had: BASIC on ROM, expansion, color, graphics, & sound in 1977. It's only "competition" were the TRS-80 & Commodore PET which were text only, no sounds, no graphics. Then the same people brag about Commodore 64, which did have sound & graphics, but was not released until 1982.
The best content of these topics on youtube. Big fucking like for you
Huh? The narration opens by stating that by 1981 Apple had an installed base of over 300,000 Apple 1s.
Did you mean 300,000 Apple iis?
Yeah I have 5 apple IIs. I would trade them all plus my mac plus for an apple I.
you bring back so many memories... VisiCalc, 20/20, etc
.
There is a briliant story, of Apple investment (way back when Jobs was on Lisa team I belive) into a Hungarian startup called Graphisoft, who subsequently went on and invented new way of designing buildings on computer. Their flagship software Archicad battles the hegemon of AEC industry - Autodesk to this day in much similair fashion as Apple and IBM once did. I would love to see your take on the story, @Asionometry.
Small Costs were such a huge driving force… my high school of 800 kids shared three Trash 80s… cassettes were mostly used for program storage… one of the TRS80s had a disk drive… 1983.
Buying a computer to go to college… my class was the first to require one computer for each student… class of ‘87…
DEC Pro30, compatible with the DEC10 mainframe on campus…. Word11 was a great word processor. Took a year to get a dot matrix printer (expensive and limited supply)
Dad went Commodore64 after that… low cost word processing that looked better than a typewriter could do… huge savings in manpower. Or Mom-power…. 😀
Do typewriters still exist?
Saw one in an antique store. I’d have bought it if it was an IBM.
it's hard to comprehend how expensive apple products were in the 80s here in Germany. a private person owning an Apple was practically unheard of.
3:22 Why am I not surprised that Steve Jobs argued against an expandable system
As a teenager interested in computers, much if this was happening around me, but I never really understood the impact of all this until much later. Well done.
Congrats on 500k, well earned!
16:35 "Inspired". Xerox had a great team building a new object-oriented language called Smalltalk with Alan Kay, Dan Ingalls, Adele Goldberg and others (based on Simula 67, the very first object-oriented programming language) and Steve Jobs happened to visit them one day and requested (or mandated) to be shown it. Adele refused but the directives forced her to give him a tour including a demo. Jobs mentions later in an interview that he was shown three things but one of them hit him hard: there were square things, the so called "windows" in the screen and Ingalls was manipulating it with a mouse. He knew he had to "be inspired" by it. The anecdote says that he complained that the window was redrawing too slowly when dragged and Ingalls fixed it on the go (as only with Smalltalk you would be able to do it, with its own integrated debugger) so that the window now fluently moved around. Check video id J33pVRdxWbw "How Steve Jobs got the ideas of GUI from XEROX " at 6:26 for that. All the features that "unique interface" had mentioned at 16:55 were already at Palo Alto with their version of Smalltalk.
I love your videos. Well researched and presented
Apple's history of overpriced computers continues to this day. 😝
Yeah, even in the used market, which is where I play.
Its a good thing tbh
They are literally cheaper than anything else now
You get what you pay for...
You can get one for 450 dollars on sale now. Cheaper than it ever has been especially considering inflation
i have the apple iie enhanced from 86, the ibm 5150, trs80 model 3, vic 20 (2 prong pwr cord), radio shack coco2 and TI 99+4. the apple has mitsumi key switches and the model 3 and TI have alps, they all feel nice.
The Apple IIe was the very first computer I ever used. Prolly back around 1988-90 using it to play simple games to keep me entertained after school around the age of 7.
I actually used the Apple //e in school, Virginia School for the deaf and blind in Hampton, where I was, had a computer lab with Apple //e's and //e Platinums… and a version of "Dragon Maze" that was made with the option to play it with or without the dragon…
We had some Apple IIe computers with dual disk-drives when I was in Matric (year 11 & 12) - the guys that used them had to have two sets of floppies. One set for hot weather and one set for cold. Neither set could be read in the the other season. (I stuck to the PDP 11/70 via terminals; far more reliable!)
I don't remember the Apple II have color. The monitors were either B/W, Green or Amber. I think later a color card could be inserted to work with a television. That was in the UK. I loved the Apple II which I bought for home use.
The first disk drive for Apple ii did bid NOT use Integrated Woz Machine. The Disk ][ board has a prom and logic ICs only. IWM was the IC put in the //c, later IIe cobtrollersy, and Iigs
Give me an Apple //e or GS over anything else in that era. The Woz machines were so much better than anything else that Apple produced and it’s inconceivable as to why they deliberately crippled the power of the GS as it was a far better machine than the Mac of the same era. I believe that the GS especially with an accelerator card that allowed GSOS to run properly outperformed the MS DOS based PCs of the era. Jobs never liked the open design philosophy of the Apple 2 line, time has definitely shown Woz’s vision was the future for Personal Computing
One of my uncles built an Altair 8800. It seemed like science fiction then.
1:55 I confused this Altair with the Altair Engineering (CAE company)
I loved the irony of it. Big Blue, the immobile corporate behemoth, ended up creating the hacker's computer. Not apple, not Commodore, not Texas Instruments or Atari. IBM did.
1:48 We had two of these, they where amazing for the time . . .
It seems so strange that IBM didn't think it should own/control the operating system it's putting into everyone's homes.
Yup, and disappointing. OS/2 was a fantastic operating system, but they recognized their mistake WAY too late. When the PC came out, IBM was of the mind the money was all in the hardware - which it was (and is) with the mainframe, although in fairness, they make a ton of $ with mainframe software.
It’s such a shame they literally cut their own throat introducing a personal computer with off the shelf parts. Had they taken the time to either develop or buy an operating system, the world, with no exaggeration, would be different.
@@ronjon7942 : well, IBM did that with the machines they developed using microchannel. A true innovation at the time. However, IBM got greedy and wanted to license the design. If you want to build an add-on card, you had to pay a fee. No one else in the industry was doing that. In the end IBM lost.
@@ronjon7942 Agree. OS/2 was awesome. And Warp was REALLY good. Sad to say it was too late.
@@ronjon7942 "It’s such a shame they literally cut their own throat introducing a personal computer with off the shelf parts."
Taking the time to develop everything in-house would have been a HUGE mistake on their part. You're forgetting what a behemoth they were. They were institutionally incapable of getting something quickly to market using their in-house procedures. It would have taken them several years to develop the PC if they had done that, and the market would have passed them by. That market was moving VERY quickly. It was precisely because they used off the shelf parts that they were able to get the PC out as quickly as they did. Could they/should they have patented the BIOS and put an exclusivity clause in the deal with Microsoft for the OS? Perhaps, but they did make a lot of money from the PC. I don't consider what happened to be a shame at all. The consumers benefited, and that's what it's all about, not the fate of one company.
Fortunately
That error is what meant that now we do not have to move between different architectures and different systems, today it's only x86 or ARM and Windows or Linux, it makes things much simpler
The time between 1975-1995 was the golden age of computing. The one aspect of both the IBM PC and the Apple II that I really liked was their open architecture (especially the PC), It you wanted to tinker and experiment with different configurations and enable new functions that the machine was not designed for, you could do it. I got an understanding of how computers worked at the hardware and basic OS level that one really can't get today with today's machines. Nowadays, too much is locked down or abstracted to a point where people are just users. Apple especially has tight control of hardware and software. But the Windows world isn't that much different. And the last refuge of the tinkerer is the mid to high end gaming PC that one can custom build. But even then, you are reduced down to a few choices of hardware and software to install.
I was about 12 years old in the early 80s. I had gotten a Vic 20 then a Commodore 64, and loved programming them. I remember going in the local computer store, and they had one brand new Macintosh you could try. It was cool, that was the first mouse I ever used, and it was fun scribbling my name in the paint program. But it was thousands and thousands of dollars, walked away and eventually ended up buying a refurbished Commodore Amiga 1000 for $999 (my Dad bought it for me for Christmas). I can tell you that the Amiga was a superior machine at the time, unfortunately it never really fully took off.
Fascinating story! Well done, sir. 😊
“Why isn’t the Mac selling?”.. it was way overpriced..
Even a single expansion slot would have helped allowing for a co-processor even though they're different architectures.
@jr: agreed. And you couldn’t easily exchange data with a PC - different disk formats. Job’s wasn’t dumb, he just wouldn’t accept the fact. He thought the Mac was a better machine and perhaps it was. But you know what his arrogance got him.
Apple... Even from the start it was more about style than substance. That alongside enforced obsolescence of the old models.
But you have to give to them for the original Macintosh. I remember seeing that when I was a teenager and LOVED the interface. It was gorgeous!
19:00 Just so you know, Lisa was actually the name of Job's daughter, and the computer was named after her. Even though he didn't accept her as his daughter. He... Was a complicated man to say the least
Losing control over his child (the computer project) was devastating for Jobs, although it was the correct decision by the board at the time.
In completely unrelated news, this bickering with the board will come to a head later and would directly lead to the re-founding of Pixar as we know it. Yes, the one that makes animated movies.
There was a single networked 1st gen LaserWriter in the Mac lab in my art/design college around 1990. It was named "Godot"...as you were constantly waiting for it.
27:05 The ad agency Chiat Day is pronounced "SHY-uh day" (not Chat Day)