I love this era of computing. Everything was happening so fast and the market was expanding so quickly, that you had to keep innovating just to survive the cutthroat competition.
I remember as a young professional during this period needing to buy a new computer just about every year. Now I have a 13 year old Macbook. I have replaced the battery. I can't think one one good reason why I should buy a new computer...
That has its upsides and downsides. One of the things powering that dynamic was early computers were horrifically under powered. Someone back then allegedly said "Nobody will ever need more than 640k" But this obviously wrong even at the time. A plain text file of the average book is around 800k to around 1MB. Not only does the whole book not fit in memory, but it wouldn't even fit on a single floppy disk of the time. In 1981 when the PC first shipped, they didn't have anywhere near even 640k. Most were like 128-256kb of RAM. So an author writing a book on a PC would have to do it by chapter across multiple disks. So they would also need to print the book out as they go to make referencing older parts of the book easy.
@@tarstarkusz That's mostly true, but I would point out that other contemporary 8088 machines, like the Victor 9000/Sirius 1 had GCR-encoded floppies with variable speed zone recording that could hold 1.2 MB. Also, because those pre-dated the IBM PC's memory-mapped I/O reservation of the addresses from 0xA000 to 0xFFFFF, you could run MS-DOS with 896K of RAM. (But of course, no one could afford the chips!)
@@WardCo I think Commodore used the GCR encoding with the 1541 and was also variable speed. IIRC, it was able to put more sectors on the outer tracks. While it is true the 8088 has a 20 bit address bus to get to 1mb, they decided to put the video card bios and video ram in the C segment (also the hard disk controller) So that effectively limited you to 768k, at least contiguously. And, of course, due to the soft segmenting, a new segment could start at FFFF:FFF0 which gave you 64k (-15 bytes) above 1 megabyte That's what himem-sys (I use dash because a period would make it a link) does. It loads a portion of dos into that 64k above the 1MB mark and you can access it without having to switch into protected mode. Note, it only works on a 286 or higher. Qemm (an aftermarket memory manager) did something to give much larger conventional memory. It did some kind of shadowing and freed up basically all of the upper segments. (But, it generally doesn't do this. IIRC, it will only ad the A segment with the rest being for TSRs, device drivers and the like) EMS allowed basically unlimited RAM with a bank-switching scheme. I used to have a BUNCH of these cards loaded with chips. I retired XT class PCs (about 100 of them) with 2-4MB of RAM on these cards. It would be worth a decent amount today on ebay
It was exciting to follow it. I bought a lot of computer magazines and went to several computer shows back then. I miss those days. Now everything is just an improvement on things that existed before.
Yes its a mistake. Apple II was introduced in 1977, and that's what stormed the market, the visicalc machine. In fact Apple Is are rare. Only 2 years later the Apple II+ was in the market and that was the king by 1981, absolutely not the I. There was the III in that year but it was a flop, in 1983 came the IIe which was the biggest seller of this era for Apple.
Both the Alto computer and Aldus Pagemaker are shoutouts to Aldus Manutius, a Rennaissance printer who designed "small" books that could fit into saddlebags and worked with the legendary Humanist Erasmas (who is the guy on the Pagemaker logo). Among other things, Aldus developed the first fonts used to print copies of classic Greek literature, as well as the original texts of the New Testament.
The Xerox Alto computer got its name from the Xerox Palo Alto Research Center (PARC), alto being the Spanish word for tall/high/upper. The GUI technology was first developed at the Standford Research Center (SRI), Xerox licensed the tech, and created the Alto computer at the Xerox Palo Alto Research Center.
This isn’t *that* surprising- lotus notes may be one of the stickiest bits of software ever created. In addition to just email it also had what we’d today call a low-code toolset and enabled relatively simple line of business applications, those apps ended up becoming core to a lot of large businesses operations, usually with a bunch of poorly documented business logic built in in somewhat hard to decipher ways since they were rarely built by dedicated programming teams, unwinding all of that without bringing the business to a halt was a hard problem.
@@ajax700 lotus was rebranded to HCL Verse but effectively still lotus. They dropped it last year but surprisingly there was quite a lot of resistance to dropping Verse. They now run outlook (in line with all their clients). I wouldn’t say they’re irrelevant- they still are a massive player in enterprise, for personal computers they exited that business with the sale of the PC business to Lenovo. (And in hindsight that was probably a good move)
It's also not surprising because changing something as fundamental as the core email platform can have bad consequences for archival requirements, will result in a lot of expensive training/retraining of the staff, and could break lots of automation.
May I submit a correction? The original disk drive controller for the Apple II was called the "Disk II controller". The "Integrated Woz Machine" was the single-chip version used in later models like the IIGS and IIc.
There was a single networked 1st gen LaserWriter in the Mac lab in my art/design college around 1990. It was named "Godot"...as you were constantly waiting for it.
Have you ever thought of doing a video on how qwerty and other key layout setups came about? Apologies if you already have, it's just something that I keep thinking about when I look at all those weird attempts at keyboards in the early eighties.
The education market was basically what kept Apple alive during those early years after the IBM PC came out. IBM took away the high-margin business market (because IBM) and Commodore was eating their lunch on the high-volume home market (because Apple stuff was stupidly expensive and C64s were cheap). Apple really had few customers other than schools.
Our school computer room had one lonely Mac and five C-64s each with 1541 drive and monitor. Everyone played Summer Games by Epyx or Spy vs Spy on them. No one touched the Mac, I am not sure anyone knew how to use it.
Apple has always sold style, not science. That's a good way of making money: the richest men in the world are just stylists, in cars and handbags. It doesn't add a lot of our general well-being over the long haul. Technology? Jobs just went out and copied. The original Apple PC is simply a cheap knock-off of the $9,000 Xerox machine of the time.
@@TheDavidlloydjones The original Apple had nothing to do with Xerox PARC stuff. It was more a heavily cost-reduced machine built around the 6502 using off-the-shelf parts and some very clever custom circuitry from Woz to tie it together as best as they could without having access to their own fab (which is something Commodore leveraged to make their stuff cheap). The later Lisa and Macintosh were the ones that borrowed heavily from PARC's Alto, but that became a theme later on. Again, Woz had some good input in the Macintosh. Jobs indeed was responsible for tons of styling and marketing decisions (some of them smart, some of them very stupid -- for instance his aversion to fans and expandibility), but Woz made a lot of genius designs turning out a far better machine than such a small company had any right to be making. It was Jobs decision for high profit margins that essentially killed Apple in the home market. Computing devices wouldn't become a fashion statement until decades later, so their price competitors were in institutional use (business and education).
@@TheDavidlloydjones Yup, it was always about style then and now. And with their closed-source mentality, the majority (entire?) of the industrial, engineering, scientific, and gaming community uses Windows, not Apple OS.
I worked for a publishing company in the 80s; We published a group of magazines targeted at computer manufacturers (DEC, IBM, HP). We had an 'art department', as you describe it. People cutting and pasting strips of text into the shape of a printed page. And then along came Adobe with Aldus Page Maker. We bought a Mac and an IBM PC-AT. Installed Pagemaker on both. And sat them down side by side in the Art Department. It took a day to decide. I ended up buying every new Mac that came out. And the SAVINGS were insane. Before you print 250000 copies of the magazine, you have to print ONE copy and have the advertisers approve it. That costs about $6000 per page. Until you can do it yourself. And the big machine that spewed out columns of printed text to be cut/pasted....gets unplugged. And I got to avoid having to support the IBM PC and Windows. Thank you Adobe. I named my son Aldus.
But it limited you to black and white. Don't most magazines get printed in 4 color? How did you get around the color limitation? It was slow and time consuming. On an early Mac, I would think everything is 1 page at a time. You simply didn't have the RAM to try and load even a black and white magazine length file or even just more than 1 page.
@@tarstarkuszearly on Aldus Pagemaker was used mostly for mock ups, with the main work still being cut and pastes. I think OP is describing the mock up use case and the move into replace paste ups later. Though OP, aldus was a separate company. Adobe and Aldus were rivals. They weren't together. And Aldus sued Adobe multiple times for copying their software (and own more often than they lost)
@@medes5597Plus, PageMaker was an absolute joke for most publishing houses when DTP *really* took off. If you were serious, you used QuarkXpress. Aldus PageMaker was common for form houses and in house SMB. Adobe didn't have crap to compete until they bought Aldus and got PageMaker. But still, they were a non issue until InDesign was released and even then, Quark had to drop the ball on the OSX release of QuarkXpress 4 for InDesign to take over.
Ah, easy way to ensure that the son will be bullied in school. I helped out at times in an intrnal publishing division of a scientific research institution. I remember Xerox Ventura; and then QuarkXpress.
One of the things that made the IBM PC really take off was the 'clones'. The BIOS was reverse engineered along with the entire open architecture, prices quickly dropped, new generations of Intel CPUs supplied more and more compute power (286/386/486/Pentium), and in a relative few years PC clones were EVERYWHERE. I loved that era, got to work on a lot of cool projects. :)
Great video. One quibble. Atari did not "collapse." It contracted but immediately began development on the Atari ST computer which was released in 1985.
Well, Atari broke up, which is perhaps not the same as collapse, but it certainly isn't the same as just contracting, either. Tramiel bought one of the divisions, which led to the ST, but the arcade business went to Namco. And Tramiel had already started the groundwork for the ST before buying Atari, so there is an argument that such work would have continued without any Atari purchase.
S-100 computers were the king when the Apple II was brought to the market. At first it appeared that the Apple II would be an open design so small companies started making various add-on boards for various tasks. It was when Franklin and a couple other firms decided to start making Apple II clones. Apple went after them with a vengeance and soon the clones were history. When IBM came out with the PC, immediately clones started appearing. The only thing holding them back was the firmware BIOS that was owned by IBM. Companies like Compaq developed clean versions of the BIOS using software engineers who wrote the bios from scratch using only the specifications of the BIOS, never looking at any of the source. So before long there were 10's of companies building PC compatibles at low price. I don't think affected IBM very much because most of the IBM customers were not going to take their chances on a clone. Apple locked down their design thus limiting the installation base of the Apple II and Mac architectures.
and it was this openness and competition that led to so much success and innovation in the PC space. Today Intel cuts its own throat. eg. think about the Atom line. Generally bootloader ('bios') was locked. how cool would it be to be able to boot full x86/64 windows on your phone eg (zenphone2) or $50-100 tablets. eventually is was possible on the zenphone2 but... this should've been a feature, not a chore for the average consumer!
@@PRH123 IBM selling off their personal computers business is a dumb move. The name recognition factor alone will make them some serious money. The Thinkpad was so good that can be milk for many years to come and only need minimal investment. Yet they sell it for quick cash. Now the money is gone. All they have now is service and a R&D department hoping for a jackpot, But they don't have enough money and resources to make it happen.
False. Not many people cared about the 'openness' of architectures in the 80s. A computer back then was considered open of it had expansion slots and people didn't expect computers to be back compat. The IBM pc didn't succeed because it was ,ore 'open'. People largely expected that the IBM PC would win the PC market and drive out apple and commodore and others like they had done to Honeywell and Burroughs in the 60s and 70s. Truth is that for 3 years after the first IBM PC did not have any competitive clones.
Not going to lie, I want to live in the alternate universe where Commodore came out on top. The Amiga was so far ahead of the Mac and PC in the mid-to-late 80s...
In the end nothing could compete with the clones. Apple, commodore, atari etc. where all custom machines trying to carve out their own incompatible market. The flexibility, expandability, competition and innovation of the pc once the cloners entered the market made competing with it pretty much impossible. It would have been nice if they where able to keep going for longer with new products.
Some of my friends bought one. It had 128K of RAM which we could not imagine what to use for. It is famous in my circle of friends because my buddy Dan beat the chess program SARGON on it while flying on acid one time -- something he was never able to do sober. Good times.
I bought my first computer in 1991. The choice was between a used 286 with a 14" VGA monitor and MS-DOS 5 (maybe 4.x) and Win 3 or a Apple Machintosh with a 10" BW display. The price difference was that enormous. What made me choose the 286 was the fact that unlike the mac there was almost an unlimited number of games/software to choose from while the mac only had stuff from Apple. Also the UI on the Apple was only available in Icelandic and even though that's the country I live in I hate using electronic products that are not running on english.
The early Mac definitely didn't just rely on software from Apple - certainly not in 1991. Back then, if you were writing a report on a Mac it was probably done on Microsoft Word. If you were crunching numbers, it was probably done on Excel. If you were connecting to a BBS, you would have used Telefinder by Silicon Beach Software or one of the many terminal emulators. Yes you might have used some Apple software too, such as HyperCard. But the average user could well have used a system with no Apple software at all.
@@OldAussieAds Not where I was living. Getting Apple software in shops that sold PCs was hardly an option. Then there was the issue with the monitor, the only one available for the Apple I could afford in the early 90's was 20" while the one with the 286 was 14" and the whole package was still cheaper than the Apple.
@@bjarkih1977 Yeah I agree there would have been differences some regions. That said, I'm in Australia and we certainly weren't first class citizens to US companies either. I'm not sure I I understand your monitor comparison. The Mac monitor was 20" while the PC monitor was 14"? Did you get those measurements around the wrong way? 12"- 14" was standard for a colour Mac in 1991. 20" would have been absolutely huge, even for a PC.
@@peterfireflylund : agreed. Job’s was a visionary, but his ideas exceeded the hardware/software capabilities at the time. He wanted it so bad, he was obsessed. You couldn’t spend money on development, like he was doing, and not have investors get upset.
@@deepsleep7822 Jobs' one talent was marketing himself and convincing people that he was a 'visionary'. What he was was a gold-plated arsehole. Wozniak was the reason Apple existed; Jobs was nearly the reason it didn't.
16:35 "Inspired". Xerox had a great team building a new object-oriented language called Smalltalk with Alan Kay, Dan Ingalls, Adele Goldberg and others (based on Simula 67, the very first object-oriented programming language) and Steve Jobs happened to visit them one day and requested (or mandated) to be shown it. Adele refused but the directives forced her to give him a tour including a demo. Jobs mentions later in an interview that he was shown three things but one of them hit him hard: there were square things, the so called "windows" in the screen and Ingalls was manipulating it with a mouse. He knew he had to "be inspired" by it. The anecdote says that he complained that the window was redrawing too slowly when dragged and Ingalls fixed it on the go (as only with Smalltalk you would be able to do it, with its own integrated debugger) so that the window now fluently moved around. Check video id J33pVRdxWbw "How Steve Jobs got the ideas of GUI from XEROX " at 6:26 for that. All the features that "unique interface" had mentioned at 16:55 were already at Palo Alto with their version of Smalltalk.
If you are curious about the 'tie-in' of the "Little Tramp" character (seen at 10:40) by Charlie Chaplin and the IBM-PC? Possibly the advertising agency that came up with the idea was using the "Modern Times" movie Chaplin produced and starred in back in 1936. After all, the tag-line for that IBM-PC Ad campaign was: "A tool for _modern times."_
Jobs never understood engineers or computer users in general. Worst purchase I ever made was an original Macintosh. I regret every penny I spent on that piece of 'stuff'. We were completely abandoned, with no capabilities and no upgrade path. To this day I've never spent another dime on anything Apple.
I too bought one of the early Macs. It was cute, the pulldown help menus did help to understand it, but it had just those three programs (word processor, spreadsheet, something) which I had no use for, it had no storage to speak of, it had that dinky b/w screen, and after bringing it work (I was a Unix programmer) and everyone getting a chance to play with it, sent it to my sister, who used it for years with her church. The only reason it was not my worst purchase is because I had no great expectations and it was intended for my sister right from the start. It did also teach me about Steve Jobs dictatorial attitude. I have had jobs using Macs, and it drove me crazy how hard they were to customize. Steve Jobs attitude was "my way or the highway", Bill Gates was "who cares and it's crap", and Unix/Linux was a zillion choices and all the customization you could want, but no one made it easy. I use System76 laptops now, wish they had a Gentoo option, but they are fine hardware.
This channel is constantly blowing me away with the topics and depth.... i always learn new things about topics i think i already know about! Kudos, glad to see this channel is growing!
The Integrated Woz Machine came later. It was a single-chip (= cheaper) version of his original disk controller which mainly consisted of an EPROM + some off-the-shelf TTL chips (+ some help from the main CPU). The IWM was key to making later Apple II and III machines cheaper to produce.
@@shorttimer874I agree. The IWM was a single chip version of the disk ][ controller card. The original card used discrete chips. And really that was the innovation. Woz took a lot of the operation out of hardware and into software (255 bytes of software!) and used commodity (74 series) chips for the apple ][ disk drive. He took drive controller chips off the card. He removed the data separator from the disk drive itself. It was an amazing design. Later he worked on the IWM, around the //e or //c timeframe. That was a single chip. Company production volumes had risen and the cost of custom chips was down so making their own made sense. The iigs added the Mega II which put basically a //c in a chip. I think the //c+ used a variant of that. Also Wendell Sander worked on the IWM revising it to the SWIM (Sander Woz Integrated Machine) which added support for 1.44M floppies (SuperDrive). If nothing else that finally gave interoperability between Apple and IBM floppies. This was done partially by putting back the old MFM data separator that Woz had removed in the disk ][. It was the right move so floppies could be standardized between systems. But the funny thing was the age of GCR (group code recording) was finally here just as Woz had used with the disk ][. Hard drives left MFM for RLL and never looked back. Now on your computer your disk (HDMI or DisplayPort), Ethernet, slots (PCIe) and USB all use GCR. it has various names, 8b/10b or 128b/130b or such. But it’s all similar in design to the disk ][ design which could be named 5b/8b (up to DOS 3.2, 13 sectors or 6b/8b (DOS 3.3, 16 sectors) in modern parlance. Anyway I think the video gets it wrong. There are other small errors too like showing a genuine Apple Mac 7200 (easily one of the worst Macs ever made) when it speaks of clones. None of this undercuts the story. He gets the tale right even despite some small mistakes.
Give me an Apple //e or GS over anything else in that era. The Woz machines were so much better than anything else that Apple produced and it’s inconceivable as to why they deliberately crippled the power of the GS as it was a far better machine than the Mac of the same era. I believe that the GS especially with an accelerator card that allowed GSOS to run properly outperformed the MS DOS based PCs of the era. Jobs never liked the open design philosophy of the Apple 2 line, time has definitely shown Woz’s vision was the future for Personal Computing
We had some Apple IIe computers with dual disk-drives when I was in Matric (year 11 & 12) - the guys that used them had to have two sets of floppies. One set for hot weather and one set for cold. Neither set could be read in the the other season. (I stuck to the PDP 11/70 via terminals; far more reliable!)
Question about the ending ... What was Scully's new "game changing" project for apple that became a disaster? Is there another video that explores this?
This review of the early history of the PC is extremely interesting to me. I was watching this field intently during the mid-70's to mid-80's. In the late '70's the Z80 was the best 8 bit micro but the 8080 had a head start and good marketing. The Motorola 6800 was very good too but it didn't have the marketing clout behind it. The Texas Instruments TI-994A was more of a toy computer, ok for games but not for serious work applications. When the IBM PC came out, its hardware wasn't very good but the IBM name was magical and everyone wanted something that was "IBM Compatible". The Texas Instruments PC ("Professional Computer") in the early '80's had much better hardware but without the feature of IBM compatibility, it wasn't a marketing success.
Yeah. I'm of the age group that grew with Apple and Microsoft, and it's bewildering they're basically retreading the mistakes of the companies they disrupted.
You seem to be mixing up CPUs (Z80 and 6800) and computers (TI-99/4A). The 6809 derivative of the 6800 was well loved by its programmers but apparently too expensive compared to the competition. The TMS9900 as the CPU in the TI-99/4A was a true 16-bit CPU when all the competition was 8-bit, but other parts of the computer's design was so compromised that the result ended up slower than the 8-bit competitors. (I'm and old Z80 coder and never coded for the 6800, 6809, or TMS9900 but read a lot of what other people thought about them.)
@@andrewdunbar828 In the late '70's I thought the Z80 was the best of the 8-bit micros. I compared it to the TMS9900 and finally decided on the TMS9900 since I was working at TI. I built my own homebrew computer and wrote a lot of software at home for the TMS9900 that I later used on a real project at TI. Unfortunately TI didn't continue to develop their micros. In 1982 they gave me a TI994/A with all the accessories and software so I could evaluate it. This was fair for games but not too good for general programming. The IBM PC stole the show and set the standard, (except for Apple). The Z80 and 6800 and 68000 were much better hardware-wise than the 8080 family but Intel's marketing prevailed.
@@bob456fk6 Intel's marketing didn't prevail against the Z80: a _lot_ more designs used the Z80 than the 8080 or 8085. In part that's because the Z80 was an 8080 clone with some extra features. Intel won with the 8086 because IBM used it and the IBM PC took off. It's kind of a crap design compared to a 68000, but Intel got working versions into production first; the 68000 apparently wasn't really yet ready when IBM was evaluating it for the upcoming PC. I wouldn't say the 6800 was better than the 8080, no. Not even as good, really, except that it needed only +5V rather than +5/-5/+12. (I've done a fair amount of programming on the 6800.) The 6502 was probably about as good as the 8080: it didn't look as good on specs, but it ran pretty darn fast if you knew what you were doing. (A 1 MHz 6502 easily is faster than a 2 MHz Z80.)
My Mac IIe was so s-l-o-w I had CompUSA bust the case and double its memory for "only" $150. Wish I had instead put that money into Apple common stock! ¯\_(ツ)_/¯
I have used the Apple Lisa computer and LaserWriter while back in the late 1980's, maybe first in South Korea. These machines were revolutionary piece of art, science and engineering marvel . Graphic screen, UI and mouse and beautiful printed output, rivalling phototypeset book. Window desktop and object-oriented drawing tool software. The software itself worth the money. IT was easy to use and understand. It was culture shock. I wrote my CS master's degree thesis about object-oriented language with Lisa (then Mac XL) & LaserWriter. So I couldn't understand why this revolutionary machine didn't sell well enough in market.
According to Bill Atkinson it took the 8 MHz Macintosh at least half a second to draw a screenful of characters - 1.75 times faster than Lisa, and that's before accounting for Lisa's larger screen. Lisa didn't sell because it was unusably slow, and also cost $10,000 (±$31,000 today).
Note that the 6502 is not based on the Motorola 6800 design. MOS Technology made a version, 6501, that was pin compatible with the 6800. But pin compatibility was the only compatibility. The 6501 was dropped after a lawsuit from Motorola. The 6502 was constructed in a different way than the 6800, and this made it possible to make it for cents and sell it for a few dollars. So the case with 6800 vs 6502 is very different than 8080 vs Z80, which were machine code compatible, even if the Z80 were constructed very differently than 8080, and had its own assembly language. And, btw, thanks for your high quality and very interesting content! I love your channel.
So Steve Jobs wasn't so great as many creating him. But I could guess it from words of Jim Keller, which he was telling about how they must hid products from Jobs in early stages, because he don't understand product development cycle, that prototype is not as final product.
That "other" on the first pie chart, where the collective CP/M (S100) machines, from many small manufacturers. The MITS Altair, and others similarly built, could run Digital Research's CP/M operating system (OS). Yes, Apple had it own proprietary OS. Though Microsoft provided a plugin card, that could run CP/M. The bulk of the CP/M machines exited the market by 1985. As for floppy drives, those first ones were 8" drives. You don't know "floppy", unless you've really handled an 8" floppy ! At the time of the IBM PC, getting patents on software was difficult. Most software was only protected by copyright. I had access to an IBM Technical Reference Manual. It had all schematics, and full printout of the BIOS source code. Used that manual a lot.
Your history about the Macintosh, Apple, and the IBM microcomputers would be just what the psychiatrist ordered. I would have relied on the Apple computer if only it included all business programs, including a relational database, as found on the Windows computer. Your story makes me appreciate using the microcomputer and the color laser printer more, and I do not want to resort to the old-fashioned methods of doing things, regardless of how available they seem to be.💙
@jr: agreed. And you couldn’t easily exchange data with a PC - different disk formats. Job’s wasn’t dumb, he just wouldn’t accept the fact. He thought the Mac was a better machine and perhaps it was. But you know what his arrogance got him.
Its a pretty strange era there were a lot of good computer companies in the early 80s that where way better than apple and ibm. like commodore Atari Sinclair ti but they all got crushed and forgotten after the 8 bit era. my guess is that they overestimated the home user market instead of focussing on the office market. in the 80s the average person didn't need a computer it could have been handy and fun for gaming. but it wasn't until the 90s and the rise of the internet that the average person wanted a computer for more than just gaming. i think another advantage ibm had and apple to a lesser extend was that their computers where more expandable and open so other companies could improve the pc. and even the old apple computers had more expansion options than the c64 .and when someone did make a hardware upgrade for the c64 nobody supported it.
@belstar1128 At least commodre and Atari did survive into the 16bit era and I do agree that in the 80s a lot of people had no reason to own a computer beyond games but they did try to find success in the business market with the commodre plus 4 and Atari portfolio and stacy. The fact that sometimes as you say expansions don't get supported alongside fragmenting the userbase is maybe why some computer companies don't or didn't support the idea of more expandable systems
The Altair 8800 was hardly a personal computer. It, like the Kim I, were really just hobbyist kits. Basically, it was a box with switches and some LEDs on the front of it. While it was affordable, there was no keyboard for input and no display for output. Im my opinion, a more realistic view of what makes a personal computer is that at the least it REQUIRES keyboard (or something more advanced) and a display of some sort (teletype and LEDs notwithstanding). In my view, it would have to be something akin to an IBM 5150, an Apple II, or a VIC 20. I wouldn't even count an Apple I, since it was a kit which keeps it in "kit" territory. While much of the hardware has changed from the IBM 5150 or Apple II to now, the basic paradigm for using a computer hasn't drastically changed. I'm typing on a laptop that has a keyboard for input and a video display showing letters (or characters) when typed. Modern computers do generally have GUIs and pointing devices, however micr are not new as everyone who has ever seen "The Mother of All Demos" can attest. The only thing really keeping Douglas Engelbart's system from being a PC was the fact that it was so horrifically expensive. The truth is that before there were affordable systems with keyboards and displays, the systems that preceded them were too difficult to be used by anyone but a hobbyist or someone in the industry. The advent of the actual PC changed that.
The Kim 1, Altair, Imsai, etc. were all very capable and expandable computers. But you needed a separate terminal to interact with them - which was a significant added cost.
@@markmuir7338 From what I've seen of those machines, getting them working with a keyboard and display is not a trivial affair and requires toggling memory locations using switches on the front of the case and extra, expensive cards and such. If my understanding is correct, this disqualifies them at least in my mind. The point of the "P" in "PC" was that you could: A) Afford the thing. B) Take it out of the box and do something useful with it as a normal person. At 8, I was able to teach myself Apple II basic, TRS 80 basic, and IBM PC basic. With one of those other computers, I sincerely doubt I would have had a clue what to do with them.
@@kevintrumbull5620 These hobbyist micro computers worked the same way as mini computers and mainframes of the era. They were intended for people who already used computers, but who wanted to have one to themselves. That's why they relied on serial terminals - that's what their users were used to. Once you had all the necessary equipment (terminal, disk drives, disk controller card, memory expansion card) you could run 'proper' operating systems on them (CP/M) along with plenty of commercial software. As you said, adding a built-in terminal which worked out of the box would expand the potential market substantially - by including people who had never used a computer before. I also got into computing that way via a Commodore VIC-20 when I was 6, then an Apple II when I was 8. It's important to recognize there were a few of these integrated systems before Apple (eg MCM/70) which didn't have the marketing genius of Steve Jobs, hence are lost to time. Also, several hobbyist computing companies (eg SWTPC) sold their own terminals alongside their micro computers, which were pre-configured. These companies and their products continued along until the IBM PC stole away most of their customers. The RUclips channels 'Adrian's Digital Basement' and 'Usagi Electric' have vastly expanded my knowledge of these early systems, and made me less of an Apple fanboy along the way. There was plenty of innovation all around.
If you were willing to add a terminal, serial port and ROM board to an Altair you could have a system that booted straight up into whatever the ROM does, with no toggling. The Apple II was sold built, as well as in kit form. You could just go down to The Byte Shop, buy one that included a case and PSU, and plug it into your TV at home. Though of course it booted up into WozMon, rather than BASIC or anything like that. But you're right that for consumer adoption, something "akin to an IBM 5150, an Apple II or a VIC 20" was required. But that didn't happen in 1981 with the first IBM PC: that happened in 1977 with the Commodore PET, Apple II and TRS-80 (later called the TRS-80 Model I).
Really difficult to take this video seriously when it starts off with false pie chart in the first 15 seconds. Atari 800/400 computers were out selling Apple II by a large margin in 1981 and still out selling Apple in 1984. And Atari is just magically not mentioned. Face Palm
this is when i grew up using computers: aldus pagemaker, photoshop (just photoshop, no number or date) illustrator and premiere. did my senior project (wrote and produced a pilot, but only had time to shoot the commercial during school year) and incorporated 3D graphics and overlays using a 660av and a VHS VCR. had that tape for 20 years till i lost it in a move. it was a spectacular disaster, but it was my disaster. if i still had it, it would be on RUclips for everyone to gawk in horror. 😮😅 Considering I'm nowhere near Hollywood, there was never a risk of my becoming an actual producer, but it was still fun learning the process.
Seeing those rooms with the old computers reminds me of an urbex video I saw the other year where an abandoned textiles factory in Italy was explored and there remained so many PCs that part of me wanted to dash over there to find one a new home. Sure, they would require repairs but it's fun seeing relics of the past still exist, even if they're in bad shape.
Next you should do a video on Commodore and the Amiga. I grew up on the Timex Sinclair and Commodore 64 as a kid. The Demo scene on the Amiga with 4096 color palette far exceeded what the Mac could do. But the best 3rd party add in card was Newtek's Video Toaster. Video switching, compositing, and 3D Computer graphics creation. You could do a series about the Computer Giants of yesteryear. The CPU wars never get old.
The Amiga topic is interesting but I feel like it would end up being just a footnote in the history of microcomputers The best would be a story about Commodore, a company that made millions but in the end only the memory of it remains
Very nice. The Fall 1985 marketing push you cite resulted in my company becoming a Mac / LaserWriter / PageMaker customer at the time. I was the one sent out to "the seminar." I reported back that it was great, and we should buy it.
I feel like Steve's hiatus from Apple (@24:30 onward) was what made him Steve Jobs. He finally went out into the world and watched people use computers. It's after this that he started saying those iconic things about "starting with the customer, and working your way back to the technology", and so on. Before this he was a great manager and idea guy, but he didn't have a concept of the mission. That's not to say he was "wrong" in redesigning the first Macintosh and quintupling the price from $500 to $2495; the price was not the issue but the technologies and software it shipped with.
the Lisa embodied much of all the same characteristics, but clearly showed that price was very much an issue. What would have changed the dynamics for the Macintosh intro would have been to ship with 512K base memory instead of 128K with expansion capable up to 2MB. All those 128K Macs were useless MacPaint doodlers only and over $3000 was a lot to pay for a doodling computer. And it needed that SCSI port from day one so that serious users (business) could attach large capacity storage. IOW, they needed to have inrtoduced something much more like the Mac Plus from day one and then they would have had a very serious contender against the business serious IBM PC
@@acmefixer1 Basically they made a very expensive device with not a very great idea of who the intended customer was supposed to be. Computers aren't just specs, they are the software they ship with, the software they can run, and the things you use them for.
@@TheSulrossIIRC the SCSI standard wasn't finalised until 1986 - even the Macintosh Plus, released in January of that year, isn't fully compliant with SCSI-1 specifications.
The price of the first Mac absolutely was a major problem. That, and the gross mismanagement of the IIGS, cost Apple what should have been a position of permanent dominance in the computer market.
The Apple II E was our family's first PC and I remember it very fondly. My university had a UNIX network on campus, and every student got an account. My first very own computer was a used 386 that friends who worked at Siemens gifted me. Those last two weren't Apple of course, but I always think of all those systems together, with tons of nostalgia! Thanks for the memories! 😊
Or may the history of Atari. Despite the claim that they went out of business in 1983, Atari came out with the Atari ST in 1985. One of the fun things about it was that you could take the roms out of a Mac, put them in a cartridge, insert it into the Atari ST, and with a small program it could turn your ST into a Mac that also ran FASTER than the Mac.
The movies and all the narratives I've seen about Job's ousting paint a picture of personal strife and whatever, but seems like he just couldn't handle the business well, as he himself seems to have said in the quote shown in the video, he didn't know how to fix the Macintosh. Sure seems to me that all that other personal stuff didn't matter, in the end the numbers are what got him kicked out. Sure, the Steve that came back probably could have fixed what younger Steve couldn't, and indeed fixed Apple, but to me I've come to now see the whole ordeal in a different light. Thanks for the video!
A couple of mistakes on your video: 1) Woz did not leave Apple, he was fired by Jobs because he wanted to continue the Apple II line, which at the time was supporting Apple Inc while the Macintosh was making them lose money. Jobs saw this as a threat and eliminated Woz and soon after tried to kill off the Apple II line. Seeing this Scully had the Board of Directors eliminate Jobs, and kept the Apple II line going until the early 90s. 2) You have it that Atari lowered the cost of the Atari 400 to around $299. The mistake here is that Atari lowered the cost of the Atari 800 to $299 and with a $100 rebate ticket printed on some Magazines and store outlets. The Atari 400 at the time went for $99. Considering that in 1979 when they both came out, the Atari 800 went for $799 and the 400 went for $599. An interesting video but you left out a lot and like these two issues, there were a lot of other minor mistakes of information. Oddly you barely mention the Radio Shack TRS-80 system as they were a good part of the market, and CP/M. Those graphis where you mention "Others" that is where the TRS-80 and CP/M belong. On top of that in the ealy 80s Radio shack releasing the CoCo (Color Computer), that opened the market for many who did not have a Sears or Computer Factory Store near them but they did have a Radio Shack. Poor marketing on their end as the CoCo could have been a major contender to deal with. at the time. Hindsight is often 20/20 but some have it as 250/250.
Woz sort-of-kind-of left after the January 1985 stockholders' meeting but made sure he remained on a token salary so he'd get his ten-year pin in 1987. Jobs, still Chairman of the Board but put in charge of nothing as "Corporate Visionary" in May, quit in September.
Found 'Nokia: The Inside Story' book at a thrift store and was an amazing read. Maybe want to take a look at it for a future video in case you were not already working on it. Thanks for the great content.
my first computer c. 1981 was a eurapple. an apple with a non ntsc video output. I clearly remember I got big PCB which needed all the parts and a bunch of cuts and straps to get ntsc. I had one mistake. I built a linear power supply, used a nice wooden Hammond case and surplus keyboard. I first I used a TV set and taped into the circuit with the ntsc signal. Later I got a green screen monitor using a phono jack for the video. Next came the floppy drives, 2 of them Asuka drives. and much later a HDD. A big thing back then was to an 80 column card. I also remember getting a Z80 card to run a spread sheet. I designed and built several cards that plugged into the slots.
Yeah! The $3K [US] amount was pricy for its time, and not affordable for the average household, as adjusted for inflation, that 1983 price would equal to near $9K in 2023 [US] dollars.
Ed Roberts of MITS bought cosmetically defective but functional 8080 CPUs and those those in the Altair. The chips had package or fiinsh defects but passed function checks. Also that photo on the cover of Popular Electronics is a lie, the box doesn't work. Bob shipped one to the magazine for it's photographer but it was lost is transit, the one pictured is empty. Bob had to scramble to get them a new one.
I didn't see any mention of Microsoft. I understand that IBM did write an operating system for its PC, but MS-DOS was far superior and deserves a lot of the credit for making the IBM PC a success. Also, I didn't see any mention of the PC Jr. There is mention of Apple creating an early reliable disk drive. The Apple II data read operation was largely implemented in software running on the main processor. Apple created a standard format for storing data on its disks. The Apple II ran most of the time without a resident operating system. To start a new program, the user would place a disk in the drive and reboot. The BIOS would read track 0 sector 0 into memory and turn over execution to it. That sector needed to take over the disk read operation, which typically used disk read code in the BIOS. IBM's disk drives, on the other hand, had hardware to read the sectors. Apple allowed each software vendor to place a copy of Apple's DOS on its disk. IBM's PC booted in DOS and then loaded the software vendor's program. Apple's system was highly resistant to viruses, since it typically got a fresh start from each disk. IBM's PC was vulnerable to one program modifying the resident operating system to modify the next program loaded, passing along a virus from disk to disk. Apple's system architecture allowed vendors to implement disk copy protection schemes based on non-standard data formats. The standard Apple disk copy program would not work. A program named Locksmith attempted to copy data in the non-standard formats, but wasn't always successful. In some cases, the software writers created their own custom format. Other times, the disk was basically in standard format but had some non standard data somewhere. A subroutine would check for the non-standard data. A clever hacker could find the subroutine and patch it out. If the entire disk was in a non-standard format, it was possible to let it load to memory and then write the image in standard format to a blank disk which would load and execute as a standard disk. Apple's PRODOS came later and was resident, allowing programs to run without reloading DOS each time. Appleworks integrated word processing, database and a spreadsheet software. Copy protection for the Apple disks was a headache for the schools. Without it, the schools could make backup copies, permitted under the copyright laws, and let the students run the backups, keeping the originals locked away. If the backup was damaged, the school could make a fresh copy of the original on a blank disk, without having to buy a new copy. Sometimes, the copy protected disks did not read properly in all disk drives while disks with standard format read well. So, hackers came to the rescue, breaking the copy protection schemes and writing the unprotected copies to blank disks for the schools to use.
Apple's biggest problem, i think, was price. Price is why the first computer my dad bought was a C-64. PC clones were what got him to enter the bigger PC market.
the biggest problem with the Apple III wasn't chips falling out of their sockets (yes, that was a problem to a small extent, but not the big deal breaker), the BIG problem was it's poor ventilation design caused the components inside to overheat significantly. It had no propose built ventilation and no fan, causing it to overheat badly if left on for more then a couple hours. I remember one Apple III in a computer lab long ago and I remember a note on it said "turn off after one hour" meaning no one could use it for more then an hour cause it was so prone to overheating.
Apple... Even from the start it was more about style than substance. That alongside enforced obsolescence of the old models. But you have to give to them for the original Macintosh. I remember seeing that when I was a teenager and LOVED the interface. It was gorgeous!
Minor note, I bought the Apple 2 Plus, the second Apple 2 generation, in 1979. The actual Apple "1" was a kit with no case or keyboard, very few of those were sold, so I am guessing in the beginning of this video the count of Apple "1"s actually includes Apple 2s.
HP wasn't in the PC market in the early 80's. Their computers may have looked like PC's but they were aimed at professionals and so were the prices. They didn't sell in any meaningful numbers. Not sure where you sourced your data for the pie chart at 11:05. It also conflicts with the market share chart you show later at 22:48. That chart is missing the "other" category as it doesn't include computers like the VIC-20 which was still selling in good numbers into the early 80's. Even the Ti-99 was still on the shelves in those years. One of the reasons Commodore was able to weather the gaming console crash and price wars of the early 80's is that it owned MOS Technologies which made the 6502 CPU. Every Apple II sold was generating revenue for Commodore!
HP had various models that could run CP/M in the early 1980s including the HP-86 and HP-125. I think that pretty much makes them personal computers, albeit not IBM PC-compatible ones. Yes, they were expensive and noted as such at the time, but then there was a perception of enhanced quality over the most comparable competition. It is an odd statement indeed to distinguish between personal computers and computers "aimed at professionals", since a lot of the money made in early microcomputing was made precisely by selling computers to professionals. That was kind of the point of the IBM PC, after all.
Many people forget how groundbreaking the original Apple II was, it had: BASIC on ROM, expansion, color, graphics, & sound in 1977. It's only "competition" were the TRS-80 & Commodore PET which were text only, no sounds, no graphics. Then the same people brag about Commodore 64, which did have sound & graphics, but was not released until 1982.
The first disk drive for Apple ii did bid NOT use Integrated Woz Machine. The Disk ][ board has a prom and logic ICs only. IWM was the IC put in the //c, later IIe cobtrollersy, and Iigs
Strictly speaking the "integrated Woz Machine" was not the first superb Woz design, which was like 6 chips. It was a one-chip Woz machine that was used on the Apple IIc and early Macs.
Apple had the Goldilocks problem. Their first try to leap forward was the Apple ///. Not hot enough. Still a 6502 CPU and limited to 64K of memory per bank. Then they tried with the Lisa. Way too hot, cost about $12,000, and too cool, speedwise, it ran really slowly. Then they tried the Macintosh, which was somewhat in-between, about the right amount of everything, a definite step up, but still flawed-- no hard disk, only 128K of RAM, and incompatible with everything else in the world. Real tough times for Apple. It would be years before the Macintosh line was profitable.
The failure of the Apple /// was mainly caused by poor quality of the initial product. Steve Jobs insisted on not having ventilation slots or a fan. Instead it had an aluminum base that was supposed to act like a heat sink. The Apple /// had more memory than the Apple ][ and a faster processor. This caused the computer to generate more heat. The heat caused many of the socketed chips to become loose and cause hardware errors. Apple tech support suggested lifting the Apple /// several inches and dropping it to reseat the chips. The Apple /// had an Apple ][ compatible mode, but special chips where installed preventing access to any Apple /// features when in Apple ][ mode. This included expanded memory, 80 columns and lower case letters. When in Apple ][ mode the Apple /// could only address 48K of memory.
Uh the TRS-80 was not trash, and was not nicknamed that "for good reason". It's just how the abbreviation came out. I considered The Apple II to be a dog compared to the TRs-80 Model III. The PET was the real dog. Until a chepaer knock of it the VIC-20 came out, which springboarded the C=64. The C=64 and 128 were the machines to have, until they got really exclipsed in speed and power by the Mac and IBM AT. NOw, the TRS Color Computer was kind of trash, but then so was the Atari 800 nad 400. Keyboards were a big deciding factor back then if you liked a computer or not. It seems laughable today, but no, yeah, a good keyboard would make or break a system. I know, I will only use an IBM PS/12 1986 Model M keyboard now... because it's a cult classic and for good reason.. it's a dream to type on for the average user.
[10:14] The infamous Wall Street Journal ad: "Over the next decade, the growth of the personal computer will continue in logarithmic leaps." It's odd that they didn't say exponential, and instead said the inverse, logarithmic. Anyway, thank you for your fascinating and well-researched videos, they are always worthwhile.
There is a briliant story, of Apple investment (way back when Jobs was on Lisa team I belive) into a Hungarian startup called Graphisoft, who subsequently went on and invented new way of designing buildings on computer. Their flagship software Archicad battles the hegemon of AEC industry - Autodesk to this day in much similair fashion as Apple and IBM once did. I would love to see your take on the story, @Asionometry.
3 месяца назад
It’s worth noting that Wozniak’s disk drive lacked the capability to track the position of its drive heads. Consequently, when it needed to reset, the motor would forcefully push the drive heads for about a second, hoping to return them to the starting point. This process generated a loud and distinctive noise, making the disk drive significantly noisier than others available at the time. All in the name of using as few of components and saving cost
I wrote some successful software in the early '80's that ran on the Apple IIe. Looked at the Mack and to my complete surprise found it wouldn't run IIe software. That seemed dumb so I moved over to the IBM pc until Windows 2.0 came out.
Thank you for this fascinating tour through this history. I lived through this whole period, buying an Apple ][+ in 1980. What I particularly liked about your video was, in your usual fashion, its completeness -- I learned a lot that I did not know before, and I feel like I have a much broader view of what was happening at the time. Being so immersed in it, I didn't see the forest for the trees. As always, I am looking forward to your next video!
I don't remember the Apple II have color. The monitors were either B/W, Green or Amber. I think later a color card could be inserted to work with a television. That was in the UK. I loved the Apple II which I bought for home use.
To say that the 6502 is based on the 6800 design is a bit reaching. The 6501 is pin compatible, as an upgrade path (the 6502 isn't), and they share some similarities in their logical code model, but they're absolutely software incompatible.
The IBM PC was very similar to a CP/M-based computer, but with the advantage of being able to use more memory, and having a more powerful computer - and a single standard disk format. So it started by swallowing up that segment of the market. The Commodore 64 was a direct competitor to the Apple II in the home market; but since the Apple II was a factor in the business market, IBM was also a competitor for it.
The video misses the Apple IIgs, which is where apple dropped the ball. Steve Jobs realized that GUI was the future, and tried to jump straight to it with the Mac, making two mistakes mentioned in the video. The first is the hardware wasn't powerful enough to support GUI. The other was lack of software. Bill gates realized GUI is the future, but given hardware wasn't strong enough, decided to improve the operating system incrementally until hardware was strong enough, which was around the year 2000, with Windows NT and Windows 2000. As a side comment, Microsoft released Office for Mac before Office for Windows. It got experience developing it for GUI before Windows was available, making it easy for them to release a Windows version when the time came. Which is where the Apple IIgs comes in. It was as much an Apple IIe upgrade as hardware at the time could offer - 16 bit processor, 256KB to 8MB of RAM, better graphics (the Mac's display was literally black and white), better sound, and compatible with Apple IIe slots and software. It was an easy upgrade path for IIe owners. It also had an OS with GUI in '88, two years before Windows 3.0 became popular. Which makes the Mac software availability all the more problem clearer. If a customer has to choose between two computers that don't support any of the software he has, he might as well switch brands. Apple realized it a little late, and released an 'Apple IIe on an expansion card' for the Mac*. Problem was the Mac was expensive, and adding the card was even more expensive. [* That wasn't a new idea. The Apple IIe had a 'Z80 computer on an expansion card', which allowed it to boot CP/M and run applications like WordStar.] Apple's mistake was trying to run faster than technology allowed it to. If Steve Jobs was willing to take a slower route, Apple would have had a chance to compete with 80x86 & Windows.
The problem is Apple doesn't license its operating system to manufactures which means don't allow these PC manufactures to make computers with MacOS as Microsoft does with Windows. This limits your choices
Shout to the Performa 5200 series, I loved the industrial design of that machine and it was fairly expandable at least via SCSI, if not obviously slow relative to Windows PCs. Still, it was and still is an extremely fun machine to use during the glory days of CRT. Definitely a nice prelude to the iMac. And way better than all those PowerPC/Performa 6xxx series of which there were way too many configurations (mac ppl know what I’m talking about) System 7.6 - 8.6 looked awesome on that Performa, Ethernet was effortless, 100Mb Zip Drives (mostly) worked for low to mid-size files. I’ve made a point to keep everything compatible with my 5215 still working. SoundEdit 16 v2 was/is also a really awesome piece of software, even if it was slow, the ability to *see* raw sound and music was amazing. Photoshop, Illustrator and QuarkXpress also did ok on this machine. Even when it lagged, it was still an amazing creative tool. It could also handle a certain type of file called the .mp3 ;)
Curious perspectives. Especially with your use of the term "killer apps", How old are you and what's your technological education? (It's a sincere question. I only want to know why some people see things in different ways To be honest, a lot of this sounds like it came from wiki and/or other she-said/he-said books and articles,..).
The Apple IIe was the very first computer I ever used. Prolly back around 1988-90 using it to play simple games to keep me entertained after school around the age of 7.
I actually used the Apple //e in school, Virginia School for the deaf and blind in Hampton, where I was, had a computer lab with Apple //e's and //e Platinums… and a version of "Dragon Maze" that was made with the option to play it with or without the dragon…
I remember these days. The Apple products were just to much for my budget and like many I bought a Commodore 64. A bonus being almost unlimited free software!!!! Also, why buy a game console when you could play similar games on the same computer you used for word processing!
Small Costs were such a huge driving force… my high school of 800 kids shared three Trash 80s… cassettes were mostly used for program storage… one of the TRS80s had a disk drive… 1983. Buying a computer to go to college… my class was the first to require one computer for each student… class of ‘87… DEC Pro30, compatible with the DEC10 mainframe on campus…. Word11 was a great word processor. Took a year to get a dot matrix printer (expensive and limited supply) Dad went Commodore64 after that… low cost word processing that looked better than a typewriter could do… huge savings in manpower. Or Mom-power…. 😀 Do typewriters still exist?
Trying to get APPLE STUFF on every desk in the schools was not a very good idea. Education did NOT have the budget for expensive computers and neither did MOST of the student's parents. Apple's idea was to get an APPLE computer on K-12 desks and then COLLEGE students would also want APPLE stuff for their college work. THAT idea did NOT pan out very well.
its open architecture vs close architecture which won the ibm vs apple, as third party ibm compatible making apple expensive as cost is a main factor...... plus an open architecture helped software development.. apple will also belong to a niche market...
I remember buying a computer in 1985 for under $200.00 and it used two cassette drives. I could use my music cassettes in it. It had no monitor so I hooked it up to an old black and white large screen TV that was in the basement. It took a while for the computer to move the cassette tapes back and forth while operating. I used a word processing program on it to prepare pamplets, etc.
The early IBM PC’s had a cassette interface. I had an early PC and tried it. It worked, but it was something I wouldn’t rely on. The first IBM PC’s had a max of 64k on the mobo. Mine was the version when they first switched to 256k on the mobo.
Tiny correction to the this otherwise excellent overview... at the very beginning the voice over mentions Apple's installed base being Apple I, should have been Apple II.
I love this era of computing. Everything was happening so fast and the market was expanding so quickly, that you had to keep innovating just to survive the cutthroat competition.
I remember as a young professional during this period needing to buy a new computer just about every year. Now I have a 13 year old Macbook. I have replaced the battery. I can't think one one good reason why I should buy a new computer...
That has its upsides and downsides. One of the things powering that dynamic was early computers were horrifically under powered.
Someone back then allegedly said "Nobody will ever need more than 640k" But this obviously wrong even at the time. A plain text file of the average book is around 800k to around 1MB. Not only does the whole book not fit in memory, but it wouldn't even fit on a single floppy disk of the time. In 1981 when the PC first shipped, they didn't have anywhere near even 640k. Most were like 128-256kb of RAM. So an author writing a book on a PC would have to do it by chapter across multiple disks. So they would also need to print the book out as they go to make referencing older parts of the book easy.
@@tarstarkusz That's mostly true, but I would point out that other contemporary 8088 machines, like the Victor 9000/Sirius 1 had GCR-encoded floppies with variable speed zone recording that could hold 1.2 MB. Also, because those pre-dated the IBM PC's memory-mapped I/O reservation of the addresses from 0xA000 to 0xFFFFF, you could run MS-DOS with 896K of RAM. (But of course, no one could afford the chips!)
@@WardCo I think Commodore used the GCR encoding with the 1541 and was also variable speed. IIRC, it was able to put more sectors on the outer tracks.
While it is true the 8088 has a 20 bit address bus to get to 1mb, they decided to put the video card bios and video ram in the C segment (also the hard disk controller) So that effectively limited you to 768k, at least contiguously. And, of course, due to the soft segmenting, a new segment could start at FFFF:FFF0 which gave you 64k (-15 bytes) above 1 megabyte That's what himem-sys (I use dash because a period would make it a link) does. It loads a portion of dos into that 64k above the 1MB mark and you can access it without having to switch into protected mode. Note, it only works on a 286 or higher.
Qemm (an aftermarket memory manager) did something to give much larger conventional memory. It did some kind of shadowing and freed up basically all of the upper segments. (But, it generally doesn't do this. IIRC, it will only ad the A segment with the rest being for TSRs, device drivers and the like)
EMS allowed basically unlimited RAM with a bank-switching scheme. I used to have a BUNCH of these cards loaded with chips. I retired XT class PCs (about 100 of them) with 2-4MB of RAM on these cards. It would be worth a decent amount today on ebay
It was exciting to follow it. I bought a lot of computer magazines and went to several computer shows back then. I miss those days. Now everything is just an improvement on things that existed before.
Slight correction - only around 200 Apple Is were made - I believe the 300,000 number you mentioned are Apple IIs.
That's correct. By 1981 the Apple I was already deprecated as the Apple II line expanded.
No. He actually said only around 200 Apple Is were sold. No idea where you got 300,000 from. He never said that.
It sure sounds like he says they had an install base of 300,000 Apple 1s in the first ten seconds of the video
Yes its a mistake. Apple II was introduced in 1977, and that's what stormed the market, the visicalc machine. In fact Apple Is are rare.
Only 2 years later the Apple II+ was in the market and that was the king by 1981, absolutely not the I. There was the III in that year but it was a flop, in 1983 came the IIe which was the biggest seller of this era for Apple.
Check the subtitles, he just mispronounced.
Both the Alto computer and Aldus Pagemaker are shoutouts to Aldus Manutius, a Rennaissance printer who designed "small" books that could fit into saddlebags and worked with the legendary Humanist Erasmas (who is the guy on the Pagemaker logo). Among other things, Aldus developed the first fonts used to print copies of classic Greek literature, as well as the original texts of the New Testament.
Erasmus.
The Xerox Alto computer got its name from the Xerox Palo Alto Research Center (PARC), alto being the Spanish word for tall/high/upper.
The GUI technology was first developed at the Standford Research Center (SRI), Xerox licensed the tech, and created the Alto computer at the Xerox Palo Alto Research Center.
Trivia: IBM only ditched Lotus as their corporate email platform... last year in 2022.
This isn’t *that* surprising- lotus notes may be one of the stickiest bits of software ever created. In addition to just email it also had what we’d today call a low-code toolset and enabled relatively simple line of business applications, those apps ended up becoming core to a lot of large businesses operations, usually with a bunch of poorly documented business logic built in in somewhat hard to decipher ways since they were rarely built by dedicated programming teams, unwinding all of that without bringing the business to a halt was a hard problem.
What do they use now?
IBM kind of disappeared into irrelevance now.
Best wishes.
@@ajax700 lotus was rebranded to HCL Verse but effectively still lotus. They dropped it last year but surprisingly there was quite a lot of resistance to dropping Verse. They now run outlook (in line with all their clients).
I wouldn’t say they’re irrelevant- they still are a massive player in enterprise, for personal computers they exited that business with the sale of the PC business to Lenovo. (And in hindsight that was probably a good move)
HCL Verse was recently uninstalled from all of their laptops, so I guess only the government is using it now.
It's also not surprising because changing something as fundamental as the core email platform can have bad consequences for archival requirements, will result in a lot of expensive training/retraining of the staff, and could break lots of automation.
May I submit a correction? The original disk drive controller for the Apple II was called the "Disk II controller". The "Integrated Woz Machine" was the single-chip version used in later models like the IIGS and IIc.
In addition, the ImageWriter was 72 or 144dpi
I noticed you said "Apple I" instead of "Apple II" around 0:10. There were only a total of roughly 175 Apple Is sold.
There was a single networked 1st gen LaserWriter in the Mac lab in my art/design college around 1990. It was named "Godot"...as you were constantly waiting for it.
IBM went to Atari HQ in 1980-1981 and briefly considered licensing or using the Atari 800 as the basis for the original PC.
That would have been much better than what we got lol. the Atari 800 was genuinely almost 10 years ahead of the pc.
IBM legitimized PC’s for business. The old adage back then was youcan’t go wrong buying IBM products.
@@belstar1128another Jay Miner miracle.
Have you ever thought of doing a video on how qwerty and other key layout setups came about? Apologies if you already have, it's just something that I keep thinking about when I look at all those weird attempts at keyboards in the early eighties.
The education market was basically what kept Apple alive during those early years after the IBM PC came out. IBM took away the high-margin business market (because IBM) and Commodore was eating their lunch on the high-volume home market (because Apple stuff was stupidly expensive and C64s were cheap). Apple really had few customers other than schools.
Our school computer room had one lonely Mac and five C-64s each with 1541 drive and monitor. Everyone played Summer Games by Epyx or Spy vs Spy on them. No one touched the Mac, I am not sure anyone knew how to use it.
Apple has always sold style, not science. That's a good way of making money: the richest men in the world are just stylists, in cars and handbags. It doesn't add a lot of our general well-being over the long haul.
Technology? Jobs just went out and copied. The original Apple PC is simply a cheap knock-off of the $9,000 Xerox machine of the time.
@@TheDavidlloydjones The original Apple had nothing to do with Xerox PARC stuff. It was more a heavily cost-reduced machine built around the 6502 using off-the-shelf parts and some very clever custom circuitry from Woz to tie it together as best as they could without having access to their own fab (which is something Commodore leveraged to make their stuff cheap).
The later Lisa and Macintosh were the ones that borrowed heavily from PARC's Alto, but that became a theme later on. Again, Woz had some good input in the Macintosh.
Jobs indeed was responsible for tons of styling and marketing decisions (some of them smart, some of them very stupid -- for instance his aversion to fans and expandibility), but Woz made a lot of genius designs turning out a far better machine than such a small company had any right to be making.
It was Jobs decision for high profit margins that essentially killed Apple in the home market. Computing devices wouldn't become a fashion statement until decades later, so their price competitors were in institutional use (business and education).
Funny how American academics tend to be the dumbest yet most financially stable customer base😂
@@TheDavidlloydjones Yup, it was always about style then and now. And with their closed-source mentality, the majority (entire?) of the industrial, engineering, scientific, and gaming community uses Windows, not Apple OS.
I worked for a publishing company in the 80s;
We published a group of magazines targeted at computer manufacturers (DEC, IBM, HP).
We had an 'art department', as you describe it. People cutting and pasting strips of text into the shape of a printed page.
And then along came Adobe with Aldus Page Maker.
We bought a Mac and an IBM PC-AT. Installed Pagemaker on both. And sat them down side by side in the Art Department.
It took a day to decide.
I ended up buying every new Mac that came out.
And the SAVINGS were insane.
Before you print 250000 copies of the magazine, you have to print ONE copy and have the advertisers approve it.
That costs about $6000 per page.
Until you can do it yourself.
And the big machine that spewed out columns of printed text to be cut/pasted....gets unplugged.
And I got to avoid having to support the IBM PC and Windows.
Thank you Adobe.
I named my son Aldus.
But it limited you to black and white. Don't most magazines get printed in 4 color? How did you get around the color limitation?
It was slow and time consuming. On an early Mac, I would think everything is 1 page at a time. You simply didn't have the RAM to try and load even a black and white magazine length file or even just more than 1 page.
@@tarstarkuszearly on Aldus Pagemaker was used mostly for mock ups, with the main work still being cut and pastes. I think OP is describing the mock up use case and the move into replace paste ups later.
Though OP, aldus was a separate company. Adobe and Aldus were rivals. They weren't together. And Aldus sued Adobe multiple times for copying their software (and own more often than they lost)
@@medes5597 Thanks.
@@medes5597Plus, PageMaker was an absolute joke for most publishing houses when DTP *really* took off. If you were serious, you used QuarkXpress. Aldus PageMaker was common for form houses and in house SMB. Adobe didn't have crap to compete until they bought Aldus and got PageMaker. But still, they were a non issue until InDesign was released and even then, Quark had to drop the ball on the OSX release of QuarkXpress 4 for InDesign to take over.
Ah, easy way to ensure that the son will be bullied in school.
I helped out at times in an intrnal publishing division of a scientific research institution. I remember Xerox Ventura; and then QuarkXpress.
One of the things that made the IBM PC really take off was the 'clones'. The BIOS was reverse engineered along with the entire open architecture, prices quickly dropped, new generations of Intel CPUs supplied more and more compute power (286/386/486/Pentium), and in a relative few years PC clones were EVERYWHERE. I loved that era, got to work on a lot of cool projects. :)
it was the clones that eventually droveibm our of the PC market
Great video. One quibble. Atari did not "collapse." It contracted but immediately began development on the Atari ST computer which was released in 1985.
Well, Atari broke up, which is perhaps not the same as collapse, but it certainly isn't the same as just contracting, either. Tramiel bought one of the divisions, which led to the ST, but the arcade business went to Namco. And Tramiel had already started the groundwork for the ST before buying Atari, so there is an argument that such work would have continued without any Atari purchase.
S-100 computers were the king when the Apple II was brought to the market. At first it appeared that the Apple II would be an open design so small companies started making various add-on boards for various tasks. It was when Franklin and a couple other firms decided to start making Apple II clones. Apple went after them with a vengeance and soon the clones were history. When IBM came out with the PC, immediately clones started appearing. The only thing holding them back was the firmware BIOS that was owned by IBM. Companies like Compaq developed clean versions of the BIOS using software engineers who wrote the bios from scratch using only the specifications of the BIOS, never looking at any of the source. So before long there were 10's of companies building PC compatibles at low price. I don't think affected IBM very much because most of the IBM customers were not going to take their chances on a clone. Apple locked down their design thus limiting the installation base of the Apple II and Mac architectures.
and it was this openness and competition that led to so much success and innovation in the PC space.
Today Intel cuts its own throat.
eg. think about the Atom line.
Generally bootloader ('bios') was locked. how cool would it be to be able to boot full x86/64 windows on your phone eg (zenphone2) or $50-100 tablets.
eventually is was possible on the zenphone2 but... this should've been a feature, not a chore for the average consumer!
There were countless Apple II clone in Asia.
Although Apple still produces notebooks and desktops now, and IBM has not for a long, long time…
@@PRH123 IBM selling off their personal computers business is a dumb move. The name recognition factor alone will make them some serious money. The Thinkpad was so good that can be milk for many years to come and only need minimal investment.
Yet they sell it for quick cash. Now the money is gone. All they have now is service and a R&D department hoping for a jackpot, But they don't have enough money and resources to make it happen.
False. Not many people cared about the 'openness' of architectures in the 80s. A computer back then was considered open of it had expansion slots and people didn't expect computers to be back compat. The IBM pc didn't succeed because it was ,ore 'open'. People largely expected that the IBM PC would win the PC market and drive out apple and commodore and others like they had done to Honeywell and Burroughs in the 60s and 70s. Truth is that for 3 years after the first IBM PC did not have any competitive clones.
Not going to lie, I want to live in the alternate universe where Commodore came out on top. The Amiga was so far ahead of the Mac and PC in the mid-to-late 80s...
And the C64/128 were impressive in their day before that.
In the end nothing could compete with the clones. Apple, commodore, atari etc. where all custom machines trying to carve out their own incompatible market.
The flexibility, expandability, competition and innovation of the pc once the cloners entered the market made competing with it pretty much impossible.
It would have been nice if they where able to keep going for longer with new products.
Your video is the first place where I heard a mention of an "Apple III".
Some of my friends bought one. It had 128K of RAM which we could not imagine what to use for. It is famous in my circle of friends because my buddy Dan beat the chess program SARGON on it while flying on acid one time -- something he was never able to do sober. Good times.
I bought my first computer in 1991. The choice was between a used 286 with a 14" VGA monitor and MS-DOS 5 (maybe 4.x) and Win 3 or a Apple Machintosh with a 10" BW display. The price difference was that enormous. What made me choose the 286 was the fact that unlike the mac there was almost an unlimited number of games/software to choose from while the mac only had stuff from Apple. Also the UI on the Apple was only available in Icelandic and even though that's the country I live in I hate using electronic products that are not running on english.
A wise choice.
The early Mac definitely didn't just rely on software from Apple - certainly not in 1991. Back then, if you were writing a report on a Mac it was probably done on Microsoft Word. If you were crunching numbers, it was probably done on Excel. If you were connecting to a BBS, you would have used Telefinder by Silicon Beach Software or one of the many terminal emulators. Yes you might have used some Apple software too, such as HyperCard. But the average user could well have used a system with no Apple software at all.
@@OldAussieAds Not where I was living. Getting Apple software in shops that sold PCs was hardly an option. Then there was the issue with the monitor, the only one available for the Apple I could afford in the early 90's was 20" while the one with the 286 was 14" and the whole package was still cheaper than the Apple.
@@bjarkih1977 Yeah I agree there would have been differences some regions. That said, I'm in Australia and we certainly weren't first class citizens to US companies either.
I'm not sure I I understand your monitor comparison. The Mac monitor was 20" while the PC monitor was 14"? Did you get those measurements around the wrong way? 12"- 14" was standard for a colour Mac in 1991. 20" would have been absolutely huge, even for a PC.
@@OldAussieAds That 20"was well out of my reach.
Maybe I misunderstood something, but it sounded like Steve Jobs almost killed the company in the 80s.
He did. John Scully saved it.
@@peterfireflylund : agreed. Job’s was a visionary, but his ideas exceeded the hardware/software capabilities at the time. He wanted it so bad, he was obsessed. You couldn’t spend money on development, like he was doing, and not have investors get upset.
Steve Jobs succeeded in killing himself in 2011.
@@deepsleep7822 Jobs' one talent was marketing himself and convincing people that he was a 'visionary'. What he was was a gold-plated arsehole. Wozniak was the reason Apple existed; Jobs was nearly the reason it didn't.
@@peterfireflylund That's not exactly how the Apple story ended.
16:35 "Inspired". Xerox had a great team building a new object-oriented language called Smalltalk with Alan Kay, Dan Ingalls, Adele Goldberg and others (based on Simula 67, the very first object-oriented programming language) and Steve Jobs happened to visit them one day and requested (or mandated) to be shown it. Adele refused but the directives forced her to give him a tour including a demo. Jobs mentions later in an interview that he was shown three things but one of them hit him hard: there were square things, the so called "windows" in the screen and Ingalls was manipulating it with a mouse. He knew he had to "be inspired" by it. The anecdote says that he complained that the window was redrawing too slowly when dragged and Ingalls fixed it on the go (as only with Smalltalk you would be able to do it, with its own integrated debugger) so that the window now fluently moved around. Check video id J33pVRdxWbw "How Steve Jobs got the ideas of GUI from XEROX " at 6:26 for that. All the features that "unique interface" had mentioned at 16:55 were already at Palo Alto with their version of Smalltalk.
If you are curious about the 'tie-in' of the "Little Tramp" character (seen at 10:40) by Charlie Chaplin and the IBM-PC?
Possibly the advertising agency that came up with the idea was using the "Modern Times" movie Chaplin produced and starred in back in 1936. After all, the tag-line for that IBM-PC Ad campaign was: "A tool for _modern times."_
they also did it becaise os washad fallen into public c domain and hey didn't ahve to pay royalties
Jobs never understood engineers or computer users in general. Worst purchase I ever made was an original Macintosh. I regret every penny I spent on that piece of 'stuff'. We were completely abandoned, with no capabilities and no upgrade path. To this day I've never spent another dime on anything Apple.
I too bought one of the early Macs. It was cute, the pulldown help menus did help to understand it, but it had just those three programs (word processor, spreadsheet, something) which I had no use for, it had no storage to speak of, it had that dinky b/w screen, and after bringing it work (I was a Unix programmer) and everyone getting a chance to play with it, sent it to my sister, who used it for years with her church. The only reason it was not my worst purchase is because I had no great expectations and it was intended for my sister right from the start.
It did also teach me about Steve Jobs dictatorial attitude. I have had jobs using Macs, and it drove me crazy how hard they were to customize. Steve Jobs attitude was "my way or the highway", Bill Gates was "who cares and it's crap", and Unix/Linux was a zillion choices and all the customization you could want, but no one made it easy. I use System76 laptops now, wish they had a Gentoo option, but they are fine hardware.
I mean, Unix programmer probably wasn’t their target audience? Seems like it worked really well for your sister.
Same, *NEVER* understood the hype for Apple, when Android is better in every way, atleast to me, imo!
I used Android for a decade until iPhone 14 Pro. To say that Android is better in every way is delusionally biased.
@@jonhall2274 Don't forget Android is a grandchild of Unix and that its distant cousin is OS X.
This channel is constantly blowing me away with the topics and depth.... i always learn new things about topics i think i already know about!
Kudos, glad to see this channel is growing!
You kudos are too lazy to read, needing video content !
apple was always the #1, who made money on IBM PC or Gateway 2000 ?
The Integrated Woz Machine came later. It was a single-chip (= cheaper) version of his original disk controller which mainly consisted of an EPROM + some off-the-shelf TTL chips (+ some help from the main CPU). The IWM was key to making later Apple II and III machines cheaper to produce.
Bought my first Apple floppy drive in 1979. To the best of my memory the code running the drive was already being referred to as IWM.
@@shorttimer874I agree. The IWM was a single chip version of the disk ][ controller card. The original card used discrete chips. And really that was the innovation. Woz took a lot of the operation out of hardware and into software (255 bytes of software!) and used commodity (74 series) chips for the apple ][ disk drive. He took drive controller chips off the card. He removed the data separator from the disk drive itself. It was an amazing design.
Later he worked on the IWM, around the //e or //c timeframe. That was a single chip. Company production volumes had risen and the cost of custom chips was down so making their own made sense. The iigs added the Mega II which put basically a //c in a chip. I think the //c+ used a variant of that.
Also Wendell Sander worked on the IWM revising it to the SWIM (Sander Woz Integrated Machine) which added support for 1.44M floppies (SuperDrive). If nothing else that finally gave interoperability between Apple and IBM floppies. This was done partially by putting back the old MFM data separator that Woz had removed in the disk ][. It was the right move so floppies could be standardized between systems. But the funny thing was the age of GCR (group code recording) was finally here just as Woz had used with the disk ][. Hard drives left MFM for RLL and never looked back. Now on your computer your disk (HDMI or DisplayPort), Ethernet, slots (PCIe) and USB all use GCR. it has various names, 8b/10b or 128b/130b or such. But it’s all similar in design to the disk ][ design which could be named 5b/8b (up to DOS 3.2, 13 sectors or 6b/8b (DOS 3.3, 16 sectors) in modern parlance.
Anyway I think the video gets it wrong. There are other small errors too like showing a genuine Apple Mac 7200 (easily one of the worst Macs ever made) when it speaks of clones. None of this undercuts the story. He gets the tale right even despite some small mistakes.
Steve Woz is a genius when it came to hardware design. Some of the software he developed was amazing, also, for its tight code and small footprint.
Give me an Apple //e or GS over anything else in that era. The Woz machines were so much better than anything else that Apple produced and it’s inconceivable as to why they deliberately crippled the power of the GS as it was a far better machine than the Mac of the same era. I believe that the GS especially with an accelerator card that allowed GSOS to run properly outperformed the MS DOS based PCs of the era. Jobs never liked the open design philosophy of the Apple 2 line, time has definitely shown Woz’s vision was the future for Personal Computing
Who here remembers buying the doorstop posing as a monthly magazine called Computer Shopper?
OMG yes I had forgotten that! It was huge!
Have u started this full time now, very regular uploads .
You left us all on a cliffhanger! 😅
We had some Apple IIe computers with dual disk-drives when I was in Matric (year 11 & 12) - the guys that used them had to have two sets of floppies. One set for hot weather and one set for cold. Neither set could be read in the the other season. (I stuck to the PDP 11/70 via terminals; far more reliable!)
Question about the ending ... What was Scully's new "game changing" project for apple that became a disaster? Is there another video that explores this?
This review of the early history of the PC is extremely interesting to me.
I was watching this field intently during the mid-70's to mid-80's.
In the late '70's the Z80 was the best 8 bit micro but the 8080 had a head start and good marketing.
The Motorola 6800 was very good too but it didn't have the marketing clout behind it.
The Texas Instruments TI-994A was more of a toy computer, ok for games but not for serious work applications.
When the IBM PC came out, its hardware wasn't very good but the IBM name was magical and everyone wanted something that was "IBM Compatible".
The Texas Instruments PC ("Professional Computer") in the early '80's had much better hardware but without the feature of IBM compatibility, it wasn't a marketing success.
Yeah. I'm of the age group that grew with Apple and Microsoft, and it's bewildering they're basically retreading the mistakes of the companies they disrupted.
You seem to be mixing up CPUs (Z80 and 6800) and computers (TI-99/4A). The 6809 derivative of the 6800 was well loved by its programmers but apparently too expensive compared to the competition. The TMS9900 as the CPU in the TI-99/4A was a true 16-bit CPU when all the competition was 8-bit, but other parts of the computer's design was so compromised that the result ended up slower than the 8-bit competitors. (I'm and old Z80 coder and never coded for the 6800, 6809, or TMS9900 but read a lot of what other people thought about them.)
@@andrewdunbar828 In the late '70's I thought the Z80 was the best of the 8-bit micros. I compared it to the TMS9900 and finally decided on the TMS9900 since I was working at TI. I built my own homebrew computer and wrote a lot of software at home for the TMS9900 that I later used on a real project at TI. Unfortunately TI didn't continue to develop their micros. In 1982 they gave me a TI994/A with all the accessories and software so I could evaluate it. This was fair for games but not too good for general programming. The IBM PC stole the show and set the standard, (except for Apple).
The Z80 and 6800 and 68000 were much better hardware-wise than the 8080 family but Intel's marketing prevailed.
@@bob456fk6 Intel's marketing didn't prevail against the Z80: a _lot_ more designs used the Z80 than the 8080 or 8085. In part that's because the Z80 was an 8080 clone with some extra features.
Intel won with the 8086 because IBM used it and the IBM PC took off. It's kind of a crap design compared to a 68000, but Intel got working versions into production first; the 68000 apparently wasn't really yet ready when IBM was evaluating it for the upcoming PC.
I wouldn't say the 6800 was better than the 8080, no. Not even as good, really, except that it needed only +5V rather than +5/-5/+12. (I've done a fair amount of programming on the 6800.) The 6502 was probably about as good as the 8080: it didn't look as good on specs, but it ran pretty darn fast if you knew what you were doing. (A 1 MHz 6502 easily is faster than a 2 MHz Z80.)
Sinclair was for kids only, never better than apple one !
Only the Gateway 2000 compaq people made PC bigger than apple ever was, apple won it all !
My Mac IIe was so s-l-o-w I had CompUSA bust the case and double its memory for "only" $150. Wish I had instead put that money into Apple common stock!
¯\_(ツ)_/¯
I have used the Apple Lisa computer and LaserWriter while back in the late 1980's, maybe first in South Korea. These machines were revolutionary piece of art, science and engineering marvel . Graphic screen, UI and mouse and beautiful printed output, rivalling phototypeset book. Window desktop and object-oriented drawing tool software. The software itself worth the money. IT was easy to use and understand. It was culture shock. I wrote my CS master's degree thesis about object-oriented language with Lisa (then Mac XL) & LaserWriter. So I couldn't understand why this revolutionary machine didn't sell well enough in market.
According to Bill Atkinson it took the 8 MHz Macintosh at least half a second to draw a screenful of characters - 1.75 times faster than Lisa, and that's before accounting for Lisa's larger screen.
Lisa didn't sell because it was unusably slow, and also cost $10,000 (±$31,000 today).
Note that the 6502 is not based on the Motorola 6800 design. MOS Technology made a version, 6501, that was pin compatible with the 6800. But pin compatibility was the only compatibility. The 6501 was dropped after a lawsuit from Motorola. The 6502 was constructed in a different way than the 6800, and this made it possible to make it for cents and sell it for a few dollars. So the case with 6800 vs 6502 is very different than 8080 vs Z80, which were machine code compatible, even if the Z80 were constructed very differently than 8080, and had its own assembly language.
And, btw, thanks for your high quality and very interesting content! I love your channel.
So Steve Jobs wasn't so great as many creating him. But I could guess it from words of Jim Keller, which he was telling about how they must hid products from Jobs in early stages, because he don't understand product development cycle, that prototype is not as final product.
That "other" on the first pie chart, where the collective CP/M (S100) machines, from many small manufacturers. The MITS Altair, and others similarly built, could run Digital Research's CP/M operating system (OS). Yes, Apple had it own proprietary OS. Though Microsoft provided a plugin card, that could run CP/M. The bulk of the CP/M machines exited the market by 1985.
As for floppy drives, those first ones were 8" drives. You don't know "floppy", unless you've really handled an 8" floppy !
At the time of the IBM PC, getting patents on software was difficult. Most software was only protected by copyright. I had access to an IBM Technical Reference Manual. It had all schematics, and full printout of the BIOS source code. Used that manual a lot.
Your history about the Macintosh, Apple, and the IBM microcomputers would be just what the psychiatrist ordered. I would have relied on the Apple computer if only it included all business programs, including a relational database, as found on the Windows computer. Your story makes me appreciate using the microcomputer and the color laser printer more, and I do not want to resort to the old-fashioned methods of doing things, regardless of how available they seem to be.💙
“Why isn’t the Mac selling?”.. it was way overpriced..
Even a single expansion slot would have helped allowing for a co-processor even though they're different architectures.
@jr: agreed. And you couldn’t easily exchange data with a PC - different disk formats. Job’s wasn’t dumb, he just wouldn’t accept the fact. He thought the Mac was a better machine and perhaps it was. But you know what his arrogance got him.
This video illustrates to me why this is one of the best channels on RUclips.
Its a pretty strange era there were a lot of good computer companies in the early 80s that where way better than apple and ibm. like commodore Atari Sinclair ti but they all got crushed and forgotten after the 8 bit era. my guess is that they overestimated the home user market instead of focussing on the office market. in the 80s the average person didn't need a computer it could have been handy and fun for gaming. but it wasn't until the 90s and the rise of the internet that the average person wanted a computer for more than just gaming.
i think another advantage ibm had and apple to a lesser extend was that their computers where more expandable and open so other companies could improve the pc. and even the old apple computers had more expansion options than the c64 .and when someone did make a hardware upgrade for the c64 nobody supported it.
@@ghost_mall English is a dumb language
@belstar1128 At least commodre and Atari did survive into the 16bit era and I do agree that in the 80s a lot of people had no reason to own a computer beyond games but they did try to find success in the business market with the commodre plus 4 and Atari portfolio and stacy. The fact that sometimes as you say expansions don't get supported alongside fragmenting the userbase is maybe why some computer companies don't or didn't support the idea of more expandable systems
The Altair 8800 was hardly a personal computer. It, like the Kim I, were really just hobbyist kits. Basically, it was a box with switches and some LEDs on the front of it. While it was affordable, there was no keyboard for input and no display for output.
Im my opinion, a more realistic view of what makes a personal computer is that at the least it REQUIRES keyboard (or something more advanced) and a display of some sort (teletype and LEDs notwithstanding). In my view, it would have to be something akin to an IBM 5150, an Apple II, or a VIC 20. I wouldn't even count an Apple I, since it was a kit which keeps it in "kit" territory.
While much of the hardware has changed from the IBM 5150 or Apple II to now, the basic paradigm for using a computer hasn't drastically changed. I'm typing on a laptop that has a keyboard for input and a video display showing letters (or characters) when typed. Modern computers do generally have GUIs and pointing devices, however micr are not new as everyone who has ever seen "The Mother of All Demos" can attest. The only thing really keeping Douglas Engelbart's system from being a PC was the fact that it was so horrifically expensive.
The truth is that before there were affordable systems with keyboards and displays, the systems that preceded them were too difficult to be used by anyone but a hobbyist or someone in the industry. The advent of the actual PC changed that.
The Kim 1, Altair, Imsai, etc. were all very capable and expandable computers. But you needed a separate terminal to interact with them - which was a significant added cost.
@@markmuir7338 From what I've seen of those machines, getting them working with a keyboard and display is not a trivial affair and requires toggling memory locations using switches on the front of the case and extra, expensive cards and such. If my understanding is correct, this disqualifies them at least in my mind.
The point of the "P" in "PC" was that you could:
A) Afford the thing.
B) Take it out of the box and do something useful with it as a normal person.
At 8, I was able to teach myself Apple II basic, TRS 80 basic, and IBM PC basic. With one of those other computers, I sincerely doubt I would have had a clue what to do with them.
@@kevintrumbull5620 These hobbyist micro computers worked the same way as mini computers and mainframes of the era. They were intended for people who already used computers, but who wanted to have one to themselves. That's why they relied on serial terminals - that's what their users were used to. Once you had all the necessary equipment (terminal, disk drives, disk controller card, memory expansion card) you could run 'proper' operating systems on them (CP/M) along with plenty of commercial software.
As you said, adding a built-in terminal which worked out of the box would expand the potential market substantially - by including people who had never used a computer before. I also got into computing that way via a Commodore VIC-20 when I was 6, then an Apple II when I was 8. It's important to recognize there were a few of these integrated systems before Apple (eg MCM/70) which didn't have the marketing genius of Steve Jobs, hence are lost to time. Also, several hobbyist computing companies (eg SWTPC) sold their own terminals alongside their micro computers, which were pre-configured. These companies and their products continued along until the IBM PC stole away most of their customers.
The RUclips channels 'Adrian's Digital Basement' and 'Usagi Electric' have vastly expanded my knowledge of these early systems, and made me less of an Apple fanboy along the way. There was plenty of innovation all around.
If you were willing to add a terminal, serial port and ROM board to an Altair you could have a system that booted straight up into whatever the ROM does, with no toggling.
The Apple II was sold built, as well as in kit form. You could just go down to The Byte Shop, buy one that included a case and PSU, and plug it into your TV at home. Though of course it booted up into WozMon, rather than BASIC or anything like that.
But you're right that for consumer adoption, something "akin to an IBM 5150, an Apple II or a VIC 20" was required. But that didn't happen in 1981 with the first IBM PC: that happened in 1977 with the Commodore PET, Apple II and TRS-80 (later called the TRS-80 Model I).
Really difficult to take this video seriously when it starts off with false pie chart in the first 15 seconds. Atari 800/400 computers were out selling Apple II by a large margin in 1981 and still out selling Apple in 1984. And Atari is just magically not mentioned. Face Palm
this is when i grew up using computers: aldus pagemaker, photoshop (just photoshop, no number or date) illustrator and premiere. did my senior project (wrote and produced a pilot, but only had time to shoot the commercial during school year) and incorporated 3D graphics and overlays using a 660av and a VHS VCR. had that tape for 20 years till i lost it in a move.
it was a spectacular disaster, but it was my disaster. if i still had it, it would be on RUclips for everyone to gawk in horror. 😮😅
Considering I'm nowhere near Hollywood, there was never a risk of my becoming an actual producer, but it was still fun learning the process.
Seeing those rooms with the old computers reminds me of an urbex video I saw the other year where an abandoned textiles factory in Italy was explored and there remained so many PCs that part of me wanted to dash over there to find one a new home. Sure, they would require repairs but it's fun seeing relics of the past still exist, even if they're in bad shape.
Next you should do a video on Commodore and the Amiga. I grew up on the Timex Sinclair and Commodore 64 as a kid. The Demo scene on the Amiga with 4096 color palette far exceeded what the Mac could do. But the best 3rd party add in card was Newtek's Video Toaster. Video switching, compositing, and 3D Computer graphics creation. You could do a series about the Computer Giants of yesteryear. The CPU wars never get old.
Amiga only, commodore did too many weird systems, who needed that ?
kids, play a game on it !
The Amiga topic is interesting but I feel like it would end up being just a footnote in the history of microcomputers
The best would be a story about Commodore, a company that made millions but in the end only the memory of it remains
I love your videos. Well researched and presented
Apple II and II+ is what you meant in 1981 as the Apple I was the 1976 single board computer requiring soldering…
Very nice. The Fall 1985 marketing push you cite resulted in my company becoming a Mac / LaserWriter / PageMaker customer at the time. I was the one sent out to "the seminar." I reported back that it was great, and we should buy it.
I feel like Steve's hiatus from Apple (@24:30 onward) was what made him Steve Jobs. He finally went out into the world and watched people use computers. It's after this that he started saying those iconic things about "starting with the customer, and working your way back to the technology", and so on. Before this he was a great manager and idea guy, but he didn't have a concept of the mission. That's not to say he was "wrong" in redesigning the first Macintosh and quintupling the price from $500 to $2495; the price was not the issue but the technologies and software it shipped with.
the Lisa embodied much of all the same characteristics, but clearly showed that price was very much an issue.
What would have changed the dynamics for the Macintosh intro would have been to ship with 512K base memory instead of 128K with expansion capable up to 2MB. All those 128K Macs were useless MacPaint doodlers only and over $3000 was a lot to pay for a doodling computer. And it needed that SCSI port from day one so that serious users (business) could attach large capacity storage.
IOW, they needed to have inrtoduced something much more like the Mac Plus from day one and then they would have had a very serious contender against the business serious IBM PC
I'm not sure what that last sentence, after $2495, means; it needs to be explained or rephrased. What was the issue?
@@acmefixer1 Basically they made a very expensive device with not a very great idea of who the intended customer was supposed to be. Computers aren't just specs, they are the software they ship with, the software they can run, and the things you use them for.
@@TheSulrossIIRC the SCSI standard wasn't finalised until 1986 - even the Macintosh Plus, released in January of that year, isn't fully compliant with SCSI-1 specifications.
The price of the first Mac absolutely was a major problem. That, and the gross mismanagement of the IIGS, cost Apple what should have been a position of permanent dominance in the computer market.
The Apple II E was our family's first PC and I remember it very fondly. My university had a UNIX network on campus, and every student got an account. My first very own computer was a used 386 that friends who worked at Siemens gifted me. Those last two weren't Apple of course, but I always think of all those systems together, with tons of nostalgia! Thanks for the memories! 😊
Or may the history of Atari. Despite the claim that they went out of business in 1983, Atari came out with the Atari ST in 1985. One of the fun things about it was that you could take the roms out of a Mac, put them in a cartridge, insert it into the Atari ST, and with a small program it could turn your ST into a Mac that also ran FASTER than the Mac.
The movies and all the narratives I've seen about Job's ousting paint a picture of personal strife and whatever, but seems like he just couldn't handle the business well, as he himself seems to have said in the quote shown in the video, he didn't know how to fix the Macintosh. Sure seems to me that all that other personal stuff didn't matter, in the end the numbers are what got him kicked out. Sure, the Steve that came back probably could have fixed what younger Steve couldn't, and indeed fixed Apple, but to me I've come to now see the whole ordeal in a different light. Thanks for the video!
A couple of mistakes on your video:
1) Woz did not leave Apple, he was fired by Jobs because he wanted to continue the Apple II line, which at the time was supporting Apple Inc while the Macintosh was making them lose money. Jobs saw this as a threat and eliminated Woz and soon after tried to kill off the Apple II line. Seeing this Scully had the Board of Directors eliminate Jobs, and kept the Apple II line going until the early 90s.
2) You have it that Atari lowered the cost of the Atari 400 to around $299. The mistake here is that Atari lowered the cost of the Atari 800 to $299 and with a $100 rebate ticket printed on some Magazines and store outlets. The Atari 400 at the time went for $99. Considering that in 1979 when they both came out, the Atari 800 went for $799 and the 400 went for $599.
An interesting video but you left out a lot and like these two issues, there were a lot of other minor mistakes of information.
Oddly you barely mention the Radio Shack TRS-80 system as they were a good part of the market, and CP/M. Those graphis where you mention "Others" that is where the TRS-80 and CP/M belong. On top of that in the ealy 80s Radio shack releasing the CoCo (Color Computer), that opened the market for many who did not have a Sears or Computer Factory Store near them but they did have a Radio Shack. Poor marketing on their end as the CoCo could have been a major contender to deal with. at the time.
Hindsight is often 20/20 but some have it as 250/250.
Woz sort-of-kind-of left after the January 1985 stockholders' meeting but made sure he remained on a token salary so he'd get his ten-year pin in 1987. Jobs, still Chairman of the Board but put in charge of nothing as "Corporate Visionary" in May, quit in September.
Found 'Nokia: The Inside Story' book at a thrift store and was an amazing read. Maybe want to take a look at it for a future video in case you were not already working on it. Thanks for the great content.
my first computer c. 1981 was a eurapple. an apple with a non ntsc video output. I clearly remember I got big PCB which needed all the parts and a bunch of cuts and straps to get ntsc. I had one mistake. I built a linear power supply, used a nice wooden Hammond case and surplus keyboard. I first I used a TV set and taped into the circuit with the ntsc signal. Later I got a green screen monitor using a phono jack for the video. Next came the floppy drives, 2 of them Asuka drives. and much later a HDD. A big thing back then was to an 80 column card. I also remember getting a Z80 card to run a spread sheet. I designed and built several cards that plugged into the slots.
This article does not seem to tell the cost of the IBM PC. I had one in 1983 or 84 and it cost almost $3K with a monitor and printer.
Yeah!
The $3K [US] amount was pricy for its time, and not affordable for the average household, as adjusted for inflation, that 1983 price would equal to near $9K in 2023 [US] dollars.
Ed Roberts of MITS bought cosmetically defective but functional 8080 CPUs and those those in the Altair. The chips had package or fiinsh defects but passed function checks. Also that photo on the cover of Popular Electronics is a lie, the box doesn't work. Bob shipped one to the magazine for it's photographer but it was lost is transit, the one pictured is empty. Bob had to scramble to get them a new one.
I didn't see any mention of Microsoft. I understand that IBM did write an operating system for its PC, but MS-DOS was far superior and deserves a lot of the credit for making the IBM PC a success. Also, I didn't see any mention of the PC Jr.
There is mention of Apple creating an early reliable disk drive. The Apple II data read operation was largely implemented in software running on the main processor. Apple created a standard format for storing data on its disks. The Apple II ran most of the time without a resident operating system. To start a new program, the user would place a disk in the drive and reboot. The BIOS would read track 0 sector 0 into memory and turn over execution to it. That sector needed to take over the disk read operation, which typically used disk read code in the BIOS. IBM's disk drives, on the other hand, had hardware to read the sectors.
Apple allowed each software vendor to place a copy of Apple's DOS on its disk. IBM's PC booted in DOS and then loaded the software vendor's program.
Apple's system was highly resistant to viruses, since it typically got a fresh start from each disk. IBM's PC was vulnerable to one program modifying the resident operating system to modify the next program loaded, passing along a virus from disk to disk. Apple's system architecture allowed vendors to implement disk copy protection schemes based on non-standard data formats. The standard Apple disk copy program would not work. A program named Locksmith attempted to copy data in the non-standard formats, but wasn't always successful. In some cases, the software writers created their own custom format. Other times, the disk was basically in standard format but had some non standard data somewhere. A subroutine would check for the non-standard data. A clever hacker could find the subroutine and patch it out. If the entire disk was in a non-standard format, it was possible to let it load to memory and then write the image in standard format to a blank disk which would load and execute as a standard disk.
Apple's PRODOS came later and was resident, allowing programs to run without reloading DOS each time. Appleworks integrated word processing, database and a spreadsheet software.
Copy protection for the Apple disks was a headache for the schools. Without it, the schools could make backup copies, permitted under the copyright laws, and let the students run the backups, keeping the originals locked away. If the backup was damaged, the school could make a fresh copy of the original on a blank disk, without having to buy a new copy. Sometimes, the copy protected disks did not read properly in all disk drives while disks with standard format read well. So, hackers came to the rescue, breaking the copy protection schemes and writing the unprotected copies to blank disks for the schools to use.
Thank you! Your videos are just excellent!
Apple's biggest problem, i think, was price. Price is why the first computer my dad bought was a C-64. PC clones were what got him to enter the bigger PC market.
the biggest problem with the Apple III wasn't chips falling out of their sockets (yes, that was a problem to a small extent, but not the big deal breaker), the BIG problem was it's poor ventilation design caused the components inside to overheat significantly. It had no propose built ventilation and no fan, causing it to overheat badly if left on for more then a couple hours. I remember one Apple III in a computer lab long ago and I remember a note on it said "turn off after one hour" meaning no one could use it for more then an hour cause it was so prone to overheating.
Apple... Even from the start it was more about style than substance. That alongside enforced obsolescence of the old models.
But you have to give to them for the original Macintosh. I remember seeing that when I was a teenager and LOVED the interface. It was gorgeous!
Minor note, I bought the Apple 2 Plus, the second Apple 2 generation, in 1979. The actual Apple "1" was a kit with no case or keyboard, very few of those were sold, so I am guessing in the beginning of this video the count of Apple "1"s actually includes Apple 2s.
I remember playing Wavy Navy on my friend’s Apple IIe back in primary school… I also remember my Apple IIc, with its AMAZING RAM upgrade to… 128k
HP wasn't in the PC market in the early 80's. Their computers may have looked like PC's but they were aimed at professionals and so were the prices. They didn't sell in any meaningful numbers. Not sure where you sourced your data for the pie chart at 11:05. It also conflicts with the market share chart you show later at 22:48. That chart is missing the "other" category as it doesn't include computers like the VIC-20 which was still selling in good numbers into the early 80's. Even the Ti-99 was still on the shelves in those years.
One of the reasons Commodore was able to weather the gaming console crash and price wars of the early 80's is that it owned MOS Technologies which made the 6502 CPU. Every Apple II sold was generating revenue for Commodore!
Mainly through licensing to companies like Rockwell and Synertek.
@@brodriguez11000 Yes second source manufacturing
HP had various models that could run CP/M in the early 1980s including the HP-86 and HP-125. I think that pretty much makes them personal computers, albeit not IBM PC-compatible ones. Yes, they were expensive and noted as such at the time, but then there was a perception of enhanced quality over the most comparable competition.
It is an odd statement indeed to distinguish between personal computers and computers "aimed at professionals", since a lot of the money made in early microcomputing was made precisely by selling computers to professionals. That was kind of the point of the IBM PC, after all.
I suffered through ownership of an Apple III and they weren't reverse compatible.
I had to buy an emulator card so I could play Wasteland on it.
Many people forget how groundbreaking the original Apple II was, it had: BASIC on ROM, expansion, color, graphics, & sound in 1977. It's only "competition" were the TRS-80 & Commodore PET which were text only, no sounds, no graphics. Then the same people brag about Commodore 64, which did have sound & graphics, but was not released until 1982.
The first disk drive for Apple ii did bid NOT use Integrated Woz Machine. The Disk ][ board has a prom and logic ICs only. IWM was the IC put in the //c, later IIe cobtrollersy, and Iigs
Strictly speaking the "integrated Woz Machine" was not the first superb Woz design, which was like 6 chips. It was a one-chip Woz machine that was used on the Apple IIc and early Macs.
Apple had the Goldilocks problem. Their first try to leap forward was the Apple ///. Not hot enough. Still a 6502 CPU and limited to 64K of memory per bank. Then they tried with the Lisa. Way too hot, cost about $12,000, and too cool, speedwise, it ran really slowly. Then they tried the Macintosh, which was somewhat in-between, about the right amount of everything, a definite step up, but still flawed-- no hard disk, only 128K of RAM, and incompatible with everything else in the world. Real tough times for Apple. It would be years before the Macintosh line was profitable.
The failure of the Apple /// was mainly caused by poor quality of the initial product. Steve Jobs insisted on not having ventilation slots or a fan. Instead it had an aluminum base that was supposed to act like a heat sink. The Apple /// had more memory than the Apple ][ and a faster processor. This caused the computer to generate more heat. The heat caused many of the socketed chips to become loose and cause hardware errors. Apple tech support suggested lifting the Apple /// several inches and dropping it to reseat the chips. The Apple /// had an Apple ][ compatible mode, but special chips where installed preventing access to any Apple /// features when in Apple ][ mode. This included expanded memory, 80 columns and lower case letters. When in Apple ][ mode the Apple /// could only address 48K of memory.
The best content of these topics on youtube. Big fucking like for you
Apple IIe was a great computer and critical for its time, after that I lost interest in Apple’s heavily proprietary computers.
Uh the TRS-80 was not trash, and was not nicknamed that "for good reason".
It's just how the abbreviation came out. I considered The Apple II to be a dog compared to the TRs-80 Model III. The PET was the real dog. Until a chepaer knock of it the VIC-20 came out, which springboarded the C=64. The C=64 and 128 were the machines to have, until they got really exclipsed in speed and power by the Mac and IBM AT.
NOw, the TRS Color Computer was kind of trash, but then so was the Atari 800 nad 400. Keyboards were a big deciding factor back then if you liked a computer or not. It seems laughable today, but no, yeah, a good keyboard would make or break a system. I know, I will only use an IBM PS/12 1986 Model M keyboard now... because it's a cult classic and for good reason.. it's a dream to type on for the average user.
[10:14] The infamous Wall Street Journal ad: "Over the next decade, the growth of the personal computer will continue in logarithmic leaps." It's odd that they didn't say exponential, and instead said the inverse, logarithmic. Anyway, thank you for your fascinating and well-researched videos, they are always worthwhile.
There is a briliant story, of Apple investment (way back when Jobs was on Lisa team I belive) into a Hungarian startup called Graphisoft, who subsequently went on and invented new way of designing buildings on computer. Their flagship software Archicad battles the hegemon of AEC industry - Autodesk to this day in much similair fashion as Apple and IBM once did. I would love to see your take on the story, @Asionometry.
It’s worth noting that Wozniak’s disk drive lacked the capability to track the position of its drive heads. Consequently, when it needed to reset, the motor would forcefully push the drive heads for about a second, hoping to return them to the starting point. This process generated a loud and distinctive noise, making the disk drive significantly noisier than others available at the time. All in the name of using as few of components and saving cost
This was a great video. It brought back so many memories. Filled in so many details. Thank you.
I wrote some successful software in the early '80's that ran on the Apple IIe. Looked at the Mack and to my complete surprise found it wouldn't run IIe software.
That seemed dumb so I moved over to the IBM pc until Windows 2.0 came out.
Thank you for this fascinating tour through this history. I lived through this whole period, buying an Apple ][+ in 1980. What I particularly liked about your video was, in your usual fashion, its completeness -- I learned a lot that I did not know before, and I feel like I have a much broader view of what was happening at the time. Being so immersed in it, I didn't see the forest for the trees. As always, I am looking forward to your next video!
Personal computing is one of the few that's grown up in an entire human generation.
I don't remember the Apple II have color. The monitors were either B/W, Green or Amber. I think later a color card could be inserted to work with a television. That was in the UK. I loved the Apple II which I bought for home use.
To say that the 6502 is based on the 6800 design is a bit reaching. The 6501 is pin compatible, as an upgrade path (the 6502 isn't), and they share some similarities in their logical code model, but they're absolutely software incompatible.
Most people here bought Ataris and Amigas which were like higher quality, far more affordable and versatile versions of Apple products
That was a very well done research. Especially looking for markets and applications for (existing) products.
The IBM PC was very similar to a CP/M-based computer, but with the advantage of being able to use more memory, and having a more powerful computer - and a single standard disk format. So it started by swallowing up that segment of the market. The Commodore 64 was a direct competitor to the Apple II in the home market; but since the Apple II was a factor in the business market, IBM was also a competitor for it.
The video misses the Apple IIgs, which is where apple dropped the ball.
Steve Jobs realized that GUI was the future, and tried to jump straight to it with the Mac, making two mistakes mentioned in the video. The first is the hardware wasn't powerful enough to support GUI. The other was lack of software.
Bill gates realized GUI is the future, but given hardware wasn't strong enough, decided to improve the operating system incrementally until hardware was strong enough, which was around the year 2000, with Windows NT and Windows 2000. As a side comment, Microsoft released Office for Mac before Office for Windows. It got experience developing it for GUI before Windows was available, making it easy for them to release a Windows version when the time came.
Which is where the Apple IIgs comes in. It was as much an Apple IIe upgrade as hardware at the time could offer - 16 bit processor, 256KB to 8MB of RAM, better graphics (the Mac's display was literally black and white), better sound, and compatible with Apple IIe slots and software. It was an easy upgrade path for IIe owners. It also had an OS with GUI in '88, two years before Windows 3.0 became popular.
Which makes the Mac software availability all the more problem clearer. If a customer has to choose between two computers that don't support any of the software he has, he might as well switch brands. Apple realized it a little late, and released an 'Apple IIe on an expansion card' for the Mac*. Problem was the Mac was expensive, and adding the card was even more expensive.
[* That wasn't a new idea. The Apple IIe had a 'Z80 computer on an expansion card', which allowed it to boot CP/M and run applications like WordStar.]
Apple's mistake was trying to run faster than technology allowed it to. If Steve Jobs was willing to take a slower route, Apple would have had a chance to compete with 80x86 & Windows.
One of my uncles built an Altair 8800. It seemed like science fiction then.
The problem is Apple doesn't license its operating system to manufactures which means don't allow these PC manufactures to make computers with MacOS as Microsoft does with Windows. This limits your choices
They licensed the OS for a while, but found out that other companies made a better Mac.
0:10 "300,000 Apple 1's"? Doesn't help the credibility of this video.
you need to do a video on Novell
Concur.
Shout to the Performa 5200 series, I loved the industrial design of that machine and it was fairly expandable at least via SCSI, if not obviously slow relative to Windows PCs. Still, it was and still is an extremely fun machine to use during the glory days of CRT. Definitely a nice prelude to the iMac. And way better than all those PowerPC/Performa 6xxx series of which there were way too many configurations (mac ppl know what I’m talking about) System 7.6 - 8.6 looked awesome on that Performa, Ethernet was effortless, 100Mb Zip Drives (mostly) worked for low to mid-size files. I’ve made a point to keep everything compatible with my 5215 still working. SoundEdit 16 v2 was/is also a really awesome piece of software, even if it was slow, the ability to *see* raw sound and music was amazing. Photoshop, Illustrator and QuarkXpress also did ok on this machine. Even when it lagged, it was still an amazing creative tool. It could also handle a certain type of file called the .mp3 ;)
Curious perspectives. Especially with your use of the term "killer apps", How old are you and what's your technological education? (It's a sincere question. I only want to know why some people see things in different ways To be honest, a lot of this sounds like it came from wiki and/or other she-said/he-said books and articles,..).
The Apple IIe was the very first computer I ever used. Prolly back around 1988-90 using it to play simple games to keep me entertained after school around the age of 7.
I actually used the Apple //e in school, Virginia School for the deaf and blind in Hampton, where I was, had a computer lab with Apple //e's and //e Platinums… and a version of "Dragon Maze" that was made with the option to play it with or without the dragon…
I remember these days. The Apple products were just to much for my budget and like many I bought a Commodore 64.
A bonus being almost unlimited free software!!!! Also, why buy a game console when you could play similar games on the same computer you used for word processing!
@kingforaday8725 Where did you get free games from ? :P Commododre did make a consolised version of the c64 in 1990 called the c64GS
@@cryptocsguy9282 Copying was very common back in the days. Often called "warez"!!!!
Been waiting for you to do this topic! Thank you
Small Costs were such a huge driving force… my high school of 800 kids shared three Trash 80s… cassettes were mostly used for program storage… one of the TRS80s had a disk drive… 1983.
Buying a computer to go to college… my class was the first to require one computer for each student… class of ‘87…
DEC Pro30, compatible with the DEC10 mainframe on campus…. Word11 was a great word processor. Took a year to get a dot matrix printer (expensive and limited supply)
Dad went Commodore64 after that… low cost word processing that looked better than a typewriter could do… huge savings in manpower. Or Mom-power…. 😀
Do typewriters still exist?
Saw one in an antique store. I’d have bought it if it was an IBM.
1:48 We had two of these, they where amazing for the time . . .
Trying to get APPLE STUFF on every desk in the schools was not a very good idea. Education did NOT have the budget for expensive computers and neither did MOST of the student's parents. Apple's idea was to get an APPLE computer on K-12 desks and then COLLEGE students would also want APPLE stuff for their college work. THAT idea did NOT pan out very well.
its open architecture vs close architecture which won the ibm vs apple, as third party ibm compatible making apple expensive as cost is a main factor...... plus an open architecture helped software development.. apple will also belong to a niche market...
I remember buying a computer in 1985 for under $200.00 and it used two cassette drives. I could use my music cassettes in it. It had no monitor so I hooked it up to an old black and white large screen TV that was in the basement. It took a while for the computer to move the cassette tapes back and forth while operating. I used a word processing program on it to prepare pamplets, etc.
The early IBM PC’s had a cassette interface. I had an early PC and tried it. It worked, but it was something I wouldn’t rely on. The first IBM PC’s had a max of 64k on the mobo. Mine was the version when they first switched to 256k on the mobo.
Tiny correction to the this otherwise excellent overview... at the very beginning the voice over mentions Apple's installed base being Apple I, should have been Apple II.
300,000 Apple I computers instaled? Less than 200 were ever made....