I loved this - as someone who was there - Acorn employee 1983-85. And I was in the room when that first ARM chip ran. What seems to have been lost in the history was just how fast it was - we kept cranking up the clock speed and it just kept working. Probably the fastest single stream computer on the planet at that time in April 1985. Also thank you for the kind words about the Domesday Project. Undoubtedly the hairiest decision I have taken in a lifetime in Computing was to say we could build the software within 2 years and it would run on twin 6502 processors with (was it?) 48K of RAM.
I recall seeing the Domesday project demonstrated on tomorrow's world and Blue Peter. Imagine my excitement when, a few years later on I found myself working in a company set up by a group of ex BBC ITU people, and it had a fully working system set up, including a few spares of those weird custom laserdisc players. The project was even ported to Hypercard on the Mac there just to keep it alive as technology changed.
@@erratic100 Yes. The project was conceived to run on those 12 inch hybrid analogue/digital laser discs by Peter Armstrong, a BBC Producer. But the sketch he and and Andy Finney had of how it would work would have been too slow and unusable at scale. There was a key meeting with Philips in Eindhoven where I think the plan was to offer us CD ROMs as the storage technology, but I reckon Philips' CD boffins thought they couldn't deliver in time. So this wonderful system was, sadly, initially delivered on top on a technological cul-de-sac.
@@johntait8422 Peter and Andy were both at the company I worked for. They are both lovely people, and great to work with. I learned a lot. And also about the innards of those Laserdisc players - there was often one with its case of being fiddled with to keep it functioning. There was one that was always problematic. I was moving it from storage one day and dropped it, heavily, on my foot. It worked perfectly after that! :) The company did end up doing a deal with Philips on an unrelated project using CDi. Another dead end technology, but it was fun to play with the range of dev kits that were sent. Also, the main project on which I worked was authored in Macromedia Director and Hypercard. Due to that it sadly no longer works on any modern kit, and two of the three multi-disc copies I own have succumbed to CD rot. I've made a bit-perfect copy of the remaining working discs, and I'll have to source a vintage Mac to play it one last time...
@@fermiLiquidDrinker It was extremely exciting. As I went through the door for the last time I was conscious I would never work in such an exciting workplace again. But I had family problems, would be in a serious financial mess if they made me redundant, and had a secure well paid job on offer.
I was involved with the electronics support team in the Aberdeen University Medical Physics department with BBC Micro in 1985. In 1989 I opened an Acorn StrongARM Risc PC dealership in a village called Fyvie in Aberdeenshire called "Tower Electronics" in the back of a Print shop. I was forced to make a living by selling custom build PCs based mostly on high-end AMD CPUs. As a dealer, I visited Acorn in Cambridge, which was terrific. Way ahead of the curve, I had a 2 slice Risc PC with an x86 CPU, a second CPU running Windows 3, and later Windows 95 inside the NATIVE ARM OS called RiscOS. Age 73 I have just finished building an AMD 7950X RTX 4090 GPU system and still enjoy computing.
Always loved the story of the first time they've measured the power consumption of the first CPU and it worked despite the fact it wasn't even connected to the power supply.
@@SchoolforHackers sadly, the power has to come from somewhere, and if you use parasitic power, it just means you add a diode drop in series with the associated additional power loss. It's better to supply power directly, rather than through a very convoluted path. Besides, if it ever happened that most lines on input into the processor were in state "low", it would cut power to the CPU. I can tell you from experience (with both ATMega and ARM) that this is quite a frustrating "randomly occurring" bug, and is the reason most new board bring up procedures in the industry start with verifying the power bus voltages.
I bought a $6k ARM single user licence. Hands down ARM has transformed the way I do things, especially over the PowerPC platforms I started with. We use ARM in safety critical devices.
Just to add, I also designed one of the two semi-custom chips in the BBC micro, the serial processor, which handled the cassette filing system interface. 🙂
I bought the Archimedes 305 in spring 1988. Amazing machine. After using it for about 8 years and expanding to 4 Mb and an ARM3, it currently sits in its original box, in the attic. It feels like an old friend that you stopped calling.
The point about testing in this video is _very_ important. The original ARM 1 & 2 are sufficiently simple that it was possible to show by formal methods that the processors work correctly.
15:52 AIUI Thumb mode was solely about code size reduction ( though may also have had speed advantages due to fewer memory fetches). Thumb instructions were 16 bit, not 8-bit, vs. 32 bit for ARM ones.Thumb instructions were translated to ARM ones before execution. so I don't think it had much impact on power, at least from the CPU - may have saved a little on memory access.
Thanks for pointing this out. I've written code for ARM chips and the have tried both (performance is basically the same.) On modern ARM processors I'm not sure this feature even has much of a reason to exist anymore (since memory used for data will usually greatly outpace memory used for code.)
@@cozy_kat Absolutely. Thumb came about when Nokia wanted a CPU, not Apple.The ARM7TDMI manual was published 2 1/2 years after the first Newton. I'm not sure when the cores themselves, or chips, were actually available. Also, aside from making it sound like "Fun" mode, almost every technical detail is wrong. Thumb instructions are 16 bits in size, not 8. They can address the whole address space. As they are decoded into regular ARM instructions you can't "turn off large parts of the CPU". Thumb instructions are faster only in that they still run at 1 per clock cycle if you cheap out and use 16 bit wide memory where the regular instructions take 2 cycles to load. Both run the same speed on 32 bit wide memory. ARM7TDMI systems often had a mix of memory widths, with for example ROM code (the OS/library) 32 bits wide, but RAM (where user-applications were loaded) 16 bits wide.
@aqualung2000 The L1i cache is still a valuable but limited resource. For some code thumb (and especially thumb-2) is actually performing better than arm. Control code would be the usual suspect. The introduction of low latency l2 caches has finally taken a bit of the pressure off the l1i.
@@ptefar indeed. L1 icache has stayed exactly the same size for 20 years, and only doubled once in 30 years, despite denser process nodes that can afford the silicon space for larger caches -- the problem is that larger is slower. Pentium MMX to Pentium 3 had 16 KB L1 icache. From 2003's Pentium M all the way to the current "12th gen" Alder lake performance cores L1 icache has stayed fixed at 32 KB. Alder Lake's slower efficiency cores finally have 64 KB. AMD used 64 KB L1 icache from Athlon in 1999 up to Zen and Zen+ (except for some experimentation in the "earthmover" cores), but Zen 2, 3, and 4 have actually reduced it to 32 KB. Note that AMD has often pursued a "lower clock rate, higher IPC" approach compared to Intel, and thus can afford slower but more effective caches. ARM with their lower clocked, higher efficiency, cores have also often had slightly larger icache than Intel, but also not growing with time. Apple M1&M2 are on a different planet :-)
I knew a lot of this already but I don't think anyone has done such a good job of assembling it all in such an easy-to-digest manner. Most informative, thank you!
I had the pleasure of working for a company owned mostly by ex Acorn employees. We had some early ARM2 prototypes to work with, which subsequently ended up being used in the animatronic control control systems for the Teenage Mutant Ninja Turtles film (amongst other projects that Jim Henson's Creature Shop (of Muppets fame) were involved with.) I miss those days!
It's fascinating to me that the ARM1 had roughly the same transistor count as the Intel 8086 from seven years earlier (Ken Shirriff recently counted actual transistors 19,618 on an example he's been reverse-engineering, although things like ROMs and programmable logic arrays that aid in decoding instructions and controlling the chip add many more, so there were 29,277 potential spaces for transistors on it -- high and low numbers pretty much exactly bracketing the reported 25,000 for the ARM1)
ARM processors have also hit it big in microcontrollers. For a long time those have been dominated by 8 bit microprocessors but the exceptional power efficiency of the ARM architecture has begun to displace them.
This is especially true in 3D printing. 8 bit microprocessors used to dominate consumer 3D printers but 32bit ARM is taking over and bringing along advanced features only made possible by the massive increase in processing power. One example is the Prusa Mk3 vs. the newer MK4 which share the same mechanics.
It had begun at least 10 to 20 years ago. ST, NXP, etc. This statement is best suited for 2010 but not 2024. At this moment all MCU cores are adopting RISC-V cores. I used to work on a project which has about 20+ CPU cores in a SoC. (Can’t say much but you probably have used our products) Initially all the small controller cores are Cortex-M3s. Beginning from around 2018 we started to swap them all out with RISC-V from SiFive. I’m no longer there but I think all those CM3s are gone by now. The AP cores are still from Arm but they have plans to replace that in the back of their hands. Another example: I just heard yesterday Renesas just released their new RISC-V SoCs replacing Arm’s. Getting rid of Arm, or at least having a backup plan is the new trend at this moment. One day it will come and the switch will be flipped, just like what Apple did to Intel. My current job is working a designing RISC-V CPUs. Even there we still have a lot of Arm’s standards like AMBA CHI etc., but we are gradually replacing them as well. Arm now is just as big or complex as Intel and they have become a dragon as well.
It's been a long time since I've been excited to see a channel I'm subbed to release a new video but here we are. I had to wait until things were quiet and private so I could enjoy this video allow to myself. Good stuff.
Despite Ms Wilson ending up at Broadcom and my somewhat bolshie FLOSS philosophy, she's definitely one of my personal heroes. Thanks so much for this, I'm glad such a well produced video about ARM's history has been done (my goodness I remember the 1W limit being such an incredible achievement by itself). Thanks!
I believed the other point when Acorn when looking for their next BBC (6502 )was also the cost of each uP. And yes with a stroke of luck, the end result of RISC is smaller number of transistors ,less silicon real estate, more chips per water = cheaper chips. A Great video down a very historical memory lane.
Loved this video. As always it’s fantastically well presented. I am very familiar with all the arm stories, this is the best I have seen that joins them all together. I really loved the insight about Sheffield house prices. Thank you.
the ARM1 had 25 thousand transistors. For CMOS a crude approximation is to divide the number of transistors by 4 (the number of transistors in a 2 input NAND gate), which would only be around 6250 gates. That allows more direct comparisons with current processors.
I had a A3010 and I loved it. I upgraded it to 2mb of ram to make it a A3020 and I got a Iomega 100mb ZIP Drive for it. I only ever needed 1 disk to hold every game and program I owned.
I loved my BBC Model B which I took to Riyadh in 1983. My most useful program was a Darts Scoring program that spoke your shotouts using the speech synthesis board. It revolutionised the Riyadh Winter Darts League.... 😉
That's fantastic, I love hearing the unsual use cases people found for these machines. 8 years ago I ran across someone using two A5000s for doing all the timing for runners for the sheffield 20k. He wrote all the software in BBC basic (initially on a model B), and was still running it all on Acron hardware. When I talked to him he was planning moving it over to an RPI running RiscOS.
This is a great and easy to learn video, thank you! I was always wondering how ARM pretty suddenly into the mainstream in the past 5 years from (somewhat) obscurity. I'm not that much of a computer buff, as all I knew when it came to processors were Intel & AMD made them. I thought it was a new architecture, but watching this video taught me that it was always there since the 80s, just hidden in plain sight. I did know about phones having ARM processors, but I didn't know they were in so many other products. Apple's shift from Intel to ARM was what really put them back on the map. I'm not an Apple fan, but I will say the performance and efficiency on the Apple Silicon macs compared their Intel models is very impressive. It has certainly piqued my interest in the history of this architecture and chip. Now to see if you can port Doom Eternal to a Beeb...
I use an ARM chromebook, and i have no complaints. Between the linux container and web based computing it can do anything i need it to and the battery lasts for ages.
I am honestly loving the videos these days now there's so little distracting stock video. You can tell a good story. Really glad you're focusing more on that now
Hi Simon! I worked with you at ANT in Cambridge for a couple of years in the early 2000s, then some 15 years later I was flabbergasted to see you again on stage at a beachside pavilion in Australia, acting in a melodrama a friend had written. Small world!
Hi indeed :-) I remember you well ! I enjoyed that play so much I took up acting as full time as I can manage ! I've got a podcast now, Tortytalks, on all good distributers - and Ambron Radio - when it works. Good to be in touch again :-) @@JonathanMaddox
I enjoy your videos. If I may make one request though. You use a lot of stock video loops, and most of the time that's not too much of a problem, but when the video loop is particularly eyecatching - such as the "chip with green sparkles coming out of it" animation, it gets very tedious and distracting to have to watch it over and over again. And you use that one continuously for ages, from about 11:15 all the way to 14:00. It was a blessed relief when you finally switched it off. Same goes for the "hyperspeed office" clip near the end, which you made heavy use of in the DEC video too. It's quite hectic and jarring to watch for more than a few seconds. Just something to watch out for in future. Your narration is great though and that's what keeps me coming back.
Excellent video, most interesting. I did some work with Newtons where I set them up for building system maintenance and inspection routines. The instructions for fitters/electricians were put onto the Newtons and readings etc were then input direct into the Newton and then downloaded to the mainframes. It saved a lot of time and waste and became the forerunner for better systems as newer hardware became available. They definitely brought something to the table.
Great video, well researched and presented, as a retro computing fan especially Acorn and Apple I've been waiting for a few years for someone to tell the story and you did a great job.
@@RetroBytesUK I think we both share the same obsession you've reviewed NeXT (I have 2). I also have a very rare machine a BeBox dual 133 (2x PPC 603e) running BeOS.
Thumb instructions are 16 bit wide, not 8 bit. The full arm instructions are 32 bit wide. Both use 32 bit registers and can address just a much memory. However, the number of (easily) accessed general purpose registers is halved in thumb mode. Code written in the two modes can interoperable freely.
Right on everything except registers being halved in Thumb mode. It's true that many Thumb instructions can only use r0-r7, but SP, LR, PC (r13, r14, r15) continue in their normal role so it's more like 8/13 (61.5%) or the registers freely available. Also, you can add, compare, and move between all 16 registers, so with just a little bit of planning you can keep things such as loop limits or frequently reused large constants in the high registers.
@Bruce Hoult It is a good tradeoff. Thumb-2 with its mix of 16 and 32 bit instructions removed a lot of the limitations from thumb. I guess the complexity for the branch predictor and instruction decode made arm go back to uniform instructions for A64 in aarch64.
@@ptefar Thumb2 and its code density absolutely made ARM what it is today. Dropping that feature for 64 bit was my biggest surprise when I read the ARMv8-A manual in 2012, and I firmly believe it is their biggest mistake. There is a very large gap between fixed length instructions and the instruction decode mess than is x86, and Thumb2 and RISC-V "C" are FAR towards the fixed length end in decode complexity while offering large benefits. Yes, there are 8-wide decode ARM ISA chips now (e.g. Apple) while x86 is stuck at a lot narrower. But there are 8-wide 2&4 byte length RISC-V coming quickly. MIPS have already formally announced availability of their core, and several others are following close behind, including open source CPU cores.
@Bruce Hoult The variable length decoders and multi-branch branch predictors become power hungry quickly. Especially if you try to keep the pipeline short for code with high branch entropy. On the other hand l1i is a valuable commodity, so it is a tough tradeoff.
Great video, thanks! But wait, at 16:00 it says that 1) the Thumb mode allows to address limited amount of memory, 2) has 8-bit instructions and 3) was one reason to use that CPU in the Newton. If I remember that correctly it's false on all 3 accounts. The Thumb is a 16-bit instruction set (reduced instead of the native 32-bit instruction set) that can address the same memory and appeared on the ARM7T (architecture v4T according to the ARM website), years after the Newton, which after checking, used the 610 (architecture v3 only).
Addicted to your videos now, so glad I found this channel, I have learnt so much and find your description of circuits very clear and easy to understand. Thank you sir
Great video. I suspect that one of the reasons why some of the details of the very early ARM days are so sketchy and inconsistent, apart from the distractions that you mentioned due to all the other stuff that was going on in Acorn at the time (Electron issues, attempts to conquer the US market), the extreme secrecy of the project within Acorn at the time meant that despite Acorn having probably over 300 staff by the end of 1984 (I was one of them) the existence of the project was only known to a very small group of people within the company and inclusion in that inner circle was strictly on a need-to-know basis so there really aren’t that many people in a position to have ever known all the details let alone remember them after all these years. It was quite impressive how during those early days most of the company employees had absolutely no idea at all that the ARM development project was happening right under their noses.
Brilliant work here, love that you didn't do what most people did and just skip over how they really became what they are today in the 90s and their legendary testing infrastructure. I also agree with your "Septic Peg" predictions (talk about obscure jokes / references), that they're going to play a major part in the datacentre market. Onwards and upwards it looks like, which I'm happy with.
Thank you for a very interesting & informative video, a lot of which I was unaware of! I used a BBC model B back in the 80's at night school for my 'O' level exam, these were exciting times indeed & my first intro to 'Basic' programming!!
Very enjoyable video, and I learned quite a few new things. Very good to see you emphasize the critical nature of test. I was involved with this sort of thing for many years and saw how chip bugs could make or break any given cpu. One obvious idea from your video is that there would seem to a great business opportunity for a (startup?) company to do RISC-V test suites, especially with the Balkanization risk inherent in the RISC-V spec.
This is a *very* thorough treatment of the subject. It's very rare for anyone to ever mention the test suite even in passing because clearly none of the other content creators have ever worked anywhere near the microprocessor industry.
I had an Acorn Electron as a kid, which suffered ULA instability problems until it got new innards to stop it crashing. I bought a BBC Master later in life that I still boot every now and again, plus several Archimedes A440's that are currently in the loft. I've always admired the tenacity of Acorn and how important ARM became- it's a shame it was sold to Softbank when this was such an asset to the UK economy, where profits now go elsewhere despite being based in the UK.
@@jyvben1520 ...which goes directly to HM Government. The real, more direct benefit to the UK's economy is from the salaries of those working at ARM, plus the secondary benefits of it using a British talent pool (fostering education etc).
I had an Electron with the Plus 1 joy stick, printer and ROM cartridge interface and Plus 3 floppy drive add on (Plus 3 cost me £200 in 1980 somthing 🙄)... loved it though... Defraging the drive was interesting as it used video ram as storage space.... and was it quick compared to tape....you bet your life it was. It still played Elite but at a lower res as the CPU ran at half the speed as its big brother the BBC. The best thing about the setup was how it all went together... Everything was BOLTED....no ZX80 problems here, they were solid in their connection...never had a ZX80 problem... lovely machine.
@@RetroBytesUK I was in my 20's then and working at GEC in Coventry England so was getting a decent wage.... My first printer I ever had was a Brother HR5, 30 charactors per second using a print cartridge (Black only) tape which would last maybe 10 A4 prints...pretty laughable by todays standards and that cost £150 ! 🙄😮back then but it did complete the setup ... The Electron with its add-ons looked really nice...I always wondered what the Plus 2 would have been, if it ever existed.
Those dudes at Acorn were really ahead of their time, i always loved the Acorn computers and still have my BBC micro running, its on the desk in front of me as i type this :)
I love Acorn, we had an A3000 in the early 90s at home alongside a Mega ST and I wouldn't have fallen in love with computers without growing up around those two machines!
Got to experience ARM for the first time with an Archimedes A310 on a YTS computer course in the 80s… I was an Amiga owner back then, but was amazed by the power of ARM. I was already well versed in BBC BASIC from being at school in the 80s, so set about seeing what I could do with BBC BASIC on the archimedes. One of the first things I wrote was a breakout/Arkanoid game… 256 color graphics and stereo sound FX that panned left and right depending on where your bat and the ball was. I was amazed at how fast it ran just using interpreted BASIC! If anyone comes across a copy of it ( I shared it with other Archimedes users) I’d love to get a copy of it again as I lost the original disks… it was called BallArcs!
I went to Sheffield Uni from 1988 to 1991 and love the city. If you live in the right bit you can walk half an hour one way and be in the middle of a big city, or walk half an hour the other way and be in the Peak District. Sheffield people are so friendly as well.
Around 2016-2017 I was applying for an internship at ARM. I was inclined to accept their offer and was very happy to be able to convince them of me, but 2 things made me refrain from that (which I sometimes regret): 1. The Brexit referendum (I am a German citizen) and 2. the horrendous living i.e. rent cost of Cambridge. So indeed rightly guess by RetroBytes :-)
I have a very similar story but it involves me, obviously, and Pixar. I was interview at Pixar to be on the IT team that supported Steve Jobs. But 2 things made me refrain from that, which I regret sometimes. 1. The cost of living in the San Francisco area. 2. Not quite the pay I was hoping for, (although it wasn’t bad, I was being a diva).
Enjoyed this video, thank you! The Centre for Computing History in Cambridge has a couple of BBC/Acorn prototypes made by Steve Furber himself. I might have even touched them :)
And to think that all that’s left of Acorn now is an antique shop. How the mighty are fallen! Nice to see that Alan Partridge has diversified in his content into the tech sector.
Not enough of RUclips espouses just how much of a beast the StrongARM processors were! We had a couple of RiscPC's in our lab with them and they ran absolute rings around anything. Sure, you don't have a particularly flash native TCP/IP stack but it's 1996 and who needs that?
Interesting point regarding test verification. When people talk about RISC V, I think they forget about the verification process which rules RISC V out of many critical applications, that and the tool suit and software libraries
In 1997 my dads friend who “bought everything” had grown tired of his Apple Newton and passed it on to me in exchange for some lawn care. I carried it to my senior year of high school in my giant-pocket skater jeans (Boss or Jnco, I forget which) and used it to type out essays for college entrance and scholarship submissions. It doesn’t hold up today, but for its day it was a pretty cool gadget to have when most teenagers didn’t even own their own graphing calculator, and many homes still didn’t have even one general computing device.
Pretty much all of the PDA computers of that era (until phased out by smartphones) ran some kind of ARM-based instruction set. They were interesting machines although my main complaints were that they were always proprietary in design and tedious to use. When obsolete they were pretty much useless unfortunately.
Doonesbury made a running gag out of the Apple Newton’s notorious handwriting recognition. “How about a working lunch? When are you free?” “Well, let me see ... how about today, sir?” “No can do. I’ve got a one-o’clock with Bdippl at Cafe Fwiblob.” “Hey, you got a digital assistant too, sir?”
@@garymarsh23 Nah, we can't forget to mention the SuperH processors (SH3) and the use of MIPS processors in that sector (f.e. Casio Cassiopeia). Back in the wild west days of mobile computing the landscape was more diverse than today (related to architecture and manufacturers). For more info see phonedb.
You can still purchase the W65C816, the 16-bit 6502 compatible processor (though not a slot-in replacement!). And yeah it’s not exactly the greatest. Because for some goddamn reason the data pins are SOMETIMES the highest 8 pins of the 24-bit address bus!
As a 6 year old our family moved to Cambridge in 1982. I started to learn that dad had been head-hunted to work at early-ish Acorn Computers... He was one of those invisible behind the scenes employees you mention early in your video, albeit he was pretty senior and rose up over the 80s/90s. He died last year (🙏) although over the years he became COO and actually created a spin off with Herman H, called EO (I think) that created a pre Apple Newton type PDA and was bought by Apple prior to launching, for their stylus/software tech. I could recount so many stories he told us! Eg Bill Gates visiting their office in Rose Crescent, Cambridge, a very long time ago trying to find an early home for his OS... "no thanks mate, we've got RISC OS (Archimedes?!), we all know how that turned out?! And many more about the actual presentation to the BBC, Italian dudes at Olivetti, Acorn US and their love for Jaguars (revenue Vs expenses!?) And many more. Johnny Ball (80/90 TV tech guy) used to visit our house and my sister featured as a photoshoot 'star' on some of the Electron boxes! When he left Acorn he got a framed BBC 16k mainboard, framed. With the inscription "Thanks Jim, for making it happen " god bless you Dad... I might not have been allowed a PC, Commodore (64 or Amiga!) But your family was never bored, despite all your travel! Only wish we'd kept those ARM options xxx
Great video. I think ARM chips compete with themselves. As you pointed out, companies licence them then add in their own tech. They share a common instruction set and some will share parts of the same silicon design but most are just custom processors (CPU or SOC). None of them designed to be compatible with another but all designed to work with a common operating system. Currently Android and iOS. Risc-V looks like the same idea minus silicon or the risk of a hostile takeover by a competitor. It may do for the server market what Linux did. Can't see Risc-V surplanting ARM in mobile devices anytime soon given the volume of devices and apps that currently use it.
I'm actually rooting for Arm desktop/single board computers . Some of the new chips are quite fine for general everyday computing. Web browsing, office software and casual games work very well on them.They're cheaper to buy, cheaper to run and take up less space. Besides that, they're just a lot of fun for the rest of us. What's not to like?
My A440 (With ARM3 upgrade) is one of my most beloved retro systems. (It had battery damaged and it took a LOT of work for me to get it going again. How does an A440 get bettery damage? Well, you leave in the original AA batteries and store it on its side, and the AAs melt and dribble down over the RAM.)
Thanks for explaining why I keep getting job posts sent to me for work at ARM in Sheffield! I had no idea they were so big there. Last I had heard, there were about 12 people there. Seems they have grown a wee bit! I think they have a GPU design facility there too?
ARM is proven, tested, mature and competitive right now. It's also open for everyone to license, but still a commercial product. RISC-V is free and open-source, and while still at a relatively early stage of development, it can grow huge in the future if the success of Linux can be replicated in the hardware market. Both may be the writing on the wall for the walled garden duopoly cartel that is x86. Which means more competition, which most likely means great things for the users.
There is a lot of potential in RISC-V, the difficulty for it taking off in the same way Linux did, is that I could modify the kernel at home compile, it then run it. With RISC-V I dont have a chip fab at home, so I'm limited to having to run it on an FPGA, which performs poorly compered to it ASIC implementation. Dont get me wrong I think open hardware is great, but we are limited by not being able to make our own ASICs
@@RetroBytesUK That is true, but I'm not sure how big of an impact hobbyists have these days. Open source software is marketed as a mob of enthusiasts pushing the industry for fun, and while there is some truth in it, the ugly part is that a lot - perhaps most - of the development of big FOSS projects like Linux is now done by paid experts employed by tech giants who have business incentive in adapting those projects to their needs. I did a tiny little bit of hobbyist kernel development myself, and when my patch landed I got asked by one of the maintainers who I work for. It's only anecdotal proof, but it looks like the commercial way is the norm these days. RISC-V is obviously not for any average hobbyist, but at the level of organizations who have the resources to work with silicon, the open source nature may very much play in favor. The biggest risk I see in RISC-V is that the talent who's pushing the project forward is mostly from developed western countries like US, UK, EU etc., but due to the free and open nature, those who will benefit from it the most are likely manufacturers from countries like China and Russia, who aren't exactly friendly towards those western countries. Contributing to an open project is helping everyone in the world, and that includes our enemies. Who need that help more than our allies, so proportionately we're helping them more. I still believe that freedom of knowledge is a value in itself and thus root for RISC-V, but I guess we still should be aware of these dangers.
There is also a huge aspect of software compatibility with even your competitors. The fact the linux kernel and gnu apps comes in supporting solution that you'll invest - are a unbelievable time and cost saving.
I was surprised when Softbank tried to sell off ARM; it seemed like a silly thing to do seeing as how they were actually profitable. You also didn't get into the kerfuffle over ARM China, and the big mess that caused (and is still causing) for everyone involved. How do you recon that's going to shake out?
Softbank paid an astoundingly high amount to buy ARM. They were presumably counting on a huge increase in cores made and/or on increasing royalty fees due to ARM's effective monopoly. RISC-V has rather ruined that for them and they may well lose money when they sell/IPO ARM.
I got visicalc as soon as it was launched and still have it though not run it for nearly 30 years. Sadly it arrived just after I had written a hire quote and inventory on Pet basic! At the time it was revolutionary and quickly became a solution for double entry accounting
Will have to see what happens with RISC-V in future as that may become a major CPU architecture, especially in datacentres. As for other platforms, still not sure.
There are already products that are price/performance-competitive with, say, the Raspberry π. Look for reviews of the latest VisionFive 2 board from StarFive, for example.
@@lawrencedoliveiro9104 VF2 is a huge step forward. I got my shipping notice as kickstarter backer #14 yesterday, so hopefully I can very soon compare it to my Pis, Odroids, and HiFive Unmatched. In most respects, the VF2 lies somewhere between a Pi 3 and Pi 4 in speed, but the lack of hardware SHA, AES, and SIMD will hold it back in some applications (and provide easy fodder for biased reviewers). Those "missing" instructions have already been added to the RISC-V ISA and will hopefully be present in next year's crop of boards. The VF2 GPU is about 4x as powerful as the Pi 4's GPU, so once proper software support is in place it should make a better desktop and media machine than the Pi 4, even if the CPU cores are a little slower.
The three ARM buildings on Fulbourn Road are behind the Cambridge Water building (which was modernised a bit later). I worked on all three ARM buildings. When I attended the first meeting for the first building with the ARM team including Hermann Hauser, it was in some temporary Portacabins. We didn’t realise just how large ARM were going to become.
Between the Newton, and the PSION Series 5 (which was ARM7 from Cirrus Login in 1997), were there any other well known portable deices based on ARM? As I recall, the Palm Pilot was based on 68k from motorola and the emerging Windows CE devices were on Intel. Of note: the PSION Series 3 from early 1990s were based on a NEC 80286 chip and ran for over 50 hours on 2 AA batteries (about 80 hours on Lithium AA batteries). (The Series 3a was also the source for Nokia building the okia 9000 in 1996 which was one of the first smartphones (PDA phone). When PSION rewrote its OS to go on ARM and make it 32bits, it had a view of licensing to phone companies. But as the PSION 5 came out, Microsoft threatened to enter the PDA market with Windows CE and PSION threw in the towel and spun off its new OS (EPOC32) into separate company called Symbian with a number of cellphone makers taking shares (Motocola and Nokia were the 2 main ones tI remember). This faltered and Nokia took it over and started o rewrite it to support open format (conrtrary to EPOC16 (later renmed SIBO), EPOC32 was highly proprietary with undocumented file formats and very weak SDKs to access the data from outside the device. Nokia never really did success with Symbian, despite it being a good OS ar the core, but it did set the path for use of ARM processors in anything portable.
Great history. Well done. It doesn't answer a question I have: What does all this success mean for England, or even the company who started it? Acorn doesn't exist anymore. How about Sophie and the original team? Are they still actively involved? With it being owned by SoftBank I would have though a lot of power has been taken from the hands of the Cambridge offices where this began. I'd also suspect that office probably doesn't have the same overarching involvement in the product anymore. How are profits distributed? I don't know but it seems that the bulk of the proceeds from the success fall into the hands of people implementing the design. How much of that flows back to Cambridge? Is it the case that the Cambridge office now is an HQ of a multinational company with power distributed widely and perhaps off shore?
The success of ARM, has helped to keep chip design as an industry to become successful in the UK, and has helped to keep a whole sector competitive. Sophie, now works for Broadcom, working on the firepath processor, she never actually left Acorn, the part of the business she worked for became Element14, which Boradcom bought. Steve Furber left the commercial world, and when over to Manchester University (in the 90s before Acorn was sold), and retired earlier this year. Softbanks ownership has not really had much of a practical effect on how ARM is run, there is maybe a bit more work done in the US now, but that trend started before Softbank bought them.
I actually designed a project around the StrongARM in 2002.. but that never came to fruition because the design engineer that Intel provided me to assist with the chipset integration went on to sell out my idea and designs to a competing company. How do I know? Because one of the presentations that other company made about their product contained a picture taken from my project proposal, and someone overlooked the watermark. Thanks, Intel.
Thanks. BTW the Thumb instruction (mostly 16 bit) mode addresses the same 32 bit address space as the original ARM.32 but can only address 8 of the 16 registers. It provides significantly greater code density than ARM.32. Thumb is the basis of the, still current production, ARM Cortex micro controllers. I noticed you sidestepped the Shirley complication.
I loved this - as someone who was there - Acorn employee 1983-85. And I was in the room when that first ARM chip ran. What seems to have been lost in the history was just how fast it was - we kept cranking up the clock speed and it just kept working. Probably the fastest single stream computer on the planet at that time in April 1985. Also thank you for the kind words about the Domesday Project. Undoubtedly the hairiest decision I have taken in a lifetime in Computing was to say we could build the software within 2 years and it would run on twin 6502 processors with (was it?) 48K of RAM.
I recall seeing the Domesday project demonstrated on tomorrow's world and Blue Peter. Imagine my excitement when, a few years later on I found myself working in a company set up by a group of ex BBC ITU people, and it had a fully working system set up, including a few spares of those weird custom laserdisc players. The project was even ported to Hypercard on the Mac there just to keep it alive as technology changed.
@@erratic100 Yes. The project was conceived to run on those 12 inch hybrid analogue/digital laser discs by Peter Armstrong, a BBC Producer. But the sketch he and and Andy Finney had of how it would work would have been too slow and unusable at scale. There was a key meeting with Philips in Eindhoven where I think the plan was to offer us CD ROMs as the storage technology, but I reckon Philips' CD boffins thought they couldn't deliver in time. So this wonderful system was, sadly, initially delivered on top on a technological cul-de-sac.
@@johntait8422 Peter and Andy were both at the company I worked for. They are both lovely people, and great to work with. I learned a lot. And also about the innards of those Laserdisc players - there was often one with its case of being fiddled with to keep it functioning. There was one that was always problematic. I was moving it from storage one day and dropped it, heavily, on my foot. It worked perfectly after that! :)
The company did end up doing a deal with Philips on an unrelated project using CDi. Another dead end technology, but it was fun to play with the range of dev kits that were sent.
Also, the main project on which I worked was authored in Macromedia Director and Hypercard. Due to that it sadly no longer works on any modern kit, and two of the three multi-disc copies I own have succumbed to CD rot. I've made a bit-perfect copy of the remaining working discs, and I'll have to source a vintage Mac to play it one last time...
@@fermiLiquidDrinker I second this - would love to hear more of John's stories.
@@fermiLiquidDrinker It was extremely exciting. As I went through the door for the last time I was conscious I would never work in such an exciting workplace again. But I had family problems, would be in a serious financial mess if they made me redundant, and had a secure well paid job on offer.
I was involved with the electronics support team in the Aberdeen University Medical Physics department with BBC Micro in 1985. In 1989 I opened an Acorn StrongARM Risc PC dealership in a village called Fyvie in Aberdeenshire called "Tower Electronics" in the back of a Print shop. I was forced to make a living by selling custom build PCs based mostly on high-end AMD CPUs. As a dealer, I visited Acorn in Cambridge, which was terrific. Way ahead of the curve, I had a 2 slice Risc PC with an x86 CPU, a second CPU running Windows 3, and later Windows 95 inside the NATIVE ARM OS called RiscOS. Age 73 I have just finished building an AMD 7950X RTX 4090 GPU system and still enjoy computing.
Yes, but will it run Crysis?
@@FlyingPhilUKno it will run arab invasion game in uk
Always loved the story of the first time they've measured the power consumption of the first CPU and it worked despite the fact it wasn't even connected to the power supply.
Parasitic draw
@@auzzierocks yes, we watched the video
Everyone watched it? Be nice pal.
They should have run with this idea. Imagine zero-power CPUs!
@@SchoolforHackers sadly, the power has to come from somewhere, and if you use parasitic power, it just means you add a diode drop in series with the associated additional power loss. It's better to supply power directly, rather than through a very convoluted path.
Besides, if it ever happened that most lines on input into the processor were in state "low", it would cut power to the CPU. I can tell you from experience (with both ATMega and ARM) that this is quite a frustrating "randomly occurring" bug, and is the reason most new board bring up procedures in the industry start with verifying the power bus voltages.
I bought a $6k ARM single user licence. Hands down ARM has transformed the way I do things, especially over the PowerPC platforms I started with. We use ARM in safety critical devices.
Just to add, I also designed one of the two semi-custom chips in the BBC micro, the serial processor, which handled the cassette filing system interface. 🙂
I did not know that, that's rather cool. Did you also work on the teletext adapter ?
I bought the Archimedes 305 in spring 1988. Amazing machine. After using it for about 8 years and expanding to 4 Mb and an ARM3, it currently sits in its original box, in the attic. It feels like an old friend that you stopped calling.
The point about testing in this video is _very_ important. The original ARM 1 & 2 are sufficiently simple that it was possible to show by formal methods that the processors work correctly.
15:52 AIUI Thumb mode was solely about code size reduction ( though may also have had speed advantages due to fewer memory fetches). Thumb instructions were 16 bit, not 8-bit, vs. 32 bit for ARM ones.Thumb instructions were translated to ARM ones before execution. so I don't think it had much impact on power, at least from the CPU - may have saved a little on memory access.
Thanks for pointing this out. I've written code for ARM chips and the have tried both (performance is basically the same.) On modern ARM processors I'm not sure this feature even has much of a reason to exist anymore (since memory used for data will usually greatly outpace memory used for code.)
Thumb and the Apple Newton also had absolutely no relation. Thumb came about years after the Newton
@@cozy_kat Absolutely. Thumb came about when Nokia wanted a CPU, not Apple.The ARM7TDMI manual was published 2 1/2 years after the first Newton. I'm not sure when the cores themselves, or chips, were actually available. Also, aside from making it sound like "Fun" mode, almost every technical detail is wrong. Thumb instructions are 16 bits in size, not 8. They can address the whole address space. As they are decoded into regular ARM instructions you can't "turn off large parts of the CPU". Thumb instructions are faster only in that they still run at 1 per clock cycle if you cheap out and use 16 bit wide memory where the regular instructions take 2 cycles to load. Both run the same speed on 32 bit wide memory. ARM7TDMI systems often had a mix of memory widths, with for example ROM code (the OS/library) 32 bits wide, but RAM (where user-applications were loaded) 16 bits wide.
@aqualung2000 The L1i cache is still a valuable but limited resource. For some code thumb (and especially thumb-2) is actually performing better than arm. Control code would be the usual suspect. The introduction of low latency l2 caches has finally taken a bit of the pressure off the l1i.
@@ptefar indeed. L1 icache has stayed exactly the same size for 20 years, and only doubled once in 30 years, despite denser process nodes that can afford the silicon space for larger caches -- the problem is that larger is slower. Pentium MMX to Pentium 3 had 16 KB L1 icache. From 2003's Pentium M all the way to the current "12th gen" Alder lake performance cores L1 icache has stayed fixed at 32 KB. Alder Lake's slower efficiency cores finally have 64 KB. AMD used 64 KB L1 icache from Athlon in 1999 up to Zen and Zen+ (except for some experimentation in the "earthmover" cores), but Zen 2, 3, and 4 have actually reduced it to 32 KB. Note that AMD has often pursued a "lower clock rate, higher IPC" approach compared to Intel, and thus can afford slower but more effective caches. ARM with their lower clocked, higher efficiency, cores have also often had slightly larger icache than Intel, but also not growing with time. Apple M1&M2 are on a different planet :-)
it seems like this is the peak of coincidence, when 2 youtubers made 2 long videos about 1 single topic at the same time.
Oh, who else has ?
@@RetroBytesUK LowSpecGamer
14:00 and this is accurate too, Alex from low spec gamer really did stop there
Hahah I was wondering if this was intended but clearly a coincidence!
@@JarrydHall Amazing !
I knew a lot of this already but I don't think anyone has done such a good job of assembling it all in such an easy-to-digest manner. Most informative, thank you!
Thanks David, nice of you to say.
And accurate!
I had the pleasure of working for a company owned mostly by ex Acorn employees. We had some early ARM2 prototypes to work with, which subsequently ended up being used in the animatronic control control systems for the Teenage Mutant Ninja Turtles film (amongst other projects that Jim Henson's Creature Shop (of Muppets fame) were involved with.) I miss those days!
That's a very cool use of an ARM2
Exciting to see RetroBytes knocking on the door of 50k subscribers! Well done!!
It's fascinating to me that the ARM1 had roughly the same transistor count as the Intel 8086 from seven years earlier (Ken Shirriff recently counted actual transistors 19,618 on an example he's been reverse-engineering, although things like ROMs and programmable logic arrays that aid in decoding instructions and controlling the chip add many more, so there were 29,277 potential spaces for transistors on it -- high and low numbers pretty much exactly bracketing the reported 25,000 for the ARM1)
All those transistors were probably needed to decode instructions.
ARM processors have also hit it big in microcontrollers. For a long time those have been dominated by 8 bit microprocessors but the exceptional power efficiency of the ARM architecture has begun to displace them.
PowerPC still dominant in automotive controllers though
This is especially true in 3D printing. 8 bit microprocessors used to dominate consumer 3D printers but 32bit ARM is taking over and bringing along advanced features only made possible by the massive increase in processing power.
One example is the Prusa Mk3 vs. the newer MK4 which share the same mechanics.
It had begun at least 10 to 20 years ago. ST, NXP, etc. This statement is best suited for 2010 but not 2024. At this moment all MCU cores are adopting RISC-V cores.
I used to work on a project which has about 20+ CPU cores in a SoC. (Can’t say much but you probably have used our products) Initially all the small controller cores are Cortex-M3s. Beginning from around 2018 we started to swap them all out with RISC-V from SiFive. I’m no longer there but I think all those CM3s are gone by now. The AP cores are still from Arm but they have plans to replace that in the back of their hands. Another example: I just heard yesterday Renesas just released their new RISC-V SoCs replacing Arm’s. Getting rid of Arm, or at least having a backup plan is the new trend at this moment. One day it will come and the switch will be flipped, just like what Apple did to Intel.
My current job is working a designing RISC-V CPUs. Even there we still have a lot of Arm’s standards like AMBA CHI etc., but we are gradually replacing them as well. Arm now is just as big or complex as Intel and they have become a dragon as well.
Thanks!
Got my first ARM based machine aged 15 and the platform has kept me in work for 35 years and counting. I still own that A440.
It's been a long time since I've been excited to see a channel I'm subbed to release a new video but here we are. I had to wait until things were quiet and private so I could enjoy this video allow to myself.
Good stuff.
Love the Black Adder reference!
My first computer was an Acorn Atom the model before BBC Acorn we had to assemble them ourselves (including soldering) - loved it
Despite Ms Wilson ending up at Broadcom and my somewhat bolshie FLOSS philosophy, she's definitely one of my personal heroes. Thanks so much for this, I'm glad such a well produced video about ARM's history has been done (my goodness I remember the 1W limit being such an incredible achievement by itself). Thanks!
Brilliant, multifaceted history lesson of digital development - so well organized and relatable.
I believed the other point when Acorn when looking for their next BBC (6502 )was also the cost of each uP.
And yes with a stroke of luck, the end result of RISC is smaller number of transistors ,less silicon real estate, more chips per water = cheaper chips.
A Great video down a very historical memory lane.
Loved this video. As always it’s fantastically well presented. I am very familiar with all the arm stories, this is the best I have seen that joins them all together. I really loved the insight about Sheffield house prices. Thank you.
the ARM1 had 25 thousand transistors. For CMOS a crude approximation is to divide the number of transistors by 4 (the number of transistors in a 2 input NAND gate), which would only be around 6250 gates. That allows more direct comparisons with current processors.
This video is an amazing piece of work. You've earned a new subscriber, keep it up!
I had a A3010 and I loved it. I upgraded it to 2mb of ram to make it a A3020 and I got a Iomega 100mb ZIP Drive for it. I only ever needed 1 disk to hold every game and program I owned.
I loved my BBC Model B which I took to Riyadh in 1983. My most useful program was a Darts Scoring program that spoke your shotouts using the speech synthesis board. It revolutionised the Riyadh Winter Darts League.... 😉
That's fantastic, I love hearing the unsual use cases people found for these machines. 8 years ago I ran across someone using two A5000s for doing all the timing for runners for the sheffield 20k. He wrote all the software in BBC basic (initially on a model B), and was still running it all on Acron hardware. When I talked to him he was planning moving it over to an RPI running RiscOS.
This is a great and easy to learn video, thank you!
I was always wondering how ARM pretty suddenly into the mainstream in the past 5 years from (somewhat) obscurity. I'm not that much of a computer buff, as all I knew when it came to processors were Intel & AMD made them. I thought it was a new architecture, but watching this video taught me that it was always there since the 80s, just hidden in plain sight. I did know about phones having ARM processors, but I didn't know they were in so many other products. Apple's shift from Intel to ARM was what really put them back on the map. I'm not an Apple fan, but I will say the performance and efficiency on the Apple Silicon macs compared their Intel models is very impressive. It has certainly piqued my interest in the history of this architecture and chip.
Now to see if you can port Doom Eternal to a Beeb...
Absolutely fantastic video. So well researched and narrated. Just brilliant.
I use an ARM chromebook, and i have no complaints. Between the linux container and web based computing it can do anything i need it to and the battery lasts for ages.
I am honestly loving the videos these days now there's so little distracting stock video. You can tell a good story. Really glad you're focusing more on that now
Yup, that's the story of my computing life. I worked at the BBC, then at ANT. You may have read my Archimedes World articles :-)
I may well have done.
Hi Simon! I worked with you at ANT in Cambridge for a couple of years in the early 2000s, then some 15 years later I was flabbergasted to see you again on stage at a beachside pavilion in Australia, acting in a melodrama a friend had written. Small world!
Hi indeed :-) I remember you well ! I enjoyed that play so much I took up acting as full time as I can manage ! I've got a podcast now, Tortytalks, on all good distributers - and Ambron Radio - when it works. Good to be in touch again :-) @@JonathanMaddox
Your presentation style is too good for a channel with 50 K subscribers. Meaning you have a very good chance of blowing up soon!
I enjoy your videos. If I may make one request though. You use a lot of stock video loops, and most of the time that's not too much of a problem, but when the video loop is particularly eyecatching - such as the "chip with green sparkles coming out of it" animation, it gets very tedious and distracting to have to watch it over and over again. And you use that one continuously for ages, from about 11:15 all the way to 14:00. It was a blessed relief when you finally switched it off. Same goes for the "hyperspeed office" clip near the end, which you made heavy use of in the DEC video too. It's quite hectic and jarring to watch for more than a few seconds. Just something to watch out for in future. Your narration is great though and that's what keeps me coming back.
I did just watch this video on a Mac Studio. Thanks ARM! 👍
Yeah! A new RetroBytes to shine up my weekend! Thanks
Man, this channel needs, and deserves, more subscribers! I am so glad this channel came across my home page about 2 years ago!
the prototype running on leakage current is just GOLD!
Excellent video, most interesting. I did some work with Newtons where I set them up for building system maintenance and inspection routines. The instructions for fitters/electricians were put onto the Newtons and readings etc were then input direct into the Newton and then downloaded to the mainframes. It saved a lot of time and waste and became the forerunner for better systems as newer hardware became available. They definitely brought something to the table.
Great video, well researched and presented, as a retro computing fan especially Acorn and Apple I've been waiting for a few years for someone to tell the story and you did a great job.
Thats very nice of you to say, you can probably tell I'm a bit of qn Acorn fan too.
@@RetroBytesUK I think we both share the same obsession you've reviewed NeXT (I have 2). I also have a very rare machine a BeBox dual 133 (2x PPC 603e) running BeOS.
@@limsolo I did always want to get a BeBox.
My goodness I love these RetroBytes videos. :)
Amazing video (as always)
Very kind of you.
Lovely video. Particularly nice to see Mike Holcombe featured - I used to work for him in the Verification and Testing group in Sheffield.
Are you still at DCS ?
Not for twenty years,@@RetroBytesUK! I run a research group in Copenhagen now.
Thumb instructions are 16 bit wide, not 8 bit. The full arm instructions are 32 bit wide. Both use 32 bit registers and can address just a much memory. However, the number of (easily) accessed general purpose registers is halved in thumb mode. Code written in the two modes can interoperable freely.
Right on everything except registers being halved in Thumb mode. It's true that many Thumb instructions can only use r0-r7, but SP, LR, PC (r13, r14, r15) continue in their normal role so it's more like 8/13 (61.5%) or the registers freely available. Also, you can add, compare, and move between all 16 registers, so with just a little bit of planning you can keep things such as loop limits or frequently reused large constants in the high registers.
@Bruce Hoult It is a good tradeoff. Thumb-2 with its mix of 16 and 32 bit instructions removed a lot of the limitations from thumb. I guess the complexity for the branch predictor and instruction decode made arm go back to uniform instructions for A64 in aarch64.
@@ptefar Thumb2 and its code density absolutely made ARM what it is today. Dropping that feature for 64 bit was my biggest surprise when I read the ARMv8-A manual in 2012, and I firmly believe it is their biggest mistake. There is a very large gap between fixed length instructions and the instruction decode mess than is x86, and Thumb2 and RISC-V "C" are FAR towards the fixed length end in decode complexity while offering large benefits. Yes, there are 8-wide decode ARM ISA chips now (e.g. Apple) while x86 is stuck at a lot narrower. But there are 8-wide 2&4 byte length RISC-V coming quickly. MIPS have already formally announced availability of their core, and several others are following close behind, including open source CPU cores.
@Bruce Hoult The variable length decoders and multi-branch branch predictors become power hungry quickly. Especially if you try to keep the pipeline short for code with high branch entropy. On the other hand l1i is a valuable commodity, so it is a tough tradeoff.
Great video, thanks! But wait, at 16:00 it says that 1) the Thumb mode allows to address limited amount of memory, 2) has 8-bit instructions and 3) was one reason to use that CPU in the Newton. If I remember that correctly it's false on all 3 accounts. The Thumb is a 16-bit instruction set (reduced instead of the native 32-bit instruction set) that can address the same memory and appeared on the ARM7T (architecture v4T according to the ARM website), years after the Newton, which after checking, used the 610 (architecture v3 only).
Addicted to your videos now, so glad I found this channel, I have learnt so much and find your description of circuits very clear and easy to understand. Thank you sir
With great processing power comes great processability.
Enjoyable video essay, thanks for sharing!
Great video. I suspect that one of the reasons why some of the details of the very early ARM days are so sketchy and inconsistent, apart from the distractions that you mentioned due to all the other stuff that was going on in Acorn at the time (Electron issues, attempts to conquer the US market), the extreme secrecy of the project within Acorn at the time meant that despite Acorn having probably over 300 staff by the end of 1984 (I was one of them) the existence of the project was only known to a very small group of people within the company and inclusion in that inner circle was strictly on a need-to-know basis so there really aren’t that many people in a position to have ever known all the details let alone remember them after all these years. It was quite impressive how during those early days most of the company employees had absolutely no idea at all that the ARM development project was happening right under their noses.
Brilliant work here, love that you didn't do what most people did and just skip over how they really became what they are today in the 90s and their legendary testing infrastructure. I also agree with your "Septic Peg" predictions (talk about obscure jokes / references), that they're going to play a major part in the datacentre market. Onwards and upwards it looks like, which I'm happy with.
Thank you for a very interesting & informative video, a lot of which I was unaware of! I used a BBC model B back in the 80's at night school for my 'O' level exam, these were exciting times indeed & my first intro to 'Basic' programming!!
What a great retrospective, you should be proud of this video
Very enjoyable video, and I learned quite a few new things.
Very good to see you emphasize the critical nature of test. I was involved with this sort of thing for many years and saw how chip bugs could make or break any given cpu.
One obvious idea from your video is that there would seem to a great business opportunity for a (startup?) company to do RISC-V test suites, especially with the Balkanization risk inherent in the RISC-V spec.
This is a *very* thorough treatment of the subject. It's very rare for anyone to ever mention the test suite even in passing because clearly none of the other content creators have ever worked anywhere near the microprocessor industry.
I had an Acorn Electron as a kid, which suffered ULA instability problems until it got new innards to stop it crashing. I bought a BBC Master later in life that I still boot every now and again, plus several Archimedes A440's that are currently in the loft. I've always admired the tenacity of Acorn and how important ARM became- it's a shame it was sold to Softbank when this was such an asset to the UK economy, where profits now go elsewhere despite being based in the UK.
still get the taxes of the people working there ...
@@jyvben1520 ...which goes directly to HM Government. The real, more direct benefit to the UK's economy is from the salaries of those working at ARM, plus the secondary benefits of it using a British talent pool (fostering education etc).
I had an Electron with the Plus 1 joy stick, printer and ROM cartridge interface and Plus 3 floppy drive add on (Plus 3 cost me £200 in 1980 somthing 🙄)... loved it though...
Defraging the drive was interesting as it used video ram as storage space.... and was it quick compared to tape....you bet your life it was.
It still played Elite but at a lower res as the CPU ran at half the speed as its big brother the BBC.
The best thing about the setup was how it all went together...
Everything was BOLTED....no ZX80 problems here, they were solid in their connection...never had a ZX80 problem... lovely machine.
@@michaelhawthorne8696 I only had the plus1 as a kid with my electron, I would have loved a plus3. Especially as it had adfs.
@@RetroBytesUK
I was in my 20's then and working at GEC in Coventry England so was getting a decent wage....
My first printer I ever had was a Brother HR5, 30 charactors per second using a print cartridge (Black only) tape which would last maybe 10 A4 prints...pretty laughable by todays standards and that cost £150 ! 🙄😮back then but it did complete the setup ...
The Electron with its add-ons looked really nice...I always wondered what the Plus 2 would have been, if it ever existed.
Those dudes at Acorn were really ahead of their time, i always loved the Acorn computers and still have my BBC micro running, its on the desk in front of me as i type this :)
I love Acorn, we had an A3000 in the early 90s at home alongside a Mega ST and I wouldn't have fallen in love with computers without growing up around those two machines!
Excellent presentation! I enjoyed this entirely.
Got to experience ARM for the first time with an Archimedes A310 on a YTS computer course in the 80s… I was an Amiga owner back then, but was amazed by the power of ARM. I was already well versed in BBC BASIC from being at school in the 80s, so set about seeing what I could do with BBC BASIC on the archimedes. One of the first things I wrote was a breakout/Arkanoid game… 256 color graphics and stereo sound FX that panned left and right depending on where your bat and the ball was. I was amazed at how fast it ran just using interpreted BASIC! If anyone comes across a copy of it ( I shared it with other Archimedes users) I’d love to get a copy of it again as I lost the original disks… it was called BallArcs!
Have long been a fan of Acorn and Chris Curry, but was unaware of `Micro Men`. Thanks was a quite enjoyable watch.
I went to Sheffield Uni from 1988 to 1991 and love the city. If you live in the right bit you can walk half an hour one way and be in the middle of a big city, or walk half an hour the other way and be in the Peak District. Sheffield people are so friendly as well.
Lived in S10 my whole life. Thanks for the nice review, thoroughly agree.
Around 2016-2017 I was applying for an internship at ARM. I was inclined to accept their offer and was very happy to be able to convince them of me, but 2 things made me refrain from that (which I sometimes regret): 1. The Brexit referendum (I am a German citizen) and 2. the horrendous living i.e. rent cost of Cambridge. So indeed rightly guess by RetroBytes :-)
I have a very similar story but it involves me, obviously, and Pixar. I was interview at Pixar to be on the IT team that supported Steve Jobs. But 2 things made me refrain from that, which I regret sometimes. 1. The cost of living in the San Francisco area. 2. Not quite the pay I was hoping for, (although it wasn’t bad, I was being a diva).
Enjoyed this video, thank you!
The Centre for Computing History in Cambridge has a couple of BBC/Acorn prototypes made by Steve Furber himself. I might have even touched them :)
And to think that all that’s left of Acorn now is an antique shop. How the mighty are fallen! Nice to see that Alan Partridge has diversified in his content into the tech sector.
Not enough of RUclips espouses just how much of a beast the StrongARM processors were! We had a couple of RiscPC's in our lab with them and they ran absolute rings around anything. Sure, you don't have a particularly flash native TCP/IP stack but it's 1996 and who needs that?
The StrongARM was the CPU used in the Newton MessagePad 2000. I was hoping Apple would put a camera in there also, but it didn't happen.
Interesting point regarding test verification. When people talk about RISC V, I think they forget about the verification process which rules RISC V out of many critical applications, that and the tool suit and software libraries
China will provide those in time. China are now all in on RISC-V
Wonderful video. Glad this was in my RUclips recommended.
In 1997 my dads friend who “bought everything” had grown tired of his Apple Newton and passed it on to me in exchange for some lawn care. I carried it to my senior year of high school in my giant-pocket skater jeans (Boss or Jnco, I forget which) and used it to type out essays for college entrance and scholarship submissions.
It doesn’t hold up today, but for its day it was a pretty cool gadget to have when most teenagers didn’t even own their own graphing calculator, and many homes still didn’t have even one general computing device.
Pretty much all of the PDA computers of that era (until phased out by smartphones) ran some kind of ARM-based instruction set. They were interesting machines although my main complaints were that they were always proprietary in design and tedious to use. When obsolete they were pretty much useless unfortunately.
Doonesbury made a running gag out of the Apple Newton’s notorious handwriting recognition.
“How about a working lunch? When are you free?”
“Well, let me see ... how about today, sir?”
“No can do. I’ve got a one-o’clock with Bdippl at Cafe Fwiblob.”
“Hey, you got a digital assistant too, sir?”
@@jimtekkit A bunch of the older Palm's ran Freescale Dragonball CPUs, but yeah, it didn't take long for ARM to take over.
@@jimtekkit Motorola's CPU32 core (type of m68k) was very popular, too. (aka "Coldfire")
@@garymarsh23 Nah, we can't forget to mention the SuperH processors (SH3) and the use of MIPS processors in that sector (f.e. Casio Cassiopeia).
Back in the wild west days of mobile computing the landscape was more diverse than today (related to architecture and manufacturers). For more info see phonedb.
You can still purchase the W65C816, the 16-bit 6502 compatible processor (though not a slot-in replacement!). And yeah it’s not exactly the greatest. Because for some goddamn reason the data pins are SOMETIMES the highest 8 pins of the 24-bit address bus!
As a 6 year old our family moved to Cambridge in 1982. I started to learn that dad had been head-hunted to work at early-ish Acorn Computers... He was one of those invisible behind the scenes employees you mention early in your video, albeit he was pretty senior and rose up over the 80s/90s. He died last year (🙏) although over the years he became COO and actually created a spin off with Herman H, called EO (I think) that created a pre Apple Newton type PDA and was bought by Apple prior to launching, for their stylus/software tech. I could recount so many stories he told us! Eg Bill Gates visiting their office in Rose Crescent, Cambridge, a very long time ago trying to find an early home for his OS... "no thanks mate, we've got RISC OS (Archimedes?!), we all know how that turned out?! And many more about the actual presentation to the BBC, Italian dudes at Olivetti, Acorn US and their love for Jaguars (revenue Vs expenses!?) And many more. Johnny Ball (80/90 TV tech guy) used to visit our house and my sister featured as a photoshoot 'star' on some of the Electron boxes! When he left Acorn he got a framed BBC 16k mainboard, framed. With the inscription "Thanks Jim, for making it happen " god bless you Dad... I might not have been allowed a PC, Commodore (64 or Amiga!) But your family was never bored, despite all your travel! Only wish we'd kept those ARM options xxx
Excellent stuff, well done.
Great video. I think ARM chips compete with themselves. As you pointed out, companies licence them then add in their own tech. They share a common instruction set and some will share parts of the same silicon design but most are just custom processors (CPU or SOC). None of them designed to be compatible with another but all designed to work with a common operating system. Currently Android and iOS.
Risc-V looks like the same idea minus silicon or the risk of a hostile takeover by a competitor. It may do for the server market what Linux did. Can't see Risc-V surplanting ARM in mobile devices anytime soon given the volume of devices and apps that currently use it.
I'm actually rooting for Arm desktop/single board computers . Some of the new chips are quite fine for general everyday computing. Web browsing, office software and casual games work very well on them.They're cheaper to buy, cheaper to run and take up less space. Besides that, they're just a lot of fun for the rest of us. What's not to like?
My A440 (With ARM3 upgrade) is one of my most beloved retro systems. (It had battery damaged and it took a LOT of work for me to get it going again. How does an A440 get bettery damage? Well, you leave in the original AA batteries and store it on its side, and the AAs melt and dribble down over the RAM.)
For the first few minutes I kept thinking you were saying Akon not Acorn! Great video
Great video, the good old days of 80’s computers and tech lol.
Thanks for explaining why I keep getting job posts sent to me for work at ARM in Sheffield! I had no idea they were so big there. Last I had heard, there were about 12 people there. Seems they have grown a wee bit! I think they have a GPU design facility there too?
Very good video, subscribed to your channel. Cheers
is that Minor Swing i hear in the background initially?
love the vid, excellently done.
It is indeed
ARM is proven, tested, mature and competitive right now. It's also open for everyone to license, but still a commercial product. RISC-V is free and open-source, and while still at a relatively early stage of development, it can grow huge in the future if the success of Linux can be replicated in the hardware market.
Both may be the writing on the wall for the walled garden duopoly cartel that is x86. Which means more competition, which most likely means great things for the users.
There is a lot of potential in RISC-V, the difficulty for it taking off in the same way Linux did, is that I could modify the kernel at home compile, it then run it. With RISC-V I dont have a chip fab at home, so I'm limited to having to run it on an FPGA, which performs poorly compered to it ASIC implementation. Dont get me wrong I think open hardware is great, but we are limited by not being able to make our own ASICs
@@RetroBytesUK That is true, but I'm not sure how big of an impact hobbyists have these days. Open source software is marketed as a mob of enthusiasts pushing the industry for fun, and while there is some truth in it, the ugly part is that a lot - perhaps most - of the development of big FOSS projects like Linux is now done by paid experts employed by tech giants who have business incentive in adapting those projects to their needs. I did a tiny little bit of hobbyist kernel development myself, and when my patch landed I got asked by one of the maintainers who I work for. It's only anecdotal proof, but it looks like the commercial way is the norm these days.
RISC-V is obviously not for any average hobbyist, but at the level of organizations who have the resources to work with silicon, the open source nature may very much play in favor.
The biggest risk I see in RISC-V is that the talent who's pushing the project forward is mostly from developed western countries like US, UK, EU etc., but due to the free and open nature, those who will benefit from it the most are likely manufacturers from countries like China and Russia, who aren't exactly friendly towards those western countries. Contributing to an open project is helping everyone in the world, and that includes our enemies. Who need that help more than our allies, so proportionately we're helping them more. I still believe that freedom of knowledge is a value in itself and thus root for RISC-V, but I guess we still should be aware of these dangers.
Everyone to license, except maybe Qualcomm.
Homies says to be fair like a true letterkennyian, great vid bro
There is also a huge aspect of software compatibility with even your competitors. The fact the linux kernel and gnu apps comes in supporting solution that you'll invest - are a unbelievable time and cost saving.
I was surprised when Softbank tried to sell off ARM; it seemed like a silly thing to do seeing as how they were actually profitable. You also didn't get into the kerfuffle over ARM China, and the big mess that caused (and is still causing) for everyone involved. How do you recon that's going to shake out?
Softbank paid an astoundingly high amount to buy ARM. They were presumably counting on a huge increase in cores made and/or on increasing royalty fees due to ARM's effective monopoly. RISC-V has rather ruined that for them and they may well lose money when they sell/IPO ARM.
Excellent video! Awesome!
I got visicalc as soon as it was launched and still have it though not run it for nearly 30 years. Sadly it arrived just after I had written a hire quote and inventory on Pet basic! At the time it was revolutionary and quickly became a solution for double entry accounting
Will have to see what happens with RISC-V in future as that may become a major CPU architecture, especially in datacentres. As for other platforms, still not sure.
There are already products that are price/performance-competitive with, say, the Raspberry π. Look for reviews of the latest VisionFive 2 board from StarFive, for example.
@@lawrencedoliveiro9104 VF2 is a huge step forward. I got my shipping notice as kickstarter backer #14 yesterday, so hopefully I can very soon compare it to my Pis, Odroids, and HiFive Unmatched. In most respects, the VF2 lies somewhere between a Pi 3 and Pi 4 in speed, but the lack of hardware SHA, AES, and SIMD will hold it back in some applications (and provide easy fodder for biased reviewers). Those "missing" instructions have already been added to the RISC-V ISA and will hopefully be present in next year's crop of boards. The VF2 GPU is about 4x as powerful as the Pi 4's GPU, so once proper software support is in place it should make a better desktop and media machine than the Pi 4, even if the CPU cores are a little slower.
@22:04 That's good, The Amiga had ye old GURU Meditation !
The facility in chyerryhinton was an older water processing plant on fulborne road near fulborne hospital, i lived less than a mile away from this!
The three ARM buildings on Fulbourn Road are behind the Cambridge Water building (which was modernised a bit later). I worked on all three ARM buildings. When I attended the first meeting for the first building with the ARM team including Hermann Hauser, it was in some temporary Portacabins. We didn’t realise just how large ARM were going to become.
A subject I know well, Expertly told. Great job!!
Thank you nice of you to say, I loved your Fisher Price controller video btw.
Love your videos dude, nice one 🙂
Between the Newton, and the PSION Series 5 (which was ARM7 from Cirrus Login in 1997), were there any other well known portable deices based on ARM? As I recall, the Palm Pilot was based on 68k from motorola and the emerging Windows CE devices were on Intel.
Of note: the PSION Series 3 from early 1990s were based on a NEC 80286 chip and ran for over 50 hours on 2 AA batteries (about 80 hours on Lithium AA batteries). (The Series 3a was also the source for Nokia building the okia 9000 in 1996 which was one of the first smartphones (PDA phone).
When PSION rewrote its OS to go on ARM and make it 32bits, it had a view of licensing to phone companies. But as the PSION 5 came out, Microsoft threatened to enter the PDA market with Windows CE and PSION threw in the towel and spun off its new OS (EPOC32) into separate company called Symbian with a number of cellphone makers taking shares (Motocola and Nokia were the 2 main ones tI remember). This faltered and Nokia took it over and started o rewrite it to support open format (conrtrary to EPOC16 (later renmed SIBO), EPOC32 was highly proprietary with undocumented file formats and very weak SDKs to access the data from outside the device.
Nokia never really did success with Symbian, despite it being a good OS ar the core, but it did set the path for use of ARM processors in anything portable.
en.wikipedia.org/wiki/List_of_products_using_ARM_processors
Great doc video on the convoluted history of ARM - a 'Great' British success story. Where does Apricot Computers fit into this?
10:40 lmao😄
This video is amazing.
Great history. Well done. It doesn't answer a question I have: What does all this success mean for England, or even the company who started it? Acorn doesn't exist anymore. How about Sophie and the original team? Are they still actively involved? With it being owned by SoftBank I would have though a lot of power has been taken from the hands of the Cambridge offices where this began. I'd also suspect that office probably doesn't have the same overarching involvement in the product anymore. How are profits distributed? I don't know but it seems that the bulk of the proceeds from the success fall into the hands of people implementing the design. How much of that flows back to Cambridge? Is it the case that the Cambridge office now is an HQ of a multinational company with power distributed widely and perhaps off shore?
The success of ARM, has helped to keep chip design as an industry to become successful in the UK, and has helped to keep a whole sector competitive.
Sophie, now works for Broadcom, working on the firepath processor, she never actually left Acorn, the part of the business she worked for became Element14, which Boradcom bought. Steve Furber left the commercial world, and when over to Manchester University (in the 90s before Acorn was sold), and retired earlier this year. Softbanks ownership has not really had much of a practical effect on how ARM is run, there is maybe a bit more work done in the US now, but that trend started before Softbank bought them.
Sophie Wilson also has some great 1st hand accounts of creating ARM on her channel.
She goes into some great details.
I actually designed a project around the StrongARM in 2002.. but that never came to fruition because the design engineer that Intel provided me to assist with the chipset integration went on to sell out my idea and designs to a competing company. How do I know? Because one of the presentations that other company made about their product contained a picture taken from my project proposal, and someone overlooked the watermark. Thanks, Intel.
That's awful, them just stealing it like that.
I enjoy this video thank you
Sheffield university was one of the first unis that opened a computer undergraduate course too
Thanks.
BTW the Thumb instruction (mostly 16 bit) mode addresses the same 32 bit address space as the original ARM.32 but can only address 8 of the 16 registers. It provides significantly greater code density than ARM.32. Thumb is the basis of the, still current production, ARM Cortex micro controllers.
I noticed you sidestepped the Shirley complication.
Is the music from jazz cafe guitar?