Historic ARM Presentation to Apple Computer - 1992

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 81

  • @ag3ntorange164
    @ag3ntorange164 3 года назад +18

    The piece of British history Silicon Valley likes to pretend didn't happen. No Acorn Computers, No iPhone. They are THAT important.

  • @thebiker5242
    @thebiker5242 3 года назад +8

    Gone from 359,000 transistors In the ARM610 to 16 billion in the M1. 29 years later. Designed always to be dense code clean and simple. Thanks for posting

  • @bprud6443
    @bprud6443 3 года назад +27

    54:28 "...if you want to ask me when do I think my radio-telephone or my little organizer would need 64 bits, I hope it's not for a very long time..."
    Wow, in 1992... These guys were a long way forward.

    • @zacharylarson1245
      @zacharylarson1245 3 года назад +11

      The answer to that question... Not for about another two decades.. Apple would switch their "radio-telephone" running an ARM processor to 64 bit architecture with the release of the A7 chip in 2013! :)

    • @RWL2012
      @RWL2012 3 года назад +4

      @@zacharylarson1245 haha yes the iPhone 5S, took 21 years

    • @patdbean
      @patdbean 2 года назад +1

      If I remember right the cortex A7/A15 and I think the A9 had extended 40bit addressing in the mid 2000s a few years before the full 64bit. ARM V8 cores started coming out in 2013.

  • @dvdchas
    @dvdchas 6 лет назад +19

    Such an amazing video. Through my childhood I was an owner of a fully upgraded BBC B Model 7, Archimedes 310, RISC PC 600, slowly upgraded to the StrongArm. Listening to the speakers makes me realise why Acorn could technically compete with the big boys. Here I am watching and typing on my Arm powered LG G4!
    THANKS GUYS

  • @allan.n.7227
    @allan.n.7227 6 лет назад +27

    WOW.. what an interesting piece of IT history.. thanks for uploading!

  • @ZakKohler
    @ZakKohler 6 лет назад +26

    What an insight into the strategy of ARM back in the day! Seems like following behind competitors and trying to go for as low power as possible turned out pretty well for them.

    • @6581punk
      @6581punk 3 года назад

      They were making an efficient processor. CISC was seen to be too complex and the solution was to have smaller instructions that were efficient and use those the build up the bigger ones. CISC has a minor advantage in binary sizes compared to RISC of course. But a few decent compile time or dynamic libraries would at least get the usual maths and other functions in a reusable form.
      When the ARM processor was tested for the first time they forgot to connect some of the power lines and the chip still worked, it was running off the power to some of the address and data lines :)

  • @vijinho
    @vijinho 6 лет назад +7

    Very important history, thanks for this. 10:00 hits the nail on the head as to why ARM has become so widespread.

  • @SecretTeaDrinker
    @SecretTeaDrinker 4 года назад +4

    Fascinating presentation. Thanks for making this available.

  • @izools
    @izools 6 лет назад +9

    How the pair of them managed to say "Clock" every *single* time without fail, in amidst all their fantastic humor.
    Their jovial demeanor and tongue in cheek presentation style is just so typical of the early pioneers, those who created ARM, those who created Paula, Agnus, and Denise, those who created Lisa.
    These guys had soul. You don't see that in computer engineering these days.

    • @Archimedes75009
      @Archimedes75009 6 лет назад +2

      Except C= people were not serious and amateurs and Acorn people were true chip makers with a vision. No surprise C= left no legacy, whereas Acorn ARM technological legacy is present everywhere today.

    • @6581punk
      @6581punk 3 года назад

      Commodore left a huge legacy as did Sinclair. Many well known people got their start on those computers. The Amiga was way ahead of the PC, autoconfig in the 1980s (plug and play), pre-emptive multitasking in the Amiga kernel, even RiscOS didn't do that.

  • @antoniomemo636
    @antoniomemo636 3 года назад +4

    Love how the amused and humorous bit by which the retelling is interspersed "we'd be bought by a company called Olivetti and several italians in sharp suits would come over to Acorn and..."

  • @gordonm2821
    @gordonm2821 3 года назад +3

    I find this video fascinating and the more up to date Mike Muller one on this channel. I learnt Z80 on the ZX Spectrum and can programme in C but never understood RISC like some friends did on the Archimedes. Little did I know RISC and ARM would become so big

  • @MDarkus3
    @MDarkus3 3 года назад +6

    It is interesting to see that the basic concept of the arm micro architecture did not really evovled since 30 years. We are still using same fundamentals. What a treat this video for a young embedded engineer like me. I whish I could do and accomplished one day a design of complex core architecture like ARM does.

    • @hrissan
      @hrissan Год назад +1

      Instruction set and architecture changed radically with arm8 - pc is no more r15 and requires special instructions to access, very few conditional instructions, opcodes are stream of 16 and 32 bit instructions, there is hardware stack pointer with special instructions to use, no more coprocessor, different synchronization primitives.

  • @ttrjw
    @ttrjw 3 года назад +40

    Got an M1 Mac? It all started here.

    • @6581punk
      @6581punk 3 года назад

      The only thing Apple uses is the instruction set. They make their own cores.

    • @rabidbigdog
      @rabidbigdog 3 года назад +2

      Haha, wot? Newton nearly killed off both companies.

    • @ttrjw
      @ttrjw 3 года назад +3

      @@6581punk And where do you think the instruction set came from? Apple licenses ARM IP.

    • @0x1EGEN
      @0x1EGEN Месяц назад

      @@6581punk That's with every ARM based processor. ARM doesn't make chips but they provide the HDL source to companies that licenses them. Without the source, it would be very difficult for Apple to design the chip from the ground up.

  • @patdbean
    @patdbean 5 лет назад +8

    54:00 when will we go 64bit? Answer, 2012/3 with the cortex A53/A57😀 intact if anything in the mid 90s the movement was down to 16bit instructions with the. Thumb architecture in the ARM 7TDMI and the ARM 9TDMI and it was those processers that got ARM's volumes up, in the mid to late 90s. 20/20 hindsight 😀

  • @Dan-TechAndMusic
    @Dan-TechAndMusic 4 года назад +6

    And now Apple's gone full circle :) Sculley's Apple choice of the ARM processor for the Newton really was one of the most historic steps for Apple, even if the Newton itself was deemed a failure.

    • @madmanincommand2968
      @madmanincommand2968 3 года назад +3

      the newton replacement (in the iphone) was already way into development when the newton was killed. .

    • @6581punk
      @6581punk 3 года назад +2

      Newton was a huge mistake. Apple was an investor in General Magic who were developing the mobile internet handheld idea in the late 80s and early 1990s. Sculley was privy to all that was going on at General Magic and by taking some of their ideas and making a badly implemented Apple version of their ideas he completely killed off interest in General Magic. That probably set back the industry years. Steve Jobs had seen General Magic's presentations and ideas at the time too, hence the iPhone.

  • @luisgonzalez1637
    @luisgonzalez1637 6 лет назад +5

    I finally understand what’s going on after taking computer architecture

  • @richardpurves
    @richardpurves 3 года назад +3

    12:31 RIP Al Thomas, who I understand passed away not long after this video was originally taken.

  • @lancelotxavier9084
    @lancelotxavier9084 3 года назад +4

    The contribution of marketing?
    "Let's name the next chip 6 from 3."
    Some people go to college for 4 years for that.

  • @kippie80
    @kippie80 3 года назад +2

    This is a good place to start with getting into ARM, Archamedies, Raspberry Pi, iPhone, Mx, etc ..

  • @RayR
    @RayR 3 года назад +1

    Love these talks. Thanks for uploading.

  • @antoniostorcke
    @antoniostorcke 3 года назад +2

    Sales is so very important.

  • @Harbourmaster68
    @Harbourmaster68 3 года назад +3

    The presentation worked!

  • @douro20
    @douro20 Месяц назад

    A year after this they came out with ARM7, which added something which made code density even better by making coding a bit harder- that was Thumb. But then Thumb was part of what made the Game Boy Advance possible.

  • @patricklepoutre
    @patricklepoutre 3 года назад +1

    What a piece of history!

  • @tarocalypse
    @tarocalypse 3 года назад +5

    I recall the murmurings at Apple developer / user events back in the 80's of the possibilities of RISC chips. Could have been a very different world if Jobs hadn't decided to kill off the original Apples to pursue the Motorola Mac. The Apple /// didn't help too. The world might also have been very different if the BBC had been able to break out of the UK.

  • @danielkrajnik3817
    @danielkrajnik3817 4 года назад +5

    if AI resolution enhancers are so good please let's run one on this video

    • @6581punk
      @6581punk 3 года назад +2

      They are good. The lost Steve Jobs interview was lifted from a VHS tape which was badly damaged and someone in the intelligence services ran it through their system to restore it. The system they used was typically used for cleaning up drone footage.

  • @j2simpso
    @j2simpso 3 года назад +3

    I frankly don't see this ARM thing taking off on an Apple computer.

  • @jkdsteve
    @jkdsteve 3 года назад +2

    Amazing technology for its time and clear why it's stood the test of time...

  • @wayando
    @wayando 2 года назад +4

    They won in the end ...

  • @Loganberrybunny
    @Loganberrybunny 6 лет назад +3

    3:25 "Now I know some of the stories, but I don't think I should tell them." Curses!

    • @TheCentreforComputingHistory
      @TheCentreforComputingHistory  6 лет назад +4

      Loganberrybunny See our interview with Mike Muller : ruclips.net/video/ljbdhICqETE/видео.html

    • @Loganberrybunny
      @Loganberrybunny 6 лет назад +2

      @@TheCentreforComputingHistory Fantastic. Thanks!

  • @fiveminuteman
    @fiveminuteman 3 года назад +3

    180 billion arm chips have been produced as of 2021. I'm guessing you are reading this message on a device that has one.😀👍🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧#UKTECHNOLOGY

  • @doctordothraki4378
    @doctordothraki4378 3 года назад +2

    You have Apple Silicon. Before that, you had the iPhone. Before that, you had this.

  • @JonMasters
    @JonMasters 3 года назад +1

    ❤️ Mike

  • @Archimedes75009
    @Archimedes75009 6 лет назад +6

    Awesome ! How many more have you got hidden in your historical treasuries vault ?

    • @TheCentreforComputingHistory
      @TheCentreforComputingHistory  6 лет назад +6

      A few ... Subscribe to keep notified ... there's another one coming ;)

    • @Archimedes75009
      @Archimedes75009 6 лет назад +2

      @@TheCentreforComputingHistory Hurrah ! From a long time subscriber ;-)

  • @kevinphillips6333
    @kevinphillips6333 Год назад +2

    If you're wondering why Acorn Computers failed so badly there's a whole list of reasons. Thankfully ARM spun off before Acorn collapsed. As an Acorn Registered Developer (!) I don't ever remember being invited to one of these kinds of talks. Though I did get to briefly meet with Sophie Wilson and others in Cambridge, it seems Apple insiders had a lot more insight than we did! I'd figure out code overhead by adding up instructions, trying to avoid multiplying or dividing by anything other than a power of two. Those barrel shifters were a genius move. As he said, the ability to store blocks quickly using registers made a lot of sense compared to DMA or custom silicon (Amiga blitter chip). The simplest solutions often win in the long run :)

    • @hrissan
      @hrissan Год назад +1

      It is interesting how they played sound on these ARM machines. I remember programming DMA hooked to transfer bytes to SoundBlaster, which would generate an interrupt every time it passed half the memory block, so my code could fill the other half.

    • @kevinphillips6333
      @kevinphillips6333 Год назад

      @@hrissan even more interesting, the Archimedes (and onwards) had 8 channel (panable) 8bit sound - logarithmically scaled. It sounded pretty good compared to other machines of the time. Companies (including the one I worked for) improved on that with 16bit stereo using high end Analog Devices and Crystal Audio codecs and synth cards from Yamaha, Ensoniq and Kurtzweil. :)

    • @hrissan
      @hrissan Год назад +1

      @@kevinphillips6333 principle must be the same - sound card issues interrupt when it starts running out of samples, then interrupt handler will transfer new chunk of data (samples) to the sound card. If we use bulk load/store with enough (for example 12) registers, then we can in theory transfer 12 consequent words (16bit stereo sample per word), almost per 12 clocks even from/to DRAM (12 plus clocks for address select). Sounds quite efficient. Programming close to hardware is such a fun!😸

  • @rick57hart
    @rick57hart Год назад +1

    39:27 Is he calling the DEC Alpha "toasters"? 😅

    • @CommandLineCowboy
      @CommandLineCowboy Месяц назад

      At the time (1993?) the Alphas has the highest MHz usually twice that of Intel's fastest 486s. So comparatively warm.

  • @JonMasters
    @JonMasters 3 года назад +2

    This is where Apple’s love of fiq began too 🤣

  • @robprupe
    @robprupe 6 лет назад +2

    I'm guessing they were referring to the Newton as the product to use there chips.

  • @djtomoy
    @djtomoy Месяц назад

    it’s just a sales pitch, i’m sure the next guy to come in was trying to sell them fancy toilet seats for the office and the next guy was selling pens!!!

  • @SaiyanGokuGohan
    @SaiyanGokuGohan Год назад

    Now it’s Brainchips turn to move everything the “far edge”

  • @mossby23
    @mossby23 6 лет назад +4

    21:36 When the speaker refers to the slide of instructions and says "they're color coded here but you can program in black and white" and everyone laughs -- what's the joke? Are they laughing at the idea of color coding syntax or coding in black and white? Not sure what the status of color coding was in '92.

    • @Conenion
      @Conenion 6 лет назад +6

      I /assume/ this refers to Apple's adherence to monochrome displays up to around 1992/93.
      en.wikipedia.org/wiki/Apple_displays
      The Xerox Alto and Star systems (where Steve Jobs got the GUI ideas from) had monochrome displays since laser printers could only do b&w back then. Plus, it uses less of the so costly video buffer ram.

    • @mossby23
      @mossby23 6 лет назад +1

      Ah that was my next assumption, it was more of a playful jab at a room full of Apple folks. Thanks!

    • @BruceHoult
      @BruceHoult 3 года назад +1

      @@Conenion The Mac II offered an 8 bit 640x480 colour display and matching video card and Color Quickdraw at introduction in 1987.

    • @Conenion
      @Conenion 3 года назад +2

      @@BruceHoult
      That is true, but to the best of my knowledge most people in the Mac world used monochrome displays till around 1989/1990 (so the years from my posting above are probably too late). Apple released new monochrome display models as late as 1990.
      Also AFAIK the Mac II wasn't really successful (way too expensive). With the LC family color became affordable for the masses. That was 1990.

    • @BruceHoult
      @BruceHoult 3 года назад +1

      @@Conenion I would say the Mac II series was very successful! I knew so many people who had them. I remember sometime around mid to late 1987 getting together with others in Wellington at someone's home and pulling the video cards out of the machines and plugging six into one Mac. There was a variety of sizes and shapes of monitors and a mix of black and white, grayscale, and colour. It all worked perfectly! The Mac II I took to that meeting was owned by my employer but many of the others were privately owned. I later bought a IIcx myself. Only the IIfx was ludicrously expensive for the time -- otherwise prices were comparable to name brand PC/AT or 386 machines. The Mac IIsi and the LC were dogs, cut down to the point of unsuitability. The LC 3 was the first decent machine in that series.

  • @Aresydatch
    @Aresydatch 3 года назад

    Aged like Cheese

  • @josecarlosxyz
    @josecarlosxyz 4 года назад

    Wtf arm?

  • @avaughan585
    @avaughan585 2 года назад +1

    Acorn a UK version of Apple 😂😂😂 It's really quaint how they saw it like that all those years ago! Little did they know what Apple would become. The audio clipping on the video is absolutely atrocious. Is that a flaw in the digital conversion for You Tube or is that on the original recording?

    • @StuartQuinn
      @StuartQuinn 2 года назад

      The comparison was accurate though - they were a computer manufacturer and platform vendor, it was essentially the same business model. Obviously, Apple was somewhat more successful!

  • @davidhart1674
    @davidhart1674 4 года назад +1

    An important chunk of history and very informative. A bit cringe though when Sophie Wilson, inventor & designer of the ARM instruction set, is referred to as a 'guy' and later a 'man'.