A History of The ARM Microprocessor | Dave Jaggar | Talks at Google

Поделиться
HTML-код
  • Опубликовано: 5 сен 2024
  • Dave discusses the novel and inspiring career that led to the ARM architecture which effectively powers the digital world, being the reason that your phone is small and doesn't burn a hole in your pocket.
    In the late 1980’s Acorn, a British one-hit-wonder computer company, faced imminent extinction, with its only assets being 18 months of life support rations and perhaps the worst microprocessor ever designed. What they did right was to hire Dave Jaggar, with the ink still wet on his master's thesis. Over eight years Dave systematically defined the entire ARM architecture, enabling it to be a popular embedded controller for the digital revolution, with around 100 billion units shipped. As ARM’s Head of Architecture Design, Dave authored the ARM Architectural Reference Manual and is the Founding Director of the ARM Austin Design Center.
    Get the book here: goo.gle/2Xiheya.
    Moderated by Raymond Blum.

Комментарии • 51

  • @CraigTalbert
    @CraigTalbert 5 лет назад +6

    This is fascinating. Would love more like this.

  • @NDakota79
    @NDakota79 3 года назад +4

    I would love to hear his comment on Apples M1 chip and what they're planning with their silicon.

  • @ip2design
    @ip2design 3 года назад +1

    Minute 51 is of interest with a fair position on RISC-V.

  • @jecelassumpcaojr890
    @jecelassumpcaojr890 5 лет назад +3

    His reply at 56 minutes or so about what computers of the future will be like used the interesting analogy of male and female brains, but Alan Kay has another good way to describe it when he says that current computers are like clockwork and future computers will be more like biology.

    • @CkVega
      @CkVega 3 года назад

      Surprised they didn't throw him out then after what Google did to James Damore.

  • @icespittingfire
    @icespittingfire 4 года назад +8

    rather harsh comments here! like it or not, this fellow was responsible for the design of the ARM7 which was the chip that kick-started ARM's domination of the embedded space. his comments on / criticisms of the original ARM design are very interesting; of course what he doesn't say is *why* they made these compromises and why they might have been good compromises in 1983 with a tiny design team and no money. the balance had completely changed by 1991 and the technical decisions he made (or helped to make) have certainly been proven by the test of time...

    • @RobBCactive
      @RobBCactive 2 года назад +1

      The details in a story can be uncomfortable, people prefer the ARM powerless CPU miracle story from the Archimedes days.
      Dave's presentation isn't charismatic and non-engineer friendl. Much of his criticism lacks the context and clear explanation needed for a general audience. In a 32bit ISA wasting op code bits for conditional instructions is a luxury you can afford.

  • @kippie80
    @kippie80 3 года назад

    In doing some bare metal coding on Raspberry Pi now, this talk was illuminating on the question of 'why'? This one and 92' presentation by ARM to Apple are most insightfull yet.

  • @erikengheim1106
    @erikengheim1106 2 года назад +1

    What am I misunderstanding: How can 16-bit instructions lead to 70% smaller programs relative to 32-bit instructions. With half as big instructions shouldn't the programs be half as big? In other words shouldn't the 16-bit programs be 50% of the 32-bit programs? What am I misunderstanding here?
    Edit: It seems like it is explained here: ruclips.net/video/_6sh097Dk5k/видео.html , Dave is talking about getting down to 2/3 of the Arm code size. So I assume what he really means here is that 16-bit Arm instruction give programs which are 70% of a 32-bit program. I think Dave's explanation then make sense. He said changing the C compiler to output simpler instructions cause a 40% code growth. So you get 140% as many instructions but since each of these are half the size you send up with programs 70% of the original size.

    • @olivierdulac
      @olivierdulac 8 месяцев назад

      A program using many complex 32-bit instruction may need more than double the amount of 16-bit instructions to be emulated.
      For a simple exemple: fetching 32 bits of data with 1 single 32-bit instructions needs more than 2 16 bits fetch on a 16bbit cpu: you also need to move the registers pointing at the data or at the destination memory location.
      However, many simpler ("RISC") instructions will also often be faster to execute than larger and more complex ("CISC") instructions, as the cpu can be smaller and more energy efficient, so the resulting larger (in number of instructions) code can end up running faster.
      On the other hand, it could be possibly mitigated and more complex than this as different architecture may compensate their bigger size with complex prefetching/pipelining/other sub-instructrions level optimizations that, depending on what kind of computing you do, may end up faster on the larger instructions set... it is all quite complex and evolving.

  • @maniralamkhan
    @maniralamkhan Год назад

    I would love to hear his comment on APPLE M1 CHIP. And What they're planning with their silicon.

  • @anandbvs143
    @anandbvs143 Год назад

    Excellent

  • @hir3npatel
    @hir3npatel 5 лет назад +1

    Very interesting talk thanks.

  • @rickyzhang
    @rickyzhang 3 года назад

    What a great sharing. Where can I download the slide?

  • @chbrules
    @chbrules 3 года назад +6

    Recorded on a potato, eh?

  • @Archimedes75009
    @Archimedes75009 5 лет назад +13

    'perhaps the worst microprocessor ever designed'? Who dares state such a thing? The ARM crunched all other readily available CPUs at the time, at same frequency. Big thumb down.

    • @indahpratiwi4308
      @indahpratiwi4308 4 года назад +1

      I'm giving you a thumbs down

    • @ip2design
      @ip2design 3 года назад

      ARM7 was a major milestone as it introduced RISC-like and 32-bit architecture to low-power designs. There was poor alternative at that time and ARM7TDMI with Thumb extension was brilliant and could beat existing 8-bit proprietary architectures.

    • @bolshevikproductions
      @bolshevikproductions 3 года назад

      @@ip2design no not didn’t

    • @RobBCactive
      @RobBCactive 2 года назад

      The architecture was not long term designed with a market route map, Acorn over-optimised for the interleaved fast memory they succeeded with in the BBC micro and the clever self-modifying code which is cache unfriendly.
      32bit memory organisation did speed up bandwidth but was expensive, cacheline size transfers accelerate without increasing memory costs.
      DRAM speed increases slower than core speed inherently due to signal length, so a clever system that is cache unfriendly wasn't the "perfect processor for smartphones" touted around in Archimedes miracle stories.

  • @LiamProven
    @LiamProven 5 лет назад +14

    The description is horribly biased, inaccurate, and loaded with disinformation. Acorn was not a one-hit wonder; the ARM was dazzling the entire computer industry at the time; it was already used in multiple machines and ran multiple OSes. It was the most efficient RISC CPU in existence, delivered more power per Watt *and* more bang per buck than any other CPU in the world from any organization, from industry giant to elite research institution.
    Jaggar merely took this amazing, award-winning cheap and took it into cheap, resource-compromised devices rather than the high-end personal computers it was designed for.
    The talk and the speaker seem biased, with no appreciation of all that the company had achieved before he joined.

    • @patdbean
      @patdbean 4 года назад +2

      Acorn alone sold about 150k ARM2 based computers between 87 and 1990 and another 300k Arm 3/6/7/strong-arm between 1990 and 1997. The idea that "zero" ARMs shipped before 1995 is rubbish. As far as EU grants go, the UK always paid more in than it ever got out, so for every pound in grant ARM got from the EU the UK taxpayer gave the EU over 2 pounds.

    • @SimonEatough
      @SimonEatough 2 года назад

      @@patdbean How is Brexit going then ?

    • @patdbean
      @patdbean 2 года назад

      @@SimonEatough brexit has gone fine, the worldwide shortages of everything from Chips to naturel gas (caused by the worldwide lockdown) are a different issue , just look at the blockages in US ports and the power cuts in china, or are they the fault of brexit as well?

    • @RobBCactive
      @RobBCactive 2 года назад

      Acorn went bust! The BBC Micro sold far more.
      The Archimedes had a fast processor for the cost but didn't have the software for the market. Workstations used 68020, then various performance RISC designs.
      Acorn had Acornsift hack up an OS due to failure of ARX, when the workstation market needed portable available software running on BSD UNIX.
      Having multiple OSes was a symptom of market failure, not a benefit.

    • @RobBCactive
      @RobBCactive 2 года назад

      @@patdbean UK Pig & Turkey farmers experience says otherwise, as does the trade figures. 127 HGV driver applications lololol

  • @bolshevikproductions
    @bolshevikproductions 3 года назад +2

    Really Bad sound quality. Was this filmed in 1970s?

  • @terabyter9000
    @terabyter9000 3 года назад +1

    Advanced Risc Machines? Isn't it Acorn Risc Machines?

    • @jeffreyjoshuarollin9554
      @jeffreyjoshuarollin9554 3 года назад +2

      It started out as Acorn RISC Machines. Apple wanted to buy it for the Newton, but weren't pleased about it being made effectively by a competitor. Fortunately, Acorn had already been thinking of spinning out the chip design division, because they couldn't afford to make chips just for the small market they had (basically, computers for the education market in the UK and a few other Commonwealth countries, plus enthusiasts). So they spun it out as Advanced RISC Machines (ARM) Ltd. And now it's just Arm, which tbf is how it was always pronounced.

  • @7alken
    @7alken Год назад

    tnx, python is evil;