The problem with semi-automated systems - Bryan Cantrill

Поделиться
HTML-код
  • Опубликовано: 6 фев 2025
  • #softwareautomation #softwaredevelopement #softwaredevelopmenttips #softwaredevelopmentprocess #softwaredevelopmentlifecycle
    In this video, the speaker Bryan Cantrill speaks on the perils/dangers of semi-automated systems and how they can very easily lead to a disaster. He brings up the Air canada incident and correlates it with the very familiar to developers software outages. Thoughts?
    Sources:
    • Debugging Under Fire: ...

Комментарии • 8

  • @xealit
    @xealit 12 дней назад +6

    To load a tank of fuel and check how much you’ve loaded not by an independent external metric, by the same gauge they you yourself input - that’s wild. In any system, in any control: the success of an input action must be checked by its supposed outcome in the system, not by that it has been performed.

    • @techmage89
      @techmage89 11 дней назад +1

      My understanding is that there was supposed to be a sensor to check, but it was broken, so they took a manual measurement, and then messed up the calculation for the manual measurement. Really the issue was that no one had the necessary experience to handle the manual backup procedure when the automation failed.

  • @IlijaStuden
    @IlijaStuden 12 дней назад +6

    Looking forward to finding the rest of the talk. Fun anecdote, just doesn't prove the point of the talk (yet).

    • @xealit
      @xealit 12 дней назад +1

      Indeed, from what’s in the clip, Bryan exaggerates the point. He probably means that “full automation” is a system with a provable/verifiable/testable contract how it works. But semi-automation doesn’t necessarily mean blind spots. You just need processes for your teams. And just blaming “human error” all the time is lame. Those people make your system run and recover when it’s semi or fully automated.

    • @AlexJordan
      @AlexJordan 11 дней назад

      The link is in the description fyi

  • @Exilum
    @Exilum 11 дней назад +1

    I like where it goes in a way as it does fit my worldview but I can't just take it at face value with just this and pretend to be even a tad objective.

  • @calebghormley2322
    @calebghormley2322 10 дней назад

    "Human falibilty", assumes other systems like full automation, or say AI, are somehow infallible. Even fully automated systems will experience causal drift and error.

  • @chudchadanstud
    @chudchadanstud 11 дней назад

    oookay? How about you talk about code rather than planes and skiing? Your point was not clear.