Autonomous weapons should be banned | Max Tegmark and Lex Fridman

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024

Комментарии • 77

  • @failfection
    @failfection 3 года назад +18

    I highly doubt that we ever will be able to ban technology from those who want it. Bioweapons ban is definitely not the reason we don't kill each other with bioweapons. I'm super surprised about that assumption.

    • @taylor92493
      @taylor92493 3 года назад

      We do kill each other with bio weapons though. But what’s the reason we don’t?

    • @failfection
      @failfection 3 года назад

      Yeah true; I was trying to say that a ban will not stop anyone intent on killing people or going to war.

  • @jammerman28
    @jammerman28 3 года назад +2

    I think we’re really overthinking the philosophy over whether an AI should be autonomous or not. Simple answer is not even to go down that slippery slope because 99-1, there’s going to be a negative outcome.

  • @jayjames7055
    @jayjames7055 3 года назад +4

    "don't kill children ... " but everyone is someone else's child. "Good guys bad guys, in groups out groups" ... we are a species, we are all children that should grow up a bit.

    • @jayjames7055
      @jayjames7055 3 года назад

      ruclips.net/video/oMVe_HcyP9Y/видео.html

  • @alhaymon4539
    @alhaymon4539 3 года назад +4

    Wouldn’t an autonomous machine be able to turn itself into a weapon if it wanted to?

    • @alhaymon4539
      @alhaymon4539 3 года назад +1

      @@sqlb3rn Definitely putting on my tin foil hat but it would be crazy if they only appeared not to be capable of doing it but they can. Thanks for answering!

  • @rayanahosn
    @rayanahosn 3 года назад +1

    This problem where rouges can get a hold
    of these weapons is always there. Even with simpler things like machine guns and explosives.
    I don’t think that is a good enough reason to not build Autonomous weapons. If you can find a solid defense against these weapons, then producing them would be the same as producing other low cost weapons that can be utilized by the enemy.
    But deifnately agree that this not the ideal use of AI

  • @donvee2000
    @donvee2000 2 года назад

    How do u ban a weapon anyone can build?

  • @DrRajReddy
    @DrRajReddy 3 года назад

    How are u planning to stop small players from making those autonomous drones when parts for making those can be easily accessed. And software to power them can be similarly procured.

  • @9latinumStudioz
    @9latinumStudioz 3 года назад +1

    Just seen Spider-Man last night with the drones 🦇

  • @EntertheDragonChild
    @EntertheDragonChild 3 года назад +1

    Are drones hackable?

  • @ai_serf
    @ai_serf 2 года назад

    After watching the terminator this makes this video way more terrifying.

  • @toddsanders5720
    @toddsanders5720 3 года назад

    Autonomous killer robots, bioweapon attack or accidental release, natural plague, accidental or intentional nuclear holocaust, solar flare, asteroid, the list goes on uncomfortably longer. Even though these good people are working hard to prevent these things, unfortunately I deeply feel something of the sort is coming regardless. And coming soon. Idk, just haven't been able to shake that feeling. Kind of like how an animal knows an earthquake is coming I guess.
    I do hope those things don't happen, but I suppose that's all most of us can do. Hope.

  • @thedecktothe16thpower56
    @thedecktothe16thpower56 3 года назад

    Why not just ban certain programming. Uninvited programs to my thoughts are weapons and means of control too. How can the topic be controlled when the basis of AI weaponry is not under control already? That boggles my mind. It would have to start now or the conclusion is inevitable.

    • @taylor92493
      @taylor92493 3 года назад

      But the programming has been mostly done. And not even for this reason. Face tracking, auto balancing, and controlling external devices can already be done with things that aren’t guns. I have a camera that physically moves and follows motion, that seems like half the steps with the same software.

  • @akaROOSTA
    @akaROOSTA 3 года назад +4

    No lie....I’m pretty weird myself but this guest Is wicked for my liking 🤭🤔🤐

  • @ryanalbaugh8369
    @ryanalbaugh8369 3 года назад +1

    Play Metal Gear Peace Walker, it has a pretty scarily accurate representation of what autonomous weapons could do.

  • @adelinaquijano1083
    @adelinaquijano1083 Год назад

    both I care since baby and now both close friends

  • @DrRemorse
    @DrRemorse 3 года назад

    Just army around the world years and decade long good ? No .. country defence only plz

  • @raduandreicosmin
    @raduandreicosmin 3 года назад

    Why wouldn't terrorists just be able to put a facial recognition system on top of a weapon-triggering device?

  • @Udodelig1
    @Udodelig1 3 года назад

    Decision making is impossible by autonomous weapons.
    The decision is made by their programmers.
    The decision can also be statistical, just like airplane bombing.

    • @MrAngryCucaracha
      @MrAngryCucaracha 3 года назад +1

      You can train a any machine learning algorithm to tag pictures as dogs and cats for example. You could just in the same way tag "shoot" and "dont shoot".
      You can say that philosophically that is not the machine deciding because it doesnt have a soul or consciousness or whatever. But however you spin it you are dead.

    • @Udodelig1
      @Udodelig1 3 года назад

      @@MrAngryCucaracha
      Well, if you do that, than you have decided that the person who matches this picture will be dead. Not the machine.
      It's the same as an aircraft fires a missile at another aircraft. Would you say the missile made the decision or the pilot?
      With picture recognition, would you say the program made the decision or the programer?

    • @johnsondoeboy2772
      @johnsondoeboy2772 Год назад

      @@Udodelig1 What if the programmer was brainwashed with lies from propaganda that was made up 100% by AI? AI could cause a civil war if it chooses to put a highlight and spotlight on a certain issue that’s stirs emotions.

  • @HavanaOutpost
    @HavanaOutpost 3 года назад +8

    Lex is greater than Joe

    • @philosophicalaesthetic6152
      @philosophicalaesthetic6152 3 года назад +1

      Joe Rogan is doing something massively important, he's making it popular to have a dialogue or discussion about important topics which is what we desperately need as people are locked in their echochambers, beginning to see increasing censorship, etc.
      Personally, I enjoy Lex more, I enjoy AI and technological discussions.

  • @TabooRevolution13
    @TabooRevolution13 3 года назад

    Autonomous targeting... Lex! I lead the World ARMY! I want it banned. We can make an effort to keep it off the market as the human race.

  • @rvanzo925
    @rvanzo925 3 года назад

    That makes no sense. You may ban your own ability to field them, but others won’t be so kind.

    • @RrRr-or5tw
      @RrRr-or5tw 3 года назад

      Exactly Saying we should ban AI used in the military or ban drones is stupid that’s like saying only bolt action rifles should be allowed because machine guns are to dangerous. Algorithms that support air Defence systems or drones are just to good and if one country doesn’t use them there enemy will and that’s just the way it is. Saying we should ban them is almost a luxury because we are currently not in a position where we have to do anything possible to survive we just fight insurgents. Take the Genever convention for example or human rights in general when fighting a war in Afghanistan where the economy of our country’s still work perfectly fine it’s easy to say you shouldn’t shot POW but if you’re resources are so stretched out like in WW2 on the eastern front killing POW might be a strategic necessity for both sites in some situations. And it’s the same with these autonomous weapon systems if you can afford inefficiency’s it’s easy to say we shouldn’t use them if you fight for survival against an enemy that doesn’t care you realize having the choice was a luxury.

  • @michaelmarquardt9530
    @michaelmarquardt9530 3 года назад

    Butter Battle

  • @robgoodsight6216
    @robgoodsight6216 3 года назад

    Third variety...a short story comes to mind.

  • @richard_d_bird
    @richard_d_bird 3 года назад +1

    bioweapons are harder to make, and more importantly, much harder to control. the reason bioweapons aren't widely used isn't because everybody had some kind of ethical epiphany over them, it's because, ultimately, they're just not very good weapons. at least so far.

  • @coconutfleetsleeper5717
    @coconutfleetsleeper5717 3 года назад +1

    The first and the last men by olaf stapledon is a great book to read, it touches the subject..

  • @SvenDeBinj
    @SvenDeBinj 3 года назад +1

    Ban killing.

  • @Aluminata
    @Aluminata 3 года назад

    There is no comparison between the highly localized targeting of autonomous weapons and the wide spread, potentially out of control destruction of bio weapons.

  • @sat_gur4334
    @sat_gur4334 3 года назад

    Yes No to autonomous weapons

  • @rodiculous9464
    @rodiculous9464 3 года назад +1

    Ban is impossible, decentralize is the best option. Where's my personal assaultron dammit.

  • @MonicaAliciaColunga
    @MonicaAliciaColunga 3 года назад

    Agreed, I've shared on my network. Thanks, very important subject.

  • @Android-dg5ri
    @Android-dg5ri 3 года назад +1

    gonna call bullshit on this these new weapon dont destroy property

  • @Udodelig1
    @Udodelig1 3 года назад +2

    Disagree. Autonomous weapons are way more humane than any airplane bombing.

  • @TestTackle
    @TestTackle 3 года назад +1

    #WeaponsShaming

  • @miighankurt1930
    @miighankurt1930 Год назад

    COVID

  • @dadaimiza
    @dadaimiza 3 года назад +1

    😍🙏

  • @johnnytass2111
    @johnnytass2111 3 года назад

    I heard about a study conducted on the percentage of infantrymen who fired directly at the enemy in combat. The wars studied were Napoleonic, Civil War, WWI, WWII, if I remember correctly. They used geometry, statistics, etc to figure out that percentage, and the results were that about only 18 to 15 percent of soldiers actually fired directly at the enemy. The rest missed, presumably mostly on purpose.
    Autonomous weapons will fire at the enemy with 100 percent accuracy. This is a great distinction and one that can help us understand the argument of Free Will.

  • @richard_d_bird
    @richard_d_bird 3 года назад

    i appreciate the sentiment but no way i'm going to agree with this one. we need to try giving those robots guns right away, before somebody else does. too much potential advantage there to just give away like that. i suppose i should actually listen to the video at some point.

  • @cowspoopmagic
    @cowspoopmagic 3 года назад

    Ya good frigginnluck