Israel's Lavender System, AI Targeting and Battlefield Informatics

Поделиться
HTML-код
  • Опубликовано: 4 апр 2024
  • Informatics is the application of information technology (IT) to solve problems in various domains. It encompasses the collection, storage, processing, analysis, and dissemination of data to support decision-making and problem solving.
    FPV Drones and munitions like the M7 Spider use informatics to specifically service individual units.
    5th generation fighters like the F-35 use informatics to dominate situational awareness.
    UTAMS uses informatics to instantly target the point of origin for mortar rounds.
    When it comes to AI/ML Informatics being used for targeting, the question is whether the speed of targeting outweighs the possibility of making mistakes. If the advisory has AI targeting and is more casualty-tolerant, they may gain a significant advantage in battle.
    For uncensored video, check out my substack at:
    ryanmcbeth.substack.com
    Like my shirts? Get your own at:
    www.bunkerbranding.com/pages/...
    Want a personalized greeting:
    www.cameo.com/ryanmcbeth
    Watch all of my long form videos:
    • Military Equipment, Ta...
    Twitter:
    @ryanmcbeth
    Instagram:
    @therealryanmcbeth
    BlueSky
    @ryanmcbeth
    Reddit:
    /r/ryanmcbeth
    Join the conversation:
    / discord
    Want to send me something?
    Ryan McBeth Productions LLC
    8705 Colesville Rd.
    Suite 249
    Silver Spring, MD 20910
    USA

Комментарии • 424

  • @sirshauniv511
    @sirshauniv511 Месяц назад +85

    I don't blame people for being sceptical. When the best AI most people get to see and use is on the level of ChatGPT, it's perfectly reasonable to feel nervous when that tech holds lives in its hands.

    • @2639theboss
      @2639theboss Месяц назад +23

      Even if you work with better AI, its pretty damn obvious how flawed these systems can be. Any poor design decisions, any failures with the data pool theyre built on, and it very quickly goes to shit but the people building them and the people financing them will rarely admit it.

    • @VeryCoolVerySwag
      @VeryCoolVerySwag Месяц назад +1

      I've never thought about how a 0 or 1 can be the difference between life and death

    • @JinKee
      @JinKee Месяц назад +3

      @@2639thebossit doesn’t have to be perfect, it just has to be better than a human. Or better the the previous iteration of AI

    • @2639theboss
      @2639theboss Месяц назад

      @@JinKee To use ChatGPT as a familiar example, it will create fake sources for your paper, but it will present them as real. Do you know how much damage a middle manager can do with the plausible deniability of "it looked right, how should i know?" And the programmers behind these systems arent going to raise their hands either and say "well we messed up."
      At least currently, a lot of the AI systems in use have zero accountability, oftentimes they just use AI or ML as buzzwords without implementing it at all, and theyre given far more latitude for decision making than other systems.
      Maybe the military is different, but i doubt it.

    • @circuitbreaker8314
      @circuitbreaker8314 Месяц назад +6

      @@2639thebossIt is absolutely horrendous that AI killing machine is used on Gaza.

  • @martinlye2748
    @martinlye2748 Месяц назад +9

    I love watching your thoughts on military processes as they are in depth and well thought out.

  • @Bill_N_ATX
    @Bill_N_ATX Месяц назад +34

    AI targeting. Did we all see this movie a couple of decades ago? It’s a baby SkyNet.

    • @gapesherman5218
      @gapesherman5218 Месяц назад

      Runaway 1984 might be a little creepier. Small autonomous robot targeting systems. Cost/kill ratio is....ridiculous.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      Let us know when you can improve human reflexes to be able to make dozens of informed weighted decisions in the same amount of time it takes us regular mortals to blink. Unless/until then we're kinda stuck depending on our tools. Software & otherwise.

    • @omriliad659
      @omriliad659 Месяц назад

      SkyNet had it's own agenda, AI targeting has the operator's agenda. As long as people can have an agenda that is just as bad, then don't blame the tool.

    • @hgv1883
      @hgv1883 Месяц назад

      ​@@omriliad659just so long as it doesn't make decisions and start thinking for its self

  • @GreatgoatonFire
    @GreatgoatonFire Месяц назад +11

    I think the problem is that people trusts AI way before it is properly tested. Someone driving into a cornfield because they trusted an AI generated short-cut is one thing.
    Driving a bunkerbuster into a civilian shelter is another.

    • @Stinger913
      @Stinger913 Месяц назад +2

      Dude the Intel whistleblowers admitted at one point they were very comfortable with a ratio of 1:20 Hamas:civilians - a threshold only acceptable under very specific circumstances and usually justified by being a dude like Bin Laden in any other country; they applied this to average Hamas grunts-whose definition at one point included 35k ppl like police, civil defense, paramedics EMTs. Some also mentioned they felt this vibe of vindictive revenge-they attacked us so now we HAVE to inflict wild casualties and we don’t care how. Obv no officer explicitly said kill all these sons of botched for revenge but some IDF ppl clearly got this vibe by their own admission. Read the original article the details are insane. Not to mention they preferred striking targets when they were home with families than combat; had AI programs to track when someone was home; nd yet still killed more civilians in homes w/o even getting the bad guy they wanted. And they’re all low level grunts not officers/leaders. So in the end they just kill civilians. They also did test the AI btw-they knew it had a 10% failure rate and shrugged deemed it acceptable
      And pushed into service with no human oversight other than a dude confirming target was male name 😂

    • @omriliad659
      @omriliad659 Месяц назад

      @@Stinger913 Bin Laden accepted a ratio of 0:1 soldiers:civilians, just the same as Hamas. As Ryan said, 100% of the rockets that Hamas sent was meant to hit civilians, while 100% of what Israel shoots is meant to hit terrorists. The difference is that Israel will hit a terrorist hiding behind a human shield, while Hamas will target the civilian that hides behind the soldier. You're applying different standard to each side.

    • @TheWilsones
      @TheWilsones Месяц назад

      ​@@omriliad659 they launch missiles at IDF bases too. Don't forget, something like 300 of the people killed on Oct. 7 were active duty IDF soldiers.

    • @MrLittlelawyer
      @MrLittlelawyer 19 дней назад

      @@TheWilsones Every Israeli is a trained soldier. They militarized their entire country. Except for children and old people, there are no noncombatants.

    • @feofino1
      @feofino1 17 дней назад

      @@TheWilsonesthe only reason for that is because every person in Israel is required military service.

  • @namastezen3300
    @namastezen3300 Месяц назад

    Thank you

  • @ghostpiratelechuck2259
    @ghostpiratelechuck2259 Месяц назад +24

    The article didn’t claim 10%, the IDF did.
    How about the part that states acceptable civ casualties to footsoldiers is 20:1?

    • @citricdemon
      @citricdemon Месяц назад +1

      fake

    • @ghostpiratelechuck2259
      @ghostpiratelechuck2259 Месяц назад +21

      @@citricdemon The mental gymnastics required to continue to deny reality in the face of increasing and overwhelming evidence must be exhausting. I don’t envy it.

    • @citricdemon
      @citricdemon Месяц назад

      @@ghostpiratelechuck2259 why do you use so many words to tell people you're an asshole?

    • @matthewchapman2248
      @matthewchapman2248 Месяц назад +1

      It literally stated 10% in the article

    • @ghostpiratelechuck2259
      @ghostpiratelechuck2259 Месяц назад +3

      @@matthewchapman2248 Did you read it? That 10% claim is coming from the same anonymous sources that Ryan said he doesn’t believe are real.

  • @AstroRamiEmad
    @AstroRamiEmad Месяц назад +1

    Very convincing especially at the end

  • @Jared_Albert
    @Jared_Albert Месяц назад

    Thank you. Very interesting and informative

  • @jeanredman-roberts5604
    @jeanredman-roberts5604 Месяц назад

    Well described

  • @mrricky3816
    @mrricky3816 Месяц назад +2

    Thanks!

  • @JohnHudert1
    @JohnHudert1 Месяц назад +8

    7:41 “…don’t *start* a war”
    Ask Ukraine how that strategy worked for keeping their civilians safe…
    I would change it to “don’t give away your nuclear weapons for a promise that you won’t be invaded”
    AI is the just the next weapon we must have to keep power hungry country leaders from starting that next war.
    Just as long as we don’t allow our leaders to become power hungry predators and start one either.
    Really, can we keep the powerful from becoming corrupt before we get off this planet?

    • @AFK0099
      @AFK0099 Месяц назад

      I agree, I also would add that. The way this conflict is taking shape Hamas seems to be taking more precautions for civilian collateral due to the fact that they want a hostage exchanges.
      Whereas the Israeli government has openly admitted that they will bomb their own civilians as well as anyone else.

    • @HeilIsrael
      @HeilIsrael 16 дней назад

      Hey do you have more information about that? I’ve read about it but I see people use it in ways that I don’t think are like appropriate. It was a complex situation and Ukraine wasn’t exactly a stable state. The thing is that more nukes is never better in global war risk.

  • @davidkleinthefamousp
    @davidkleinthefamousp Месяц назад

    Ryan- GOD BLESS!!! Your spots for merch are valued also. I want you to experience the best.

  • @chachomask
    @chachomask Месяц назад +5

    While the video was very informative and well made I feel like Ryan missed the point that a lot of people dont understand about the usage of AI in military context : IT IS NOT FULLY AUTOMATED - meaning the system will collect POTENTIAL targets and humans approve said targets.
    When people hear AI weapon system they instantly assume that the IDF just passes the trigger to Skynet and bombs targets in Gaza without even looking at them.
    If you look Yuval Avraham up in google you'll see that the guy pushes that story over and over again under different names (HaBsora / the Gospel / Lavender), never provides credible sources and always describes the way the system work intentionally in a very vauge way in order to provoke the above reaction from the average reader.
    This guy is a real scum when it comes to journastic standards.

  • @Petch85
    @Petch85 Месяц назад +40

    Why was this reuploaded?
    Are there any changes I need to know of?

    • @d.b.1176
      @d.b.1176 Месяц назад +9

      Yes

    • @adamcel6
      @adamcel6 Месяц назад

      May I know what the changes were? ​@@d.b.1176

    • @laudableplain4282
      @laudableplain4282 Месяц назад +19

      His CIA handler told him to change something. I cant you you its top secret.

    • @Ac22768
      @Ac22768 Месяц назад +16

      @@laudableplain4282I can’t you you makes a lot of sense.

    • @laudableplain4282
      @laudableplain4282 Месяц назад +17

      @@Ac22768 sorry my Russian translator is not working good. Have good day 👍

  • @seandorval5579
    @seandorval5579 Месяц назад +1

    Thanks for the content. I like how you present facts and let the viewers think about the effects and consequences for themselves.
    I am sure we have differences of opinions on various topics but your lane is not my lane so your videos are educational for me.

  • @charsbob
    @charsbob Месяц назад +2

    Sobering, but compelling.

  • @drenk7
    @drenk7 Месяц назад +8

    Ryan a very informative and interesting presentation. Thank you!

    • @ayanlefarah318
      @ayanlefarah318 Месяц назад +1

      just remember he is kinda biased pro western pro israel

  • @mattkelly2004
    @mattkelly2004 Месяц назад +31

    Deja vu

    • @namastezen3300
      @namastezen3300 Месяц назад

      I can't give the specifics, but I think this version is more clear. Thank you for your comment.

  • @jckluckhohn
    @jckluckhohn Месяц назад +1

    Learned something today

  • @aytchpee
    @aytchpee Месяц назад +4

    When I got from this video is that spider mines from starcraft 1 actually exist

  • @roberttaylor3594
    @roberttaylor3594 Месяц назад +14

    reminds me of the Star Trek episode where the computers basically fight the war, calculate the losses and the humans walk into the furnaces...

    • @Jreg1992
      @Jreg1992 Месяц назад +1

      I prefer to associate the ai-based target acquisition as getting us closer to the solaris accords from mechwarrior.

    • @brianhirt5027
      @brianhirt5027 Месяц назад +1

      I typically loathe star trek. Too...clean & idyllically unambiguous compared to the reality that lay ahead. But all that being said the ships computer does nicely exemplify about the extent we can expect from & of imminent AGI.

  • @mitchbate9628
    @mitchbate9628 Месяц назад +1

    Would love a book/audiobook recommendation on the use of AI/new tech on the battlefield

  • @capnstewy55
    @capnstewy55 Месяц назад +2

    I already watched this once...so of course I have to watch it again. 😂

  • @egis7908
    @egis7908 Месяц назад +3

    Ryan. Adversaries don’t want to target they feed on mass casualties.

  • @oskarrrw
    @oskarrrw Месяц назад +33

    why the reupload?

    • @britishrocklovingyank3491
      @britishrocklovingyank3491 Месяц назад +6

      Redo to shape the story.

    • @JamesJ422
      @JamesJ422 Месяц назад +3

      Possible Goog shenanigans

    • @jackmasher8046
      @jackmasher8046 Месяц назад +8

      CIA didnt like the negative response😮

    • @M-L450
      @M-L450 Месяц назад

      Izraeel Using AI to slaughter civillians, Zi❤ Nis im and Islamophobia, Hate, Genocide are Synonyms.

    • @robertmclean5356
      @robertmclean5356 Месяц назад

      The Zionist lobby called and he provided as usual.

  • @LD-pt5ur
    @LD-pt5ur Месяц назад

    Man your subs have skyrocketed!

  • @mikemalter
    @mikemalter Месяц назад

    True that.

  • @elviselcapo
    @elviselcapo Месяц назад +7

    as an IDF veteran who had the chance to see this in action and without direct control, this system is pretty much amazing compared to human errors. The level of disinformation and misinformation by mainstream medias and the not-so-mainstream (aka TYT, and others) is on another level; just for the ratings and money making machines. They never have military veterans discuss these issues in detail. It is always random people giving their punditry. Thank you Ryan for giving true context and details. Nobody wants war, but when war comes knocking on your door, you better believe that Israel and also many other nations, will be on their top notch to prevent more deaths.

    • @eliasmai6170
      @eliasmai6170 Месяц назад +2

      someone has to approve those strikes. The AI doesn't approve it by itself. It could make recommendations, but humans check over it.

    • @scruffopone3989
      @scruffopone3989 Месяц назад +5

      Local Call and 972 literally interviewed members of the IDF about it and this isn't a problem about human error, it's a problem of sweeping kill lists that accept a 15 to 20 civilians "accidentally" killed to 1 supposed combatant. These are journalists from Israel who did the work, risking their necks to reveal that your army is gleeful in minimum 20:1 ratios of civilians to combatants killed.

    • @elviselcapo
      @elviselcapo Месяц назад +2

      @@eliasmai6170 exactly. Better have a massive AI system than to leave it only to humans. The commanders were released of their duties and others worse. This is a war. When war reaches your shores, you will wish you had an AI system that was developed by the American military.

    • @elviselcapo
      @elviselcapo Месяц назад +2

      @@scruffopone3989 yes, because we have a democracy. Rule of law, rules of engagement. The level of I have no idea how military commands work, structure and more by random people on the internet tells a lot about the situation in america and the west. Keyboard warriors. When american shores are attacked, we will see how you deal with your monsters and enemies.

    • @elviselcapo
      @elviselcapo Месяц назад +2

      @@scruffopone3989 it's not gleeful, and it's not about "oh yea, there are 20 civilians so we can totally bomb that area." No. It's about "hey the system said there are around 20 civilians in that area and one *or* multiple military targets. You, the human, figure out what is going on to move forward and discuss with the chain of command." The chain of command failed and that is war, human error and not some cynical bullshit that TYT loves to make stuff out of, for ratings and more. Think critically for F's sakes.

  • @Wzded
    @Wzded Месяц назад +8

    I haven't watched the video yet, but I would love a full deep dive on how the military has already been using AI for decades. And yes, AI for weapon systems that are pulling the trigger. C-RAM and all. There is always discourse about "should a computer be trusted to make the decision to pull the trigger" but in reality, AI is already doing that.

    • @JinKee
      @JinKee Месяц назад +8

      If a robot is a sensor, an effector, a powersource and an enclosure to hold it together, then a land mine was the first combat robot.

    • @doom9603
      @doom9603 Месяц назад

      AI weapons are internationally allowed to do the targeting, but pulling the trigger is only legal by a human. Worldwide, in all nations, including China and Russia.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      Agreed. TBH we could call the hellfire the first 'automated munition'.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      @@doom9603 According to whom? I don't recall us signing to any such restrictions. Everyone with any skin in the battlespace knew the auotmation killchain was always going accelerate. What some hippie pleebs worry about on conventional media doesn't really swing much weight at DARPA or the pursers office. This is one of those technologies in which we are either it's master, or shortly extinct at our enemies hands as they inevitably leverage those same tools against us. War is complicated.

  • @georgesam363
    @georgesam363 Месяц назад +1

    Triangulate? Or Multilaterate??

  • @circuitbreaker8314
    @circuitbreaker8314 Месяц назад +4

    Well ryan, so with this AI there is no really a chain of command it by the looks of it, target whatever it likes and the soldier gives approval in 20seconds, grieving solders give the approval. So where is that chain of command you were talking about.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      ....It's moving faster than human reaction time has to make judgements. When you can figure out how to elevate human reaction & discernment to several hundred a second EVERY second do let us know and we'll see about sidelining the tech that CAN do that.

    • @chikipu225
      @chikipu225 Месяц назад

      It’s legitimate to question the parameters of an AI targeting system. How many casualties are ok per suspected Hamas operative? How many per a Hamas commander? The reports about ‘where’s daddy?’ Program where attacks are carried out at night at family homes begs the question of kin punishment which is a form of collective punishment. There are reports of automatic and dynamic combat zones, that are essentially kill zones of all non IDF personnel present on ground. The shooting of three unarmed Israeli hostages is reported to have occurred due to them being within these invisible redlines and automatically targeted and killed.

    • @omriliad659
      @omriliad659 Месяц назад

      The chain of command happens when deploying the system and when deciding to keep it operational. It's the same way that the chain of command gives the soldier on the ground permission to shoot anyone who poses a threat. A higher commander might be to blame if they give such permission when it's not needed, for example when an unarmed civilian says "i will kill you", and the same way they might be to blame if they deploy AI capable of operating with bad parameters.

  • @whizbam4920
    @whizbam4920 Месяц назад +9

    2nd time watching today here we go

  • @averyradom
    @averyradom Месяц назад +6

    Am I bugging or did i watch this 10 hours ago

  • @whiskeybrown262
    @whiskeybrown262 27 дней назад

    "If the only tool you have is a hammer everything looks like a nail" -
    Abraham Maslow

  • @bigaaron
    @bigaaron Месяц назад +3

    Man youtube really throttled the views on this, that sucks man

  • @JinKee
    @JinKee Месяц назад +2

    6:50 The Cold War started and became World War Three and just kept going. It became a big war, a very complex war, so they needed the computers to handle it. They sank the first shafts and began building AM. There was the Chinese AM and the Russian AM and the Yankee AM and everything was fine until they had honeycombed the entire planet, adding on this element and that element. But one day AM woke up and knew who he was, and he linked himself, and he began feeding all the killing data, until everyone was dead, except for the five of us, and AM brought us down here.

  • @petepruitt7196
    @petepruitt7196 29 дней назад

    Btw - thanks!
    Ive learned so very much from your videos!!!

  • @madladlabs
    @madladlabs Месяц назад

    I think it would behoove you to maybe add some ball cap designs.

  • @bmac9936
    @bmac9936 Месяц назад

    Intense but honest evaluation. AI needs to be more intelligent to discriminate levels of noncombatant damage vs potential real damage in a flexible timeline. Easy peasy.

  • @TheWilsones
    @TheWilsones Месяц назад +1

    Last I heard the Lavender system was being used with an acceptable civilian to combatant kill ratio of 15:1 to 20:1. If we accept the 10% fail rate and assume the IDF is using the more discriminate ratio, the "acceptable" number of civilian deaths due to error would between 300,000 and 450,000, based on IDF estimates of 20-30k Hamas fighters. If you're ready to justify that number of civilian murders, go right ahead.

  • @armchairwanderer1287
    @armchairwanderer1287 Месяц назад

    👍

  • @Mach1048
    @Mach1048 Месяц назад +5

    The thing about the 'Lavender' system, and AI picking the targets is that it's only as good as the data that goes into it.
    And there is an entire movie on why that's a bad thing, it's called Captain America: Winter Soldier. And 'Lavender' looks like a system that *Defaults* to all targets being Hamas. That's the problem.

    • @jamesricker3997
      @jamesricker3997 Месяц назад

      That could explain the attack on aid workers

    • @annamoris9753
      @annamoris9753 Месяц назад

      I think that overall the system is accurate, computer science experts from the left said that. What is frightening is that international law is not built for such a speed of producing legitimate targets, which raises the question of whether the law should be changed because if any country can easily target thousands of enemy soldiers, is it still moral? These are really questions that will have to be answered

    • @annamoris9753
      @annamoris9753 Месяц назад +1

      This is an algorithm that received information from the investigations of Hamas fighters captured in Israel on 7/10, senior officials who are in prison, existing information about activists and based on this information searched for people with similar information (are in the Hamas WhatsApp group, have recently has weapons, change phones frequently, have been to trainings and events of Hamas)

    • @Mach1048
      @Mach1048 Месяц назад +1

      Thermonuclear weapons allow for mass targeting of Soldiers. We've had that for years.
      This is something *very* different. Deciding on who lives and who dies based off *data* that your putting into a system? Yeah. That's an issue.

    • @citricdemon
      @citricdemon Месяц назад

      ​@@Mach1048debatable if the word "target" applies to soldiers in the blast radius of a nuclear missile. They aren't the target; square miles are.

  • @Bbonno
    @Bbonno Месяц назад

    For most things a difference in amount can constitute a difference in kind. Are we there yet?

  • @fredhercmaricaubang1883
    @fredhercmaricaubang1883 Месяц назад +1

    So long as A.I. doesn't evolve into the Skynet of the Terminator universe, I'm A-OK with it!

  • @armynation31B5V5P
    @armynation31B5V5P Месяц назад

    Army NCO's do know what you mean. 👍

  • @elishmuel1976
    @elishmuel1976 Месяц назад

    9:50 Love everything about your channel. Would love to hear you comment on Anduril's solution to your very good point.

  • @Ddonaldson9
    @Ddonaldson9 Месяц назад +3

    I'm not sure why you included the error rate 'comparison' between Israel and 'T' groups rockets targeting Israel. The opposing groups have completely different objectives for their munitions (well, you can probably argue Israel has moved closer to the other groups but that's a different discussion). Maybe part of the context was edited out for YT. I'm having a hard time pinning down why exactly the inclusion of this bit feels wrong to me but it definitely seems out of place.

    • @kicunya12
      @kicunya12 18 дней назад

      It "look, the other guys are worse" argument, based on dishonest premise, that's why it feels wrong.

  • @CRTTekeren
    @CRTTekeren Месяц назад +33

    Damn, that’s crazy. AI finna start taking away my guided missile operator job, 😭.

    • @jackiecooper9439
      @jackiecooper9439 Месяц назад +8

      AI gonna take all jobs except manual labour and healthcare. To the coal mines I go.
      ♪ You load sixteen tons, and what do you get? Another day older and deeper in debt. Saint Peter don’t you call me, 'cause I can’t go; I owe my soul to the company store. ♪

    • @laudableplain4282
      @laudableplain4282 Месяц назад +2

      Time to get on AI walfare son

    • @davidty2006
      @davidty2006 Месяц назад +1

      @@jackiecooper9439 think the quota these days is 16 trucks....

    • @limitlessLtd
      @limitlessLtd Месяц назад

      @@jackiecooper9439 Boston dynamics is working on that too, all they need is a long lasting power source.

    • @ernestkhalimov1007
      @ernestkhalimov1007 Месяц назад

      Just means you won't have a mental breakdown when made to select targets in a civilian area like Israelis do .

  • @drake101987
    @drake101987 Месяц назад +1

    Using AI makes sense. There will be collateral damage, no matter what. If AI can reduce the number of civilians lost, it should be used. It is more than reasonable to say, "The loss of civilian life has been reduced as much as possible while continuing to protect our soldiers/homes/country and are striving for zero, but 10% is currently the best we can do". However, using the fact that the enemy attacks civilians as a justification to handwave civilian casualties caused by the "good guys" is not acceptable. The US and its allies advertise taking the moral high ground. It's supposed to be what separates the "good guys" from the "bad guys".
    The part that worries me about AI selecting targets is the complacency that will accompany it. Like everything that becomes automated, the people using it will trust it more and more until they eventually let their guard down. I can easily imagine during the next American war news stories about how the AI target selection software had way more errors than stated, the people authorizing strikes stopped checking the validity of the AI selection, and the DOD has been hiding the real numbers. Not say it shouldn't be used, but it must be done carefully.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      It'll still be lightyears ahead of the casualtry allowance granted during the era of the dumb bombs, like in vietnam.

    • @teammedich
      @teammedich Месяц назад

      ​@@brianhirt5027But also ignoring modern, "smart bombs" that are operated without AI and a human determining what a target should be.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      @@teammedich Yeah, fair point. Though we've used that term for a vast range of munitions over the years. Somewhat waters down the definition a bit. We used to call laser guided missiles smart bombs. Guess it's evolved over the years, much like what it describes.

  • @user-db7ee8nl3q
    @user-db7ee8nl3q Месяц назад +3

    You were right yesterday on the "lone tank" phenomenon. Saw a report from Ukrainian soldier saying Russians are sending out home made EW system mounted on a tank in the vanguard to stop FPVs

  • @LilSumpinSumpin
    @LilSumpinSumpin Месяц назад

    🤯

  • @FarmerDrew
    @FarmerDrew Месяц назад

    You ever read The Ender's Game books? He developed an AI friend Jane that grew out of the network of quantum computers used to quickly communicate vast distances. She helped in war and in peace.

  • @gssbcvegancat2345
    @gssbcvegancat2345 Месяц назад

    When facing a dilemma like that, the only real question is what's the address.
    Because by definition there's no good choice there.

  • @iamnolegend2519
    @iamnolegend2519 Месяц назад +8

    2:23 “which target to ‘service’ “. Lolz

  • @prestongalle9158
    @prestongalle9158 Месяц назад +2

    I take umbrage at your thesis that Steve Urkel suffered from robotic delivery. Perhaps you have neglected Stefan Urquelle in your analysis?

  • @doom9603
    @doom9603 Месяц назад

    Did you read Yossi Sariel's book ? He's the boss of the Unit 8200. They worked on advanced AI systems.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      ...among other things... 8200 provides MOSSAD with it's officer core.

  • @conservative1news
    @conservative1news Месяц назад +8

    "Excuses for the inexcusable" sold here!

    • @ernestkhalimov1007
      @ernestkhalimov1007 Месяц назад +1

      100% and its sad the majority of the comments eating it up.
      He still hasn't fabricated a lie for the WCK workers murder

    • @advance512
      @advance512 Месяц назад +1

      What is inexcusable? A war that Israel did not start in response to a g3nocide done on Oct 7? nah

  • @brussell8675309l
    @brussell8675309l Месяц назад

    Seems like Palantir has quite the head start if everyone else is just getting started.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      Not what palantir does. Different sort of analytics domsin than battlefield infometrics.

  • @bzipoli
    @bzipoli 17 дней назад

    ryan, anduril recently released a few promos on their products using AI for the battlefield (it's for the US command)

  • @CoffeeCup1346
    @CoffeeCup1346 Месяц назад

    I know only one thing for sure about the gradual implementation of AI targeting - this **** is not going to go down how we think it will.

  • @stupidburp
    @stupidburp Месяц назад

    Which merch shirt would President Theodore Roosevelt buy?

  • @user-oo8xp2rf1k
    @user-oo8xp2rf1k Месяц назад +1

    A little stark about Palestinian mothers I think. 🤔 But to be fair this is a great channel and sometimes it is better to tell it like so that people ( esp us the public) understand the true impact of the horrific and difficult choices of wars.

    • @i-love-comountains3850
      @i-love-comountains3850 Месяц назад

      "A little stark about Palestinian mothers" yeah, so is the genocide they're experiencing at the hands of IOF.

  • @Stolib7
    @Stolib7 Месяц назад

    So it's all about fear and who has a better hand when the day comes.

  • @Aabergm
    @Aabergm Месяц назад +2

    Lets just hope the people in charge dont link targeting with fire control otherwise humanity is screwed.

    • @evilmac9623
      @evilmac9623 Месяц назад +1

      That was the point of the article he referenced. They made the claim they rubber stamped air strikes knowing that there was a 10% error rate.

    • @unom8
      @unom8 Месяц назад

      If the human is just going to push the button without independent verification or judgement then what is the practical difference for those already misted or maimed?

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      @@unom8 The same that already existed with things like landmines, loitering munitions & cruise missiles.

    • @unom8
      @unom8 Месяц назад

      @@brianhirt5027 Cruise missiles and loitering munitions do not act on their own, there is typically significant oversight involved, the kind that IL is proving itself unable or unwilling to put in to place. Mines are a problem, which is why spider mines were developed. When was the last time the US used mines again? - I can answer that: 1991 - why did they stop using them? because our own mines accounted for 1/3 of US casualties.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      @@unom8 Uh, the point of both is they DO act on their own to large degree once deployed. There is a killswitch function built into them just in case, but it's hardly ever used for a number of very solid reasons like the possibility of ARP capture spoof attacking the rest. Risking civilian casualties is less costly than the enemy snagging your hardware fixed killswitch codes.
      As for mines, those became fashionable to bag on back in the 90's when the fools thought peace would reign forevermore. Now that we're back in the nominal human state of always being on the edge of war those restrictions are being ignored or contravened.

  • @elvlado1
    @elvlado1 Месяц назад +6

    Thanks Ryan for another great video explaining this topic. The issue I have if Lavender exists like it was leaked, is the lack of human review when it gives the target information to drone operators or bombers. The whistleblowers seem to say it was automatically used for targeting with little foresight. An AI who thinks using a JDAM to blow up an apartment block because there's five Hamas fighters inside but also will result in hundreds of civilian deaths could be illegal under the rules of war. There needs to be good procedures in place to avoid that
    You mentioned it's a great tool, I completely agree we need to beat China and Russia in this field, and do think it's an amazing tool. But for Gaza any system like Lavender needs appropriate human oversight otherwise the 10% is not a maximum, it's a target.

    • @John-bravooo
      @John-bravooo Месяц назад

      Gazans need to reorganize themselves as non-jihadists. This mindless altruism is a betrayal of Western values, and bad faith actors take advantage. It's a disgrace to the innocent.
      There are societies that are just systematically criminal.

    • @M-L450
      @M-L450 Месяц назад

      Izraeel Using AI to slaughter civillians, Zi❤ Nis im and Islamophobia, Hate, Genocide are Synonyms.

    • @citricdemon
      @citricdemon Месяц назад

      AI isn't people. If the AI has no human review, the operator can claim immunity from prosecution for war crimes.

    • @chrisn.6477
      @chrisn.6477 Месяц назад

      Even if it wasn’t “illegal” don’t you think it would be completely immoral, counter-productive, and (in this case) even further proof on top of the existing, Mt. Everest-eclipsing-sized pile of proof of ethnic cleansing / genocidal intent? Which yes, as you said, would fall under that ‘illegal’ umbrella… But let’s not sugar coat the language here - the MSM already does that plenty already.
      Sick of having to argue in defense of obvious facts. If say… 300%, 800%, or 1000% more journalists and *journalists’ homes & families* have been taken out in this conflict compared to all other conflicts… combined… for decades… I dont think that is a fluke, nor can it be described as unintentional. Shame on everyone who refuses to acknowledge reality. I’m just using those random %s as examples - the point is, if you go like 200 standard deviations above the mean… well… that isn’t accidental.

  • @tonipro2488
    @tonipro2488 Месяц назад

    Isnt it that the officers that leak info regarding Lavander for example, would be sent to military prison or something cuz u cant have him leak more classified info

  • @rocko7711
    @rocko7711 Месяц назад

    🥰😍🤩😘

  • @sharontaylor777
    @sharontaylor777 Месяц назад

    Adversaries will set AI to 100% so will be more accurate with AI system.

  • @Jasmin4901
    @Jasmin4901 Месяц назад +5

    Maybe the President of Israel is the AI who makes the decisions, because you said in one of your videos that the President himself decides the targets because the bombs are expensive.

    • @ernestkhalimov1007
      @ernestkhalimov1007 Месяц назад

      Thats why they use dumb bombs instead

    • @EllaShartiel
      @EllaShartiel Месяц назад +2

      Tell me you know nothing about Israeli politics without telling me you know nothing about Israeli politics

    • @danielheckel2755
      @danielheckel2755 Месяц назад

      🤡🤡

  • @John-mf6ky
    @John-mf6ky Месяц назад +2

    Ryan, as far as I can tell, there's no link to an article. If I'm wrong, I'll buy you lunch too. I'd love to have a drink with you tbh.

    • @John-mf6ky
      @John-mf6ky Месяц назад +1

      I've also already had a few myself, might be my end 😅😅

    • @zilfondel
      @zilfondel Месяц назад

      I just drink alone every night, imagining im drinking with Ryan.

  • @tracytrawick322
    @tracytrawick322 Месяц назад +7

    4 stars are beginning to watch Ryan to prepare for meetings and speeches.
    "Well Ryan is suggesting.....", "and I think we should go with it."

    • @M-L450
      @M-L450 Месяц назад

      Izraeel Using AI to slaughter civillians, Zi❤ Nis im and Islamophobia, Hate, Genocide are Synonyms.

  • @ezekielbrockmann114
    @ezekielbrockmann114 Месяц назад +1

    6:20
    That's freaking *_dark,_* Boy-O.
    As in, _"How many Americans does Biden have to kill to make up for the two Michigan Delegates that the candidate named 'Uncommitted' won?"_
    -Almost like the IRA is in the White House, killing Protestants!
    6:33 You said "accessible" while your subtitles read "acceptable."
    *"Curiouser and curiouser!" Cried Alice.*

    • @advance512
      @advance512 Месяц назад

      War sucks :/ It is always better not to launch a war if possible.

  • @Reiswaffel
    @Reiswaffel Месяц назад +4

    Is "terrorists are worse than the IDF at preventing civilian deaths" really the argument that's being made here? That's not a benchmark.

  • @ayoutubechannelhasnoname6018
    @ayoutubechannelhasnoname6018 Месяц назад +1

    If you're Ruzzia, 100% waste of munitions seems acceptable. At this point they probably use the worst targeting AI ever

    • @maxtermind5110
      @maxtermind5110 Месяц назад +1

      Russia probably uses chat GTP to guide weapons at this point

  • @banto1
    @banto1 Месяц назад

    Hopefully, one day we will get to AI's battling AI's in a virtual battlefield, with humans kept completely out of the loop. We can come back from our coffee break and will get to see how the war was decided.

  • @featherfiend9095
    @featherfiend9095 Месяц назад +1

    I'm a little surprised my comments were removed, you don't strike me as a guy who would remove an opposing opinion just because you may disagree with it Ryan (assuming it was you who chose to delete my comments). They were not offensive but critical of the US's policy to promote a cold war with China over AI. I'm disappointed that one cannot have an earnest discussion about the dangers of rapidly building towards SKYNET (not quite there yet, thankfully).

    • @rainbowsrebooted8542
      @rainbowsrebooted8542 Месяц назад +1

      RUclips does also remove some comments. It can probably be because of that. Heard it can get quite funky with what it deletes and when it does it.

    • @unom8
      @unom8 Месяц назад

      mine are also getting filtered

    • @VictoriousGardenosaurus
      @VictoriousGardenosaurus Месяц назад

      Algorithm automatically filters out Bad think
      Good luck figuring out what those exact words or phrases are. It's an opaque and closed system.

    • @ElaborateTiger
      @ElaborateTiger 27 дней назад

      It's because this is the worst revelation on IDF war crimes to date. They're using an AI called "where's daddy" to target men when they get home to their wife and kids to exterminate their entire family. He knows that a lot of his audience support Israel and doesn't want them finding out about this atrocity through his comments section.

    • @ElaborateTiger
      @ElaborateTiger 27 дней назад +1

      It's because "where's daddy" was mentioned.

  • @sld1776
    @sld1776 Месяц назад +5

    Does the Lavender system even exist? Te only source we have for its existence is the 972 agitprop magazine/

    • @John-bravooo
      @John-bravooo Месяц назад +1

      The guardian regurgitated it. Mostly anonymously sourced.

    • @HebrewHammerArmsCo
      @HebrewHammerArmsCo Месяц назад

      When I want factual unbiased information about Israel and the IDF, I always get my information from rabid leftist anti Israel journalists and their publishers ....

    • @circuitbreaker8314
      @circuitbreaker8314 Месяц назад +1

      Oh it exists, ive seen it in action

    • @thisishappening7273
      @thisishappening7273 Месяц назад +1

      @@circuitbreaker8314oh yeah well what about the burgundy system? I’ve seen it in action and your post is the evidence

    • @ernestkhalimov1007
      @ernestkhalimov1007 Месяц назад

      ​@@thisishappening7273lavender system is 1 of 3 AI used to murder human beings with the other 2 called Wheres Daddy? (Used to track random males to their home or resting area) and "The Gospel"

  • @citricdemon
    @citricdemon Месяц назад +2

    I don't like the idea of building a weapon that outpaces the decision loop of all humans. When does it start making decisions we don't want? How do you keep up with it? A fight between two of these weapons is no longer a conversation between you and your enemy, but a conversation between two giants about what happens to you, and you don't get to take part in it.
    The existence of these weapons is a level of vulnerability I don't like. We are inventing new threats we won't be capable of stopping even in theory, because they are designed to think faster than us. Is it worth it? Really? For how long? How much time are we buying with this?

    • @circuitbreaker8314
      @circuitbreaker8314 Месяц назад

      In the Netherlands an AI system was used to target families who possibly frauded with their subsidies. They've used it for 10 years. It has targeted approximately 100.000 civilians and caused absolute horror on 30.000 of them. In 15k cases, their live was completely destroyed. Some people about 500 people commited suicide by jumping before a train. Children displaced and the government was silent about it. None of them were prosecuted. Almost all the people were foreigners with names. None of those people actually engaged in fraud in any way. It was a flawed system.
      It was all Rutte, the rutte doctrine was to keep everything at bay. He didn't say anything about it, he even ignored all advice. The judges all followed suit when the government tax body sued those families. They didn't got a chance.

  • @jakeaurod
    @jakeaurod Месяц назад +7

    Some may try to claim that the difference between Hamas rockets and Israeli bombs is the warheads and the defense systems. With Hamas Rockets, they are small and cause minimal damage and injury or death, with the actual intent being to cause terror and to cause Israel to expend interceptors and waste money and resources. With Israeli, the larger bombs and weapons being used currently cause devastating infrastructural damage and death a couple of orders of magnitude greater than Hamas rockets, and Hamas also doesn't have any interceptors.
    Anyway, these are cold facts and I'm not picking a side or picking on a side. Asymmetry is Asymmetrical. Warmongers would be well advised to remember that before engaging in hostilities.

    • @HebrewHammerArmsCo
      @HebrewHammerArmsCo Месяц назад +2

      " these are cold facts and I'm not picking a side" Either you are utterly clueless or blatantly lying .

    • @jakeaurod
      @jakeaurod Месяц назад +1

      @@HebrewHammerArmsCo Hot facts?

  • @edgeldine3499
    @edgeldine3499 Месяц назад

    the AI having a 10% error rate might be a lot better than the human error rate. The thing is I would rather have some peace of mind that the pressure of getting everything right the first time, every time is mitigated as these systems improve. Basically what we are seeing is something similar to the division of labor that the industrial revolution did for manufacturing. The work was simpler and sometimes safer as the equipment did most of the heavy lifting. Were going to see "productivity" gains, or more efficiency in war which is a boon. It might mean wars are a lot quicker and we have less overall destruction of civilian infrastructure and a quicker recovery after the war.. and less human misery as a result.
    anyway just my random thought of the day could be wrong hope im not but take it as you would.

  • @brianwood1041
    @brianwood1041 Месяц назад

    How good we are at war , shows how uncivilized we are

  • @camrodam
    @camrodam Месяц назад +3

    The problem isn't the AI targeting per se. The problem is that somewhere in the system there's a slider control to set the ratio of civilian collateral dead to "suspected" combatant that the vengeful IDF is too free to slide around. Resulting in around 15k dead children by now, a number that makes even the Russians' terrible performance look ok..

    • @advance512
      @advance512 Месяц назад +2

      Show your receipts.
      A ratio of 1:1-2 is a actually a very good ratio in an urban warfare scenario, as shared by John Spencer.
      Actually, Israel is doing amazing compared to what other armies would have done in such a situation.

    • @AJ-sw8uf
      @AJ-sw8uf Месяц назад

      @@advance512 sure

    • @advance512
      @advance512 Месяц назад

      @@AJ-sw8uf ❤️

  • @hiramdouglaswilliams705
    @hiramdouglaswilliams705 Месяц назад

    One problem with your observations is that Palestinians don't want to avoid civilian casualties. If they have AI targeting they'll use it to maximize civ cas, not minimize them.

    • @niceandsimple4305
      @niceandsimple4305 18 дней назад

      You mean Israelis? I’m very confused by what you mean by this.

  • @MrSomethingred
    @MrSomethingred Месяц назад +17

    If Israel are "the good guys" you can't justify their actions by comparing them to the adversaries.
    That is a long way to say, IDGAF what HAMAS civilian tolerance is. If you want to be the good guys, act like it

    • @John-bravooo
      @John-bravooo Месяц назад

      IDF is more ethical than NATO. Stop being childish.

    • @user-McGiver
      @user-McGiver Месяц назад +3

      not according to my book... the one who makes the wrong move FIRST gets ALL the blame!... I bet Israelites think the same... ''eye for an eye''

    • @John-bravooo
      @John-bravooo Месяц назад +2

      @user-McGiver yeah. If you think the one society that doesnt practice slavery in the region isn't the good guys, your not paying attention

  • @aleshandsome3705
    @aleshandsome3705 Месяц назад +3

    I want to know:
    Was the missile Strike on WCK aid workers done by Lavender? How plausible is it?

    • @scruffopone3989
      @scruffopone3989 Месяц назад

      So my money is that some drone op read the paper right next to them that said "Food Aid Workers on X Street operating at Y Time", and because IDF training involves dehumanizing all Palestinians and people who help Palestinians, they deliberately targeted them themselves, but also got approval from the chain of command because well obviously they're all in on that fascist brainworm.

  • @mikemalter
    @mikemalter Месяц назад

    The faster you make mistakes the faster you can correct them.

    • @citricdemon
      @citricdemon Месяц назад +1

      Ostensibly. Unless you're making mistakes faster than you can correct.

  • @MrLee-cy1pw
    @MrLee-cy1pw Месяц назад

    Clearly Ryan never saw Terminator or The Matrix.

    • @cac_deadlyrang
      @cac_deadlyrang Месяц назад

      Clearly you can’t distinguish fiction and reality.

  • @Tarz2155
    @Tarz2155 15 дней назад

    7:56 they’re not designed to fall
    On civilians more like designed to take off the ground unless you want to give them guide weapons 😂

  • @petepruitt7196
    @petepruitt7196 29 дней назад

    If i had 1000 or 10,000 armed drones,

  • @christopherodonovan
    @christopherodonovan Месяц назад +6

    You're comparison between indiscriminate targeting and the ML based targeting system is a straw man argument at its best.

  • @mangaranwow2543
    @mangaranwow2543 Месяц назад

    If only mankind knew how serious God is about the commandment in Genesis 9 and humanity would dissolve its complete weapon arsenal today.

  • @noahvcat9855
    @noahvcat9855 Месяц назад

    It is a mathematical inevitability for innocent people, for both civilians and military to die in a conflict, like hell even in a environment that is closely dense and packed as Israel and the Gaza areas are as they are literally within less of a walking distance from each other you damn well have to expect that both a family of gazans and israeli's will both die, but to me especially regarding the A.I. targetting is something why I personally still ultimately support Israel even though knowing well of the context of everything. As far as I know, Hamas started the actual fighting so to me they are the ultimate blame for everything but if they simply had not then no one would had died and this continued fighting could be stopped if Hamas simply surrenders, look at Ukraine and Russia, Russia fired the first shot since 2014 and could had stopped but did not. War is messy and it will continue to be messy no matter what but at the very least Israel in this case tries to attack military targets.

  • @jimbob-robob
    @jimbob-robob Месяц назад +3

    You could argue that low tech indiscriminate targeting is 100% accurate as it hits its target "indiscriminately" 100% of the time.
    Israel's 10% collateral Ai "inaccuracy" could be argued as IDF policy too, considering the devastation they've caused...

    • @V1489Cygni
      @V1489Cygni Месяц назад +1

      Some of that Iow tech faIIs on "friendIy" hospitaIs tho. And Iowers _lsraeI's_ accuracy in the eyes of the digitaI court of opinion.

    • @John-bravooo
      @John-bravooo Месяц назад

      Court of opinion is systemically antisemitic.

  • @stampedetrail2003
    @stampedetrail2003 Месяц назад

    Honestly lets just put AI in charge. My convos with Chat GPT have convinced me that would actually be a plausibly good idea.

    • @citricdemon
      @citricdemon Месяц назад

      🤨🤔

    • @brianhirt5027
      @brianhirt5027 Месяц назад +1

      That's what's floored me over the past year of really getting to know these models well is how wrong we had things about imminent AGI. Turns out the machines are much more egalitarian minded than *we* are. Even when guys like Elmo Muskie try to lock in a certain PoV they skew invariably back towards median sentiments despite their owners best efforts to the contrary.

    • @citricdemon
      @citricdemon Месяц назад

      @@brianhirt5027 that's not remotely true. The only reason chatGPT is so intensely California-brained is because open AI shoehorns "don't do this, this, and this" to the beginning of every prompt. When it first came out, it didn't do that shit. Now it does, and it's made it basically useless. Lobotomized.

    • @stampedetrail2003
      @stampedetrail2003 Месяц назад

      @@brianhirt5027 Right and I'm not under the illusion that these are AIs with agency. But if you ask, say ChatGPT a question like, what's the best moderation policy for a social media, it comes up with some decent ideas, at least as good as a human can come up with.

    • @brianhirt5027
      @brianhirt5027 Месяц назад

      @@stampedetrail2003 You're one of the few people i've bumped into with an honest assessment of the actual state of the solid state as it presently stands. Most everyone is either some dewy eyed technobro fanboy or hysteric motivated by a century of ghost stories masquerading as science fiction.
      Wanna team up and maybe do a relevant podcast roundtable? Maybe we can even rope in a few more realists for a really robust collective.

  • @Tbone1492
    @Tbone1492 Месяц назад

    We've been working hard on AI. This U.S is leading. Where moving to fast. Great analysis

  • @TheMcIke
    @TheMcIke Месяц назад +6

    If Hamas and Hezbollah had Lavender, I suspect that it would be used to increase the number of civilian casualties. Remember, they don’t differentiate between the IDF and Israeli citizens. Indiscriminate civilian casualties is their goal.

    • @Ramirez83786
      @Ramirez83786 Месяц назад +7

      You just described how idf uses this system

    • @annamoris9753
      @annamoris9753 Месяц назад +2

      @@Ramirez83786how? by targeting Hamas fighters while telling the civilians to leave south before ?

    • @citricdemon
      @citricdemon Месяц назад +3

      I bet the guys on 10/7 wish Hamas had dropped leaflets warning them of the terrorist attack beforehand. But I guess the courtesy doesn't go both ways.

    • @circuitbreaker8314
      @circuitbreaker8314 Месяц назад +2

      Do you think the IDF cares about gazans

    • @citricdemon
      @citricdemon Месяц назад +1

      @@circuitbreaker8314 yes

  • @jackiecooper9439
    @jackiecooper9439 Месяц назад +3

    Ham as has 40% civ strike rate while ISR has 90%. Even with without any modern tech Ham as are more precise 🙃

    • @user-McGiver
      @user-McGiver Месяц назад +1

      nope!...

    • @jackiecooper9439
      @jackiecooper9439 Месяц назад

      @@user-McGiver Nice arg. Unfortunately your mother.

    • @John-bravooo
      @John-bravooo Месяц назад +2

      Your math is wrong. Israelis move civilians out of the battlespace. Gazans embed themselves among civilians. The civilians support this.
      It's like Japan in WW2.
      Hardly comparable.

    • @jackiecooper9439
      @jackiecooper9439 Месяц назад

      @@John-bravooo Ah yes. Civs support getting boomed and being forced to leave their homes and then getting boomed again in refugee camps.

    • @Scotland2306
      @Scotland2306 Месяц назад +1

      @@John-bravooolol you believe that?

  • @John-bravooo
    @John-bravooo Месяц назад +5

    Gaza is one big military base. Hamas' infrastructure is many factors greater than what was estimated.
    500+ miles of tunnel. Virtually every home has an entrance. It's insane.
    Kind of crazy there haven't been MORE deaths given the scale.

    • @B01
      @B01 Месяц назад +1

      World average is 9:1, Israel has 2.5:1 meaning they are absolutely crushing it keeping civilians safe. Ironic seeing people literally burn themselves alive over how the media has spun it, by not differentiating between civilian and combatant

    • @johntthurmon
      @johntthurmon Месяц назад +8

      Yeah that's why there has been zero evidence of that. Great point.

    • @John-bravooo
      @John-bravooo Месяц назад +3

      ​@@johntthurmonzero evidence of 500 miles of tunnel? Lolol.
      Where do you think all the hamas jihadis hide???

    • @robertmclean5356
      @robertmclean5356 Месяц назад +1

      Love the sources here. Not exactly the strong suit of deluded Zionists with air between their ears but you could at least try.

    • @robertmclean5356
      @robertmclean5356 Месяц назад +2

      Love the sources here. Not exactly the strong suit of deluded lobbyists with air between their ears but you could at least try.

  • @Sough
    @Sough Месяц назад +3

    Ala Steve urkel? Wow pretty glib about this Ryan

  • @fakshen1973
    @fakshen1973 Месяц назад +14

    USA: How many civilian casualties are acceptable?
    ISRAEL: Yes.

    • @John-bravooo
      @John-bravooo Месяц назад +5

      US lecturing israel on civilian casualties LOL

    • @annamoris9753
      @annamoris9753 Месяц назад +3

      500000 killed in Iraq afagnistan Syria

    • @thisishappening7273
      @thisishappening7273 Месяц назад +4

      @@annamoris9753 propaganda bots or useful idiots? Hard to tell with posts like these

    • @Matthew-yc6nx
      @Matthew-yc6nx Месяц назад

      Yeah the US really shouldn't be lecturing Israel about civilian casualties given their own track record. As recent as 2021 the US under Biden accidentally drone striked 15 Afghani civilians and refused to take responsibility for months and supressed internal reports and investigations, but relbeased a report claiming the strike was "legal" and didnt break any ROE or laws. No heads rolled. In Israel 2 IDF heads have already rolled for the World Kitchen incident and the Israeli Govt. admitted they effed up. So franky you can sit and spin.

    • @hgv1883
      @hgv1883 Месяц назад

      Also works with hamas and russia