Apple Has Begun Scanning Users Files EVEN WITH iCloud TURNED OFF

Поделиться
HTML-код
  • Опубликовано: 25 ноя 2024
  • In this video I discuss how several news medias have announced that Apple will no longer pursue scanning peoples iCloud accounts for CCAM and a blog post that appears to be showing that Apple is indeed scanning local filesystems on Mac OS without users consent (iCloud and analytics turned off)
    sneak.berlin/2...
    ₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿
    Monero
    45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436
    Bitcoin
    3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV
    Ethereum
    0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079
    Litecoin
    MBfrxLJMuw26hbVi2MjCVDFkkExz8rYvUF
    Dash
    Xh9PXPEy5RoLJgFDGYCDjrbXdjshMaYerz
    Zcash
    t1aWtU5SBpxuUWBSwDKy4gTkT2T1ZwtFvrr
    Chainlink
    0x0f7f21D267d2C9dbae17fd8c20012eFEA3678F14
    Bitcoin Cash
    qz2st00dtu9e79zrq5wshsgaxsjw299n7c69th8ryp
    Etherum Classic
    0xeA641e59913960f578ad39A6B4d02051A5556BfC
    USD Coin
    0x0B045f743A693b225630862a3464B52fefE79FdB
    Subscribe to my RUclips channel goo.gl/9U10Wz
    and be sure to click that notification bell so you know when new videos are released.

Комментарии • 2,1 тыс.

  • @paull1248
    @paull1248 Год назад +5231

    Every time I hear "Think about the children", it makes me sick to my stomach. I'm sure a trillion dollar company with ridiculous margins, who exploits people in 3rd world countries to make production costs even lower, truly care, deeply about children. The fact they're trying to encroach on your privacy is bad enough, hiding behind children while doing so is disgusting.

    • @O1OO1O1
      @O1OO1O1 Год назад +142

      "that's for using children"
      - Trinity, Matrix Resurrections

    • @powerdude_dk
      @powerdude_dk Год назад +26

      Solid point there

    • @christiangonzalez6945
      @christiangonzalez6945 Год назад +220

      Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.

    • @christiangonzalez6945
      @christiangonzalez6945 Год назад +24

      Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.

    • @Mustachioed_Mollusk
      @Mustachioed_Mollusk Год назад +95

      We’re going to need you to stop posting opposing opinions because what if some children read what you said? Be responsible.

  • @ichigo_nyanko
    @ichigo_nyanko Год назад +2419

    I can totally see this being adapted (quietly) to scan and detect copyrighted material and snitching to the police whenever it is found.

    • @jer1776
      @jer1776 Год назад +470

      Yep, as well as memes that promote "hate".

    • @RaaynML
      @RaaynML Год назад +51

      Anyone can have a different definition of freedom

    • @xXx_Regulus_xXx
      @xXx_Regulus_xXx Год назад +236

      that's definitely one of the primary goals, they just use the unobjectionable goal as the one to get their foot in the door.
      "we're gonna make sure there's no CP on your computer, which we know you don't have anon so don't worry about it. And while we're in there we're gonna check for pirated anime, maybe see if that netflix login in your keyring was borrowed from anybody. But you're not some TOS breaking freak, so you have nothing to worry about right bro?"

    • @zeppie_
      @zeppie_ Год назад +98

      I think it’s more likely that it straight up deletes those files from your hard drive and when you try to open it again you get some annoying as fucc notification like “we couldn’t find your files :(“

    • @youreyesarebleeding1368
      @youreyesarebleeding1368 Год назад +2

      @@jer1776 These heckin trolls on 4chan spreading H8 sp33ch memes are currently the BIGGEST THREAT TO OUR DeMoCrAcY!!!! We HAVE to scan your personal files, it's to protect Democracy! and the children! and minorities! and immigrants!!!

  • @SergioLeonardoCornejo
    @SergioLeonardoCornejo Год назад +777

    Protecting children is always the excuse to impose authoritarian measures.

    • @ilearncode7365
      @ilearncode7365 Год назад

      The same people that think a bad sequence of pixel values is the worst thing ever because they care about children so much, are the same people that support a woman’s right to kill their own child while in the womb.

    • @HisCarlnessI
      @HisCarlnessI Год назад +31

      We both thinking about gun rights, and statistically crazy rare events?

    • @flamestoyershadowkill
      @flamestoyershadowkill Год назад +14

      Or minorities

    • @UninspiredUsername40
      @UninspiredUsername40 Год назад

      Or terrorism

    • @Akab
      @Akab Год назад +16

      @@HisCarlnessI a bit of regulation like a background check should always happen(i mean it is already anyways in most states) but guns should definitely not be banned or even over regulated, yes.
      And yeah, "think of the children" is really such an old excuse, you would think people would've realized that farce by now but they still don't 🙄

  • @kinomoto3633
    @kinomoto3633 Год назад +458

    one step closer to the day when having "offline storage" will be a suspicious, borderline criminal thing to have

    • @BlazeEst
      @BlazeEst Год назад +11

      Bruh 💀

    • @swish6143
      @swish6143 Год назад +57

      Same as cash.

    • @leandrogl2
      @leandrogl2 Год назад +27

      You can be jailed for failing to decrypt a hard-drive or file.

    • @RogueAmendiaresyourgirl
      @RogueAmendiaresyourgirl Год назад +10

      Time to switch to SD cards and USB drives with Veracrypt-encryption.

    • @NihongoWakannai
      @NihongoWakannai Год назад +13

      @@leandrogl2 well yeah no shit, if they have a warrant then they have the right to demand you show it to them. The difference is that they need a court ordered warrant to do that, it's not a private company looking at your files whenever they want.

  • @DarkMetaOFFICIAL
    @DarkMetaOFFICIAL Год назад +556

    Trusting Apple with your privacy is like having Epstein as the family babysitter

    • @anglepsycho
      @anglepsycho Год назад +29

      Or Lena Dunham as the homeschool teacher.

    • @DarkMetaOFFICIAL
      @DarkMetaOFFICIAL Год назад +2

      @@anglepsycho lmao

    • @diablo.the.cheater
      @diablo.the.cheater Год назад +13

      Or trusting me with running a government.

    • @JayRagon
      @JayRagon Год назад +16

      @@diablo.the.cheater that's honestly not bad enough to compare to trusting apple with privacy

    • @Yorkshire42069
      @Yorkshire42069 Год назад

      What do you do to get around it?

  • @bremsberg
    @bremsberg Год назад +860

    Really miss the times when we had more hardware choices rather than between "don't be evil" trash bin and "think different" trash bin.

    • @nevabeensmart
      @nevabeensmart Год назад +126

      Monopoly is more than a kids game...

    • @Arimodu
      @Arimodu Год назад +98

      Its not really the hardware as it is the pre-packaged software. If someone took a recent samsung, slapped a decent ROM on it and re-sold it you would have your good choice, but that person would also have a BIG FAT LAWSUIT in their lap the next day.

    • @villagernumber77
      @villagernumber77 Год назад +9

      You forgot the where do you want to go today rubbish bin

    • @TheOfficalAndI
      @TheOfficalAndI Год назад +29

      And the first trashbin dropped the whole not being evil thing.

    • @vallisdaemonumofficial
      @vallisdaemonumofficial Год назад +74

      @@Arimodu this is why it should be as easy to build a phone as it a PC. Seriously, fuck a lotta this hardware that's glued together.

  • @kevina.4036
    @kevina.4036 Год назад +582

    "We know what is best for you. Think of the children". Atrocities have been committed for less.

    • @wrongthinker843
      @wrongthinker843 Год назад

      "Think of the children" - a megacorporation that exploits child labor

    • @YouKnowMeDuh
      @YouKnowMeDuh Год назад +3

      Less? You can drop a lot of words: "We know what's best."
      Yep, it's happened.

  • @mad_vegan
    @mad_vegan Год назад +133

    Soon we'll have "personalized ads" based on the types of files we have on our devices.

    • @Dave102693
      @Dave102693 Год назад +23

      Google and MS already does this

    • @riftsquid7659
      @riftsquid7659 Год назад +14

      You’re already getting ads based on every single thing that comes out of your mouth. 99% of your apps have baked in microphone access.

    • @Dave102693
      @Dave102693 Год назад

      @@riftsquid7659 exactly

    • @Hackanhacker
      @Hackanhacker Год назад

      @@Dave102693 they arnt on on my device and its shows xD Mostly i have no ad anywhere lol fuck oof no in my house but when one or two slip trough it have no shit related to me xD its pretty hard to get rid of thos system.tho (phone)

  • @Hypnotically_Caucasian
    @Hypnotically_Caucasian Год назад +229

    company: “wOnT sOmEbOdY pLeAsE tHiNk oF tHe cHiLdRen?!?”
    Also company: * uses child labor in India to make $1,000 phones in horrific conditions *

    • @decoy3418
      @decoy3418 Год назад +25

      It didn't happen in America so it's not real.

    • @sorryi6685
      @sorryi6685 Год назад +4

      Their is definitely no child labour in India to make Iphones. Child Labour exist in India but it is unorganised sector and in interior of India but definitely not in Iphones. That will be a PR nightmare. Why take risk when there is huge adult population who will work for very low wage

    • @avyam7509
      @avyam7509 Год назад +7

      It's china.

    • @justaweeb14688
      @justaweeb14688 Год назад +3

      It’s impossible to not have human rights violations in any of the manufacturing process. Fair phone tried and failed.

    • @Schopenhauer69
      @Schopenhauer69 Год назад

      China maybe. Child labor is a crime in India.

  • @Blood-PawWerewolf
    @Blood-PawWerewolf Год назад +908

    I called it the moment they “quietly” abandoned the CSAM BS last year.
    We all knew it was be snuck in without our knowledge

    • @21N13
      @21N13 Год назад

      Do you have any evidence that this has been snuck in? The process that this RUclipsr who earns money off of this video mentions, without going into detail on while basing the entire video on it, has existed on macOS for over a decade. You can now select text in images, images are scanned for subjects and Visual Look Up fetches related information over the web. Both you and the RUclipsr earning money off of this clickbait video could've done 1 web search and discovered everything you could possibly want to know about how the process works. Instead you're spreading fake news and FUD here.

    • @timewave02012
      @timewave02012 Год назад +22

      Apple's original whitepaper described the scanning as host-side, using a form of homomorphic encryption to avoid revealing the perceptual hashes needed to craft preimage or collision attacks (think: attackers being able to craft false positive images). I don't support it in any way, but it should be no surprise, considering this is exactly how the system was originally designed to work.

    • @GareWorks
      @GareWorks Год назад +6

      Yeah, I think a lot of us had a pretty good idea of what they were really up to.

    • @RD-eh3tz
      @RD-eh3tz Год назад

      @@timewave02012 typical Apple being homomorphic, when will they learn to tolerate minorities.

  • @nigeltheoutlaw
    @nigeltheoutlaw Год назад +1484

    I'm really grateful to guys like you and Upper Echelon calling attention to problems like this that the mainstream is in full, unhesitating support of. This is some truly frightening, dystopian crap, and it's eternally disappointing how unintelligent the average NPC is that they lack any and all pattern recognition to realize "protect the children" is never the goal.

    • @ProteinFeen
      @ProteinFeen Год назад +24

      Thanks for the shoutout to upper echelon bouta check him out❤

    • @gamtax
      @gamtax Год назад +32

      @@ProteinFeen That guy came from gaming channel to commentary channel. Worth to check it out.

    • @ProteinFeen
      @ProteinFeen Год назад +8

      @@gamtax yeah I just subscribed to him, I really like the cyber security info channels especially this one. Keeps things fresh.

    • @maxia8302
      @maxia8302 Год назад

      While I like UEG, he sometimes gets stuff wrong. Problem is, we don‘t track false predictions. Remember his discussion that Musk would never buy Twitter or that SBF would be Epsteined?

    • @RT-qd8yl
      @RT-qd8yl Год назад +5

      That's why we need to round up the NPCs and put them all in a prison camp.

  • @arghpee
    @arghpee Год назад +181

    >petite woman nudes false flagged by Apple Bot
    >sent to Apple HQ
    all part of the plan.

    • @appalachiabrauchfrau
      @appalachiabrauchfrau Год назад +22

      tfw 4ft11 womanlet with babyface realizing Apple has turned me into a weapon, gg.

    • @arghpee
      @arghpee Год назад +6

      @@appalachiabrauchfrau mfw 6'5 and own an Android. 🗿

    • @BlazeEst
      @BlazeEst Год назад +21

      Just deleted nsfw content of petite aged women because this video got me paranoid, in the future there’s gonna be laws from being attracted to youthful petite women

    • @synexiasaturnds727yearsago7
      @synexiasaturnds727yearsago7 Год назад

      @@arghpee damn, should've trolled big bro
      whatever, you're putting the "big bro" in the first sentence anyways

    • @anglepsycho
      @anglepsycho Год назад +7

      Lmao, I'm concerned for East Asian middle-aged women now, better get that surgery if you don't want anything bad done to your chest online.

  • @paulosullivan3472
    @paulosullivan3472 Год назад +548

    I think its safe to assume all tech companies are doing this with the full support of the governments.

    • @Micchi-
      @Micchi- Год назад +61

      Always has been *gunshot

    • @-_Somebody_
      @-_Somebody_ Год назад +1

      @@Micchi- I see what you did there

    • @warlockpaladin2261
      @warlockpaladin2261 Год назад +32

      Apple's relationship with China adds an interesting wrinkle to this fabric.

    • @benni1015
      @benni1015 Год назад +5

      If you look at what the EU commission does right now i feel like many governments want it to go much much further.

  • @4.0.4
    @4.0.4 Год назад +70

    Using children as human shields for surveillance technology is in line with using them in sweatshops to make the iPhones 👍

  • @cubusek5849
    @cubusek5849 Год назад +316

    Maybe they should include to detect photos of repairing your Apple device, so they can send the secret anti-right to repair police

    • @canardchronique3477
      @canardchronique3477 Год назад +11

      Are you under the impression that they aren't currently doing exactly that? On the remote chance they aren't yet, maybe you shouldn't give them any ideas...

    • @Hackanhacker
      @Hackanhacker Год назад

      at ome point apple gunma make transformers for rwal damn it

    • @Network126
      @Network126 Год назад +2

      @@Hackanhacker Lol how old are you?

    • @KyzoVR
      @KyzoVR 8 месяцев назад

      @@Network126i am 4

  • @janpapaj4373
    @janpapaj4373 Год назад +826

    HOW THE FUCK DID APPLE TRAIN THE DETECTION MODEL

    • @Bloom_HD
      @Bloom_HD Год назад +466

      3 letter agencies have huge databases of cp. That's where it comes from whenever one of their whistleblowers wakes up to find 18TB of it uploaded onto his HDD and the local police "anonymously" tipped.

    • @alifahran8033
      @alifahran8033 Год назад

      The elites' private collection. A lot of people were regulars on Epstein's island.

    • @o-hogameplay185
      @o-hogameplay185 Год назад +50

      it is actually not that difficult. they just hash every images, and see if they match with hashes of known csam

    • @hihihihi3806
      @hihihihi3806 Год назад

      @@Bloom_HD the agencies r the real pedos

    • @Bloom_HD
      @Bloom_HD Год назад +209

      @@o-hogameplay185 take the image, slightly alter it by manipulating brightness by 0.1% or adding a single off-color pixel into the corner. And boom, new hash.
      That's not what they do.

  • @VBYTP
    @VBYTP Год назад +650

    One more reason I'm really glad I didn't buy an Apple device this Christmas. IMO if you want to protect children, start putting in real punishments for child abusers and leave good normal people alone.

    • @notafbihoneypot8487
      @notafbihoneypot8487 Год назад +74

      It's like getting rid of encryption, it would only hurt everyone. Even if it's for a good cause. I don't trust that it wouldn't be abused on a massive level

    • @pluto8404
      @pluto8404 Год назад +87

      That would be the logical thing to do. But as we have seen in the gun debate or war on drugs, it is about controlling the normal people, they dont care about the criminals.

    • @ryderostby
      @ryderostby Год назад

      Its always the safety excuse, always when it comes to taking away your privacy because ‘the less control you have the better’. Yeah I feel even better giving my data to the biggest data whores on this fucking planet

    • @m0-m0597
      @m0-m0597 Год назад +23

      come on guys, who really needs privacy? Let them just look through you. All will be fine. Also, isn't it just annoying being constantly concerned about stuff like that? Enjoy life and stop thinking :-)

    • @SergioLeonardoCornejo
      @SergioLeonardoCornejo Год назад +39

      Children are the excuse because they know that appeals to the feelings of people

  • @Angelarius82
    @Angelarius82 Год назад +105

    One thing about this that no one seems to have considered is this: Whatever this software is it will only ever be able to make suggestions about what the photo "might" be. Before any police or authorities are involved a real human will need to look at the suggestions that the AI has given. If you have private photos of yourself on your computer that might be enough to tigger a suggestion and next thing you know some stranger is looking at it because an AI said it "might" be something.

    • @Axman6
      @Axman6 Год назад +9

      The technology they were proposing to use was specifically not AI based, but Mental Outlaw just made assumptions because he’s too lazy to do any research at all. There’s not a single piece of evidence that anything here has anything to do with CSAM - go read the original blog post and see if you can find anything. IIRC, they were even going to require several hits against known CSAM before any action at all would be taken.

    • @nonormies
      @nonormies Год назад +53

      @@Axman6 do corporate boots actually taste like apples?

    • @Axman6
      @Axman6 Год назад +14

      @@Channel-gz9hm I’m not a fan of any of this, I’m glad Apple,reversed the decision, but just making up facts without evidence and getting mad is pathetic and delusional. This is like Qanon levels of leaps of logic. Paid shill, I wish; I’m just interested in verifiable facts instead of sensationalism with zero evidence.

    • @anglepsycho
      @anglepsycho Год назад +2

      It probably will be both AI and human error in the way since, currently, AI has to mess up and blend in with brush tools to make a form of a match like DALLE.

    • @thesenamesaretaken
      @thesenamesaretaken Год назад +8

      @@Axman6 it's a Qanon level leap to logic to think tech companies that datamine customers and that collude with governments might do both at the same time, wew

  • @anthonyeid6534
    @anthonyeid6534 Год назад +366

    TLDR: mediaanalysisd is a process that has been around for years; its purpose is to send neural hashes to apple and get information on what those hashes mean. Such as if there's a cat in a photo, a painting, etc. It can be disabled by turning off Siri suggestions by going to System Settings > Siri & Spotlight. Note disabling mediaanalysisd will turn off visual look up.
    Mediaaanalysisd is apart of visual look up (VLU) which is a system apple uses to recognize objects in photos like buildings, animals, paintings and more and give users information about the photo.
    VLU works by computing a neural hash locally that is used to distinguish photos/videos by the objects within them. During VLU apple requests a neural hash to compute what that hash represents and sends it back to the user. I'm assuming the database used to compute the hash’s meaning is too massive to be used locally and wouldn't be able to stay updated with new information.
    I really didn't like the article this video is based on because it makes claims without any real evidence or research. mediaanalysisd is a process that has been around for years and it's very easy to look up what it's used for and how to disable it. The author is close to some conspiracy theorist in my opinion.
    Anyway a much more in depth read on this topic can be found here: eclecticlight.co/2023/01/18/is-apple-checking-images-we-view-in-the-finder/

    • @mapl3mage
      @mapl3mage Год назад +91

      literally the only person who bothered to look up and search what the program actually does. no one else bothered. to be fair, the program name is suspicious and enough to ring alarm bells for anyone who believes the government is out to get them, which is literally the target audience of this channel.

    • @bcj842
      @bcj842 Год назад +7

      So if I turn that off I won’t be able to grab people out of a photo and make a clipart of them?

    • @poweron3654
      @poweron3654 Год назад +13

      Should be much higher up.

    • @Me-eb3wv
      @Me-eb3wv Год назад

      Interesting

    • @js32096
      @js32096 Год назад +74

      Sad that I had to scroll this far, to see your comment. I recently subscribed to Mental Outlaw and if he doesn't address this misinformation, I'll stop trusting his channel. Takes minutes to debunk. Mental Outlaw should be doing this due diligence before disseminating to his audience.

  • @NewWarrior21st
    @NewWarrior21st Год назад +176

    I actually cracked up when you we're talking about Apple using the children pretext and then it cut to EDP 😆

  • @Sthrl188
    @Sthrl188 Год назад +44

    People, this is what we have been talking about. The future. We need to stop it while we still can. (Remember; you will own nothing, and you will be happy)
    I do not want to live in a world like this.

    • @totallynotsarcastic7392
      @totallynotsarcastic7392 Год назад +6

      there is no political way to stop it

    • @appalachiabrauchfrau
      @appalachiabrauchfrau Год назад +6

      I'm just going back to disposable cameras. Got a roll of film developed and it felt like opening a present.

    • @MOTH_moff
      @MOTH_moff Год назад +2

      Print all your photos and keep them in a book.
      I'm only half joking.

    • @Sthrl188
      @Sthrl188 Год назад

      @@MOTH_moff yeah thanks im gonna print out my messages and mail them mf

  • @stephaneduhamel7706
    @stephaneduhamel7706 Год назад +417

    How would they even train an AI to recognize this kind of images? I can't imagine any ethical or legal way of getting training data.

    • @sprtwlf9314
      @sprtwlf9314 Год назад +253

      Ethical and legal aren't considerations for the global elite.

    • @sampletext9426
      @sampletext9426 Год назад +73

      @Happy Hippie Hose
      why is it ok for the government to have the largest collection of see pea, but not us?

    • @masterloquendo0
      @masterloquendo0 Год назад

      @@sampletext9426 nigga what?

    • @ali-1000
      @ali-1000 Год назад +21

      @@sampletext9426are you saying that you should have the right to store and view CSAM material 😟🤨

    • @sampletext9426
      @sampletext9426 Год назад +76

      @@ali-1000
      absolutely not, but we both can agree that our leaders can store and keep all they want

  • @beardalaxy
    @beardalaxy Год назад +116

    There was the story from i think a year or 2 ago where Google flagged a guy who had sent a text of a rash on his child's genitals to their doctor, and he got visited by the cops and everything. Had a really bad time with it. I can't imagine the stress from that happening with everyone thinking you actually had CSAM on you. That's messed up.

    • @janmaker227
      @janmaker227 Год назад +32

      This!!! Had a similar situation and you know wha try the most crazy thing? As a parent you get crazy thinking about how many people get to look at it that are not you or your doctor!

    • @beardalaxy
      @beardalaxy Год назад +3

      @@xCDF-pt8kj right, it's impossible for an AI to know intent unless you very clearly spell it out to them on a case-by-case basis, and sometimes not even then.

    • @tangentfox4677
      @tangentfox4677 Год назад +1

      The true highlight of that particular story is how even after proving innocence and getting everything sorted out, Google upheld banning him from their platforms for CP. Which means this guy can never work at any company that uses Google products, can't access any of his data that Google held, can't use Android, can't access his email, can't ever use any tool built on a Google login. He is effectively locked out of a significant portion of society simply because an automated system doesn't understand context matters.

    • @evil_radfem9162
      @evil_radfem9162 Год назад +1

      so, the scanner worked

    • @beardalaxy
      @beardalaxy Год назад +2

      @@evil_radfem9162 it did, but it lacks context and thus fucks over someone

  • @paracelsus407
    @paracelsus407 Год назад +311

    If Apple wanted to protect children, they'd build it into a Camera app, rather than search files on the device. All this does is make it easier to frame innocent people.

    • @brandonw1604
      @brandonw1604 Год назад +5

      They don’t search on device. This article is 100% wrong.

    • @xinfinity4756
      @xinfinity4756 Год назад +74

      @@brandonw1604 could you provide a reputable article demonstrating or at least explaining how/ why it doesn't?

    • @brandonw1604
      @brandonw1604 Год назад

      @@Geth270 but they’re not scanning anything so there’s that.

    • @Axman6
      @Axman6 Год назад +60

      Having it in the camera makes no sense, this was supposed to detect *known* CSAM material, which is derived from a database of image hashes of material collected by law enforcement agencies. It isn’t capable of detecting new material being generated, which is a massively more difficult problem and one significantly more prone to false positives.

    • @brandonw1604
      @brandonw1604 Год назад +1

      @@xinfinity4756 the other part is common sense. The scanning was done server side on iCloud. They’re not going to use a process calling home that you can block with an application firewall like Little Snitch.

  • @ZucchiniCzar
    @ZucchiniCzar Год назад +80

    It's always in the name of "security".

  • @turtleswithbombs
    @turtleswithbombs Год назад +428

    Brb, going to go break into people's houses to make sure they're not abusing children

    • @soda3185
      @soda3185 Год назад +28

      The hero we need but not the one we deserve /s

    • @users4007
      @users4007 Год назад +1

      @you will see it I suspect this to be rick roll

    • @whitelily2942
      @whitelily2942 Год назад +8

      @@users4007 no it’s spam

    • @idiotontheweb
      @idiotontheweb Год назад

      @TruthfulIy thx bro I needed it

    • @namesurname4666
      @namesurname4666 Год назад +1

      during your trip you want some cupcakes?

  • @PsRohrbaugh
    @PsRohrbaugh Год назад +54

    There are thousands of people who had charges dropped when they proved that a virus or hacker downloaded the illegal images, not them. There are thousands in jail right now who claim to be innocent but were unable to prove it. That should terrify you.

    • @PsRohrbaugh
      @PsRohrbaugh Год назад +14

      @@bacon222 No. I can't post links here, but search things like
      "State Worker's Child Porn Charges Dropped; Virus Blamed"
      "Child porn downloaded by mistake"
      "Computer porn hacker is making our lives a misery"
      There used to be a lot more examples easily available through Google, but I've spent 20 minutes searching and can't find any. Hmm... 🤔

    • @fatalityin1
      @fatalityin1 Год назад

      @@bacon222 It is a common "hack" first introduced by 4ch during their scientology raids, when they first with driveby-laptops scanned the wlan and then downloaded illegal stuff to the scientology router storages.
      Now that software is a lot more sophisticated, some viruses even deliberately download illegal images from a govt honeypot into a hidden folder in your sys directory, making it impossible to find while you can expect a "FBI, open up" within a week.
      The shocking thing is: the more you are versatile with PCs, the more likely you are going to get sentenced, because in theory you should have been able to prevent it.
      People workin in IT in my country already fight this kind of legislature move, just because you work in IT doesn't mean you are omnipotent and even the accusation is enough to make you unemployed forever

    • @Dave102693
      @Dave102693 Год назад +1

      Idk why don’t think that this happens more often then not.

    • @fayenotfaye
      @fayenotfaye Год назад

      “There are thousands in jail right now who claim to be innocent but we’re unable to prove it” Source: it came to them in a dream

    • @Memelord18
      @Memelord18 Год назад

      source?

  • @0xsupersane920
    @0xsupersane920 Год назад +22

    The worst part is that they advertise: "Privacy, that's iPhone" and then do shxt like this. They have the money to advertise to get the most reach possible, like for major sporting events, awards shows and all over the internet, which makes this problem even worse.

  • @matthewsjardine
    @matthewsjardine Год назад +18

    Apple wasn't happy with the old adage: "The cloud is just someone else's computer...". They decided to take it one step further, 'your computer is just someone else's computer'.

  • @Reaya
    @Reaya Год назад +33

    The people who store this type of data aren't stupid enough to put it in the cloud, and I'm sure Apple is aware of this. The only reason this was even considered is that they want to collect even more information from their users under the guise of protecting the children.

    • @YouAreStillNotablaze
      @YouAreStillNotablaze Год назад +3

      They actually really are that stupid and the thing is so many do this out in the open it's astronomical and LEAs can't even keep up.

  • @ProteinFeen
    @ProteinFeen Год назад +67

    Just found your account and man you are an awesome creator. I’m surprised I haven’t seen your videos before. I know nothing about half the stuff you talk about but you got a like and sun from me!❤

    • @ProteinFeen
      @ProteinFeen Год назад +3

      Sub*

    • @MentalOutlaw
      @MentalOutlaw  Год назад +19

      Thanks, glad you enjoy the videos

    • @mattgamei5vods649
      @mattgamei5vods649 Год назад +14

      @@MentalOutlaw when do you launch your onlyfans?

    • @ProteinFeen
      @ProteinFeen Год назад +5

      @@MentalOutlaw great stuff watching this on my iPhone 14 kms😂😂

    • @ProteinFeen
      @ProteinFeen Год назад

      @TruthfulIy someone has to clean the toilets everyone can’t just play in the computer

  • @veirant5004
    @veirant5004 Год назад +26

    The joke is that you can be imprisoned for a decade just for clicking on a "download" button somewhere on the Internet. I don't even give a cr*p about the content of what is being downloaded. Jail for saving a picture from the 'net. It's the f*cking end. And yeah, hey to the freedomest state of America ❤️.

  • @JamesWilson01
    @JamesWilson01 Год назад +309

    The sad thing is that most Apple users are so brainwashed that they either don't care or think this kind of thing is a great idea 😬

    • @nitebreak
      @nitebreak Год назад +10

      @@forbidden-cyrillic-handle I have an apple phone and I like it well enough because of some of the features but this makes me nervous. I have music on my iTunes that was not all purchased and I wonder if they are gonna start dmca on private files…

    • @localvoid69420
      @localvoid69420 Год назад +4

      @@nitebreak same, hope they don't dmca me cuz I don't wanna get in trouble just because I was listening to music that's not available on itunes

    • @ArdivKmen
      @ArdivKmen Год назад

      @@forbidden-cyrillic-handle Every single phone manufacturer copies Apple in one way or another, what makes you think it will only be Apple that does stuff like this? No matter how much we complain all phones will get this eventually. You see what kind of wiretap people install in their homes, all they have to say "Think of the children" and suddenly disabling that feature makes you a suspected child molester.

    • @Skullet
      @Skullet Год назад +6

      No, the sad thing here is making gross generalisations about people based on the products they buy.

    • @baileyharrison1030
      @baileyharrison1030 Год назад +4

      @@nitebreak How would they know you didn’t just rip the music from a CD you purchased?

  • @MunyuShizumi
    @MunyuShizumi Год назад +25

    A moment of silence for all the conversations where our disapproval will immediately be rebuked with "So you're defending child predators?" followed by incoherent screeching noises.

  • @mario7501
    @mario7501 Год назад +26

    There's a reason the government always calls laws that infringe on privacy something like "kid's online protection act". It's a very sinister game

    • @raj18x
      @raj18x 6 месяцев назад

      Agreed

  • @SimGunther
    @SimGunther Год назад +52

    More reason to use the BSDs, Linux, Haiku, and TempleOS

    • @nigeltheoutlaw
      @nigeltheoutlaw Год назад +20

      RIP Terry

    • @frog7362
      @frog7362 Год назад +8

      RIP brother terry

    • @hereticanthem5652
      @hereticanthem5652 Год назад

      BDSM as well

    • @steelfox1448
      @steelfox1448 Год назад +3

      RIP Terry

    • @RandyHanley
      @RandyHanley Год назад

      So true! These crooked companies are no different than crooked politicians, trying to sell it as something for the good of the people.

  • @leznis
    @leznis Год назад +27

    I love Apple - Pay 3 times more for a laptop that's overpriced af, and have your privacy violated, becuase "Think of the children". I remember these classic Simspons episodes where Reverend's wife screams "Won't somebody, please, think of the children!"

  • @GSFigure
    @GSFigure Год назад +111

    Madness is about to flourish.

    • @notafbihoneypot8487
      @notafbihoneypot8487 Год назад

      Don't worry. Give me your Phone number and I'll send You secure link to protect your DATA.
      SPONSORED BY NORDVPN

    • @MaxDankOG
      @MaxDankOG Год назад +4

      @Vetrussuper based

    • @totallynotsarcastic7392
      @totallynotsarcastic7392 Год назад +2

      "about to"?

    • @big_red_machine3547
      @big_red_machine3547 Год назад

      And AI is about to magnify what’s already so wrong about this sick society

  • @h.e.pennypacker4728
    @h.e.pennypacker4728 Год назад +2

    This could be the best youtube channel for any topic, definitely for anything tech related

  • @Tathanic
    @Tathanic Год назад +43

    say you don't like apple, "your icloud" suddenly finds 100TB of CP they found while scanning

    • @sampletext9426
      @sampletext9426 Год назад

      its scary that sea pe has become a weapon that can incriminate people.
      why aren't the citizens worried?

    • @25thDaveWalker
      @25thDaveWalker Год назад +24

      I love it that there's a reply under this comment but I can't view it, because youtube hid it from me. What a time to be alive

    • @audigamer8261
      @audigamer8261 Год назад +3

      @@25thDaveWalker same

    • @warlockpaladin2261
      @warlockpaladin2261 Год назад +9

      I too have noticed that the comment count is frequently off on YT.

    • @fayenotfaye
      @fayenotfaye Год назад

      @@25thDaveWalker it’s spam. There’s no proof as of yet that RUclips shadow bans people from comments, they sometimes do it from the recommended feed but not from comments.

  • @xE92vD
    @xE92vD Год назад +220

    I wouldn't be surprised if iPhone users still continued using their spyware filled devices after witnessing this.

    • @ygx6
      @ygx6 Год назад +29

      @Vetrus cuz they isleep when people state facts

    • @fanban2926
      @fanban2926 Год назад +10

      Same with Samsungs tho

    • @digi3218
      @digi3218 Год назад +10

      witnessing what?
      Sent on iPhone

    • @ygx6
      @ygx6 Год назад +16

      @@fanban2926 duh, google devices but apple just exposed themselves for scanning local images and forwarding them to their servers, google hasn't _yet_

    • @staidey5994
      @staidey5994 Год назад +12

      @@fanban2926 it's a complete different thing when the company doesn't actively advertise their privacy features. Afaik, neither Google nor Samsung nor any other Android device manufacturers have ever advertised their phones as being private; however, apple had been claiming the privacy features of their phones until they got recently exposed for having their phone not being as private as they've claimed this entire time.

  • @Slugbunny
    @Slugbunny Год назад +21

    Can practically see the privacy being dusted out of that ecosystem.

  • @captainsmirk9218
    @captainsmirk9218 Год назад +46

    A LOT of people put their intellectual property in icloud - ideas, business plans, unpublished book scripts, trade strategies, engineer designs, personal and bank information, etc. etc. Its only a matter of time before this is stolen by a bad actor given this level of access "for the children" (whether a hacker or contractor paid to look through files). I've been an iphone supporter since the iphone 2G - I am now getting rid of ALL my apple products. The access isn't just images and videos.

    • @Haise-san
      @Haise-san Год назад

      Me too, fuck that company

    • @bcj842
      @bcj842 Год назад +7

      Don’t throw your stuff out unless you plan on buying some next-level privacy shizz to replace it. Throwing out your Apple device over a headline just to go out and buy something from Xiaomi or Google is just trading one prison cell for another.

    • @MrTonyBarzini
      @MrTonyBarzini Год назад

      @@bcj842 is pinephone viable?

    • @bcj842
      @bcj842 Год назад +2

      @@MrTonyBarzini Sadly, this is about where my expertise ends… I know there’s more privacy-oriented devices out there, but what I don’t know is which ones are solid and which ones to avoid. I don’t have the technical background to make that call.

    • @raphaelcardoso7927
      @raphaelcardoso7927 Год назад +1

      @Kougami where in their user agreement?

  • @bradhaines3142
    @bradhaines3142 Год назад +15

    this is going to be a great way for apple to destroy peoples lives for nothing. and even worse, this could end up a boy cried wolf issue of all the false positives making people ignore when its a legit positive.

  • @MillywiggZ
    @MillywiggZ Год назад +32

    Adobe is doing the same with everyone’s Photoshop, Illustrator, etc. files.
    But that’s to train their A.I. art program.

    • @Spaghetti742
      @Spaghetti742 Год назад +8

      Thats actually a decent idea, if it were consent based. I mean think about all of the people who use photoshop/illustrator. BUt there needs to be a way to opt out

    • @Memelord18
      @Memelord18 Год назад +6

      @Spaghetti8696 I think it should be opt in for copyright reasons

    • @users4007
      @users4007 Год назад +4

      damn, if I ever use photoshop I’m def gonna pirate an old version then

    • @Spaghetti742
      @Spaghetti742 Год назад +1

      @@Memelord18 True, didn't think about that

    • @nothing_
      @nothing_ Год назад +2

      Can't they just scrape the internet like everyone else?

  • @c-LAW
    @c-LAW Год назад +53

    Just the attorney fees defending against an accusation is thousands of dollars if not tens of thousands.

    • @kevinmiller5467
      @kevinmiller5467 Год назад +1

      Don't worry investor, it doesn't cost Apple a dime.

  • @smolbirb4
    @smolbirb4 Год назад +52

    I love your content especially the more Linux/privacy focused content, and more so when you cover gentoo stuff hardly anyone talks about gentoo

  • @bedazzledmisery6969
    @bedazzledmisery6969 Год назад +6

    My prediction is this is gonna really bring back old school film photography and developing in a dark room to prevent images uploaded to any types of clouds.

  • @hmr1122
    @hmr1122 Год назад +43

    I wouldn't be surprised if windows already has some type of file scanning in the works.

    • @anglepsycho
      @anglepsycho Год назад +2

      @B Y I forgot they do that-

    • @ra2enjoyer708
      @ra2enjoyer708 Год назад +5

      They did play with the idea of showing you ads right in the file manager, so it's not so far off.
      Especially how windows normalizes having superuser privileges at all times, therefore any program can send and receive arbitrary data from the internet.

    • @DsiakMondala
      @DsiakMondala Год назад

      @FLN 764 Trop kek. There is no such a thing as deleting something you uploaded to the internet new friend. It is there forever.

    • @DsiakMondala
      @DsiakMondala Год назад

      @FLN 764 You... think you get to choose what goes on your windows installation? A-are you t-that new? OwO'

    • @SimonReeves2
      @SimonReeves2 Год назад

      @fln764 They might be a little condescending but they are correct in a sense, although difficult OneDrive can be deleted with 3rd party software *on your end only.* Anything that's been sent to the Cloud though, no. Only Microsoft can truly get rid of that, and deleting OneDrive on your system won't stop Microsoft, the Advertisers and Government Agencies they sell access to or Hackers from seeing what's been sent there.
      The best you could do is deleting OneDrive immediately after installing Windows for the first time, and then removing other forms of telemetry/data collection as well. They have also been known to reinstall deleted telemetry/data collection if you update your OS as well. Hell, they were bold enough to include telemetry/data collection in a 2019 end of service update for Windows 7, just for those who thought they were safe from Microsoft's BS on an older OS.

  • @LarsLarsen77
    @LarsLarsen77 Год назад +10

    One of MANY reasons why I've never owned an apple product.

  • @buddylee6203
    @buddylee6203 Год назад +16

    Im kinda tired of xmrig and lominer being reported as a virus

  • @IAmAlpharius14
    @IAmAlpharius14 Год назад +28

    always when i use any cloud platform i treat it as if all my files are publicly visible and just encrypt everything i upload onto it using gpg

    • @rejvaik00
      @rejvaik00 Год назад

      Can you teach me this?

    • @alexandervowles3518
      @alexandervowles3518 Год назад +7

      @@rejvaik00 you can just use 7zip to lock your files if you're lazy

    • @jamhamtime1878
      @jamhamtime1878 Год назад +1

      Yeah, but is there better methods that's much more easily integrated in any OS?
      I'm currently just using gpg, simple in linux, can easily install/use in windows, but very annoying to use on my phone when I do need it.
      Other comment said 7zip, maybe that would be more convenient? I never knew 7zip can even lock with passwords, I definitely will try later. Are there any more possibly convenient method for linux+windows+android?
      And honestly, gpg on android is not THAT inconvenient.

  • @GuideZer0
    @GuideZer0 Год назад +2

    I wish I could tell companies, "I don't care about the specificity of your advertising. The violation of privacy and commodification of personal information for the purpose of targeted advertising is creepy to me and I don't want you to try to sell me things on that basis!!"

  • @LokiScarletWasHere
    @LokiScarletWasHere Год назад +8

    Client-side scanning is exactly what they advertised they'd do, before claiming they're rolling it back.
    What's changed is they're doing it even with iCloud turned off.

  • @geeshta
    @geeshta Год назад +52

    So if Google automatically makes collections out of my photos like "Animals", "Food", "Nature" etc. does it mean it scans my photos as well?

    • @PvtAnonymous
      @PvtAnonymous Год назад +36

      is this a rhetorical question?

    • @tmacman0418
      @tmacman0418 Год назад +34

      Yes, all your photos are fed into it's AI including the faces of everyone you took a picture with so they can track and send ads to you more effectively.

    • @totallynotsarcastic7392
      @totallynotsarcastic7392 Год назад +6

      do you really need to ask?

    • @DarthChrisB
      @DarthChrisB Год назад +15

      No, the computer just happens to know what's in the photo whithout looking at it...

    • @filiphabek271
      @filiphabek271 Год назад

      For me it doesn't happen. Which google product do you use?

  • @gorofujita5767
    @gorofujita5767 Год назад +7

    This has less to do with scanning for copyrighted material, and more to do with spying (or even falsely incriminating, when really needed) political or ideological opponents marked by the NSA.
    Also, in limit situations, think of what they could possibly do with undesirable counter-establishment speakers. If they can scan your files, they can also make something "bad" mysteriously appear in your phone during police custody. There's a world os possibilities to this, and I wish more people thought about the implications seriously.

  • @jaytrip420
    @jaytrip420 Год назад +3

    Every day I see more and more reasons why my switch to android based devices was more than necessary

  • @ozymandias_yt
    @ozymandias_yt Год назад +4

    Two things first: 1. The following criticism isn’t supposed to defend Apple. It’s just about getting a broader perspective. 2. The approach of analysing personal data is alarming, which makes the main point of this video absolutely valid.
    BUT I think there are a few things wrong here.
    1. “Mediaanalysisd” is not new. It has been around for some time and is used for many different things. One of them is Spotlight. Everything typed into spotlight is sent to Apple Servers for analysis, because there is a web search feature in this program as well. Even without iCloud, I can imagine the intelligent file search is still turned on, which allows the user to search for stuff and MacOS finds local pictures with the corresponding thing in them. I see no evidence, that Jeffery Pauls pictures are sent to Apple or that they are scanned for CSAM content.
    2. Apple uses hashes for comparison with the NCMEC database. This means there is no AI, which is analysing photos like some people seem to imagine it. Theoretically it is even supposed to „check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures“ (The Verge). The last statement is questionable, because the algorithm apparently can not guarantee perfect avoidance of false positives. However this entire concept can’t be compared with full image recognition software, even though it is still scary.
    3. Another thing so many people got wrong. Apple never said they will stop working on their CSAM scanning software. They just stated that they will push the release further into the future because further development is needed. This means we definitely still need to keep an eye on this.
    4. There are at least a few ways of transferring photos from iOS devices to the Mac without the photos app. The Finder, which is the file manager, is indeed one of them.

    • @nousquest
      @nousquest Год назад +1

      The fact that there are closed source media analysis tools, important part being that they have to contact apple's servers to work, running in the background, given their track record, makes the suspicions valid.

  • @DoublesC
    @DoublesC Год назад +10

    9:15 Apple is a honeypot for people who want privacy but don't understand technology

  • @Basieeee
    @Basieeee Год назад +73

    I bet they will just scan for known checksums of csam files, but it could obviously be extended to anything they want to track.

    • @Bloom_HD
      @Bloom_HD Год назад +38

      Yes. Give them the benefit of the doubt. Always. Apple is your friend.
      /s

    • @homuraakemi9556
      @homuraakemi9556 Год назад +18

      Scanning the checksum wouldn't work since any alteration to the photo would change the checksum and that wouldn't catch any new CSAM either

    • @pluto8404
      @pluto8404 Год назад +27

      I hear Tim Cook liked "partying" on a famous island in the US Virgin Islands. I am sure he truly cares about this cause.

    • @ygx6
      @ygx6 Год назад +2

      They check for fuzzy hashes, not 1:1 checksums

    • @ygx6
      @ygx6 Год назад +3

      @@homuraakemi9556 they check for fuzzy hashes, not the exact checksum so cropping and editing won't do much but yes, new csam images would slide by the check

  • @washboardman7435
    @washboardman7435 Год назад +6

    Apple is committed to keeping you safe to everyone but themselves and whoever pays for the access.

  • @c-LAW
    @c-LAW Год назад +6

    7:46 "Litttle Snitch" is a great little utilities. I've used it on Mac for many years.

  • @kxuydhj
    @kxuydhj Год назад +2

    when this controversy first showed up i thought "good, if there's anything i hate more than children it's child predators", but the road to hell is paved with good intentions and this really is a great example. me getting pissed off at targeted ads might also have had something to do with my switch in attitude, but whatevs.

  • @Icee47
    @Icee47 Год назад +1

    Bro I found your channel yesterday, and I’m binging all your vids man. So good! I’ll start becoming more private as well…

  • @alexander1989x
    @alexander1989x Год назад +4

    It's a very thin line between "scanning for crime" and "scanning for criticism, dissent, journalism and negativiry".

    • @warlockpaladin2261
      @warlockpaladin2261 Год назад

      Apple's relationship with China adds an interesting wrinkle to this fabric.

  • @perrywood3839
    @perrywood3839 Год назад +7

    Would love a video on how to setup something like Little Snitch/Glasswire/open source alternatives for those of us still using OSX/Windows to try and cull stuff like this

    • @Marty_YouTuber
      @Marty_YouTuber Год назад

      Little Snitch
      Little Snitch is a firewall application that monitors and controls outbound internet traffic.
      Paid • Proprietary
      Firewall
      Mac

  • @daverei1211
    @daverei1211 Год назад +9

    You got to wonder the potential misuse of this where bad actors use adware or drive by malware to drop “suspicious” images and then try to extort you by showing that there is a file, and post to this scanning, and that by paying them 1btc for them to protect you……

  • @benni1015
    @benni1015 Год назад +2

    Since barely anybody seems to know, in the EU the eu commission tries to implement something similar to CSAM, but instead for files in some cloud storage it is for our private communication. With client side scanning they want to have an A.I read through your messages, scan your photos etc before they are encrypted and sent to the receipiant.

  • @kotzpenner
    @kotzpenner Год назад +11

    Corporations and Governments try not to spy on their customers/citizens (IMPOSSIBLE, GONE SEXUAL)

  • @floppa9415
    @floppa9415 Год назад +5

    That explains why battery life is going down the shitter.

  • @11alekon
    @11alekon Год назад +15

    I wonder what would happen to weapon artist for games, cause we have tons and tons of images/videos/documents for all sorts of weapons, even ourselves holding the the gun to understand how to use it. Apple is going to go mental over it in a string gun country like the UK

    • @ra2enjoyer708
      @ra2enjoyer708 Год назад +2

      Just get a friendly visit by a glowie every now and then. And also don't forget to pay an attorney each time.

    • @mimikyu_
      @mimikyu_ Год назад +1

      as an artist i didnt even think about this. ive been learning to draw weapons recently and i know my phone already scans the images and guesses what object is in them. like it made a folder for my cats full of all the cat photos in my phone without me doing anything. so it can easily see me having weapon images and think im some sort of criminal

  • @kek207
    @kek207 Год назад +5

    Saying you don't have anything to hide
    is like telling people: you don't have anything to say ~Edward Snowden

  • @Nightcaat
    @Nightcaat Год назад +4

    Normally I find your videos informative and entertaining (sure they have filler, but they’re nice to listen to) so I’m disappointed in you for not checking this further. This article is fear-mongering and the comments are eating it up thanks to their confirmation bias against Apple. mediaanalysisd is related to Spotlight and have been a part of macOS long before Apple was interested in CSAM detection, used for things like face, text, and object recognition.

  • @Lysande1337
    @Lysande1337 Год назад +2

    This is why I'm so hesitant about "updating" my devices.
    Imagine a world where people had more sense about tech and actively chose not to allow malicious practices:
    - A search engine that would give you the results that you're looking for
    - A video hosting platform with:
    1. Functioning ratings and a way to display them
    2. Comment filtering - you can choose to only see comments that are actually relevant to the content of the video - like sources, analysis and description.
    3. A functioning search function (again) with filters like:
    - intervals for age of video
    - popularity
    - rating
    - Viewed/not already viewed
    - newest/oldest
    - And a way to block channels and keywords
    4. Favoring content that is real and true and not fake and gay clickbait.
    5. I can go on and on but won't.
    - Actual ownership over both software and hardware when purchased. - They cannot change it without your consent and cannot make it worse after the fact.
    - HUGE support for open source and thus alternatives and options - imagine software that does EXACTLY what you want and nothing else. I think this would effectively kill "big tech". You can't get away with selling your hardware at a lower price than it's cost with the help of ads/contracts if there is a better alternative.
    I guess we can dream.

  • @Mustachioed_Mollusk
    @Mustachioed_Mollusk Год назад +6

    This HAS to be a major breach in privacy. What information is going to be stolen using the, “Think of the children” excuse?

  • @BastianInukChristensen
    @BastianInukChristensen Год назад +4

    Both Google and MS already does CSAM scanning, but for the time being I only know of it on their cloud services.

  • @apIthletIcc
    @apIthletIcc Год назад +21

    Apple: "trust us we're not spying at all when we scan local files without permission"
    FBI: "trust us you need to use adblock"
    T-Mobile: "trust us not ALL your personal info was stolen 3x in the last two years"

  • @Merrifieldsam
    @Merrifieldsam 5 месяцев назад +1

    I kind of just assumed Apple was ALWAYS doing this. I work at a school, we use primarily Apple and ChromeOS. Apple OSes do A LOT of stupid and creepy stuff like this. My assumption was that the higher-ups were monitoring this stuff through third-party programs and management systems but it doesnt surprise me at all that Apple's also doing this natively, considering their apparent contract with the US government and how their biggest customers would actually WANT this sort of thing. I'm sure it's in high demand!

  • @Spumoon
    @Spumoon Год назад +2

    Cleaned out my temp files on Windows the other day and my bing search before and after yielded different results. Maybe I'm just a hecking n00b, but that came as quite a surprise to me.

  • @o-hogameplay185
    @o-hogameplay185 Год назад +5

    Louis Rossmann made a video about a man, who was asked by his son's doctor to send pictures about his son's injuries, so he can get the correct medicine faster. google scanned the photos, and alerted the police because they mistook it as csam. the worst part in this is, that if i am correct, before talking to the police, a human went through those photos, CONFIRMING they were csam...
    so not only IA will scan your photos (or files) but some karens too

  • @moki5796
    @moki5796 Год назад +3

    You should post an update to the Jeffrey Paul story. This "issue" has been resolved now, turns out it was a bug where the service would send empty requests whenever a media file was previewed. No information about the files were ever transmitted without consent.

  • @AnonymouslyBlack
    @AnonymouslyBlack Год назад +5

    Thank you for sharing this info. What a time to be alive, I swear. I'm not as well informed as you all are when it comes to secure systems, and will conduct my own research after posting this, but my question is "What is considered a secure mobile phone system these days?". Microsoft is out, Google is out, and now Apple. What are we left with? Or is it now up to the user to jailbreak their own system for some privacy?

    • @synexiasaturnds727yearsago7
      @synexiasaturnds727yearsago7 Год назад +1

      Google Pixel, then get rid of the OS and replace it with something else.

    • @ra2enjoyer708
      @ra2enjoyer708 Год назад +3

      There is no such thing as a secure mobile system, since the entire idea of smartphone is being a datamining device being constantly hooked to the internet.

    • @Zakkious
      @Zakkious Год назад +2

      Smartphones are inherently unsecure, unfortunately.

  • @RegenerationOfficial
    @RegenerationOfficial Год назад +2

    While discussing, to found a company that installs private one home servers and rents leftover space, we ran into the issue to distinguish between legitimate childhood photos and abusive material. Like flirting, there is a fine line towards denying it ever was what it seemed to be.

  • @transposedmatrix
    @transposedmatrix Год назад +2

    Regarding the false positives, Apple has stated that 1) you don’t get flagged for a single positive, 2) the probability of false flagging is roughly 1 in 10^12, and 3) that all flagged images are reviewed manually. So the scenario you described in the beginning is very unlikely, if not just straight up incorrect. This is all stated in the CSAM paper Apple has released by the way, and probably one of of the first sources you’d check…
    That being said, everything else mentioned in the video is obviously a cause for concern.

    • @transposedmatrix
      @transposedmatrix Год назад +1

      @Sir Pendelton Me too. I’m just saying that the first half of the video is basically completely wrong.

    • @Nightcaat
      @Nightcaat Год назад +2

      The other half’s wrong too. There’s no evidence that mediaanalysisd is actually checking for CSAM. The daemon has been present in macOS for years and is related to Spotlight for things like searching for text in photos

    • @transposedmatrix
      @transposedmatrix Год назад +1

      @@Nightcaat Yes, you’re right, I don’t think it’s conclusive evidence either. That being said, I don’t feel confident enough in my knowledge about that topic to claim it myself.

  • @alifahran8033
    @alifahran8033 Год назад +8

    Every single day this cursed world convinces me more and more that Uncle Ted is right.
    I am one step away from throwing my smartphone into the trash can and buying a "dumb" phone. The only thing stopping me as of today is the need for 2FA for my job.

    • @matthew8153
      @matthew8153 Год назад +2

      “Dumb” phones today aren’t actually dumb. They use android.

    • @alifahran8033
      @alifahran8033 Год назад

      No, I meant the likes of a Nokia from the late 2000s - early 2010s. The most basic phone that you can imagine. My favourite phone of all time was my Nokia 6300. Metal frame body, thin and cool design for it's time, SD slot, ability to run games (Real Soccer 2010 and Spiderman (don't remember the exact name) were my favourite)). It had zero spookiness and all the functionality.

    • @matthew8153
      @matthew8153 Год назад

      @@alifahran8033
      Sadly none of those old phones work. They took down the 3G and older networks.

    • @alifahran8033
      @alifahran8033 Год назад +2

      The perks of not living in the USA.
      In our country (Bulgaria) the network providers still support 3G, but I am not 100% sure about 2G.

    • @JPX64Channel
      @JPX64Channel Год назад

      @@matthew8153 there are a lot of 3G phones from late 00s to early 10s, they still work.

  • @SuperTort0ise
    @SuperTort0ise Год назад +7

    9:24 "What happens on your iPhone, stays on your iPhone." Until your iPhone tells us what happened on it.

  • @JohnMushitu
    @JohnMushitu Год назад +4

    They recently just uploaded a video on privacy. That's actually funny 😂

  • @Mantikal
    @Mantikal Год назад +1

    Man, these guys are really helping to make UNIX & Linux distros more popular

  • @aldrickfondracul9297
    @aldrickfondracul9297 Год назад +1

    It's times like this that I'm glad I've stuck with dumbphones for so long, even though I am pretty much fighting progress at this point. I seriously dread the day I finally get a smartphone.

  • @aguywithaytusername
    @aguywithaytusername Год назад +126

    wait a minute. if it's an ai, how did they train it?

    • @upcomingweeb136
      @upcomingweeb136 Год назад +32

      Good question

    • @Dratchev241
      @Dratchev241 Год назад +84

      by feeding it tons of images that would get us tossed in a nice hotel with grey bars and doors.

    • @Ryfinius
      @Ryfinius Год назад +4

      Watch the snl skit where Dwayne Johnson trained his evil robot.

    • @RogueA.I.
      @RogueA.I. Год назад +12

      You know how…

    • @IaMaPh1991
      @IaMaPh1991 Год назад

      The glowies have literal terabytes of material in their possession, which they very likely fap to when they aren't arresting and prosecuting innocent citizens for "evidence" they likely planted in the first place.
      Of course they are willing to share it with a powerful corporation. It's certainly not illegal when THEY posses and distribute it amongst one another...

  • @andre-le-bone-aparte
    @andre-le-bone-aparte Год назад +4

    Question: Can you do a monthly news update of these kinds of topics? - This was helpful for us who use Unix / Linux but have to interact with MacOS / WinOS for work.

  • @goofylookincat5028
    @goofylookincat5028 Год назад +12

    I can’t remember where I heard it, but if image scanning isn’t actually involved, Apple will at least try and scan the hashes of the files and see if it matches up with their database. Now this is still bad because when this technology gets to governments, they can scan and look for ANY kind of file such as a picture of Xi Jinping photoshopped onto Winnie the Pooh.

    • @ra2enjoyer708
      @ra2enjoyer708 Год назад +1

      > image scanning isn’t actually involved
      > scan the hashes of the files
      What do you think is the process of getting a hash of the file, lol?

    • @Fixer_Su3ana
      @Fixer_Su3ana Год назад

      What do you mean photoshopped? He already looks that way.

  • @BlazeEst
    @BlazeEst Год назад +12

    You’re def on a watchlist for advocating privacy

  • @mcall9800
    @mcall9800 Год назад +2

    Apparently this was a bug in previous versions of Mac OS and the request was actually empty. Louis Rossman made a video talking about the update I would recommend.

  • @JodyBruchon
    @JodyBruchon Год назад +1

    It is not the place of large businesses to "protect children." That's the job of parents, teachers, chaperones, and supervising adults in person. *Companies "protecting children" is never benevolent.*

  • @tp6335
    @tp6335 Год назад +9

    What interests me the most is, if Apple is training a closed source ai in house to look for csam, they are bound to have a training set in their possession or provided to them. Is the existence of such a training set not highly immoral? What if there is a bad actor somewhere involved and a leak occurs?

    • @nsfeliz7825
      @nsfeliz7825 Год назад +2

      youre right ,someone somewhere is being paid to posssess child pron . for the purpose of training ai. th$ mere possesion of cp is indeed illegal.

    • @pyromcr
      @pyromcr Год назад

      Some engineer at Apple is just looking for an excuse to jack off all day and came up with this project.

  • @Krynos18
    @Krynos18 Год назад +3

    Hash based detection for known files. The real problem comes in with hash collisions. If your graduation photo happens to generate the same hash as a previously identified CSAM image, even if the photos are completely different, you are out of luck until a real human bothers to look at the file.

  • @yevoidstar
    @yevoidstar Год назад +3

    This video either needs to be cleared up or outright deleted since the network connections made by mediaanalysisd were never proven or shown that it is actually reporting results of scanning images for "CSAM". And it's coming out now that this was an outright bug... Nice video though.

  • @sirflimflam
    @sirflimflam Год назад +2

    mediaanalysisd is related to visual lookup. I have my own qualms with the inability to disable VSL but it's not inherently related to the whole CSAM thing. it's also been around forever in one form or another.

  • @glitchy_weasel
    @glitchy_weasel Год назад +2

    Oh wow, it's as if we give them an inch they'll take a whole mile. Don't stop raising awareness, MO! Let's help by sharing this video.