Apple Has Begun Scanning Users Files EVEN WITH iCloud TURNED OFF
HTML-код
- Опубликовано: 25 ноя 2024
- In this video I discuss how several news medias have announced that Apple will no longer pursue scanning peoples iCloud accounts for CCAM and a blog post that appears to be showing that Apple is indeed scanning local filesystems on Mac OS without users consent (iCloud and analytics turned off)
sneak.berlin/2...
₿💰💵💲Help Support the Channel by Donating Crypto💲💵💰₿
Monero
45F2bNHVcRzXVBsvZ5giyvKGAgm6LFhMsjUUVPTEtdgJJ5SNyxzSNUmFSBR5qCCWLpjiUjYMkmZoX9b3cChNjvxR7kvh436
Bitcoin
3MMKHXPQrGHEsmdHaAGD59FWhKFGeUsAxV
Ethereum
0xeA4DA3F9BAb091Eb86921CA6E41712438f4E5079
Litecoin
MBfrxLJMuw26hbVi2MjCVDFkkExz8rYvUF
Dash
Xh9PXPEy5RoLJgFDGYCDjrbXdjshMaYerz
Zcash
t1aWtU5SBpxuUWBSwDKy4gTkT2T1ZwtFvrr
Chainlink
0x0f7f21D267d2C9dbae17fd8c20012eFEA3678F14
Bitcoin Cash
qz2st00dtu9e79zrq5wshsgaxsjw299n7c69th8ryp
Etherum Classic
0xeA641e59913960f578ad39A6B4d02051A5556BfC
USD Coin
0x0B045f743A693b225630862a3464B52fefE79FdB
Subscribe to my RUclips channel goo.gl/9U10Wz
and be sure to click that notification bell so you know when new videos are released.
Every time I hear "Think about the children", it makes me sick to my stomach. I'm sure a trillion dollar company with ridiculous margins, who exploits people in 3rd world countries to make production costs even lower, truly care, deeply about children. The fact they're trying to encroach on your privacy is bad enough, hiding behind children while doing so is disgusting.
"that's for using children"
- Trinity, Matrix Resurrections
Solid point there
Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.
Another thing that makes me sick to my stomach "are you hidding something", yeah put 24/7 cameras in your bedroom, you aren't hiding something.
We’re going to need you to stop posting opposing opinions because what if some children read what you said? Be responsible.
I can totally see this being adapted (quietly) to scan and detect copyrighted material and snitching to the police whenever it is found.
Yep, as well as memes that promote "hate".
Anyone can have a different definition of freedom
that's definitely one of the primary goals, they just use the unobjectionable goal as the one to get their foot in the door.
"we're gonna make sure there's no CP on your computer, which we know you don't have anon so don't worry about it. And while we're in there we're gonna check for pirated anime, maybe see if that netflix login in your keyring was borrowed from anybody. But you're not some TOS breaking freak, so you have nothing to worry about right bro?"
I think it’s more likely that it straight up deletes those files from your hard drive and when you try to open it again you get some annoying as fucc notification like “we couldn’t find your files :(“
@@jer1776 These heckin trolls on 4chan spreading H8 sp33ch memes are currently the BIGGEST THREAT TO OUR DeMoCrAcY!!!! We HAVE to scan your personal files, it's to protect Democracy! and the children! and minorities! and immigrants!!!
Protecting children is always the excuse to impose authoritarian measures.
The same people that think a bad sequence of pixel values is the worst thing ever because they care about children so much, are the same people that support a woman’s right to kill their own child while in the womb.
We both thinking about gun rights, and statistically crazy rare events?
Or minorities
Or terrorism
@@HisCarlnessI a bit of regulation like a background check should always happen(i mean it is already anyways in most states) but guns should definitely not be banned or even over regulated, yes.
And yeah, "think of the children" is really such an old excuse, you would think people would've realized that farce by now but they still don't 🙄
one step closer to the day when having "offline storage" will be a suspicious, borderline criminal thing to have
Bruh 💀
Same as cash.
You can be jailed for failing to decrypt a hard-drive or file.
Time to switch to SD cards and USB drives with Veracrypt-encryption.
@@leandrogl2 well yeah no shit, if they have a warrant then they have the right to demand you show it to them. The difference is that they need a court ordered warrant to do that, it's not a private company looking at your files whenever they want.
Trusting Apple with your privacy is like having Epstein as the family babysitter
Or Lena Dunham as the homeschool teacher.
@@anglepsycho lmao
Or trusting me with running a government.
@@diablo.the.cheater that's honestly not bad enough to compare to trusting apple with privacy
What do you do to get around it?
Really miss the times when we had more hardware choices rather than between "don't be evil" trash bin and "think different" trash bin.
Monopoly is more than a kids game...
Its not really the hardware as it is the pre-packaged software. If someone took a recent samsung, slapped a decent ROM on it and re-sold it you would have your good choice, but that person would also have a BIG FAT LAWSUIT in their lap the next day.
You forgot the where do you want to go today rubbish bin
And the first trashbin dropped the whole not being evil thing.
@@Arimodu this is why it should be as easy to build a phone as it a PC. Seriously, fuck a lotta this hardware that's glued together.
"We know what is best for you. Think of the children". Atrocities have been committed for less.
"Think of the children" - a megacorporation that exploits child labor
Less? You can drop a lot of words: "We know what's best."
Yep, it's happened.
Soon we'll have "personalized ads" based on the types of files we have on our devices.
Google and MS already does this
You’re already getting ads based on every single thing that comes out of your mouth. 99% of your apps have baked in microphone access.
@@riftsquid7659 exactly
@@Dave102693 they arnt on on my device and its shows xD Mostly i have no ad anywhere lol fuck oof no in my house but when one or two slip trough it have no shit related to me xD its pretty hard to get rid of thos system.tho (phone)
company: “wOnT sOmEbOdY pLeAsE tHiNk oF tHe cHiLdRen?!?”
Also company: * uses child labor in India to make $1,000 phones in horrific conditions *
It didn't happen in America so it's not real.
Their is definitely no child labour in India to make Iphones. Child Labour exist in India but it is unorganised sector and in interior of India but definitely not in Iphones. That will be a PR nightmare. Why take risk when there is huge adult population who will work for very low wage
It's china.
It’s impossible to not have human rights violations in any of the manufacturing process. Fair phone tried and failed.
China maybe. Child labor is a crime in India.
I called it the moment they “quietly” abandoned the CSAM BS last year.
We all knew it was be snuck in without our knowledge
Do you have any evidence that this has been snuck in? The process that this RUclipsr who earns money off of this video mentions, without going into detail on while basing the entire video on it, has existed on macOS for over a decade. You can now select text in images, images are scanned for subjects and Visual Look Up fetches related information over the web. Both you and the RUclipsr earning money off of this clickbait video could've done 1 web search and discovered everything you could possibly want to know about how the process works. Instead you're spreading fake news and FUD here.
Apple's original whitepaper described the scanning as host-side, using a form of homomorphic encryption to avoid revealing the perceptual hashes needed to craft preimage or collision attacks (think: attackers being able to craft false positive images). I don't support it in any way, but it should be no surprise, considering this is exactly how the system was originally designed to work.
Yeah, I think a lot of us had a pretty good idea of what they were really up to.
@@timewave02012 typical Apple being homomorphic, when will they learn to tolerate minorities.
I'm really grateful to guys like you and Upper Echelon calling attention to problems like this that the mainstream is in full, unhesitating support of. This is some truly frightening, dystopian crap, and it's eternally disappointing how unintelligent the average NPC is that they lack any and all pattern recognition to realize "protect the children" is never the goal.
Thanks for the shoutout to upper echelon bouta check him out❤
@@ProteinFeen That guy came from gaming channel to commentary channel. Worth to check it out.
@@gamtax yeah I just subscribed to him, I really like the cyber security info channels especially this one. Keeps things fresh.
While I like UEG, he sometimes gets stuff wrong. Problem is, we don‘t track false predictions. Remember his discussion that Musk would never buy Twitter or that SBF would be Epsteined?
That's why we need to round up the NPCs and put them all in a prison camp.
>petite woman nudes false flagged by Apple Bot
>sent to Apple HQ
all part of the plan.
tfw 4ft11 womanlet with babyface realizing Apple has turned me into a weapon, gg.
@@appalachiabrauchfrau mfw 6'5 and own an Android. 🗿
Just deleted nsfw content of petite aged women because this video got me paranoid, in the future there’s gonna be laws from being attracted to youthful petite women
@@arghpee damn, should've trolled big bro
whatever, you're putting the "big bro" in the first sentence anyways
Lmao, I'm concerned for East Asian middle-aged women now, better get that surgery if you don't want anything bad done to your chest online.
I think its safe to assume all tech companies are doing this with the full support of the governments.
Always has been *gunshot
@@Micchi- I see what you did there
Apple's relationship with China adds an interesting wrinkle to this fabric.
If you look at what the EU commission does right now i feel like many governments want it to go much much further.
Using children as human shields for surveillance technology is in line with using them in sweatshops to make the iPhones 👍
Maybe they should include to detect photos of repairing your Apple device, so they can send the secret anti-right to repair police
Are you under the impression that they aren't currently doing exactly that? On the remote chance they aren't yet, maybe you shouldn't give them any ideas...
at ome point apple gunma make transformers for rwal damn it
@@Hackanhacker Lol how old are you?
@@Network126i am 4
HOW THE FUCK DID APPLE TRAIN THE DETECTION MODEL
3 letter agencies have huge databases of cp. That's where it comes from whenever one of their whistleblowers wakes up to find 18TB of it uploaded onto his HDD and the local police "anonymously" tipped.
The elites' private collection. A lot of people were regulars on Epstein's island.
it is actually not that difficult. they just hash every images, and see if they match with hashes of known csam
@@Bloom_HD the agencies r the real pedos
@@o-hogameplay185 take the image, slightly alter it by manipulating brightness by 0.1% or adding a single off-color pixel into the corner. And boom, new hash.
That's not what they do.
One more reason I'm really glad I didn't buy an Apple device this Christmas. IMO if you want to protect children, start putting in real punishments for child abusers and leave good normal people alone.
It's like getting rid of encryption, it would only hurt everyone. Even if it's for a good cause. I don't trust that it wouldn't be abused on a massive level
That would be the logical thing to do. But as we have seen in the gun debate or war on drugs, it is about controlling the normal people, they dont care about the criminals.
Its always the safety excuse, always when it comes to taking away your privacy because ‘the less control you have the better’. Yeah I feel even better giving my data to the biggest data whores on this fucking planet
come on guys, who really needs privacy? Let them just look through you. All will be fine. Also, isn't it just annoying being constantly concerned about stuff like that? Enjoy life and stop thinking :-)
Children are the excuse because they know that appeals to the feelings of people
One thing about this that no one seems to have considered is this: Whatever this software is it will only ever be able to make suggestions about what the photo "might" be. Before any police or authorities are involved a real human will need to look at the suggestions that the AI has given. If you have private photos of yourself on your computer that might be enough to tigger a suggestion and next thing you know some stranger is looking at it because an AI said it "might" be something.
The technology they were proposing to use was specifically not AI based, but Mental Outlaw just made assumptions because he’s too lazy to do any research at all. There’s not a single piece of evidence that anything here has anything to do with CSAM - go read the original blog post and see if you can find anything. IIRC, they were even going to require several hits against known CSAM before any action at all would be taken.
@@Axman6 do corporate boots actually taste like apples?
@@Channel-gz9hm I’m not a fan of any of this, I’m glad Apple,reversed the decision, but just making up facts without evidence and getting mad is pathetic and delusional. This is like Qanon levels of leaps of logic. Paid shill, I wish; I’m just interested in verifiable facts instead of sensationalism with zero evidence.
It probably will be both AI and human error in the way since, currently, AI has to mess up and blend in with brush tools to make a form of a match like DALLE.
@@Axman6 it's a Qanon level leap to logic to think tech companies that datamine customers and that collude with governments might do both at the same time, wew
TLDR: mediaanalysisd is a process that has been around for years; its purpose is to send neural hashes to apple and get information on what those hashes mean. Such as if there's a cat in a photo, a painting, etc. It can be disabled by turning off Siri suggestions by going to System Settings > Siri & Spotlight. Note disabling mediaanalysisd will turn off visual look up.
Mediaaanalysisd is apart of visual look up (VLU) which is a system apple uses to recognize objects in photos like buildings, animals, paintings and more and give users information about the photo.
VLU works by computing a neural hash locally that is used to distinguish photos/videos by the objects within them. During VLU apple requests a neural hash to compute what that hash represents and sends it back to the user. I'm assuming the database used to compute the hash’s meaning is too massive to be used locally and wouldn't be able to stay updated with new information.
I really didn't like the article this video is based on because it makes claims without any real evidence or research. mediaanalysisd is a process that has been around for years and it's very easy to look up what it's used for and how to disable it. The author is close to some conspiracy theorist in my opinion.
Anyway a much more in depth read on this topic can be found here: eclecticlight.co/2023/01/18/is-apple-checking-images-we-view-in-the-finder/
literally the only person who bothered to look up and search what the program actually does. no one else bothered. to be fair, the program name is suspicious and enough to ring alarm bells for anyone who believes the government is out to get them, which is literally the target audience of this channel.
So if I turn that off I won’t be able to grab people out of a photo and make a clipart of them?
Should be much higher up.
Interesting
Sad that I had to scroll this far, to see your comment. I recently subscribed to Mental Outlaw and if he doesn't address this misinformation, I'll stop trusting his channel. Takes minutes to debunk. Mental Outlaw should be doing this due diligence before disseminating to his audience.
I actually cracked up when you we're talking about Apple using the children pretext and then it cut to EDP 😆
Facts
People, this is what we have been talking about. The future. We need to stop it while we still can. (Remember; you will own nothing, and you will be happy)
I do not want to live in a world like this.
there is no political way to stop it
I'm just going back to disposable cameras. Got a roll of film developed and it felt like opening a present.
Print all your photos and keep them in a book.
I'm only half joking.
@@MOTH_moff yeah thanks im gonna print out my messages and mail them mf
How would they even train an AI to recognize this kind of images? I can't imagine any ethical or legal way of getting training data.
Ethical and legal aren't considerations for the global elite.
@Happy Hippie Hose
why is it ok for the government to have the largest collection of see pea, but not us?
@@sampletext9426 nigga what?
@@sampletext9426are you saying that you should have the right to store and view CSAM material 😟🤨
@@ali-1000
absolutely not, but we both can agree that our leaders can store and keep all they want
There was the story from i think a year or 2 ago where Google flagged a guy who had sent a text of a rash on his child's genitals to their doctor, and he got visited by the cops and everything. Had a really bad time with it. I can't imagine the stress from that happening with everyone thinking you actually had CSAM on you. That's messed up.
This!!! Had a similar situation and you know wha try the most crazy thing? As a parent you get crazy thinking about how many people get to look at it that are not you or your doctor!
@@xCDF-pt8kj right, it's impossible for an AI to know intent unless you very clearly spell it out to them on a case-by-case basis, and sometimes not even then.
The true highlight of that particular story is how even after proving innocence and getting everything sorted out, Google upheld banning him from their platforms for CP. Which means this guy can never work at any company that uses Google products, can't access any of his data that Google held, can't use Android, can't access his email, can't ever use any tool built on a Google login. He is effectively locked out of a significant portion of society simply because an automated system doesn't understand context matters.
so, the scanner worked
@@evil_radfem9162 it did, but it lacks context and thus fucks over someone
If Apple wanted to protect children, they'd build it into a Camera app, rather than search files on the device. All this does is make it easier to frame innocent people.
They don’t search on device. This article is 100% wrong.
@@brandonw1604 could you provide a reputable article demonstrating or at least explaining how/ why it doesn't?
@@Geth270 but they’re not scanning anything so there’s that.
Having it in the camera makes no sense, this was supposed to detect *known* CSAM material, which is derived from a database of image hashes of material collected by law enforcement agencies. It isn’t capable of detecting new material being generated, which is a massively more difficult problem and one significantly more prone to false positives.
@@xinfinity4756 the other part is common sense. The scanning was done server side on iCloud. They’re not going to use a process calling home that you can block with an application firewall like Little Snitch.
It's always in the name of "security".
Or "officer safety "
Just not *your* security
Brb, going to go break into people's houses to make sure they're not abusing children
The hero we need but not the one we deserve /s
@you will see it I suspect this to be rick roll
@@users4007 no it’s spam
@TruthfulIy thx bro I needed it
during your trip you want some cupcakes?
There are thousands of people who had charges dropped when they proved that a virus or hacker downloaded the illegal images, not them. There are thousands in jail right now who claim to be innocent but were unable to prove it. That should terrify you.
@@bacon222 No. I can't post links here, but search things like
"State Worker's Child Porn Charges Dropped; Virus Blamed"
"Child porn downloaded by mistake"
"Computer porn hacker is making our lives a misery"
There used to be a lot more examples easily available through Google, but I've spent 20 minutes searching and can't find any. Hmm... 🤔
@@bacon222 It is a common "hack" first introduced by 4ch during their scientology raids, when they first with driveby-laptops scanned the wlan and then downloaded illegal stuff to the scientology router storages.
Now that software is a lot more sophisticated, some viruses even deliberately download illegal images from a govt honeypot into a hidden folder in your sys directory, making it impossible to find while you can expect a "FBI, open up" within a week.
The shocking thing is: the more you are versatile with PCs, the more likely you are going to get sentenced, because in theory you should have been able to prevent it.
People workin in IT in my country already fight this kind of legislature move, just because you work in IT doesn't mean you are omnipotent and even the accusation is enough to make you unemployed forever
Idk why don’t think that this happens more often then not.
“There are thousands in jail right now who claim to be innocent but we’re unable to prove it” Source: it came to them in a dream
source?
The worst part is that they advertise: "Privacy, that's iPhone" and then do shxt like this. They have the money to advertise to get the most reach possible, like for major sporting events, awards shows and all over the internet, which makes this problem even worse.
Apple wasn't happy with the old adage: "The cloud is just someone else's computer...". They decided to take it one step further, 'your computer is just someone else's computer'.
The people who store this type of data aren't stupid enough to put it in the cloud, and I'm sure Apple is aware of this. The only reason this was even considered is that they want to collect even more information from their users under the guise of protecting the children.
They actually really are that stupid and the thing is so many do this out in the open it's astronomical and LEAs can't even keep up.
Just found your account and man you are an awesome creator. I’m surprised I haven’t seen your videos before. I know nothing about half the stuff you talk about but you got a like and sun from me!❤
Sub*
Thanks, glad you enjoy the videos
@@MentalOutlaw when do you launch your onlyfans?
@@MentalOutlaw great stuff watching this on my iPhone 14 kms😂😂
@TruthfulIy someone has to clean the toilets everyone can’t just play in the computer
The joke is that you can be imprisoned for a decade just for clicking on a "download" button somewhere on the Internet. I don't even give a cr*p about the content of what is being downloaded. Jail for saving a picture from the 'net. It's the f*cking end. And yeah, hey to the freedomest state of America ❤️.
The sad thing is that most Apple users are so brainwashed that they either don't care or think this kind of thing is a great idea 😬
@@forbidden-cyrillic-handle I have an apple phone and I like it well enough because of some of the features but this makes me nervous. I have music on my iTunes that was not all purchased and I wonder if they are gonna start dmca on private files…
@@nitebreak same, hope they don't dmca me cuz I don't wanna get in trouble just because I was listening to music that's not available on itunes
@@forbidden-cyrillic-handle Every single phone manufacturer copies Apple in one way or another, what makes you think it will only be Apple that does stuff like this? No matter how much we complain all phones will get this eventually. You see what kind of wiretap people install in their homes, all they have to say "Think of the children" and suddenly disabling that feature makes you a suspected child molester.
No, the sad thing here is making gross generalisations about people based on the products they buy.
@@nitebreak How would they know you didn’t just rip the music from a CD you purchased?
A moment of silence for all the conversations where our disapproval will immediately be rebuked with "So you're defending child predators?" followed by incoherent screeching noises.
reeeeEEEE
There's a reason the government always calls laws that infringe on privacy something like "kid's online protection act". It's a very sinister game
Agreed
More reason to use the BSDs, Linux, Haiku, and TempleOS
RIP Terry
RIP brother terry
BDSM as well
RIP Terry
So true! These crooked companies are no different than crooked politicians, trying to sell it as something for the good of the people.
I love Apple - Pay 3 times more for a laptop that's overpriced af, and have your privacy violated, becuase "Think of the children". I remember these classic Simspons episodes where Reverend's wife screams "Won't somebody, please, think of the children!"
Madness is about to flourish.
Don't worry. Give me your Phone number and I'll send You secure link to protect your DATA.
SPONSORED BY NORDVPN
@Vetrussuper based
"about to"?
And AI is about to magnify what’s already so wrong about this sick society
This could be the best youtube channel for any topic, definitely for anything tech related
say you don't like apple, "your icloud" suddenly finds 100TB of CP they found while scanning
its scary that sea pe has become a weapon that can incriminate people.
why aren't the citizens worried?
I love it that there's a reply under this comment but I can't view it, because youtube hid it from me. What a time to be alive
@@25thDaveWalker same
I too have noticed that the comment count is frequently off on YT.
@@25thDaveWalker it’s spam. There’s no proof as of yet that RUclips shadow bans people from comments, they sometimes do it from the recommended feed but not from comments.
I wouldn't be surprised if iPhone users still continued using their spyware filled devices after witnessing this.
@Vetrus cuz they isleep when people state facts
Same with Samsungs tho
witnessing what?
Sent on iPhone
@@fanban2926 duh, google devices but apple just exposed themselves for scanning local images and forwarding them to their servers, google hasn't _yet_
@@fanban2926 it's a complete different thing when the company doesn't actively advertise their privacy features. Afaik, neither Google nor Samsung nor any other Android device manufacturers have ever advertised their phones as being private; however, apple had been claiming the privacy features of their phones until they got recently exposed for having their phone not being as private as they've claimed this entire time.
Can practically see the privacy being dusted out of that ecosystem.
A LOT of people put their intellectual property in icloud - ideas, business plans, unpublished book scripts, trade strategies, engineer designs, personal and bank information, etc. etc. Its only a matter of time before this is stolen by a bad actor given this level of access "for the children" (whether a hacker or contractor paid to look through files). I've been an iphone supporter since the iphone 2G - I am now getting rid of ALL my apple products. The access isn't just images and videos.
Me too, fuck that company
Don’t throw your stuff out unless you plan on buying some next-level privacy shizz to replace it. Throwing out your Apple device over a headline just to go out and buy something from Xiaomi or Google is just trading one prison cell for another.
@@bcj842 is pinephone viable?
@@MrTonyBarzini Sadly, this is about where my expertise ends… I know there’s more privacy-oriented devices out there, but what I don’t know is which ones are solid and which ones to avoid. I don’t have the technical background to make that call.
@Kougami where in their user agreement?
this is going to be a great way for apple to destroy peoples lives for nothing. and even worse, this could end up a boy cried wolf issue of all the false positives making people ignore when its a legit positive.
Adobe is doing the same with everyone’s Photoshop, Illustrator, etc. files.
But that’s to train their A.I. art program.
Thats actually a decent idea, if it were consent based. I mean think about all of the people who use photoshop/illustrator. BUt there needs to be a way to opt out
@Spaghetti8696 I think it should be opt in for copyright reasons
damn, if I ever use photoshop I’m def gonna pirate an old version then
@@Memelord18 True, didn't think about that
Can't they just scrape the internet like everyone else?
Just the attorney fees defending against an accusation is thousands of dollars if not tens of thousands.
Don't worry investor, it doesn't cost Apple a dime.
I love your content especially the more Linux/privacy focused content, and more so when you cover gentoo stuff hardly anyone talks about gentoo
They can't talk until they're done compiling.
Gentoo takes so long to compile.
@@Wampa842 😂😂😂
@@Wampa842 true
My prediction is this is gonna really bring back old school film photography and developing in a dark room to prevent images uploaded to any types of clouds.
I wouldn't be surprised if windows already has some type of file scanning in the works.
@B Y I forgot they do that-
They did play with the idea of showing you ads right in the file manager, so it's not so far off.
Especially how windows normalizes having superuser privileges at all times, therefore any program can send and receive arbitrary data from the internet.
@FLN 764 Trop kek. There is no such a thing as deleting something you uploaded to the internet new friend. It is there forever.
@FLN 764 You... think you get to choose what goes on your windows installation? A-are you t-that new? OwO'
@fln764 They might be a little condescending but they are correct in a sense, although difficult OneDrive can be deleted with 3rd party software *on your end only.* Anything that's been sent to the Cloud though, no. Only Microsoft can truly get rid of that, and deleting OneDrive on your system won't stop Microsoft, the Advertisers and Government Agencies they sell access to or Hackers from seeing what's been sent there.
The best you could do is deleting OneDrive immediately after installing Windows for the first time, and then removing other forms of telemetry/data collection as well. They have also been known to reinstall deleted telemetry/data collection if you update your OS as well. Hell, they were bold enough to include telemetry/data collection in a 2019 end of service update for Windows 7, just for those who thought they were safe from Microsoft's BS on an older OS.
One of MANY reasons why I've never owned an apple product.
Im kinda tired of xmrig and lominer being reported as a virus
always when i use any cloud platform i treat it as if all my files are publicly visible and just encrypt everything i upload onto it using gpg
Can you teach me this?
@@rejvaik00 you can just use 7zip to lock your files if you're lazy
Yeah, but is there better methods that's much more easily integrated in any OS?
I'm currently just using gpg, simple in linux, can easily install/use in windows, but very annoying to use on my phone when I do need it.
Other comment said 7zip, maybe that would be more convenient? I never knew 7zip can even lock with passwords, I definitely will try later. Are there any more possibly convenient method for linux+windows+android?
And honestly, gpg on android is not THAT inconvenient.
I wish I could tell companies, "I don't care about the specificity of your advertising. The violation of privacy and commodification of personal information for the purpose of targeted advertising is creepy to me and I don't want you to try to sell me things on that basis!!"
Client-side scanning is exactly what they advertised they'd do, before claiming they're rolling it back.
What's changed is they're doing it even with iCloud turned off.
So if Google automatically makes collections out of my photos like "Animals", "Food", "Nature" etc. does it mean it scans my photos as well?
is this a rhetorical question?
Yes, all your photos are fed into it's AI including the faces of everyone you took a picture with so they can track and send ads to you more effectively.
do you really need to ask?
No, the computer just happens to know what's in the photo whithout looking at it...
For me it doesn't happen. Which google product do you use?
This has less to do with scanning for copyrighted material, and more to do with spying (or even falsely incriminating, when really needed) political or ideological opponents marked by the NSA.
Also, in limit situations, think of what they could possibly do with undesirable counter-establishment speakers. If they can scan your files, they can also make something "bad" mysteriously appear in your phone during police custody. There's a world os possibilities to this, and I wish more people thought about the implications seriously.
Every day I see more and more reasons why my switch to android based devices was more than necessary
Two things first: 1. The following criticism isn’t supposed to defend Apple. It’s just about getting a broader perspective. 2. The approach of analysing personal data is alarming, which makes the main point of this video absolutely valid.
BUT I think there are a few things wrong here.
1. “Mediaanalysisd” is not new. It has been around for some time and is used for many different things. One of them is Spotlight. Everything typed into spotlight is sent to Apple Servers for analysis, because there is a web search feature in this program as well. Even without iCloud, I can imagine the intelligent file search is still turned on, which allows the user to search for stuff and MacOS finds local pictures with the corresponding thing in them. I see no evidence, that Jeffery Pauls pictures are sent to Apple or that they are scanned for CSAM content.
2. Apple uses hashes for comparison with the NCMEC database. This means there is no AI, which is analysing photos like some people seem to imagine it. Theoretically it is even supposed to „check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures“ (The Verge). The last statement is questionable, because the algorithm apparently can not guarantee perfect avoidance of false positives. However this entire concept can’t be compared with full image recognition software, even though it is still scary.
3. Another thing so many people got wrong. Apple never said they will stop working on their CSAM scanning software. They just stated that they will push the release further into the future because further development is needed. This means we definitely still need to keep an eye on this.
4. There are at least a few ways of transferring photos from iOS devices to the Mac without the photos app. The Finder, which is the file manager, is indeed one of them.
The fact that there are closed source media analysis tools, important part being that they have to contact apple's servers to work, running in the background, given their track record, makes the suspicions valid.
9:15 Apple is a honeypot for people who want privacy but don't understand technology
I bet they will just scan for known checksums of csam files, but it could obviously be extended to anything they want to track.
Yes. Give them the benefit of the doubt. Always. Apple is your friend.
/s
Scanning the checksum wouldn't work since any alteration to the photo would change the checksum and that wouldn't catch any new CSAM either
I hear Tim Cook liked "partying" on a famous island in the US Virgin Islands. I am sure he truly cares about this cause.
They check for fuzzy hashes, not 1:1 checksums
@@homuraakemi9556 they check for fuzzy hashes, not the exact checksum so cropping and editing won't do much but yes, new csam images would slide by the check
Apple is committed to keeping you safe to everyone but themselves and whoever pays for the access.
7:46 "Litttle Snitch" is a great little utilities. I've used it on Mac for many years.
Explain?
@@warlockpaladin2261 It's an outbound app firewall
when this controversy first showed up i thought "good, if there's anything i hate more than children it's child predators", but the road to hell is paved with good intentions and this really is a great example. me getting pissed off at targeted ads might also have had something to do with my switch in attitude, but whatevs.
Bro I found your channel yesterday, and I’m binging all your vids man. So good! I’ll start becoming more private as well…
It's a very thin line between "scanning for crime" and "scanning for criticism, dissent, journalism and negativiry".
Apple's relationship with China adds an interesting wrinkle to this fabric.
Would love a video on how to setup something like Little Snitch/Glasswire/open source alternatives for those of us still using OSX/Windows to try and cull stuff like this
Little Snitch
Little Snitch is a firewall application that monitors and controls outbound internet traffic.
Paid • Proprietary
Firewall
Mac
You got to wonder the potential misuse of this where bad actors use adware or drive by malware to drop “suspicious” images and then try to extort you by showing that there is a file, and post to this scanning, and that by paying them 1btc for them to protect you……
Since barely anybody seems to know, in the EU the eu commission tries to implement something similar to CSAM, but instead for files in some cloud storage it is for our private communication. With client side scanning they want to have an A.I read through your messages, scan your photos etc before they are encrypted and sent to the receipiant.
Corporations and Governments try not to spy on their customers/citizens (IMPOSSIBLE, GONE SEXUAL)
That explains why battery life is going down the shitter.
I wonder what would happen to weapon artist for games, cause we have tons and tons of images/videos/documents for all sorts of weapons, even ourselves holding the the gun to understand how to use it. Apple is going to go mental over it in a string gun country like the UK
Just get a friendly visit by a glowie every now and then. And also don't forget to pay an attorney each time.
as an artist i didnt even think about this. ive been learning to draw weapons recently and i know my phone already scans the images and guesses what object is in them. like it made a folder for my cats full of all the cat photos in my phone without me doing anything. so it can easily see me having weapon images and think im some sort of criminal
Saying you don't have anything to hide
is like telling people: you don't have anything to say ~Edward Snowden
Normally I find your videos informative and entertaining (sure they have filler, but they’re nice to listen to) so I’m disappointed in you for not checking this further. This article is fear-mongering and the comments are eating it up thanks to their confirmation bias against Apple. mediaanalysisd is related to Spotlight and have been a part of macOS long before Apple was interested in CSAM detection, used for things like face, text, and object recognition.
This is why I'm so hesitant about "updating" my devices.
Imagine a world where people had more sense about tech and actively chose not to allow malicious practices:
- A search engine that would give you the results that you're looking for
- A video hosting platform with:
1. Functioning ratings and a way to display them
2. Comment filtering - you can choose to only see comments that are actually relevant to the content of the video - like sources, analysis and description.
3. A functioning search function (again) with filters like:
- intervals for age of video
- popularity
- rating
- Viewed/not already viewed
- newest/oldest
- And a way to block channels and keywords
4. Favoring content that is real and true and not fake and gay clickbait.
5. I can go on and on but won't.
- Actual ownership over both software and hardware when purchased. - They cannot change it without your consent and cannot make it worse after the fact.
- HUGE support for open source and thus alternatives and options - imagine software that does EXACTLY what you want and nothing else. I think this would effectively kill "big tech". You can't get away with selling your hardware at a lower price than it's cost with the help of ads/contracts if there is a better alternative.
I guess we can dream.
This HAS to be a major breach in privacy. What information is going to be stolen using the, “Think of the children” excuse?
Both Google and MS already does CSAM scanning, but for the time being I only know of it on their cloud services.
Apple: "trust us we're not spying at all when we scan local files without permission"
FBI: "trust us you need to use adblock"
T-Mobile: "trust us not ALL your personal info was stolen 3x in the last two years"
I kind of just assumed Apple was ALWAYS doing this. I work at a school, we use primarily Apple and ChromeOS. Apple OSes do A LOT of stupid and creepy stuff like this. My assumption was that the higher-ups were monitoring this stuff through third-party programs and management systems but it doesnt surprise me at all that Apple's also doing this natively, considering their apparent contract with the US government and how their biggest customers would actually WANT this sort of thing. I'm sure it's in high demand!
Cleaned out my temp files on Windows the other day and my bing search before and after yielded different results. Maybe I'm just a hecking n00b, but that came as quite a surprise to me.
Louis Rossmann made a video about a man, who was asked by his son's doctor to send pictures about his son's injuries, so he can get the correct medicine faster. google scanned the photos, and alerted the police because they mistook it as csam. the worst part in this is, that if i am correct, before talking to the police, a human went through those photos, CONFIRMING they were csam...
so not only IA will scan your photos (or files) but some karens too
You should post an update to the Jeffrey Paul story. This "issue" has been resolved now, turns out it was a bug where the service would send empty requests whenever a media file was previewed. No information about the files were ever transmitted without consent.
Thank you for sharing this info. What a time to be alive, I swear. I'm not as well informed as you all are when it comes to secure systems, and will conduct my own research after posting this, but my question is "What is considered a secure mobile phone system these days?". Microsoft is out, Google is out, and now Apple. What are we left with? Or is it now up to the user to jailbreak their own system for some privacy?
Google Pixel, then get rid of the OS and replace it with something else.
There is no such thing as a secure mobile system, since the entire idea of smartphone is being a datamining device being constantly hooked to the internet.
Smartphones are inherently unsecure, unfortunately.
While discussing, to found a company that installs private one home servers and rents leftover space, we ran into the issue to distinguish between legitimate childhood photos and abusive material. Like flirting, there is a fine line towards denying it ever was what it seemed to be.
Regarding the false positives, Apple has stated that 1) you don’t get flagged for a single positive, 2) the probability of false flagging is roughly 1 in 10^12, and 3) that all flagged images are reviewed manually. So the scenario you described in the beginning is very unlikely, if not just straight up incorrect. This is all stated in the CSAM paper Apple has released by the way, and probably one of of the first sources you’d check…
That being said, everything else mentioned in the video is obviously a cause for concern.
@Sir Pendelton Me too. I’m just saying that the first half of the video is basically completely wrong.
The other half’s wrong too. There’s no evidence that mediaanalysisd is actually checking for CSAM. The daemon has been present in macOS for years and is related to Spotlight for things like searching for text in photos
@@Nightcaat Yes, you’re right, I don’t think it’s conclusive evidence either. That being said, I don’t feel confident enough in my knowledge about that topic to claim it myself.
Every single day this cursed world convinces me more and more that Uncle Ted is right.
I am one step away from throwing my smartphone into the trash can and buying a "dumb" phone. The only thing stopping me as of today is the need for 2FA for my job.
“Dumb” phones today aren’t actually dumb. They use android.
No, I meant the likes of a Nokia from the late 2000s - early 2010s. The most basic phone that you can imagine. My favourite phone of all time was my Nokia 6300. Metal frame body, thin and cool design for it's time, SD slot, ability to run games (Real Soccer 2010 and Spiderman (don't remember the exact name) were my favourite)). It had zero spookiness and all the functionality.
@@alifahran8033
Sadly none of those old phones work. They took down the 3G and older networks.
The perks of not living in the USA.
In our country (Bulgaria) the network providers still support 3G, but I am not 100% sure about 2G.
@@matthew8153 there are a lot of 3G phones from late 00s to early 10s, they still work.
9:24 "What happens on your iPhone, stays on your iPhone." Until your iPhone tells us what happened on it.
They recently just uploaded a video on privacy. That's actually funny 😂
Man, these guys are really helping to make UNIX & Linux distros more popular
It's times like this that I'm glad I've stuck with dumbphones for so long, even though I am pretty much fighting progress at this point. I seriously dread the day I finally get a smartphone.
wait a minute. if it's an ai, how did they train it?
Good question
by feeding it tons of images that would get us tossed in a nice hotel with grey bars and doors.
Watch the snl skit where Dwayne Johnson trained his evil robot.
You know how…
The glowies have literal terabytes of material in their possession, which they very likely fap to when they aren't arresting and prosecuting innocent citizens for "evidence" they likely planted in the first place.
Of course they are willing to share it with a powerful corporation. It's certainly not illegal when THEY posses and distribute it amongst one another...
Question: Can you do a monthly news update of these kinds of topics? - This was helpful for us who use Unix / Linux but have to interact with MacOS / WinOS for work.
I can’t remember where I heard it, but if image scanning isn’t actually involved, Apple will at least try and scan the hashes of the files and see if it matches up with their database. Now this is still bad because when this technology gets to governments, they can scan and look for ANY kind of file such as a picture of Xi Jinping photoshopped onto Winnie the Pooh.
> image scanning isn’t actually involved
> scan the hashes of the files
What do you think is the process of getting a hash of the file, lol?
What do you mean photoshopped? He already looks that way.
You’re def on a watchlist for advocating privacy
Apparently this was a bug in previous versions of Mac OS and the request was actually empty. Louis Rossman made a video talking about the update I would recommend.
It is not the place of large businesses to "protect children." That's the job of parents, teachers, chaperones, and supervising adults in person. *Companies "protecting children" is never benevolent.*
What interests me the most is, if Apple is training a closed source ai in house to look for csam, they are bound to have a training set in their possession or provided to them. Is the existence of such a training set not highly immoral? What if there is a bad actor somewhere involved and a leak occurs?
youre right ,someone somewhere is being paid to posssess child pron . for the purpose of training ai. th$ mere possesion of cp is indeed illegal.
Some engineer at Apple is just looking for an excuse to jack off all day and came up with this project.
Hash based detection for known files. The real problem comes in with hash collisions. If your graduation photo happens to generate the same hash as a previously identified CSAM image, even if the photos are completely different, you are out of luck until a real human bothers to look at the file.
This video either needs to be cleared up or outright deleted since the network connections made by mediaanalysisd were never proven or shown that it is actually reporting results of scanning images for "CSAM". And it's coming out now that this was an outright bug... Nice video though.
mediaanalysisd is related to visual lookup. I have my own qualms with the inability to disable VSL but it's not inherently related to the whole CSAM thing. it's also been around forever in one form or another.
Oh wow, it's as if we give them an inch they'll take a whole mile. Don't stop raising awareness, MO! Let's help by sharing this video.