@@cardboardpackage Tim Cook has been hiding while he throws Craig under the bus. I think the CEO of the company should be the one explaining this to its customers and media, not his VP of Software Engineering. Cook just threw him under the bus and started driving it.
@@MidNiteR32 nah. I think it was the right move. First of all Craig is more agreeable and second the risk is lower. And to be honest Craig managed it formidably imo
3:21 “pornography of any other sort” I’m glad Craig essentially said that Apple knows and understands that people simply just have nudes on their phones
@@nicolelea615 My friend there are people out there that are just as bad as mentioned but they are people who have private photos of their partners/spouses don’t put everyone and everything under one group its not fair hope you understand (:
@@albinjt probably because Apple restricts certain channels on Telegram. It’s insane that I have to go the browser version of Telegram to view those restricted channels.
@@bigrich9654 What sort of channels are they restricting tho ? And what are the restricted channels in iOS you have been accessing ? Could ya send us the links ?
If they are stored on their servers in this day and age I feel as if it’s your fault for trusting big tech. Either way, we’ll all forget about this in a couple of weeks. We basically already have
If they can install a program that tell me my battery is at 10% after 10minutes of use, when a quick hard restart bring it back on at 100% there is no telling what they can install on your phone. If Ur f-ingdeau gives Apple a couple hundred million of our tax dollars because we proved that the vax was ineffective and self immunity has a 80% success at beating the virus there’s no telling what those greedy blasters will do.
Now that this whole news has gotten out, actual petafiles aren’t going to be storing their Photos on iPhone anymore. So basically this feature is useless now.
The people stupid enough to store highly illegal material in cloud storage won't be stopped by these news. It was always one of the easiest ways to get caught
First of all, thank you for covering the issue. I wish you pressed him on what type of audit he mentions, because to me anyone can force Apple to add a database via the FISA court. I want to know what is done to prevent that from happening instead of taking Apple as its word.
@@zonka6598 if that it what’s works for you and gives you a sense of privacy then by all means but just note that if you use google they’re already doing it and worst so yeah…
“It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.” -Warren Buffett Apple is feeling this hard, hence the panicked response to media.
@Carrot Cruncher I'm an Android guy through and through Brett has a network engineer I can tell you programs are much more flawed than people realize they're doing this for margin of error
Tim Cook is the ceo Craig is the software person he knows what everything does he made it Tim Cook does not do software and anyone that stores things to the cloud they dont own the servers all they own is the main device storage they dont scan on the device they scan on the cloud only
It's definitely a little more complicated, I can "own" a car but there's a lot of restrictions on what I can to with it or to it, especially if you want to use it on a road. Ownership doesn't really imply full control most of the time, even with land you have tons of laws limiting what you can do with it
@@joshgribbon8510 Exactly we as consumers don't really own anything anymore and that's the world over, we don't have any rights just privileges until someone decides to take them away.
I don't want my images to be scanned even if I don't engage in any illegal activities. It doesn't matter if it's AI or a human looking through my photos it just makes me feel uncomfortable.
@@johansm97 lol I don't understand how people think everything in their iPhones aren't already being touched by AI, especially photos. How do you think your photos look so good? Computational photography using AI. How do you think they group faces and show you memories? AI. This is just Apple using AI, in a much more careful way than other companies, to do something. That's all it is, and people are losing their minds
@@7billza they actually aren't, facial recognition on iphone is done on device, apple doesn't scan anything, it's the only company that believes in privacy
@@bouzianenadhir8503 they wouldn't have destroyed end to end encryption to Apple servers aka iCloud. If they can snoop around while a photo is uploading to cloud, it's not end to end encrypted. It's not private. As simple as that.
The reason to worry about this photo scanning is that there's no way it doesn't evolve. Currently, it only checks 1) photos being uploaded to iCloud 2) that match a database of known CSAM. Importantly, this doesn't do anything about new CSAM created in the abuse of children. Catching new material is the obvious next step. And there's no way to achieve that with the current hashing architecture. It has to be done by constantly monitoring all media on the phone, probably with some "AI moderator". And there's no way that some government doesn't demand that this monitoring be used to detect something other than CSAM, like political dissent (remember: China is Apple's biggest market). That's the worry. This new tech is only *kinda ok* as long as it doesn't evolve a single step beyond what it is now. And there's virtually no chance of that happening.
Yeah, this tech should evolve because this step alone doesn’t solve the problem. Regardless, It was either going to get created to do the right thing or the wrong thing. That’s just how it works. For now it’s use-case is positive.
Same tech can be used to identify political dissidents, protesters, and just about anybody. Imagine matching memes commonly shared by people of the groups to identify people for political persecution.
Yes. Even If we take them at their word and accept that they can't see other photos because they can only see the ones that neural network has very tightly matched for. They still haven't said anything about the possibility of them searching for other stuff.
I'm sure your isp, phone provider, Google, facebook (including Instagram), and any other social media or messaging platform do that. If you truly care about privacy, you have to get an opensource operating system, and only use opensource apps. There's no way around it
I’m glad that she pushed the “who owns your phone” and the conclusion. I applaud WSJ on pushing the exec on something that felt not scripted apple BS interview. Now, how do we know those pictures being provided by those associations won’t be manipulated into searching for other stuff. At the end of the day apple has no idea what those hashes are. Who knows what the hash provided was.
As a longtime Apple customer (1986)I was thrilled with Tim Cook's statement about privacy and your history of resisting law enforcement and government when it comes to privacy. Now you have appointed yourself the law. And now you are going to scan my phone without my permission. At least the government has to get a warrant. Just a month ago I got rid of my Fitbit watch because Google bought the company and bought an Apple watch because of Apple's supposed commitment to privacy. You are not the government so I have no recourse if you abuse my privacy. So you can do whatever you think is right and I have no recourse. There are only two operating systems in the world and we just have to accept that Big Brother Apple is like Big Brother Google who knows what’s best for the unwashed. We have just about as much recourse as people in China.
Or Apple wanted to avoid government parties such as FBI and CIA so long that by doing so (according to past features such as adding a feature to destroy all users data should the phone's password be typed 10times wrong) it could jeopardize the company. Donald Trump single handedly managed to give an executive order to Google to stop providing the official version of Android and it's services to Huawei and Huawei was almost ready to exit the market. Now imagine Apple being forced to show all users iCloud data to governments due to child pornography claims even though you do not have any. That would suck for them and the user's privacy. Apple (for now) found an in-between solution that still protects legit users data on iCloud and protects Apple from governments by giving an actual "backdoor" to them after many years (as seems by Kreg's tone). The only time this feature will get out of hand is only if it expands for political parties or political correctness such as someone posting an LGBTQ funny image that seems insulting in apple's eyes. Then things will not look good for Apple.
@@milantoth6246 It was extreme. My concern is the fact that the internet and media companies are becoming a necessity. Most businesses or utility companies assume you have internet access. The problem is the tools you need to access the internet are companies that can make arbitrary decisions that change your access to the internet, and you have no recourse. There are only two operating systems in reality Apple and Android, private companies.
He doesn't seem to understand the fundamental reason people are upset. The hash database is on your phone. The scanning is on your phone. This means that we have no guarantee that our phones will be private in the future.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now. Why are you in an uproar now? When they first said your phone was Private, why didn't you roll your eyes and say "ya, but what about the future?"
@@bhavinbijlani They already built the tech to do it. That was their argument against creating a backdoor in 2015. Now it exists and Apple has no excuse that they “can’t comply.” They’ve already stated that they developed the technology to comply.
People are upset because they don’t understand the underlying technology, the same way that a lack of education about natural forces and science leads people, still, to call someone a witch and persecute them.
This explanation from Apple is even more worrying. They describe a technical solution where no one will be able to independently evaluate what content triggers the alert. Hashes results will be ciphered so that no one will know what content matches what "hash of interest" on the device nor on the backend. Any political sensitive content could be part of the database without any one never knowing. Never trust anyone's word to keep you safe from technology abuse.
The explanation is literally a lie. We don't process the images on your phone, here's the misunderstanding: oeighoihzgoiehrg hieogheorighe oriheoirhgoierhg "scanning on your phone, yes, but," eoitgoiehgeihgo That could be the TL;DW of the video tbh
The alternative is, as Google and MS do, to scan the whole cloud content of all users. Apple wants to be in a position to not being able to see our data. And that's the way to protect our privacy, while trying to follow the laws of the US and EU that want to have more and more supervision.
Oh it gets worse. At some point audits (real humans) get involved. At this level who knows what can happen…and if anything did go afoul at Apple, how would you know? What happens when hackers find a way to inject foul hashes or FISA requests force apple to apply this tech for political reasons (under the guise of domestic terrorists)…..in fact, the timing is extraordinarily on point with recent up dates to terrorism.
@@tomboss9940 The alternative is better. You, as a user, can decide whether or not your content undergo the screening. While your data rests on your computer or phone, they remain yours and on your sole control. What Apple is doing is potentially removing that control from your hands: any data on your phone may be monitored without you even granting that right. the only things preventing them from doing that is their good will. Technology history taught us that you should never trust anyone's word from preventing technology abuse (be it knowingly or not).
This was a weak interview. Craig threw some big fancy words when asked to simply describe the system. No hard questions asked and this seemed more like a PR move than an interview
"Craig, tell us why it's okay to treat your customers as if they are guilty until proven innocent, and why you want to foist the system resources onto the users instead of your data centers..." That's what should have been asked.
@@ssud11 Yep, they also pay through future access to things, not just through money. So if they cover this in a way that Apple likes they get future access to news first because they're seen as trusted.
"It's not a backdoor. But it can be manually verified by humans in case our algorithm finds a match." Hmmmmmmm 🤔 That sounds suspiciously like a backdoor to me.
@@ifiwantyoutofeel no it's the fact that it might be a faulty system. How can it differentiate an image of a child posing in a sexual manner with lingerie, to a baby taking a bath. Will it flag both, none, or one of those images? Simple things like that can really impact a person's future
It seems like Apple still doesn't understand just how strange this has made their most loyal and fervent customers feel. This has the potential to really spiral out of control on PR terms, much like the 'apple purposely slows down phones' headlines came out of the throttling due to battery age thing. This loyal base kinda sets the tone for what the sentiment around Apple is, and right now they are spewing and the issue isn't going away. I think the way Craig handled this won't do anything to dampen the concerns either, condescendingly dismissing the backdoor concerns and also giving no details on how it will be expanded or just how we can guarantee Apple is limiting it to child porn. I understand Apple has a new head of PR, it's making people question just what Apple has been up to before that their slick PR glossed over. Some kinda line has been crossed here that I've never felt/seen in my 25 years of using and following Apple.
I feel exactly the same way. I’ve been apple only since I was 8… huge fan of the company… they basically bought my house… some line is being crossed here. Like maybe I’m not in love anymore…
The apple loyalists will always fall in line. As a person who used to think "this is surely the last straw for Apple fans" I don't doubt anymore. I buy the stock and get "rich" with the winning team. Public backlash needs to be HUGE to stop this. Apple hard-core fans aren't revolting against Apple. I've put my money on that.
They also announced it on a Friday afternoon because they knew there would be blowback and they just wanted people to forget about it during the weekend. Well that’s not happening.
@Apple Genius Jokes on you, I dont use Samsung or Apple. Sad for u Apple fanboy, need to pay 3X the price of a smartphone, just to get low battery, low storage/ram, ugly notch, slow charging and 60hz display in 2021. LOL so sad. Just make sure u dont break the glass, the repair price is more expensive than 1 android phone.
It doesn’t matter what the steps are between if A is uploading a photo and Z is them reviewing/alerting authorities. They “Review your private photos” despite the letters in between. Don’t get lost in the steps.
This question can and should not be asked to Apple directly but to the government and entities responsible for controlling data security. All companies have a similar or identical technology and unlike Apple they’ve been using it for decades now.
7:05 should have digged deeper here. The reference hashes belong to child pornography today, tomorrow some state might want to force Apple to add additional references hashes, e.g., of Winnie Poh pictures. If too many Winnie Poh pics get uploaded to the cloud, we have our manual verification prompt, thus our backdoor. One could try to weaken the hashes, too, so they cover more pictures, prompting the manual verification on all kinds of pictures. In the end, you still need to trust Apple to only check for the hashes they tell you about. Not quite the advertised "you don't have to trust a single entity".
I’m so lost with why people are upset 🤷🏻♂️ So what if a manual verification prompt occurs if we have too many Winnie the Pooh picture? Are you saying that then Disney could then advertise to us more or something? Like apple aren’t gonna report you to the police for having Winnie the Pooh on your phone
@@UnkleRiceYo they will if your in China. That’s the point slick. In China it’s a hidden law not to have the photo referencing their leader as Winnie the Pooh so they arrest people who do. Apples software could easily be rolled out to match the picture and report people in China.
@@JamesTurfKing So Apple is to blame because of China’s unfair censorship laws? Also, there’s no indication whatsoever that they will be doing anything of the sort.
@@Dlawderek I believe you don‘t understand the issue. A country like china could say „hey apple additional to csam also scan for the following images when uploading to iCloud (f.e. HongKong freedom acitivism photos)“ If Apple then goes: „na wr promised our customers not to do that“ China could go: „do it or you‘re no longer allowed to sell your products in China“ (a huge market that brings a lot of revenue). It really isn‘t hard to understand. The problem is not what apple is doing but the possibility of the miss use.
Many of us understood exactly what this was from day one, this "talking down to" by Apple is gross. You don't control what's in the database and a government can change it from just CSAM to anything they want. Creating the backdoor is the problem.
I disagree completely with calling this a "backdoor". Apple is not *entering* your phone to do anything, Apple is scanning what *you decide to send to them*. This is more of a bouncer than a backdoor.
But isn’t it better than having the door wide open as it is on many cloud services? I think this is the best balance they could find between not hosting CSAM on their servers and also protecting customer privacy.
@@Dlawderek absolutely not, however good their intentions are, i will never agree to having my private data monitored. Something many forget is that one’s privacy is protected by law. Even if police were to illegally obtain even legitimate evidence against one (be it through illegal wiretapping or else) that evidence will be rejected as unlawfully obtained. What apple is doing here is basically rephrasing “we will hack into your storage and check if you have anything illegal” into “we will scan all your photos and if you don’t agree then we will stop providing service to you even if you paid for it”. Barbarism.
@@Dlawderek NO. If you upload to a cloud service its not your hardware or a private space. Apple is now saying my hardware is actually theirs too to do as they please...
Even more than that, this whole thing is probably start from China because Huawei got banned. So, CCP lost it's surveillance tools and turn to Apple for answer. What else can forced Apple to sudden launch such opposite program
@@gobi817 Because you have a reasonable expectation of privacy on your personal cell phone and companies don't have the right to search and report your content to the police. They shouldn't be looking at your data beyond what is necessary to provide cell phone service. iCloud was marketed as a way to store your data, not a service to scan for and prevent illegal activity.
Just don’t upload your photos to Apple then? Also, I don’t think people send well known pictures of drugs to other people. Funnily enough, if Apple has a hash for your drug photo, this proves you didn’t take it yourself.
Ive been repairing Apple products for 2 years, and aside from battery replacements I wouldn’t recommend non-techie/ qualified/ confident people to do other things like replacing screen screens, lightning ports, FaceID sensors etc
7:18 This is FACTUALLY WRONG. First, only the database of HASHES of CSam is stored on the device. By the nature of hashing, it’s IMPOSSIBLE to get the source image that produced that hash, meaning you CANNOT CHECK for yourself if any given image will be classified as CSam without trusting the US authorities’ database for only including relevant hashes (and not hashes of political or religious content). In fact, I believe even Apple has to trust the authorities to provide relevant hashes only. Second, 7:30 Mr. Federighi didn’t mention that the system will launch in the US only, thus he could say that the database will be THE SAME. No, there exists no universe in which the Russian or Chinese government that will allow iPhones to be shipped with magic hashes that were SECRETLY produced by the US authorities. I bet, the first thing they will do, perhaps justly, is to DEMAND either ACCESS to the US database or (more likely) use of their own databases for their citizens. And I can guarantee that the databases will be matching against WHATEVER doesn’t suit the government. Finally, here’s why all of these is an attack on our rights: There exists a way to bypass this mechanism by simply turning off iCloud Photos. In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason.
"In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason." Lolololol! Nicely put.
@@Eugenepanels yeah it’s really strange how they try to underplay this change in a way. They should have done a comprehensive press release from the start, considering how important this change is.
Dude the whole thing was leaked before they could properly present this. That’s why it’s causing problems, because it wasn’t officially presented by Apple.
@@blackhatson13 they are big tech companies…and they follow many rules and regulations, it isnt very simple for every employee over there to come and view our private icloud photos…
Isn't Facebook also did this with it's team of "human moderator". I'm not saying if that's not gonna invade my privacy, but without those human moderator, we could be seeing terrorism, porn and those nasty things in our message / chat
I appreciate the tone and balance of this interview. Nice job. My biggest problem with these features is that Apple is assuming a moral position. Let me say I am 100% aligned on these behaviors being immoral/heinous. What concerns me is simply that they are taking a moral position. What happens when next month, it’s not child porn but “hate words” in iMessage? Hate defined however Silicon Valley defines it. Applying tech to moral subjects is a very slippery slope. To suggest they can’t or won’t misuse this kind of tech in the future is just ignorant/naive.
Legislation or court systems in other countries could easily add requirements to Apple’s scanning database. It’s hard to believe Apple executives could be this short-sighted about a technology. In order to save face Apple can simply say there are problems with the technology and shelve this for the time being.
I think CSAM and “hate words” are not even nearly in the same league. CSAM is illegal and demonstrably dangerous. The 1st amendment protects your “hate words” so I find it hard to believe that Apple would scan or flag this content. This is a “slippery slope” logical fallacy.
@@Dlawderek “Hateful content” like Nazi imagery is illegal in some European countries. What’s to stop governments from requiring to Apple to include that in the database of images they scan for?
@@davehugstrees Maybe they will. If they start censoring political speech by looking through people's images and reporting them, I would be mad. This is not that. If that day comes, we can all turn off our iCloud storage and/or get rid of our Apple products. I don't think outrage is justified in a case where they are taking very cautious steps to curb the storage of CSAM on their servers. It takes 30 instances of hashcodes matching known CSAM before there is an audit. Even if some photos are flagged mistakenly (which I understand to be very rare) it would never reach 30 by mere chance. Even if it did, I would not mind someone at Apple verifying that I have no illegal images in my iCloud. There shouldn't be anything here to worry about.
From apple: “Could governments force Apple to add non-CSAM images to the hash list? -> Apple will refuse any such demands” So once again… you’re missing the point. Yes, a government could force this… but trust us. Why should we trust them? Who’s the next leadership team? Apple needs to stop this now. I’m honestly considering breaking up with them for the first time in 30 years.
@@hsing-kaichen5062 they also gave into to China and put a data center in China for China icloud. So China just has to walk over to their icloud data center in china, pull the physical data and they have China iPhone data. They already caved to China once. China will ask to add their own csam database, will you disagree then? And what's in that database? We won't know.
"Who owns this phone?" "Well, customers do, but good luck running any software other than ours on it." Answers to moot questions keep average consumers misinformed.
@@akhileshjayaranjan5628 If Apple cannot see your photos then what's even the point of this system? Algorithms are faulty and Apple admits that if this system flags something, there will have to be a human to double-check. And that's the big problem right there: they *can* check your photos. Who is to say that they or a local/US government agency wouldn't just check every photo instead of only the ones that have been flagged?
@@samsonsoturian6013 NO. DON"T TOUCH MY PHONE. DON"T USE MY IPHONE"S COMPUTATIONAL POWER TO DO THE FIRST HALF OF THE WORK. NONE OF MY BUSINESS. I DON"T WANT TO BE INVOLVED.
He gave a vague answer. In future apple is planning to scan our entire phone . People like you who still don't understand and still thinks that apple is god . whatever they do is perfect.. I feel bad for you brother .
@@ShubhamKumar-xu2od mm nice one It hadn’t occurred to me but agreed. This is the worry with technology little by little but we’re getting to that point
They aren’t looking at your photos. The only people that should be worried about this are child predators… which may be telling of why you care so much.
@@cmtheone I’ve worked with law enforcement to put predators in jail before. It’s funny that you’re too dim to see how having your privacy tampered with in the name of the greater good isn’t concerning. Then again, you’re the ideal complacent sheeple that big companies and governments want us all to be. Enjoy your ignorance friend.
@lol what makes you think child predators will store their photos on their phones? Same idiocy as using gun registration to stop violence criminals using guns to rob a bank.
Actually they aren’t scanning any files. They are creating an encrypted hash that is checked against their database of CP hashes. How hashes work is they cannot be decoded and the only wait to identify them is to have a hash. Thus, the only data that is “revealed” in this process is CP data which should be banned. However, this is not to say that I agree with what they are doing or that I don’t recognize the potential of what this may be come as it relates to privacy, but the fundamental feature actually doesn’t breach privacy unless the user uploads CP.
Hashing isn’t scanning. The whole point of hashing is to efficiently store and retrieve data without scanning. The hash does not know the contents of the file, it just calculates a number (hash) that is used during transport to check if errors occurred (the checksum is calculated at the source and destination) and it needs to be sent again. Did you guys take a computer networking class or not?
I was shocked he used that language. I’m guessing Craig, Tim and anyone else giving media interviews are demanding the questions upfront. Then Apple legal, corp comm and marketing can train the two of them with exactly what to say that will answer SOME questions, but not enough to commit to anything that could lead up to being used in a courtroom or in Congress against them.
If the customers owned their phones, they'd be able to install software from wherever they wanted to obtain it. They'd also be able to replace the battery themselves, even if it meant buying a special tool for the job.
Child pornography is really bad, scan them, and alert the authorities. Stop child abuse. Authorities please take action especially when that person is very politically connected like your policy makers.
In a practical sense, Apple at least has good intentions by doing this. Its unarguably a good thing that they are planning on tracking down phones that happen to have children on there. I do see why people are mad tho. Apple has always had a long history of keeping information secure for its customers and this seems like a slap in the face to those who use iPhone because of its security.
Because apple has always been so vocal about Privacy and Not letting other apps track your data. And also cuz apple has starting to show ads on their platforms and hiring people to create a targeted ad network that they have been opposing for so long, pushing out the whole competition. Google scans your data all the time, and flags illegal stuff on gDrive but it's not a bid deal cuz they never said they won't do it or tracking personal data is a bad thing like Apple has been doing.
They actually cut the Apple campus out from any word Craig said which could be used to make memes! That means there has to be an agreement for this interview. I wonder if that includes other limitations as well since the interviewer didn’t pressure Apple that much. This feels more like Apple marketing than journalism
Given that this is an exclusive, this is most likely a way for Apple to take control of the situation. Most companies will only agree to these types of interviews if only certain questions are asked to control the narrative.
I wish Joanna would have asked about the future "enhancement and expansion" of this thing, as Apple announced. Dystopian world we are about to live in.
"I think our customers own their phones, huh, for sure." Too bad his thinking is not reflecting what is really happening... #righttorepair p.s. this is not an interview, merely a communication from Apple...
Apple is like: a man comes to a lady during her shower, saying he will keep eyes closed and just scan for security. People just don’t believe it and don’t buy it. The point is not “ a safe way to scan phones”. The point is “ DON’T scan my phone”. Dont’t means don’t
I like your woman in the shower analogy, but you should have elaborated more on that story. Left me wonder what happens next. When can the man open his eyes?
Though I was concerned about the privacy aspect at first, I think Craig explained it pretty well here, and I kind of get his viewpoint from a software development/iCloud as a service standpoint. 1. *Without* cloud services, the device itself is secure and encrypted. 2. When you're using iCloud, when images are uploaded, they sort of perform a comparison to the reference CSAM image database on-device (which I guess is a trained neural network to flag that part), and upload it with the actual photo. Actual photo never gets opened, only the CSAM neural net jumbled encoding gets processed in the second half of the neural network in the cloud. Now the 2nd part may sound like potential invasion of *on-device privacy*. Maybe. From my experience even "secure" cloud services like MEGA or such do routinely flag accounts for takedown when hashes of the files match copyrighted content. I think the major difference is in the way the actual cryptography is performed. As far as I understand. With most cloud providers, they scan all files in the cloud, like, the whole hashing/neural network encoding stuff is all done on your files in the cloud. But in Apple's case, as far as Craig says it's more like.. Hey we don't want our servers to know this much about the original image, we'd rather have part of the encoding performed on device, then sent over to the cloud. This benefits both Apple and people in that the cloud doesn't need to perform as much workload (maybe), and the cloud doesn't need access to the actual image in order to perform encodings on it. Put simply, I think the way they see it is more like... 1. "That on-device encoding is better, because now we still can't access user data *while also complying with the law on cloud services, without being able to give the authorities the actual data they have*". 2. Processing images on-device considered as *part of the uploading process*, rather than considered as "scanning of all photos". With the recent iPhones sporting more and more advanced neural network accelerators, I think it's a logical step from a development standpoint to simply perform an encoding of the images on-device (both very efficient for the server & makes server not need to go though the actual image data). So the on-device security is not compromised, it's just that part of iCloud support is built in to iOS, and this part is not used at all when you don't use the cloud services (i.e. Craig's "and you don't have to" around 2:34). It's the iCloud uploading process that does the scanning. And so, iCloud is still secure "data-wise". Nobody can read your data, *but* your cloud account can still get flagged for suspicious activity. Pretty neat engineering, but to consider if it's an invasion of privacy is up to debate. But on-device still sounds pretty secure to me.
Completely missed the point then, it’s not about CSAM, it’s about the putting scanning code on to devices, meaning in the future, another update comes along to scan for “potential” terrorists, and so on. It’s all about the concept of putting scanning code on to devices, in the first place, that’s the issue here.
@@DrumToTheBassWoop No really, I understand people's concerns over the code being there on the device. But from an engineering perspective, iCloud is just another preinstalled software, which is optional to use. Your perspective is that iCloud Photos is the core part of iOS, and that scanning happens on every phone. To me, it's just another service that is optional and happens to be preinstalled, and won't ever be used if it's not for the exact action of uploading photos to iCloud. Kind of like, what's on device your device is yours, but if you want to put the stuff in this cloud service then the service will scan before uploading. Like I said, depends on the perspective. I see it as just another Cloud service that happens to be preinstalled, you see it as major intrusion in the OS itself. The only way I would consider this to be invasion of On-Device privacy is if the uploads/analyses are performed without disclosure. This announcement is mostly just them saying their cloud services *will* comply with the law.
You can just opt out by deactivating iCloud Photos, and if that still isn’t enough for you, switch to Android, BlackBerry, build a OS yourself or pay someone else who you trust to do it for you or just use pigeons, or don’t ever install iOS 15!
Another part of this issue is the idea of who owns the content. Regardless of where it is stored. If the police need a warrant to search a safety deposit box at a bank, shouldn't Apple need a warrant before searching photos? The idea of fiduciary duty and trust. If someone is purposely posting items to a public location by all means search away. But when photos are privately being stored in the cloud it feels very invasive.
@@crusherman2001 And that is the issue. When I store paper files in a safety deposit box (1) the bank can't nosy through my stuff and (2) the bank has the responsibility of keeping my files secure. I still own the documents. For all Apple's talk of privacy this could be manipulated to be very big brother...
@@crusherman2001 If you have a reasonable expectation of privacy, they can't just go through your images to report them to the police. For example, if you pay to store your physical items at a storage place, they can't go through and search your stuff and report it to the police. Now, if they have a reason to think you are doing something illegal (smell of weed coming out, for example), they can report it to the police who will then need reasonable suspicion or a warrant to search your stuff. This proactively searching and reporting people to the authorities is not only a terrible invasion of privacy, but it's one that could create legal issues for innocent users.
With Dropbox, Google, MS, this is happening now. Apple wants to safeguard iCloud. That's why they came up with this (complex) solution to not having to watch all your photos. The plan is to encrypt all parts of iCloud in a way that Apple cannot read it. This solution is a counter-offer to the US and EU's intrusive laws in the works for "child protection" (as a scapegoat for sniffing through all our cloud data and communications).
@DataHearth Apple should really implement the best ZeroTrust. The feature is ok, but it's a matter of who they trust. As a developer, I would never tap on this field of sensitivity on users' data.
There no confusion, you are using child pornography as an excuse to scan peoples phone and literally spying on them. Who is Apple to decide they will police people. Drop Apple phones fast, this is not ok, they will use any wording to confuse or convince you this is ok. Watch Edward Snowden video on this bs.
And that's why companies hire likeable people to do their pr campaigns. And it worked. Hence why I own the stock. Apple could kill your kids, and you'd still buy the phones.
it sounds like a blind raid without probable cause or a warrant, they can't see exactly what you have in your house as they rummage around, but they'll check anyway. it's either private or its not.
As a tech worker, I don't think we misunderstood you, Apple. You are still using MY iPhone's computational power to do something you want to do, without my consent. (Generating the hashes of photos, described by Craig as the "first half" of the process) I don't want to be a part of YOUR company's sense of social responsibility. I have a say if you gonna use MY phone to do anything. I am extremely disappointed with Apple (but thanks to WSJ for this interview). And as a 10-year loyal iPhone and Mac customer, I will re-consider if I should use Apple product if this 'feature' goes live.
Craig Federighi looked highly uncomfortable during the interview and his explanations were a concerning mess. Thumbs up for the questions made by the reporter, they usually go much softer on them.
I don't get why the hashing has to be done in my phone. That can be easily done once the image is in iCloud (if you choose to use that service). Kinda of scary that my phone is generating hashes to identify the files I store in my device. You don't need to think too much to realize this can be weaponized by the government
@@alexkobzin557 yes but no to identify my files comparing the hashes to a government database. As I said, that can be weaponized if you change the CSAM database for any database related about content that the government don’t like. And no, I don’t approve the CSAM. My concern is more about privacy. Something Apple has been using over and over to sell their products
Actually hashing is a security feature meant to protect the user. It means that your data isn’t in plain sight. So your photo from Utah might look like”hwhizb&37$;8€![€|*...” when stored on a server Instead of the actual image. Which is what you want in case their database is ever breached. On-device hashing happens too for the same reasons.
@@CalienteFrijoles They have the hashes in the database, therefore, they have the hash itself and the file that generates that hash. Yes, Apple cannot see the content directly but if the hash matches the the government can go to they database and check the file. Now apply this hashing to a database related to content that the government don't like and you have an easy way to identify people that are 'against' the system. This has the potential to become something similar to the scoring system that China already has
They even made a cut when Federighi said “pornography” so the Apple campus won’t be in the background if “memed”. Sneaky little weasels.. I can see your tricks!
While I am against this. This is simply not true. They don’t even scan images that were taken yourself (apparently). Only images that came from another source and is then uploaded to the cloud.
I appreciated that she did this fiercely straightforward interviewing sessions (mostly kind of interrogation) for the good of every Apple device users. Thank You ✌🏼
The real problem is this technology could be used to identify protesters and activists in the name of “CSAM”. Any of your photos and metadata of that photo might be read by Apple’s employees. That’s because Apple can’t guarantee the accuracy of the neural network. My real concern is that law enforcement could take advantage of that, and forcing Apple to access to someone’s gallery to identify social activists, and Apple explains that as a technical defect.
The database they refer to for these images are strictly from NCMEC. NCMEC does not have a database to protesters and activists. Governments do not have access to the Csam verified images, I repeat Csam images only, until the manual verification process which will not happen after repeated algorithmic testing.
Apple has blocked law enforcement before on getting private information from a phone of a literal terrorist. If any precedent was set, its that Apple values your privacy and these what if scenarios will not happen until they do. And even if it does happen, Apple will be accountable for it as Craig mentioned.
@@frappes_ that's not quite the objection at hand. It's NCMEC today. Same approach could be applied to any other database. With regard to the precedent, access was denied to the United States. Courts are able to protect Apple corporation from arbitrary requests. End-to-end encryption was also an excuse used. It's a different ballpark when it comes to authoritarian regimes.
@@justshad937 I get that, and I share that concern, but based on Apple sources this will only laucnh at the US for now. I know it can become a slippery slope but I have faith in the precedent set by Apple in denying governments backdoors to their technology, and even if it does get that bad, the choice for consumers will be clear; do not buy Apple stuff anymore.
@@frappes_ that government could simply make it illegal for Apple to disclose this information. "In the name of national security." The same way US telcos didn't disclosure NSA surveillance.
Here is the issue THE US GOV cannot just go around and go through each device and report what is on there. So neither should apple or any other private company for that matter. This is a lot bigger issue than it is being presented.
Even if this feature remains harmless the implications for the future use of the abuse of said feature is bad. Apple should stop this before it gets worse for there reputation.
WSJ: We would like to ask you a few softball questions about the new update. Apple: OK Sounds good, and we'll send 1000 new Iphone 13 so you can " test out "
Real title of video: Tim Cook throws Craig to the wolves.
🤣🤣🤣🤣
bro what 😂
@@cardboardpackage Tim Cook has been hiding while he throws Craig under the bus. I think the CEO of the company should be the one explaining this to its customers and media, not his VP of Software Engineering. Cook just threw him under the bus and started driving it.
@@MidNiteR32 nah. I think it was the right move. First of all Craig is more agreeable and second the risk is lower. And to be honest Craig managed it formidably imo
@@MidNiteR32 nonetheless your comment is quite on point ;)
3:21 “pornography of any other sort” I’m glad Craig essentially said that Apple knows and understands that people simply just have nudes on their phones
Yeah, some people are simple degenerate pigs, but not actually pedophiles.
@@nicolelea615 yes, and some people are photographers and part of that is nude photography, not porn.
@@nicolelea615 They might be photos of a spouse or partner. Or photos people took to track weight lose/gain progress.
@@billjamal4764 Like you, your dad, your uncle, etc.
@@nicolelea615 My friend there are people out there that are just as bad as mentioned but they are people who have private photos of their partners/spouses don’t put everyone and everything under one group its not fair hope you understand (:
Loved how he mentions Telegram as the message app.
No free mentions for you, Zuck.
Dam facebook. Still dont get how their adquisition of instagram and whatsap went true, o wait...
Yet Telegram founder hates Apple
@@albinjt so whats the benefits of loving apple?
@@albinjt probably because Apple restricts certain channels on Telegram. It’s insane that I have to go the browser version of Telegram to view those restricted channels.
@@bigrich9654 What sort of channels are they restricting tho ? And what are the restricted channels in iOS you have been accessing ? Could ya send us the links ?
Apple: We're not scanning your images, we're just scanning your images
We're not scanning your images, we're just scanning OUR images.
If they are stored on their servers in this day and age I feel as if it’s your fault for trusting big tech. Either way, we’ll all forget about this in a couple of weeks. We basically already have
If they can install a program that tell me my battery is at 10% after 10minutes of use, when a quick hard restart bring it back on at 100% there is no telling what they can install on your phone.
If Ur f-ingdeau gives Apple a couple hundred million of our tax dollars because we proved that the vax was ineffective and self immunity has a 80% success at beating the virus there’s no telling what those greedy blasters will do.
Correct me if I'm wrong, but in order to upload an image to the cloud, you need to scan the images first right?
There is plenty information on how they “scan” the photos, it’s even explained in laymen terms in this video.
Now that this whole news has gotten out, actual petafiles aren’t going to be storing their Photos on iPhone anymore. So basically this feature is useless now.
The people stupid enough to store highly illegal material in cloud storage won't be stopped by these news.
It was always one of the easiest ways to get caught
@@hundvd_7 he says, sounding a little too informed
@@diedforurwins Sure, go ahead and call other people pedophiles. That will make you look smart.
“Petafiles” bro?
@@Dr.HouseMD down with those Petafiles!
Let’s make sure all the members of the Vatican have an iPhone
The pope just bought a Huawei phone
Or they picked the blackberry devices.
@Highground Trump and biden be sweating
Use iCloud*
@Highground We don't have to be selective. But if you want to be, the Republican party is where we should start
First of all, thank you for covering the issue. I wish you pressed him on what type of audit he mentions, because to me anyone can force Apple to add a database via the FISA court. I want to know what is done to prevent that from happening instead of taking Apple as its word.
THIS!! I was pleased to hear about "auditability" -- but what exactly does he mean? Anyone got source / more info on that?
Before WSJ is allowed the privilege to interview craig they have to agree to terms and conditions
He seems to be very shady in his explanation as to what the company IS going to do.
So basically avoid apple's icloud services
@@zonka6598 if that it what’s works for you and gives you a sense of privacy then by all means but just note that if you use google they’re already doing it and worst so yeah…
“It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.”
-Warren Buffett
Apple is feeling this hard, hence the panicked response to media.
Okay then purchase a Chinese phone.
It’s going to be misused like any other tool big tech and the government gets its hands on, period.
@Carrot Cruncher I'm an Android guy through and through Brett has a network engineer I can tell you programs are much more flawed than people realize they're doing this for margin of error
Apple fanboys will just bend over and accept everything
@@The-Heart-Will-Testify they are the ones who are mad at apple. Think before u comment.
Tim: “Hey Craig….”
Craig: “NO NO NO NO NO NO!”
Craig: “Hey everyone…😅”
Don’t get it
@@Jushwa it means tim cook said craig to go for interview
Tim Cook is the ceo Craig is the software person he knows what everything does he made it Tim Cook does not do software and anyone that stores things to the cloud they dont own the servers all they own is the main device storage they dont scan on the device they scan on the cloud only
“I think the customer owns the phone”
It’s a yes or no answer
That's a big fat no.
That sounds like a yes to me?
It's definitely a little more complicated, I can "own" a car but there's a lot of restrictions on what I can to with it or to it, especially if you want to use it on a road. Ownership doesn't really imply full control most of the time, even with land you have tons of laws limiting what you can do with it
@@joshgribbon8510 Exactly we as consumers don't really own anything anymore and that's the world over, we don't have any rights just privileges until someone decides to take them away.
@@mantasvilcinskas definitely more complicated then a yes.
I don't want my images to be scanned even if I don't engage in any illegal activities. It doesn't matter if it's AI or a human looking through my photos it just makes me feel uncomfortable.
They already are. Don’t you see how your photos app can recognize faces etc? I think people pressed about this have things to hide
@@johansm97 lol I don't understand how people think everything in their iPhones aren't already being touched by AI, especially photos. How do you think your photos look so good? Computational photography using AI. How do you think they group faces and show you memories? AI. This is just Apple using AI, in a much more careful way than other companies, to do something. That's all it is, and people are losing their minds
@@7billza they actually aren't, facial recognition on iphone is done on device, apple doesn't scan anything, it's the only company that believes in privacy
@@bouzianenadhir8503 they wouldn't have destroyed end to end encryption to Apple servers aka iCloud. If they can snoop around while a photo is uploading to cloud, it's not end to end encrypted. It's not private. As simple as that.
Don’t use iCloud then.
The reason to worry about this photo scanning is that there's no way it doesn't evolve. Currently, it only checks 1) photos being uploaded to iCloud 2) that match a database of known CSAM. Importantly, this doesn't do anything about new CSAM created in the abuse of children. Catching new material is the obvious next step. And there's no way to achieve that with the current hashing architecture. It has to be done by constantly monitoring all media on the phone, probably with some "AI moderator". And there's no way that some government doesn't demand that this monitoring be used to detect something other than CSAM, like political dissent (remember: China is Apple's biggest market). That's the worry. This new tech is only *kinda ok* as long as it doesn't evolve a single step beyond what it is now. And there's virtually no chance of that happening.
en.wikipedia.org/wiki/Slippery_slope
@@RHStevens1986 Sure, but also worth considering: en.wikipedia.org/wiki/Foot-in-the-door_technique
The typical American ignorance that radiates from this single comment is amazing.
Ios 16 they will start scanning your on device photo library. Mark my worlds guys 😎
Yeah, this tech should evolve because this step alone doesn’t solve the problem. Regardless, It was either going to get created to do the right thing or the wrong thing. That’s just how it works. For now it’s use-case is positive.
Same tech can be used to identify political dissidents, protesters, and just about anybody. Imagine matching memes commonly shared by people of the groups to identify people for political persecution.
Yes. Even If we take them at their word and accept that they can't see other photos because they can only see the ones that neural network has very tightly matched for. They still haven't said anything about the possibility of them searching for other stuff.
All they need to do is change the hash and AI to look for other photos.
I'm sure your isp, phone provider, Google, facebook (including Instagram), and any other social media or messaging platform do that. If you truly care about privacy, you have to get an opensource operating system, and only use opensource apps. There's no way around it
THANK YOU was looking for this. This is smoke in Mirrors.
It’s already on gmail , facebook, instagram, twitter and RUclips.
I’m glad that she pushed the “who owns your phone” and the conclusion. I applaud WSJ on pushing the exec on something that felt not scripted apple BS interview.
Now, how do we know those pictures being provided by those associations won’t be manipulated into searching for other stuff. At the end of the day apple has no idea what those hashes are. Who knows what the hash provided was.
@@Karantkr Multiple photo apps do this..
This felt not scripted? The forced laughs, fake "searching for the right words", multiple camera angles and after all that this felt unscripted?
@@MrSidneycarton you expect a trillion dollar company to shoot an interview with a single camera? 😒🙄 multiple camera angles are an industry standard
Paid interview
@@nixednamode3607 Not sure whether that was intended as sarcasm or not buddy.
As a longtime Apple customer (1986)I was thrilled with Tim Cook's statement about privacy and your history of resisting law enforcement and government when it comes to privacy. Now you have appointed yourself the law. And now you are going to scan my phone without my permission. At least the government has to get a warrant. Just a month ago I got rid of my Fitbit watch because Google bought the company and bought an Apple watch because of Apple's supposed commitment to privacy. You are not the government so I have no recourse if you abuse my privacy. So you can do whatever you think is right and I have no recourse. There are only two operating systems in the world and we just have to accept that Big Brother Apple is like Big Brother Google who knows what’s best for the unwashed. We have just about as much recourse as people in China.
Or Apple wanted to avoid government parties such as FBI and CIA so long that by doing so (according to past features such as adding a feature to destroy all users data should the phone's password be typed 10times wrong) it could jeopardize the company. Donald Trump single handedly managed to give an executive order to Google to stop providing the official version of Android and it's services to Huawei and Huawei was almost ready to exit the market.
Now imagine Apple being forced to show all users iCloud data to governments due to child pornography claims even though you do not have any. That would suck for them and the user's privacy. Apple (for now) found an in-between solution that still protects legit users data on iCloud and protects Apple from governments by giving an actual "backdoor" to them after many years (as seems by Kreg's tone).
The only time this feature will get out of hand is only if it expands for political parties or political correctness such as someone posting an LGBTQ funny image that seems insulting in apple's eyes. Then things will not look good for Apple.
An american saying they have to endure tyranny anything like the one in china is just ignorant.
@@milantoth6246 It was extreme. My concern is the fact that the internet and media companies are becoming a necessity. Most businesses or utility companies assume you have internet access. The problem is the tools you need to access the internet are companies that can make arbitrary decisions that change your access to the internet, and you have no recourse. There are only two operating systems in reality Apple and Android, private companies.
You can de-google Android phones though, since it is open source. Check out Rob Braxman's channel on how to do it, if privacy is so important to you.
@@justinberman7386 along with Graphene and Calyx which pretty much only work on Pixels, there is also /e/OS which supports a wider range of phones.
He doesn't seem to understand the fundamental reason people are upset. The hash database is on your phone. The scanning is on your phone. This means that we have no guarantee that our phones will be private in the future.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now. Why are you in an uproar now? When they first said your phone was Private, why didn't you roll your eyes and say "ya, but what about the future?"
@@bhavinbijlani They already built the tech to do it. That was their argument against creating a backdoor in 2015. Now it exists and Apple has no excuse that they “can’t comply.” They’ve already stated that they developed the technology to comply.
It feels like in China.
People are upset because they don’t understand the underlying technology, the same way that a lack of education about natural forces and science leads people, still, to call someone a witch and persecute them.
@@carlosgomez-ct6ki World is gone same. We want to have privacy. But Every company/gov want to get it.
This explanation from Apple is even more worrying. They describe a technical solution where no one will be able to independently evaluate what content triggers the alert. Hashes results will be ciphered so that no one will know what content matches what "hash of interest" on the device nor on the backend. Any political sensitive content could be part of the database without any one never knowing. Never trust anyone's word to keep you safe from technology abuse.
The explanation is literally a lie. We don't process the images on your phone, here's the misunderstanding: oeighoihzgoiehrg hieogheorighe oriheoirhgoierhg "scanning on your phone, yes, but," eoitgoiehgeihgo
That could be the TL;DW of the video tbh
@@user-hm7zn6bz4y It's literally not that hard to understand it
The alternative is, as Google and MS do, to scan the whole cloud content of all users. Apple wants to be in a position to not being able to see our data. And that's the way to protect our privacy, while trying to follow the laws of the US and EU that want to have more and more supervision.
Oh it gets worse. At some point audits (real humans) get involved. At this level who knows what can happen…and if anything did go afoul at Apple, how would you know? What happens when hackers find a way to inject foul hashes or FISA requests force apple to apply this tech for political reasons (under the guise of domestic terrorists)…..in fact, the timing is extraordinarily on point with recent up dates to terrorism.
@@tomboss9940 The alternative is better. You, as a user, can decide whether or not your content undergo the screening. While your data rests on your computer or phone, they remain yours and on your sole control. What Apple is doing is potentially removing that control from your hands: any data on your phone may be monitored without you even granting that right. the only things preventing them from doing that is their good will. Technology history taught us that you should never trust anyone's word from preventing technology abuse (be it knowingly or not).
This was a weak interview. Craig threw some big fancy words when asked to simply describe the system. No hard questions asked and this seemed more like a PR move than an interview
Basically a paid interview for pr purposes.
"Craig, tell us why it's okay to treat your customers as if they are guilty until proven innocent, and why you want to foist the system resources onto the users instead of your data centers..." That's what should have been asked.
@@KarstenJohansson because if it was checked at iCloud servers people would go oh no they are spying on us
@@IndexError They wouldn't say that when the check is done on their personal device?
@@ssud11 Yep, they also pay through future access to things, not just through money.
So if they cover this in a way that Apple likes they get future access to news first because they're seen as trusted.
"People have misunderstood" People are not stupid, we understand what you are doing and we have a problem with it
They are making it up to look at our private images 🤬
"It's not a backdoor. But it can be manually verified by humans in case our algorithm finds a match."
Hmmmmmmm 🤔 That sounds suspiciously like a backdoor to me.
Spying withe extra steps
The files are on their servers
A backdoor to what? iCloud? Which Apple already controls?
Yeah... You don't need to upload your photos or use that service.... Or simple don't have cp
@@ifiwantyoutofeel no it's the fact that it might be a faulty system. How can it differentiate an image of a child posing in a sexual manner with lingerie, to a baby taking a bath. Will it flag both, none, or one of those images? Simple things like that can really impact a person's future
It seems like Apple still doesn't understand just how strange this has made their most loyal and fervent customers feel. This has the potential to really spiral out of control on PR terms, much like the 'apple purposely slows down phones' headlines came out of the throttling due to battery age thing. This loyal base kinda sets the tone for what the sentiment around Apple is, and right now they are spewing and the issue isn't going away. I think the way Craig handled this won't do anything to dampen the concerns either, condescendingly dismissing the backdoor concerns and also giving no details on how it will be expanded or just how we can guarantee Apple is limiting it to child porn. I understand Apple has a new head of PR, it's making people question just what Apple has been up to before that their slick PR glossed over. Some kinda line has been crossed here that I've never felt/seen in my 25 years of using and following Apple.
I feel exactly the same way. I’ve been apple only since I was 8… huge fan of the company… they basically bought my house… some line is being crossed here. Like maybe I’m not in love anymore…
windows is in background nice
The apple loyalists will always fall in line. As a person who used to think "this is surely the last straw for Apple fans" I don't doubt anymore. I buy the stock and get "rich" with the winning team.
Public backlash needs to be HUGE to stop this. Apple hard-core fans aren't revolting against Apple. I've put my money on that.
We're not because we don't run our lives with pitchforks and torches.
@@ThinkyParts how old are you now?
They also announced it on a Friday afternoon because they knew there would be blowback and they just wanted people to forget about it during the weekend. Well that’s not happening.
And they was about to loose sales , I thought the whole phone was the cloud I'm not understanding
Remember icloud hackin 2014? All celebrities pictures leaked. Yeah Apple has some nice security there. Thank God I dont have an Apple account.
Apple always releases negative news on a Friday afternoon
@@fynkozari9271 Dude that was 2014 lol Apple has only gotten better with security since then.
@Apple Genius Jokes on you, I dont use Samsung or Apple. Sad for u Apple fanboy, need to pay 3X the price of a smartphone, just to get low battery, low storage/ram, ugly notch, slow charging and 60hz display in 2021. LOL so sad. Just make sure u dont break the glass, the repair price is more expensive than 1 android phone.
Tim checks the laptops of his engineers........
apple engineer: I swear its just for the image classification algorithm.
Trying to confuse the rocket detection algorithm with similar images 😏
Bruh 🤣
It doesn’t matter what the steps are between if A is uploading a photo and Z is them reviewing/alerting authorities. They “Review your private photos” despite the letters in between. Don’t get lost in the steps.
Apple cannot call itself the privacy company anymore.
How about Google drive and Dropbox, they also scan for CP
@@starbutterflygaming8881 true, but they never really were known for their privacy stance, unlike Apple.
@@romakrelian thats apple stan right there
...do you people not know how to interpret English language? Why is there still confusion
@@drinkwoter or because nobody is ever safe when buying a phone
While I applaud the CSAM implementation, the issue becomes how far reaching will this become? It's a slippery slope.
This question can and should not be asked to Apple directly but to the government and entities responsible for controlling data security. All companies have a similar or identical technology and unlike Apple they’ve been using it for decades now.
Same thought. I think this is what happens when legislation cannot keep up with how fast tech develops.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.
This was built for china to spy on dissidents
@@tiagomaqz other companies scan things on their cloud.
Apple is scanning on your device AND the cloud
7:05 should have digged deeper here. The reference hashes belong to child pornography today, tomorrow some state might want to force Apple to add additional references hashes, e.g., of Winnie Poh pictures. If too many Winnie Poh pics get uploaded to the cloud, we have our manual verification prompt, thus our backdoor.
One could try to weaken the hashes, too, so they cover more pictures, prompting the manual verification on all kinds of pictures.
In the end, you still need to trust Apple to only check for the hashes they tell you about. Not quite the advertised "you don't have to trust a single entity".
I’m so lost with why people are upset 🤷🏻♂️ So what if a manual verification prompt occurs if we have too many Winnie the Pooh picture? Are you saying that then Disney could then advertise to us more or something? Like apple aren’t gonna report you to the police for having Winnie the Pooh on your phone
@@UnkleRiceYo they will if your in China. That’s the point slick. In China it’s a hidden law not to have the photo referencing their leader as Winnie the Pooh so they arrest people who do. Apples software could easily be rolled out to match the picture and report people in China.
@@JamesTurfKing So Apple is to blame because of China’s unfair censorship laws? Also, there’s no indication whatsoever that they will be doing anything of the sort.
@@Dlawderek I believe you don‘t understand the issue. A country like china could say „hey apple additional to csam also scan for the following images when uploading to iCloud (f.e. HongKong freedom acitivism photos)“ If Apple then goes: „na wr promised our customers not to do that“ China could go: „do it or you‘re no longer allowed to sell your products in China“ (a huge market that brings a lot of revenue). It really isn‘t hard to understand. The problem is not what apple is doing but the possibility of the miss use.
@@UnkleRiceYo did you bother trying to understand why this is an actual problem
Many of us understood exactly what this was from day one, this "talking down to" by Apple is gross. You don't control what's in the database and a government can change it from just CSAM to anything they want. Creating the backdoor is the problem.
Exactly. There was no never any misunderstanding
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.
And you can just....you know not upload anything to the cloud....
Exactly, those who provide the hashes can change it to look for anything.
I disagree completely with calling this a "backdoor". Apple is not *entering* your phone to do anything, Apple is scanning what *you decide to send to them*. This is more of a bouncer than a backdoor.
"A thoroughly
documented, carefully
thought-out, and
narrowly-scoped
backdoor is still a
backdoor"
But isn’t it better than having the door wide open as it is on many cloud services? I think this is the best balance they could find between not hosting CSAM on their servers and also protecting customer privacy.
@@Dlawderek no
@@Dlawderek What about not building the door at all. Law enforcement is NOT the duty of private companies, and there are very good reasons for that.
@@Dlawderek absolutely not, however good their intentions are, i will never agree to having my private data monitored.
Something many forget is that one’s privacy is protected by law. Even if police were to illegally obtain even legitimate evidence against one (be it through illegal wiretapping or else) that evidence will be rejected as unlawfully obtained. What apple is doing here is basically rephrasing “we will hack into your storage and check if you have anything illegal” into “we will scan all your photos and if you don’t agree then we will stop providing service to you even if you paid for it”. Barbarism.
@@Dlawderek NO. If you upload to a cloud service its not your hardware or a private space. Apple is now saying my hardware is actually theirs too to do as they please...
This is painful even for him to sell this…my god. This is a problem. iCloud photos are now turned off for me.
dont jinx this to me dude, i just switched to icloud
7:45 what are the multiple levels of auditabililty? Will you seriously say “no” to China?
They've already said "yes" to China when they gave up their security keys to decrypt Chinese iCloud data. They're just going to fold again.
And will likely more yes to other governments
Even more than that, this whole thing is probably start from China because Huawei got banned. So, CCP lost it's surveillance tools and turn to Apple for answer. What else can forced Apple to sudden launch such opposite program
They don't provide encryption for phones sold in China and Saudi Arabia
@@mukamuka0 lol remember Apple is an American company. If anyone is asking them to do anything it’s the CIA
If this is allowed, what's stopping them from reporting your drug pics to the police? Wake up people
If drug is illegal where you live, then why not? People doing illegal activities should be reported.
DRUGS ARE BAD MQWAYYYY
@@gobi817 You missed the point entirely. Also, simply having a picture of drugs is not illegal.
@@gobi817 Because you have a reasonable expectation of privacy on your personal cell phone and companies don't have the right to search and report your content to the police. They shouldn't be looking at your data beyond what is necessary to provide cell phone service. iCloud was marketed as a way to store your data, not a service to scan for and prevent illegal activity.
Just don’t upload your photos to Apple then? Also, I don’t think people send well known pictures of drugs to other people. Funnily enough, if Apple has a hash for your drug photo, this proves you didn’t take it yourself.
"customers own their phones for sure"
They cant even repair them without going to Apple!
You own it, until you want to repair it ;)
I just did it today tho
if you repair, you’ll get a warning message in settings 🥲
You don't own an IPhone, you just use it.
Ive been repairing Apple products for 2 years, and aside from battery replacements I wouldn’t recommend non-techie/ qualified/ confident people to do other things like replacing screen screens, lightning ports, FaceID sensors etc
“no no no you dummies don’t understand how this works.”
We do, which is why we don’t want it.
“I think the customer owns the phone”
Right to repair: no
now you can
7:18 This is FACTUALLY WRONG. First, only the database of HASHES of CSam is stored on the device. By the nature of hashing, it’s IMPOSSIBLE to get the source image that produced that hash, meaning you CANNOT CHECK for yourself if any given image will be classified as CSam without trusting the US authorities’ database for only including relevant hashes (and not hashes of political or religious content). In fact, I believe even Apple has to trust the authorities to provide relevant hashes only.
Second, 7:30 Mr. Federighi didn’t mention that the system will launch in the US only, thus he could say that the database will be THE SAME. No, there exists no universe in which the Russian or Chinese government that will allow iPhones to be shipped with magic hashes that were SECRETLY produced by the US authorities. I bet, the first thing they will do, perhaps justly, is to DEMAND either ACCESS to the US database or (more likely) use of their own databases for their citizens. And I can guarantee that the databases will be matching against WHATEVER doesn’t suit the government.
Finally, here’s why all of these is an attack on our rights: There exists a way to bypass this mechanism by simply turning off iCloud Photos. In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason.
"In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason." Lolololol! Nicely put.
Still doesn’t hit on the real concerning issue
Wait until China ask them to quietly scan other photo...
They should done this interview from the start, and the worry about future change still stands.
Right? This makes the suspicion grow even more.
@@Eugenepanels yeah it’s really strange how they try to underplay this change in a way. They should have done a comprehensive press release from the start, considering how important this change is.
Dude the whole thing was leaked before they could properly present this. That’s why it’s causing problems, because it wasn’t officially presented by Apple.
@@TomorowGames as far I know it wasn’t leak but released be apple them self via their newsroom, but I will check if I’m wrong….
@@TomorowGames Yeah exxactly. It was leaked way before the proper launch and as a result there were tons of false information and fearmongering.
Tim literally threw the guy at wolves. Hilarious
It's almost as if it's Craig's job to talk about software, 5 days a week. He even makes a few mil a year for doing it.
@@_sparrowhawk talk about software is his job, but that matter was super important and a word from Tim would have been welcomed
Nice copy and paste
Craig practicing his “Good Morning” for Tim Cook’s Replacement 👀😂
Reference | 1:20
I would be happy if Craig took over for Tim.
@@triple7marc Same, he’s so perfect for the Role . Full of Life and so Enthusiastic .
5:36 "Human moderators". So basically private icloud content can be viewed by apple tech support moderators
It will probably be a highly specialized team who can do that, not anyone at apple, let alone tech support.
@@harsimranbansal5355 still a violation of privacy
@@blackhatson13 they are big tech companies…and they follow many rules and regulations, it isnt very simple for every employee over there to come and view our private icloud photos…
Isn't Facebook also did this with it's team of "human moderator". I'm not saying if that's not gonna invade my privacy, but without those human moderator, we could be seeing terrorism, porn and those nasty things in our message / chat
This has always been the case
I appreciate the tone and balance of this interview. Nice job. My biggest problem with these features is that Apple is assuming a moral position. Let me say I am 100% aligned on these behaviors being immoral/heinous. What concerns me is simply that they are taking a moral position. What happens when next month, it’s not child porn but “hate words” in iMessage? Hate defined however Silicon Valley defines it. Applying tech to moral subjects is a very slippery slope. To suggest they can’t or won’t misuse this kind of tech in the future is just ignorant/naive.
Legislation or court systems in other countries could easily add requirements to Apple’s scanning database. It’s hard to believe Apple executives could be this short-sighted about a technology. In order to save face Apple can simply say there are problems with the technology and shelve this for the time being.
I think CSAM and “hate words” are not even nearly in the same league. CSAM is illegal and demonstrably dangerous. The 1st amendment protects your “hate words” so I find it hard to believe that Apple would scan or flag this content. This is a “slippery slope” logical fallacy.
@@Dlawderek “Hateful content” like Nazi imagery is illegal in some European countries. What’s to stop governments from requiring to Apple to include that in the database of images they scan for?
@@davehugstrees Maybe they will. If they start censoring political speech by looking through people's images and reporting them, I would be mad. This is not that. If that day comes, we can all turn off our iCloud storage and/or get rid of our Apple products. I don't think outrage is justified in a case where they are taking very cautious steps to curb the storage of CSAM on their servers. It takes 30 instances of hashcodes matching known CSAM before there is an audit. Even if some photos are flagged mistakenly (which I understand to be very rare) it would never reach 30 by mere chance. Even if it did, I would not mind someone at Apple verifying that I have no illegal images in my iCloud. There shouldn't be anything here to worry about.
Anyone who says it will never be misused or increase in scope is lying to themselves.
From apple: “Could governments force Apple to add non-CSAM images to the hash list? -> Apple will refuse any such demands” So once again… you’re missing the point. Yes, a government could force this… but trust us. Why should we trust them? Who’s the next leadership team? Apple needs to stop this now. I’m honestly considering breaking up with them for the first time in 30 years.
Do it, I know I am
Didn’t Apple refuse to unlock an iPhone to US federal government once for a crime case?
@@hsing-kaichen5062 they also gave into to China and put a data center in China for China icloud. So China just has to walk over to their icloud data center in china, pull the physical data and they have China iPhone data. They already caved to China once. China will ask to add their own csam database, will you disagree then? And what's in that database? We won't know.
And go where? Analog? Pick your evil…
if you don’t trust them don’t use icloud photos then and switch to a different cloud photo service 🤷🏼♂️
"Who owns this phone?"
"Well, customers do, but good luck running any software other than ours on it."
Answers to moot questions keep average consumers misinformed.
You’ve obviously never jailbroken an iPhone
Yeah, and that's totally intentional.
Sounds convincing. But this is still a backdoor to expand for the govt.
Literally no, goggle has been doing this for the past 10 years
hashes can be made from photos but a hash cannot be converted back into the photo. Apple does not see your photos to generate the hash.
@@akhileshjayaranjan5628 exactly
@@akhileshjayaranjan5628 If Apple cannot see your photos then what's even the point of this system? Algorithms are faulty and Apple admits that if this system flags something, there will have to be a human to double-check. And that's the big problem right there: they *can* check your photos. Who is to say that they or a local/US government agency wouldn't just check every photo instead of only the ones that have been flagged?
we will still continue the ‘Misunderstood’ after this video explanation
This is the exact moment "misunderstand" becomes "defame."
@@samsonsoturian6013 NO. DON"T TOUCH MY PHONE. DON"T USE MY IPHONE"S COMPUTATIONAL POWER TO DO THE FIRST HALF OF THE WORK. NONE OF MY BUSINESS. I DON"T WANT TO BE INVOLVED.
He gave a vague answer. In future apple is planning to scan our entire phone . People like you who still don't understand and still thinks that apple is god . whatever they do is perfect.. I feel bad for you brother .
@@KaizenAction296 ok, conspiritard
"journalism" = regurgitating what big companies tell you
Where are Global Human Rights activists ?
Taliban is more violent than ISIS
Nothing to see here , move along pleb
she is just like some apple activist, protecting apple at all costs
The second part reminds me of that Black Mirror episode. We are getting there.
Which one?
As a parent and security expert, I get that feature. The first one is the one I'm more curious about...
Which episode?
@@LuthandoMaqondo Arkengel
@@ShubhamKumar-xu2od mm nice one It hadn’t occurred to me but agreed. This is the worry with technology little by little but we’re getting to that point
I think that Apple has “misunderstood” that I value my privacy more than the convenience their products and services can offer me.
They aren’t looking at your photos. The only people that should be worried about this are child predators… which may be telling of why you care so much.
@@cmtheone I’ve worked with law enforcement to put predators in jail before. It’s funny that you’re too dim to see how having your privacy tampered with in the name of the greater good isn’t concerning. Then again, you’re the ideal complacent sheeple that big companies and governments want us all to be. Enjoy your ignorance friend.
@@jackoryan292 Switch to Samsung brother 👍 I'd recommend the S21 great phone 👍 I love my iPad but come on man switch to Samsung brother 👍
@J0p4 google has been doing this for ages. As well as Microsoft. So if you’re going to use them for image cloud storage it’s even worse.
@lol
what makes you think child predators will store their photos on their phones?
Same idiocy as using gun registration to stop violence criminals using guns to rob a bank.
Apple is the only company telling people this is happening. Thank you for the transparency
You’ve got a great point man
Wasn’t it leaked? Then they had to come out and explain...
It sounds like “you are holding it wrong”
I was looking for this comment.
Me too
Someone is old enough to remember ;)
“We’re not scanning your photos, you see, we’re scanning your photos.”
actually it's we aren't scanning your photos on your phone, we are scanning you entire icloud photo library. It's even worse
Actually they aren’t scanning any files. They are creating an encrypted hash that is checked against their database of CP hashes. How hashes work is they cannot be decoded and the only wait to identify them is to have a hash. Thus, the only data that is “revealed” in this process is CP data which should be banned. However, this is not to say that I agree with what they are doing or that I don’t recognize the potential of what this may be come as it relates to privacy, but the fundamental feature actually doesn’t breach privacy unless the user uploads CP.
@@ohmyghost88 that is literally scanning
@@ohmyghost88 nice try Craig we know that’s you
Hashing isn’t scanning. The whole point of hashing is to efficiently store and retrieve data without scanning. The hash does not know the contents of the file, it just calculates a number (hash) that is used during transport to check if errors occurred (the checksum is calculated at the source and destination) and it needs to be sent again. Did you guys take a computer networking class or not?
"I THINK our customers own their phones"
What a great vote of confidence......
for sure.
I was shocked he used that language. I’m guessing Craig, Tim and anyone else giving media interviews are demanding the questions upfront.
Then Apple legal, corp comm and marketing can train the two of them with exactly what to say that will answer SOME questions, but not enough to commit to anything that could lead up to being used in a courtroom or in Congress against them.
Lol exactly
If the customers owned their phones, they'd be able to install software from wherever they wanted to obtain it. They'd also be able to replace the battery themselves, even if it meant buying a special tool for the job.
THINK DIFFERENT
Child pornography is really bad, scan them, and alert the authorities. Stop child abuse. Authorities please take action especially when that person is very politically connected like your policy makers.
In a practical sense, Apple at least has good intentions by doing this. Its unarguably a good thing that they are planning on tracking down phones that happen to have children on there.
I do see why people are mad tho. Apple has always had a long history of keeping information secure for its customers and this seems like a slap in the face to those who use iPhone because of its security.
Because apple has always been so vocal about Privacy and Not letting other apps track your data. And also cuz apple has starting to show ads on their platforms and hiring people to create a targeted ad network that they have been opposing for so long, pushing out the whole competition. Google scans your data all the time, and flags illegal stuff on gDrive but it's not a bid deal cuz they never said they won't do it or tracking personal data is a bad thing like Apple has been doing.
What a timing to drop this exactly after the *Pegasus* deal 'still unaddressed'
Yeah they are very similar
What is that? Can you explain?
What? Wasnt already patched?
@@tophan5146 watch rene ritchie’s video about it
@@kevinhernandezarango5005 never gonna be patched
They actually cut the Apple campus out from any word Craig said which could be used to make memes! That means there has to be an agreement for this interview. I wonder if that includes other limitations as well since the interviewer didn’t pressure Apple that much. This feels more like Apple marketing than journalism
I thought similar. I noticed how this was cut too....like why did they have alternate camera angles for a meeting that took place on FaceTime?
I mean what did you expected from Apple. They are one of the most strictest companies who absolutely love controlling the narrative.
Given that this is an exclusive, this is most likely a way for Apple to take control of the situation. Most companies will only agree to these types of interviews if only certain questions are asked to control the narrative.
Literally everything you see in news these days is just propaganda. Journalism is dead
Time to make some memes with a huge Apple watermark out of pure spite
I wish Joanna would have asked about the future "enhancement and expansion" of this thing, as Apple announced. Dystopian world we are about to live in.
Meanwhile Google has already been inhabiting that world for *years* now.
@@exiles_dot_tv indeed, the rest of big tech are dragging us all to that dark place.
I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.
Have you been asleep the last few decades or are you just a Microsoft/Google fan boy?
@@PedroLopezBeanEater I haven't and I'm not anyone's fan boy.
"I think our customers own their phones, huh, for sure."
Too bad his thinking is not reflecting what is really happening...
#righttorepair
p.s. this is not an interview, merely a communication from Apple...
Pretty sure she asked questions, that makes it an interview.
And he answered those questions. No fuss about it
"how do you know this is a nude image or a rocketship?" LOL top-tier questions!
Having a picture of Blue Origin rocket
Iphone user: *nervous sweating*
Apple is like: a man comes to a lady during her shower, saying he will keep eyes closed and just scan for security. People just don’t believe it and don’t buy it. The point is not “ a safe way to scan phones”. The point is “ DON’T scan my phone”. Dont’t means don’t
But Apple don’t scan your phone. I think you meant photos on iCloud server.
Connection to your phone
I like your woman in the shower analogy, but you should have elaborated more on that story. Left me wonder what happens next. When can the man open his eyes?
Yeah a MAN, ofc it has to be a MAN
Though I was concerned about the privacy aspect at first, I think Craig explained it pretty well here, and I kind of get his viewpoint from a software development/iCloud as a service standpoint.
1. *Without* cloud services, the device itself is secure and encrypted.
2. When you're using iCloud, when images are uploaded, they sort of perform a comparison to the reference CSAM image database on-device (which I guess is a trained neural network to flag that part), and upload it with the actual photo. Actual photo never gets opened, only the CSAM neural net jumbled encoding gets processed in the second half of the neural network in the cloud.
Now the 2nd part may sound like potential invasion of *on-device privacy*. Maybe.
From my experience even "secure" cloud services like MEGA or such do routinely flag accounts for takedown when hashes of the files match copyrighted content. I think the major difference is in the way the actual cryptography is performed.
As far as I understand. With most cloud providers, they scan all files in the cloud, like, the whole hashing/neural network encoding stuff is all done on your files in the cloud. But in Apple's case, as far as Craig says it's more like.. Hey we don't want our servers to know this much about the original image, we'd rather have part of the encoding performed on device, then sent over to the cloud. This benefits both Apple and people in that the cloud doesn't need to perform as much workload (maybe), and the cloud doesn't need access to the actual image in order to perform encodings on it.
Put simply, I think the way they see it is more like...
1. "That on-device encoding is better, because now we still can't access user data *while also complying with the law on cloud services, without being able to give the authorities the actual data they have*".
2. Processing images on-device considered as *part of the uploading process*, rather than considered as "scanning of all photos". With the recent iPhones sporting more and more advanced neural network accelerators, I think it's a logical step from a development standpoint to simply perform an encoding of the images on-device (both very efficient for the server & makes server not need to go though the actual image data).
So the on-device security is not compromised, it's just that part of iCloud support is built in to iOS, and this part is not used at all when you don't use the cloud services (i.e. Craig's "and you don't have to" around 2:34). It's the iCloud uploading process that does the scanning. And so, iCloud is still secure "data-wise". Nobody can read your data, *but* your cloud account can still get flagged for suspicious activity.
Pretty neat engineering, but to consider if it's an invasion of privacy is up to debate. But on-device still sounds pretty secure to me.
Well said!!!
Completely missed the point then, it’s not about CSAM, it’s about the putting scanning code on to devices, meaning in the future, another update comes along to scan for “potential” terrorists, and so on. It’s all about the concept of putting scanning code on to devices, in the first place, that’s the issue here.
@@DrumToTheBassWoop No really, I understand people's concerns over the code being there on the device.
But from an engineering perspective, iCloud is just another preinstalled software, which is optional to use. Your perspective is that iCloud Photos is the core part of iOS, and that scanning happens on every phone. To me, it's just another service that is optional and happens to be preinstalled, and won't ever be used if it's not for the exact action of uploading photos to iCloud.
Kind of like, what's on device your device is yours, but if you want to put the stuff in this cloud service then the service will scan before uploading.
Like I said, depends on the perspective. I see it as just another Cloud service that happens to be preinstalled, you see it as major intrusion in the OS itself.
The only way I would consider this to be invasion of On-Device privacy is if the uploads/analyses are performed without disclosure. This announcement is mostly just them saying their cloud services *will* comply with the law.
I DON‘T WANT SPY SOFTWARE ON MY PHONE. PERIOD!
All companies do it already
@@nayutakani2055 still better then how google does it
You can just opt out by deactivating iCloud Photos, and if that still isn’t enough for you, switch to Android, BlackBerry, build a OS yourself or pay someone else who you trust to do it for you or just use pigeons, or don’t ever install iOS 15!
Apple screwed already up their image when they kicked Parler off from iOS, you are a tech company and not politicians, period!
I really wonder what the testing phase for the algorithm looked like.
Oof
Another part of this issue is the idea of who owns the content. Regardless of where it is stored. If the police need a warrant to search a safety deposit box at a bank, shouldn't Apple need a warrant before searching photos? The idea of fiduciary duty and trust. If someone is purposely posting items to a public location by all means search away. But when photos are privately being stored in the cloud it feels very invasive.
I’d love to see what the new TOS are for iCloud once Apple implements this.
Youre not privately storing them though. Youre storing them on Apples iCloud servers where they become responsible for any content you have on there.
@@crusherman2001 And that is the issue. When I store paper files in a safety deposit box (1) the bank can't nosy through my stuff and (2) the bank has the responsibility of keeping my files secure. I still own the documents. For all Apple's talk of privacy this could be manipulated to be very big brother...
@@crusherman2001 If you have a reasonable expectation of privacy, they can't just go through your images to report them to the police. For example, if you pay to store your physical items at a storage place, they can't go through and search your stuff and report it to the police. Now, if they have a reason to think you are doing something illegal (smell of weed coming out, for example), they can report it to the police who will then need reasonable suspicion or a warrant to search your stuff. This proactively searching and reporting people to the authorities is not only a terrible invasion of privacy, but it's one that could create legal issues for innocent users.
With Dropbox, Google, MS, this is happening now. Apple wants to safeguard iCloud. That's why they came up with this (complex) solution to not having to watch all your photos. The plan is to encrypt all parts of iCloud in a way that Apple cannot read it.
This solution is a counter-offer to the US and EU's intrusive laws in the works for "child protection" (as a scapegoat for sniffing through all our cloud data and communications).
This is the first time I've seen apple being so flustered in an interview.
Smells fishy...
I am a developer, I agree with Craig on this tech or specific software implementation, but I am worried about what privacy really should be.
@DataHearth Apple should really implement the best ZeroTrust. The feature is ok, but it's a matter of who they trust. As a developer, I would never tap on this field of sensitivity on users' data.
There no confusion, you are using child pornography as an excuse to scan peoples phone and literally spying on them. Who is Apple to decide they will police people. Drop Apple phones fast, this is not ok, they will use any wording to confuse or convince you this is ok. Watch Edward Snowden video on this bs.
This is so Apple. The condescending attitude and they can’t even have an ad hoc interview, it’s a two camera shoot with powerpoint pseudo-interview.
Privacy was one of the only reasons to use apple but not anymore
Did you watch the interview?
That’s why I like Craig , very clear , very respectful. I’m still using external storage though
And that's why companies hire likeable people to do their pr campaigns. And it worked. Hence why I own the stock. Apple could kill your kids, and you'd still buy the phones.
And I thought apple cared about my privacy
Lol
And Google did no evil. Times change
If you’re a child then your parents can and should know when you’re about to do something unsafe
Just stop download illegal child images and you wont have anything to worry about.
They still do, it isnt like tim apple is eating pop corn and having fun watching your private icloud photos…
it sounds like a blind raid without probable cause or a warrant, they can't see exactly what you have in your house as they rummage around, but they'll check anyway. it's either private or its not.
I don’t want my phone to use AI to scan my photos
Same
As a tech worker, I don't think we misunderstood you, Apple. You are still using MY iPhone's computational power to do something you want to do, without my consent. (Generating the hashes of photos, described by Craig as the "first half" of the process) I don't want to be a part of YOUR company's sense of social responsibility. I have a say if you gonna use MY phone to do anything. I am extremely disappointed with Apple (but thanks to WSJ for this interview). And as a 10-year loyal iPhone and Mac customer, I will re-consider if I should use Apple product if this 'feature' goes live.
So, 7:05 senior vp of software at Apple really don't understand what backdoor is?
Thank you Joanna for doing this interview
she did a terrible job. Joana needs to stopping sucking up to Apple. its even worse that they are going through our iCloud
@@jeycalc6877 so we should bycott apple and move to a Chinese clone.
@@bhagathyennemajalu A brain is a terrible thing waste, I suggest you use yours
Post-wall "professional" woman.
Wall Street probably get donations for this
Craig Federighi looked highly uncomfortable during the interview and his explanations were a concerning mess. Thumbs up for the questions made by the reporter, they usually go much softer on them.
10:50 "I think our customers own their phone for sure"
*Fights againt Right to Repair* 😂
"a degree of analysis done on your device" So, YES, iPhone will be scanned.
No, the HASHES (alphanumerical strings) of your photos will be scanned.
@@infinitepower6780 If they can come into my phone to hash pic, why couldn't the Gov't compel them to hash other files
@@keefyboy touché
I guess it's just trust at this point
Only when preparing the files for upload to iCloud.
It takes 20 years to build a reputation and yet only a few days to crumble the very foundation. Thanks Apple. SMH.
You didn't realize that long ago?
Craig is being trotted out like Colin Powell was about the Iraq war
Yes!
"I don't understand the backdoor characterization" what a weasel
9:46 how much of the shelf life of a phone is lost this way, how hot will their phones get,
I don't get why the hashing has to be done in my phone. That can be easily done once the image is in iCloud (if you choose to use that service). Kinda of scary that my phone is generating hashes to identify the files I store in my device. You don't need to think too much to realize this can be weaponized by the government
hashes are used all the time everywhere on each step of almost any software . so don’t worry about hashes
@@alexkobzin557 yes but no to identify my files comparing the hashes to a government database. As I said, that can be weaponized if you change the CSAM database for any database related about content that the government don’t like. And no, I don’t approve the CSAM. My concern is more about privacy. Something Apple has been using over and over to sell their products
@@aeelinnannelie5651 yeah. if they will screw up privacy they will loose Android completely. and i think they understand that
Actually hashing is a security feature meant to protect the user. It means that your data isn’t in plain sight. So your photo from Utah might look like”hwhizb&37$;8€![€|*...” when stored on a server Instead of the actual image. Which is what you want in case their database is ever breached. On-device hashing happens too for the same reasons.
@@CalienteFrijoles They have the hashes in the database, therefore, they have the hash itself and the file that generates that hash. Yes, Apple cannot see the content directly but if the hash matches the the government can go to they database and check the file. Now apply this hashing to a database related to content that the government don't like and you have an easy way to identify people that are 'against' the system. This has the potential to become something similar to the scoring system that China already has
They even made a cut when Federighi said “pornography” so the Apple campus won’t be in the background if “memed”. Sneaky little weasels.. I can see your tricks!
Same thing on 7:08
And on 9:53
@@jimbo-dev the video is only 11:45 long
@@high63294 oops, correct, I fixed it. RUclips is infested with ads and the mobile youtube client doesn’t allow jumping to timestamps more than once 😖
@@jimbo-dev just buy premium acc, they are cheap
"We've been unwilling to deploy a solution that would involve scaning all customer data"
That's exactly what this this, Craig
While I am against this. This is simply not true. They don’t even scan images that were taken yourself (apparently). Only images that came from another source and is then uploaded to the cloud.
For everyone upset, you know this only matters if you choose to use iCloud. You don’t have to use iCloud. Boom. Problem solved.
I appreciated that she did this fiercely straightforward interviewing sessions (mostly kind of interrogation) for the good of every Apple device users. Thank You ✌🏼
He is shaking 🤣
The real problem is this technology could be used to identify protesters and activists in the name of “CSAM”. Any of your photos and metadata of that photo might be read by Apple’s employees. That’s because Apple can’t guarantee the accuracy of the neural network. My real concern is that law enforcement could take advantage of that, and forcing Apple to access to someone’s gallery to identify social activists, and Apple explains that as a technical defect.
The database they refer to for these images are strictly from NCMEC. NCMEC does not have a database to protesters and activists. Governments do not have access to the Csam verified images, I repeat Csam images only, until the manual verification process which will not happen after repeated algorithmic testing.
Apple has blocked law enforcement before on getting private information from a phone of a literal terrorist. If any precedent was set, its that Apple values your privacy and these what if scenarios will not happen until they do. And even if it does happen, Apple will be accountable for it as Craig mentioned.
@@frappes_ that's not quite the objection at hand. It's NCMEC today. Same approach could be applied to any other database. With regard to the precedent, access was denied to the United States. Courts are able to protect Apple corporation from arbitrary requests. End-to-end encryption was also an excuse used. It's a different ballpark when it comes to authoritarian regimes.
@@justshad937 I get that, and I share that concern, but based on Apple sources this will only laucnh at the US for now. I know it can become a slippery slope but I have faith in the precedent set by Apple in denying governments backdoors to their technology, and even if it does get that bad, the choice for consumers will be clear; do not buy Apple stuff anymore.
@@frappes_ that government could simply make it illegal for Apple to disclose this information. "In the name of national security." The same way US telcos didn't disclosure NSA surveillance.
Here is the issue THE US GOV cannot just go around and go through each device and report what is on there. So neither should apple or any other private company for that matter. This is a lot bigger issue than it is being presented.
"This is not what is happening" but is actually exactly what is happening.
So he is saying data you upload to Apple is theirs to scan if they want.
Even if this feature remains harmless the implications for the future use of the abuse of said feature is bad. Apple should stop this before it gets worse for there reputation.
Can banks scan/look at what's stored in your safe without warrant? If not, why can tech company scan our private files without warrant?
You can just not upload it
Because you agree to it when you upload to cloud
How are they doing multi-camera shots from a web call?
😂😂😂 This is edited ny friend
Yeah, but this is meant to give the impression that it's a 1:1 conversation. This feels staged.
Yes, we are scanning your photos without scanning your photos.
WSJ: We would like to ask you a few softball questions about the new update.
Apple: OK Sounds good, and we'll send 1000 new Iphone 13 so you can " test out "