Apple’s Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features (Exclusive) | WSJ

Поделиться
HTML-код
  • Опубликовано: 18 дек 2024

Комментарии • 2,6 тыс.

  • @MidNiteR32
    @MidNiteR32 3 года назад +1447

    Real title of video: Tim Cook throws Craig to the wolves.

    • @Chrrvn
      @Chrrvn 3 года назад +10

      🤣🤣🤣🤣

    • @cardboardpackage
      @cardboardpackage 3 года назад +6

      bro what 😂

    • @MidNiteR32
      @MidNiteR32 3 года назад +63

      @@cardboardpackage Tim Cook has been hiding while he throws Craig under the bus. I think the CEO of the company should be the one explaining this to its customers and media, not his VP of Software Engineering. Cook just threw him under the bus and started driving it.

    • @vaneakatok
      @vaneakatok 3 года назад +37

      @@MidNiteR32 nah. I think it was the right move. First of all Craig is more agreeable and second the risk is lower. And to be honest Craig managed it formidably imo

    • @vaneakatok
      @vaneakatok 3 года назад +4

      @@MidNiteR32 nonetheless your comment is quite on point ;)

  • @hyypersonic
    @hyypersonic 3 года назад +1172

    3:21 “pornography of any other sort” I’m glad Craig essentially said that Apple knows and understands that people simply just have nudes on their phones

    • @nicolelea615
      @nicolelea615 3 года назад +83

      Yeah, some people are simple degenerate pigs, but not actually pedophiles.

    • @billjamal4764
      @billjamal4764 3 года назад +177

      @@nicolelea615 yes, and some people are photographers and part of that is nude photography, not porn.

    • @abubakrakram6208
      @abubakrakram6208 3 года назад +105

      @@nicolelea615 They might be photos of a spouse or partner. Or photos people took to track weight lose/gain progress.

    • @utubekullanicisi
      @utubekullanicisi 3 года назад +4

      @@billjamal4764 Like you, your dad, your uncle, etc.

    • @yousifwessam180
      @yousifwessam180 3 года назад +11

      @@nicolelea615 My friend there are people out there that are just as bad as mentioned but they are people who have private photos of their partners/spouses don’t put everyone and everything under one group its not fair hope you understand (:

  • @ferreiraleo
    @ferreiraleo 3 года назад +1260

    Loved how he mentions Telegram as the message app.
    No free mentions for you, Zuck.

    • @trowawayacc
      @trowawayacc 3 года назад +24

      Dam facebook. Still dont get how their adquisition of instagram and whatsap went true, o wait...

    • @albinjt
      @albinjt 3 года назад +13

      Yet Telegram founder hates Apple

    • @technotrack5959
      @technotrack5959 3 года назад +8

      @@albinjt so whats the benefits of loving apple?

    • @bigrich9654
      @bigrich9654 3 года назад +1

      @@albinjt probably because Apple restricts certain channels on Telegram. It’s insane that I have to go the browser version of Telegram to view those restricted channels.

    • @albinjt
      @albinjt 3 года назад +1

      @@bigrich9654 What sort of channels are they restricting tho ? And what are the restricted channels in iOS you have been accessing ? Could ya send us the links ?

  • @beku420
    @beku420 3 года назад +817

    Apple: We're not scanning your images, we're just scanning your images

    • @gmancolo
      @gmancolo 3 года назад +74

      We're not scanning your images, we're just scanning OUR images.

    • @spaceforce0
      @spaceforce0 3 года назад +22

      If they are stored on their servers in this day and age I feel as if it’s your fault for trusting big tech. Either way, we’ll all forget about this in a couple of weeks. We basically already have

    • @FFLFFS
      @FFLFFS 3 года назад +6

      If they can install a program that tell me my battery is at 10% after 10minutes of use, when a quick hard restart bring it back on at 100% there is no telling what they can install on your phone.
      If Ur f-ingdeau gives Apple a couple hundred million of our tax dollars because we proved that the vax was ineffective and self immunity has a 80% success at beating the virus there’s no telling what those greedy blasters will do.

    • @oissukki
      @oissukki 3 года назад +5

      Correct me if I'm wrong, but in order to upload an image to the cloud, you need to scan the images first right?

    • @xehP
      @xehP 3 года назад +1

      There is plenty information on how they “scan” the photos, it’s even explained in laymen terms in this video.

  • @bazil4146
    @bazil4146 3 года назад +500

    Now that this whole news has gotten out, actual petafiles aren’t going to be storing their Photos on iPhone anymore. So basically this feature is useless now.

    • @hundvd_7
      @hundvd_7 3 года назад +130

      The people stupid enough to store highly illegal material in cloud storage won't be stopped by these news.
      It was always one of the easiest ways to get caught

    • @diedforurwins
      @diedforurwins 3 года назад +11

      @@hundvd_7 he says, sounding a little too informed

    • @hundvd_7
      @hundvd_7 3 года назад +136

      @@diedforurwins Sure, go ahead and call other people pedophiles. That will make you look smart.

    • @Dr.HouseMD
      @Dr.HouseMD 3 года назад +13

      “Petafiles” bro?

    • @AugustaChile
      @AugustaChile 3 года назад +12

      @@Dr.HouseMD down with those Petafiles!

  • @rijs4303
    @rijs4303 3 года назад +3795

    Let’s make sure all the members of the Vatican have an iPhone

    • @cobracommander.1958
      @cobracommander.1958 3 года назад +245

      The pope just bought a Huawei phone

    • @fofoqueiro5524
      @fofoqueiro5524 3 года назад +28

      Or they picked the blackberry devices.

    • @MG-ll5nw
      @MG-ll5nw 3 года назад +18

      @Highground Trump and biden be sweating

    • @sebastiangruenfeld141
      @sebastiangruenfeld141 3 года назад +12

      Use iCloud*

    • @RajJhaveri
      @RajJhaveri 3 года назад +10

      @Highground We don't have to be selective. But if you want to be, the Republican party is where we should start

  • @quentinlemaitre2998
    @quentinlemaitre2998 3 года назад +1066

    First of all, thank you for covering the issue. I wish you pressed him on what type of audit he mentions, because to me anyone can force Apple to add a database via the FISA court. I want to know what is done to prevent that from happening instead of taking Apple as its word.

    • @youtubeus3rname
      @youtubeus3rname 3 года назад +31

      THIS!! I was pleased to hear about "auditability" -- but what exactly does he mean? Anyone got source / more info on that?

    • @joshua01
      @joshua01 3 года назад +58

      Before WSJ is allowed the privilege to interview craig they have to agree to terms and conditions

    • @Jamesytdjv
      @Jamesytdjv 3 года назад +19

      He seems to be very shady in his explanation as to what the company IS going to do.

    • @zonka6598
      @zonka6598 3 года назад +10

      So basically avoid apple's icloud services

    • @joshua01
      @joshua01 3 года назад +6

      @@zonka6598 if that it what’s works for you and gives you a sense of privacy then by all means but just note that if you use google they’re already doing it and worst so yeah…

  • @VeryBlackMirror
    @VeryBlackMirror 3 года назад +787

    “It takes 20 years to build a reputation and five minutes to ruin it. If you think about that, you'll do things differently.”
    -Warren Buffett
    Apple is feeling this hard, hence the panicked response to media.

    • @bhagathyennemajalu
      @bhagathyennemajalu 3 года назад +6

      Okay then purchase a Chinese phone.

    • @sciencescripture
      @sciencescripture 3 года назад +15

      It’s going to be misused like any other tool big tech and the government gets its hands on, period.

    • @alexjasonchandler
      @alexjasonchandler 3 года назад +3

      @Carrot Cruncher I'm an Android guy through and through Brett has a network engineer I can tell you programs are much more flawed than people realize they're doing this for margin of error

    • @The-Heart-Will-Testify
      @The-Heart-Will-Testify 3 года назад +12

      Apple fanboys will just bend over and accept everything

    • @gliderman9302
      @gliderman9302 3 года назад +6

      @@The-Heart-Will-Testify they are the ones who are mad at apple. Think before u comment.

  • @doompod
    @doompod 3 года назад +239

    Tim: “Hey Craig….”
    Craig: “NO NO NO NO NO NO!”
    Craig: “Hey everyone…😅”

    • @Jushwa
      @Jushwa 3 года назад +1

      Don’t get it

    • @EZYash5
      @EZYash5 3 года назад +3

      @@Jushwa it means tim cook said craig to go for interview

    • @JoinRibyl
      @JoinRibyl 2 года назад

      Tim Cook is the ceo Craig is the software person he knows what everything does he made it Tim Cook does not do software and anyone that stores things to the cloud they dont own the servers all they own is the main device storage they dont scan on the device they scan on the cloud only

  • @popgyorgybotond4741
    @popgyorgybotond4741 3 года назад +406

    “I think the customer owns the phone”
    It’s a yes or no answer

    • @Lousy_Bastard
      @Lousy_Bastard 3 года назад +36

      That's a big fat no.

    • @mantasvilcinskas
      @mantasvilcinskas 3 года назад +10

      That sounds like a yes to me?

    • @joshgribbon8510
      @joshgribbon8510 3 года назад +9

      It's definitely a little more complicated, I can "own" a car but there's a lot of restrictions on what I can to with it or to it, especially if you want to use it on a road. Ownership doesn't really imply full control most of the time, even with land you have tons of laws limiting what you can do with it

    • @Lousy_Bastard
      @Lousy_Bastard 3 года назад +3

      @@joshgribbon8510 Exactly we as consumers don't really own anything anymore and that's the world over, we don't have any rights just privileges until someone decides to take them away.

    • @bernardomejia9882
      @bernardomejia9882 3 года назад +1

      @@mantasvilcinskas definitely more complicated then a yes.

  • @Sulfen
    @Sulfen 3 года назад +613

    I don't want my images to be scanned even if I don't engage in any illegal activities. It doesn't matter if it's AI or a human looking through my photos it just makes me feel uncomfortable.

    • @johansm97
      @johansm97 3 года назад +115

      They already are. Don’t you see how your photos app can recognize faces etc? I think people pressed about this have things to hide

    • @7billza
      @7billza 3 года назад +132

      @@johansm97 lol I don't understand how people think everything in their iPhones aren't already being touched by AI, especially photos. How do you think your photos look so good? Computational photography using AI. How do you think they group faces and show you memories? AI. This is just Apple using AI, in a much more careful way than other companies, to do something. That's all it is, and people are losing their minds

    • @bouzianenadhir8503
      @bouzianenadhir8503 3 года назад +35

      @@7billza they actually aren't, facial recognition on iphone is done on device, apple doesn't scan anything, it's the only company that believes in privacy

    • @coolbuddyshivam
      @coolbuddyshivam 3 года назад +22

      @@bouzianenadhir8503 they wouldn't have destroyed end to end encryption to Apple servers aka iCloud. If they can snoop around while a photo is uploading to cloud, it's not end to end encrypted. It's not private. As simple as that.

    • @soniqstateofficial4490
      @soniqstateofficial4490 3 года назад +26

      Don’t use iCloud then.

  • @jaredspencer3304
    @jaredspencer3304 3 года назад +209

    The reason to worry about this photo scanning is that there's no way it doesn't evolve. Currently, it only checks 1) photos being uploaded to iCloud 2) that match a database of known CSAM. Importantly, this doesn't do anything about new CSAM created in the abuse of children. Catching new material is the obvious next step. And there's no way to achieve that with the current hashing architecture. It has to be done by constantly monitoring all media on the phone, probably with some "AI moderator". And there's no way that some government doesn't demand that this monitoring be used to detect something other than CSAM, like political dissent (remember: China is Apple's biggest market). That's the worry. This new tech is only *kinda ok* as long as it doesn't evolve a single step beyond what it is now. And there's virtually no chance of that happening.

    • @RHStevens1986
      @RHStevens1986 3 года назад +13

      en.wikipedia.org/wiki/Slippery_slope

    • @hrtlsbstrd
      @hrtlsbstrd 3 года назад +16

      @@RHStevens1986 Sure, but also worth considering: en.wikipedia.org/wiki/Foot-in-the-door_technique

    • @dylanxu
      @dylanxu 3 года назад +14

      The typical American ignorance that radiates from this single comment is amazing.

    • @walterwhite1
      @walterwhite1 3 года назад +6

      Ios 16 they will start scanning your on device photo library. Mark my worlds guys 😎

    • @CalienteFrijoles
      @CalienteFrijoles 3 года назад +2

      Yeah, this tech should evolve because this step alone doesn’t solve the problem. Regardless, It was either going to get created to do the right thing or the wrong thing. That’s just how it works. For now it’s use-case is positive.

  • @JackZeroZ
    @JackZeroZ 3 года назад +652

    Same tech can be used to identify political dissidents, protesters, and just about anybody. Imagine matching memes commonly shared by people of the groups to identify people for political persecution.

    • @outofahat9363
      @outofahat9363 3 года назад +33

      Yes. Even If we take them at their word and accept that they can't see other photos because they can only see the ones that neural network has very tightly matched for. They still haven't said anything about the possibility of them searching for other stuff.

    • @stater3
      @stater3 3 года назад +24

      All they need to do is change the hash and AI to look for other photos.

    • @billjamal4764
      @billjamal4764 3 года назад +24

      I'm sure your isp, phone provider, Google, facebook (including Instagram), and any other social media or messaging platform do that. If you truly care about privacy, you have to get an opensource operating system, and only use opensource apps. There's no way around it

    • @grifinx
      @grifinx 3 года назад +1

      THANK YOU was looking for this. This is smoke in Mirrors.

    • @fvs666
      @fvs666 3 года назад +7

      It’s already on gmail , facebook, instagram, twitter and RUclips.

  • @MatuteG
    @MatuteG 3 года назад +318

    I’m glad that she pushed the “who owns your phone” and the conclusion. I applaud WSJ on pushing the exec on something that felt not scripted apple BS interview.
    Now, how do we know those pictures being provided by those associations won’t be manipulated into searching for other stuff. At the end of the day apple has no idea what those hashes are. Who knows what the hash provided was.

    • @bobbyright2010
      @bobbyright2010 3 года назад +14

      @@Karantkr Multiple photo apps do this..

    • @MrSidneycarton
      @MrSidneycarton 3 года назад +7

      This felt not scripted? The forced laughs, fake "searching for the right words", multiple camera angles and after all that this felt unscripted?

    • @nixednamode3607
      @nixednamode3607 3 года назад +5

      @@MrSidneycarton you expect a trillion dollar company to shoot an interview with a single camera? 😒🙄 multiple camera angles are an industry standard

    • @harish8231
      @harish8231 3 года назад +3

      Paid interview

    • @MrSidneycarton
      @MrSidneycarton 3 года назад +1

      @@nixednamode3607 Not sure whether that was intended as sarcasm or not buddy.

  • @MrTee-de7to
    @MrTee-de7to 3 года назад +99

    As a longtime Apple customer (1986)I was thrilled with Tim Cook's statement about privacy and your history of resisting law enforcement and government when it comes to privacy. Now you have appointed yourself the law. And now you are going to scan my phone without my permission. At least the government has to get a warrant. Just a month ago I got rid of my Fitbit watch because Google bought the company and bought an Apple watch because of Apple's supposed commitment to privacy. You are not the government so I have no recourse if you abuse my privacy. So you can do whatever you think is right and I have no recourse. There are only two operating systems in the world and we just have to accept that Big Brother Apple is like Big Brother Google who knows what’s best for the unwashed. We have just about as much recourse as people in China.

    • @Γιάννης-ξ8μ
      @Γιάννης-ξ8μ 3 года назад +5

      Or Apple wanted to avoid government parties such as FBI and CIA so long that by doing so (according to past features such as adding a feature to destroy all users data should the phone's password be typed 10times wrong) it could jeopardize the company. Donald Trump single handedly managed to give an executive order to Google to stop providing the official version of Android and it's services to Huawei and Huawei was almost ready to exit the market.
      Now imagine Apple being forced to show all users iCloud data to governments due to child pornography claims even though you do not have any. That would suck for them and the user's privacy. Apple (for now) found an in-between solution that still protects legit users data on iCloud and protects Apple from governments by giving an actual "backdoor" to them after many years (as seems by Kreg's tone).
      The only time this feature will get out of hand is only if it expands for political parties or political correctness such as someone posting an LGBTQ funny image that seems insulting in apple's eyes. Then things will not look good for Apple.

    • @milantoth6246
      @milantoth6246 3 года назад +11

      An american saying they have to endure tyranny anything like the one in china is just ignorant.

    • @MrTee-de7to
      @MrTee-de7to 3 года назад +3

      @@milantoth6246 It was extreme. My concern is the fact that the internet and media companies are becoming a necessity. Most businesses or utility companies assume you have internet access. The problem is the tools you need to access the internet are companies that can make arbitrary decisions that change your access to the internet, and you have no recourse. There are only two operating systems in reality Apple and Android, private companies.

    • @jackwilson5542
      @jackwilson5542 3 года назад +2

      You can de-google Android phones though, since it is open source. Check out Rob Braxman's channel on how to do it, if privacy is so important to you.

    • @theodiscusgaming3909
      @theodiscusgaming3909 3 года назад +1

      @@justinberman7386 along with Graphene and Calyx which pretty much only work on Pixels, there is also /e/OS which supports a wider range of phones.

  • @richardparker9268
    @richardparker9268 3 года назад +161

    He doesn't seem to understand the fundamental reason people are upset. The hash database is on your phone. The scanning is on your phone. This means that we have no guarantee that our phones will be private in the future.

    • @bhavinbijlani
      @bhavinbijlani 3 года назад +10

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now. Why are you in an uproar now? When they first said your phone was Private, why didn't you roll your eyes and say "ya, but what about the future?"

    • @agentlatte
      @agentlatte 3 года назад +7

      @@bhavinbijlani They already built the tech to do it. That was their argument against creating a backdoor in 2015. Now it exists and Apple has no excuse that they “can’t comply.” They’ve already stated that they developed the technology to comply.

    • @carlosgomez-ct6ki
      @carlosgomez-ct6ki 3 года назад +16

      It feels like in China.

    • @chrismeller6248
      @chrismeller6248 3 года назад +8

      People are upset because they don’t understand the underlying technology, the same way that a lack of education about natural forces and science leads people, still, to call someone a witch and persecute them.

    • @Benjamin-lv8zg
      @Benjamin-lv8zg 3 года назад +1

      @@carlosgomez-ct6ki World is gone same. We want to have privacy. But Every company/gov want to get it.

  • @vinzniv75
    @vinzniv75 3 года назад +483

    This explanation from Apple is even more worrying. They describe a technical solution where no one will be able to independently evaluate what content triggers the alert. Hashes results will be ciphered so that no one will know what content matches what "hash of interest" on the device nor on the backend. Any political sensitive content could be part of the database without any one never knowing. Never trust anyone's word to keep you safe from technology abuse.

    • @user-hm7zn6bz4y
      @user-hm7zn6bz4y 3 года назад +41

      The explanation is literally a lie. We don't process the images on your phone, here's the misunderstanding: oeighoihzgoiehrg hieogheorighe oriheoirhgoierhg "scanning on your phone, yes, but," eoitgoiehgeihgo
      That could be the TL;DW of the video tbh

    • @honzasedlon3309
      @honzasedlon3309 3 года назад +41

      @@user-hm7zn6bz4y It's literally not that hard to understand it

    • @tomboss9940
      @tomboss9940 3 года назад +23

      The alternative is, as Google and MS do, to scan the whole cloud content of all users. Apple wants to be in a position to not being able to see our data. And that's the way to protect our privacy, while trying to follow the laws of the US and EU that want to have more and more supervision.

    • @grimhammer00
      @grimhammer00 3 года назад +12

      Oh it gets worse. At some point audits (real humans) get involved. At this level who knows what can happen…and if anything did go afoul at Apple, how would you know? What happens when hackers find a way to inject foul hashes or FISA requests force apple to apply this tech for political reasons (under the guise of domestic terrorists)…..in fact, the timing is extraordinarily on point with recent up dates to terrorism.

    • @vinzniv75
      @vinzniv75 3 года назад +13

      @@tomboss9940 The alternative is better. You, as a user, can decide whether or not your content undergo the screening. While your data rests on your computer or phone, they remain yours and on your sole control. What Apple is doing is potentially removing that control from your hands: any data on your phone may be monitored without you even granting that right. the only things preventing them from doing that is their good will. Technology history taught us that you should never trust anyone's word from preventing technology abuse (be it knowingly or not).

  • @kushpatel9911
    @kushpatel9911 3 года назад +136

    This was a weak interview. Craig threw some big fancy words when asked to simply describe the system. No hard questions asked and this seemed more like a PR move than an interview

    • @ssud11
      @ssud11 3 года назад +19

      Basically a paid interview for pr purposes.

    • @KarstenJohansson
      @KarstenJohansson 3 года назад +6

      "Craig, tell us why it's okay to treat your customers as if they are guilty until proven innocent, and why you want to foist the system resources onto the users instead of your data centers..." That's what should have been asked.

    • @IndexError
      @IndexError 3 года назад

      @@KarstenJohansson because if it was checked at iCloud servers people would go oh no they are spying on us

    • @KarstenJohansson
      @KarstenJohansson 3 года назад +1

      @@IndexError They wouldn't say that when the check is done on their personal device?

    • @bgill7475
      @bgill7475 3 года назад

      @@ssud11 Yep, they also pay through future access to things, not just through money.
      So if they cover this in a way that Apple likes they get future access to news first because they're seen as trusted.

  • @modimihir
    @modimihir 3 года назад +52

    "People have misunderstood" People are not stupid, we understand what you are doing and we have a problem with it

    • @brendenstahl7007
      @brendenstahl7007 3 года назад +4

      They are making it up to look at our private images 🤬

  • @animeguy6877
    @animeguy6877 3 года назад +350

    "It's not a backdoor. But it can be manually verified by humans in case our algorithm finds a match."
    Hmmmmmmm 🤔 That sounds suspiciously like a backdoor to me.

    • @ralphy4813
      @ralphy4813 3 года назад +18

      Spying withe extra steps

    • @Roshan_420
      @Roshan_420 3 года назад +9

      The files are on their servers

    • @powerplayer75
      @powerplayer75 3 года назад +23

      A backdoor to what? iCloud? Which Apple already controls?

    • @ifiwantyoutofeel
      @ifiwantyoutofeel 3 года назад +2

      Yeah... You don't need to upload your photos or use that service.... Or simple don't have cp

    • @joeswansonthesimphunter2612
      @joeswansonthesimphunter2612 3 года назад +3

      @@ifiwantyoutofeel no it's the fact that it might be a faulty system. How can it differentiate an image of a child posing in a sexual manner with lingerie, to a baby taking a bath. Will it flag both, none, or one of those images? Simple things like that can really impact a person's future

  • @Mickeysternum245
    @Mickeysternum245 3 года назад +654

    It seems like Apple still doesn't understand just how strange this has made their most loyal and fervent customers feel. This has the potential to really spiral out of control on PR terms, much like the 'apple purposely slows down phones' headlines came out of the throttling due to battery age thing. This loyal base kinda sets the tone for what the sentiment around Apple is, and right now they are spewing and the issue isn't going away. I think the way Craig handled this won't do anything to dampen the concerns either, condescendingly dismissing the backdoor concerns and also giving no details on how it will be expanded or just how we can guarantee Apple is limiting it to child porn. I understand Apple has a new head of PR, it's making people question just what Apple has been up to before that their slick PR glossed over. Some kinda line has been crossed here that I've never felt/seen in my 25 years of using and following Apple.

    • @ThinkyParts
      @ThinkyParts 3 года назад +53

      I feel exactly the same way. I’ve been apple only since I was 8… huge fan of the company… they basically bought my house… some line is being crossed here. Like maybe I’m not in love anymore…

    • @masternobody1896
      @masternobody1896 3 года назад +8

      windows is in background nice

    • @Tential1
      @Tential1 3 года назад +21

      The apple loyalists will always fall in line. As a person who used to think "this is surely the last straw for Apple fans" I don't doubt anymore. I buy the stock and get "rich" with the winning team.
      Public backlash needs to be HUGE to stop this. Apple hard-core fans aren't revolting against Apple. I've put my money on that.

    • @thevideoclub8562
      @thevideoclub8562 3 года назад +2

      We're not because we don't run our lives with pitchforks and torches.

    • @stevenperea_5405
      @stevenperea_5405 3 года назад +4

      @@ThinkyParts how old are you now?

  • @Maxyy40
    @Maxyy40 3 года назад +290

    They also announced it on a Friday afternoon because they knew there would be blowback and they just wanted people to forget about it during the weekend. Well that’s not happening.

    • @pixelking_871
      @pixelking_871 3 года назад +1

      And they was about to loose sales , I thought the whole phone was the cloud I'm not understanding

    • @fynkozari9271
      @fynkozari9271 3 года назад +7

      Remember icloud hackin 2014? All celebrities pictures leaked. Yeah Apple has some nice security there. Thank God I dont have an Apple account.

    • @MathieuLLF
      @MathieuLLF 3 года назад +3

      Apple always releases negative news on a Friday afternoon

    • @thehomiedan6378
      @thehomiedan6378 3 года назад +4

      @@fynkozari9271 Dude that was 2014 lol Apple has only gotten better with security since then.

    • @fynkozari9271
      @fynkozari9271 3 года назад

      @Apple Genius Jokes on you, I dont use Samsung or Apple. Sad for u Apple fanboy, need to pay 3X the price of a smartphone, just to get low battery, low storage/ram, ugly notch, slow charging and 60hz display in 2021. LOL so sad. Just make sure u dont break the glass, the repair price is more expensive than 1 android phone.

  • @nish6106
    @nish6106 3 года назад +369

    Tim checks the laptops of his engineers........
    apple engineer: I swear its just for the image classification algorithm.

    • @chefnyc
      @chefnyc 3 года назад +34

      Trying to confuse the rocket detection algorithm with similar images 😏

    • @EzraMerr
      @EzraMerr 3 года назад

      Bruh 🤣

  • @yooperlite
    @yooperlite 3 года назад +10

    It doesn’t matter what the steps are between if A is uploading a photo and Z is them reviewing/alerting authorities. They “Review your private photos” despite the letters in between. Don’t get lost in the steps.

  • @romakrelian
    @romakrelian 3 года назад +70

    Apple cannot call itself the privacy company anymore.

    • @starbutterflygaming8881
      @starbutterflygaming8881 3 года назад +1

      How about Google drive and Dropbox, they also scan for CP

    • @romakrelian
      @romakrelian 3 года назад +14

      @@starbutterflygaming8881 true, but they never really were known for their privacy stance, unlike Apple.

    • @drinkwoter
      @drinkwoter 3 года назад +2

      @@romakrelian thats apple stan right there

    • @4JUVIE
      @4JUVIE 3 года назад

      ...do you people not know how to interpret English language? Why is there still confusion

    • @reydanny6-792
      @reydanny6-792 3 года назад +1

      @@drinkwoter or because nobody is ever safe when buying a phone

  • @JJs_playground
    @JJs_playground 3 года назад +280

    While I applaud the CSAM implementation, the issue becomes how far reaching will this become? It's a slippery slope.

    • @tiagomaqz
      @tiagomaqz 3 года назад +22

      This question can and should not be asked to Apple directly but to the government and entities responsible for controlling data security. All companies have a similar or identical technology and unlike Apple they’ve been using it for decades now.

    • @_fisheater1027
      @_fisheater1027 3 года назад +9

      Same thought. I think this is what happens when legislation cannot keep up with how fast tech develops.

    • @bhavinbijlani
      @bhavinbijlani 3 года назад +2

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.

    • @halahmilksheikh
      @halahmilksheikh 3 года назад +6

      This was built for china to spy on dissidents

    • @Tential1
      @Tential1 3 года назад +2

      @@tiagomaqz other companies scan things on their cloud.
      Apple is scanning on your device AND the cloud

  • @10defo
    @10defo 3 года назад +84

    7:05 should have digged deeper here. The reference hashes belong to child pornography today, tomorrow some state might want to force Apple to add additional references hashes, e.g., of Winnie Poh pictures. If too many Winnie Poh pics get uploaded to the cloud, we have our manual verification prompt, thus our backdoor.
    One could try to weaken the hashes, too, so they cover more pictures, prompting the manual verification on all kinds of pictures.
    In the end, you still need to trust Apple to only check for the hashes they tell you about. Not quite the advertised "you don't have to trust a single entity".

    • @UnkleRiceYo
      @UnkleRiceYo 3 года назад +4

      I’m so lost with why people are upset 🤷🏻‍♂️ So what if a manual verification prompt occurs if we have too many Winnie the Pooh picture? Are you saying that then Disney could then advertise to us more or something? Like apple aren’t gonna report you to the police for having Winnie the Pooh on your phone

    • @JamesTurfKing
      @JamesTurfKing 3 года назад +19

      @@UnkleRiceYo they will if your in China. That’s the point slick. In China it’s a hidden law not to have the photo referencing their leader as Winnie the Pooh so they arrest people who do. Apples software could easily be rolled out to match the picture and report people in China.

    • @Dlawderek
      @Dlawderek 3 года назад

      @@JamesTurfKing So Apple is to blame because of China’s unfair censorship laws? Also, there’s no indication whatsoever that they will be doing anything of the sort.

    • @0mn10us0wl
      @0mn10us0wl 3 года назад +12

      @@Dlawderek I believe you don‘t understand the issue. A country like china could say „hey apple additional to csam also scan for the following images when uploading to iCloud (f.e. HongKong freedom acitivism photos)“ If Apple then goes: „na wr promised our customers not to do that“ China could go: „do it or you‘re no longer allowed to sell your products in China“ (a huge market that brings a lot of revenue). It really isn‘t hard to understand. The problem is not what apple is doing but the possibility of the miss use.

    • @IneffablePanther
      @IneffablePanther 3 года назад +1

      @@UnkleRiceYo did you bother trying to understand why this is an actual problem

  • @ShawnAuth
    @ShawnAuth 3 года назад +176

    Many of us understood exactly what this was from day one, this "talking down to" by Apple is gross. You don't control what's in the database and a government can change it from just CSAM to anything they want. Creating the backdoor is the problem.

    • @justshad937
      @justshad937 3 года назад +18

      Exactly. There was no never any misunderstanding

    • @bhavinbijlani
      @bhavinbijlani 3 года назад +6

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.

    • @luisgutierrez8047
      @luisgutierrez8047 3 года назад +3

      And you can just....you know not upload anything to the cloud....

    • @Theninjagecko
      @Theninjagecko 3 года назад

      Exactly, those who provide the hashes can change it to look for anything.

    • @albertodlh
      @albertodlh 2 года назад

      I disagree completely with calling this a "backdoor". Apple is not *entering* your phone to do anything, Apple is scanning what *you decide to send to them*. This is more of a bouncer than a backdoor.

  • @JessieS
    @JessieS 3 года назад +128

    "A thoroughly
    documented, carefully
    thought-out, and
    narrowly-scoped
    backdoor is still a
    backdoor"

    • @Dlawderek
      @Dlawderek 3 года назад +3

      But isn’t it better than having the door wide open as it is on many cloud services? I think this is the best balance they could find between not hosting CSAM on their servers and also protecting customer privacy.

    • @Manish_Kumar_Singh
      @Manish_Kumar_Singh 3 года назад +5

      @@Dlawderek no

    • @gabrielnunez3371
      @gabrielnunez3371 3 года назад +9

      @@Dlawderek What about not building the door at all. Law enforcement is NOT the duty of private companies, and there are very good reasons for that.

    • @epampoefmkfkefpeao4291
      @epampoefmkfkefpeao4291 3 года назад +1

      @@Dlawderek absolutely not, however good their intentions are, i will never agree to having my private data monitored.
      Something many forget is that one’s privacy is protected by law. Even if police were to illegally obtain even legitimate evidence against one (be it through illegal wiretapping or else) that evidence will be rejected as unlawfully obtained. What apple is doing here is basically rephrasing “we will hack into your storage and check if you have anything illegal” into “we will scan all your photos and if you don’t agree then we will stop providing service to you even if you paid for it”. Barbarism.

    • @chinogambino9375
      @chinogambino9375 3 года назад +1

      @@Dlawderek NO. If you upload to a cloud service its not your hardware or a private space. Apple is now saying my hardware is actually theirs too to do as they please...

  • @stoneroseshero
    @stoneroseshero 3 года назад +29

    This is painful even for him to sell this…my god. This is a problem. iCloud photos are now turned off for me.

    • @bruhlollmao560
      @bruhlollmao560 3 года назад +1

      dont jinx this to me dude, i just switched to icloud

  • @zainratnani
    @zainratnani 3 года назад +73

    7:45 what are the multiple levels of auditabililty? Will you seriously say “no” to China?

    • @halahmilksheikh
      @halahmilksheikh 3 года назад +20

      They've already said "yes" to China when they gave up their security keys to decrypt Chinese iCloud data. They're just going to fold again.

    • @fofoqueiro5524
      @fofoqueiro5524 3 года назад +2

      And will likely more yes to other governments

    • @mukamuka0
      @mukamuka0 3 года назад +8

      Even more than that, this whole thing is probably start from China because Huawei got banned. So, CCP lost it's surveillance tools and turn to Apple for answer. What else can forced Apple to sudden launch such opposite program

    • @sorryi6685
      @sorryi6685 3 года назад

      They don't provide encryption for phones sold in China and Saudi Arabia

    • @NotTubeIm
      @NotTubeIm 3 года назад +3

      @@mukamuka0 lol remember Apple is an American company. If anyone is asking them to do anything it’s the CIA

  • @telomnisi9054
    @telomnisi9054 3 года назад +16

    If this is allowed, what's stopping them from reporting your drug pics to the police? Wake up people

    • @gobi817
      @gobi817 3 года назад +6

      If drug is illegal where you live, then why not? People doing illegal activities should be reported.

    • @bradador1
      @bradador1 3 года назад +1

      DRUGS ARE BAD MQWAYYYY

    • @koloqial
      @koloqial 3 года назад +2

      @@gobi817 You missed the point entirely. Also, simply having a picture of drugs is not illegal.

    • @DebraJohnson
      @DebraJohnson 3 года назад +1

      @@gobi817 Because you have a reasonable expectation of privacy on your personal cell phone and companies don't have the right to search and report your content to the police. They shouldn't be looking at your data beyond what is necessary to provide cell phone service. iCloud was marketed as a way to store your data, not a service to scan for and prevent illegal activity.

    • @KP3droflxp
      @KP3droflxp 3 года назад

      Just don’t upload your photos to Apple then? Also, I don’t think people send well known pictures of drugs to other people. Funnily enough, if Apple has a hash for your drug photo, this proves you didn’t take it yourself.

  • @TomNook.
    @TomNook. 3 года назад +293

    "customers own their phones for sure"
    They cant even repair them without going to Apple!

    • @JackieWelles
      @JackieWelles 3 года назад +16

      You own it, until you want to repair it ;)

    • @milantoth6246
      @milantoth6246 3 года назад +3

      I just did it today tho

    • @bilalkhann16
      @bilalkhann16 3 года назад +16

      if you repair, you’ll get a warning message in settings 🥲

    • @zak6093
      @zak6093 3 года назад +4

      You don't own an IPhone, you just use it.

    • @costacoffee4life665
      @costacoffee4life665 3 года назад +6

      Ive been repairing Apple products for 2 years, and aside from battery replacements I wouldn’t recommend non-techie/ qualified/ confident people to do other things like replacing screen screens, lightning ports, FaceID sensors etc

  • @RickLaBanca
    @RickLaBanca 3 года назад +33

    “no no no you dummies don’t understand how this works.”
    We do, which is why we don’t want it.

  • @lenajk2004
    @lenajk2004 3 года назад +36

    “I think the customer owns the phone”
    Right to repair: no

  • @evergreen-
    @evergreen- 3 года назад +13

    7:18 This is FACTUALLY WRONG. First, only the database of HASHES of CSam is stored on the device. By the nature of hashing, it’s IMPOSSIBLE to get the source image that produced that hash, meaning you CANNOT CHECK for yourself if any given image will be classified as CSam without trusting the US authorities’ database for only including relevant hashes (and not hashes of political or religious content). In fact, I believe even Apple has to trust the authorities to provide relevant hashes only.
    Second, 7:30 Mr. Federighi didn’t mention that the system will launch in the US only, thus he could say that the database will be THE SAME. No, there exists no universe in which the Russian or Chinese government that will allow iPhones to be shipped with magic hashes that were SECRETLY produced by the US authorities. I bet, the first thing they will do, perhaps justly, is to DEMAND either ACCESS to the US database or (more likely) use of their own databases for their citizens. And I can guarantee that the databases will be matching against WHATEVER doesn’t suit the government.
    Finally, here’s why all of these is an attack on our rights: There exists a way to bypass this mechanism by simply turning off iCloud Photos. In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason.

    • @kiloton5764
      @kiloton5764 3 года назад

      "In other words, while the pervs will lose the convenience of syncing CSam across their devices, everyone else will be surveilled for no reason." Lolololol! Nicely put.

  • @GJ835
    @GJ835 3 года назад +50

    Still doesn’t hit on the real concerning issue

    • @mukamuka0
      @mukamuka0 3 года назад +4

      Wait until China ask them to quietly scan other photo...

  • @KuroiPK
    @KuroiPK 3 года назад +224

    They should done this interview from the start, and the worry about future change still stands.

    • @Eugenepanels
      @Eugenepanels 3 года назад +11

      Right? This makes the suspicion grow even more.

    • @KuroiPK
      @KuroiPK 3 года назад +5

      @@Eugenepanels yeah it’s really strange how they try to underplay this change in a way. They should have done a comprehensive press release from the start, considering how important this change is.

    • @TomorowGames
      @TomorowGames 3 года назад

      Dude the whole thing was leaked before they could properly present this. That’s why it’s causing problems, because it wasn’t officially presented by Apple.

    • @KuroiPK
      @KuroiPK 3 года назад

      @@TomorowGames as far I know it wasn’t leak but released be apple them self via their newsroom, but I will check if I’m wrong….

    • @infinitepower6780
      @infinitepower6780 3 года назад

      @@TomorowGames Yeah exxactly. It was leaked way before the proper launch and as a result there were tons of false information and fearmongering.

  • @SC-RGX7
    @SC-RGX7 3 года назад +37

    Tim literally threw the guy at wolves. Hilarious

    • @_sparrowhawk
      @_sparrowhawk 3 года назад +3

      It's almost as if it's Craig's job to talk about software, 5 days a week. He even makes a few mil a year for doing it.

    • @SC-RGX7
      @SC-RGX7 3 года назад +1

      @@_sparrowhawk talk about software is his job, but that matter was super important and a word from Tim would have been welcomed

    • @00z53
      @00z53 3 года назад +3

      Nice copy and paste

  • @Feadds
    @Feadds 3 года назад +32

    Craig practicing his “Good Morning” for Tim Cook’s Replacement 👀😂
    Reference | 1:20

    • @triple7marc
      @triple7marc 3 года назад +4

      I would be happy if Craig took over for Tim.

    • @Feadds
      @Feadds 3 года назад +2

      @@triple7marc Same, he’s so perfect for the Role . Full of Life and so Enthusiastic .

  • @artjom01
    @artjom01 3 года назад +72

    5:36 "Human moderators". So basically private icloud content can be viewed by apple tech support moderators

    • @harsimranbansal5355
      @harsimranbansal5355 3 года назад +5

      It will probably be a highly specialized team who can do that, not anyone at apple, let alone tech support.

    • @blackhatson13
      @blackhatson13 3 года назад +20

      @@harsimranbansal5355 still a violation of privacy

    • @sivamanipatnala5517
      @sivamanipatnala5517 3 года назад +4

      @@blackhatson13 they are big tech companies…and they follow many rules and regulations, it isnt very simple for every employee over there to come and view our private icloud photos…

    • @noblesse4728
      @noblesse4728 3 года назад +5

      Isn't Facebook also did this with it's team of "human moderator". I'm not saying if that's not gonna invade my privacy, but without those human moderator, we could be seeing terrorism, porn and those nasty things in our message / chat

    • @tiagopaim3060
      @tiagopaim3060 3 года назад

      This has always been the case

  • @marks4java
    @marks4java 3 года назад +283

    I appreciate the tone and balance of this interview. Nice job. My biggest problem with these features is that Apple is assuming a moral position. Let me say I am 100% aligned on these behaviors being immoral/heinous. What concerns me is simply that they are taking a moral position. What happens when next month, it’s not child porn but “hate words” in iMessage? Hate defined however Silicon Valley defines it. Applying tech to moral subjects is a very slippery slope. To suggest they can’t or won’t misuse this kind of tech in the future is just ignorant/naive.

    • @davehugstrees
      @davehugstrees 3 года назад +8

      Legislation or court systems in other countries could easily add requirements to Apple’s scanning database. It’s hard to believe Apple executives could be this short-sighted about a technology. In order to save face Apple can simply say there are problems with the technology and shelve this for the time being.

    • @Dlawderek
      @Dlawderek 3 года назад +12

      I think CSAM and “hate words” are not even nearly in the same league. CSAM is illegal and demonstrably dangerous. The 1st amendment protects your “hate words” so I find it hard to believe that Apple would scan or flag this content. This is a “slippery slope” logical fallacy.

    • @davehugstrees
      @davehugstrees 3 года назад +6

      @@Dlawderek “Hateful content” like Nazi imagery is illegal in some European countries. What’s to stop governments from requiring to Apple to include that in the database of images they scan for?

    • @Dlawderek
      @Dlawderek 3 года назад +5

      @@davehugstrees Maybe they will. If they start censoring political speech by looking through people's images and reporting them, I would be mad. This is not that. If that day comes, we can all turn off our iCloud storage and/or get rid of our Apple products. I don't think outrage is justified in a case where they are taking very cautious steps to curb the storage of CSAM on their servers. It takes 30 instances of hashcodes matching known CSAM before there is an audit. Even if some photos are flagged mistakenly (which I understand to be very rare) it would never reach 30 by mere chance. Even if it did, I would not mind someone at Apple verifying that I have no illegal images in my iCloud. There shouldn't be anything here to worry about.

    • @brkbtjunkie
      @brkbtjunkie 3 года назад +8

      Anyone who says it will never be misused or increase in scope is lying to themselves.

  • @ThinkyParts
    @ThinkyParts 3 года назад +34

    From apple: “Could governments force Apple to add non-CSAM images to the hash list? -> Apple will refuse any such demands” So once again… you’re missing the point. Yes, a government could force this… but trust us. Why should we trust them? Who’s the next leadership team? Apple needs to stop this now. I’m honestly considering breaking up with them for the first time in 30 years.

    • @shahrilamirul4007
      @shahrilamirul4007 3 года назад +1

      Do it, I know I am

    • @hsing-kaichen5062
      @hsing-kaichen5062 3 года назад +1

      Didn’t Apple refuse to unlock an iPhone to US federal government once for a crime case?

    • @Tential1
      @Tential1 3 года назад +4

      @@hsing-kaichen5062 they also gave into to China and put a data center in China for China icloud. So China just has to walk over to their icloud data center in china, pull the physical data and they have China iPhone data. They already caved to China once. China will ask to add their own csam database, will you disagree then? And what's in that database? We won't know.

    • @thakillerb
      @thakillerb 3 года назад

      And go where? Analog? Pick your evil…

    • @sirfabyan
      @sirfabyan 3 года назад

      if you don’t trust them don’t use icloud photos then and switch to a different cloud photo service 🤷🏼‍♂️

  • @lost-prototype
    @lost-prototype 3 года назад +57

    "Who owns this phone?"
    "Well, customers do, but good luck running any software other than ours on it."
    Answers to moot questions keep average consumers misinformed.

    • @Black7308
      @Black7308 3 года назад +1

      You’ve obviously never jailbroken an iPhone

    • @lost-prototype
      @lost-prototype 3 года назад

      Yeah, and that's totally intentional.

  • @jellyd4889
    @jellyd4889 3 года назад +53

    Sounds convincing. But this is still a backdoor to expand for the govt.

    • @frufrufrufru1999
      @frufrufrufru1999 3 года назад +2

      Literally no, goggle has been doing this for the past 10 years

    • @akhileshjayaranjan5628
      @akhileshjayaranjan5628 3 года назад +1

      hashes can be made from photos but a hash cannot be converted back into the photo. Apple does not see your photos to generate the hash.

    • @frufrufrufru1999
      @frufrufrufru1999 3 года назад

      @@akhileshjayaranjan5628 exactly

    • @noth1ngnss921
      @noth1ngnss921 3 года назад +1

      @@akhileshjayaranjan5628 If Apple cannot see your photos then what's even the point of this system? Algorithms are faulty and Apple admits that if this system flags something, there will have to be a human to double-check. And that's the big problem right there: they *can* check your photos. Who is to say that they or a local/US government agency wouldn't just check every photo instead of only the ones that have been flagged?

  • @kadirkaynar
    @kadirkaynar 3 года назад +137

    we will still continue the ‘Misunderstood’ after this video explanation

    • @samsonsoturian6013
      @samsonsoturian6013 3 года назад +2

      This is the exact moment "misunderstand" becomes "defame."

    • @TechieBaby
      @TechieBaby 3 года назад +1

      @@samsonsoturian6013 NO. DON"T TOUCH MY PHONE. DON"T USE MY IPHONE"S COMPUTATIONAL POWER TO DO THE FIRST HALF OF THE WORK. NONE OF MY BUSINESS. I DON"T WANT TO BE INVOLVED.

    • @KaizenAction296
      @KaizenAction296 3 года назад

      He gave a vague answer. In future apple is planning to scan our entire phone . People like you who still don't understand and still thinks that apple is god . whatever they do is perfect.. I feel bad for you brother .

    • @samsonsoturian6013
      @samsonsoturian6013 3 года назад

      @@KaizenAction296 ok, conspiritard

  • @Tential1
    @Tential1 3 года назад +71

    "journalism" = regurgitating what big companies tell you

    • @Bbhai24
      @Bbhai24 3 года назад +2

      Where are Global Human Rights activists ?
      Taliban is more violent than ISIS

    • @bestintentions6089
      @bestintentions6089 3 года назад +2

      Nothing to see here , move along pleb

    • @jeycalc6877
      @jeycalc6877 3 года назад

      she is just like some apple activist, protecting apple at all costs

  • @denisruskin348
    @denisruskin348 3 года назад +207

    The second part reminds me of that Black Mirror episode. We are getting there.

    • @monsters8730
      @monsters8730 3 года назад +2

      Which one?

    • @jayseb
      @jayseb 3 года назад +2

      As a parent and security expert, I get that feature. The first one is the one I'm more curious about...

    • @LuthandoMaqondo
      @LuthandoMaqondo 3 года назад

      Which episode?

    • @ShubhamKumar-xu2od
      @ShubhamKumar-xu2od 3 года назад +5

      @@LuthandoMaqondo Arkengel

    • @user-ls2jg7vl2h
      @user-ls2jg7vl2h 3 года назад +1

      @@ShubhamKumar-xu2od mm nice one It hadn’t occurred to me but agreed. This is the worry with technology little by little but we’re getting to that point

  • @jackoryan292
    @jackoryan292 3 года назад +44

    I think that Apple has “misunderstood” that I value my privacy more than the convenience their products and services can offer me.

    • @cmtheone
      @cmtheone 3 года назад +2

      They aren’t looking at your photos. The only people that should be worried about this are child predators… which may be telling of why you care so much.

    • @jackoryan292
      @jackoryan292 3 года назад +13

      @@cmtheone I’ve worked with law enforcement to put predators in jail before. It’s funny that you’re too dim to see how having your privacy tampered with in the name of the greater good isn’t concerning. Then again, you’re the ideal complacent sheeple that big companies and governments want us all to be. Enjoy your ignorance friend.

    • @jopa7696
      @jopa7696 3 года назад +2

      @@jackoryan292 Switch to Samsung brother 👍 I'd recommend the S21 great phone 👍 I love my iPad but come on man switch to Samsung brother 👍

    • @KP3droflxp
      @KP3droflxp 3 года назад

      @J0p4 google has been doing this for ages. As well as Microsoft. So if you’re going to use them for image cloud storage it’s even worse.

    • @FFLFFS
      @FFLFFS 3 года назад

      @lol
      what makes you think child predators will store their photos on their phones?
      Same idiocy as using gun registration to stop violence criminals using guns to rob a bank.

  • @goodtimes333888
    @goodtimes333888 3 года назад +14

    Apple is the only company telling people this is happening. Thank you for the transparency

    • @leonardog.2491
      @leonardog.2491 3 года назад +2

      You’ve got a great point man

    • @chawlee
      @chawlee 3 года назад +1

      Wasn’t it leaked? Then they had to come out and explain...

  • @shrteng6856
    @shrteng6856 3 года назад +30

    It sounds like “you are holding it wrong”

    • @JV-tk3nn
      @JV-tk3nn 3 года назад +1

      I was looking for this comment.

    • @giacomonki
      @giacomonki 3 года назад +1

      Me too

    • @DMINATOR
      @DMINATOR 3 года назад

      Someone is old enough to remember ;)

  • @aibochan1764
    @aibochan1764 3 года назад +72

    “We’re not scanning your photos, you see, we’re scanning your photos.”

    • @jeycalc6877
      @jeycalc6877 3 года назад +4

      actually it's we aren't scanning your photos on your phone, we are scanning you entire icloud photo library. It's even worse

    • @ohmyghost88
      @ohmyghost88 3 года назад +10

      Actually they aren’t scanning any files. They are creating an encrypted hash that is checked against their database of CP hashes. How hashes work is they cannot be decoded and the only wait to identify them is to have a hash. Thus, the only data that is “revealed” in this process is CP data which should be banned. However, this is not to say that I agree with what they are doing or that I don’t recognize the potential of what this may be come as it relates to privacy, but the fundamental feature actually doesn’t breach privacy unless the user uploads CP.

    • @jeycalc6877
      @jeycalc6877 3 года назад +16

      @@ohmyghost88 that is literally scanning

    • @aibochan1764
      @aibochan1764 3 года назад +7

      @@ohmyghost88 nice try Craig we know that’s you

    • @ohmyghost88
      @ohmyghost88 3 года назад +4

      Hashing isn’t scanning. The whole point of hashing is to efficiently store and retrieve data without scanning. The hash does not know the contents of the file, it just calculates a number (hash) that is used during transport to check if errors occurred (the checksum is calculated at the source and destination) and it needs to be sent again. Did you guys take a computer networking class or not?

  • @kingkang6877
    @kingkang6877 3 года назад +131

    "I THINK our customers own their phones"
    What a great vote of confidence......

    • @khriskeane6800
      @khriskeane6800 3 года назад +4

      for sure.

    • @VeryBlackMirror
      @VeryBlackMirror 3 года назад +8

      I was shocked he used that language. I’m guessing Craig, Tim and anyone else giving media interviews are demanding the questions upfront.
      Then Apple legal, corp comm and marketing can train the two of them with exactly what to say that will answer SOME questions, but not enough to commit to anything that could lead up to being used in a courtroom or in Congress against them.

    • @joshuanolan4581
      @joshuanolan4581 3 года назад

      Lol exactly

    • @KarstenJohansson
      @KarstenJohansson 3 года назад +14

      If the customers owned their phones, they'd be able to install software from wherever they wanted to obtain it. They'd also be able to replace the battery themselves, even if it meant buying a special tool for the job.

    • @tophan5146
      @tophan5146 3 года назад +3

      THINK DIFFERENT

  • @69savpm67
    @69savpm67 3 года назад +1

    Child pornography is really bad, scan them, and alert the authorities. Stop child abuse. Authorities please take action especially when that person is very politically connected like your policy makers.

  • @OasisMusicOfficial
    @OasisMusicOfficial 3 года назад +8

    In a practical sense, Apple at least has good intentions by doing this. Its unarguably a good thing that they are planning on tracking down phones that happen to have children on there.
    I do see why people are mad tho. Apple has always had a long history of keeping information secure for its customers and this seems like a slap in the face to those who use iPhone because of its security.

    • @mildmixchintu1717
      @mildmixchintu1717 2 года назад

      Because apple has always been so vocal about Privacy and Not letting other apps track your data. And also cuz apple has starting to show ads on their platforms and hiring people to create a targeted ad network that they have been opposing for so long, pushing out the whole competition. Google scans your data all the time, and flags illegal stuff on gDrive but it's not a bid deal cuz they never said they won't do it or tracking personal data is a bad thing like Apple has been doing.

  • @avieshek
    @avieshek 3 года назад +151

    What a timing to drop this exactly after the *Pegasus* deal 'still unaddressed'

  • @jimbo-dev
    @jimbo-dev 3 года назад +61

    They actually cut the Apple campus out from any word Craig said which could be used to make memes! That means there has to be an agreement for this interview. I wonder if that includes other limitations as well since the interviewer didn’t pressure Apple that much. This feels more like Apple marketing than journalism

    • @koloqial
      @koloqial 3 года назад +5

      I thought similar. I noticed how this was cut too....like why did they have alternate camera angles for a meeting that took place on FaceTime?

    • @JackieWelles
      @JackieWelles 3 года назад +3

      I mean what did you expected from Apple. They are one of the most strictest companies who absolutely love controlling the narrative.

    • @gabrielgarcia7554
      @gabrielgarcia7554 3 года назад +2

      Given that this is an exclusive, this is most likely a way for Apple to take control of the situation. Most companies will only agree to these types of interviews if only certain questions are asked to control the narrative.

    • @roiqk
      @roiqk 3 года назад +3

      Literally everything you see in news these days is just propaganda. Journalism is dead

    • @gabrielnunez3371
      @gabrielnunez3371 3 года назад +1

      Time to make some memes with a huge Apple watermark out of pure spite

  • @jjaramos
    @jjaramos 3 года назад +87

    I wish Joanna would have asked about the future "enhancement and expansion" of this thing, as Apple announced. Dystopian world we are about to live in.

    • @exiles_dot_tv
      @exiles_dot_tv 3 года назад +8

      Meanwhile Google has already been inhabiting that world for *years* now.

    • @jjaramos
      @jjaramos 3 года назад +2

      @@exiles_dot_tv indeed, the rest of big tech are dragging us all to that dark place.

    • @bhavinbijlani
      @bhavinbijlani 3 года назад

      I believe the POSSIBILITY of future changes existed even before this announcement. We only had their word before and we only have their word now.

    • @PedroLopezBeanEater
      @PedroLopezBeanEater 3 года назад +2

      Have you been asleep the last few decades or are you just a Microsoft/Google fan boy?

    • @jjaramos
      @jjaramos 3 года назад +4

      @@PedroLopezBeanEater I haven't and I'm not anyone's fan boy.

  • @soundsof...
    @soundsof... 3 года назад +28

    "I think our customers own their phones, huh, for sure."
    Too bad his thinking is not reflecting what is really happening...
    #righttorepair
    p.s. this is not an interview, merely a communication from Apple...

    • @alespic
      @alespic 3 года назад

      Pretty sure she asked questions, that makes it an interview.
      And he answered those questions. No fuss about it

  • @crspy1075
    @crspy1075 3 года назад +26

    "how do you know this is a nude image or a rocketship?" LOL top-tier questions!

    • @baoquoc3710
      @baoquoc3710 3 года назад +5

      Having a picture of Blue Origin rocket
      Iphone user: *nervous sweating*

  • @kcocok
    @kcocok 3 года назад +24

    Apple is like: a man comes to a lady during her shower, saying he will keep eyes closed and just scan for security. People just don’t believe it and don’t buy it. The point is not “ a safe way to scan phones”. The point is “ DON’T scan my phone”. Dont’t means don’t

    • @jovimathews
      @jovimathews 3 года назад +3

      But Apple don’t scan your phone. I think you meant photos on iCloud server.

    • @brendenstahl7007
      @brendenstahl7007 3 года назад +1

      Connection to your phone

    • @JV-tk3nn
      @JV-tk3nn 3 года назад

      I like your woman in the shower analogy, but you should have elaborated more on that story. Left me wonder what happens next. When can the man open his eyes?

    • @xehP
      @xehP 3 года назад

      Yeah a MAN, ofc it has to be a MAN

  • @SKongdachalert
    @SKongdachalert 3 года назад +17

    Though I was concerned about the privacy aspect at first, I think Craig explained it pretty well here, and I kind of get his viewpoint from a software development/iCloud as a service standpoint.
    1. *Without* cloud services, the device itself is secure and encrypted.
    2. When you're using iCloud, when images are uploaded, they sort of perform a comparison to the reference CSAM image database on-device (which I guess is a trained neural network to flag that part), and upload it with the actual photo. Actual photo never gets opened, only the CSAM neural net jumbled encoding gets processed in the second half of the neural network in the cloud.
    Now the 2nd part may sound like potential invasion of *on-device privacy*. Maybe.
    From my experience even "secure" cloud services like MEGA or such do routinely flag accounts for takedown when hashes of the files match copyrighted content. I think the major difference is in the way the actual cryptography is performed.
    As far as I understand. With most cloud providers, they scan all files in the cloud, like, the whole hashing/neural network encoding stuff is all done on your files in the cloud. But in Apple's case, as far as Craig says it's more like.. Hey we don't want our servers to know this much about the original image, we'd rather have part of the encoding performed on device, then sent over to the cloud. This benefits both Apple and people in that the cloud doesn't need to perform as much workload (maybe), and the cloud doesn't need access to the actual image in order to perform encodings on it.
    Put simply, I think the way they see it is more like...
    1. "That on-device encoding is better, because now we still can't access user data *while also complying with the law on cloud services, without being able to give the authorities the actual data they have*".
    2. Processing images on-device considered as *part of the uploading process*, rather than considered as "scanning of all photos". With the recent iPhones sporting more and more advanced neural network accelerators, I think it's a logical step from a development standpoint to simply perform an encoding of the images on-device (both very efficient for the server & makes server not need to go though the actual image data).
    So the on-device security is not compromised, it's just that part of iCloud support is built in to iOS, and this part is not used at all when you don't use the cloud services (i.e. Craig's "and you don't have to" around 2:34). It's the iCloud uploading process that does the scanning. And so, iCloud is still secure "data-wise". Nobody can read your data, *but* your cloud account can still get flagged for suspicious activity.
    Pretty neat engineering, but to consider if it's an invasion of privacy is up to debate. But on-device still sounds pretty secure to me.

    • @prevails85
      @prevails85 3 года назад +1

      Well said!!!

    • @DrumToTheBassWoop
      @DrumToTheBassWoop 3 года назад

      Completely missed the point then, it’s not about CSAM, it’s about the putting scanning code on to devices, meaning in the future, another update comes along to scan for “potential” terrorists, and so on. It’s all about the concept of putting scanning code on to devices, in the first place, that’s the issue here.

    • @SKongdachalert
      @SKongdachalert 3 года назад +2

      @@DrumToTheBassWoop No really, I understand people's concerns over the code being there on the device.
      But from an engineering perspective, iCloud is just another preinstalled software, which is optional to use. Your perspective is that iCloud Photos is the core part of iOS, and that scanning happens on every phone. To me, it's just another service that is optional and happens to be preinstalled, and won't ever be used if it's not for the exact action of uploading photos to iCloud.
      Kind of like, what's on device your device is yours, but if you want to put the stuff in this cloud service then the service will scan before uploading.
      Like I said, depends on the perspective. I see it as just another Cloud service that happens to be preinstalled, you see it as major intrusion in the OS itself.
      The only way I would consider this to be invasion of On-Device privacy is if the uploads/analyses are performed without disclosure. This announcement is mostly just them saying their cloud services *will* comply with the law.

  • @lucamuller1672
    @lucamuller1672 3 года назад +15

    I DON‘T WANT SPY SOFTWARE ON MY PHONE. PERIOD!

    • @DisticTV
      @DisticTV 3 года назад +1

      All companies do it already

    • @DisticTV
      @DisticTV 3 года назад

      @@nayutakani2055 still better then how google does it

    • @koldobika_
      @koldobika_ 3 года назад

      You can just opt out by deactivating iCloud Photos, and if that still isn’t enough for you, switch to Android, BlackBerry, build a OS yourself or pay someone else who you trust to do it for you or just use pigeons, or don’t ever install iOS 15!

  • @adamchandler8834
    @adamchandler8834 3 года назад +2

    Apple screwed already up their image when they kicked Parler off from iOS, you are a tech company and not politicians, period!

  • @NielsDutch1906
    @NielsDutch1906 3 года назад +15

    I really wonder what the testing phase for the algorithm looked like.

  • @38Unkown
    @38Unkown 3 года назад +13

    Another part of this issue is the idea of who owns the content. Regardless of where it is stored. If the police need a warrant to search a safety deposit box at a bank, shouldn't Apple need a warrant before searching photos? The idea of fiduciary duty and trust. If someone is purposely posting items to a public location by all means search away. But when photos are privately being stored in the cloud it feels very invasive.

    • @VeryBlackMirror
      @VeryBlackMirror 3 года назад +2

      I’d love to see what the new TOS are for iCloud once Apple implements this.

    • @crusherman2001
      @crusherman2001 3 года назад +2

      Youre not privately storing them though. Youre storing them on Apples iCloud servers where they become responsible for any content you have on there.

    • @38Unkown
      @38Unkown 3 года назад +1

      @@crusherman2001 And that is the issue. When I store paper files in a safety deposit box (1) the bank can't nosy through my stuff and (2) the bank has the responsibility of keeping my files secure. I still own the documents. For all Apple's talk of privacy this could be manipulated to be very big brother...

    • @DebraJohnson
      @DebraJohnson 3 года назад

      @@crusherman2001 If you have a reasonable expectation of privacy, they can't just go through your images to report them to the police. For example, if you pay to store your physical items at a storage place, they can't go through and search your stuff and report it to the police. Now, if they have a reason to think you are doing something illegal (smell of weed coming out, for example), they can report it to the police who will then need reasonable suspicion or a warrant to search your stuff. This proactively searching and reporting people to the authorities is not only a terrible invasion of privacy, but it's one that could create legal issues for innocent users.

    • @tomboss9940
      @tomboss9940 3 года назад

      With Dropbox, Google, MS, this is happening now. Apple wants to safeguard iCloud. That's why they came up with this (complex) solution to not having to watch all your photos. The plan is to encrypt all parts of iCloud in a way that Apple cannot read it.
      This solution is a counter-offer to the US and EU's intrusive laws in the works for "child protection" (as a scapegoat for sniffing through all our cloud data and communications).

  • @abhigyanchakraborty5563
    @abhigyanchakraborty5563 3 года назад +33

    This is the first time I've seen apple being so flustered in an interview.
    Smells fishy...

  • @revtane9
    @revtane9 3 года назад +38

    I am a developer, I agree with Craig on this tech or specific software implementation, but I am worried about what privacy really should be.

    • @revtane9
      @revtane9 3 года назад +1

      @DataHearth Apple should really implement the best ZeroTrust. The feature is ok, but it's a matter of who they trust. As a developer, I would never tap on this field of sensitivity on users' data.

  • @Icybones000
    @Icybones000 3 года назад +2

    There no confusion, you are using child pornography as an excuse to scan peoples phone and literally spying on them. Who is Apple to decide they will police people. Drop Apple phones fast, this is not ok, they will use any wording to confuse or convince you this is ok. Watch Edward Snowden video on this bs.

  • @RickLaBanca
    @RickLaBanca 3 года назад +2

    This is so Apple. The condescending attitude and they can’t even have an ad hoc interview, it’s a two camera shoot with powerpoint pseudo-interview.

  • @prateekyadav9956
    @prateekyadav9956 3 года назад +14

    Privacy was one of the only reasons to use apple but not anymore

  • @knowledgeispower36
    @knowledgeispower36 3 года назад +30

    That’s why I like Craig , very clear , very respectful. I’m still using external storage though

    • @Tential1
      @Tential1 3 года назад +4

      And that's why companies hire likeable people to do their pr campaigns. And it worked. Hence why I own the stock. Apple could kill your kids, and you'd still buy the phones.

  • @DeedoDoop
    @DeedoDoop 3 года назад +70

    And I thought apple cared about my privacy

    • @Michael-te6tb
      @Michael-te6tb 3 года назад +4

      Lol

    • @TomNook.
      @TomNook. 3 года назад +6

      And Google did no evil. Times change

    • @letrat7021
      @letrat7021 3 года назад +11

      If you’re a child then your parents can and should know when you’re about to do something unsafe

    • @JConnel
      @JConnel 3 года назад +8

      Just stop download illegal child images and you wont have anything to worry about.

    • @sivamanipatnala5517
      @sivamanipatnala5517 3 года назад +5

      They still do, it isnt like tim apple is eating pop corn and having fun watching your private icloud photos…

  • @pjdexter168
    @pjdexter168 3 года назад +3

    it sounds like a blind raid without probable cause or a warrant, they can't see exactly what you have in your house as they rummage around, but they'll check anyway. it's either private or its not.

  • @DARK_AMBIGUOUS
    @DARK_AMBIGUOUS 3 года назад +5

    I don’t want my phone to use AI to scan my photos

  • @TechieBaby
    @TechieBaby 3 года назад +6

    As a tech worker, I don't think we misunderstood you, Apple. You are still using MY iPhone's computational power to do something you want to do, without my consent. (Generating the hashes of photos, described by Craig as the "first half" of the process) I don't want to be a part of YOUR company's sense of social responsibility. I have a say if you gonna use MY phone to do anything. I am extremely disappointed with Apple (but thanks to WSJ for this interview). And as a 10-year loyal iPhone and Mac customer, I will re-consider if I should use Apple product if this 'feature' goes live.

  • @akshaypatel6720
    @akshaypatel6720 3 года назад +17

    So, 7:05 senior vp of software at Apple really don't understand what backdoor is?

  • @MickiMinach
    @MickiMinach 3 года назад +94

    Thank you Joanna for doing this interview

    • @jeycalc6877
      @jeycalc6877 3 года назад +6

      she did a terrible job. Joana needs to stopping sucking up to Apple. its even worse that they are going through our iCloud

    • @bhagathyennemajalu
      @bhagathyennemajalu 3 года назад

      @@jeycalc6877 so we should bycott apple and move to a Chinese clone.

    • @jeycalc6877
      @jeycalc6877 3 года назад +3

      @@bhagathyennemajalu A brain is a terrible thing waste, I suggest you use yours

    • @john10000ish
      @john10000ish 3 года назад

      Post-wall "professional" woman.

    • @puneetsharma1437
      @puneetsharma1437 3 года назад

      Wall Street probably get donations for this

  • @cyb3r1
    @cyb3r1 2 года назад +1

    Craig Federighi looked highly uncomfortable during the interview and his explanations were a concerning mess. Thumbs up for the questions made by the reporter, they usually go much softer on them.

  • @bj0urne
    @bj0urne 3 года назад +3

    10:50 "I think our customers own their phone for sure"
    *Fights againt Right to Repair* 😂

  • @keefyboy
    @keefyboy 3 года назад +30

    "a degree of analysis done on your device" So, YES, iPhone will be scanned.

    • @infinitepower6780
      @infinitepower6780 3 года назад

      No, the HASHES (alphanumerical strings) of your photos will be scanned.

    • @keefyboy
      @keefyboy 3 года назад

      @@infinitepower6780 If they can come into my phone to hash pic, why couldn't the Gov't compel them to hash other files

    • @infinitepower6780
      @infinitepower6780 3 года назад +1

      @@keefyboy touché
      I guess it's just trust at this point

    • @KP3droflxp
      @KP3droflxp 3 года назад

      Only when preparing the files for upload to iCloud.

  • @MESHvlogs
    @MESHvlogs 3 года назад +9

    It takes 20 years to build a reputation and yet only a few days to crumble the very foundation. Thanks Apple. SMH.

  • @Mickeysternum245
    @Mickeysternum245 3 года назад +52

    Craig is being trotted out like Colin Powell was about the Iraq war

  • @dukiwave
    @dukiwave 3 года назад +30

    "I don't understand the backdoor characterization" what a weasel

  • @mediamass1404
    @mediamass1404 3 года назад

    9:46 how much of the shelf life of a phone is lost this way, how hot will their phones get,

  • @aeelinnannelie5651
    @aeelinnannelie5651 3 года назад +19

    I don't get why the hashing has to be done in my phone. That can be easily done once the image is in iCloud (if you choose to use that service). Kinda of scary that my phone is generating hashes to identify the files I store in my device. You don't need to think too much to realize this can be weaponized by the government

    • @alexkobzin557
      @alexkobzin557 3 года назад +3

      hashes are used all the time everywhere on each step of almost any software . so don’t worry about hashes

    • @aeelinnannelie5651
      @aeelinnannelie5651 3 года назад +11

      @@alexkobzin557 yes but no to identify my files comparing the hashes to a government database. As I said, that can be weaponized if you change the CSAM database for any database related about content that the government don’t like. And no, I don’t approve the CSAM. My concern is more about privacy. Something Apple has been using over and over to sell their products

    • @alexkobzin557
      @alexkobzin557 3 года назад +1

      @@aeelinnannelie5651 yeah. if they will screw up privacy they will loose Android completely. and i think they understand that

    • @CalienteFrijoles
      @CalienteFrijoles 3 года назад +2

      Actually hashing is a security feature meant to protect the user. It means that your data isn’t in plain sight. So your photo from Utah might look like”hwhizb&37$;8€![€|*...” when stored on a server Instead of the actual image. Which is what you want in case their database is ever breached. On-device hashing happens too for the same reasons.

    • @aeelinnannelie5651
      @aeelinnannelie5651 3 года назад +7

      @@CalienteFrijoles They have the hashes in the database, therefore, they have the hash itself and the file that generates that hash. Yes, Apple cannot see the content directly but if the hash matches the the government can go to they database and check the file. Now apply this hashing to a database related to content that the government don't like and you have an easy way to identify people that are 'against' the system. This has the potential to become something similar to the scoring system that China already has

  • @alexlouder
    @alexlouder 3 года назад +45

    They even made a cut when Federighi said “pornography” so the Apple campus won’t be in the background if “memed”. Sneaky little weasels.. I can see your tricks!

    • @jimbo-dev
      @jimbo-dev 3 года назад

      Same thing on 7:08
      And on 9:53

    • @high63294
      @high63294 3 года назад

      @@jimbo-dev the video is only 11:45 long

    • @jimbo-dev
      @jimbo-dev 3 года назад

      @@high63294 oops, correct, I fixed it. RUclips is infested with ads and the mobile youtube client doesn’t allow jumping to timestamps more than once 😖

    • @privacyhelp
      @privacyhelp 3 года назад

      @@jimbo-dev just buy premium acc, they are cheap

  • @AJsWorld
    @AJsWorld 3 года назад +15

    "We've been unwilling to deploy a solution that would involve scaning all customer data"
    That's exactly what this this, Craig

    • @iAmCracky
      @iAmCracky 3 года назад

      While I am against this. This is simply not true. They don’t even scan images that were taken yourself (apparently). Only images that came from another source and is then uploaded to the cloud.

  • @rjisasavage
    @rjisasavage 3 года назад +1

    For everyone upset, you know this only matters if you choose to use iCloud. You don’t have to use iCloud. Boom. Problem solved.

  • @arinchk.9265
    @arinchk.9265 3 года назад +3

    I appreciated that she did this fiercely straightforward interviewing sessions (mostly kind of interrogation) for the good of every Apple device users. Thank You ✌🏼

  • @emmanuelgoodluck9013
    @emmanuelgoodluck9013 3 года назад +16

    He is shaking 🤣

  • @cisbaovuwel3394
    @cisbaovuwel3394 3 года назад +10

    The real problem is this technology could be used to identify protesters and activists in the name of “CSAM”. Any of your photos and metadata of that photo might be read by Apple’s employees. That’s because Apple can’t guarantee the accuracy of the neural network. My real concern is that law enforcement could take advantage of that, and forcing Apple to access to someone’s gallery to identify social activists, and Apple explains that as a technical defect.

    • @frappes_
      @frappes_ 3 года назад +2

      The database they refer to for these images are strictly from NCMEC. NCMEC does not have a database to protesters and activists. Governments do not have access to the Csam verified images, I repeat Csam images only, until the manual verification process which will not happen after repeated algorithmic testing.

    • @frappes_
      @frappes_ 3 года назад +1

      Apple has blocked law enforcement before on getting private information from a phone of a literal terrorist. If any precedent was set, its that Apple values your privacy and these what if scenarios will not happen until they do. And even if it does happen, Apple will be accountable for it as Craig mentioned.

    • @justshad937
      @justshad937 3 года назад

      @@frappes_ that's not quite the objection at hand. It's NCMEC today. Same approach could be applied to any other database. With regard to the precedent, access was denied to the United States. Courts are able to protect Apple corporation from arbitrary requests. End-to-end encryption was also an excuse used. It's a different ballpark when it comes to authoritarian regimes.

    • @frappes_
      @frappes_ 3 года назад

      @@justshad937 I get that, and I share that concern, but based on Apple sources this will only laucnh at the US for now. I know it can become a slippery slope but I have faith in the precedent set by Apple in denying governments backdoors to their technology, and even if it does get that bad, the choice for consumers will be clear; do not buy Apple stuff anymore.

    • @justshad937
      @justshad937 3 года назад

      @@frappes_ that government could simply make it illegal for Apple to disclose this information. "In the name of national security." The same way US telcos didn't disclosure NSA surveillance.

  • @ECVIDS999
    @ECVIDS999 3 года назад +10

    Here is the issue THE US GOV cannot just go around and go through each device and report what is on there. So neither should apple or any other private company for that matter. This is a lot bigger issue than it is being presented.

  • @MrCoffis
    @MrCoffis 3 года назад +28

    "This is not what is happening" but is actually exactly what is happening.

  • @DoctorPrepperMD
    @DoctorPrepperMD 3 года назад +1

    So he is saying data you upload to Apple is theirs to scan if they want.

  • @Andy1341000
    @Andy1341000 3 года назад +27

    Even if this feature remains harmless the implications for the future use of the abuse of said feature is bad. Apple should stop this before it gets worse for there reputation.

  • @quickhug6075
    @quickhug6075 3 года назад +4

    Can banks scan/look at what's stored in your safe without warrant? If not, why can tech company scan our private files without warrant?

    • @KP3droflxp
      @KP3droflxp 3 года назад

      You can just not upload it

    • @cody4916
      @cody4916 3 года назад

      Because you agree to it when you upload to cloud

  • @mrgrumpy888
    @mrgrumpy888 3 года назад +4

    How are they doing multi-camera shots from a web call?

    • @mauricioflores3732
      @mauricioflores3732 3 года назад +2

      😂😂😂 This is edited ny friend

    • @mrgrumpy888
      @mrgrumpy888 3 года назад

      Yeah, but this is meant to give the impression that it's a 1:1 conversation. This feels staged.

  • @camcan6382
    @camcan6382 3 года назад +1

    Yes, we are scanning your photos without scanning your photos.

  • @geraltofrivia__w.w.7513
    @geraltofrivia__w.w.7513 3 года назад +1

    WSJ: We would like to ask you a few softball questions about the new update.
    Apple: OK Sounds good, and we'll send 1000 new Iphone 13 so you can " test out "