The most urgent threat of deepfakes isn't politics

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024
  • The real threat of deepfakes, explained with Kristen Bell.
    Join the Open Sourced Reporting Network: www.vox.com/ope...
    Actress Kristen Bell first found out there were deepfake porn videos of her online from her husband, actor Dax Shepherd. In the videos, her face has been manipulated onto porn stars’ bodies.
    “I was just shocked,” the actress told Vox. “It's hard to think about that I'm being exploited.”
    And this isn’t only happening to celebrities. Noelle Martin, a recent law graduate in Perth, Australia, discovered that someone took photos she’d shared on social media and used them first to photoshop her face into nude images, and then to create deepfake videos.
    Deepfakes are often portrayed as a political threat - fake videos of politicians making comments they never made. But in a recent report, the research group Deeptrace found that 96% of deepfakes found online are pornographic. Of those videos, virtually all are of women. And virtually all are made without their consent.
    Sources:
    "The State of Deepfakes" deeptracelabs....
    Defining "Deepfake" www.theverge.c...
    "Deepfakes and Cheapfakes: The Manipulation of Audio and Visual Evidence" datasociety.ne...
    "An Introduction to Neural Networks and Autoencoders" www.alanzuccon...
    "Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security" papers.ssrn.co...
    Open Sourced is a year-long reporting project from Recode by Vox that goes deep into the closed ecosystems of data, privacy, algorithms, and artificial intelligence. Learn more at www.vox.com/ope...
    Join the Open Sourced Reporting Network: www.vox.com/ope...
    This project is made possible by the Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
    Watch all episodes of Open Sourced right here on RUclips: bit.ly/2tIHftD
    Vox.com is a news website that helps you cut through the noise and understand what's really driving the events in the headlines. Check out www.vox.com.
    Watch our full video catalog: goo.gl/IZONyE
    Follow Vox on Facebook: goo.gl/U2g06o
    Or Twitter: goo.gl/XFrZ5H

Комментарии • 1,4 тыс.

  • @milfordjenkins6113
    @milfordjenkins6113 4 года назад +2327

    Did anyone else think that Kristen Bell at the start was a deepfake

    • @hongvicodes
      @hongvicodes 4 года назад +25

      Glad you told me

    • @Emily-Dong
      @Emily-Dong 4 года назад +6

      Haha yea

    • @Blinkisageek
      @Blinkisageek 4 года назад +28

      That would have made it sooo much better. Like at the end it's revealed it's a deepfake.

    • @evilmokri6217
      @evilmokri6217 4 года назад +1

      Yah

    • @timez9663
      @timez9663 3 года назад

      Yeah D:::

  • @ianlack4417
    @ianlack4417 4 года назад +4207

    Let’s start the countdown timer. Congress should be able to address this in about 100 years or so.

    • @myeyelashflewoff
      @myeyelashflewoff 4 года назад +117

      This is so sad and true that it's actually funny

    • @o_o3142
      @o_o3142 4 года назад +117

      A hundred years? Quite the optimist, aren't you? My money's on a brief chat in 250 and legislation 60 years after that.

    • @tomwaddy9035
      @tomwaddy9035 4 года назад +20

      *Acknowledge, it will take another 200 years or so to actually do anything

    • @BSKX17
      @BSKX17 4 года назад +18

      unless the people there became popular faces all of a sudden

    • @K42023
      @K42023 4 года назад +2

      😂😂

  • @LuxuryPi
    @LuxuryPi 4 года назад +3227

    This is basically the reason why you should care about privacy of your data. You can't imagine what will be done with it.

    • @thatsterroristsbro7855
      @thatsterroristsbro7855 4 года назад +53

      A reason. One reason of many. But it'd happen anyway, this is contingent on a few factors, and it is a particular kind of person being targeted by another kind. So, not as neutral as you perhaps are mistaking it to be.

    • @bubrub5564
      @bubrub5564 4 года назад +34

      @@liang2492 There is still a risk of it happening.

    • @lucidexistance1
      @lucidexistance1 4 года назад +4

      What's the name of your first pet?
      Where were you born?
      What's your mother's maiden name?

    • @gozolino
      @gozolino 4 года назад +24

      @@liang2492 That's what you think.
      Then it happens.

    • @MattPratt
      @MattPratt 4 года назад +6

      In b4 Facebook starts selling all your photos to deepfake makers. Hey a buck is a buck to Zuck.

  • @paulcooper8818
    @paulcooper8818 4 года назад +925

    It is said that some aboriginal peoples, when first introduced to photographs, thought their soul was being stolen.
    In a sense, now that is possible.

    • @ryancappo
      @ryancappo 4 года назад +29

      The Amish have the same feeling towards pictures of their faces.

    • @illDefine1
      @illDefine1 4 года назад +28

      Why is everyone on this site so dramatic? It is not soul thievery, it is a form of identity theft. If you exaggerate everything people tend to take you less seriously after a while ... which is exactly what is happening to people who are considered progressives.

    • @LightningbrotherG
      @LightningbrotherG 4 года назад +30

      It suddenly clicked for me, like stealing someone's face is bad, but stealing someone's soul, manipulating it as you see it, is horrific.

    • @deiandorinha1207
      @deiandorinha1207 4 года назад +3

      @@LightningbrotherG :(

    • @unicornsprinkles3277
      @unicornsprinkles3277 4 года назад +5

      i mean i get what youre going for but this just sounds so pretentious

  • @agrimg
    @agrimg 4 года назад +1146

    It won’t be long until video, photographs, or audio recordings are no longer considered evidence in a court of law.

    • @tasibsharar7357
      @tasibsharar7357 4 года назад +5

      Agrim Gupta you can just limit the internet

    • @agrimg
      @agrimg 4 года назад +77

      @@tasibsharar7357 that is true, but in a lot of places, even security footage is now simply moved to a cloud and not kept in physical drives (near the cameras). Limiting the internet probably won't work, but laws about the digital signatures/prints of video evidence will now have to be rethought.

    • @XavierZara
      @XavierZara 4 года назад +54

      The response would be a stronger focus on Metadata and device identifiers

    • @beepboopbeepp
      @beepboopbeepp 4 года назад +77

      Imagine how easy it will be for authoritarian governments to frame people now with deepfakes as evidence

    • @luziealyssa5677
      @luziealyssa5677 4 года назад +2

      Where I live they are already not considered evidence because they can easily be tampered with

  • @loof470
    @loof470 4 года назад +3058

    Imagine if all the people they had in the video were also deepfakes

    • @samazwe
      @samazwe 4 года назад +91

      That'd be soooo meta

    • @FIGHTwPRIDE
      @FIGHTwPRIDE 4 года назад +25

      Oh you took the Red Pill 👍

    • @MimixLight
      @MimixLight 4 года назад +9

      Duncan Kim Can someone please elaborate the “red pill, blue pill” thing? Thanks

    • @MortyMortyMorty
      @MortyMortyMorty 4 года назад +9

      Tbh if we live in a simulation we may all be deepfakes in a computer. Oh wait I didn't want to go this deep.

    • @anneemull
      @anneemull 4 года назад +3

      MimixLight it's from the story line of the movie "The Matrix"

  • @flacadiabla3193
    @flacadiabla3193 4 года назад +2520

    This is why I've never been a fan of showing off my life online.

    • @Guest-sl2qb
      @Guest-sl2qb 4 года назад +64

      The suspect could've used her face from her movies..

    • @Theblueseaweedsoawesome
      @Theblueseaweedsoawesome 4 года назад +80

      Nobody care about ur life

    • @SR009s
      @SR009s 4 года назад +59

      Who tf is going to deepfake you?

    • @FreakieFan
      @FreakieFan 4 года назад +88

      @@SR009s
      That's probably Noelle thought too, but she got deepfaked anyway.

    • @RenbroNL
      @RenbroNL 4 года назад +22

      Yeah well that is only going to help people for now. This technology will advance and in the future it won't require a big social media presence, maybe even 1 photo will do.
      And even then. A big social media presence doesn't justify this at all.

  • @spooderman2616
    @spooderman2616 4 года назад +1777

    I think my Dad is a deepfake, I havent seen him in 9 years.

  • @starcherry6814
    @starcherry6814 4 года назад +576

    Abusive exes could use this on their victims

    • @penname8441
      @penname8441 4 года назад +69

      That's the terrifying part.

    • @fleshtaffy
      @fleshtaffy 4 года назад +6

      Could be a worthy punishment

    • @fcgHenden
      @fcgHenden 4 года назад +80

      @@fleshtaffy I hope you just misread that. 😬

    • @FoxyNinetails
      @FoxyNinetails 4 года назад +36

      By what I've seen, that's already happening. It should have been made illegal yesterday, but it's not. Allowing deepfakes of people like this will only feed into potential rapists and abusers, making them more likely to commit a serious crime later on with the real deal.

    • @fleshtaffy
      @fleshtaffy 4 года назад +1

      @@fcgHenden Nah. It's quite interesting seeing how white people justify things.

  • @bruno3
    @bruno3 4 года назад +664

    Celebrities are the ones most at risk but not the ones who will suffer the most. They have the means to clarify the situation and reach people through the media. Other people don't. They won't have experts analysing the video and popular web sites and media outlets clarifying the situation. Most people will most likely believe it's the real deal, or at least consider de possibility of it being real. But it's not that you couldn't do it already to pictures with photoshop and similar software. It's just that most people didn't know you could now do it to videos as well, but it's a matter of time until they do.

  • @princeshezy007
    @princeshezy007 4 года назад +697

    Sadly the more awareness this topic gets there will be even more traction to those websites. It's a lose lose situation to fight against the internet.

    • @tc2241
      @tc2241 4 года назад +75

      Yup, the more you discuss it and try to ban it, the more it grows. We should use this time to develop tools to identify deepfakes as its explosion is right around the corner.

    • @daniilgrankin7658
      @daniilgrankin7658 4 года назад +30

      Exactly mate, It's time to think about how to live with that problem, not how to prevent it

    • @Plasticityy
      @Plasticityy 4 года назад +3

      I agree

    • @hetmonster2
      @hetmonster2 4 года назад +4

      @@tc2241 Already exists.

    • @tc2241
      @tc2241 4 года назад +3

      hetmonster2 ya, but it’s not in heavy use like detecting counterfeits. It should be built into video players, uploaders, browsers, etc. we’re nowhere near that

  • @JoLum_doesnt_eat_cheese_a_lot
    @JoLum_doesnt_eat_cheese_a_lot 4 года назад +526

    imagine victim blaming someone for having a face ☹️ taking a selfie is 'asking for it' now?

    • @hmbs1630
      @hmbs1630 4 года назад +162

      and you can be guaranteed men are currently jumping through the mental hoops required to both justify it and blame women for it.

    • @BaM7Up
      @BaM7Up 4 года назад +4

      T K yes

    • @tostupidforname
      @tostupidforname 4 года назад +17

      Its obviously not the victims fault but yes everybody should be ready for stuff like this when making your data public.

    • @fleshtaffy
      @fleshtaffy 4 года назад +10

      The feminists are rampant in 2020. You guys are like little flies.

    • @crogers3602
      @crogers3602 4 года назад +91

      @@fleshtaffy you're a misogynist, and a poor excuse of a man

  • @vincenttjia
    @vincenttjia 4 года назад +1435

    Tiktok: booming
    Deepfakes: it's free real estate

    • @megaraph5551
      @megaraph5551 4 года назад +3

      🤣

    • @marcya4428
      @marcya4428 4 года назад +1

      @@Lennard222 Still not ok buddy

    • @jonny__b
      @jonny__b 4 года назад +12

      Do people even make original jokes anymore?

    • @cshaps1212
      @cshaps1212 4 года назад

      *social media: Booming*

    • @ccricers
      @ccricers 4 года назад +1

      From social media to solipsism

  • @louispride7695
    @louispride7695 4 года назад +578

    This is disgusting. Even though I’m not a woman this makes me want to delete or at least minimise the use of my social media.

    • @IsThisRain
      @IsThisRain 4 года назад +25

      I agree. If it makes you feel better though, @iamdeepfaker on Instagram uses the technology pretty responsibly. So this is a pretty double-edged sword, although I believe all of us can live without this technology.

    • @itdc2219
      @itdc2219 4 года назад +16

      or just not post pictures of you. i just post cats lol. do a deepfake of them

    • @hanhong2267
      @hanhong2267 4 года назад +20

      If you're concerned, you don't have to get off social media completely, but deleting any pictures or videos that aren't important off your social media account could minimize the chances of getting deepfaked.

    • @j.j.juggernaut9709
      @j.j.juggernaut9709 4 года назад +6

      It won't be just women in danger because of these deepfakes.

    • @jonny__b
      @jonny__b 4 года назад +3

      Just delete it dude.

  • @standdbyme
    @standdbyme 4 года назад +834

    This is sick on so many levels.

    • @canti7951
      @canti7951 4 года назад +36

      It's disrespectful but how is it an urgent threat though?

    • @neszt9325
      @neszt9325 4 года назад +27

      welcome to the internet

    • @a2pabmb2
      @a2pabmb2 4 года назад +36

      Someone clearly isn't familiar with Rule#34

    • @gozolino
      @gozolino 4 года назад +35

      @@a2pabmb2 Cartoons aren't the same thing as a deepfake.

    • @desiree7633
      @desiree7633 4 года назад +70

      @@canti7951 these people did not consent for this. that's the problem. it could ruin their lives

  • @Sela2125
    @Sela2125 4 года назад +409

    I think this gets into the question of “just because you can, should you?” Novel technology has allowed for scenarios that we could scarcely imagine just a few years ago. We have seen this technology used by Hollywood to bring back stars that have passed away. The ethics of that are still being debated, but at least the parties involved (to my knowledge) have given consent. So, there are arguably benevolent uses for this particular technology. What mechanisms, be they legal, technological or otherwise, can we use to protect people’s privacy and self-ownership while not abridging the pursuit of knowledge upon which technology is built? How do we get people to take a step back and think, “should I do this?”

  • @goodman25
    @goodman25 4 года назад +596

    Good thing I have no face.

  • @lifevest1
    @lifevest1 4 года назад +108

    Man this has the making for an incredible Black Mirror episode. This is some messed up sh*t!

    • @patrickb1439
      @patrickb1439 3 года назад +4

      Theyve already done it- "be right back"!

  • @Yaruh
    @Yaruh 4 года назад +674

    The people who discovered deepfakes, were obviously looking for deepfakes

    • @friggasring
      @friggasring 4 года назад +47

      Actually the same Bill Hader clip they showed in this came up in my RUclips recommendations because I was watching Bill Hader clips. I'm sure I had heard of DeepFakes before, but that was the first time I was really exposed to them.

    • @sebastian-benedictflore
      @sebastian-benedictflore 4 года назад +69

      Referring to Ashton Kutcher "stumbling upon" Kristen Bell's deepfake?

    • @ropytube
      @ropytube 4 года назад +19

      Sebastian-Benedict Flore he has an organisation named Thorn, look it up

    • @crimsoncloud6352
      @crimsoncloud6352 4 года назад

      Yeah lol her husband...

    •  4 года назад +32

      @@sebastian-benedictflore Ashton Kutcher has a foundation call thorn that specializes on tracing kidnapped children on the dark web to save them from child prostitution and trafficking, so it is entirely possible that he knew about Kristen bells deepfake because of this

  • @MrBristolian
    @MrBristolian 4 года назад +69

    I have huge respect for Kristen and Noelle for talking so calmly and authoritatively about something that must have been so upsetting.

  • @AS-ke6co
    @AS-ke6co 4 года назад +275

    We're providing way too much data on the internet.

    • @luziealyssa5677
      @luziealyssa5677 4 года назад +29

      But that's not the problem here, the problem is that people abuse this data

    • @Sovereignless_Soul
      @Sovereignless_Soul 4 года назад +25

      @@luziealyssa5677 you can't stop people, but you can stop uploading data.

    • @bubrub5564
      @bubrub5564 4 года назад +5

      @@Sovereignless_Soul Which just defeats the point of the internet.

    • @perisaizidanehanapi7931
      @perisaizidanehanapi7931 4 года назад +15

      @@Sovereignless_Soul Interesting point there, but unfortunately some people do their job through social media presence. So, I think to blame the victim is really not the final answer we want to pursue.

    • @leelee-rf6px
      @leelee-rf6px 4 года назад

      @@Sovereignless_Soul The internet won't be all that interesting now would it

  • @williamhild1793
    @williamhild1793 4 года назад +31

    I'm old. I Didn't grow up with computers. Never, in a million years, would a 25-year old me believe that many years in the future, something like this would come about. Just beyond belief.

    • @Chan-md2hb
      @Chan-md2hb Год назад

      have you never heard of photoshop? XD

  • @Laurence2000
    @Laurence2000 4 года назад +102

    “I’d be thrilled that someone found me attractive”
    I so badly want to put this claim to the test.

    • @d4vian398
      @d4vian398 3 года назад +34

      The one who said that is probably people who are incels.

    • @stitches768
      @stitches768 3 года назад +7

      @@d4vian398 100%

    • @temtem9255
      @temtem9255 3 года назад +6

      @@d4vian398 or someone with a different opinion?

    • @MA-yu2ss
      @MA-yu2ss 2 года назад +3

      @@temtem9255 who will be degraded more?

    • @bimbo6861
      @bimbo6861 2 года назад

      it's because they're imagining an attractive woman behind that "someone", not an obese, sweaty, 45 years old man. the latter is our reality.

  • @adityasuri4009
    @adityasuri4009 4 года назад +114

    I feel so bad for this woman. People are so cruel

  • @zakki5630
    @zakki5630 4 года назад +205

    The people who made those deepfakes are definitely going to the bad place

    • @bidbux9500
      @bidbux9500 4 года назад +36

      Dude, EVERYONE is going to the "bad place".

    • @MaDKrEvEdKo
      @MaDKrEvEdKo 4 года назад +13

      Yeah man, seeing how the 2020 is going, we have all went to the bad place.

    • @canti7951
      @canti7951 4 года назад +2

      This sounds like satire. I hope it is.

    • @fleshtaffy
      @fleshtaffy 4 года назад

      This is sarcasm. The vast majority of us will be there.

    • @fcgHenden
      @fcgHenden 4 года назад

      @@bidbux9500 He didn't know. 😨

  • @clp480
    @clp480 4 года назад +61

    Imagine if this is used to frame someone for a crime

    • @PHlophe
      @PHlophe 4 года назад +9

      china is making good use of it

    • @nameq
      @nameq 4 года назад +2

      more like imagine evidence no longer being credible when you have just some media recordings
      like... imagine people logically evolving

    • @akzebraminer
      @akzebraminer 3 года назад

      Videos and audio would need to have a certified and trusted source in the future, rather than having the video or audio prove itself on its own.

  • @duyguozkann
    @duyguozkann 4 года назад +78

    A sad smile at Kristen Bell's face at the end got me :(

  • @therealmykag
    @therealmykag 4 года назад +115

    We were so excited when we got the rainbow barfs and dog ears... and now here we are.

  • @paulfernandez8820
    @paulfernandez8820 4 года назад +68

    kristen bell is so calm I like how she was interviewed

  • @imo.o
    @imo.o 4 года назад +53

    Of course it’s mostly happening to women. I’m annoyed that I’m not surprised.

    • @deiandorinha1207
      @deiandorinha1207 4 года назад +28

      as always we are the ones being abused by whatever new thing men find to objetify us and use our bodys against our will for their pleasure with no consent as usual... not shocked, sadly 😔

    • @deiandorinha1207
      @deiandorinha1207 4 года назад +4

      @Bugler55 yeah... so an isolated case defines it all doesn't it... think about the majority of the cases and not only about what you choose to see

    • @deiandorinha1207
      @deiandorinha1207 4 года назад +2

      @Bugler55 would you judge a guy the same way if he was doing the same thing? just asking, really

    • @nameq
      @nameq 4 года назад +3

      well it couldn't be a surprise. it's not like with deepfakes being invented there's more evil in the world. it's all same old evil (people) playing with its new toys not thinking about the others.

  • @timmm8686
    @timmm8686 4 года назад +326

    About time that Somebody talks about it. Happened for years

    • @harrylane4
      @harrylane4 4 года назад +11

      This has been in headlines for, like, three years.

    • @timmm8686
      @timmm8686 4 года назад

      @@harrylane4 never saw it. Only individual cases in which stars talked about it

  • @skywriting33
    @skywriting33 4 года назад +31

    Another reason to protect your children’s faces from social media.
    Parents plaster their pages with pics of their kids. If they do this with adults why not children? Very scary.

  • @mmmfuhlendorf
    @mmmfuhlendorf 4 года назад +19

    William Gibson wrote about this 20 years ago, people thought it was exaggerated...

  • @redoktopus3047
    @redoktopus3047 4 года назад +34

    And people made fun of me when I was super concerned about everyone sharing everything online

    • @sfullernj
      @sfullernj Год назад

      We should have all listened to you

  • @daviddima6067
    @daviddima6067 4 года назад +108

    I thought staying inside is safe enough for me, turns out im wrong

  • @aryant1884
    @aryant1884 4 года назад +6

    You cannot stop this. Just stop postings thousands of photos of yours online. For a flawless deepfake you need a huge amount of data to train your model with. Celebrity photos can be found easily online that's why its easier to produce deepfakes of them.

    • @mended8774
      @mended8774 4 года назад

      Even if it becomes illegal its still going to happen it is just sad

    • @kwirro
      @kwirro 4 года назад

      Deepfaking technology is getting better and better. From my phone, I could deepfake someone with just one photo of them. It's not that good NOW, but 2 years ago it was completely unthinkable.

    • @aryant1884
      @aryant1884 4 года назад

      @@kwirro That's why I added the word Flawless or you can say indistinguishable. Not posting photos is just a prevention. There is no cure. The only possible solution I see is how we tackle viruses with antiviruses to design algorithms to identify deepfakes.

    • @kwirro
      @kwirro 4 года назад

      @@aryant1884 Ok, true.

  • @crunchypastries713
    @crunchypastries713 4 года назад +109

    I feel super bad for the victims. Imagine it being viral and toxic people will bully them. It will deeply impact them in a negative way and may bring depression into their lives. And the people who are making these deep fakes are very inhumane. Humanity is messed up smh

    • @deiandorinha1207
      @deiandorinha1207 4 года назад +8

      this people might feel the impact of this for their entire lifes... it's simply sad

    • @nameq
      @nameq 4 года назад

      it won't be as impactful once everyone knows that media (previously text, photo, now video) is mostly not credible

  • @charlieliang6883
    @charlieliang6883 4 года назад +170

    Scary.

  • @m.a.3322
    @m.a.3322 4 года назад +65

    Wow, what has humanity come to? This is dark and unacceptable.

    • @Terjecs
      @Terjecs 4 года назад +22

      this isnt close to the worst people have done lol

    • @m.a.3322
      @m.a.3322 4 года назад +6

      @@Terjecs lol no, this is pretty fkn malicious.

    • @yellovvdoe
      @yellovvdoe 4 года назад +16

      @@Terjecs still, this is horrible. Nobody should be doing this to anyone.

    • @thatsterroristsbro7855
      @thatsterroristsbro7855 4 года назад +4

      No, this is mankind. Again.

    • @Munchausenification
      @Munchausenification 4 года назад +4

      It's actually a historical thing almost. Back in Medieval times you had Pretenders (people who look and sound like heirs to the throne of a country) imitating for personal gain of power and wealth.

  • @NasaRacer
    @NasaRacer 4 года назад +101

    This feels just like the concern and outrage we had when Photoshop became a big thing. Turned out there wasn't much that could be done about it and we all got used to it. Sadly this is most likely what will happen with this technology as well. No putting the genie back in the bottle.

    • @chaseleswift252
      @chaseleswift252 4 года назад +4

      Yeah, right now there are ups and downs to this technology that are emerging, still we have the opportunity to learn from past and control this. I wouldn't be surprised if something big happens again by this point. Hopefully nothing bad will happen. :/

    • @arielhernandez1638
      @arielhernandez1638 2 года назад +2

      Yeah I agree that its not nice, and I'm sorry that this is happening, but there is really no way to stop this completely, except in ways that would be too extreme. You can discourage it, but not stop it entirely. Its just not worth it to try and make a big deal about this, because they ONLY way to stop this is to have an extremely controlling internet that isn't free anymore. It's sad, but you need to be resilient and just deal with it.

    • @glutenfree7057
      @glutenfree7057 Год назад +2

      You could put regulations over it. No need to get all "we can't do anything about it", because we can say that about literally everything. "We can't do anything about stealing because people are gonna do it anyway" or whatever are just weak excuses.

  • @ErraticMagics
    @ErraticMagics 4 года назад +15

    The Amish were right

  • @rinopw4262
    @rinopw4262 4 года назад +81

    Why am I afraid this vid actually gave ideas to some people who haven't thought about this..

    • @canti7951
      @canti7951 4 года назад +3

      You're afraid because you know you're right.

    • @abb4538
      @abb4538 4 года назад +18

      @@2NVS basically make sure there are no videos or multiple candid photographs of you online-- which is almost impossible when social media is so prevalent in our society. It's frustrating that the video's conclusion was essentially "you could be next lol good luck"

    • @tostupidforname
      @tostupidforname 4 года назад +1

      @@2NVS not much is needed.

    • @tostupidforname
      @tostupidforname 4 года назад +3

      Nah deepfakes are well known for everyone even remotely interested in machine learning.

  • @allencraig02
    @allencraig02 4 года назад +6

    It's not "the internet" that needs to be kinder or more considerate. It's people. Specific people making specific decisions to do something that's scumbaggy. Many people are scumbags and the internet allows them to be their worst selves anonymously-but it doesn't MAKE them be scumbags, it only gives them a vehicle to have an audience.

  • @Aero7SVR
    @Aero7SVR 4 года назад +10

    Privacy will become a thing of the past, in our high tech future.

  • @desmonddesjarlais2697
    @desmonddesjarlais2697 4 года назад +9

    This is an unfortunate situation to be in. Having your identity abused like this.

    • @christhelostsoul9927
      @christhelostsoul9927 4 года назад +2

      Unfortunate but unsuprising it started with photoshop and then it was only a matter of time before it moved to video

  • @pavarottiaardvark3431
    @pavarottiaardvark3431 4 года назад +35

    This raises some serious and concerning questions about the nature of the digital self. Who owns images of us? Do we own the idea of our likeness?
    If so, do people have the right to take our photo in public?
    But if NOT, would we also not have the right to stop deepfakes made from those photos?

    • @deceiver123m
      @deceiver123m 4 года назад +3

      Some celebrities took steps to patent their likeness etc. A early bid to be proactive in this new era.

    • @deceiver123m
      @deceiver123m 4 года назад

      But can a bright up and comer beat a celebrity into patenting their likeness. $$$

    • @RampageG4mer
      @RampageG4mer 4 года назад +5

      You absolutely do not own your likeness

    • @TengkuAmier
      @TengkuAmier 4 года назад

      @@RampageG4mer Religiously Maybe but in scientific terms yes

  • @Razear
    @Razear 4 года назад +11

    When you give people tools this powerful, it sheds a light on the darkest crevices of humanity.

    • @sfullernj
      @sfullernj Год назад

      Crevice 🤣 great choice of wording

  • @tc2241
    @tc2241 4 года назад +50

    Seeing the comments, I’m actually surprised by the amount of people who haven’t known about this. It’s been talked about and documented for years now. I guess some people have to wait for things to hit their ‘social bubble’.

    • @fleshtaffy
      @fleshtaffy 4 года назад +6

      Most people are only aware of what's being talked about in their lunch room or what's in the headlines. I envy them though. Ignorance truly is bliss.

    • @leilanidru7506
      @leilanidru7506 4 года назад +2

      I’m not surprised by the amount of people who either aren’t taking this seriously or don’t really see it as an issue

    • @tostupidforname
      @tostupidforname 4 года назад +1

      yeah me too

    • @leilanidru7506
      @leilanidru7506 4 года назад

      Bugler55 I’m not surprised by your lack of surprise. The nonchalance to women’s images being sexually exploited and degraded online without their consent is expected.

  • @Sorrys0rry
    @Sorrys0rry 4 года назад +56

    The world feels like a giant dumpster fire right now, it is scary.

    • @MaryLeighLear
      @MaryLeighLear 4 года назад +2

      My dream is to be off the grid within 10 years.

    • @nameq
      @nameq 4 года назад +1

      you are forgetting that you are living in the best times ever, judging by materialistic standards. before it was much much harder.

  • @Hdidbi_3049
    @Hdidbi_3049 4 года назад +10

    tiktok is literally a breeding place for deepfake videos, all these teens. sigh.

    • @stitches768
      @stitches768 3 года назад

      Christ, you're right

    • @akzebraminer
      @akzebraminer 3 года назад

      It’s also a national security threat

  • @wearelegion1163
    @wearelegion1163 4 года назад +27

    This is one of the reasons I don’t ever post pics of myself online. I was even fired from my job of 15 years because I told my new boss I don’t allow my picture online because he wanted to post pics of his staff. Others declined, but I got fired.

  • @uss_04
    @uss_04 4 года назад +50

    01:20
    “This is my face, this belongs to me”
    Deepfakers:
    “All your face are belong to us”

  • @davidmelgar1197
    @davidmelgar1197 4 года назад +3

    What's happening to adult women is horrifying but now I'm thinking of all the naive, ignorant parents uploading reams and reams of photos and videos of their young kids... :(

  • @delorbb2298
    @delorbb2298 4 года назад +7

    Why is this the first thing some men think of? And don't come for me because I said "men". We all know that's where it started.

    • @theflamethrower867
      @theflamethrower867 4 года назад

      delor b and who says it doesn’t happen to men as well

    • @delorbb2298
      @delorbb2298 4 года назад

      @@theflamethrower867 JEEBUS. Please go back and RE-READ what I posted.

    • @theflamethrower867
      @theflamethrower867 4 года назад +1

      delor b after you read what I said
      Blunt accusations/statements don’t mean anything

  • @persiadance7605
    @persiadance7605 4 года назад +26

    i wonder how they find these links

    • @derpatel9760
      @derpatel9760 4 года назад +5

      research probably. Dont ask what the research was.

  • @Alexander-xo5ho
    @Alexander-xo5ho 4 года назад +26

    it may be seen as very incriminating now, but I think that as the technology gets more common, people start to distance identities from images, videos and etc. and in some time, nobody will regard any image or video as relevant, regardless of its realism. this will likely be very detrimental to many systems that rely on identifying identities based on images, videos and etc. like the legal system.

    • @sentinel9651
      @sentinel9651 4 года назад +2

      This sounds philosophical.

    • @HiAdrian
      @HiAdrian 4 года назад +1

      @@sentinel9651 Sounds very plausible actually. Once fakes become common, people will start doubting visual material; they have to.
      It would be ideal if people could stop caring (different value system around sexuality), but that might run too much against human nature.

  • @terrainvictus1210
    @terrainvictus1210 4 года назад +73

    Good thing I only post memes.

  • @IchabodNyx
    @IchabodNyx 4 года назад +76

    I'm just impressed you got Tom Cruise's consent to having his face digitally edited into this video.

    • @chinguunerdenebadrakh7022
      @chinguunerdenebadrakh7022 4 года назад +1

      Did they really get the consent tho?

    • @blakedake19
      @blakedake19 4 года назад +11

      Exactly, only for women VIPs there is a problem, for everyone else shut up and let the internet flow

    • @EddieMachetti
      @EddieMachetti 4 года назад +7

      But Tom cruise is man, so he and how he feels does not matter.

    • @tostupidforname
      @tostupidforname 4 года назад +2

      good point actually.

  • @raydhanitra
    @raydhanitra 4 года назад +69

    "I found your deepfake"
    "Errr actually that was real"
    :O

    • @360stab
      @360stab 4 года назад +5

      People don't look at it from this side. It will give everyone a way to deny embarrassing videos.

    • @Toarcade
      @Toarcade 4 года назад +4

      ".....yes.... right... the deepfake..."

  • @CH-vr2dl
    @CH-vr2dl 4 года назад +8

    this has been a problem for like 4 years already ..

    • @seanbelkom9094
      @seanbelkom9094 4 года назад +3

      More than 4 years people have been photoshopping celebrities on naked bodies ever since image editing was a thing, i was gonna say "ever since photoshop was a thing" but i think there were image editing software before adobe photoshop but deep fakes is still pretty newish

  • @patrik5123
    @patrik5123 4 года назад +38

    And people wonder why I have a cat as avatar...
    I still think politics is more of an URGENT threat as that could potentially threaten every person in the world, but sure, on a personal level this is incredibly damaging.

  • @himynameis3102
    @himynameis3102 4 года назад +10

    I feel like if this was happening to men the issue would be taken way more seriously. But as 99% of the victims are women, it’s just swept under the rug like all of our other issues.

    • @jaydeepbose4501
      @jaydeepbose4501 4 года назад +2

      Of course, now go to the kitchen

    • @tostupidforname
      @tostupidforname 4 года назад

      Why would that be the case?

    • @christhelostsoul9927
      @christhelostsoul9927 4 года назад

      It does happen to men just in different ways and honestly you can't really stop deepfakes unfortunately

  • @pyramid_iremide
    @pyramid_iremide 4 года назад +19

    This is nowhere near the same thing but it reminds me of when some people would put on realistic masks of black people and rob banks. Black people were actually getting arrested for crimes they weren't doing

  • @ninnikins4768
    @ninnikins4768 3 года назад +2

    That's disgusting. Such a wonderful technology and people use it to abuse women.

  • @likira111
    @likira111 4 года назад +20

    I like Cr1tikal's takes on this, "now if I ever get caught doing anything, I can just say it's a deepfake".

    • @ekaterinavalinakova2643
      @ekaterinavalinakova2643 3 года назад

      I guess there is an upside to most things. But I'm still way more concerned about this being used to frame people for heinous crimes.

  • @nutbreaker7482
    @nutbreaker7482 4 года назад +7

    I don't know whats real anymore

    • @RezValla
      @RezValla 3 года назад

      None of us do. A UFO could land in the middle of Time Square and we’d never be able to tell if it really happened.

  • @phillywilly4155
    @phillywilly4155 4 года назад +6

    That scary because it could mess up your whole life.

    • @temtem9255
      @temtem9255 3 года назад +2

      I mean, as the problem expands the videos wont be taken seriously anymore and the problem wil kinda fade away

  • @abhijitmeti1611
    @abhijitmeti1611 4 года назад +9

    6:15 I wish internet was more responsible and kinder.. 😔

  • @pmm1767
    @pmm1767 4 года назад +60

    Don't sort by newest first, DON'T SORT BY NEWEST FIRST

  • @gasdive
    @gasdive 4 года назад +3

    Missed the impact on the actors who are having their work stolen and their identity erased. There's more than one hurt along the way here.

  • @sebastian-benedictflore
    @sebastian-benedictflore 4 года назад +5

    Why does Ashton Kutcher know about Kristen Bell's deepfakes?

  • @crazziii_
    @crazziii_ 4 года назад +3

    There's so much twisted stuff in the world that I just don't know how to live my life anymore.

  • @Crick1952
    @Crick1952 4 года назад +22

    Rule 34 is always in effect

    • @DawingmanT900
      @DawingmanT900 4 года назад +13

      @jami0070 oh you sweet summers child....

    • @Crick1952
      @Crick1952 4 года назад +9

      @jami0070 I can't destroy something so precious.
      Stay golden Ponyboy, stay golden

    • @christhelostsoul9927
      @christhelostsoul9927 4 года назад +1

      Only drawn content exists in rule34...

  • @stainlessstove4629
    @stainlessstove4629 Год назад +2

    Anyone else think that you just need to be responsible for that what you post on the internet? IE... you dont need to post selfies on instagram.
    But it is so scary.

  • @oli9757
    @oli9757 4 года назад +4

    As a kpop fan this was brought to many attention recently through some channels i watch, its really something that everyone should know about because it can really affect every celebrity and person wth their face online honestly. Its just disgusting that people think they can do it, and i really hope people don’t think this is actually okay. Gotta give it to Vox for also giving informed information that many people should know about, the internets just a dark, scary place.

  • @yangsinful
    @yangsinful 4 года назад +19

    I guess women gonna mostly be impacted by this

  • @brooklynyte
    @brooklynyte 4 года назад +5

    It's horrible that this is happening, but are you really saying that this is a bigger impact than politics?? Come on now....

  • @warbler4954
    @warbler4954 4 года назад +1

    If we have freedom of speech, privacy is the freedom of not speaking.

  • @outlinedcord7254
    @outlinedcord7254 4 года назад +3

    The title of this video is pretty click-baity. Aside from that one statistic about the percentage of pornographic vs non-pornographic deep fakes, the video doesn’t argue that the political implications of deep fakes are less important.

  • @althaf198
    @althaf198 2 года назад +2

    It feels like my mom and sister is not safe anymore

  • @opalindigo2984
    @opalindigo2984 4 года назад +6

    This is one of the many reasons I quit social media. My privacy is more important than validation.
    I rest my case.

  • @lilyvalley5389
    @lilyvalley5389 3 года назад +4

    So sad & like disrespectful to people & wrong. Very dangerous.

  • @kamranakrami3
    @kamranakrami3 4 года назад +5

    Yeah yeah.. we all know your "friend" didn't tell you about those videos of your wife

  • @jabberwockydraco4913
    @jabberwockydraco4913 Год назад +1

    Using deepfakes to say, escalate a war, feels a bit more serious.

  • @monday8585
    @monday8585 4 года назад +30

    me when I read the title:
    vox: The most urgent threat of deepfakes isn't politics
    me: oh I know vox, I know

  • @brandonmccoy2894
    @brandonmccoy2894 11 месяцев назад +1

    I usually don’t care too much when celebrities complain/ speak out about something that’s damaging their image or reputation. But this one I really sympathize with them. How this isn’t illegal is beyond me, I understand why someone that finds the individual actress super attractive but this isn’t how you go about it. Just use your imagination, instead of defaming them on the internet where anybody can find it. This is just wrong, I mean it has to be defined as defamation in some form.

  • @firehot9578
    @firehot9578 4 года назад +3

    That fact that it originated as a name from reddit is scary

  • @djvelocity
    @djvelocity 4 года назад +2

    Sorry everybody but the genie is out of the bottle by this point. This technology has been refined over the past decade and it is now here to stay, disruptive as it may be. All of us are vulnerable to its predation and because of the democratization of this technology, in terms of who has access to it and who can therefore leverage it, and because in the future it will become as easy as opening a main video and a few social media photos of somebody within a smartphone app, we need to get comfortable with the idea of this kind of illusion in videography. Trying to rail against it with social justice, repudiation of morality, and legislation are pointless and futile, it’s like shoveling water. The technology is here to stay so all the public can do now is be vigilant and educated so as not to be fooled easily. This is honestly the only path forward

    • @BBGUNGAMER110
      @BBGUNGAMER110 4 года назад

      Finally someone who understands

    • @djvelocity
      @djvelocity 4 года назад +1

      goku uzuchia ha yeah bro, I constantly read so I’m well familiar with this technology. It’s nice that people want to hashtag their way out of this but it’s a pointless effort eh?

  • @dottod952
    @dottod952 4 года назад +11

    4:05 Yep this video is old

  • @blah2064
    @blah2064 4 года назад +1

    Making or sharing a deepfake should be a criminal offense.

  • @MUTADARES
    @MUTADARES 4 года назад +10

    How do they find out🤔

  • @andyc2518
    @andyc2518 4 года назад +3

    It's sad how naive Kristen Bell sounds at the end of the video. Sad in the sense that something like that is considered naive when she's absolutely right. I along with countless others have seen just how immature, cruel and insensitive people can be online and there's no fixing that without violating privacy and privacy is needed to protect those most vulnerable. Like with the riots currently going on, you can't quite fix it, you just gotta ride it out and that's what we gotta try to online. Ride it out and hope for the best in the end... even though humans don't really work that way.

  • @Luna-fo8np
    @Luna-fo8np 4 года назад +111

    bruh moment am i right guys
    edit: this has nothing to do with the video why did people like this ._.

  • @GlowingEagle
    @GlowingEagle 4 года назад +2

    Huge respect for Vox for putting a grid over their faces as to not allow these video clips to be used as data for deepfake software

  • @TugaThings
    @TugaThings 4 года назад +5

    No one should be entitled to sexualize others or use their faces without consent, it is wrong. But the thing is, it's the internet and unfortunately everything gets sexualized from objects to people... We have 2 options: we learn to live with this or we try to censor the internet which is almost impossible

  • @KarlRamstedt
    @KarlRamstedt 4 года назад +1

    The genie obviously isn't going back in the bottle; people are not gonna stop doing this, even if it is outlawed.
    So labeling is probably the only realistic way to combat the misinformation.

  • @sarveshsawant9564
    @sarveshsawant9564 4 года назад +41

    I thought there was an expert in the thumbnail

    • @commentmachine1457
      @commentmachine1457 4 года назад +9

      she is an expert in acting though

    • @xxDOTH3DEWxx
      @xxDOTH3DEWxx 4 года назад +4

      Why would you expect that from vox

    • @perisaizidanehanapi7931
      @perisaizidanehanapi7931 4 года назад

      @@xxDOTH3DEWxx There is a researcher from Deeptrace in 1:30

    • @xxDOTH3DEWxx
      @xxDOTH3DEWxx 4 года назад +1

      @@perisaizidanehanapi7931 yes but Kristen bell is not

    • @luisuribe5457
      @luisuribe5457 4 года назад

      Actors have a huge ego and they think they’re experts on everything... specially politics and climate change.

  • @ZainAli-nd9ke
    @ZainAli-nd9ke 2 года назад +2

    Technology where's person reputation taken or misused should be banned or make it hard to public to use it .

  • @AJX-2
    @AJX-2 3 года назад +6

    When you put personal information online, you forever lose control of what people do with that information. Photos count as information. Nobody has any reasonable expectation of privacy online.

    • @ekaterinavalinakova2643
      @ekaterinavalinakova2643 3 года назад

      Photosites NEED to start making it very clear what rights users are sublicensing uploading images to their sites in such a way that it's not written in legalese, and end users NEED to start reading the TOS. Most people would have never have forseen that uploading an image of themselves by default sublicenses the company to do whatever they want with said image.

  • @FinancialShinanigan
    @FinancialShinanigan 4 года назад +3

    Don't lie Vox, we all know thats Elizabeth Banks!