CNN reporter calls his parents using an AI deepfake voice. Watch what happens next

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024
  • Can CNN correspondent Donie O'Sullivan's parents spot the difference between their son's voice and an AI version? An online tool creates convincing AI audio fakes of anyone. Here's how the technology works and what's at stake. #CNN #News

Комментарии • 303

  • @bunyipdragon9499
    @bunyipdragon9499 Год назад +56

    Oh great, when my family calls to borrow dollars I'll have to ask them three security questions 😜

    • @somethingelse4878
      @somethingelse4878 Год назад +2

      Do what i do,
      say i don't believe it's you, and stop standing in front of the tv

    • @antoinetteboswell2948
      @antoinetteboswell2948 Год назад

      You got that right how sad this country is coming to it's sad you would have to ask your family questions only you and them know

    • @bunyipdragon9499
      @bunyipdragon9499 Год назад

      @@annasalvinski4522 thankyou 💜

  • @shawnnewell4541
    @shawnnewell4541 Год назад +19

    Boy, is this scary. Why is this even necessary?

    • @kpepperl319
      @kpepperl319 Год назад +6

      Because people are willing to do anything to get money... It always about money

    • @veganath
      @veganath Год назад +1

      I lost my Mum during the pandemic, I won't let them steal her from me!! There you go a necessity for a good outcome. Like ALL technology it can be used for human concern or not!

  • @TH-tl6sy
    @TH-tl6sy Год назад +48

    With deep fakes of faces and voice it definitely puts a lot of security measures at risk. Things that use facial recognition or voice recognition as security authentication.
    Personally so far I've always been able to tell when it's computer generated voice because of mispronounced word and that slightly monotone sound. Lack of inflection. But I have no doubt this will be addressed before long. And that's scary.

    • @quantumfineartsandfossils2152
      @quantumfineartsandfossils2152 Год назад +1

      This makes all forms of surveillance extremely important but mostly in the choices you make forcing you to be a reliable observer and to be able to *prove you were ever physically around anyone*, your neighbors also. Obviously if you have no proof you were ever around anyone (requiring erasing all the people who ever knew you too) and if there's no proof as to how you treat others, using their voice in this way actually exposes you & you set off alerts against others who know where you actually are & were. This can make fighting crime less dangerous & it can pinpoint alert you that a duplicate is making noise in the system thereby giving away its location which will always be different from where you actually are. The only way this would still not work is if someone deep fakes you while standing next to you.

    • @joey.99
      @joey.99 Год назад

      Yeah AI voices can be extremely destructive

    • @quantumfineartsandfossils2152
      @quantumfineartsandfossils2152 Год назад

      @@joey.99 Joey you are impossible to deep fake because everybody knows where you are if they didn’t you would not have a Social Security number and you wouldn’t have a drivers license you wouldn’t have a license plate you wouldn’t have a name you wouldn’t have a date of birth you wouldn’t have a memory and an experience of your environment and everyone around you you yourself would not be a witness that others can benefit from you yourself would not be a witness of other people many who you know but maybe not a name basis you can witness and defend them So if anyone no matter who they are no matter where they are located makes a deep fake of you it will alert an entire complex surveillance system as to the location of that duplicate while simultaneously observing your existence because you exist at the exact same time all without invading your privacy in fact enhancing your privacy on purpose because you’ll make more money the more space you can create for others it is because you need more freedom and more safety in order to use biometrics and practice quantum science which is why we’re here and we don’t know why and we’re trying to find outThis will make it extremely easy and possible to co-opt gossipers that are like Alex Jones in the system who are gossiping perhaps gossiping about you or spreading disinformation that will cost them money soon there will be cybercrime cop shows The people who make deep fakes of you obviously having no idea who you are who knows you and what you’re doing with your life will be taken to get a mental health check up we don’t even need prisons or punishment anymore a lot of criminals and abusers who commit crimes they need psychiatric help and mental health help and we can cure a lot of brain diseases we are all born with diseases there’s no person that has ever been born without diseases in fact everything has diseases plants have diseases trees have diseases paintings of diseases literally everything so if this is implemented for you if it’s something you’re interested in it will help you understand why you have free will and why when you pick up a glass you’re picking it up when you put it down you put it down why those interactions and entanglements happen and why they are real and no one else will be able to do that because they are not you and if they try you and everyone in the system will be alerted immediately that some delusional clone is using your voice and they will have to be held accountable or pay a charge or stop doing that when they have no proof that they know who you are or why Most criminals are always someone we know or someone who has encountered us

    • @quantumfineartsandfossils2152
      @quantumfineartsandfossils2152 Год назад

      When there are two locations for one person then you can finally hold the criminal accountable because you know where they are because you have to know where they are they have to pretend to be a real person in order to impersonate you if someone tries to authenticate you they will alert a system and using your location others will have the surveillance and the proof they need to act and make sure that person stops otherwise you never would’ve noticed them before this is why we’re having a disinformation revolution that actually requires more freedom not less this is because very few people are naturally honest truthful factual people who corrected their mistakes they can’t help themselves so that’s why they steal your voice and try to sign into something else alerting the FBI where they are because the FBI knows where you are

    • @joey.99
      @joey.99 Год назад

      @@quantumfineartsandfossils2152 thanks for the essay

  • @antoinetteboswell2948
    @antoinetteboswell2948 Год назад +22

    And I think it's very irresponsible and troubling to think that someone is saying something that I said and that I didn't even say it's horrifying me.

    • @quantumfineartsandfossils2152
      @quantumfineartsandfossils2152 Год назад

      This makes all forms of surveillance extremely important but mostly in the choices you make forcing you to be a reliable observer and to be able to *prove you were ever physically around anyone*, your neighbors also. Obviously if you have no proof you were ever around anyone (requiring erasing all the people who ever knew you too) and if there's no proof as to how you treat others, using their voice in this way actually exposes you & you set off alerts against others who know where you actually are & were. This can make fighting crime less dangerous & it can pinpoint alert you that a duplicate is making noise in the system thereby giving away its location which will always be different from where you actually are. The only way this would still not work is if someone deep fakes you while standing next to you.

  • @CollinSamatas
    @CollinSamatas Год назад +46

    Goodness, that is terrifying. If it's this good now, how will it be in 10 years? 20? Really terrifying.

    • @annasalvinski4522
      @annasalvinski4522 Год назад

      i like your emails !

    • @theghostofalsimmons5737
      @theghostofalsimmons5737 Год назад

      This along with deepfake are possibly the most dangerous tech being developed and perfected. These folks are laughing and treating this as something fun and entertaining but the implications of this for fraud, the spreading of misinformation and enhanced propaganda spreading are immeasurable. Folks can literally rewrite history with this tech by subtly or completely changing historical speeches and interviews. To not be able to decipher what’s real and authentic versus what is not is already problematic in western culture, this tech will amplify that confusion by tenfold. But no big deal right, because it’s fun. This may be the catalyst of a complete takeover of misinformation.

    • @robdisner
      @robdisner Год назад +1

      It’s going to be a total hellscape.

    • @HifromBob
      @HifromBob Год назад

      Oh Lord, I like that you say "Goodness" :-)

  • @maddiethornhill9853
    @maddiethornhill9853 Год назад +14

    A couple of giggling schoolboys playing with a shiny toy that has a hand grenade buried within it. What could possibly go wrong?

  • @chelittle6433
    @chelittle6433 Год назад +21

    Some technology should just not be allowed period.

    • @dianagross8784
      @dianagross8784 Год назад +4

      I agree. We are not good at learning from our previous and current mistakes

    • @agf1700
      @agf1700 Год назад

      Unfortunately there is no stopping it. Technology can and often is an huge asset and wonderful in some respects but…At a minimum social media companies should be compelled to stop any minors being able to get access to damaging content and also sick and twisted content that’s not good for ANYONE’s mentality.

  • @the_brad_wilkinson_experim1307
    @the_brad_wilkinson_experim1307 Год назад +20

    In the very near future we will truly have no idea what is real, and what is not.

    • @tree5013
      @tree5013 Год назад

      So we are going to become democrats that think men can get pregnant and the masks work because CNN told them so?

    • @dsddala467
      @dsddala467 Год назад

      People who watch Faux news already live in that world, as court documents have recently showed.

    • @annasalvinski4522
      @annasalvinski4522 Год назад

      i like your emails !

  • @Robin-bk2lm
    @Robin-bk2lm Год назад +13

    "We have what's called the liar's dividend. Anyone can plausibly deny reality."
    Should have started with that. It's bloody serious.

  • @hypnothetical9461
    @hypnothetical9461 Год назад +30

    Imagine having to captcha check your family/friends in a phone call.

  • @stephenconsalvo
    @stephenconsalvo Год назад +11

    Voice over actors may be out of a job soon. People are going to have to get their voices legally protected.

    • @SarafinaSummers
      @SarafinaSummers 11 месяцев назад +1

      Walt Dizney tried to do that with the original voice actress for Snow White. NO shit. He tried to keep her from doing interviews.

  • @poodook
    @poodook Год назад +5

    Humanity is screwed

  • @swingtag1041
    @swingtag1041 Год назад +11

    As usual, they leave you with no solutions.
    Here's a solution: set up a passphrase with your family members so they know it's you. If you're ever concerned that it might not be them, challenge them with questions that only they would know the answer to.

    • @TonyLenart
      @TonyLenart Год назад +1

      Good idea. But what if they record those, and use them against you another time?

    • @Robin-bk2lm
      @Robin-bk2lm Год назад

      That's not the point at all. No one cares if you can fool your family. The entire public can be disenfranchised with this. You will become a serf again with no rights if the rich use this to dismantle democracy.

    • @annasalvinski4522
      @annasalvinski4522 Год назад

      i like your emails !

    • @davidcat1455
      @davidcat1455 Год назад

      @@TonyLenart
      If they’re recording your private conversations, then deep fake is the least of your problems. Why would they need to fake anything when they getting it straight from the horses mouth in the first place?

  • @Stevie_B_0828
    @Stevie_B_0828 Год назад +5

    That.
    Is.
    Disturbing.
    😦

    • @johnperrry215
      @johnperrry215 Год назад

      Look you old Queen you're just trying to be dramatic

  • @reidhulshof3645
    @reidhulshof3645 Год назад +5

    No way they used the unlimited bacon bit as their example

  • @MaroonedInDub
    @MaroonedInDub Год назад +3

    Voice recognition is used as a security measure when telephoning banks etc. That will need to stop asap.

    • @SarafinaSummers
      @SarafinaSummers 11 месяцев назад +1

      It never was a good idea to begin with.

  • @ssc4153
    @ssc4153 Год назад +2

    This is dangerous.......

  • @dluv_98
    @dluv_98 Год назад +7

    i really liked this story, very well put together, well balanced, not too cynical, not too hyped, and i love anderson’s reference to hearing a deceased loved one, love that he can see the potential for the good

  • @ogbevalentine4160
    @ogbevalentine4160 Год назад +16

    My mind goes to those terminators in the movies. James Cameron was really ahead of his time.

    • @sprish00
      @sprish00 Год назад +2

      Have you heard of Sophia Stewart? I recently learned she wrote this book called "The third eye". That was (purportedly) stolen and used to create two of Hollywood's biggest franchises, THE MATRIX and THE TERMINATOR.

    • @ogbevalentine4160
      @ogbevalentine4160 Год назад

      @@sprish00 omg, are you kidding?

  • @shannonfergusson978
    @shannonfergusson978 Год назад +16

    Tucker, FOX, etc will definitely use this to their advantage

    • @jamesestelle7260
      @jamesestelle7260 Год назад

      ruclips.net/video/DynOlXtlYTs/видео.html
      Here you go. Tucker deep faked!!!!!

    • @anacc3257
      @anacc3257 Год назад

      Have they ever used faked footage?

    • @maddiethornhill9853
      @maddiethornhill9853 Год назад +1

      Never mind FOX….how about N. Korea? Or any number of other bad players?

    • @annasalvinski4522
      @annasalvinski4522 Год назад

      i like your emails !

  • @7errafirma
    @7errafirma Год назад +11

    This should be immediately regulated and maybe even better, banned. Can you imagine how people can scam others, give them false bad news, etc.?

    • @robdisner
      @robdisner Год назад +1

      Even if we banned it here, what does it matter? It’s still going to be used elsewhere in the world to perpetrate scams like these.

    • @7errafirma
      @7errafirma Год назад +1

      @@robdisner It should be regulated worldwide.

    • @robdisner
      @robdisner Год назад

      @@7errafirma Do you know anything that is easily regulated worldwide?

    • @7errafirma
      @7errafirma Год назад

      @@robdisner Yeah, most things, health industry, food industry, construction, etc. I am surprised you can't. xD

  • @914WOL
    @914WOL Год назад +4

    This is damned scary ,?just think of the endless trouble it will cause.

  • @ninamo3523
    @ninamo3523 Год назад +4

    Every family has words and phrases that AI wouldn't know. Time to keep them secret.

  • @O6i
    @O6i Год назад +2

    I havent answered a single phone call in years as i found out this technology has been around for a long time. I was getting calls from family and then later they told me they never called or spoke to me. Havent answered the phone since.

  • @armandhammer2235
    @armandhammer2235 Год назад +2

    The media is gonna use this technology.

  • @josephsonora3787
    @josephsonora3787 Год назад +3

    I'm sure all the criminal minded are thankful for this segment Cooper! They're glad to know it works! 👍

  • @JohnJaneson2449
    @JohnJaneson2449 Год назад +6

    Real-life meetings will be important again. All personal relationship and business deals will require direct meeting.

    • @annasalvinski4522
      @annasalvinski4522 Год назад

      i like your emails !

    • @GotoHere
      @GotoHere Год назад

      Meetings like fake conferences, on fake climate change were hypocrites like Joe Biden go to to fall asleep at?

  • @antoinetteboswell2948
    @antoinetteboswell2948 Год назад +2

    I think it's dangerous. This voice match machine could get a lot of people into trouble with the government, the police, judges, teachers, and citizens of the United States for impersonating people voice this just straight-out room to event this kind of thing. 👎👎👎👎👎

  • @dawnoceanside7300
    @dawnoceanside7300 Год назад +3

    It's why you don't record your voice for Google!!

    • @annasalvinski4522
      @annasalvinski4522 Год назад

      i like your emails !

    • @emaarredondo-librarian
      @emaarredondo-librarian Год назад +1

      Why would Google want to do anything with your voice? Are you secretly a millionaire so that multi-billionaire company would need to impersonate you to steal from you? Or are you president of some country, so Google would want to impersonate you - for some reason?
      The people at risk of being deep-faked aren't the ones recording their voices so the Google Assistant can understand them (more or less), but the ones speaking publicly or recording themselves and publishing that on social networks *anyone* can watch, and then upload the voice clips in *any* of the multiple AI websites already available to make them say anything they didn't. The people most at risk are the rich and famous, but above all, politicians and celebrities. Imagine a deep-fake of Greta Thunberg saying she loves muscle cars, Tom Hanks insulting someone, Al Sharpton making erotic advances, Mitt Romney ordering alcohol on the phone, Mike Pence saying he loves BDSM, Trump saying he ordered McCain to be killed... those are just some of the possibilities.
      What would Google win by deep-faking us?

    • @theterminaldave
      @theterminaldave Год назад +1

      @@emaarredondo-librarian Did you know that you can access all of the voice searches you've done using your android phone? Ok, now pretend that someone hacks your account and gets access to that info. Now that person then uses your voice to social engineer whatever scheme they'd like to.
      It's not about "google" having it per say, it's the fact that that info simply exists. But of course, if you were an activist in an authoritarian country, they could use the process I just laid out in my first paragraph to crush dissent and find others who you know.
      Now do you understand?

    • @emaarredondo-librarian
      @emaarredondo-librarian Год назад

      @@theterminaldave For starters, you must be aware that Google Support provides the steps to delete your voice recordings, in case you happen to become the underground leader of the resistance against a totalitarian regime - which takes the very long turn of stealing your Google searches to get your voice to get other people... when they can just get you, or any of your loved ones, torture you and/or them, get all the information they want and some more, and kill everybody in a fake shooting.
      You think I'm kidding? Nope. Chilean here. I grew up in a real-life dictatorship. A thing all of them have in common, they love the direct approach to political adversaries. Ask any of Putin's ones. The ones left alive.
      And regarding that "someone" hacking my account to "social engineer schemes," the question remains what *me* or *you* would be of any usefulness to *someone* in that regard. Are you a leader of public opinion? An influencer? Certainly I'm not. The "someone" would be very disappointed, after all the work, hacking, editing, etc., and getting no social engineering results whatsoever.
      Finally, why hacking Google searches to get your voice? All "someone" needs is to call you on the phone, chatting with you at some cafe, watching anything with sound you have uploaded on social media, getting forwards of your voice messages on WhatsApp, etc.
      The only way you can make sure no one will use your voice in some scheme is deleting absolutely every recording you ever produced, make all your acquaintances do the same with their copies, and make a vow of silence.
      You cannot trust anyone who makes you any question or addresses you in any way. Everyone who tries to make you talk can be secretely recording you.
      Now, go on with your life.

    • @theterminaldave
      @theterminaldave Год назад

      @@emaarredondo-librarian not sure if I'll get around to reading your book of a comment. Remember being concise will have a better chance of getting your point across.

  • @Syx7h
    @Syx7h Год назад +2

    Metal Gear Solid 2 was way ahead of its time

  • @ASMR-XI-ZUI
    @ASMR-XI-ZUI 10 месяцев назад +1

    AI deepfakes of others should be illegal unless used with permission. Only you can use your own AI for personal projects. E.g. if you are a content creator being allowed to use your own voice and images.

  • @peggygallagher5802
    @peggygallagher5802 Год назад +1

    Horrific!

  • @quantumfineartsandfossils2152
    @quantumfineartsandfossils2152 Год назад +2

    This makes all forms of surveillance extremely important but mostly in the choices you make forcing you to be a reliable observer and to be able to *prove you were ever physically around anyone*, your neighbors also. Obviously if you have no proof you were ever around anyone (requiring erasing all the people who ever knew you too) and if there's no proof as to how you treat others, using their voice in this way actually exposes you & you set off alerts against others who know where you actually are & were. This can make fighting crime less dangerous & it can pinpoint alert you that a duplicate is making noise in the system thereby giving away its location which will always be different from where you actually are. The only way this would still not work is if someone deep fakes you while standing next to you.

  • @primordial_platypus
    @primordial_platypus Год назад +2

    Can a voice analyzer tell the difference?

  • @monkeybusiness2204
    @monkeybusiness2204 Год назад +4

    They should create an AI to counter abuse of AI like the methods shown in this video.

    • @fjbfhb5017
      @fjbfhb5017 Год назад

      Go FJB

    • @veganath
      @veganath Год назад

      GAN(Generative Adversarial Networks) I think you are referring to.

  • @peterdixon357
    @peterdixon357 Год назад +2

    THIS is all joking now BUT it's going to Become a very Evil 😈 things against each other and alot of people are go to get hurt over IT soon

  • @mr.m4853
    @mr.m4853 Год назад +3

    This leprechaun came on CNN to cry when Elon Musk suspended him on Twitter.

    • @Acer_Maximinus
      @Acer_Maximinus Год назад

      Donie came on to let people who didn’t already know; what a petty, vindictive A hole Elon is.😮
      😂😂😂

  • @klrklr116
    @klrklr116 Год назад

    I don’t think it’s a good idea to give the general public access to this technology. We already have a problem in this country with truth. This should be addressed immediately.

  • @jimgorman7903
    @jimgorman7903 Год назад +2

    Donie - What A Guy! :-)

  • @72mossy
    @72mossy Год назад +2

    Donie if your reading this, Tyrone beat Kerry 1-15 to 2-09, the kingdom would want to pull their socks up 😆

  • @dorianshadesofgray
    @dorianshadesofgray Год назад +5

    Fun and games now … wait for the terror and lawsuits to follow

  • @slynskey333
    @slynskey333 Год назад +2

    Ringing the mammy back home.

  • @Facetimegirl
    @Facetimegirl Год назад

    How will we add a tell to differentiate a real vs fake thing? This is so bad for democracy.

  • @catalinasanchez5098
    @catalinasanchez5098 Год назад +1

    Scary 🫣

  • @stevensalazar2713
    @stevensalazar2713 Год назад +2

    This was a good segment made me happy to know in this scary future we have these two wonderful servants of the empire.

  • @erykaton170
    @erykaton170 Год назад +3

    Everyone better start using pass phrases with their loved ones so we will know if it's legit or not.

  • @MichaelCashVEVO
    @MichaelCashVEVO 6 месяцев назад +1

    this has happened to me so much lately :')

  • @YanPutrisuryo
    @YanPutrisuryo Год назад

    It is scarry.... I am afraid you cannot stop this.

  • @samuelmachado6088
    @samuelmachado6088 Год назад +1

    As Long as the Anointed Christians and the Holy Ghost is here, no demon can take down earth.....Hallelujah 🙌

  • @speed3971
    @speed3971 Год назад +1

    It's still slightly monotone and the accents are off. But pretty good.

  • @jenny6253
    @jenny6253 Год назад

    Don’t know why they are laughing!! This is incredibly dangerous tech!!

  • @Lottieloves287
    @Lottieloves287 Год назад +2

    It should be illegal. It’s already ruined many lives 😔

  • @CoffeeAndColtrane
    @CoffeeAndColtrane Год назад

    Which online tool are you using? I'd like to check it out

  • @chriscuntlicker796
    @chriscuntlicker796 Год назад

    LOL this is hillarious. California finally figuring out EV's + Silicon Valley is shit. Jews fleeing in mass numbers to Israel from America. Thanks for gathering in the same location for us though.

  • @texansforever6782
    @texansforever6782 Год назад

    whats the program called

  • @Happy_Bnzo_Puppy
    @Happy_Bnzo_Puppy Год назад

    Why does anybody even develop something like that? It doesn't do any good. 🙄

  • @scottd52843
    @scottd52843 Год назад

    If this tech gets out of hand, I could easily see this as against the law if caught.

  • @mistercohaagen
    @mistercohaagen Год назад

    If 900 numbers start sounding like this, we're about to get gunned down by sexually frustrated Terminators.

  • @tree5013
    @tree5013 Год назад +8

    Mitch McConnell fell in the Trump hotel tonight and is in the hospital. With Biden loosing his fight with three staircases this year and McConnell a floor, at least gravity is trying to work for the American people.

    • @maddiethornhill9853
      @maddiethornhill9853 Год назад

      If only a more youthful leader was the solution to the mess the US is in.

    • @tree5013
      @tree5013 Год назад

      @@maddiethornhill9853 Well you arnt going to find any older ones LOL

    • @BryanScrilla
      @BryanScrilla Год назад

      Haha I hope he d...

    • @maddiethornhill9853
      @maddiethornhill9853 Год назад +3

      @@tree5013 I presume you’re young. You certainly are ageist…that much is apparent. But clearly you haven’t read enough of history to glean that it is good judgement and not youthful vigour that keeps a country functioning properly and peacefully.

    • @tree5013
      @tree5013 Год назад

      @@maddiethornhill9853 I'm in my 60s LOL. Next you're going to be calling me a racist LMFAO. I don't think that Biden, Mcconnel or 90% of our leaders represent the American people. They only represent their own bank accounts. Why do you liberals always have to call people names and judge them by the color of their skin or other attributes? When I was in school they called that being a bigot.

  • @Yankeeprepperasshat
    @Yankeeprepperasshat 9 месяцев назад

    Are those glasses he’s wearing ai generated? Or did he really buy a pair of children’s glasses? The arms are so short, that the lenses are super angled. How can he even see through them?

  • @GandyBeats
    @GandyBeats Год назад

    What did they use?

  • @rodneyroimatawhanga525
    @rodneyroimatawhanga525 Год назад

    This is pretty stupid Donie was talking to his parents through the machine.

  • @timothylouly9449
    @timothylouly9449 Год назад

    What AI is this that they're using?

  • @jimpetterson7422
    @jimpetterson7422 Год назад +1

    Welp this is the end ….

  • @alejandroriano3246
    @alejandroriano3246 Год назад

    This is very scary, scamers will be all over this...

  • @ericphantri96734
    @ericphantri96734 Год назад

    Don't turn this technology to the other sides

  • @Ruthypops1
    @Ruthypops1 Год назад

    Hey Janelle, whats wrong with Wolfie

  • @LeechyKun
    @LeechyKun Год назад

    Scientist always ask if they could but never if they should. Same goes for engineers that work on AI.

  • @anyname777
    @anyname777 Год назад

    How'd Alex M. get convicted then?

  • @exiled_londoner
    @exiled_londoner Год назад

    Talking about Deep Fakes... has anybody else noticed that as Anderson Cooper gets older he looks more and more like Max Headroom?

  • @paulacoyle5685
    @paulacoyle5685 Год назад

    I thought donie sounded a bit serious and flat as well, did not seem quite real but it would be hard to tell if you weren't expecting it. mom had the right idea, the way she described it.

  • @buzzy2587
    @buzzy2587 Год назад

    It's not at all funny. It's sick and demented. Humans are brilliant and wonderful, and people developing this are opening pandoras box. Can we please just stop replacing humans!

  • @ambrishpriyadarshi988
    @ambrishpriyadarshi988 Год назад

    War should be stopped

  • @manuellubian5709
    @manuellubian5709 Год назад

    Pranking his parents was GOLD!!

  • @thatJAWNraps
    @thatJAWNraps Год назад

    this along w/ the deepfake visual stuff (like Kendrick's Heart Part 5 video) we literally cant trust our eyes/ears anymore

  • @Keashane
    @Keashane 6 месяцев назад

    How about foreign words?

  • @Nothinglefttosay
    @Nothinglefttosay Год назад

    Ok.. turning off Siri, camera, microphone, messages, facial recognition, voice command, emails,… and no more games..!
    Goodbye technology 😢 I will always Lov………

  • @13BGunBunny
    @13BGunBunny Год назад +1

    This is so cringe and unfunny

  • @blizz2795
    @blizz2795 Год назад +1

    SCARY

  • @axxojaxx2510
    @axxojaxx2510 Год назад

    It's not AI It's a speech synthesizing algorhythm. I think it's about time for journalists to learn the correct definitions.

    • @veganath
      @veganath Год назад

      AI, i.e. a machine learning model was used to synthesize a voice. 'AI synthesized voice' perhaps

  • @Z1BABOUINOS
    @Z1BABOUINOS Год назад +1

    Even the best AI is not able to fake Biden's senility! 😆
    Dead giveaway.
    You know... the *THING !* 🧠💤

  • @scooterbaby1
    @scooterbaby1 Год назад

    scary stuff

  • @charlespk2008
    @charlespk2008 Год назад

    either we atart to put weight into credible sources or we all go to hell in a hand basket.
    any and all media sources need to be held accountable for unverified information. phone identity needs to be treated like legal registration. we can't keep letting frauds and 'i think' people rule the sound waves.

  • @cocoatmidnight7611
    @cocoatmidnight7611 Год назад +1

    This video is scary not funny

  • @randal_gibbons
    @randal_gibbons Год назад

    Definitely unlimited bacon for me.

  • @dusk1947
    @dusk1947 Год назад

    And this is why advanced AI tools have such a disruptive potential...
    Good use case.

  • @dawnoceanside7300
    @dawnoceanside7300 Год назад

    CNN where's Don lemon?

  • @jukio02
    @jukio02 Год назад

    Do the scream voice, lol.

  • @Anjuli72
    @Anjuli72 Год назад

    I'd say he's in real trouble with his Ma 😂😂😂😂

    • @LindaC616
      @LindaC616 Год назад

      He said nothing wrong to his mum

  • @user-mu3iy8fq3d
    @user-mu3iy8fq3d 9 месяцев назад

    Promoting a digital environment that values truth and authenticity over deceptive simulations is essential. By advocating for a cultural shift towards prioritizing genuine content, we can contribute to the creation of a more trustworthy and reliable online space.

  • @djc6323
    @djc6323 Год назад

    Ai voice is monotone and emotionless

  • @adammartin7007
    @adammartin7007 Год назад

    This tech has been around for 20 odd years.

  • @Becca_TheWorm
    @Becca_TheWorm Год назад

    Would you rather have unlimited bacon but no games or games, unlimited games but no games

  • @Livetoeat171
    @Livetoeat171 Год назад

    Sometimes you can tell it's AI if they use the word "about" because Canadians say the word "about" very funny like a boot. And "again" is pronounced, uh gane. Otherwise, the voice sounds normal

  • @ethzero
    @ethzero Год назад

    Remember the Bond film, Diamonds are Forever? Blofeld used an "electronic voicebox" to fake a kidnapped businessman Willard White. We're now here.

  • @MatthewTaylor3
    @MatthewTaylor3 Год назад

    I'd actually be using this for most of my vlogs so I don't sound so monotonous all the time.

  • @thatJAWNraps
    @thatJAWNraps Год назад

    using chat gpt with deepfake for people who died is basically already a Black Mirror episode; just irl robot form...buuut considering what we can do with 3D printing...😳

  • @thatJAWNraps
    @thatJAWNraps Год назад

    So far the only thing im seeing it used for are those 'online gaming sessions' between 'trump/obama/biden' 🤷‍♂

  • @needinput9805
    @needinput9805 Год назад +1

    Duh huh huh
    Don’t care

  • @sjadude3
    @sjadude3 Год назад

    Wow such news.