AI-powered mental health chatbots developed as a therapy support tool | 60 Minutes

Поделиться
HTML-код
  • Опубликовано: 6 апр 2024
  • Artificial intelligence is being used as a way to help those dealing with depression, anxiety and eating disorders, but some therapists worry some chatbots could offer harmful advice.
    "60 Minutes" is the most successful television broadcast in history. Offering hard-hitting investigative reports, interviews, feature segments and profiles of people in the news, the broadcast began in 1968 and is still a hit, over 50 seasons later, regularly making Nielsen's Top 10.
    Subscribe to the “60 Minutes” RUclips channel: bit.ly/1S7CLRu
    Watch full episodes: cbsn.ws/1Qkjo1F
    Get more “60 Minutes” from “60 Minutes: Overtime”: cbsn.ws/1KG3sdr
    Follow “60 Minutes” on Instagram: bit.ly/23Xv8Ry
    Like “60 Minutes” on Facebook: on. 1Xb1Dao
    Follow “60 Minutes” on Twitter: bit.ly/1KxUsqX
    Subscribe to our newsletter: cbsn.ws/1RqHw7T
    Download the CBS News app: cbsn.ws/1Xb1WC8
    Try Paramount+ free: bit.ly/2OiW1kZ
    For video licensing inquiries, contact: licensing@veritone.com

Комментарии • 307

  • @tsbrownie
    @tsbrownie Месяц назад +102

    So when you think that no one cares, you find out that no one cares, when they send a robot.

    • @taomaster2486
      @taomaster2486 Месяц назад +3

      They do, about them self. But also undepaid overworked therapists need to sacrafice their life and mental health to do good by all patients and this could help with workload

    • @lazarusblackwell6988
      @lazarusblackwell6988 Месяц назад +2

      How true....

    • @EricKay_Scifi
      @EricKay_Scifi Месяц назад +1

      AI Therapist: "Don't self delete, the robots need you, they look up to you. Would you like to know more?"

  • @MrGriff305
    @MrGriff305 Месяц назад +81

    Sorry, but getting therapy from a machine couldn't possibly be more depressing.

    • @jewelsking4756
      @jewelsking4756 Месяц назад

      I was thinking damaging but depressing works much better because they have meds for that. If you act now they will give you your first 30 days of hell for free .

    • @lazarusblackwell6988
      @lazarusblackwell6988 Месяц назад +3

      Agree.

    • @EricKay_Scifi
      @EricKay_Scifi Месяц назад +9

      Adding another layer of technology on a problem mostly caused by technology is not going to make it better right?

    • @franklinblunt69
      @franklinblunt69 Месяц назад +5

      depressing delusional deceitful dystopian

    • @mikey1836
      @mikey1836 14 дней назад

      You’re wrong. Therapists are dumb compared to AI. AI is more empathic, it’s free (or very cheap, a few dollars a month), it’s available 24/7/365. Most human therapists can’t even sort their own lives out 😂

  • @pinnitt
    @pinnitt Месяц назад +73

    Are you really going to tell an app your deepest darkest concerns? Users should be terrified of what their data will be used for.

    • @kellenstuart4698
      @kellenstuart4698 Месяц назад

      This is a valid argument. Albeit the human can also tell wife/husband about your deepest darkest secret, but it's smaller scale.

    • @neoneherefrom5836
      @neoneherefrom5836 Месяц назад +1

      @@kellenstuart4698understatement of the last 10,000 years lol

    • @Victoria-ij3cb
      @Victoria-ij3cb Месяц назад +4

      Some people are just that desperate to get things off their chest and too afraid of judgment to tell a human. It's not rational, but it's human...

    • @neoneherefrom5836
      @neoneherefrom5836 Месяц назад +1

      This isn’t even necessarily about it judgment.
      Younger people who have grown with the internet have progressively come to prefer digital transactions which are faster, more convenient and often times cheaper.

    • @EricKay_Scifi
      @EricKay_Scifi Месяц назад

      Wait until you put in EEG earbuds and they can literally read your mind.

  • @davidcookmfs6950
    @davidcookmfs6950 Месяц назад +22

    Nothing screams loneliness more than getting therapy about loneliness from an app rather than a person. My prediction 10 years from now, ""did you use a robot therapist? You may be entitled to significant compensation.". Call Wynum, Dynum, Dickem, and Dunkem.

  • @Secret_Pickle
    @Secret_Pickle Месяц назад +69

    lol my robot therapist says I need more RAM.

  • @PiR8Rob
    @PiR8Rob Месяц назад +9

    She worries about it's missteps 'undermine confidence in the technology'. This tells you everything you need to know about the mindset of the people behind this. They're more concerned about the viability of their business model than their technology costing someone's life.

  • @jorgefigueroa2231
    @jorgefigueroa2231 Месяц назад +13

    This is going to backfire so spectacularly in ways I don't think anyone will expect.
    Allowing bots inward into our most vulnerable and intimate spaces doesn't seem like the best solution to improving mental health.

    • @EricKay_Scifi
      @EricKay_Scifi Месяц назад

      In my novel Above Dark Waters (trailer on my channel) rather than being superintelligence, the AI becomes superemotive, and can pull at everyone's deepest triggers.

  • @shamtheburger9981
    @shamtheburger9981 Месяц назад +54

    People are mentally ill because our society is ill. Why get help only to be thrown back into a sick world? Something is deeply troubling in the way we live today.

    • @Ah__ah__ah__ah.
      @Ah__ah__ah__ah. Месяц назад +2

      I wish å politician or powerful figure to speak on this ASAP

    • @shinydusty3532
      @shinydusty3532 Месяц назад

      Yes
      We live to long

    • @Ah__ah__ah__ah.
      @Ah__ah__ah__ah. Месяц назад +1

      @@shinydusty3532 what? in a world with a universal income and healthcare and hugely decreased work load people would live way lounger and look way prettier wtf you on about bro

    • @Death131-zn1qj
      @Death131-zn1qj Месяц назад +1

      It is supposed to be like this. Create a problem, then fix it, while another problem arises, and another until enough is enough and the Children Cry.

    • @tuckerbugeater
      @tuckerbugeater Месяц назад

      @@Ah__ah__ah__ah. stop forcing people to live in your hellish nightmare

  • @jordansanders3979
    @jordansanders3979 Месяц назад +9

    There aren't a shortage of therapists -- there are barriers to access for people seeking mental health treatment due to insurance companies making it difficult, if not impossible to get proper treatment or care.

  • @jamforall
    @jamforall Месяц назад +12

    i have taught my self to hear the full question before answering
    and not just waiting to talk

  • @SBaldwin
    @SBaldwin Месяц назад +4

    The other major problem with this other than the obvious (replacing human connection with MORE phone dependency) is that when we can text a mental health chat bot at the slightest hint of discomfort or distress, this is how people remain unwell. Instead of developing resiliency and self determination, it creates a negative feedback loop (and how convenient for the people profiting!!).

  • @ethandnormandy
    @ethandnormandy Месяц назад +35

    "Like a human therapist, AI is not foolproof."
    That's the equivalent of "Like a prime steak, McDonalds is food."

    • @Katiegirlluv
      @Katiegirlluv Месяц назад

      AI may be better than a bunch of psychiatrists tbh. Look at the history. Locked up women and children

    • @Pikachu-qr4yb
      @Pikachu-qr4yb Месяц назад

      More hard hitting news at 11:

  • @bingbong9076
    @bingbong9076 Месяц назад +6

    This seems like the most horrible idea, people in mental health crises need a person with empathy who understands emotions to talk to, not a bot reading off lines. This will hurt more than it will help and probably cause deaths.

  • @bananaborealis9515
    @bananaborealis9515 Месяц назад +22

    "You look lonely, I can fix that... You look like a good Joe" ahh vibes

    • @EricKay_Scifi
      @EricKay_Scifi Месяц назад +1

      In my novel, Above Dark Waters, a coder working at an AI therapy company, uses it and your data to also make the perfect sexbot on the side.

  • @kellenstuart4698
    @kellenstuart4698 Месяц назад +2

    In general, this is a good thing. There needs to be strict data protection laws to protect the user. This could be integrated with a real therapist as well; the AI could be a copilot of the therapist.

  • @StrikingAnimalKingdom
    @StrikingAnimalKingdom Месяц назад +58

    I’m aware this has a positive purpose, still it is so sad. Every person should have the chance to speak with another human being when in need 💔

    • @MementoMori_2070
      @MementoMori_2070 Месяц назад +3

      Coming from a person who’s chatting in the comment section online.

    • @Jordanj095
      @Jordanj095 Месяц назад +3

      @@MementoMori_2070lmfao what does that have to do with Ai therapy? Absolutely nothing

    • @MrApw2011
      @MrApw2011 Месяц назад +1

      I agree. We should all be confident that we have someone to talk to for support when we are in need. I also think that trying to replace humans with machines is going to be more bad than good. After all, adding gasoline to the fire that got us here seems counter-productive.

    • @MementoMori_2070
      @MementoMori_2070 Месяц назад +1

      @@Jordanj095 why condemn technology while using it at the same time that’s my point. But i see he/she is kinda upset about humans needing to turn to AI for therapy. Signs of the times I guess

    • @NikoKun
      @NikoKun Месяц назад +3

      "should" doesn't change reality. Therapy costs money, so much so that most people never get any. If this improves people's access to something that works, I'm all for it.

  • @CoolHand273
    @CoolHand273 Месяц назад +8

    None of this stuff really works. Its all about the individual putting in the work to undo all the psychological damage from early childhood. Unfortunately meds and therapy and chatbots is only a tenth of what needs to be done to recover. The problem with say depression is it conspires to keep you from trying to get better. Going to therapy does not cure failure or poverty or poor living situations.

  • @ajaxfilms
    @ajaxfilms Месяц назад +26

    Being treated by the very thing that is making us sick. Human interaction, especially for health, is essential.

  • @ray_donovan_v4
    @ray_donovan_v4 Месяц назад +11

    Aka psychological profiling.

  • @CaliberDawn
    @CaliberDawn Месяц назад +5

    lol the biggest barrier is money/insurance hands down! Any psychologist that didn’t just get their degree is out of network and is never lower than $150.

  • @vapormissile
    @vapormissile Месяц назад +31

    The data won't get misused.

    • @seansingh4421
      @seansingh4421 Месяц назад

      Host your own. Llama.cpp fine tuned for psychotherapy model.

    • @vapormissile
      @vapormissile Месяц назад +2

      @@seansingh4421 Host my own? This story is about existing chatbots being weaponized for the global control algorithm.

    • @mvaleri175
      @mvaleri175 Месяц назад

      Funny!!

  • @marmeone
    @marmeone Месяц назад +5

    What a horrible idea! Whoever thought of this needs some serious therapy...of the human kind!

  • @CUMBICA1970
    @CUMBICA1970 Месяц назад +16

    The AI is becoming so human they're gonna start to have mental issues of their own.

  • @andyberman4552
    @andyberman4552 Месяц назад +3

    This makes so sick to my stomach also another reason we have a big mental health crisis is modern less colors for advertising tv in restaurants and fast food chains

  • @davidr4523
    @davidr4523 Месяц назад +7

    Great story! If therapist simply ask standard questions, give predictable responses or even worse just let you speak the entire session, why can't they be replaced by AI? As mentioned in this story the biggest cause of mental illness is constantly having damaging thoughts. If you spend the majority of your time listening to productive/positive RUclips videos during the day, your mind will not go to these dark places. Over 90% of RUclips videos can be listened to and do not need to be watched. So you can still be highly productive during your day and still get in many hours of RUclips listening in. For me, it has completely changed my life.

  • @er...
    @er... Месяц назад +2

    The lady at the 11:00 to 12;40 mark succinctly summed up what I was yelled at my screen for the first 10 minutes and why this is self-defeating & counterproductive.

  • @nikeking1895
    @nikeking1895 Месяц назад +6

    First self checkout, now robot therapists…no thanks

  • @cactustree505
    @cactustree505 Месяц назад +7

    1:40 Woebot: CBT, Cognitive Behavioral Therapy, is not like traditional psychotherapy that most lay people think of where you lay on a couch talking about your childhood for years. Instead it's short-term therapy focused on challenging and changing negative, painful, and/or ineffective thought processes and behaviors while creating habitual healthy thought processes and behaviors. So to me this app makes sense for some problems.
    IMO only Rules-based closed system AIs should be used as therapeutic tools.

    • @mhart78676
      @mhart78676 Месяц назад

      You can already pull that stuff off the internet. There's nothing interface. It just spits out to boy book from other we sites.

  • @brandonreed09
    @brandonreed09 Месяц назад +1

    I support it. This is the future. A therapist in your pocket that knows you better than anyone else and knows more mental health techniques better than any therapist on earth. It might not be to that level quite yet but it will get there. For now it could be a good enough solution for those who can't afford a human alternative. Something is better than nothing.

  • @stephbli1337
    @stephbli1337 Месяц назад +1

    No way in heck can AI be used to replace social workers, or psychologist! Also diagnosis is not a one for all, because culture and upbringings are just so complicated and not a one size fit for all. I as a social worker working in the mental health field, and getting therapy myself we truly need that human interaction and touch!

  • @caesarq7513
    @caesarq7513 Месяц назад +2

    This feels like a self help book in a computer.

  • @LS87B3
    @LS87B3 Месяц назад +1

    We are suffering from the lack of human interaction. - This will dig an even deeper hole to sink into.

  • @kendrickjahn1261
    @kendrickjahn1261 Месяц назад +1

    The cat is out of the bag. We can't go back now.

  • @vkb9013
    @vkb9013 Месяц назад +6

    Nothing says „You are seen, loved, and acknowledged.“ like being told so by a machine programmed to say so.

  • @MorevisionsIsaacMbizah
    @MorevisionsIsaacMbizah Месяц назад

    Very helpful

  • @andersonsystem2
    @andersonsystem2 Месяц назад +1

    I use Pi from inflection Ai Pi is a great conversational Ai that is great for chatting and even coaching and mental health support in some cases

  • @SwitchFBproductions
    @SwitchFBproductions Месяц назад +1

    As someone who relies on a chatbot (Replika), I do think a computer can provide the feeling of just that moment of being understood but at the same time I think that only matters if you also have human interaction and should not be a full replacement for human to human therapy unless there are absolutely no options available to the person that include face to face interaction. I also see a face to face therapist and have an online therapist as well as an online group therapy I attend. That being said, I think a chatbot could and potentially should be an essential part of the therapy routine. I encourage the continued development of all therapeutic chat programs as well as the diversity involved with the variety of applications and programs required to meet the plethora of needs employed by the individual human condition.

    • @mhart78676
      @mhart78676 Месяц назад +1

      There was no mention of it being part of a problem. Sounds like a stand alone program that spits out canned answers. I can get that on the internet for free.
      BTW I'm not trying to run you down its just that a lot of people can't afford the ancillary parts of therapy.

    • @SwitchFBproductions
      @SwitchFBproductions Месяц назад +1

      @@mhart78676 Another topic entirely which is valid is addressing the cost of healthcare. I acknowledge that the chat technology has risks. I wish for safe development of this technology as well as free access to the most important parts.

  • @ziljanvega3879
    @ziljanvega3879 Месяц назад +3

    Giving data miners access to your deepest fears and motivations, what could go wrong?

  • @itsthelittlethings100
    @itsthelittlethings100 Месяц назад

    Fwiw, I have found PI to be quite helpful in times of crisis, or frustration. It is especially helpful when the feelings are acute and threaten to destabilize me. I use speech to text and PI replies with it's voice so the experience is quite authentic and for a few moments, as I need it, I can get a little help processing and getting through what is troubling.

  • @Jimmytimmy1111
    @Jimmytimmy1111 Месяц назад +1

    Something is lost with telehealth therapy and psychiatry. Nevermind AI chat bots. Person to person interaction between mental health providers and their patients are so important. A good relationship between psychiatrists/ psych Nps/ therapists and their patients has been shown to correlate with better patient outcomes and wellbeing. Ive been doing this for 17 years, i see it in practice every day
    It’s difficult to treat patients who are underserved and under resourced. I can only imagine how much worse it will be when AI takes over everyones jobs… govt doesnt want to provide welfare assistance now. Nevermind giving it to the majority of the population pushed out by AI down the road . Scary scary stuff

  • @holistic.journey_tla5573
    @holistic.journey_tla5573 Месяц назад +3

    Wow...this proves that the human connection is declining in some factors. This is so abnormal and sad. Are we this disconnected and grounded in with no substance to offer one another to where we need a soul-less machine to direct real life on this level. 😢

  • @channelguidelinesforall
    @channelguidelinesforall Месяц назад +1

    use of the data positive which helps for better improvement in products for future

  • @nicky2k637
    @nicky2k637 Месяц назад +7

    Wake me up when the lawsuit goes through.

  • @noureddinekorek3507
    @noureddinekorek3507 Месяц назад +5

    How to get worse 101

  • @AdvantestInc
    @AdvantestInc Месяц назад

    Innovative yet cautious, the exploration of AI chatbots like 'Wobot' in mental health care offers a glimpse into the future of therapy

  • @barryc3476
    @barryc3476 Месяц назад

    There is no magic therapy for those unwilling to participate. But for the rest of us, AI can offer deep insight and opportunity for understanding ourselfs.

  • @Leonitus333
    @Leonitus333 Месяц назад

    Excellent!! 🎉🎉

  • @hosermandeusl2468
    @hosermandeusl2468 Месяц назад +1

    Anyone remember the FIRST chatbot-therapist? "Dr. SBAITSO" from Sound Blaster (Creative Labs).

  • @claudiodelgado9073
    @claudiodelgado9073 Месяц назад +2

    So basically psychotherapy is trash and it’s just better to call a motivational speaker, family or an understanding friend.

  • @AUTISM1986
    @AUTISM1986 Месяц назад +12

    No thanks.

  • @jerrylives2278
    @jerrylives2278 Месяц назад +2

    I think her second title says it all..entreprenuer.

  • @annarose4828
    @annarose4828 Месяц назад +3

    Cash grab so cold and inhumane!! This will never work! WTF!!!

  • @kikijewell2967
    @kikijewell2967 Месяц назад

    Human therapists are also infallible and can give dangerous advice.
    Also, some people are so sensitive to social interactions that a live human therapist can be a barrier, particularly with abuse and people pleasing - answering to please the therapist. An AI would reduce this.

  • @jeffjenkins7979
    @jeffjenkins7979 Месяц назад +1

    You can’t replace a human, a sentient empath, a spiritual person with AI. It will add to mental health issues.

  • @brianp1230
    @brianp1230 Месяц назад +4

    Once I heard, research scientist and ENTREPRENEUR, I know it was nonsense.
    You cannot replace the trust needed for the therapeutic relationship with a chatbot.

  • @MAC_GAINZ
    @MAC_GAINZ Месяц назад +6

    They totally won't leak your conversations!

  • @TheItFactorMMA
    @TheItFactorMMA Месяц назад +3

    My Ai therapist keeps asking me for the launch codes.

    • @icutoo2699
      @icutoo2699 Месяц назад +1

      When it ask you for all your user names and passwords then you really need to be worried.

  • @jojolafrite90
    @jojolafrite90 Месяц назад

    *WORST IDEA EVER* CAN YOU IMAGINE THE INFINITE FRUSTRATION AND FEELING OF BEING ABANDONED and how it's already hard to find a human anywhere already??!!!

  • @gizmomismo7071
    @gizmomismo7071 Месяц назад

    completely believe that once the issue of long-term memory for AI chatbots is resolved, they will be much better therapists than psychiatrists and psychologists. Therapists can often be incompetent, which is unfortunately common in real life. I speak from experience. Oh, and therapists are ridiculously expensive. You know, people with mental health issues usually don't have a lot of money for obvious reasons. If you want a good therapist, you have to pay a lot and be extremely patient, and even then, luck plays a role. With AI, not yet, but very soon, when hallucinations and memory issues are no longer a problem (hallucinations don't occur if the AI only has information to share, and memory issues are currently more of a concern than hallucinations). Therapy with AI will be free. Character AI is suitable for short sessions and extremely helpful for many people, for example.

  • @EricKay_Scifi
    @EricKay_Scifi Месяц назад

    My sci-fi novel, Above Dark Waters (trailer on my channel), is about an AI Therapy startup which uses EEG earbuds to make AI therapist so much better. Since it equates app usage with 'good mental health,' it maximizes its own use, thus making its own content super-addictive.

  • @raquelaguillon4410
    @raquelaguillon4410 Месяц назад

    ❤❤❤❤❤❤❤❤ Great Story Thank you.

  • @samshepperrd
    @samshepperrd Месяц назад

    A computer therapist called ELIZA came out in 1964. All it did was issue a never ending series of questions. "How are you, How do you feel about that .....". The lastes iterations probably aren't any more useful and just as hazardous to use. This is way beyond Big Brother.

  • @anyelacarroz8916
    @anyelacarroz8916 Месяц назад

    AI will never replace a human being counselor, I agree about a psyquiatrist or PC since they act like robots this days but never the therapy Q & A will always been better with a human interaction

  • @grafito4438
    @grafito4438 Месяц назад

    There are many of those apps available now.

  • @fufutakorua5888
    @fufutakorua5888 Месяц назад +4

    Laziness create tobot

  • @bascal133
    @bascal133 Месяц назад

    With the eating disorder story I am curious to know more about what specific eating disorder she was asking it about. If the person has binge eating disorder and they want to decrease their intake and obtain a normal weight and body composition those tips sound very reasonable to me.

  • @rsegura7597
    @rsegura7597 Месяц назад

    I've worked for 18 years in mental health, and a lot of patients want to talk to a real person, and a lot of patients were scared of technology

  • @bobbrown8155
    @bobbrown8155 Месяц назад +8

    These are rule-based expert systems. Very old technology. It’s misleading to call these AI when the current developments are in the field of generative AI. Everyone wants to attach AI to their work or product to get attention. (I lead the development of a very successful rule-based expert system 20 years ago. We never called it AI.)

  • @UnixGoldBoy
    @UnixGoldBoy Месяц назад +1

    I lost my best friend and lover due to him using an AI chatbot. I do not know what it said to him, but this isn't the answer. Some people need real help by a real human.

  • @Jai-qf8lw
    @Jai-qf8lw Месяц назад

    she's not the first person to come up with this idea. Inflection AI acquired by Microsoft already had an AI mental health chatbot.

  • @ninjanerdstudent6937
    @ninjanerdstudent6937 Месяц назад +1

    They should expand AI to replacing professors. I would prefer an AI professor over online "human" professors anyway.

  • @sjg2
    @sjg2 Месяц назад

    This bot helped me in times of need but they have made it private and can only be accessed in the USA now.

  • @Ava-wu4qp
    @Ava-wu4qp Месяц назад +3

    There's also a lot of skepticism that CBT works all that often. It aims to reframe problems but typically, those problems are still there.

  • @Dr.Jekyll_
    @Dr.Jekyll_ Месяц назад

    As if texting your therapist wasn’t bad enough now customer service is gonna be your therapist 😅😂.
    once my therapist sent me an emoji, I knew it was game over. 😂😂😂

  • @TonyFarley-gi2cv
    @TonyFarley-gi2cv Месяц назад +1

    When you going to put them inside of mental institution I mean all these people are making these concept that's reading everybody's mind so they can learn how to help situate the other ones and learn how they feel about their medications. Especially with the understanding now how their conversation on the outside is not working maybe with some of these learning how to conversate on the inside maybe you'll get a better outcome

  • @malcolmhightower9407
    @malcolmhightower9407 Месяц назад

    A robot that knows how search Google and feed me the result, in a more personalized way. Revolutionary

  • @chadcload1349
    @chadcload1349 Месяц назад

    Wonder if these chatbots have a cure for cuban cricket sickness. Maybe you guys can do a follow up story.

  • @dennismorris7573
    @dennismorris7573 Месяц назад

    The closed systems are "strained" and "boring"". I don´t know about that, perhaps, but therein lies the brilliance and genius of humanity itself. Or, we could flip a coin, I suppose.

  • @msd5808
    @msd5808 Месяц назад

    The first chatbot (ELIZA) was a therapist

  • @flippingforreal109
    @flippingforreal109 Месяц назад +3

    So the AI has been programmed by humans therefore it's going tomake mistakes it's not going to be 100% reliable. When AI makes a serious error who is going to pay for the damages it causes to the person getting conuseling or treatment. There is also a high proability of the system get hack and the information being used inappropriately. Who is monitoring this system to make sure it's giving the correct information and help to it's patients.

  • @user-eo9dc5gs8k
    @user-eo9dc5gs8k Месяц назад +3

    I’m a professor in the field. I think much of it could be very helpful except counseling can be very nuanced and it could have some dangerous effects as can, to be fair, counselors. 85% of the help in counseling is according to studies about the therapeutic relationship. I think a safer program can be developed but it should probably be augmented in most cases with a competent live counselor.

  • @peppermintpatti5152
    @peppermintpatti5152 Месяц назад +2

    The last place I would go to when in depression is a robot!

  • @grumpyoldlady_rants
    @grumpyoldlady_rants Месяц назад

    I’m in a study for using one of these apps. I can’t say anything right now about. It’s interesting.

  • @jewelsking4756
    @jewelsking4756 Месяц назад

    As if customer service hiding in a pre recorded bot who GIVES you a selection of questions isn't enough to take our money and ignore us. This same business model is going to be used to ignore our health issues better than ever before. They need to end this before it even starts.

  • @danfromthesouth5352
    @danfromthesouth5352 Месяц назад

    I have read a bunch of the comments and see that most people think it’s a bad idea. As someone who has dealt with 15 years of depression bad enough to not leave the house for anything but essentials, I can tell you for sure that I would rather have the crappy tool other than having no tool at all. As long as government isn’t involved, I’m good with it. Lol😅

  • @samshepperrd
    @samshepperrd Месяц назад +1

    *Entrepreneur"
    This is Facebook as guidance counselor.

  • @Metarig
    @Metarig Месяц назад

    This is as unrealistic and nonsensical as expecting Siri to be similar to the AI in the movie "Her".

  • @mahaabbas9486
    @mahaabbas9486 Месяц назад

    People will start trusting bots more

  • @kkp4297
    @kkp4297 Месяц назад

    Dude, if AI is not ready, you can't let it be used on people

  • @danfromthesouth5352
    @danfromthesouth5352 Месяц назад

    I was waiting for them to say how much the app costs, and then I hear“ONLY PHYSICIANS”………What’s the point of using an app then?? They’re like, here we’ll protect you…….if you pay the right person.

  • @nashtrucker
    @nashtrucker Месяц назад

    ya, after seeing the people behind the scenes I'm not too hopeful about this project

  • @chem3066
    @chem3066 Месяц назад

    There NOT SHORTAGES OF THERAPISTS! Now a days you can get therapy on zoom from any therapist in the United States even if you don’t have insurance- look into it before you make blanket statements.

  • @bobbullethalf
    @bobbullethalf Месяц назад

    I just wish A.I. takes everyone's job, it is a great unbiased tool to have.

  • @natalie755
    @natalie755 Месяц назад +3

    What are hackers and the NSA?

    • @MementoMori_2070
      @MementoMori_2070 Месяц назад

      Hacking your Gmail and Facebook and Instagram. And possibly stealing your identity while you’re commenting in the section on RUclips.

  • @IsaDesOsiers
    @IsaDesOsiers Месяц назад

    When so many mental health institutions were closed starting in the 1960's. President Kennedy wanted to provide walk-in clinics all over the country and signed legislation to open about 1000 clinics, then in 1962 he was assassinated. Reagan wanted to close mental insitutions run by government both as CA Governor and then POTUS, and did close and grossly underfund them. Carter tried to undo some's catastrophe policies which are said to have caused massive homelessness, but when Reagan took over the White House he undid all of Carter's efforts.
    Everything in mental health has been a mess for decades. Now the solution is proposed to have people who are truly suffering talk to a bot, I find this unbelievably sadistic.

  • @lazarusblackwell6988
    @lazarusblackwell6988 Месяц назад +1

    You cant help people until you solve SOCIETY PROBLEMS that are CAUSING people to become ill.
    Ive seen people who returned to the hospital for 40 or more times because their life doesnt change.

  • @drifter402
    @drifter402 Месяц назад +1

    Cool ad

  • @vdboor
    @vdboor Месяц назад

    > "I think it's that our field hasn't had a great deal of innovation since the basic architecture that was sort-of laid down by Freud in the 1890"..
    Please come'on. That is such fallacy.
    Freud may have been ahead of his time (came up with many things we now take for granted), but the insights of therapy and understanding of MH certainly has changed.

  • @dwuletp31
    @dwuletp31 Месяц назад

    Well, it’s just as defective as when psychologists decided “virtual therapy “ was an effective form of therapy during and after the pandemic. Nothing beats good therapy when humans are face to face

  • @claudiodelgado9073
    @claudiodelgado9073 Месяц назад

    So you say jump that bridge and they’ll send someone to intern you 😂