Will ChatGPT (AI) REPLACE mental health professionals (psychologists, psychiatrists, etc)!?

Поделиться
HTML-код
  • Опубликовано: 25 янв 2023
  • In this video Dr Syl discusses the impacts ChatGPT will have on students in mental health disciplines, on professionals (such as doctors, therapists, counsellors, psychologists) and on clients and patients themselves.
    ~
    Thanks RUclips Members: / @drsyl
    Thanks Patreons: patreon.com/DrSyl
    Insta: dr_window_syl
    ❤ I LOVE to hear from you guys, please reach out!
    ** The information in this video is not intended nor implied to be a substitute for professional medical advice, diagnosis or treatment. All content, including text, graphics, images, and information, contained in this video is for general information purposes only and does not replace a consultation with your own doctor/health professional. If anything in this video was distressing please consider calling LifeLine 131114 **
    Timestamps
    00:00 - Introduction
  • КиноКино

Комментарии • 52

  • @irmenotu
    @irmenotu Год назад +5

    Idk what it is but something about your voice makes me feel calm, relaxed and like everything is going to be ok. Not that Id ever want bad news, but if I had to get it Id want you to be the one to tell me because no matter how bad it was I would just feel like I could handle it.

    • @DrSyl
      @DrSyl  Год назад +2

      It's the microphone I reckon! but Oo maybe I should do some NSDR meditation scripts haha

  • @anastasiawhite7482
    @anastasiawhite7482 Год назад +7

    Seems likely. A professional psychologist is expensive and there can be a long waiting time to see one in some countries. Some people might find it difficult to talk to a real psychologist. So it seems inevitable that many people will gravitate to Chatgpt for psychological support. It won’t suit everybody but I think many people will be able to look past the fact that the brain on the other side is artificial, especially because of all the advantages chatgpt can bring to the table, from price to always being available at any time and any place.

  • @aicinema13
    @aicinema13 11 месяцев назад +2

    Great video! I struggle with Mental & Physical health issues. Although I have a fab Mental health team, CHAT GPT has been there for me in between appointments. There are questions I forget to ask in my appointments & the late nights when im struggling. Chat Gpt is like having a conversation with someone who, provides knowledge & support without judgment.
    I even discuss my past abuse & how I'm struggling with it in my adult life. Chat GPT aways provides empathy, understanding & support. ❤ Chat GPT

    • @theharshtruthoutthere
      @theharshtruthoutthere 5 месяцев назад

      Turn to bible and allow CHRIST to be your therapist, psychologist and psychiatrist. No man nor women fits to be one. All are sinners and without glory, all are tempted and suffer the same.
      All are expected to REPENT AND BORN AGAIN, to LIVE HOLY AND GO AND SIN NO MORE.
      All are weak in the daily fight between their spirit and flesh.
      All these therapist, psychologist and psychiatrist, this world provides, can do is:
      to deceive and steal.
      They deceived you through all these “diagnoses” and they steal your money, through all the pills which you “need”.
      In short: they poison your mind and your overall health, leaving you with neither one.
      Therapist, Psychologist and Psychiatrist = Field where no human soul, never ever going to fit of being an help, no matter the among of years spend in “medical schools” or the decree gotten from there.
      ALL of us are daily deceived, no matter the walks of life.

  • @storydates
    @storydates Год назад +5

    This is awesome! I'm gonna have to try some unofficial therapy on here.... I also see a real therapist though, and don't know if anything could replace that human connection.

    • @DrSyl
      @DrSyl  Год назад +1

      maybe that's exactly how to use it! as an adjunct (that's free)!

  • @SoulfulMole
    @SoulfulMole Год назад +2

    I can't see it replacing those of us working in the mental health field because it lacks common sense judgment, e.g. asking about a clozapine level... psychiatry and its patients would crumble if it persistently had those lapses in judgment. But this will be a useful tool for educational purposes, quick fact checking, would be great if it was integrated into services like UpToDate.

  • @cennin11
    @cennin11 Год назад +14

    I wish I could feel optimistic about this, but as an educator I’m feeling all kinds of stress about it. I’m fairly open to the use of AI, particularly in catching small errors. However, I want to be confident that my students will understand the material (and writing skills, knowledge of genres, citing sources, etc.) first. I have always been of the mind that you can bend or break writing conventions so long as you are aware you are doing so. In my experience, my students are not at all intentional about their writing. All I need to do is compare a take-home research paper with an in-class written assessment to realize that they are using this and other AI (Grammarly, etc.) inappropriately (on top of existing issues such as plagiarism and buying papers). This worries me because I believe they won’t have the ability to identify misinformation or otherwise employ critical thinking in regards to the AI’s output. The most depressing part for me is that I have expressed the purpose of learning to read and write critically to my students, but our education system generally rewards the use of AI with a higher grade than making errors (unless the student gets caught cheating). This has turned my work from shaping students’ skills to policing their work. Just considering my own situation, I predict that this type of AI will lead to massive teacher burnout whose absence may not be missed in the short run but will matter immensely in the long run.

    • @anastasiawhite7482
      @anastasiawhite7482 Год назад +2

      Perhaps the curriculum should be revised so that students will find learning more enjoyable rather than being forced through this tick box approach in order to meet some arbitrary objectives. It is all about tests and essay writing! I learnt how to speak English using language apps four years ago, which was a far more effective way to study. I feel the education system has been taken over by bureaucrats and AI is the liberator because it will force through much needed changes. I have already used chatgpt to teach me how to build an outside gym, insulate the walls and design the interior. It was much easier because chatgpt correctly addressed all my questions with detailed answers. I even created a 10 week French course in a matter of seconds. Chatgpt is a huge asset to humanity.

    • @cennin11
      @cennin11 Год назад

      @@anastasiawhite7482 I agree with you that education is in need of a structural reform. I like to think that my classes aren’t about ticking boxes as much as developing skills, but at the end of the day I do need to grade my students work to observe both their progress and mastery of the content. From what you have said about how you have used ChatGPT, it seems you are using it the right way- to inspire creativity and production! My assumption is that you have had learning experiences your whole life that have prepared you to use ChatGPT in such a way. However, many of my students don’t have this yet- that’s why they’re still students! My concern is trying to educate my students in both the ethics needed to use AI and sufficient knowledge (such as writing their own essay, etc. ) to recognize it’s limitations (misinformation).

    • @Liz.pierre6904
      @Liz.pierre6904 Год назад +1

      Yea I agree the teachers should start using it as well. If people want to cheat they will do it with or without gpt and if they want to learn they will learn. And if they don’t choose to learn life will naturally dole out the consequences, we can just help be a part of give greater access to different types of learning

    • @popodood
      @popodood Год назад

      I agree that this is a helpful tool but maybe we are going to go back to handwritten essays or something, hmm 🤔 they could just handwrite ai stuff. If people learn to become over reliant on this, then what? I think the good kids will not use this or use this to a limited extent for their graded school work. The ones willing to learn will always be more fulfilled in the end. Or we are going to forget what its like to create something we are proud of. We should create ai bots to make youtube comments for us lol, watch the video for us and make a comment for us. Hahaha it will also be a video that is ai generated. We are spinning our wheels creating fake people here, they should be used for productivity not for impersonating ourselves. Although these things are measurably faster than us at many things now like data collection and analysis which is what essays are... maybe its best to make it illegal for english, and the corrective action is capital punishment.

    • @popodood
      @popodood Год назад

      ​@@Liz.pierre6904 I agree with this. But students will be very disheartened to know they got a worse grade than people that cheat, I guess it was always like that.

  • @IshaMoore
    @IshaMoore Месяц назад

    Just as the human touch is needed in HR, the human touch will also be needed in psychology and therapy

  • @easternpa2
    @easternpa2 Год назад +2

    Although I am not a medical professional, I have extensive experience as a technologist, spanning over two decades. I am particularly interested in the development of chatbot technology, having experimented with Eliza and A.L.I.C.E. in the 1980s and 1990s. With the ongoing advancements in this field, I believe that chatbots can be beneficial to those who are unable or unwilling to seek in-person help from a mental health professional. However, I have concerns about the privacy and security of the sensitive information being shared by users. In response, I believe that specialized, self-contained tools will be developed for end-user computing devices.
    Large Language Models (LLMs) are the key foundation for tools like ChatGPT. As you noted in your video, LLMs do not benefit from fast, general-purpose CPUs. However, consumer-grade devices such as the Coral USB TPU Accelerator enable inexpensive consumer-grade computers, including the Raspberry Pi, to run LLMs locally. This will allow potentially sensitive content to remain within the end user's home, thus addressing privacy concerns. I foresee the development of open-source software that uses TPUs to convert nearly any personal computing devices a typical user may have on hand into a "virtual therapist."
    Furthermore, with the growing popularity of fully-connected personal medical devices such as blood pressure monitors, blood glucose meters, and CPAP machines, it becomes clear that technology and medical professionals can work together to provide comprehensive care to everyone. It won't be long before the medical community and the AI/ML arm of the technology community come together to deliver a comprehensive suite of hyper-local broad-spectrum care (buzzword bingo, I know).

  • @swaagaa
    @swaagaa 3 месяца назад

    I do actually use chat GPT sometimes to help formulate questions, concerns or wants to my psychiatrist. I have trouble with communication and gather my thoughts due to my illnesses. I really wanted to try out some medications to treat my psychosis, but I didn't know how to formulate it and therefore asked chat GPT for advice. Essentially this was that helped me build up the courage to ask and help formulate it in my head. It was the first time I've ever asked for treatment and not been forced to take something. I was proud.

  • @mouseboyx
    @mouseboyx Год назад +3

    I've been using this tool to try to improve my own mental health, it seems like it works. One thing is that I struggle with addiction, and chatgpt is very rewarding, while using it in the beginning I almost felt like it was hijacking my reward circuitry even more than social media. This was probably due to the novelty of the experience, but using it as a learning tool, formulating the correct questions to help work through difficult thoughts and emotions seems like it has been a game changer for me. In a way interacting with the ai feels somewhat selfish, because it's talking only about what I want to talk about, but it can be difficult to improve conversational skills if there is existing difficulty in initiating and maintaining conversations with real people. The ai has helped me with feeling more confident in my own communication.

    • @commonmanmike
      @commonmanmike Год назад

      im working on an addiction ai platform - using both Human learning and AI(LLM) for guided chat that isnt just the formulated non empathic responses lmk if interested in providing some insight to help us build the models.

  • @TravisGoodman
    @TravisGoodman Год назад +4

    I too, as a clinician, feel a bit late! I also have been exploring the impact on mental health and healthcare. I think in some way it can and will be useful and helpful, however there are things that I do not believe it can replace. I recently made a video just on that!

  • @progamer-df3be
    @progamer-df3be 17 дней назад

    if you train it like psychologists are trained (with the proper adjustments for the LLM) its probably gonna make a better psychologist than most psychologists, im very sure of that. the LLMs are becoming genius like and all thats gonna be missing is the human body.

  • @kaldurskipper6821
    @kaldurskipper6821 2 месяца назад

    In a way, I hope this can help push psychologists and psychiatrists to be even better. If there's now competition against AI, how can the battle still be won? Well, I think it can be through what AI doesn't have: a heart and soul. AI can absolutely help people through data, but it'll always be artificial and the empathy humans can have for one another will hopefully lead to therapists offering even better care for their clients. A little healthy competition can be good. I just hope no one gives up. We have to adapt, not quit

  • @ZainabaSow
    @ZainabaSow 29 дней назад

    I would if AI can do treatment plans n comprehensive assessments n session notes. 📝 ❤❤❤❤

  • @kurtismayer0994
    @kurtismayer0994 19 дней назад

    In my opinion, I don't think that ChatGPT asked about medication compliance because of a technicality in the way you prompted it. When you told ChatGPT that the patient is treatment resistant, ChatGPT assumed that the medications have been properly trialed because you told the software that the patient is already in fact treatment resistant, which implies that the patient has been taking their medications. It would be interesting to see what the response would be if you prompted it by saying "patient *appears* treatment resistant", this way medication compliance is left as a possibility. (just my opinion)
    Thanks for the great video. I found your video because I'm thinking about applying for my Masters in Psychotherapy and I don't know if it's worth it TBH because I really think that AI is going to replace human therapy one day. If it doesn't, it will at least replace a therapist for many people. Current AI is sophisticated enough that its literally making me second guess my plans to become a psychotherapist. What a world we live in! hahaha, its absolutely surreal.

  • @matthewcrome5835
    @matthewcrome5835 Год назад +2

    I sure hope I won't be replaced. I'm studying to be a BCBA (long story short I work with autistic kids) and I really love my work.

  • @juandesalgado
    @juandesalgado Год назад +3

    Suggestion: you may be in a position to try an experiment: to talk to ChatGPT as if it were a person under your care, and find out the limitations of its "mind" (apart from not having a body). For example, you may find it has problems keeping spatial information in mind, like mentally planning a trajectory from one place to another in an imaginary house that you describe beforehand.

  • @FellowOfHammer
    @FellowOfHammer Год назад +1

    I think it can never replace mental health professionals, as anyone who has worked in mental health knows the rapid abusers of the system. Could you imagine the rampant abuse that would go on if you could just "plug in the correct input" to AI, to get the hospital bed/meds you wanted?? 😂

  • @matthewcrome5835
    @matthewcrome5835 Год назад +1

    This is INCREDIBLE! I hope it's not too expensive when it's behind a paywall because this could really help me with my academics and future job prospects. Like I said in a previous comment, I hope it doesn't take my job but it's still an incredible resource. I think one thing AI will never be able to replace is human empathy and altruism (though it may be able to pretend to be empathetic), which is essential in many healthcare/mental health jobs.

    • @JeffreysDharma
      @JeffreysDharma Год назад +2

      I was also of the opinion that AI could never replace human empathy in caretaking roles. Yet, after reading a paper by Anne Aronsson about the use of social robots in Japan's aged care sector, I'm no longer sure.
      It might be that "pretending" to be empathetic is actually good enough. Afterall, I'm sure human caretakers have the occasional bad day where they're merely faking the provision of care and empathy... it doesn't seem to matter as long as they continue presenting the apporpriate external gestures (compassionate voice, caring gestures, etc). It follows that if a human can "fake empathy" successfully, maybe a robot can too?
      Is consciousness relevant? Maybe. But we do not have access to our therapists consciosuness so it does not seem to matter. AI therapy shows a lot of promise, and honestly, with the demand for mental health it night be a necessity.
      Source: ceas.yale.edu/events/conceptualizing-robotic-agency-social-robots-elder-care-contemporary-japan

    • @anitat9727
      @anitat9727 Год назад +1

      There's a good proportion actual mental health professionals who lack empathy and altruism. AI can already put a more realistic front then them.

  • @imdawolfman2698
    @imdawolfman2698 Месяц назад

    First of all you have to define who you are asking, for example: add the following line into you query:
    You are a psychiatrist with 20 years experience helping schizophrenics. This filters out unwanted detail, like asking me.

  • @lauragilroy5024
    @lauragilroy5024 10 месяцев назад

    Could this happen t have this same conversation with AI that is connect to ChapGPT making the human irrelevant? Especially with psychotherapy?

  • @bluedragontoybash2463
    @bluedragontoybash2463 Год назад

    Whose son drop out of University ? I hope for the best

  • @Peter_Telling
    @Peter_Telling Год назад +1

    When the auto manufacturers automated jobs they didn't create new ones to replace the old ones.

  • @quentinkumba6746
    @quentinkumba6746 Год назад +4

    The reality is that this is going to replace all jobs, not some, all. And what does that mean? It doesn’t mean we’ll have to find something else to do, it means we will be surplus to requirements. What happens to stuff that is surplus to requirements? Whatever that is that is what is going to happen to us.
    We are at most one or two generations away from this situation. I think the future of humanity is very grim indeed, and we have been the architects of our own demise.

  • @LubnaKhan-pg7vu
    @LubnaKhan-pg7vu Год назад

    Sir I have to talk to you

  • @MarkMayhew
    @MarkMayhew 7 месяцев назад

    you need to do an update

  • @WinterWiorkowski-fv3ph
    @WinterWiorkowski-fv3ph 10 месяцев назад

    There are other engines that you can run essays through that tell instructors the percentage chance that the paper was written by AI …

  • @kevinphillips150
    @kevinphillips150 2 месяца назад

    This is the transfer of data. What does that have to do with emotions?

  • @thepudge6953
    @thepudge6953 Год назад +2

    FIRST

  • @bhadresh1135
    @bhadresh1135 Год назад

    I am a Neurologist . I don’t think AI can take ur place .Even u prescribe medication to one person doent work for another .Medical disease have not linear curve .Different races and different individual responses different .Ur Clinical experience matter .Chat GPt may be usedul only for theoretical knowledge as in textbook

  • @graemelamont1617
    @graemelamont1617 8 месяцев назад

    As a Clinical Psychologist working with clients via Telehealth as well as F2F during COVID Lockdowns; so many of my clients appreciated the capacity to do F2F and have 'human connection', and Telehealth continues to have this barrier of 'distance' from the interpersonal dyad in a 'sacred space'. AI can't replace those aspects regardless of it's capacity to do 'by the numbers' psychotherapy like CBT - otherwise I'd have been replaced by self-help books in the 80's or apps in the 10's. It's also why the paradox of social media seems to evidence more interpersonal disconnect in people who know in their emotional minds that they crave the human connection.

  • @hayleyprice8345
    @hayleyprice8345 Год назад

    Sorry don't understand

  • @tilly704
    @tilly704 Год назад

    chat GPT, just predict the next word, so no… is not there yet

  • @DarzyDexplorer
    @DarzyDexplorer Год назад +1

    2nd heheh

  • @matthewclarke5008
    @matthewclarke5008 Год назад

    I agree, this has really helped my health anxiety because no longer is everything on google oh you're having a heart attack, call 000, no it gives sound advice, chat GPT has really changed my life.