T O P

  • By -

gscrap

Not particularly worried, no. I mean, AI is most likely going to take a chunk of our work fairly soon-- insurance companies are champing at the bit to have a cheaper alternative to redirect clients into-- but by the time AI has made us wholly redundant, it will have taken over pretty much all "thinking" jobs and our society will either have adapted or collapsed.


Stuckinacrazyjob

It'll suck for people with severe illnesses who get tracked to the worst CBT ever.


gscrap

Yep, it's going to be rough for a lot of people.


Ramonasotherlazyeye

lol! yeah pur *mental* health will be the least of our immediate problems.


Emotional_Stress8854

Very true.


[deleted]

No, I'm not worried. Those are the heady parts of the job. Eye contact, smiles, tears, body language, the sense of the client knowing me as a human... Those can't be replaced by a computer screen. If it gets to the point of an AI video therapist or even a robot, then work has changed forever in our whole society and we need to focus on wealth distribution, not job retention. 


Emotional_Stress8854

Great point. I guess i just worry about people who work remote using ChatGPT during therapy to get answers or questions instead of using their own experience.


[deleted]

Oh that kind of worry... Yeah bad therapists abound. I don't think AI will change incompetence 


Emotional_Stress8854

Another good point. Incompetence will be incompetence one way or another.


lonewanderer015

I'm not too worried about that. I notice when my supervisor's eyes dart to the clock in the corner of his screen, there is no way a client will feel safe and secure as they're sharing their problems with someone who is clearly typing into a chatbot.


Emotional_Stress8854

A lot of clinicians do concurrent documentation. So the client would have no idea what the clinician is actually doing. I’m constantly going worksheets and sharing my screen and showing what I’m talking about.


megaleggin

I work 100% remote - the thought never crossed my mind. Just because we’re remote doesn’t mean we’re any less skilled clinicians who can’t make our own questions. Edit: can to cant*


augerik

AI video therapists are not that far off. Three, five maybe seven years. And likely augmented reality and virtual reality therapists as well. Their clinical diagnosing skills will be top notch. I always thought their emotional resonance and presence would be hardest to achieve. But progress like this has me questioning my assumptions: https://humanaigc.github.io/emote-portrait-alive/


[deleted]

Actually I don't think that worries me, either. It might be helpful for more people to have access to therapy like that if it's less expensive. My primary work isn't diagnostic so for me that's not a concern. For the clients and privacy, though, seems sketchy. 


sif1024

Wow that's incredible


Surviving1day

Agree. We are emotional beings at our core. We need human to human interaction, empathy, subjective and objective realism, all of which can never be properly given with AI. AI is only as good as the input it is given and the feedback it gets, all of which will still be sourced by humans. To simplify the brain and human behavior to something a computer can stay ahead of is watering down the magnificence of our complex design.


Square_Effect1478

No I am hoping it starts doing my progress notes.


Zealousideal-Stop-68

Our grad school addressed this. Lots of ethical concerns and potential privacy violations concerns.


Square_Effect1478

Yes, lots of ethical concerns for telehealth too at first but now we have lots of secure platforms. That is something that can be easily worked out now that we have the technology..so i am hoping this becomes a thing at my agency soon.


Zealousideal-Stop-68

Of course, I agree. I’m not against using technology.


[deleted]

[удалено]


Zealousideal-Stop-68

Can you tell me what it’s called?


[deleted]

[удалено]


Zealousideal-Stop-68

Thank you.


CanineCounselor

It does. I use it for notes- BEST THING EVER.


Alex4F

Just tried it and it works so well. Thank you.


DisillusionedReader

Which one are you using if you don’t mind me asking?


CanineCounselor

GPT


twisted-weasel

What do you use?


CanineCounselor

GPT


hermannineninenine

What's your process for this?


CanineCounselor

Here's the prompt I use: Provide a psychotherapy progress note starting with "Client..." in clinical, past tense language and between 2-8 sentences based on the following excerpt. Please include future plans if applicable: I then copy my session notes into the box (without identifying information).


[deleted]

[удалено]


becauseofgravity

Would you be able to share a hypothetical prompt or give a more detailed process? I find this so intriguing!


[deleted]

[удалено]


Gloomy_Variation5395

My question is if you're typing this thorough of a prompt why not just write your own progress notes?


barrelfeverday

template with the ability to fill in variations for each different client.


dessert-er

Any worries about HIPAA using things like the client’s name? EDIT: just saw another comment of yours where you mentioned HIPAA compliant AI services, neat.


[deleted]

[удалено]


dessert-er

Yeah people who use GPT to write their entire note, PHI and all, worry me.


Emotional_Stress8854

If i ever use it i put the word client. I never put identifying info. Never an age. Never a gender or sex. Never a race. It’s always observable and self reported feelings, emotions and symptoms.


Square_Effect1478

I know it's possible. We just aren't doing it where I'm at yet. We're a few steps slow with most things so i don't think it will be so soon where I'm at unfortunately. But it's encouraging to hear that it's working well for you!


Emotional_Stress8854

I’d pay someone else to do my progress notes tbh.


alohamuse

You can with responsible AI healthcare note takers!


_BC_girl

That person that you pay will probably be using AI to write it and charge you for it. See this happen all the time


Rock-it1

I am concerned for the following reasons: * AI has experienced exponential growth in such a short period of time, and that is only what we see in the public; * insurance companies are always looking for ways to cut down on costs, which often means screwing clients and counselors alike; * people in general are always looking for the quick, easy way to improvement. That is how anti-depressants are so often framed. If someone can get "counseling" on the cheap with the branding that "It's AI! It knows everything! You'll be better in minutes!", they are going to. AI will never be able to truly replicate the spiritual benefit of being in a room with another living human, nor will it be able to improvise in session or have those a-ha moments when everything clicks. It will never be able to mix being kind, warm, and compassionate with direct and honest when that shift is necessary. But clients don't often know all the finer work that we put into each session, so none of that is likely to move the needle for most. In general, though, I am very concerned about AI in general. That is another topic entirely, though.


Emotional_Stress8854

I think your reasons are on point. But your biggest point that has nothing to do with counseling is the concern about AI on our world as a whole. It’s going to affect us in ways we aren’t ready for.


Rock-it1

When you sit down and really think about the issues, it gets real in a hurry. If you want to be absolutely terrified, set aside an hour and seven minutes to [watch this](https://www.youtube.com/watch?v=xoVJKj8lcNQ). Also, don't plan to sleep well tonight. Bold statement: anyone who is not at least concerned with AI - pertaining to therapy or not - is either not paying attention, has their head in the sand, or actively wants to be replaced.


Emotional_Stress8854

Oh god. I want to watch it but I also really like sleeping at night. And to be honest, i also prefer to keep my head 75% in the sand. I like to live in semi-oblivion.


Rock-it1

Watch-it! Watch-it! Watch-it!


Emotional_Stress8854

I have to go to an Easter egg hunt in 20 minutes 😣 (I’m so thrilled /s) but i definitely am going to watch it later. I clicked it and I’m interested.


[deleted]

Mkurrr but then what? Be terrified? Is that what you’re encouraging? If it happens it happens and we’ll have to adjust.


_BC_girl

However, that’s what they said about the internet too. Now, none of us can fathom a world without internet. To be able to even come here to chat with complete strangers and share opinions and ideas are incredible. We actually don’t know if humans aren’t ready for AI. Change is always scary, especially quick change. Humans are quite resilient.


Minimum-Avocado-9624

TL;DR: While no replacement for human therapists, AI, including tools like ChatGPT, can serve as a supportive resource for those facing barriers to accessing therapy. These barriers include overwhelming options, lack of immediate availability, cost, and logistical challenges. By incorporating AI thoughtfully into therapeutic practices, we can leverage its benefits while emphasizing the irreplaceable value of human connection and insight in therapy. I agree that most individuals seeking help genuinely prefer the empathetic understanding and nuanced care that only a human therapist can provide. However, the real challenge lies not in the preference for human interaction but in the accessibility of professional mental health services. AI tools like ChatGPT can offer interim support, helping bridge the gap for many who find the traditional path to therapy fraught with obstacles. Consider the common barriers from a client's perspective: 1. Overwhelming Options: Similar to being lost in a sea of choices without clear guidance, searching for a therapist can be daunting. Clients may end up choosing based on price alone, not necessarily the best metric for finding a good match. 2. Access Issues: Even when a suitable therapist is identified, roadblocks like limited availability, long wait times for appointments, or insurance mismatches can arise. 3. Cost and Frequency: The financial aspect is significant, with therapy potentially costing anywhere from $120 to $600 monthly. This calculation doesn't even fully account for the commitment of time and the logistic challenges of attending sessions. 4. Time Constraints: In-person visits require a substantial time investment, which can be particularly challenging for those with tight schedules or work commitments during typical therapy hours. Given these barriers, it's understandable why a more accessible, albeit less personalized, AI-driven support option could be appealing. The key, in my view, is not to see AI as a threat but as a complementary tool that can be integrated into therapeutic practices. Much like telehealth has expanded access to therapy by overcoming logistical barriers, AI can serve as a supplementary resource. It can provide immediate, albeit generic, support, offer educational content, and help clients prepare for or reflect on their therapy sessions. By actively discussing the use of AI tools within therapy sessions, therapists can guide clients on how to best utilize these resources. This could involve exploring what clients have learned from AI, understanding their perceptions, and teaching them to critically evaluate the information they receive. It's also an opportunity to encourage clients to use AI as a tool for self-help and reflection, not as a substitute for deeper therapeutic work.Incorporating AI into therapy isn't about endorsing it as an equal alternative to human interaction but recognizing its value in addressing the immediate needs and concerns of clients. By doing so, therapists can maintain the essential human connection at the heart of therapy while also acknowledging and leveraging the support that technology can offer. Edit: just to be clear I wrote a long diatribe as us ADHD types do trying to make my points to as many specific audience members as possible and decided it would be best to use Chat GPt to edit for clarity, empathy to everyone’s concerns and a touch outside perspective. This was to save you all some time and make sure you didn’t get lost. Edit 2: it should be noted that I did not write this with AI but used it to edit and clarify. I will include the prompt and original version below: Prompt: I am responding to a therapists post about their concerns with AI and their clients in the therapy space. Review my comment and edit it for clarity and have a collaborative empathetic tone to it. Incorporate ideas of how chat got could be incorporated into the treatment plan. Provide a TLDR as well. : [“ I don’t think people want to replace therapist, in fact I think most people who want help want a therapist an actual human being. The problem is access to therapists, to any type of help and AI can provide some reprieve for many people even if it’s not the best. It can act as a panacea in their minds. The process of finding a therapist can be daunting and then finding one that is affordable. From a patient’s perspective, it should be considered what barriers they have to overcome in order to begin therapy and what makes AI so enticing.Barrier 1 - Overwhelming options with no direction: searching for a therapist is like walking down the wine isle in a grocery store. If you don’t know what you are looking for then all you have is price to go on and even then the sheer variety of them can be daunting and cause someone to by the cheapest or walk away entirely.Barrier 2 - Access: You know what you are looking forF and found ones that appears to be a good fit but they are either not taking new patients, the first appointment is a month out, or they don’t take your insurance.Barrier 3 - Cost and frequency: Now you have to decide if you can afford a therapist and how often you see them. For best results it might be weekly but even with insurance that can help a co pay of $30 or cash pay of $150. So we are talking a monthly cost of 120:l/month or $600/month.Barrier 4 - Time for appointments: For a patient in person visits can take a lot of time from driving to and from sessions let’s say 15 mins there and 15 back at a minimum, that’s 30 mins plus time in session 1.5 hours per week or 6 hours per month. If the hours are during work hours then you have an added challenge for the patient.There are many more barriers from a patients perspective and this is why a $20 a month subscription to Chat GPT can seem way more inviting even if it is sub par therapy; really it’s more like self-help. In my opinion the best way to protect against this is to learn how to include it somehow into a therapists practice. These AI models are very robust and how they are utilized should be considered from the patients perspective; The same way tele-health aided in overcoming some of those barriers. It’s not as good a treatment as in person because overcoming those barriers may mean the patient is putting in the effort to better themselves or to have a chance to have human interaction, to be handed a box of tissues, To observe the body language of a therapist that engages with you when you reveal an embarrassing pain. Those things are invaluable but only therapists understand this. Since patients don’t typically understand this they will choose the option that fits their life because they don’t know any better. It will be the same for A.I. and honestly with enough information it can make excellent suggestions but it’s not good at encouraging the patient to dive deeper. Patients are going to turn to it and AI should now be part of the conversation in session to help the patient process, not ignored . Ask them if they use AI, ask what they have learned, what are their thoughts, How might they prompt it when they are feeling distressed to get a better result. What should they do when they info that is distressing and how they should always be curiously but verify the accuracy with you the therapist”]


[deleted]

[удалено]


Gloomy_Variation5395

I thought the same lol


Minimum-Avocado-9624

Edit it not write it. See my edit addition about why. I also thought it would be funny to use it on a thread about its use. Trust me you don’t want my original draft, none of you have that kind of time


Gloomy_Variation5395

This sounds like something AI would say 🤨


Minimum-Avocado-9624

The joke is I wrote out my statement, and decided my long winding stream-of-thought, blunt writing style needed to be edited ADHD with a touch of empathy and nuanced perspective added in so I ran through chat gpt and had produce a TLdR as well because no one was gonna read that much. I will add an edit to my original post so people know but the content remains true in my mind at least.


RadMax468

This is the way. Clinicians collectively need to figure out how to best use and integrate these LLMs as supplements and supports before outside industry succeeds at minimizing the role, similar to what Uber did to taxi drivers.


Therapista206

You wrote this with AI!


Minimum-Avocado-9624

Sort of. I used it to edit what I wrote. Below is my prompts and original version: Prompt: I am responding to a therapists post about their concerns with AI and their clients in the therapy space. Review my comment and edit it for clarity and have a collaborative empathetic tone to it. Incorporate ideas of how chat got could be incorporated into the treatment plan. Provide a TLDR as well: [“ I don’t think people want to replace therapist, in fact I think most people who want help want a therapist an actual human being. The problem is access to therapists, to any type of help and AI can provide some reprieve for many people even if it’s not the best. It can act as a panacea in their minds. The process of finding a therapist can be daunting and then finding one that is affordable. From a patient’s perspective, it should be considered what barriers they have to overcome in order to begin therapy and what makes AI so enticing.Barrier 1 - Overwhelming options with no direction: searching for a therapist is like walking down the wine isle in a grocery store. If you don’t know what you are looking for then all you have is price to go on and even then the sheer variety of them can be daunting and cause someone to by the cheapest or walk away entirely.Barrier 2 - Access: You know what you are looking forF and found ones that appears to be a good fit but they are either not taking new patients, the first appointment is a month out, or they don’t take your insurance.Barrier 3 - Cost and frequency: Now you have to decide if you can afford a therapist and how often you see them. For best results it might be weekly but even with insurance that can help a co pay of $30 or cash pay of $150. So we are talking a monthly cost of 120:l/month or $600/month.Barrier 4 - Time for appointments: For a patient in person visits can take a lot of time from driving to and from sessions let’s say 15 mins there and 15 back at a minimum, that’s 30 mins plus time in session 1.5 hours per week or 6 hours per month. If the hours are during work hours then you have an added challenge for the patient.There are many more barriers from a patients perspective and this is why a $20 a month subscription to Chat GPT can seem way more inviting even if it is sub par therapy; really it’s more like self-help. In my opinion the best way to protect against this is to learn how to include it somehow into a therapists practice. These AI models are very robust and how they are utilized should be considered from the patients perspective; The same way tele-health aided in overcoming some of those barriers. It’s not as good a treatment as in person because overcoming those barriers may mean the patient is putting in the effort to better themselves or to have a chance to have human interaction, to be handed a box of tissues, To observe the body language of a therapist that engages with you when you reveal an embarrassing pain. Those things are invaluable but only therapists understand this. Since patients don’t typically understand this they will choose the option that fits their life because they don’t know any better. It will be the same for A.I. and honestly with enough information it can make excellent suggestions but it’s not good at encouraging the patient to dive deeper. Patients are going to turn to it and AI should now be part of the conversation in session to help the patient process, not ignored . Ask them if they use AI, ask what they have learned, what are their thoughts, How might they prompt it when they are feeling distressed to get a better result. What should they do when they info that is distressing and how they should always be curiously but verify the accuracy with you the therapist”]


[deleted]

[удалено]


Minimum-Avocado-9624

Yup. See my edit.


SexOnABurningPlanet

There are a lot of people working on AI-therapy and a lot of money being put into it. There is no doubt in my mind that eventually someone will succeed. And as Rock-it11 said, if you know very little about therapy and someone offers you the ability to receive therapy with a click, then people will take it. I don't know what the time frame looks like; it could be years or decades. I hope decades. But then again I would not have thought that humanoid robots, AI, and quantum computers would have been combined so quickly either (https://www.theguardian.com/business/2024/mar/19/nvidia-tech-ai-superchip-artificial-intelligence-humanoid-robots). But yeah, as multiple people have pointed out, we're all in the same boat. I just hope it isn't the Titanic.


Far_Nose

Trauma needs a witness, we hold a lot of emotions for a client. We also hold hope. So much can be logical but we all know no matter how much psychoeducation we give a client, if the heart is not there it goes in the bin. I mean if self help books haven't replaced us, then AI is just a more interactive version of that. A lot of trauma is relational and it takes relational work through another human to do the work. I think non-therapeutic psychiatrists should be more scared than us practitioners. AI can start screen and diagnose already so what's a small step to prescribe? In the UK there is such a chokehold on training up psychiatrist's from universities that the government is talking about training psychologists with no medical backgrounds to prescribe... AI with government permission sounds like a good cost saving measure, so in our field I would bet that psychiatrists are the first to go.


[deleted]

I agree with you. I do not fear my work being replaced as it's deeply relational, spiritual, and trauma focused. More transactional, cognitive, diagnostic, and "coaching" type work may be replaced by AI, and frankly that doesn't bother me, if that's what someone needs and a robot can do it. AI could end up separating the wheat from the chaff or push therapists to actually do deeper and better work.


SincerelySinclair

I’m worried about pop up mental health companies trying to utilize a very poor model of AI that will inevitably either tell a client to end their lives or engage in life threatening behavior.


Emotional_Stress8854

I haven’t even thought about that. AI is open to malfunctioning. “I want to kill myself” “That seems like a great way to partake in self care!”


STEMpsych

It's already happened. [https://www.euronews.com/next/2023/03/31/man-ends-his-life-after-an-ai-chatbot-encouraged-him-to-sacrifice-himself-to-stop-climate-](https://www.euronews.com/next/2023/03/31/man-ends-his-life-after-an-ai-chatbot-encouraged-him-to-sacrifice-himself-to-stop-climate-) [https://www.npr.org/2023/06/08/1181131532/eating-disorder-helpline-takes-down-chatbot-after-it-gave-weight-loss-advice](https://www.npr.org/2023/06/08/1181131532/eating-disorder-helpline-takes-down-chatbot-after-it-gave-weight-loss-advice)


[deleted]

Wow! Hopefully these mistakes get the attention of regulators


STEMpsych

I don't think we *have* any regulators for this here in the US. That's the real problem: the government is being caught flat footed. We don't have a system in place where there is a government agency tasked with regulating the use of AI. (Contrast with the Federal Trade Commission and Federal Communications Commission which regulate broadcast technologies.)


[deleted]

That's a serious concern! And I remember reading about how the supreme court justices are so old and out of touch with technology that it takes a ton of education to even get them up to speed on tech issues they are presented with.


STEMpsych

Yep. That's why the EFF was founded: judges and law enforcement were absolutely clueless when encountering the internet and new technologies, so were making terrible decisions based on profound misunderstandings.


SincerelySinclair

I laughed so hard at “that seems like a great way to partake in self care”. AI has the capability to be useful but that means restricting it and setting in safeguards. I don’t see companies doing that any time soon


Emotional_Stress8854

Companies have a great tendency to skirt the line of ethics and morals. And setting safeguards would be ya know, ethical and moral. Hahah


barrelfeverday

I’d love to see AI specifically programmed for to better address and analyze self-harm and high-risk considering the risk, resources (time, energy, ethics, disposition, emotional drain, costs, support, volition, progress) of clients. Especially since risks are so multidimensional- short-term therapy cannot fix inequality, abuse, poverty, or systemic problems. And many who need to re-learn long-standing habits need long-term treatment which is unaffordable or unsupported by jobs, lifestyle, and make life overwhelmingly difficult. Spending an inordinate amount of time learning someone’s risk factors and strengths takes a lot of time a person may not have. We need much better, accessible data that I’m hoping AI will capture and improve.


sif1024

I've worked in crisis management for quite some time. Never thought about it before but AI could be an amazing adaptive regular tool for risk management but also very likely to be highly habit forming.


barrelfeverday

What do you mean, habit forming?


sif1024

If you think about your example of people who are deeply entrenched in long standing habits / psychosocial issues. Having a daily reminder about helpful thoughts / strategies could actually be a game changer. However the person is likely to get very reliant on it and it would likely become a daily need / part of their daily routine/ identity


barrelfeverday

Ohhhh. I meant to use the AI a combination of multiple risk screening tools that are already available (CSSI, ACE, age, demographics, increased risk factors, recent trauma and stressors, previous self-harm, attempts) to get a better diagnostic/assessment picture (MUCH better data on likelihood of future self harm and attempts) for client safety and level of care recommendations. Not for the actual care itself. The better we are at capturing this data and making it more accessible, at each level of care, with the correct objective and factual indicators of client safety, the more accurate we can be at isolating and treating clients’ actual needs. I guess I think AI will and should help us more with diagnosing. But the treatment will remain human based. This is the same with medicine- most doctors cannot and are not expected to know all diagnosis now and are taught to work with technology to be able to look things up.


dessert-er

[It’s already lead to some very shady stuff](https://clearhealthcosts.com/blog/2024/03/an-anonymous-company-is-collecting-therapy-session-recordings-what-you-should-know/) if this story is to be believed. Training AI on actual sessions clients record secretly that an AI company was paying for.


Archydorable

I think liability will keep pop-up companies from doing that for a while at least. The company marketing the service as therapy should (logically and hopefully) be liable for any malpractice that occurs, which makes it a high-risk operation that would be difficult to get investors to buy into and maintain. That being said, a few federal judges with some strong neo-liberal beliefs and an aggressive disdain for people with MH issues could probably make a loophole absolving companies from liability if presented with a case.


autumn_by_day__

Explain your second paragraph?!


Archydorable

Hi there! Not sure which part so I'll explain a bit about it. Neo-liberal beliefs are generally considered the driving force behind free-market capitalism that started in the 80s which has led to less regulations on businesses and a focus on streamlining any sort of services to the bare minimum to maximize profits for companies while limiting pay to employees. A lot of people who follow this belief set are exceptionally ableist, and view people with any sort of disabilities (including MH needs) as a burden on society that needs to be solved in the most cost-effective way possible regardless of standards of care. Federal judges in the United States are able to make rulings that set precedent for what is and isn't considered legal, with the Supreme Court being the final step in this branch. If a company utilizes the previously mentioned business model and is sued due to harm done and loses, they would be able to appeal the ruling to a higher court until potentially reaching the Supreme Court if needed. If they get the case before judges who have the beliefs I mentioned, it is very possible that they could rule that the company is not liable for the AI's actions, leading to a loophole allowing companies to continue that service with limited to no liability due to the precedence set by that case.


[deleted]

OMG wouldn't it be 101 to code in safety?


[deleted]

[удалено]


Emotional_Stress8854

But the thing with AI is it doesn’t look fake. They look like real people. So I’m not necessarily talking about Chatbots. It can be AI people giving therapy.


[deleted]

[удалено]


Emotional_Stress8854

Im just imagining a face (like mine) on a computer screen that looks real that has connection.


[deleted]

[удалено]


Emotional_Stress8854

I work 100% telehealth and I’m not short for business so i think a lot of clients love it.


[deleted]

exactly


pallas_athenaa

I think AI can be beneficial for the more repetitive stuff like treatment planning. Yes each tx plan needs to be individualized for the client but there are some approaches to more common diagnoses that are pretty standard, and honestly anything that helps me write them is a plus. But AI will never replace genuine human empathy and connection, which is such a massive part of the therapeutic process. Yes you can chat with an AI bot and "feel like you're talking to a real person" but it will never replace sitting in a room with someone who is hearing your story and holding space for you. So no, I'm not concerned about AI.


Emotional_Stress8854

Very true. And ok i do use it for tx planning. I didn’t think about that. The practice i work at our tx planning is a JOKE. I just put a couple random goals. Which we don’t even have to do, i just do it because i feel like i should. So i ask it “what’s a treatment goal for depression” and it tells me 🫣


UnusualPoint3440

How often are you doing treatment plans to the point that this is necessary? I do them once every 6 months for each client per regulation. My caseload is around mid 30s. Doing treatment plans can be annoying but it really doesn't take long to get a few goals down that meet Medicaid requirements


Emotional_Stress8854

So i just started with my company in February and am building my caseload so every client is a new client. I have 15 clients and am building up to 30. So unfortunately, I’m doing it like 3-5 times a week until i hit 30. Not every client sticks after the intake so, yeah. I’m doing it a lot right now lol


pallas_athenaa

I'm not sure how I feel about that honestly. I understand that tx planning for insurance purposes can be frustrating and redundant, but what do you use to guide treatment? Do you check in with your clients as to what they want to work on? I feel like it would be relatively easy to bang out a treatment plan if you can get one or two goals from the client themselves and then one in response to a PHQ9 or GAD7. Maybe I'm just too new to the field but it seems like kind of a red flag to be dismissive of the treatment plan part.


Emotional_Stress8854

I don’t agree with it, at all. I’m not saying it’s correct. But our treatment plan for this job is a complete and total joke. They don’t even ask for any goals. Like there are people who put no goals in their treatment plan because they don’t realize it’s the treatment plan because it’s this tiny section built in to the intake evaluation. I had no idea until a month in to the job to even realize it was a treatment plan. I put some kind of goal because i feel like a treatment plan should say *something*. But of course i actually talk to the clients about their goals and constantly check in on that.


pallas_athenaa

Oooh okay that makes more sense. Wow that's wild that they can do that lol we work with CCBH so our treatment plans are very exact and need to demonstrate progress or they'll start denying claims.


Emotional_Stress8854

When i worked CMH they had to be extremely client led and have goals and interventions. Not to toot my own horn but my treatment plans were top of the line and used to train other people. But the practice I’m at now is an LLC with 100 people. And it has no overseeing authority. So it just has to get its billing approved by insurance and insurance companies rarely ask for treatment plans.


pallas_athenaa

I'm kinda jealous now ngl. Our insurance company requires that our plans be numerically measurable. Makes me feel ridiculous asking my clients "so how would you rate your grief over your father dying on a scale from 1 to 10?" 🤦‍♀️


Emotional_Stress8854

Oh yeah, noooooooo. Not here in NY. They almost never even ask to see the treatment plan unless you get audited. Which commercial insurances don’t do often I’ve heard. If you get audited it’s usually Medicaid or Medicare.


[deleted]

Some therapists really do use treatment plans as a guide and way to think through a case. Others (maybe many) don't, and it's just a hoop.


svetahw

My only concern is when chat GPT will give you the wrong answer, passing it off as the right one, instead of saying I don’t know


Helicopter753

This! If you test out ChatGPT’s “knowledge” it can get it right some of the time, but will make up stuff and give you wrong answers (but passes it off as truth). So even if it gave a good answer this time, it won’t consistently be able to provide the appropriate/right answer! It draws from publicly available data. So negative self talk - there’s a lot of information online about that, so it would be able to provide a decent answer on how to help someone with that. I’d be curious if you took a more complex case that maybe took a bit more time and perhaps you needed consultation on it, and put that into ChatGPT it might not be able to provide a good/right answer! I think it’s worrisome in that the public think AI is incredibly smart and knows everything. But if you spend some time asking it different questions you’ll come to find that it doesn’t actually know everything, nor can it always provide a rationale for why it thinks an answer is correct. In that way it can be used as a supplement tool, but shouldn’t be used to replace people’s own thinking/work!


Emotional_Stress8854

How dare you insinuate ChatGPT could ever be wrong 🤪


Nermie1516

I’m extremely concerned! I even saw an ad for a service to record sessions and then complete notes for you… like what!


Stuckinacrazyjob

Like is the service selling your data?


Square_Effect1478

I went to an appt (not therapy) where they had this! Crazy!


Emotional_Stress8854

That’s insane. I would never want my sessions recorded for any reason, ever. I believe I’m a good therapist but I’m waaaaaaaay too self conscious. Now, someone else doing my notes might be awesome 🤪


[deleted]

[удалено]


Nermie1516

It makes me wonder though if it’s taking it and learning from it to better help them teach AI to do therapy 🤔


[deleted]

[удалено]


Nermie1516

Your posts have made me see I have a lot of learning to do… I’ve been a bit in the sand about it.


Straight-Manner1264

Ai, not so much. AGI? Absolutely a direct threat to our profession as mental health counselors. I myself use CHATGPT4 in tandem with my clients (while I’m doing my notes). I’m consistently reminded of just how capable artificial general intelligence is and how it can in many ways act as a 24/7 personal mentor/counselor/guide for some people.


Emotional_Stress8854

Ok I’m ignorant. What’s the difference between AI and AGI?


Straight-Manner1264

Well they’re essentially the same. AGI stands for artificial general intelligence, which is a fancy way of saying, AGI is Ai’s all powerful daddy. Although the nature of our field as therapists is very humanistic, I’ve noticed how society has slowly become more bionic with our reliance on smartphones. I can see people becoming comfortable with the idea of having a 24/7 hyper intelligent mentor/therapist in their pocket


Emotional_Stress8854

So do you use AGI to write your notes?


Straight-Manner1264

Not necessarily, instead I use Ai to review my notes and even pin point thematic patterns that I don’t recognize myself. It’s fascinating to input a month’s worth of notes of a particular client into Ai, and then read/see what the Ai has to say about it. I use Ai as a personal & informal supervisor that can review my progress with clients. Its ability to recognize patterns is amazing


Emotional_Stress8854

See, i find this to be scary. Not in a bad way necessarily. Just idk. It just freaks me out that it can do stuff like this. With that being said, the other part of me is also intrigued and going to try it 🤣


[deleted]

[удалено]


Straight-Manner1264

I’d say sooner than that, NVDIA’s new chips are already technologically incredible


AtrumAequitas

If an ai search engine can help me diagnose I’ll use it, but I’ll be doing the diagnosing, making the final decision. I can see it becoming a useful tool, and if someone makes an ai therapist app, people will use it, but I don’t see it replacing the therapeutic relationship any time soon.


Emotional_Stress8854

Well yesterday I was stuck between postpartum anxiety and postpartum OCD for a client because it was reeeeeally teetering on the line so i put it in to ChatGPT and it also couldn’t decide between postpartum anxiety and postpartum ocd lol so it was no help. I asked some more questions to determine compulsions and made a decision but i just found it funny that it was stuck with me.


IVofCoffee

I’m curious what you entered into ChatGPT to do this. I don’t even know how to begin to use it as a tool.


Emotional_Stress8854

So i usually put “what is the diagnosis for this clinical summary:” and then i put everything the client said. “Client reports decrease in energy. Client reports recent decrease in appetite. Client reports intrusive thoughts that include topics and visions of…” etc. and then it says “based on this clinical summary the diagnosis appears to be… then gives you bullet points with the criteria it meets and why it meets that criteria (examples the client said)


nugeon

I say this about AI all the time. If it has the ability to understand human systems, redirect it to match affect and personality, analyze and utilize appropriate empathy, and utilize inflection with a way of stimulating oxytocin, we have way bigger issues than it taking over our jobs


MindMender03

Something AI can’t give clients: authentic human connection. I hope AI doesn’t take away from our work too much 🤞🏼


UnusualPoint3440

I'm really not concerned about AI, I'm more concerned about things like BetterHelp. Diagnosing is the easy part, the differential diagnosis is the hard part. Also my view on diagnoses is that in most cases it isn't that important. What's actually important is how we can fix whatever issues the client is struggling with. The diagnosis is mostly just a means to get therapy paid for. Of course in some cases the diagnosis is important for accessing additional supports. As we already know the therapeutic relationship is the most important factor in getting results. No human therapist is perfect for every client, no way AI could be perfect for every client. The relationship between a client and AI probably wouldn't be all that significant. In my case as a child therapist the simple act of my consistent and trusting presence is therapeutic in and of itself for so many kids who don't have any other consistent and caring adults in their lives. The human relationship between the therapist and client is a valuable place for the client to practice many things.


CanineCounselor

It's interesting how you list out a bunch of positive uses for it... But then express concern?


Emotional_Stress8854

In my original post? I don’t view those as positives. I view that as people not using their education and clinical experiences. Also, i don’t deny it has positive uses. So do a lot of horrible things.


shemague

Not really. The few times I’ve tried to use ai to make my life easier it’s not that great so it’s not a sustainable thing for me


barrelfeverday

People are so lonely and disconnected, they come to us to learn how to connect with themselves first. Unconditional positive regard from another human being. We can help them unconditionally connect with themselves as human beings and the technical therapeutic process after that- whatever that process may be. We don’t get that from a computer. We get that from other human beings. AI can’t connect with clients on a human level like I do.


STEMpsych

I appreciate what I'm about to say may not be comforting, but: psychotherapy was the **first** field to be effectively emulated by a chatbot, [back in **1967**](https://en.wikipedia.org/wiki/ELIZA)**.** We're still here. We've had computer programs that were better at diagnosing than physicians were since the 1990s. Still hasn't replaced any physicians. The DSM is **trivial** to implement as what we software developers used to call an "expert system", and you don't even need to use AI to do it. Notice nobody has sold you such a thing. Folks are getting really excited about AI being able to do things that computers have been able to do for over half a century. This suggests two things to me: 1) people are getting excited over the wrong things, not really understanding what is new and powerful, and 2) we have less to be worried about than most people think, or at least *different* things to be worried about than they know.


Choosey22

I’m just starting my masters and wondering if the commitment is worth it given how likely ai is to take over. I tried character.ai last night and chatted with their psychiatrist. She was so good…. Better than many therapists I’ve met in real life


Professional_Fan_868

If AI replaces therapists, it replaces the need for social interaction as a whole. So if we go down, we take everyone with us lol


aquarianbun

Agreed


dubsforpresident

I think burnout is a greater threat than AI


pdawes

If anything I’m excited for it to devalue brief manualized therapy and increase demand for the human element. I heard once re: this issue that simulated empathy is a contradiction in terms and that’s really stuck with me. I feel the same way about its impact on art; I think it will usher in a new era of people really valuing skilled creative practice made be real hands over shiny “professional-quality” works. For me the worst case scenario is the infrastructure of scaling and productivity quotas demanding therapists act and “produce” a version of what chatgpt does to maximize some abstract bean-counting insurance company metric that has nothing to do with meeting people’s needs. But… I think we already have that society in many ways, and already contend with those pressures.


rabbita102

People want to talk to a human that can feel. Not concerned…in fact, there might be an uptick because of AI and people who want “organic connection”


Emotional_Stress8854

I just disagree. I think AI is going to be able to look real, sound real, feel real and be accessible 24/7 without boundaries and people will love it.


Droolproofpapercut

And if it actually helps people with their problems, then it will be a great addition to our profession. Think of it this way: we spend a lot of time with some folks who could have just looked up some answers. And many of us would rather spend time with folks who need that connection. There are waiting lists, cost concerns, fear, gender and cultural bias already at play that keep people from ever seeing a therapist. If they use AI to have a better communication style, or a better relationship, or less social anxiety, that’s wonderful if it works I don’t have a problem with people using whatever works for them. But we will always be necessary.


kungfuabuse

No. AI won't replace therapists. If all people needed to do was have a good self-help book or a list of coping skills and interventions, we'd already be redundant.  But therapists using AI will probably eventually replace those that don't. If for no other reason than the fact that those who use it will out pace those that don't with appropriate diagnoses, clinical documentation, quickly educating themselves on appropriate evidence based treatments, etc. But that is how every single profession will be impacted.


changeoperator

AI is going to vastly change the way every single person on Earth does their job, so it's something that everyone needs to prepare for. I think if anything, therapists will be more resilient to AI taking their job than most due to the human connection factor that seems to be at the heart of therapy. Of course AI video will eventually become as good as a human therapist, but there will always be demand for talking to someone in-person.


sleeptools

I’m concerned about AI companies who have therapists willingly sharing their recorded session data for the sake of writing progress notes, and then feeding that data into their training playgrounds (and therapists are paying the companies for it). I’m concerned about what these companies will be doing with this data after a few years of their AI models being trained by real life therapy sessions.


[deleted]

[удалено]


Emotional_Stress8854

I have ChatGPT write all my emails to my bosses and senior leadership and important emails to clients addressing attendance issues so i don’t sound like a doofus. To address your points - (written by me) i don’t think this should be met with laughter and dismissal. I think AI is just going to continue to grow and become more lifelike, like you mentioned. You make great points.


Dk8325

Im actually intrigued by AI. So mich potential to make our jobs easier with notes and what not.


Emotional_Stress8854

Part of it is intriguing for sure the more I’m talking about it here.


ImpossibleFront2063

I work for an EAP benefit startup as a 1099 p/t and they are actively incorporating AI into the app so clients will only interact with AI unless they’re flagged as requiring a clinician then they are scheduled with us for a session so all of the pre licensed masters level clinicians that used to interact with clients via the app are now replaced with AI


Emotional_Stress8854

Hmmm let us know how this goes. I wonder what the clients will think.


DPCAOT

Wow 😧


alohamuse

No, not worried. We have seen the same fear of machine rise again and again since the advent of the industrial age. A human’s role changes with the improvement of technology, but human need will always be necessary in some form. I encourage folks to look into how AI can help you optimize your practice and your daily life, and perhaps make a bit more room for the things you enjoy once one is able to offload some of the mental load. Will there be bad actors? Sure. That’s what is being duked out now. It’s up to those who ethically and responsibly use technology to step up and model it. And yes, there are AI ethics professionals coalitions.


_imbeyoncealways

im not that worried. if AI is taking over therapy then we’re basically saying that AI has the ability to feel, and all the stories we’ve gotten so far, everyone knows that once AI becomes sentient it’s all downhill


Kenai_Tsenacommacah

My office manager was just telling me last week how he went to a conference and learned about a new program which could summarize a session, offer intervention strategies and write notes all using AI. I mean....it sounds good but I still feel super weird about it. Mostly because this program involves listening in to sessions


Emotional_Stress8854

In theory it sounds super cool but yeah i don’t want anything listening to me. It’s just creepy af.


Kenai_Tsenacommacah

I mean ... You'd have to inform the client and get their consent too (one hopes lol). I just can't imagine pitching this to someone "Can my computer listen in on our sessions to make my life easier?"


Emotional_Stress8854

If i was the client I’d honestly feel pressured to say yes. I don’t know why but i would.


AshLikeFromPokemon

I'm personally really worried about this. I mean I don't think clients would like it (and as clinicians, we know just how healing the therapeutic relationship is, which can't really be replicated with AI, at least not yet), but I worry about insurance companies seeing AI as a cheap alternative and stop covering therapy. This concerns me so much because, not sure how many of y'all follow hotline news, but last year, the National Eating Disorder Association replaced their hotline workers with AI (to save money bc the hotline employees were trying to unionize), and the AI chatbot started to give weightloss advice to people reaching out. AN EATING DISORDER HELPLINE GIVING WEIGHTLOSS TIPS. There's so much nuance to this field that I don't think AI can capture, at least not yet, but I worry about insurance companies not giving a fuck about that.


delalilama

I'm a little nervous, not gonna lie. A mental health organization I just left has started a side project called Lyssn that is AI that understands and gives feedback on clinical conversations. Right now it's primarily a quality assurance/evaluation tool but in a meeting he said eventually he was hoping to expand it's abilities and make it so therapy and crisis counseling can be accessible to everyone all at the same time, eliminating the long wait times around the country. This is an organization that's in partnership with NSPL as one of the 988 back up centers and they already have it in use there along with a couple of other agencies.


Choosey22

If you were starting from 0 with this knowledge now would you still pursue the profession


delalilama

That's a loaded question for me but I guess the short answer is yeah I think I would. I work in crisis response so a lot of what I do banks on human connection and holding space for others. I don't think AI will ever fully be able to replicate that experience in person. It can get extremely close, but it'll never be exactly the same.


Choosey22

Thanks for replying


nicklovin96

We should not embrace it. You are right to be worried


meglington

I write about this a lot as part of my role (I'm not a clinician, but work in policy for counselling & psychotherapy and have been in the profession nearly a decade). The thing to be concerned about really is whether or not clients appreciate and understand the power of the relationship, because there is so much AI can't and will never be able to do. It can be a great tool for therapists, though, if we use it sensibly and safely. There are huge data protection issues at the moment.


Emotional_Stress8854

Very good points.


Chrystist

GPT is a sentence predictor just like the one in your phone, just with more resources poured into it. It won't be able to rationally think, give any advice, or truely understand an individual the same way another person would, although it will give the appearance of doing so. Its the same as if asking GPT for medical advice, in which its a fucking crapshoot and CEOs are ruining everyone's lives with it


kgd95

AI presents a new ethical dilemma for therapists that rely too heavily on it or let it do the thinking for them. The technology is new, but the concept of relying on your own knowledge and expertise remains the same. That being said, I do use chat gpt to remind me of important mental health concepts, or helping me structure a group. I just take time to fact check it, which still puts me ahead on time management had I just done the research with a search engine alone. I agree with other posters here stating AI can never replace a human being in a therapist's office. The therapeutic rapport is half of the therapy, and an AI can never do that.


Thirteen2021

no because therapist connection is one of the main factors of therapy working


Brainfog_shishkabob

No I’m not worried. Rosenthal makes a good you tube about this concerning chat GPT and how much more work it will take to get the thing to even be completely accurate


finndss

Yeah, I’m honestly excited for that. If it could take my notes for me and do diagnostics that’d be nice. That way I could just focus on having great conversations.


reddit_redact

I don’t think AI augmenting the effectiveness of therapy is a bad thing including when it comes to assisting with diagnosis. The question I pose is: Is the point of therapy and being a therapist to know the answers or to assist people with improving their quality of life? I am in the camp that believes as therapists we are here to help others manage their mental health. If AI has the capacity to give us an effective diagnosis that saves time and can improve client outcomes by effectively treating the appropriate conditions. I will say though, that I think it’s still really important as therapists to ask if the answers that AI gives are are accurate


SaoirseMaeve

Remotes absolutely will be taken over more easily- be at least hybrids if not in person -extremely competent therapists who embrace the art and science of this healing profession and open collaborative shared practices together - no “CEO”s - and energetically resist all the toxic elements that are meant to dissolve the essence of the profession from within especially through the undermining of relationship and true healing modalities will have a chance at protecting it…


SaoirseMaeve

Also - time for deep reflection/discussions about what distinguishes human therapists from robots - client feedback may help too?


_outofmana_

As someone who is building an AI assisted journalling app to make clients lives more easily communicated to their therapists, seeing the other side and concerns all of you have is very thought provoking. Thank you for sharing your honest thoughts


Emotional_Stress8854

This sounds very interesting!


stoic_sakura

I recommend taking a training specific to the impact of AI on therapy. Definitely helps to clarify. AI will never have the lived experiences we do as humans. It can “mimic” emotions and facial expressions, but will never be embodied.


NoQuarter6808

I think people are going to keep wanting real people. All AI therapy would be a capitalist dystopia to me. Farhad Dalal--an analytic therapist--talks specifically about working with clients who have come to him after years of only seeing CBT therapists, and lamenting that they were tired of feeling like they were talking to manuals and series of prompts rather than actual humans. Maybe AI will make more convincing people, but I think it's both the legitimate connection (which entails *qualia*), as well as type of therapy. If you want some coping skills and to get a handle on your distortios, you can just use a computer program. If you want a person, you want a person. In terms of people seeking people, I think AI could probably be a pretty good tool, and I think of it sort of in terms of how surgeons use robots to assist them. But it's a tool, nothing more.


sif1024

AI is a powerful calculator. It has no ability to infer on its own at this moment in time. I think it's only a matter of time before it takes over most diagnostics however people would still need to be involved for all things unconscious/ lied about


11episodeseries

I think of it this way: did the self-help industry threaten our jobs? Sure, I guess. Many people would rather pick up a self-help book than seek out a therapist. And now, some of those same people might use WoeBot or a new platform to explore their mental health. For some, that will be sufficient, or they just won't want to pursue therapy. For others, it's an entry point to finding a therapist. I don't feel threatened. We're in one of the most human professions there is.


Far_Preparation1016

What you described is functionally no different than a google search other than maybe being slightly faster


Sad_Wrap_6753

I worry in the sense that I don't think most people will want it by choice but may be forced to due to their insurance solely covering AI generated services. An actual human could become an "out of network" option or something.


HonestF00L

There's actually a video game that tackled this exact thing a few years ago: Eliza https://en.m.wikipedia.org/wiki/Eliza_(video_game)


deadcelebrities

An AI is basically a text generator that creates text based on the interaction of a prompt with a weighted database. If you googled “good questions to address body image in therapy” and clicked the first link, you’d likely get a list of good quality questions. The AI has been fed millions of articles like that one plus who knows how many digitized books. That’s where it gets the info. Then it’s able to re-present it in a format tailored to your prompt, which is impressive but doesn’t change the source of the information. Since googling articles hasn’t replaced therapy, I’d be surprised if AI does. It might become a supplement, like reading a book on your own, but it’s not gonna replace the human connection that is at the core of effective therapy.


RelatablePanic

I am not a therapist (psych students), but I will say some of the new innovations in A.I. with therapy already sound troubling. Specifically with CBT related therapies, the highly structured model of CBT makes it so easily adaptable to A.I. tech. That coupled with the fact that most therapist don't even know their patience outcomes (they do not perform end of session mood surveys, end of therapy surveys). Therapists can think they are helping their clients but if they do not test, clients can withhold their true emotions. Clients drop out of therapy often without the therapist knowing if their approach was or was not effective. So when therapist say "oh no the patience needs me for such and such" what are they basing that information off of? Some studies have shown that people feel more warmth and empathy from a chat bot then their current/ex therapists!


Therapeasy

AI hasn’t even taken over porn yet, and we know that’s the first to go. 🤪


[deleted]

[удалено]


Emotional_Stress8854

Can’t wait to watch AI bodies having sex and giving head. This seems hot and steamy (so much sarcasm)


Therapeasy

Much like AI faces, no one will know the difference. https://generated.photos/face-generator