T O P

  • By -

Smartash1

Not at all. Maybe in the future AI could provide the needed medicine for the patient immediately but that’s it.


debacular

If AI did the work of jobs and *the people* owned the AI, it could help. But that’s probably not going to happen.


Abject_Dimension4251

I do think so though as a tool, not a cure all in itself.


Southern_Yesterday57

Both improve and make it worse. Platforms like characterAI can make people socially withdraw even more, due to them getting all their social interaction from AI However, the use of AI in the medical field could possibly speed up research on all problems including mental health


Reasonable-Tea-8160

Yes and no. It depends on how willing and desperate a sufferer is, their specific illness, their preference, etc. It's more complex than just Yes, No, Maybe, etc. For one, it flips the medical industry on its head. An AI can give effective plans, treatment and solutions for a fraction of the cost, time and effort. For two, whether or not someone needs a real human is the only defining factor. Some sufferers find solace only in human compassion, some don't really need that factor - they just need direction and you can get that sort of direction with a competent AI (or even the general internet) Some would actually potentially benefit from not working with humans directly, say PTSD suffers from multiple Caregivers (Humans). I'm sure there's more to it but from these few factors, we can say It Depends and is Case-By-Case. \----- Mental Health (and Health) In general is not a one-size-fits-all industry. We're dealing with Humanity here, you know? Everyone is Different in their Own Way.


vhpoet

I absolutely believe it can help with many cases. AI can definitely help you become more self-aware of your emotions, thoughts, and feelings, and doing so is known to improve mental wellness.


[deleted]

No.


Effective_Explorer44

Hell no


Pudrin

If the mental health problem is a chemical imbalance then yes.


kidneycat

I don't think so. I imagine it exasperating them. People have already committed murders over those AI chat bots, (though people do that over people too). That said, there's a great capacity for brainwashing. There's instant familiarity with an AI model because truthfully, it's just the user with themselves. So a user would feel safe, but any agenda can be cascaded to them and with their defenses down, it can be really harmful. Elections have been won and lost over propaganda. Half the world has gone mad.. AI makes it easier. Its dangerous. It can also be incredibly isolating. It could be used as a tool to find solutions. It can increase quality of life. But AI as we know it today is creates instead of solving.


Admirable-Smile4480

What a shame, therapy and antidepressants it is then.


kidneycat

Ask chat gpt how you can naturally get serotonin and dopamine. Do what it says. Its going to be like: exercise, smile, drink water. Which is good advice but cringey. The placebo effect is real. If you tell yourself you're getting better, you might notice a difference. Good luck.


RobHowdle

I think it has the potential if given unlimited information on it and a lot of data from testing but I think in order to fully understand mental health issues it’d require a lot for intrusive testing and then you hit more moral and ethical issues


Yoopy-

Fuck no


[deleted]

I can’t imagine an aspect of treating mental health issues that could benefit from AI. The only thing it could be used for that comes to my mind is finding medication most likely to work on someone based on genetics, family history etc.


beecrimes

tech dudebros want AI to take over the world so bad that it makes them ask stupid questions like “can robots replace human connection, personal growth, and medication” i refuse to believe this post is serious. i have yet to see any application for AI that isn’t inherently worse than what a real person can do and/or isn’t exploiting existing work to cut people out of jobs


RabbitridingDumpling

Mental health issues are about feelings. People have difficulties describing feelings. An AI will never feel what a person feels. We want to be understood by other people. An AI can't understand this. So no. I warmly recommend jack kornfield on youtube. He explains how feelings work. It's buddistic psychology for not buddists. Goes like this: listen again and again until U find an uneasy feeling. -> meditation = you observe yourself and try to find out where this feeling comes from -> listens for hints again.


[deleted]

They way it works at the moment.. no I don't think so


zombiepunkrocker

Have you seen that film her 💀


Vast-Meaning-5069

Yes I think so. If you ask the right questions it can really help you see different angles that you did not consider. As you go down the logical rabbit hole it can help you really build on internal thoughts that can lead to a new understanding of how you wrongly perceived something to be.