- cross-posted to:
- secularhumanism@sh.itjust.works
- globalnews@lemmy.zip
- cross-posted to:
- secularhumanism@sh.itjust.works
- globalnews@lemmy.zip
It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?
Eventually, yes, I think it will be. Not yet though, the tech just isn’t strong enough atm. But an AI is resistant to the emotional toll, burnout and low pay that a real life therapist has to struggle with. The AI therapist doesn’t need a therapist.
Personally though, I think this is going to be one of the first widespread, genuinely revolutionary things LLMs are capable of. Couple more years maybe? It won’t be able to handle complex problems, it’ll have to flag and refer those cases to a doctor. But basic health maintenance is simpler.
That would assume the people designing AI want what is best for the person and not what will make them the most money at the expense of the consumer.
The companies involved in AI are NOT benevolent.
You could just run your own. There are plenty of open source models that don’t answer to any company.
Why dont i just give myself therapy? I know way more about what is going on in my head than anyone else does.
Because what’s going on in your own head a would taint your treatment plan and cause it the be a self-defeating plan.
Maybe one day that’ll actually be possible.
Yes, one thing it absolutely has to be good at is referring patients to human therapists, for anyone who need something beyond the standard strategies the AI is trained on. It has to be smart enough to know when to give up.
Edit, it would also be great if the AI would match up these difficult cases to therapists who are known to do well with whatever the patient is dealing with, as well as matching according to the patient’s personality, communication style, etc wherever possible
Edit 2 for clarity above
Where is the profit in sending someone to a different AI for help?
I meant referring them to human specialists.