Skip to Main Content

The mental health field is increasingly looking to chatbots to relieve escalating pressure on a limited pool of licensed therapists. But they’re entering uncharted ethical territory as they confront questions about how closely AI should be involved in such deeply sensitive support.

Researchers and developers are in the very early stages of figuring out how to safely blend artificial intelligence-driven tools like ChatGPT, or even homegrown systems, with the natural empathy offered by humans providing support — especially on peer counseling sites where visitors can ask other internet users for empathetic messages. These studies seek to answer deceptively simple questions about AI’s ability to engender empathy: How do peer counselors feel about getting an assist from AI? How do visitors feel once they find out? And does knowing change how effective the support proves?

Unlock this article by subscribing to STAT+ and enjoy your first 30 days free!

GET STARTED

Create a display name to comment

This name will appear with your comment