Will robots replace therapists?
Sit back, relax, and tell the algorithm what's on your mind
You've heard the news: Robots are coming for our jobs. Bookkeepers, umpires, factory workers, and even legal assistants could all see their employability disappear in the next 20 years.
But what about jobs that require a more personal, human touch? Surely those are safe, right?
Actually, new innovations suggest that artificial intelligence is invading even the world of physical and mental health care. For the millions of people seeking mental health treatment from a living, breathing human, this raises a question: What role will robots and AI play in the world of therapy? Will therapists and counselors be replaced by our unfeeling robot overlords?
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
We're already seeing AI make some advancements here. Take, for example, a new program called Woebot from Stanford researchers. Woebot is essentially a chatbot therapist. It uses a Facebook messenger to administer a very common form of therapy called Cognitive Behavioral Therapy (CBT), which, as Megan Molteni at Wired explains it, "asks people to recast their negative thoughts in a more objective light." Once they see their negative thoughts, they can start to recognize patterns and triggers, and try to stop them. Woebot checks in with users daily by sending messages, asking simple questions along the lines of "How do you feel today?" or "What are you doing right now and what's your general mood?" Because it's a robot, it remembers a user's responses and "gets to know" them as time passes. It recognizes changes in mood and can tailor suggestions, the same way a real therapist might.
The Woebot is supposed to help people overcome fears of being judged or stigmatized, letting them obtain mental health help in a format that's both familiar to them and anonymous. People may not admit to friends that they're struggling, but might readily confide in Woebot — especially because they know the app won't judge them. Of course, Woebot isn't a complete replacement for face-to-face therapy, but not everyone has the time (or the money) to see a real therapist. And research suggests this kind of thing can really work. One recent study shows Woebot could reduce anxiety and depression in users.
This isn't the first attempt at creating a robotic therapist. Researchers who helped created Ellie — a robot that helps veterans who are battling post-traumatic stress disorder — agree robots "listen" without judgment, and people may be more likely to confide in them more honestly. Ellie analyzes tone of voice, eye gaze, facial expressions, and head gestures to look for indicators of depression and post-traumatic stress disorder in patients. However, the robot's developers stress she's not a replacement for human therapists because she doesn't try to offer any kind of treatment (unlike Woebot); she just gathers data.
But there's another problem: Sometimes patients enter therapy but don't stick with it, or don't have the motivation to change their behavior or environment. It's hard to identify who will or won't follow through, but a team of Penn State engineers is working on ways to use machine learning to come up with customized mental and physical health plans that help patients stay motivated. It's based on a gaming technique: Users are encouraged to move through virtual environments and perform certain tasks. As they do, scenarios get progressively harder and users have to exert more energy and greater motivation. The patient's performance results could help researchers measure their personal level of motivation, and tailor mental health treatment accordingly to keep them interested and committed.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
One of the most difficult parts of treating people suffering from mental health problems might be trying to identify those who are at the highest risk of self-harm. This is really hard to do. Suicidal thoughts are not usually rooted in a single, isolated incident such as a relationship breakup, job loss, or death of a close friend. This unpredictability is a problem for clinicians, but scientists are looking at how machine learning might be able to help. By examining huge quantities of data and pulling out patterns that humans might miss, robots could help spot potentially suicidal patients.
In one study, an algorithm predicted suicide attempts with surprising accuracy. The research particularly focused on improving clinicians' predictive abilities about suicide. It found that today's modern clinicians are no more able to definitively find the factors that lead to suicide than mental health specialists of 50 years ago. Machine learning could be the missing link that leads to major advancements in reducing self-harm through prediction and prevention.
All of this said, while robots are already supplementing the work therapists do, they can't create genuine connections with clients, the kinds of connections needed to really help patients thrive. For now, that's something only humans can do. So take comfort, doctor, you're not out of a job just yet.
Kayla Matthews is a technology journalist and writer, contributing to The Week as well as publications like VentureBeat, Motherboard, and MakeUseOf. She is also the owner and editor of the productivity and tech blog Productivity Bytes.