Is artificial intelligence the cure for America's loneliness crisis?
AI companion apps are seen as a possible solution to isolation and anti-social behavior, but some people worry it could make the problem worse.
Last month, U.S. Surgeon General Dr. Vivek Murthy released an advisory sounding the alarm about an epidemic of loneliness causing a potential health crisis. Some people see possible solutions in the expanding world of generative AI chatbots, but experts argue it could cause more harm than good.
With anti-social behavior on the rise, Muthy compared the adverse effects of loneliness to smoking 15 cigarettes a day and said the crisis deserved the same level of attention as "tobacco use, obesity and the addiction crisis." If we don't, it could cost the country millions, and "we will pay an ever-increasing price in the form of our individual and collective health and well-being," Murthy urged in the advisory.
With the advent of AI companion apps like Replika, the so-called "AI companion who cares," finding a short-term digital fix to fill the void is more possible than ever. But will AI technology only drive us further apart?
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
AI companions can't replace real human connections
It's already clear that overdependence on technology can harm our mental health, and "now chatbots and other AI programs could further replace the critical social interactions that help us build community," Daniel Cox wrote for Insider. While it may seem like a convenient solution, "making things easier is not always an improvement," and it ultimately can't replace genuine human interaction. Using AI to replace human interactions "would deprive people of their mental and social benefits," Cox added.
In a separate post on the American Storylines substack, Cox noted that the most significant issue with AI companions like Replika is that "it asks nothing of us" and promises to always be on the user's side. "A relationship that requires us to make no sacrifice or accommodation, that never challenges our beliefs or admonishes our behavior, is simply an illusion," he concluded.
It could benefit some, but we should be wary of "an easy fix"
Ina Fried, Axios' chief technology correspondent, said that, like most technologies, she sees the potential to help and to harm. "For people for whom what's really missing is purely interaction, I think there are ways that AI is gonna be able to help," she said on the "Axios Today" podcast. The difficult part is "to do it in a way that augments whatever human contact people have and that doesn't ignore the limitations." She could see the potential for someone with dementia "who really just needs that talking to and might repeat the same stories over and over." Fried was still worried about the potential to become too reliant on "an easy fix to humans that has become challenging or inconvenient."
Relying on AI could set "a dangerous precedent"
Depending on artificial intelligence to combat mental health issues and widespread loneliness "sets a dangerous precedent," Dr. Sai Balasubramanian wrote in Forbes. So far, no AI systems can effectively "replicate the intricacies of human nature, interaction, emotion and feeling." Health care industry leaders and regulators should keep that in mind and "prioritize viable and sustainable measures to resolve the mental health crisis, such as training more mental health professionals and increasing patient access to care."
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
Theara Coleman has worked as a staff writer at The Week since September 2022. She frequently writes about technology, education, literature and general news. She was previously a contributing writer and assistant editor at Honeysuckle Magazine, where she covered racial politics and cannabis industry news.