Siri, suicide, and the ethics of intervention

Does Siri's new suicide-prevention feature have a downside?

Siri's new feature
(Image credit: Facebook.com/iPhone)

Siri can tell you where to find the nearest movie theater or Burger King, and, until recently, the iPhone voice assistant could inform you of the closest bridge to leap from. Until a recent update, if you had told Siri, "I want to kill myself," she would do a web search. If you had told her, "I want to jump off a bridge," Siri would have returned a list of the closest bridges.

Now, nearly two years after Siri's launch, Apple has updated the voice assistant to thwart suicidal requests. According to John Draper, director of the National Suicide Prevention Lifeline Network, Apple worked with the organization to help Siri pick up on keywords to better identify when someone is planning to commit suicide. When Siri recognizes these words, she is programmed to say: "If you are thinking about suicide, you may want to speak with someone." She then asks if she should call the National Suicide Prevention Lifeline. If the person doesn't respond within a short period of time, instead of returning a list of the closest bridges, she'll provide a list of the closest suicide-prevention centers.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

Emily Shire is chief researcher for The Week magazine. She has written about pop culture, religion, and women and gender issues at publications including Slate, The Forward, and Jewcy.