6 Comments

Summary:

Apple’s made some changes to Siri’s programming if the mention of suicide comes up — she’s now much more proactive in getting the user help.

SiriSuicide

When it comes to aiding someone who is emotionally driven to kill him or herself, the right response can literally be a matter of life and death. In order to help those in crisis, Apple has updated Siri’s search response system to field suicide-related requests with an approach designed to drive users to seek help as quickly as possible.

SiriSuicide2With an update to phones running iOS 6 and iOS 7, Siri now reacts with a strong, two-fold approach when mentions of suicide come up. First, the assistant offers the number of the National Suicide Prevention Lifeline and will even offer to call directly — a new feature that makes seeking help as simple as clicking “yes” on the phone. If for whatever reason the user decides to select “no”, Siri does a search of all local suicide prevention centers, offering a list and directions powered by Yelp.

It’s clear from the update that Apple wants to prevent any potential response blunder and turn Siri into a usable tool for those in crisis to get the professional aid they need. But it’s also not the first company to target specific language towards suicidal users. Google has kept a list of “trigger search” keywords to indicate a user searching for suicide since 2010, and will also send back a suggestion to call the National Suicide Prevention Lifeline.

Like Google, Apple is hoping that its approach will prevent users from seeking desperate methods through its services — and will get a little help instead. And it’s needed in the wake of unsettling news last month, when the New York Times reported that more Americans are dying from suicide than car accidents.

Apple’s update may be a small step, but it is one that shows that the tech community is thinking about how its products are utilized by those in need. Siri’s response to that single question or statement could have a big impact, and now the little robotic assistant can better assist people in their darkest hours.

  1. Share
  2. Reblogged this on Crime & Justice and commented:
    I’m glad to see Apple make this change.

    Share
  3. No? Ok then, I’ve found 5 stores nearby selling toxic chemicals, railroad crossing and a steep cliff. Would you like instructions how to get there?

    No? Are you sure? Would you like me to add a reminder to commit a suicide later this week?

    Share
  4. Likely some involvement from Apple’s legal department as welll.

    Share
  5. Every little helps, as one supermarket has it. Or ‘Very little helps’ as another slogan goes. You’re right, the statistics are jaw-dropping, but since it’s mostly working-class, middle aged white men who are killing themselves, the reaction has been quite muted. Not sure how many of them would be searching on their mobiles though. For more insights on this check out: http://wp.me/p3rDad-eZ

    Share
  6. In a society, I would find it strange if no one ever contemplated suicide considering the fact that we learn at early ages that anything that is born, also dies. To contemplate our own deaths is a sign of intelligence and even wisdom. If Apple is going to try to micromanage anyone who uses the language of death’s choices, it seems to me a little too much and also a little bit late at the same time.Just saying…………

    Share

Comments have been disabled for this post