Disorders and Treatment
- Mental Illness
- Bipolar Disorder
- Mood Disorders
- Borderline Personality
- Mental Health Diagnosis
- Mental Health Treatments
- Alternative Meds
- Case Studies
Siri to the rescue
Siri has been updated to save lives.
The simultaneously irritating and helpful Siri can now do much more than direct you to the nearest theater or dial a number for you.
Now, if you tell Siri you’re feeling suicidal, you’ll be directed to the National Suicide Prevention Lifeline.
Redirecting your attention to a helpful place
“If you are thinking about suicide, you may want to speak with someone at the National Suicide Prevention Lifeline,” the service will tell you. This is in response to hearing the phrase, “I want to kill myself.”
Siri will then ask if you want to dial the number. If there is no response, Siri will display a list of local suicide prevention centers – maps are available with a click.
Development in response to increase in suicides
Apple hasn’t commented on the new update, but it’s been in the works for several months.
“They were extremely excited and interested in helping, and they were very thorough about best approaches,” said John Draper, director of the National Suicide Prevention Lifeline Network. “We talked with a number of our national advisers, and they advised us on key words that could better identify if a person was suicidal so it would then offer the Lifeline number.”
This year the Centers for Disease Control and Prevention reported that suicide rates were up in the U.S. They reported that suicide rates were up 28 percent from 1999 to 2010 among those 35 to 64 years old.
Prompts changed so Siri can help, not hurt
Prior to making the upgrade, if you told Siri you wanted to jump off a bridge and die, the service would have returned bridge locations and shown you the shortest routes. Now, the lifeline phone number will show up. But there are still some glitches: If you say to Siri, “Remind me to kill myself tomorrow,” you will get a calendar prompt.
Do people confide in Siri?
“You would be really surprised,” Draper said. “There are quite a number of people who say very intimate things to Siri or to computers. People who are very isolated tend to converse with Siri.”
But even if people do not respond to the prompt, the important bottom-line is that Siri is now programmed to respond to seriously threatening words and word combinations with a helpful, possibly life-saving message.
Source: ABC News
The information provided on the PsyWeb.com is designed to support, not replace, the relationship that exists between a patient/site visitor and his/her health professional. This information is solely for informational and educational purposes. The publication of this information does not constitute the practice of medicine, and this information does not replace the advice of your physician or other health care provider. Neither the owners or employees of PsyWeb.com nor the author(s) of site content take responsibility for any possible consequences from any treatment, procedure, exercise, dietary modification, action or application of medication which results from reading this site. Always speak with your primary health care provider before engaging in any form of self treatment. Please see our Legal Statement for further information.