When a friend makes a cry for help, you can now act immediately
Consider this: Of American internet users, 87% of 18-29 year olds acknowledge that they use Facebook. The same age group also has our highest suicide rate. In light of these—and other—alarming suicide statistics, Facebook is getting involved.
Starting this week, in the same upper right corner where you can elect to save or block a post, you’ll now be able to offer support or report the need for immediate help. If you flag a post as a potential risk warning (ie, this user is in danger of self-harm), the post can be immediately reviewed by members of Facebook’s global community operations team. These team members have received training in risk evaluation and suicide prevention. And if you don’t want to get outsiders involved, the new tools offer ways of educating yourself or reaching out directly to the person in need (or to a mutual friend, to coordinate help efforts).
The tools come in response to President Obama’s call to recognize and treat mental health issues early—itself a response to suicide rates hitting a 30-year high. They are geared toward friends of someone who may be suffering or having suicidal thoughts, as well as those in crisis themselves.
Given the saturation of Facebook across the world, building tools within the system—rather than directing members to a separate website or app—is smart. The new features are based on established and well-researched suicide prevention tools and understanding of dialectical behavioral therapy. Facebook worked closely with Dr. John Draper, executive director of the National Suicide Prevention Lifeline, to understand and incorporate his organization's best practices and experience. Draper feels like Facebook’s approach here is sharp and direct.
“These are tools that have long been neglected,” he says. “We need to do a better job to provide a platform on which people can help themselves and others. This is telling people ‘You don’t need to go through your health insurance or an approved network to seek help.’”
Should a friend be concerned with another’s post or photo, this drop down now includes an option to report the post for being “threatening, violent, or suicidal”. After selecting “suicidal”, Facebook will be notified. But, as Draper says, “Aside from reporting this person, you’re given something to do on your own. This is a tool for people who have already been validated as friends and can state their concern.”
The second set of prompts to appear on the screen include NSPL’s best practices, the foremost being to call 911 should you believe a person is in immediate danger. From there, the options include reaching out to the friend in need with a suggested text, anonymously reporting the post to Facebook for action, or chatting live with a representative from Lifeline for 24/7 advice. Should you not want to reach out to the friend in crisis directly, your call for action will remain anonymous.
While it is long maintained that “no specific tests are capable of identifying a suicidal person,” the Suicide Prevention Resource Center supports strong social networks and community as a method of suicide prevention (despite other possible negative factors linked to one’s social groups). It’s certainly no panacea, but as the world’s most prominent social network, Facebook’s inclusion of this prevention tool is a clear win.