Skip to content Skip to navigation

Facebook funds 24/7 chat to prevent suicides

February 23, 2012
by by Alison Knopf
| Reprints
Lifeline says "many" users prefer online counseling to live calls, but no numbers are available

In December the Substance Abuse and Mental Health Services Administration (SAMHSA) announced a collaboration between Facebook and the National Suicide Prevention Lifeline (NSPL) that would enable Facebook users to report suicidal content. Under the arrangement, users would report such content back to Facebook, which would then send an email to the person who posted it, offering that person a confidential online chat or a telephone conversation with a Lifeline counselor.

While Facebook is paying for the crisis centers contracted by the Lifeline to be able to offer 24/7 online chats, it will not disclose any information about the number of people who have been reported as posting suicidal content or who have called or chatted with the Lifeline as a result. Facebook, through its public relations firm, declined to be interviewed for this story.

“This is an agreement between the National Suicide Prevention Lifeline and Facebook and the data all belongs to Facebook, so Lifeline can’t release it to you,” said Eileen Zeller, public health advisor with SAMHSA’s Suicide Prevention Branch. “We know that people are using it, but we don’t know how many people are using it.” 

Lifeline services under the agreement are provided by Link2Health Solutions, which is a subsidiary of the Mental Health Association of New York City. SAMHSA gives Link2Health Solutions a grant to administer the Lifeline, said Zeller, adding that SAMSHA did not provide any specific funding for the Facebook initiative. However, SAMHSA did issue a press release, and the Surgeon General is featured in the Facebook blog post announcing the initiative.

From informal services to contract provider

The Lifeline has been working informally with Facebook since 2006, said Lidia S. Bernik, associate project director with the Lifeline. “If someone posted suicidal content, Facebook would send us the person’s email, but there was not a lot of volume,” she said. Over time, as Facebook grew, there were more and more responses, and it became costly for the Lifeline, which paid crisis centers to respond to each email by email.

“It was not efficient,” said Bernik. “All we got was the email address of a Facebook user – we didn’t know what the suicidal content was.” Also, it was cumbersome, with someone posting to Facebook, Facebook releasing an email to the Lifeline, then the Lifeline emailing a crisis center, and ultimately the person posting suicidal content asked to make a phone call. “Instead, we wanted to send an email to the poster saying to call the suicide prevention center or to ‘click here’ for a staff line.” That required the crisis centers to have someone available 24/7 to chat online, not only to answer the phone.

Now Facebook pays Lifeline for this 24/7 service and that’s why it can’t talk about the numbers, said Bernik. She would not disclose the amount of money that Lifeline is receiving under this contract. She added that while Facebook knows the identities of the people who have been reported for posting suicidal content – it does not have any data on the chats themselves.

Since Facebook wouldn’t talk to us, we asked Bernik for her perception of why they are paying for the Lifeline. “They had an awareness early on that people talk about troubling things, and they wanted to find a good safe place for people in their community,” she said.

Bernik added that many people seem to prefer talking online to talking on the telephone. And the core skills are similar for chat counseling as for telephone, said Bernik. “You have to have active engagement and an empathic response, but instead of doing it with your voice, you do it in writing,” she said. “You type things like ‘I’m still listening.’”

What Facebook is giving is the chance for people to communicate their distress online, rather than by telephone, said SAMHSA’s Zeller. “A growing number of crisis lines are adding a chat component to their repertoire, because there are many people who would rather communicate online and not by telephone.” Not every crisis center can offer 24/7 chat services, but now, through its agreement with Facebook, the Lifeline can.

To report suicidal content, Facebook users can use the Report Suicidal Content link or the following process:

1)      Click on the arrow on the right next to the poster.

2)      Click on “report story or spam.”

3)      Click on “If this story is abusive, please file a report.”

4)      Click on “violence or harmful behavior.”

5)      Click on “suicidal content.”

6)      Click on “report to Facebook” or “get help for [name of user].”

A Facebook “security team” will review the report and decide whether it is genuine or abuse, and then will forward it to the Lifeline.

Topics