In September 2010, Rutgers freshman Tyler Clementi posted on his Facebook page that he was "Jumping off the gw bridge sorry" – and then did. Last Christmas, Simone Back wrote that she "Took all my pills be dead soon so bye bye every one." Several Facebook "friends" added disparaging comments, but no one stepped forward to check on her. Black's body was found the next day. And last December, Clay Duke posted a Facebook "testament," writing that "Some people (the government sponsored media) will say I was evil, a monster … no…" He then went on a shooting rampage and killed himself.
Could anything have stopped Tyler, Simone or Clay once they decided to end their lives? Perhaps not. But as the grim status update increasingly becomes the new suicide note, Facebook has taken on an active, interventionist role. Users can now anonymously flag "suicidal content" – and Facebook will directly contact the person who posted it with suicide-prevention assistance. The mechanism has quietly been in place since June, but Facebook announced Tuesday that it was expanding its efforts, offering potentially self-destructive users more resources via a partnership with the National Suicide Prevention Lifeline.
While the information Facebook previously offered was limited to phone numbers, now users have the option of connecting with a crisis counselor through a private, anonymous chat room. Lifeline's John Draper explained Tuesday that "there are many people in crisis who don't feel comfortable picking up the phone. This new service provides a way for them to get the help they need in the way they want it." It makes sense: Someone who prefers the relative safety of online communication, someone who is already online, would naturally gravitate toward chat rather than picking up the phone.
There are times when using a word on Facebook or in a Google search can make a person feel creepily observed. It's easy to feel – quite accurately – that our every move online targets us for a sponsored message. That monitoring rarely feels in our best interest, but it's nothing but terrific that since 2010, Google has made sure that the first thing that comes up when someone searches "suicide" is the image of a red phone, the message "Need help?" and the number of the National Suicide Prevention Lifeline. Likewise, Facebook's resources for users who may be suicidal – and concerned friends -- are a strong step toward social responsibility. Facebook also offers specific help for LGBT members, with phone numbers and links for the Trevor Project.
Even the smallest gesture – like a suicide-prevention hotline number in the Google search of someone who may be looking for the best ways to die – affords someone with a seed of hope the chance to chart a different course. And if someone reads a friend's despairing status update and makes help available, well, that's a pretty noble use of social networking.
These innovations say a lot about our peculiar squeamishness about suicide, and how far removed we can be from our friends' deepest pain. A potentially suicidal person's first source of intervention might now be the automated results from a search engine. Or help might come from Facebook's anonymous support team as a form message.
But as Surgeon General Regina Benjamin wrote on Facebook this week, we all need to learn "how to stand with and support someone who is in crisis." That means creating online resources and safe means of reporting suicidal behavior. But if Google and Facebook can figure out how to help, surely the rest of us can come up with a few ideas as well. We can start by remembering that a voice on the phone, a supportive shoulder, or the offer to drive someone to a crisis center can be powerful as well. Sometimes, saving a human life still requires a human touch.