Facebook, We Need to Have a Serious Discussion About Suicide

Suicide; a tragic health problem in the US and worldwide

Suicide is death caused by injuring oneself with the intent to die. A suicide attempt is when someone injures themselves with the intent to die but does not die as a result of their actions.

Suicide is a tragic health problem in the United States. According to the Center for Disease Control and Prevention (cdc.gov), suicide is the 10th leading cause of death in the US. It was responsible for nearly 45,000 deaths in 2016, with approximately one death every 12 minutes.

Many more people think about or attempt suicide and survive. In 2016, 9.8 million American adults seriously thought about suicide, 2.8 million made a plan, and 1.3 million attempted suicide.

Worldwide, the numbers are even more staggering. The World Health Organization (who.int) states that nearly 800,000 people die by suicide each year worldwide. And for each suicide, there are countless others that have attempted suicide.

Social Media and Suicide

The link between social media and suicide is still unclear. Preliminary research into the link between social media and suicide (nih.gov) suggests that “framing the topic of social media and suicide from a public health perspective to address the issue and guide prevention programs makes sense.”

With that in mind, let’s talk about Facebook and what they have done to help prevent suicide, specifically their reporting tools and their suicide prevention and safety resources.

Facebook has made reporting clearly suicidal content easy via their Report Suicidal Content tool. Just enter some basic information like the posters name, a link to their profile, and a link to the post and Facebook’s team will prioritize this report over other reports. In fact, they’ve made reporting even easier adding links directly on posts, generally via a dropdown that allows you to “Give feedback or report post,” with further options to select your reason, one of them being suicidal content.

In addition, Facebook’s Community Standards regarding Suicide and Self-Injury are present, as well as a dedicated Safety Center. The safety center provides some useful information on how to get help for yourself or someone else as well as some links to additional resources on the matter.

So What’s the Problem Then?

The problem is multi-pronged, first off the community standards are too vague and the enforcement of removing content isn’t done very wisely. In many cases, we’ve noted that the content was simply deleted, with no further visible action or communication from Facebook. Not exactly the help we were hoping for, especially for a mental health community like ours.

Here are some of the problems we’ve noticed during our experience running a bipolar disorder support group on the Facebook platform, especially since the introduction of the Group Quality feature for group admins.

It is absolutely unclear what is, and what is not acceptable content regarding suicide, for both group members and group administrators.

Facebook’s policy rationale states that they “remove content that encourages suicide or self-injury, including certain graphic imagery and real-time depictions that¬†experts tell us might lead others to engage in similar behavior.” Our experience has been that if the content is reported or otherwise flagged by Facebook’s bots and subsequently removed for having violated the community standards, there is absolutely no explanation.

Group admins don’t even have the opportunity to see the content that was removed, because, according to Facebook, they can’t show us the removed content because it goes against their community standards. You see the cycle here? It is a completely missed opportunity to clarify the rules, or community standards as they are referred to, but there is nothing more from Facebook.

So what type of suicidal content is not allowed? Clearly, we would not want any posts that encourage suicide or posts that graphically depict suicide. But what about suicidal ideation posts where the user is having thoughts of suicide but no plan or intent to move forward, posts where the user is looking for support and encouragement, or ways to deal with the thoughts and combat them in a healthy manner? Are they ok? We’re really not sure as Facebook has not clearly communicated this to us.

Without a clear explanation of why a post violates the community standards, the group itself receives a strike every time content is removed for supposedly violating said standards.

Now, with the addition of the Group Quality feature group admins can now see when content has been removed, but not what the content itself was. The only information group admins are left to go on is who made the post, and which admin or mod approved the post. After you have a post that was approved by an admin removed you get a nice little warning that your group may be disabled if further violations continue. Again, with no knowledge of what the said content was.

Mental health support groups like the one we maintain on Facebook are now effectively censoring and declining posts where the user is mentioning suicide because there is no clear guidance on what is acceptable, and further strikes against the group may get it shut down completely. Are we really turning away someone who has a mental illness and is reaching out for help and support in perhaps their darkest hour? Almost sounds like Facebook is trying to sweep the problem under the rug instead of deal with it head on through effective communication.

We’ve all heard the phrase, perhaps the movement, that it is OK to not be OK. But apparently, it is not OK to openly talk about how you feel, especially if it involves the hot button topic of suicide, at least on Facebook. Group admins are being forced to delete content that may be flagged as going against the vague community standards, or have their group disabled, and I would assume, eventually removed from the platform.

Since group admins cannot approve suicidal posts, there is no way to report the post and get help to the group member. The reporting tools might as well not even exist in this scenario.

Our bipolar support group has been on Facebook for nearly five years, providing support, information, encouragement, and hope to more than 36,000 members. With those kinds of numbers, you can imagine the volume of posts we deal with on a day to day basis, and also how many we now need to reject. In all cases where we feel the posting member may be in danger, we report the content to Facebook. But alas, a rejected post has no URL, and therefore cannot be reported using the aforementioned tools.

Oh, and you might think Facebook would have thought this through and provided tools for the group admins to still offer support to those in dire need. Sadly, they have not. There are no additional tools for group admins to contact the posting member, no special links to report content before it has been approved, no way to send the member additional resources. The only way to do this is via Facebook’s Messenger platform, which not everyone has installed and even if they do, oftentimes messages aren’t seen as they get trapped in the user’s other inbox or filtered messages.

What is the solution to this? How can we provide the support members badly need?

Better communication from Facebook and better tools for admins are required.

The solution to this is really simple, in theory. Facebook needs to develop and/or implement tools for admins on the backend so we can more effectively communicate with members when there is a safety concern. Facebook should put a tool in place where a group admin can still report content, even if we intend to decline said content. Facebook also needs to clarify its community standards to better communicate what they regard as unacceptable content, to both members and administrators.

In Facebook’s effort to provide a safe and welcoming environment for all, and deleting content that doesn’t sound happy enough, they’re effectively muting an entire group of people on their platform. A group of people impacted by mental illness. A group of people that desperately need support and perhaps have nowhere else to turn. And that is a true shame.

1 thought on “Facebook, We Need to Have a Serious Discussion About Suicide”

  1. HEY FACEBOOK allowing a computer to choose your content based on words in the comments is one of the most retarded things I have ever seen.
    I Lead groups on Facebook where Suicide is a common topic of conversation as it is with almost every chronic illness or condition. When you start just randomly taking out posts without a human trying to see if it fits in the content it is meant or if it needs to be deleted.
    You are limiting us SO much on who needs help, how we can discuss something that I know is very real.My cousin ate a bullet when she couldnt get the help she needed. I dont want to see this happen again because of your cheap ways to fix things on your platform just to protect your butts.
    If we are in closed groups BACK OFF we have been doing this alot longer than you have been watching us do it.