Facebook's Suicide Prevention Tools: Invasive or Essential?

Free Book Preview Ultimate Guide to Social Media Marketing

This book takes readers through a 360-degree perspective of social media marketing in businesses.
3 min read

If you have ever scrolled through your News Feed and stopped on a troubling, borderline suicidal post from a friend, you may have been unsure how to help or wondered if reaching out would be appropriate. Facebook understands that people share these types of negative personal thoughts on the platform and has developed tools to help you help your friends.

Facebook now offers resources for users who perceive a friend's posts as suicidal, allowing them to flag a post for review by a team at the company. Users can click a drop-down menu within the post in question that allows them to specify their concerns to Facebook’s global community operations team. These reports are directed to employees trained to evaluate suicidal content. The team may then send the reporting user some information about prevention and advice for communicating with the friend. In some cases, Facebook may intervene by contacting local law enforcement where the friend resides, according to The New York Times.

Related: Facebook Updates Its Suicide Prevention Tools

Previously, suicide prevention assistance was limited to some English-speaking Facebook users, but now it is available to everyone.

Among the tools is a page containing a form to report sightings of suicidal content to the team, along with advice for assisting friends who may be considering self-injury, those who may have an eating disorder and members of the military, LGBT individuals and law enforcement officers whose posts indicate they may be contemplating suicide. It also offers direct support to at-risk users seeking help for themselves. All of the tools contain warnings to users, advising them to take immediate action if a post explicitly states suicidal intent by calling law enforcement or a suicide hotline and directing them to said contact information.

Facebook relies on humans on both sides -- users report and team members review. None of the content is detected or evaluated using artificial intelligence or algorithms.

Related: Can We Turn to Our Smartphones During Mental Health Crises?

We asked Entrepreneur’s Facebook and Twitter followers whether Facebook should allow users to solicit its employees’ help in preventing suicide, or whether the company should refrain from intervening in people's personal lives. Many who responded embraced Facebook’s efforts, while others thought sole responsibility should fall on the identifying users themselves. Some thought in terms of the company’s image, and some asked questions about how reporting someone would affect how Facebook targets that user in the future. Read some of their comments below.

More from Entrepreneur

We created the SYOB course to help you get started on your entrepreneurial journey. You can now sign up for just $99, plus receive a 7-day free trial. Just use promo code SYOB99 to claim your offer.
Jumpstart Your Business. Entrepreneur Insider is your all-access pass to the skills, experts, and network you need to get your business off the ground—or take it to the next level.
Entrepreneur Store scours the web for the newest software, gadgets & web services. Explore our giveaways, bundles, "Pay What You Want" deals & more.

Latest on Entrepreneur