Social Media

Content Moderators for Facebook and YouTube Reveal What it's Like to Sift Through Some of the Most Disturbing Material on the Internet

The moderators are frequently tasked with watching disturbing video and images of child sexual abuse, violence, animal cruelty and more.
Content Moderators for Facebook and YouTube Reveal What it's Like to Sift Through Some of the Most Disturbing Material on the Internet
Image credit: mihailomilovanovic | Getty Images
News Editor
3 min read
Brought to you by Business Insider

A legion of temporary employees who are tasked with moderating content on platforms like Facebook and YouTube say they were unprepared for the emotional and psychological toll the job would take.

The content moderators are hired to sift through online posts, including pictures and video that were flagged as inappropriate. Several of those employees shared their experiences in a Wall Street Journal report published Wednesday night.

Moderators said they watched images of war victims who had been "gutted" and "child soldiers engaged in killings." A former moderator who worked at Facebook recalled watching video of a cat thrown into a microwave, The Journal reported.

Shaka Tafari, who moderated content on the Whisper messaging app, said he was "alarmed" by the number of messages that contained references to rape, or included images of bestiality and animal abuse.

"I was watching the content of deranged psychos in the woods somewhere who don't have a conscience for the texture or feel of human connection," Tafari told The Journal.

Tech giants oversee thousands of content moderators -- jobs that are typically staffed through temporary employment agencies and have a high turnover rate due to the nature of the work. But that turnover could also be attributed to the emotional stress of the role, according to former employees interviewed by The Journal.

Some of those people claimed they had few tools to deal with the aftereffects of a job that required them to consume some of the most depraved material on the internet. Content moderators at Facebook and Microsoft are offered various avenues for psychological counseling, The Journal reported, but some of the employees said it was not enough. Moderators typically left the job within a year or less.

Content moderation became a hot-button issue on Facebook in particular this year after the fallout from the 2016 U.S. election, when it was revealed that Russia leveraged the platform to execute influence campaigns that boosted then-candidate Donald Trump and disparaged his Democratic opponent, Hillary Clinton.

Though Facebook CEO Mark Zuckerberg initially balked at the notion, the social-media platform later admitted that Russia-based operatives published some 80,000 posts on the platform over a two-year period.

The matter gained heightened urgency this week when The Washington Post reported that Russia's election-influence efforts would likely continue, as officials recently warned that digital platforms are still vulnerable to such misuse.

More from Entrepreneur

Kathleen, Founder and CEO of Grayce & Co, a media and marketing consultancy, can help you develop a brand strategy, build marketing campaigns and learn how to balance work and life.
Book Your Session

For a limited time only, get this bundle of Entrepreneur PressĀ® titles for less than $30 (60% OFF) on our bookstore when you use "LEAP" at checkout.
Buy Now

Are paying too much for business insurance? Do you have critical gaps in your coverage? Trust Entrepreneur to help you find out.
Get Your Quote Now

Latest on Entrepreneur

My Queue

There are no Videos in your queue.

Click on the Add to next to any video to save to your queue.

There are no Articles in your queue.

Click on the Add to next to any article to save to your queue.

There are no Podcasts in your queue.

Click on the Add to next to any podcast episode to save to your queue.

You're not following any authors.

Click the Follow button on any author page to keep up with the latest content from your favorite authors.