TikTok Concerned Over Labor Rights Violations After Landmark Court Ruling Against Meta, Leaked Documents Suggest The case against Meta brought on by moderators in Kenya could set a precedent for holding tech giants accountable for the treatment of outsourced workers.
By Madeline Garfinkle Edited by Jessica Thomas
Key Takeaways
- Kenyan court declares Meta as "true employer" of moderators, meaning they are liable for potential labor rights violations.
- TikTok fears similar scrutiny after leaked memo reveals concerns over outsourcing arrangements in Kenya.
Opinions expressed by Entrepreneur contributors are their own.
In a landmark ruling last month, a Kenyan court declared that Meta was the "true employer" of hundreds of moderators working in Nairobi, Kenya — meaning that Meta can be held liable in Kenya for labor rights violations, even though the moderators are technically employed by a third-party contractor. Meta will appeal the decision, TechCrunch reported.
Moderators are responsible for filtering out violent, hateful and shocking content on Meta's platforms.
Meta previously contracted with a company called Sama, and it now contracts with a company called Majorel. TikTok, the short-form video app, also outsources to moderators in Kenya with Majorel, and leaked memos may imply the company has violated labor rights.
The initial case against Meta was brought forward by Daniel Motaung, a South African moderator who says he was fired in 2019 after attempting to form a union. Motaung claimed that the job exposed him to traumatic and disturbing content, resulting in post-traumatic stress disorder. He was allegedly paid as little as $2.20 an hour for the work, WIRED reported in February.
Motaung also claimed that the true nature of the work was never explicitly laid out to him before taking on the role that would ultimately leave him traumatized.
As Motaung's case progressed, in January Meta attempted to sever ties with Sama (resulting in 260 moderators losing their jobs) and move its operations to another third-party company, Majorel (TikTok's partner), per WIRED.
After 184 moderators sued Meta and Sama alleging unlawful termination of contracts, the court ruled in favor of the moderators in March, extending their contracts and preventing layoffs until the case is resolved. The court found that Meta was the primary employer, and Sama was "merely an agent" overseeing the work on its behalf.
The court also ordered Meta and Sama to provide medical, psychiatric and psychological care to the moderators, acknowledging the "inherently hazardous" nature of their work sifting through social media content to remove hate, misinformation and violence.
As for TikTok, leaked documents obtained by the NGO Foxglove Legal and viewed by WIRED suggest that the company is concerned about the potential legal repercussions it might face if the Kenyan court's decision sets a precedent.
"TikTok will likely face reputational and regulatory risks for its contractual arrangement with Majorel in Kenya," the memo says, adding that if the court rules in favor of the moderators, "TikTok and its competitors could face scrutiny for real or perceived labor rights violations."
In response to the situation, TikTok is contemplating an independent audit of Majorel's operations in Kenya to address potential concerns regarding labor practices, according to the leaked documents.
However, similar moves have been criticized for being performative and not leading to substantial improvements in workers' conditions, Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, told WIRED — a reality TikTok appears to be aware of as the memo stated such audits "may mitigate additional scrutiny from union representatives and news media."
Although TikTok has the opportunity to proactively approach the issue, some experts caution the company might merely be trying to mitigate blame rather than genuinely improve working conditions for its outsourced workers.
"I think it would be very unfortunate if TikTok said, 'We're going to try to minimize liability, minimize our responsibility, and not only outsource this work, but outsource our responsibility for making sure the work that's being done on behalf of our platform is done in an appropriate and humane way,'" Barrett told WIRED.
Entrepreneur has reached out to TikTok and Meta for comment.
Related: 3 McDonald's Franchisees to Pay Thousands in Fines for Child Labor Law Violations