Here's How Facebook Plans To Increase Transparency During 2020 US Elections
The social media firm launched 'Facebook Protect' to safeguard the accounts of elected officials, candidates and their staff who may be more vulnerable to be targets of malicious activity
After Facebook was criticised and accused for not being responsible enough during 2016 US Elections, the social media giant has been continuously updating its policies to stop fake news from spreading through its platform. With 2020 US Elections ahead, the social media company has announced a slew of measures to combat misinformation during elections.
The 59th quadrennial US presidential election is scheduled for November 3, 2020. In 2016, Facebook was allegedly accused for influencing the results due to the spread of fake news and political advertisement.
Fighting Foreign Interference
Combating inauthentic behaviour: Facebook claimed to remove four separate networks of accounts, pages and groups on Facebook and Instagram for engaging in co-ordinated inauthentic behavior. According to the company, three of the accounts originated in Iran and one in Russia. The fake accounts were allegedly targeting audience from the US, North Africa and Latin America. “In the past year alone, we’ve taken down over 50 networks worldwide, many ahead of major democratic elections,” Facebook said in a statement.
Protecting accounts of candidates, elected officials and teams: The company launched ‘Facebook Protect’ to safeguard the accounts of elected officials, candidates and their staff who may be more vulnerable to be targets of malicious activity. “Page admins can enroll their organization’s Facebook and Instagram accounts in Facebook Protect and invite members of their organization to participate in the programme as well. Participants will be required to turn on two-factor authentication, and their accounts will be monitored for hacking. If we discover an attack against one account, we can review and protect other accounts affiliated with that same organization that are enrolled in our programme,” the official statement said.
Increasing page transparency: Over one year, the company has ensured greater transparency for pages by revealing the page’s primary country location and whether it has merged with other pages. “We are adding more information about who is behind a page, including a new feature, ‘Organizations That Manage This Page’ tab that will feature the page’s ‘Confirmed Page Owner’, including the organization’s legal name and verified city, phone number or website,” the statement noted. However, this information will be available only for pages with larger US audiences.
Labeling State-Controlled Media: From next month, Facebook will begin labeling media outlets which are wholly or partially under the editorial control of their government as state-controlled media. According to the company, this label will be visible on both their page and in Facebook ad library. “We will hold these pages to a higher standard of transparency because they combine the opinion-making influence of a media organization with the strategic backing of a state,” the statement reads.
Learn More About Political Ads: Facebook updated its ad library report and ad Library API to help journalists, lawmakers, researchers and others understand the advertisements better. This includes a US presidential candidate spend tracker to understand how much candidates have spent on advertisements and clarifying if the advertisement ran on Facebook, Instagram, Messenger or Audience Network.
Reducing Spread of Misinformation
Combating Spread of Viral Misinformation: Facebook reduces the distribution of confirmed misinformation to limit its reach. “On Facebook, if pages, domains or groups repeatedly share misinformation, we’ll continue to reduce their overall distribution and we’ll place restrictions on the page’s ability to advertise and monetize,” Facebook said. “Over the next month, content across Facebook and Instagram that has been rated false or partly false by a third-party fact-checker will start to be more prominently labeled so that people can better decide for themselves what to read, trust and share.”
Dealing with Voter Suppression and Intimidation: Before 2018 midterm elections, Facebook had expanded its voter suppression and intimidation policies to prohibit misrepresentation of dates, locations, times and methods for voting or voter registration and giving threats for voting. “In advance of the US 2020 elections, we’re implementing additional policies and expanding our technical capabilities on Facebook and Instagram to protect the integrity of the election. We have now implemented our policy banning paid advertising that suggests voting is useless or meaningless, or advises people not to vote,” the statement read. Apart from this, Facebook is also working directly with secretaries of state and election directors to address the problem of localized voter suppression which may be occurring in a single state or district.
Help People Understand Online Content Better
Facebook also announced an investment of $2 million to develop a project to help users determine what to read and share. The company is also including a new series of media literacy lessons in its digital literacy library. “These projects range from training programmes to help ensure the largest Instagram accounts have the resources they need to reduce the spread of misinformation, to expanding a pilot programme that brings together senior citizens and high school students to learn about online safety and media literacy, to public events in local venues like bookstores, community centers and libraries in cities across the country,” the statement noted.