Facebook’s ‘tier’ system decides who gets content moderation



Facebook in 2019 began classifying countries in a “tier” system that dictated the amount of resources used to moderate content posted on the platform in those nations, according to a new report.

The prioritizing system placed the United States, Brazil and India in “tier zero” — meaning more resources were dedicated to moderating content and enforcing Facebook’s regulations in those countries, The Verge reported Monday.

Israel, Germany, Indonesia and Iran were categorized in “tier one,” so Facebook committed slightly less to enforcement of its rules and election integrity safeguards, according to the technology news outlet. Facebook reportedly formed “war rooms” for content moderation in those countries, including on elections.

Twenty-two countries were placed in “tier one” but did not have dedicated “war rooms,” and Facebook put the rest of the globe in “tier three,” which meant the platform only took action on rule-violating election-related content if it was flagged by moderators.

A Facebook logo.
Facebook reportedly has war rooms dedicated to content moderation amid elections.
AFP via Getty Images

The tiered system is detailed in disclosures to the Securities and Exchange Commission and a redacted version was given to Congress by whistleblower Frances Haugen’s lawyers.

The documents show that there are huge differences in the guardrails Facebook employs to monitor content in countries. Artificial intelligence that detects hate speech and misinformation in the United States is not available in Ethiopia, The Verge noted.

A Facebook logo on a phone.
Facebook has been criticized for meddling in the 2020 presidential election to prevent former President Donald Trump from being reelected.
NurPhoto via Getty Images
A Facebook like logo.
Facebook has only classified a handful of countries as “tier zero” and “tier one” compared to the rest of the world.
Bloomberg via Getty Images

Facebook also does not have a misinformation detection system in Myanmar, where the military seized power earlier this year via a coup, or Pakistan.

The revelation of Facebook’s opaque content moderation practices comes after a new whistleblower told the SEC that top Facebook staff undermined attempts to fight misinformation and hate speech during President Trump’s tenure because they predicted those efforts would hamper the company’s growth and feared Trump and his allies.


Source link