Facebook is in the business of making money. And it's very good at it. In the first three months of 2021, Facebook raked in over $11 billion in profits, almost entirely from displaying targeted advertising to its billions of users.
In order to keep the money flowing, Facebook also needs to moderate content. When people use Facebook to livestream a murder, incite a genocide, or plan a white supremacist rally, it is not a good look.
But content moderation is a tricky business. This is especially true on Facebook where billions of pieces of content are posted every day. In a lot of cases, it is difficult to determine what content is truly harmful. No matter what you do, someone is unhappy. And it's a distraction from Facebook's core business of selling ads.
In 2019, Facebook came up with a solution to offload the most difficult content moderation decisions. The company created the "Oversight Board," a quasi-judicial body that Facebook claims is independent. The Board, stocked with impressive thinkers from around the world, would issue "rulings" about whether certain Facebook content moderation decisions were correct. Facebook executives could spend less time wrestling with thorny content moderation decisions, and more time figuring out how to sell more ads.
The most important decision in the Oversight Board's history was issued on Wednesday morning, when the Board reviewed Facebook's decision to indefinitely suspend Donald Trump. The Oversight Board upheld the suspension but said it was inappropriate for Facebook to make it indefinite. It demanded Facebook make a decision on the length of the suspension — or decide to make it permanent.
The Oversight Board demonstrated an awareness of how Facebook would like to use Oversight Board decisions to shield itself from controversy. "Facebook seeks to avoid its responsibilities. The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty," it wrote.
But the decision, which is nearly 12,000 words long, illustrates that whether Trump is ultimately allowed to return to Facebook is of limited significance. The more important questions are about the nature of the algorithm that gives people with views like Trump such a powerful voice on Facebook.
Those are questions, however, that Facebook is unwilling to answer.
The questions Facebook won't answer
The Oversight Board was Facebook's idea. It spent years constructing the organization, selected its chairs, and funded its endowment. But now that the Oversight Board is finally up and running and taking on high-profile cases, Facebook is choosing to ignore questions that the Oversight Board believes are essential to doing its job.
This is a key passage (emphasis added):
Facebook stated to the Board that it considered Mr. Trump’s “repeated use of Facebook and other platforms to undermine confidence in the integrity of the election (necessitating repeated application by Facebook of authoritative labels correcting the misinformation) represented an extraordinary abuse of the platform.” The Board sought clarification from Facebook about the extent to which the platform’s design decisions, including algorithms, policies, procedures and technical features, amplified Mr. Trump’s posts after the election and whether Facebook had conducted any internal analysis of whether such design decisions may have contributed to the events of January 6. Facebook declined to answer these questions. This makes it difficult for the Board to assess whether less severe measures, taken earlier, may have been sufficient to protect the rights of others.
At another point in the decision, the Oversight Board said it asked for "information about violating content from followers of Mr. Trump’s accounts" and Facebook declined to answer.
A critical issue, as the Oversight Board suggests, is not simply Trump's posts but how those kinds of posts are amplified by Facebook's algorithms. Equally important is how Facebook's algorithms amplify false, paranoid, violent, right-wing content from people other than Trump — including those that follow Trump on Facebook. Trump aside, Facebook played a key role in the spreading of misinformation about the election and the organization of the January 6 attack.
But Facebook has constructed the Oversight Board to exclude these kinds of discussions. This is how the Oversight Board's charter describes its "authority to review":
People using Facebook’s services and Facebook itself may bring forward content for board review. The board will review and decide on content in accordance with Facebook’s content policies and values.
The jurisdiction of the Oversight Board excludes both the algorithm and Facebook's business practices.
The Oversight Board asked about the algorithm and related issues anyway and Facebook's response was to refuse to answer the questions. Facebook said this information "was not reasonably required for decision-making in accordance with the intent of the Charter." The Oversight Board has no power to compel Facebook to answer. It's an important reminder that, for all the pomp and circumstance, the Oversight Board is not a court. The scope of its authority is limited by Facebook executives' willingness to play along.
Facebook's algorithm problem, in one chart
Donald Trump's Facebook page is a symptom, not the cause, of the problem. Its algorithm favors low-quality, far-right content. Trump is just one of many beneficiaries.
NewsWhip is a social media analytics service which tracks which websites get the most engagement on Facebook. It just released its analysis for April and it shows low-quality right-wing aggregation sites dominate major news organizations.
The Daily Wire, for example, is a sexist, xenophobic, and bigoted far-right website that produces no original reporting. But, on Facebook in April, The Daily Wire received more than double the distribution of the Washington Post and the New York Times combined:
This actually understates how much better The Daily Wire's content performs on Facebook than the Washington Post and the New York Times. The Daily Wire published just 1,385 pieces of content in April compared to over 6,000 by the Washington Post and the New York Times. Each piece of content The Daily Wire published in April received 54,084 engagements on Facebook, compared to 2,943 for the New York Times and 1,973 for the Washington Post.
Two other sites that top the Washington Post and the New York Times are Western Journal, a far-right website that pushed Trump's false claims about voter fraud, and Rumble, a video platform that caters to Trump supporters.
It's important to note here that Facebook's algorithm is not reflecting reality — it's creating a reality that doesn't exist anywhere else. In the rest of the world, Western Journal is not more popular than the New York Times, NBC News, the BBC, and the Washington Post. That's only true on Facebook.
Facebook has made a conscious decision to surface low-quality content and recognizes its dangers. Shortly after the November election, Facebook temporarily tweaked its algorithm to emphasize "'news ecosystem quality' scores, or N.E.Q., a secret internal ranking it assigns to news publishers based on signals about the quality of their journalism." The purpose was to attempt to cut down on election misinformation being spread on the platform by Trump and his allies. The result was "a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible."
BuzzFeed reported that some Facebook staff members wanted to make the change permanent. But that suggestion was opposed by Joel Kaplan, a top Facebook executive and Republican operative who frequently intervenes on behalf of right-wing publishers. The algorithm change was quickly rolled back. Facebook made a similar change after January 6, but only for a few days. Other proposed changes to the Facebook algorithm over the years have been rejected or altered because of their potential negative impact on right-wing sites like The Daily Wire.
One day, Trump may be back allowed on Facebook. Or he may not. But as long as the Facebook algorithm continues to favor Trumpism, nothing much will change.