Inside Facebook's dysfunctional approach to civil rights 

(Photo by Beata Zawrzel/NurPhoto via Getty Images)

On Wednesday, two respected civil rights experts -- Laura Murphy and Megan Cacace --  hired by Facebook to review the company's practices issued a blistering final report. While acknowledging that Facebook had made progress in some areas, the report finds that  Facebook's "approach to civil rights remains too reactive and piecemeal." It laments that Facebook's "vexing and heartbreaking decisions...represent significant setbacks for civil rights."

These decisions, the report notes, has resulted in the civil rights community becoming "disheartened, frustrated and angry after years of engagement where they implored the company to do more to advance equality and fight discrimination, while also safeguarding free expression." 

That was reflected in a statement issued by the nation's leading civil rights organizations — including the NAACP, Color of Change, and The Leadership Conference on Civil and Human Rights.

This audit has laid bare what we already know — Facebook is a platform plagued by civil rights shortcomings. Facebook has an enormous impact on our civil rights — by facilitating hate speech and violence, voter and census disinformation, and algorithmic bias, and by shortchanging diversity and inclusion. This audit has exposed Facebook’s vulnerabilities and provides important recommendations that they must take up swiftly.

Facebook COO Sheryl Sandberg, who served as the company point person on the project, was non-committal. "[W]e won’t make every change they call for, we will put more of their proposals into practice," Sandberg said in a statement

Sandberg appeared to reject one of the core premises of the report. Sandberg claimed that Facebook had "clear policies against hate." But the report repeatedly finds Facebook's civil rights policies to be unclear, incomplete, and erratically enforced. 

The problem with exempting politicians from fact-checking, from a civil rights perspective

The decision that most clearly undermined Facebook's commitment to civil rights, according to the report, was announced in September 2019 by Facebook executive Nick Clegg. In a speech, Clegg said that the company would exempt politicians' speech from fact-checking and, on a case-by-case basis, would decline to enforce other Facebook standards against politicians based on "newsworthiness." 

Nick Clegg, Vice-President for Global Affairs and Communications, stated that Facebook does not subject politicians’ speech to fact-checking, based on the company’s position that it should not “prevent a politician’s speech from reaching its audience and being

subject to public debate and scrutiny”...In that same speech, Clegg described Facebook’s newsworthiness policy, by which content that otherwise violates Facebook’s Community Standards is allowed to remain on the platform.

Clegg's announcement was amplified a few days later by Facebook CEO Mark Zuckerberg during a speech at Georgetown University. 

Facebook CEO, Mark Zuckerberg, in his October 2019 speech at Georgetown University began to amplify his prioritization of a definition of free expression as a governing principle of the platform. In my view as a civil liberties and civil rights expert, Mark elevated a selective view of free expression as Facebook’s most cherished value.

The report zeroes in on what's troubling about Facebook's policies, as articulated by Clegg and Zuckerberg. They do not represent a commitment to free expression. The policies privilege expression of the powerful over all other people. This is not just unfair — it makes it even more challenging to protect civil rights on the platform. 

Elevating free expression is a good thing, but it should apply to everyone. When it means that powerful politicians do not have to abide by the same rules that everyone else does, a hierarchy of speech is created that privileges certain voices over less powerful voices. The prioritization of free expression over all other values, such as equality and non-discrimination, is deeply troubling to the Auditors. Mark Zuckerberg’s speech and Nick Clegg’s announcements deeply impacted our civil rights work and added new challenges to reining in voter suppression.

The report recommends Facebook reverse these policies, but it's a recommendation that Facebook will almost certainly ignore. And with two sets of policies around speech — one for the powerful and one for everyone else — can Facebook ever effectively protect civil rights?

Civil rights experts blast Facebook for coddling Trump

The report is unsparing in its criticism of Facebook’s failure to remove a post where Trump promoted violence against "looters," which plainly violated Facebook's prohibition against incitement of violence. Facebook claimed that Trump's post advocating violence — "when the looting starts, the shooting starts" — fell into its (unwritten) exemption for the discussion of "state action." The report dismantles that argument:

The civil rights community and the Auditors were deeply troubled by Facebook’s decision, believing that it ignores how such statements, especially when made by those in power and targeted toward an identifiable, minority community, condone vigilantism and legitimize violence against that community. Civil rights advocates likewise viewed the decision as ignoring the fact that the “state action” being discussed — shooting people for stealing or looting — would amount to unlawful, extrajudicial capital punishment. In encounters with criminal conduct, police are not authorized to randomly shoot people; they are trained to intercept and arrest, so that individuals can be prosecuted by a court of law to determine their guilt or innocence. Random shooting is not a legitimate state use of force. 

As he was considering what to do, Zuckerberg talked to Trump on the phone. The civil rights auditors tried to make their views known, but said they weren't permitted to speak with Zuckerberg (or any other decision-maker) until after the decision was made.

The report says that Zuckerberg's decision to permit Trump's post was "dangerous and deeply troubling because it reflected a seeming impassivity toward racial violence in this country."

A permissive approach to White Nationalism

Another failure noted in the report is Facebook's decision to permit "more than 100 groups identified by the Southern Poverty Law Center and/or Anti-Defamation League as white supremacist organizations" to have a presence on Facebook. 

Facebook bans explicit advocacy of "white nationalism" or "white separatism" but "does not prohibit content that explicitly espouses the very same ideology without using those exact phrases." Months ago, the auditors advised Facebook to broaden its ban on white nationalism to include "content which expressly praises, supports, or represents white nationalist or separatist ideology even if it does not explicitly use those terms."  Thus far, Facebook has ignored the recommendation.

What happens when you report voter suppression on Facebook

Based on the recommendation of the auditors, Facebook now allows users to report instances of "voter interference." That's the good news. The bad news is that Facebook has decided not to have any humans review these reports. 

[C]ontent reported by users as voter interference is only evaluated and monitored for aggregate trends. If user feedback indicates to Facebook that the same content or meme is being posted by multiple users and is receiving a high number of user reports, only then will Facebook have the content reviewed by policy and operational teams. This means that most posts reported as “voter interference” are not sent to human content reviewers to make a determination if posts should stay up or be taken down.

Facebook says that, prior to the 2018 election, "over 90% of the content Facebook removed as violating its voter suppression policy (as it existed at the time) was detected proactively by its technology before a user reported it." 

But, as the report notes, "statistics on what percentage of content Facebook removed was initially flagged by proactive detection technology, of course, do not indicate whether or how much content that actually violated Facebook’s policy was not detected and therefore allowed to stay up."

What Facebook got right

Facebook should be commended, at a minimum, for commissioning the review in the first place and releasing a final report that criticizes the company. The report also notes some areas where things have gotten better:

Facebook implemented "a new advertising system so advertisers running US housing, employment, and credit ads will no longer be allowed to target by age, gender, or zip code." 

Facebook "developed robust policies to combat census interference."

Facebook created "a new policy that prohibits content encouraging or calling for the harassment of others."

Facebook "has more consistently engaged with leaders in the civil rights community and sought their feedback, especially in the voting and census space."

The report commends these actions while noting that they are insufficient to create an environment on Facebook that protects civil rights.


Give a gift subscription

Thanks for reading!