On Friday, Facebook CEO Mark Zuckerberg announced that the company will process a review of the policy he quoted when permitting President Donald Trump’s violence-inciting post to continue on the site.
“We’re going to review our policies allowing discussion and threats of state use of force to see if there are any amendments we should adopt,” Zuckerberg wrote in a lengthy statement, days after his employees played a virtual protest in protest of his response to Trump’s post.
After Trump published an inflammatory post on the platform about anti-racism protests, the outrage at Zuckerberg followed, the president warned that “when the looting starts, the shooting starts.” The phrase originated with a combative Miami police chief threatening the young, largely Black, people involved in the civil rights movement in the 1960s.
Facing calls to take the post down or put a warning on it, as Twitter did, Zuckerberg initially responded to upset civil rights leaders and his own employees by saying the post did not violate any of Facebook’s policies. In a leaked call with around 25,000 employees this week, he argued that the language Trump used “has no history of being read as a dog whistle for vigilante supporters to take justice into their own hands.” He made the same argument in a statement earlier that week.
The statement Zuckerberg issued Friday signals he’s taken some of that criticism to heart. He said Facebook will consider new rules for two specific types of posts.
The first category is posts of excessive use of police or state force. “Given the sensitive history in the US, this deserves special consideration,” he wrote. He added that in cases of ongoing unrest or conflict, “We already have precedents for imposing greater restrictions during emergencies and when countries are in ongoing states of conflict, so there may be additional policies or integrity measures to consider around discussion or threats of state use of force when a country is in this state.”
Zuckerberg also revealed that Facebook will review its policies on monitoring posts that could create confusion about voting or suppress voter turnout. He cited a hypothetical newspaper article warning people about COVID-19 risk if they go to the polls as an example of the type of post the company would monitor.
He also hinted that Facebook may mimic how Twitter handles posts that invite violence. Twitter covered Trump’s recent inflammatory tweet with a message that it was “glorifying violence” and required users to click through to remove the warning to see the post. While Zuckerberg said he likes that Facebook’s policy is to fully remove any posts that violate the guidelines, he’s open to hearing new ideas.
“In general, I worry that this approach has a risk of leading us to editorialize on content we don’t like even if it doesn’t violate our policies,” he wrote, “so I think we need to proceed very carefully.”