Mark Zuckerberg Changed Facebook Policies To Go Easy on Hatemonger Alex Jones
Zuckerberg continues to side with the alt-right and republicans, ignoring his company's own policies.
A damning new exposé by Buzzfeed News reveals that CEO Mark Zuckerberg personally ordered changes to Facebook’s content policy in order to go easy on Infowars hatemonger Alex Jones. Zuckerberg’s actions set off a chain reaction allowing far-right hate groups to escape scrutiny on the site and making it easier for them to recruit and indoctrinate countless users in the lead up to the January 6th insurrection attempt at the Capitol.
Facebook is bad https://t.co/eDpccvaO4U
— Molly Jong-Fast🏡 (@MollyJongFast) February 21, 2021
It’s no secret that Facebook has changed the rules for republican leaders and alt-right mouthpieces. From their refusal to ban Donald Trump’s violent hate speech to exempting his campaign ads from fact-checking to dragging their feet on any sort of civic or moral accountability, Facebook has continued to ensure that hate speech and white supremacy can thrive on their social network.
Jones, who has called the Sandy Hook massacre a “false flag” and antagonized the victims’ grieving families, was banned from Facebook and Instagram in May 2019. Jones had violated Facebook’s policies on “dangerous individuals and organizations, which requires Facebook to also remove any content that expressed ‘praise or support’ for them.” But one month earlier, Zuckerberg himself intervened and contradicted established Facebook policies to be more lenient. Jones, who spouted transphobic and anti-Muslim hate speech, was given a massive loophole in his ban. While Jones and Infowars pages were banned, fans and followers were still free to post and share support for him.
A former policy employee told BuzzFeed News, “Mark personally didn’t like the punishment, so he changed the rules,” with another employee adding, “That was the first time I experienced having to create a new category of policy to fit what Zuckerberg wanted. It’s somewhat demoralizing when we have established a policy and it’s gone through rigorous cycles. Like, what the fuck is that for?”
One way of reading this story is that someone who took place in the Brooks Brothers riot in 2000 has used his position at Facebook to maintain the influence of people who took part in the Capitol Insurrection https://t.co/eEtvSWtvN3
— Don Moynihan (@donmoyn) February 21, 2021
“Zuckerberg basically took the decision that he did not want to use this policy against Jones because he did not personally think he was a hate figure,” said another former policy employee.
Facebook spokesperson Andy Stone responded to the allegations saying, “Mark called for a more nuanced policy and enforcement strategy,” around the Alex Jones situation. Zuckerberg’s response kneecapped efforts to control the spread of Jones’s hateful ideologies, but he’s not the only one culpable for Facebook’s favoritism towards republican extremists.
Many place the blame squarely at the feet of Joel Kaplan, the vice president of global public policy at Facebook. Kaplan, a DC republican who served as White House deputy chief of staff under President George W. Bush, has repeatedly blocked and delayed efforts to prevent the spread of misinformation on the site. In addition, he has frequently gone to bat for various conservative firebrands who have violated Facebook policies.
Kaplan oversees both public policy (aka government relationships and lobbying) AND content policy (what can and can’t be posted) for the site, making him the gateway for extremists to go unbanned on the platform. It’s an unusual set-up, as fellow tech giants normally split those two roles within their company. It also sees Kaplan actively undoing the work of his own employees who are crafting and implementing company policy. One researcher described the situation saying, “When the company has a very apparent interest in propping up actors who are fanning the flames of the very fire we are trying to put out, it makes it hard to be visibly proud of where I work.”
Kaplan was the subject of a memo sent by a data scientist which detailed his direct efforts to subvert site policy to protect republicans:
“In December, a former core data scientist wrote a memo titled, “Political Influences on Content Policy.” Seen by BuzzFeed News, the memo stated that Kaplan’s policy team “regularly protects powerful constituencies” and listed several examples, including: removing penalties for misinformation from right-wing pages, blunting attempts to improve content quality in News Feed, and briefly blocking a proposal to stop recommending political groups ahead of the US election.”
Kaplan was also criticized for attending the 2018 Supreme Court confirmation hearings for Brett Kavanaugh, which offended Facebook employees. Not only was Kaplan endorsing Kavanaugh’s nomination, but his appearance signaled to employees that he didn’t care about the sexual assault allegations against Kavanaugh.
When questioned about Kaplan’s outsized influence, Zuckerberg responded, “That basically asked whether Joel can be in this role, or can be doing this role, on the basis of the fact that he is a Republican … and I have to say that I find that line of questioning to be very troubling, … If we want to actually do a good job of serving people, [we have to take] into account that there are different views on different things.”
Ultimately, the responsibility rests on Zuckerberg’s shoulders. And while the company has since brought on an independent Oversight Board, Zuckerberg still has final say on what goes. “Joel [Kaplan] has influence for sure, but at the end of the day Mark owns this stuff,” said the former policy employee. “Mark has consolidated so much of this political decision-making power in himself.”
(via Buzzfeed News, featured image: SAUL LOEB/AFP via Getty Images)
Want more stories like this? Become a subscriber and support the site!
—The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—
Have a tip we should know? [email protected]