The Wall Street Journal this week began publishing an extraordinary series of damning articles about Facebook. The five so far consistently hit one theme–that the company is often very well informed about harms its systems cause society, but nonetheless chooses either not to take action or to lie in public about what it knows, or both. What makes the series unusually authoritative is that it derives from an extensive cache of internal company documents, many created by teams at the company intended to detect and prevent harms.
Here’s the Journal’s summary of its first four articles: “Facebook’s own research lays out in detail how its rules favor elites; its platforms have negative effects on teen mental health; its algorithm fosters discord; and that drug cartels and human traffickers use its services openly.” The most recent piece, published Friday Sept. 17, documents how the company has been unable to prevent its systems from being overrun by anti-vaccine propaganda. Again, all this comes not from dissident ex-employees or outsiders, but from the company’s own employees, as documented in internal presentations.
Sadly, for those like me who have followed Facebook closely for years, none of this is a surprise. But the Journal series may bespeak a major shift, even for those who are jaded and expect little other than evasion and apathy from this shockingly-powerful company. The articles suggest it may start to be seen widely as an outlaw enterprise, one that cares little whether or not its activities bring harm to society. The series demonstrates a company reflexively making choices about how to respond to challenges based not on a desire to reduce harm but rather on an obsession with user growth, user activity (which generates page views that enable more advertising), and especially, public perception.
One depressing and telling new statistic revealed in the the Thursday installment of the series is about global efforts around misinformation, which is rampant and has damaged country after country. Internal documents show that Facebook employees and contractors in 2020 spent 3.2 million hours finding and often removing false or misleading posts and ads. But only a shocking 13 percent of all that work was spent on content outside the U.S. Yet, the article too gently points out, 90 percent of Facebook usage is in other parts of the world. The reason for the discrepancy, based on the evidence presented here as well as accounts in the recent book The Ugly Truth, is that the company is primarily concerned about negative press coverage in the U.S.
The article makes the urgent point that Facebook does not even have the capacity to perform oversight in many places where egregious harms are taking place because of the platform. It gives the example of Ethiopia, where “armed groups use the site to incite violence against ethnic minorities.” The documents show that few, if any, company employees monitor such activity. The company has no speakers or reviewers for some languages used in Ethiopia, nor are the artificial intelligence tools Facebook touts as a silver bullet to address harmful speech used in those languages.
In classic obfuscatory company-speak, spokesman Andy Stone responded to such concerns by telling the Journal this week the company has safety programs in “over 50 languages.” Sounds good, right? In fact, Facebook officially supports over 110 languages, and scores more are used on its platform. Facebook’s net profits after tax will exceed $40 billion this year. Could it afford to ensure hate is monitored and managed in all those languages?
The Journal notes the documents show that employees themselves feel “embarrassment and frustration, citing decisions that allow users to post videos of murders, incitements to violence, government threats against pro-democracy campaigners and advertisements for human trafficking.” The Journal also writes that these internal reports “offer an unparalleled picture of how Facebook is acutely aware that the products and systems central to its business success routinely fail and cause harm.”
“We’re not actually doing what we say we do publicly,” a company researcher says in one document. That statement summarizes much of the entire series, and even Facebook’s standard attitude.
Other articles in the series describe equally unsavory actions. The first installment revealed a vast list of 5.8 million prominent users worldwide, many of them politicians and public figures, who are exempt from the company’s normal enforcement policies regarding speech. “These people can violate our standards without any consequences,” according to an internal presentation. The list was created, according to the documents, specifically to prevent what the company calls “PR fires”–instances of bad publicity.
Another article that has generated tremendous reaction, including demands from U.S. senators for more information, focuses on Instagram’s effects on teen self-image, and Facebook’s disingenuous public statements about what it knows about harmful effects from its platform. The senators are especially concerned because a letter sent from Mark Zuckerberg to Congress in response to a formal request for company data on this problem omitted mention of a key company study that found 32 percent of teen girls said Instagram made them feel worse if they already felt bad about their bodies.
Though the Journal repeatedly has sought comment from Facebook on all this, executives have shown little sign of contrition or even a willingness to admit the company has done anything untoward. One tiny exception is in the first article about the program exempting 5.8 million from policy enforcement on speech. Spokesperson Stone conceded criticism of the program was “fair.”
Many people find it hard to explain how one company could make so many mistakes, often so willfully. Why doesn’t this company want to be a force of order and progress? The answer is pretty simple, it seems to me. Facebook has no governance. It doesn’t have a CEO. It has an emperor. What he cares about is growth, so growth priorities prevail over harm remediation.
His control over the company is total. If he doesn’t like a board decision, he fires its members. But everyone needs and benefits from oversight. No one is perfect. Zuckerberg may be the richest person his age in human history. He may have unfailing confidence in his judgement and motives, but in central, fundamental, critical, obvious ways, he is failing as a leader.