If you were editor of a newspaper outside of New York forty years ago, you’d wait until late evening to decide what to put on your front page. That was when the New York Times released its news “budget,” telling newspaper editors all over America what it planned to print on its own front page.
Influenced by the decisions of Times editors in New York, editors elsewhere decided what news deserved top billing. In this manner, the Times and a few other large daily papers set the daily agenda of national discourse. That influence percolated through to television and radio, which mimicked the news judgment of the print media. All of it was overseen by knowledgeable, experienced journalists driven to seek truth.
Today, social media—and Facebook in particular—play a comparable role. About half of Americans get their news from Facebook, according to an October 2019 study from the Pew Research Center, while 55% of adults get news from social media at least part of the time. Like the influence of the Times four decades ago, Facebook’s extends far beyond what its algorithms and content moderators choose to distribute. Now every newsroom in America games its coverage to travel widely on social media, in order to grow readership and revenue. As Buzzfeed’s publisher famously outlined in a 2017 TED talk, this means creating content that’s emotional, that makes people feel happy or sad or feeds their self-identity.
The result, says Pew and many others, is a massive degradation of quality. Even as so many Americans rely on this flawed media, more than half of them believe one-sided and inaccurate news on social media are “very big problems” because the platforms favor clickbait or biased material.
Facebook is now cementing its role as the globe’s agenda-setter. It’s putting in place an “Oversight Board,” a sort of supreme court of content to hear “cases” from users who object to Facebook’s decisions on whether their piece of posted content—call it news—should properly have been removed by Facebook. It’s mostly designed to take away the company’s responsibility for deciding whether something is incendiary or harmful–in effect to reduce the heat on itself. Only the most controversial or confusing specific posts will get routed to the board for determination. The vast majority of the service’s billions of daily posts will continue to get only the most cursory and instantaneous algorithmic review.
Facebook’s board will have 40 members, selected by the company, and among other things will advise it on how to decide what’s newsworthy. In some ways, it will function like the editorial board of a newspaper or the group set up by Dow Jones in 2007 to oversee the journalistic independence of The Wall Street Journal shortly after Rupert Murdoch bought it. That latter group’s authority quickly came into question after the Journal’smanaging editor was replaced shortly after the purchase—without the committee being informed.
As a private company reaching close to 2.5 billion people worldwide, Facebook can publish whatever it wants. But unlike the New York Times and its other editorial brethren, it does not face liability for publishing defamatory material (including “fake news”), thanks to Section 230 of the 1996 Communications Decency Act, which exempted interactive computer services from such lawsuits, while allowing them to censor material they deem “lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”
This gives Facebook and other digital platforms incredible leeway to do whatever they want with content. In addition, although Facebook also functions as a massive communications platform, like a phone company, and as broadcaster, like NBC or CBS, or as a cable network, like CNN or Fox, Facebook faces none of the regulations that drive up costs in those businesses. Establishing a “supreme court” of Facebook content ought to have U.S. regulators challenging the right of a $550 billion company controlled by the richest 35-year-old billionaire in history to operate with virtually no oversight.
Europe has decided this is nuts.
In three major decisions, the European Union has created greater liability for Facebook and other social media when it comes to copyright infringement, defamation, and video on the platforms. In the summer of 2019, the E.U. adopted a directive that makes social media companies directly liable for material uploaded by users without the consent of a copyright owner, ending a previous regime that created liability only if the platform failed to take something down after being notified.
In addition, last year the E.U. changed its definition of a broadcaster, classifying social media that carries video, like Facebook and You Tube, as essentially television broadcasters. Among other things that means they must provide more protection for minors, putting in place controls such as age verification systems, and implementing easy systems for users to flag objectional content like hate speech. The move follows a major tightening of online privacy laws earlier in 2018 known as the E.U. General Data Protection Regulation (GDPR).
What really has both Facebook and U.S. free speech advocates in a dither is a European Court of Justice (ECJ) decision in October 2019, after Facebook appealed an earlier Austrian ruling. The ECJ ruled against Facebook, declaring it must adhere to Austrian law and remove, across its entire global platform, content that was found to defame an Austrian politician. The politician, Eva Glawischnig-Piesczek, had sued Facebook to delete comments that called her a “corrupt oaf,” a “lousy traitor” and member of a “fascist party.” By ruling Facebook must take down the content globally, the ECJ is effectively demanding worldwide enforcement of defamation laws of individual E.U. countries.
All these European moves grow out of the E.U.’s “digital single market” strategy and are aimed, in part, at protecting traditional media players that have lost revenue to Facebook and Google. There is continent-wide outrage at the decline of local media, and experts consider the blame to be clear. It will not be surprising if the E.U. comes up with yet more ways to rein in what it sees as an internet environment that’s out of control.
In a Wall Street Journal editorial announcing his Oversight Board, Zuckerberg explained that Facebook needs help making content decisions, but certainly not from the government. Positioning himself as a free speech advocate, he wrote: “I don’t think it’s right for a private company to censor politicians or the news in a democracy.” The editorial is part of a larger lobbying effort to persuade government to continue allowing Facebook to decide what should and should not circulate on its massively influential network. At a Georgetown Law School speech in October 2019 Zuckerberg further defended this right, invoking the Constitution, Frederick Douglass and Rev. Martin Luther King Jr. He was immediately attacked by the presidential campaign of Joe Biden, which called the speech “a feigned concern for free expression” designed to protect the company’s profits.
Facebook plans to choose the initial members of its Oversight Board later this year, aiming for it to start operating in 2020, and says its rulings will be binding. The members will get paid by a trust set up by Facebook, an effort to further distance the company from influencing the board. (At a recent Fordham law conference, the rumors were that members would make more than six figures for this part-time gig.) It will be interesting to see which world luminaries Zuckerberg chooses to help him regulate his own company, and who agrees to do so. Whatever happens, the Facebook supreme court is unlikely to dispel overwhelming societal concern over the consequences of social media content.
Since hate speech, violent videos and other material on social media lead directly to real world physical harm, political persecution, and the distortion of democracy, we might ask whether it makes sense to leave the solution to the businesses that created it. Would we do so for any other industry? Of course not.