Here comes an understatement: Facebook’s failure to protect user data was well known before the company suspended dealings with Cambridge Analytica last week. What is not well known is that the transfer of 50 million user records to the controversial data mining and political consulting firm could have been avoided if the Federal Trade Commission had done its job. The FTC issued a 2011 consent order against Facebook to protect the privacy of user data. If it had been enforced, there would be no story. Facebook bears responsibility too, because it actively worked to avoid compliance.
Perhaps if the government and company had done their jobs, we would have seen a different outcome in the 2016 election.
Back in 2009, the Electronic Privacy Information Center (EPIC), which I head, and a coalition of consumer organizations filed a complaint with the Federal Trade Commission. It alleged that Facebook was overriding user settings and allowing third parties to obtain users’ private information without their consent. We had conducted extensive research, documented the problem of Facebook’s changing privacy settings, and turned to the FTC to seek a legal order.
The Federal Trade Commission launched an investigation, and in a comprehensive settlement with the company in 2011, made clear that it agreed with us. As the FTC said at the time, “Facebook changed its website so certain information that users may have designated as private – such as their Friends List – was made public. They didn’t warn users that this change was coming or get their approval in advance.” Also, from the 2011 settlement: “Facebook represented that third-party apps that users installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users’ personal data – data the apps didn’t need.” Much of this was in our original complaint.
The FTC’s findings were also followed by a series of legal requirements that the FTC imposed on Facebook. Under the settlement, Facebook was “required to obtain consumers’ affirmative express consent before enacting changes that override their privacy preferences” and “barred from making misrepresentations about the privacy or security of consumers’ personal information.”
In a companion case EPIC also pursued, the FTC put in place a similar consent order with Google, after the company tried to dragoon Gmail users into the now defunct Google Buzz social media service (which aimed in part to compete with Facebook). Among the serious privacy violations, Google launched Buzz by automatically turning the private address books of email users into public network directories. The FTC agreed with us that strong action was necessary.
As the result of these two enforcement actions, both Google and Facebook are now subject to 20-year oversight by the Commission and annual reporting requirements. At the time, we were elated. Although the United States, unlike many countries, does not have a data protection agency, we believed that the FTC could safeguard online privacy even as the tech industry was growing and innovation was proceeding.
We celebrated too soon. Almost immediately after the settlements, both Facebook and Google began to test the FTC’s willingness to stand behind its judgements. Dramatic changes in the two companies’ advertising models led to more invasive tracking of Internet users. Online and offline activities were increasingly becoming merged. New techniques, such as facial recognition, began to be deployed under the FTC’s watch, even though that identification practice is actively banned in many countries around the word.
To EPIC and many others, these changes violated the terms of the consent orders. We urged the FTC to establish a process to review these changes and publish its findings so that the public could at least evaluate whether the companies were complying with the original orders. But the Commission remained silent, even as it claimed that its model was working well for these companies.
In 2012, EPIC sued the commission when it became clear that Google was proposing to do precisely what the FTC said it could not – consolidate user data across various services that came with diverse privacy policies in order to build detailed individual profiles. The problem was widely understood. Many members of Congress in both parties, state attorneys general, and Jon Leibowitz, the head of the FTC itself, warned about the possible outcome. Even the federal court, which ruled that it could not require the agency to enforce its order, was sympathetic. “EPIC – along with many other individuals and organizations – has advanced serious concerns that may well be legitimate, and the FTC, which has advised the Court that the matter is under review, may ultimately decide to institute an enforcement action,” wrote the judge.
But such an enforcement action never came against either Google or Facebook. The FTC has stood idly by as the threats to privacy and democracy have grown. If the FTC had enforced the Facebook consent order, Cambridge Analytica could not have accomplished its unprecedented data harvest. Instead, Facebook allowed app developers to access information on all of a user’s friends without the friends’ knowledge or consent. This meant that by getting a mere 270,000 people to download an app, Cambridge Analytica was able to obtain data on 50 million people. We understood that risk in 2009.
The companies themselves have also taken steps to avoid public protest that might have curbed the most egregious business practices. For example, Facebook has actively inhibited the ability of Facebook users to form groups to organize online against the company. Following a successful campaign in 2009 led by the 150,000 members of a Facebook group called “Facebook Users Against the New Terms of Service,” the company simply prohibited the creation of any user group that included the words “Facebook “ or “FB.” (Try setting up a group “Facebook users for privacy.”) This assault on online freedom was all the more stunning with the Arab Spring campaign in the background.
Perhaps there is an opportunity for change. With four new commissioners at the FTC, now is the time to begin enforcement of the original consent orders and to safeguard consumer privacy. The Federal Communications Commission (FCC), which has been sidelined in the recent privacy debates, also needs to get back on the field and take responsibility for the privacy and security of the nation’s communications infrastructure. And Congress needs a comprehensive approach to online privacy that includes real protection for American consumers across all Internet services, not an approach that favors one sector over another.
Now Congress is waking up that privacy on Facebook is a problem. During the past week, members in both parties have called for hearings. Senator John Thune (R-SD), chairman of the powerful Senate Commerce Committee, wrote this week ““the possibility that Facebook has either not been transparent with consumers or has not be able to verify that third party app developers are transparent with consumers is troubling.” And long-time privacy champion Senator Ed Markey (D-MA), said that in light of the “ongoing Federal Trade Commission (FTC) consent decree that requires Facebook to obtain explicit permission before sharing data about its users, the Committee should move quickly to hold a hearing on this incident, which has allegedly violated the privacy of tens of millions of Americans.”
No doubt Mark Zuckerberg will need to appear before Congress. But so too should the Commissioners at the FTC. Someone needs to ask the FTC, “Why didn’t you stop this from happening?”
Marc Rotenberg is President of the Electronic Privacy Information Center. He will be a speaker at Techonomy NYC on May 8-9, as part of our discussions of the impact of net giants. He also helped establish the .ORG domain, that enables and promotes the non-commercial use of the Internet.