Many citizens, businesses and governments expect privacy to survive in a virtual world. But with so much data about all of us out there, how it’s actually used may be as important as some abstract notion of “privacy.” The challenge lies in negotiating the complicated divergent interests of many parties.
Callaway: I’m David Callaway. I’m the editor of “USA Today.” The advent of the Internet of Things and of wearable technology has caused a gold rush in new data and new data usage and new data abuses, ensuring it’s going to be one of the top stories of this year for journalists such as myself, if not our generation. The pace of the news cycle has been unrelenting. Even in the last week, we’ve seen Tim Cook of Apple turn on some of his fellow tech companies, accusing them of privacy breaches. We’ve seen the US government—investigative journalism has revealed that the US government has greatly expanded its surveillance program to include foreign hackers lurking, coming in to the US or into corporate America. We’ve seen the government get hacked, probably by China—at least that’s what they suspected—in one of the biggest hacks of all time. And we’ve even seen bizarre speculation that our government is hacking other countries to find out information about its own citizens. So clearly the cycle has gone to the extreme.
With me today to talk about privacy issues and info sec are four distinguished panelists. To my far right is Horacio Gutierrez, corporate vice president and deputy general counsel of Microsoft Corporation. Horacio was leading Microsoft’s efforts in Brussels through most of the last decade, fighting—or at least trying—to educate the regulators about what was going on with big technology. Sitting to his left is Brad Burnham, managing partner of Union Square Ventures. To his left is Robert Quinn, senior vice president Federal Regulatory and chief privacy officer of AT&T Services. And on my near right is Julie Brill, commissioner of the Federal Trade Commission. Thank you all for joining us.
Brad, I want to start with you. You had an interesting comment when we spoke last week ahead of the panel. You said data is the new oil. Everybody wants it; no one quite understands what it is or what it might do in the wrong hands. What’s happening out there and how can it be protected?
Burnham: So, actually I said a lot of people say that data is the new oil, and that they say it’s the new oil because they think of it as this asset that sort of fuels the digital economy. But I said that what people have not yet begun to do is understand that it’s not solely an asset. It is also in many cases a liability. And I think that if you ask the CEO of Target whether or not he though data was an asset or a liability after losing 80 million records, I think he thinks of it as a liability.
Callaway: Julie, you are leading a dramatic increase in the scope of how the government regards data and its role in protecting consumers. How do you take what’s happening and all the kind of Wild West aspects of it and, without over regulating, make it less of a consumer problem?
Brill: Well, it’s a challenge. You know, we struggle at the Federal Trade Commission, just like every governmental agency does, to keep up with the incredible pace of technology. And in fact, technological change now, probably we’re about six months behind industry. We used to be maybe five years behind industry, but six months is the equivalent of five years now. And you know, we bring on chief technologists and whatnot to try to really understand what’s happening.
Look, you know, there was some discussion this morning about whether we have flexible enough laws to deal with technological change. The truth is I think the answer is yes and no. We have some laws that are very good. They’re sectoral, specific laws dealing with health information when it’s in the hands of doctors, insurance companies, hospitals and whatnot. We have financial laws protecting financial information when it’s in the hands of banks and other financial institutions, and credit reporting information, things like that. But what we’re seeing now is those laws are great—as long as the data is in the silos that were envisioned in the 1990s and early 2000s. Those silos don’t exist anymore, or better said, data does not respect those silos. Data is flowing everywhere. So you can have sensitive health information appearing on WebMD or appearing through Google search or elsewhere, or perhaps an Apple watch, as you would have in a doctor’s office. So the trick is to figure out how we can still have these sectoral specific laws, which are deeply protective, and I think represent a consensus that we’ve achieved—certainly that Congress has reached—and I think that’s because we as a society have reached, that certain types of information needs to be protected. Health information, financial information, information about kids, credit reporting. We need to figure out now how we can extend those rules to the way data’s actually flowing. And that means that when health information exists outside of a doctor’s office and hospital or whatever, how are we going to protect it? And does it deserve some special type of protection and special information being given to consumers about its collection and use. And you could use that example in credit reporting or financial information too.
So what we do is we look at the sectoral specific laws, but then we also apply our Federal Trade Commission Act, which is much more flexible. It basically says unfair and deceptive acts and practices in commerce will not be allowed. So it’s a very flexible tool. It was originally designed that way. It’s 80 years old, so it’s a very old law. And it’s been quite robust, and it’s given us the ability to bring cases when consumers are really being harmed. So we’re trying to define some of the outer boundaries in which companies can act and where they will be crossing the lines, and once they’re within in this space of what is not impermissible—as opposed to what is permissible, and this is a big difference, but as long as they’re not in the area of what is impermissible, there’s a lot that they can do to innovate.
Callaway: Horacio, I think we certainly can all agree that one of the main goals is to protect consumers. But in some cases, as you’ve noted and seen in Europe, regulation can be a double edged sword. How do we rectify that with advances in technology and by private corporations and public corporations?
Gutierrez: You know, and I think Julie was alluding to that point too—I mean, everybody recognized that we are witnessing —I would say—a magical time in terms of technology development. The kinds of things that are possible now because of technology, because of the Internet, and also because of what we can do with data today is really fantastic and obviously that level of innovation is something that should be supported. So you always have to try to strike that balance between ensuring that new technologies are subject to the rule of law, that there is some reasonable level of protection of consumer rights, of deterrence of deceptive types of practices, and at the same time not to overdo it. You know, I jokingly have said that there are two things that can kill the Internet. One is over-regulation and the other one is under-regulation. Because at the end of the day you also need products that people are going to perceive are trustworthy. You know, in a sense, I think we’ve come to realize at Microsoft ourselves that when it comes to our customer’s data, our business model is becoming more like a bank’s business model, because at the end of the day our reputation for how we deal with that data is really what’s going to allow us to continue to be successful in the future and use their data in a responsible way, in transparent way, but also in a way that can deliver all this magnificent set of benefits and features that we want to offer.
Callaway: Bob, how does that work for AT&T? I mean there are arguably no more regulated companies than the telecom companies. Even the banks, even your brethren, the banks, would probably agree with that. How does that work in your world, where the ISPs are under extreme tight regulation and yet there’s data going out everywhere?
Quinn: Well, I think it’s really a challenge. You know, the FCC just declared broadband Internet access to be a Title II service, and that subjects broadband Internet access to the CPNI [customer proprietary network information] rules that are in Section 222, and, you know, we already have a lot of issues with that that I think are very akin to what commissioner Brill was talking about when she talked about it in a context of health data that exists. When it’s within the silo, we have a lot of protections around it, but then when health data’s collected outside of the silo, by a watch or something else, it’s just as sensitive information, but, you know, there’s no protections built into it at all.
We’ve already seen that in the telecom world. You know, for years and years, to give you an example, who you call on your phone or who you send a text message to on your phone, that’s considered CPNI, which is protected. I have all kinds of rules about that. We have to take reasonable precautions to protect the confidentiality of that data. Yet, if you have a smartphone in your pocket and the phone app is a Google phone app—I don’t mean to pick on Google. It’s any app, right? But if you have a Google phone app, the Google phone app is capturing who you’re calling and who you’re texting as well, and so that data, when it’s collected by Google, it’s not subject to the CPNI rules. And it doesn’t make any sense any more. Some of these rules don’t make any sense, and our big fear in area of Title II is that we’re going to get a whole bunch of rules that are designed to give consumers comfort that their data’s being protected when in fact it’s a very limited set of data and it’s a limited set of players that the, quote unquote, “protections” are actually working for.
Callaway: Closing the door after the horse is out, the barn door.
Callaway: What do you think of that, Julie?
Brill: Well, you know, there’s a number of things Bob said I could react to, but one of the ones I want to react to is—
Quinn: That’s good. I packed it all in.
Brill: You packed a lot in there. But the one thing I do want to say is something that I often have to tell my European colleagues and people I deal with in Europe—because I do deal with the European-US data flow conversation quite a bit, and it is not true that we’ve no protections outside of silos.
Quinn: Oh, and I apologize for that.
Brill: No, that’s okay. No, no, no.
Quinn: I know you’re sensitive to that, and you’re right. I’m wrong on that.
Brill: And the reason why I’m sensitive to it and the reason why I want to underscore the point is because I do think we have protections. They’re not specific. They’re not sector specific. They’re not sort of the, “Thou shalt do X, Y, and Z” as you might see under GLB [Gramm-Leach-Bliley] or under HIPAA [Heath Insurance Portability and Accountability Act] or under FCRA [Fair Credit Reporting Act] or COPA [Children’s Internet Protection Act]. But rather they are—it’s a flexible law, which I think has some very interesting implications, for instance for the Internet of things or for other issues that I hope we get into in a minute. But what is does is, it says you can’t be deceptive and you can’t be unfair. And unfairness is dissociated from statements you make and promises you make to consumers. Unfairness is you can’t harm consumers in a way that they can’t avoid. And I think there’s a lot of activity that can happen with respect to health information, with respect to financial information, and we’ve brought these cases where a company can wander into an unfairness space. So that’s the one point I really want to underscore.
Quinn: You know, and I’ll tell you, my experience was in the CPNI specific world that I was talking about, because the FCC hasn’t done those types of things that you have. They haven’t used their CPNI rules, or at least yet, to interpret them to be like your unfair or deceptive practices standard that you have. I think the FTC is very sophisticated in the way they approach privacy, and we’ve very good experiences working with them, and we’ve had hard experiences working with them too. We’ve paid some fines. But I think that’s a good environment, and I think you guys have a really good understanding. You say you’re six months behind, which is the equivalent of five years. I think you’re miles ahead of the agency that is now going to have rulemaking authority over privacy as it relates to broadband Internet access service. I think they’re trying to catch up to you guys, but I think we’ve got a long way to go.
Callaway: Brad, one of the people who is picking on Google is Tim Cook. He said in a widely reported speech last week that others in his industry were essentially violating privacy issues, that Apple would have none of it, that that’s not the business it wants to be in. What’s going on there? It looked to me like you’ve got these champions of market forces suddenly turning on each other.
Burnham: Well, so I think that Tim Cook is in a business that isn’t really defined by data, and Google and Facebook as examples are. I think he’s separating himself, and he has the opportunity to separate himself for that reason. But I think that there’s a larger conversation to be had here, which is we don’t really even have an understanding of whose data this is. You know, I’m a huge believer in markets. I’m an investor. I believe in capitalism, I believe in markets. I believe markets are wonderfully flexible and adaptive, and I think that regulation largely, in most contexts, is destructive to that kind of flexibility. But we’ve had 3,000 years to develop our notion of what physical property is, and we’ve stumbled around a little bit recently on what intellectual property means. I mean, patents and copyrights are hotly debated today. We have never started the conversation about what rights in data means. So I personally believe that if we start that conversation it will lead us to a different conclusion than if we try and remedy all of the specific what I would think of as symptoms of a misunderstanding of whose data that is—you know, how the data is being used by any individual party—if we go after each one of those it’s kind of a whack a mole game instead of doing the very heavy lifting to start thinking through what it means to have a right in that data. You know, Facebook doesn’t create any data, right? Facebook creates the environment within which we interact with them and create the data. And so if you argue that consumers have an interest in that data, you begin to unpack this in an interesting way.
Callaway: Horacio, with regard to market forces versus regulation, what can you tell the FCC based on your experience in Europe with regulation about wanting to be careful about going too far into interrupting market forces?
Gutierrez: I’ll make two points. First, I’ll acknowledge Brad’s point about, you know, we don’t know what data is. This is a very complicated topic. There are some aspects of data that very clearly we can say, “this is the customer data,” and then there’s data that you derive from customer data, and then it gets really complicated after that. But we’re in such a nascent point of the data market development as such that it is, I think, absolutely right that there’s a lot that we need to understand better, and certainly be careful about regulation before there is more clarity there.
The global point is a really interesting one, because so far we’ve been talking about, you know, sort of the US regulatory regime and how it’s evolving and things like that, but we have to have this discussion in the context of the whole globe—because these services and these technologies are global in nature. And the approach that different countries and different regulators around the world have to these issues are very different, and they’re informed by their own biases related to technology, their own societal values that are legitimately different than ours. And this panel is focused on privacy, but this issue about the collision of this data technology world with other rights, even human rights like privacy—you know, freedom of expression, religious freedoms, freedom of political association and things like that—we actually face those issues almost on a weekly basis as we’re trying to think of creating these products and services and offering them in Asia and in Europe and in Latin America and everywhere.
So I think that at the same time as it is clear and it is absolutely obvious to say that these emerging technologies cannot be sort of in a bubble that’s exempt from laws and regulation, one needs to be careful that the resulting apparatus of regulation internationally is not so disjointed that it is impossible to offer products and services. I mean, it used to be we would make a product in the US—many of the technology companies are American—you could make a single product and then that product could be just translated, very likely localized, and then sold anywhere around the world. And then what’s at stake right now is whether in the future we’re going to be able to continue to do that or whether, you know, things are going to be disaggregated and broken down because there’s going to be data residency requirements and there might be, you know, censorship and content moderation requirements in others, that it’s going to become really impossible to have a global service and reap some of the benefits of it. So some of it I think is going to happen, the question is how far.
Callaway: Julie, you want to respond to that?
Brill: Sure. Again, lots in there. But, you know, my conversations, not only with European commissioners in the EU commissions, and also with the DPAs in Europe, but also through the International Conference of Data Protection and Privacy Commissioners, where I’m on the executive committee and I deal with them quite a bit—yeah, there are some different perspectives, there’s no doubt, you know, and one of them is is privacy sort of a fundamental right versus is it a consumer issue. I think that’s sort of a major dividing point. Although I often remind my European friends, we do have a constitution, we do have constitutional rights to privacy, but it’s often vis-à-vis government, not vis-à-vis companies. They say, “Oh right, I forgot about that.”
But I think there are lots of issues where we are working towards a similar goal globally. So for instance—and a lot of this is taking place with respect to new technologies, and I think one of the best examples is with respect to the Internet of Things. I think another really good example is with respect to data brokers. Now, in Europe, they’re just beginning the conversation around data brokers, which is a conversation that we’ve been having since I’ve been on the commission. You know, it’s been four or five years because this has been one of my number one issues and we did a very deep dive by doing the study of data brokers to figure out the exact kind of profiling you were talking about—that is, what can you derive from consumer’s data? What are the inferences you can make about consumers? How are consumers getting segmented and profiled—for commercial reasons, but sometimes these commercial reasons can be very significant for consumers, like where will they be in the customer service line when they call up? Will they be at the end, or will they go fast forward to the beginning? Or are they a trustworthy consumer? Are they who they say they are, or are they going to have to go through a lot more security clearance before they can get to a particular website or get to their account? Some of this is good stuff, but some of this is problematic. And one of the things that we’ve found with respect to data brokers is some profiles are fine—like, you know, do you like gardening, do you like dogs, do you like cats, or do you like dogs and you like cats, whatever right? But other profiles were deeply troubling. So things like, you know, ethnic second city struggle or urban scramble, both of which were profiles that were largely populated by African Americans and Latinos who were mostly like single parents struggling to make it and living in an urban environment. Code words for certain types of folks.
Now, even there it’s a complicated conversation because on some level you can see a bank wanting to use that information that wants to offer an introductory product to try to bring them out of a financially disadvantaged situation and bring them into a really good, you know, an entry product. Then other times you’re going to see scam artists using the exact same information to harm those individuals. So it’s a complicated conversation that we are having and this is a conversation that is beginning in Europe now.
Callaway: How do we turn that into transparency for consumers, right? I mean, the murkiness, even at the corporate level is, you know, I mean what’s the AT&T data broker supply chain looks like, right? When I give my information to AT&T, I just assume they’ve got it, right? But there must be some supply chain, and where are the weaknesses, right?
Brill: The weaknesses are everywhere in the data broker system, unfortunately. So there’s the furnishers of the information, there’s the data brokers, and then there’s those who use the information. And one of the phrases I really like to use when I talk to companies is don’t be an ostrich. Really figure out what is happening to your data, where it’s going, and then if you’re using this data, how you’re using it and are you using it in an appropriate way. So often when I speak, the one thing I want people to take away is don’t be an ostrich if they’re companies.
The other very important message is there are lots of tools we can give consumers to try to navigate this system. I talk about creating portals and dashboards for consumers, whether it’s in the Internet of things, through the command centers that are going to be in everybody’s home, you know, governing the phones and the stoves and the dishwashers. That can be a great portal for informing consumers about how their data is being used, you know, what choices they might have about it. You could do online portals with respect to data brokers, how are consumers being profiled, but we can’t place so much of the emphasis on consumers. It is going to be way too much burden on them to try to navigate this system. We need to start thinking—you talk about data as a resource. I think of consumers’ attention as a precious resource, and we need to start thinking of it that way. And that means that companies need to start doing this from the get-go. Privacy by design, security by design, they need to start thinking about these things and building these things, and building these things into their products and services.
Callaway: That brings me to my next question. You just said it. For most consumers, I think a recent study said they’re five times more worried about their security than their privacy. In a sense they’ve already given up on the privacy aspect, right?
Brill: I’ll dispute that, but keep going.
Callaway: Okay. Bob, are privacy and security quickly becoming the same thing?
Callaway: Yeah, and it’s really the companies that are targets. That’s where the money is, that’s where the data is. So I can be, Brad, you know, the most diligent consumer and read up on everything and have all my privacy settings, but I still have information that I’ve given to companies that is at risk. With all the news of hacking that’s been coming, really since Target, and continuing to move faster and faster, are we going to be talking mostly about security next year at this time, or is privacy still a major bugaboo?
Burnham: Well, I think we will be talking about the security of personal information because companies will continue to collect it and it will continue to get hacked. But I think the reason for that is it will take a long time for us to begin to reassess our notion of whose data this is. But if in an alternative universe, which I don’t think will exist a year from now, you can imagine a scenario where users’ data is stored in a user controlled data store and companies go to the data instead of the users going to the company, and in that world you can imagine that the data store wouldn’t be on a server in somebody’s basement—because that’s too much work; you know, consumer attention is very valuable and scarce—but rather it would be a third party, a company that emerges that manages that data on behalf of users and users would choose which data store based on the values and the trust in the brand that they had. And that user would probably then indemnify the data store, because the data store would be working on behalf of the user. Where it’s gets interesting is if companies are extracting data from users without really users having a full understanding of what’s happening and then that company gets hacked and it turns out that the company had data that the user didn’t really even understand that they had, then I see why a user is upset about that loss. Whereas if they have a consensual relationship and a defined contractual relationship, including an indemnification, it’s a much more balanced relationship.
Callaway: I want to open to questions in a minute, but did you want to respond?
Brill: Well, so I agree with everything these gentlemen just said about privacy versus security, but we’ve done a number of cases where you can see how closely these two really are related, and how they really are two sides of the same coin. So I’m just going to throw a couple out there, just as examples. So our very first Internet of Things case involved a company called TRENDnet, which had offered video feeds that you could access through an app and so you could look into your own home and watch your kid or your babysitter or whatever you wanted to do. And it was hacked, the service was hacked, and so these video feeds, that were supposed to be highly secure, were actually able—the hacker was able to post them online. So these were video feeds of people in their homes, you know, doing very private things like people normally do in their homes, suddenly went out there kind of in the clear. Privacy problem? Absolutely. Security problem? Absolutely. Very, very closely related.
Similarly, with respect to Snapchat, which is an issue that we didn’t have time to talk about, but when you think about competition around privacy—you alluded to it in terms of Google versus Apple—what I see happening now is there is a tremendous amount of competition on privacy, on the attribute of privacy. Snapchat, Whisper, a lot of other services are defining themselves, are differentiating themselves through their ephemerality, which is really a form of privacy. But what happened with respect to Snapchat is, although it said its snaps were going to disappear and it was going to be private as a result, actually, vulnerable to hacks and the snaps were able to be downloaded and obtained. Privacy problem? Absolutely. Security problem? Absolutely. So they can be closely related, despite the fact that I also agree with you that they need to be treated kind of differently and security is a 24-7, I completely agree with that, absolutely.
Quinn: Two sides of the same coin is something that—we use that terminology all the time.
Callaway: All right, we’re going to open to questions, but real quickly, Horacio?
Gutierrez: Just one word. I, you know, as the person whose team has to advise our product groups, we don’t get to choose whether it’s privacy or security. It’s got to be both and they’re very connected.
The other thing is we’ve talked about the protection of the data from the companies that provide the service that collects it, the protection of the data from hackers, and a topic that we haven’t covered—and maybe we won’t have time to cover today—is, you know, unauthorized access to the data by governments, either our own or some other government.
Burnham: That’s the next panel.
Gutierrez: All right.
Callaway: Let’s give some folks out there some time to ask some questions. Do we have anything? Yes, right there.
Highland: Hi, my name is Bernadette Highland. I have a question related to education of the people who are in charge of making policy and laws in relation to a comment earlier that Andrew made. There probably maybe are 10 people in Congress who really understand the generalities of the difference between the web and the Internet. And I’m not asking them to be able to define what TCP/IP stands for, but just the basics. And here we have a world moving extremely quickly, and probably in this room are some of the top minds around governments and law and technology, but there aren’t a lot of these of these people, and a lot of things are happening so quickly—and frankly big companies, some of whom have representatives in this room, are undermining international efforts to do things like create international data exchange standards. The Privacy Aware Web—that was a project led by Tim Berners-Lee, the inventor of the web—was sabotaged by private interests. So we talk about this, and there are some big names up here who are saying “we support this” and “privacy is key,” but there’s an awful lot for you to gain by keeping that information locked up in silos. Why aren’t we doing things in a more distributed manner with collaboration internationally to make enterprises like the web, which is the most robust, scalable information system on the planet?
Gutierrez: I’ll just say something. The point about international collaboration, I think, goes back to one of the first points I made here. We think it is critical. We think that in the absence of a more sort of transparent and collaborative discussion with regulators around the world, not only do you have the problem of the emerging patchwork of disparate regulation, but the reality is some of these problems involve global phenomena that need to be tackled consistently—and you’re going to face the impulse by each regulator to apply their rules in an extraterritorial way. I mean, it’s one thing to think about, you know, the democratic process and some of the fundamental guarantees that one would enjoy in a country like the United States—or in some other countries around the world versus, you know, some other country that doesn’t have the same system of checks and balances and that may be using sort of the same argumentation in order to gain access to customer data in other parts around the world. So this notion of extraterritoriality and the potential abuse of it makes it really essential that there be a dialogue on an international level, perhaps even an international instrument that emerges out of this that makes sense of this potential patchwork of disparate regulations.
Burnham: I’ll pick up on another aspect of that question, which was that policymakers don’t understand the technology well enough to be able to prescriptively architect the right market structure, and I think that’s a real risk right now. And so I think that the first instinct should be, “do no harm.” You know, don’t just sort of lay back. But then the next instinct should be to try and identify principles that have a kind of universality, even potentially across borders, so that they’re defining markets that work—as opposed to specific behaviors on the parts of companies, and that would be a better outcome than having them try and actually describe the specific behavior that they want.
Kirkpatrick: Horacio has been admirably restrained in not talking about one of the most interesting examples of this problem of international privacy protection that he’s deeply involved in. I wanted to just encourage you to briefly describe the Microsoft Ireland situation, and then I also wanted to ask Julie what she thinks of that after you describe it.
Gutierrez: Yes, so basically we are in litigation against the government of the United States in the context of a warrant that they issued trying to get, through us, access to data of an Irish citizen that had used our services. So essentially we view it as an extraterritorial attempt of exercising jurisdiction extraterritorially. And we think it is symptomatic of this problem that I was talking about, that in balancing on the one hand the privacy rights of an individual against law enforcement access considerations—and some other cases might be national security or it could be public safety considerations. We’ve seen them in different parts of the world motivated by different reasons. Sometimes there is an overstepping of some of the core principles that really are the foundation to the legal system. This is not new. We’ve seen them in history, all the way to John Adams’ Alien and Sedition Act and Abraham Lincoln’s suspension of the writ of habeas corpus, to the internment of Americans of Japanese descent, to the Church Committee’s findings on counterintelligence. There’s all these episodes in the history of the United States and of other countries in the world in which the pendulum has swung so far to one side and against privacy rights that, later on, when we look back we come to regret it. Just as in the case of the Japanese internment camps, you know, Congress had to essentially admit that they had erred, and the person who brought the case to the Supreme Court ended up getting the Presidential Medal of Honor in 1998.
Callaway: Julie, what do you think of that?
Brill: Well, I really don’t want to comment—sorry—on the particular case, but I do think the extraterritoriality issue is an important one to think about, and, again, without commenting on what’s happening with respect to the particular warrant at all, there are issues around extraterritorialities you’re identifying. But there’s also extraterritoriality that can be beneficial. For instance, when we’re enforcing safe harbor, we have companies that have made a promise that they are going to be respecting the principles that have been established through safe harbor. We, the FTC, can protect not only American citizens, but we can also protect European citizens. And this is actually a very important component of safe harbor. If we didn’t have some aspect of extraterritoriality with respect to safe harbor, we would probably not be in the discussions that we’re in now with the Europeans, arguing that safe harbor is a data flow mechanism that ought to be retained, because we at the FTC have authority to deal with it. Safe Web Act is another one, where we can go after companies that are in other countries that are harming US consumers, or we can obtain relief against US companies that are harming citizens in other countries.
So I don’t disagree with anything that you’ve said, Horacio, about some of the concerns and some of the horror stories that you have raised, but there are times when extraterritoriality—when used properly, when used with restraint, when used in appropriate cases—is actually going to be beneficial, not just to the consumers involved, but also to companies that want to operate through some sort of international mechanism. Because without enforcement, it’s not going to happen.
Callaway: One last minute here, I want to get in one more question to the four of you. Just a quick one-word answer. John Chambers says that the hacking issues that are plaguing Target and corporate America will get much, much worse before we get a handle on them. True or false?
Gutierrez: It’s so hard to predict. I hope not, but it is a possibility. Technology can be broken.
Burnham: Unless we figure out a way to decentralize the data and give users more control, it will continue to be a problem.
Quinn: Horacio’s answer. I hope not, but hackers are good.
Callaway: All right. Thank you folks for coming, and thank you distinguished panelists.