Session Description: Every time we engage with a connected device, we’re generating data and value. More often than not that value does not accrue to ourselves. As data becomes the most valuable asset we have, why don’t we have ownership or control? Could we make it happen?
The full transcript is also available here as a downloadable PDF.
Kirkpatrick: I’m just here to introduce Katie Benner, a longtime friend and colleague who worked at Fortune when she was just starting out and now is like a major star at The New York Times. We’re so pleased to have her talking about data ownership as a civil right and whether that’s the future we’re heading toward.
Benner: Thanks, David, it’s always nice to see you. Always nice to be here as well, and thank you for coming to talk today about whether or not data ownership should be a civil right. I think that it’s the sort of conversation that’s been brewing for a long time, but then it’s starting to come to a head over the past couple of years with a couple of key moments. Whether it was Snowden, whether it was the Apple encryption fight, here we are now and everybody’s thinking about, “What do we do with our data, who has it, and why?”
I’m just going to kick it off by introducing all of our panelists and diving into questions. Please, I encourage you, at any point during the discussion—I’m not going to do a Q&A at the end—start raising your hands early and often. I want to incorporate your questions into this discussion. Because there are so many of you here. You’re a great resource. You’re all really sharp thinkers, and I don’t think it seems fair to force you to hold everything back for the last 10 minutes of the panel. So please, as you raise your hands, I’ll call on you. Sound good? Awesome.
Okay, so I will start on the far end here, with Christiane zu Salm. She is the publisher of Nicolai Publishing and Intelligence, which is one of Germany’s oldest publishing houses. It’s so great that you could be here from very far away. We’re happy to see you.
And then, sitting next to her, is Marta Tellado. She is the CEO of Consumer Reports, which is a nonprofit dedicated to consumer safety and empowerment. And she’s also worked on the Aspen Institute’s Domestic Policy Group, and served on the Ford Foundation board, so thank you so much for coming.
Seated next to her is Dr. Eric Topol. He is the director of the Scripps Translational Science Institute, with a focus on genomics and drug development policy. He was previously the chair of cardiovascular medicine at the Cleveland Clinic and has a lot of interesting ideas about medical data and how we should own medical data and keep it more protected.
And finally, seated next to me, is Sabrina Ross. She’s currently an attorney at Uber with a focus on international privacy and security, but she also worked at Apple with a focus on privacy, a company that we all know has very strong opinions on that topic.
So thank you all for joining us and for being here. And I think that maybe—I’d love it if Christiane, if you could really set the table for us. You had some really interesting ideas earlier about what it means—why should we even begin thinking about data as a civil right? What lays the groundwork for that discussion? Because for a long time, data was just thought of a byproduct of technology companies, or their consumer enterprise. Just something that was produced that those companies could use as a secondary resource. Why should it be a civil right?
zu Salm: Yes. First and foremost, thank you so much for inviting me here from Europe. That’s basically thanks to David, who’s always open-minded to Europe. He knows where Europe is, he knows that Europe exists.
[LAUGHTER]
zu Salm: I, as a European, I truly appreciate that. Also, this morning you were mentioning Macron, the president of France, and I don’t know if you know, David, that he’s a philosopher. And that’s exactly what I think we more and more need—philosophical thinking. And my background is this publishing house. Nicolai Publishing was founded 1713 in Germany, and it stands for the Age of Enlightenment. And I truly am convinced, even if that might sound a little bit exotic to you, that the thinking of the philosophers from 300 years ago might be a useful source to look at to solve, maybe, today’s problems. Because I think it’s very much about not that we think about the challenges ahead, but how we think about them. This is why I want to share with you a little bit of the philosopher’s thinking and see what they have to say about this big question, whether the ownership of data should be a civil right.
I think we can all agree that the ownership of data has a lot to do with privacy, and privacy is one of the most fundamental rights and freedoms in our liberal societies. And we Europeans, and also the philosophers, have a certain view of that in that regard. That they made—the philosophers made a big distinction between public and private since a long time. So, there was, for example, John Locke, the British philosopher. He tied freedom and equality very much to privacy, and he said that free and equal individuals should be able to make their own decisions and make their own life plans, and hence privacy is a fundamental right. Kant—he even said that privacy was human dignity. Nothing less than that. And privacy laws should reflect on the moral imperative to treat others with respect. And John Stuart Mill was the first to claim that privacy is a social good that benefits both societies and individuals. And believe it or not, that very idea found its way into the laws of the European data protection right where—Europe has been released recently—where the privacy rights have been declared as human rights. And I think in that context, it’s really interesting to discuss the ownership of data rights because it touches our fundamental values of privacy and of a self-determined life.
And, if you might find that a bit strange here in the west coast of the U.S. to look at philosophers from 300 years ago, I’m happy to say that I met Beth Comstock. I listened to her speaking at a conference in Los Angeles on Saturday, and she’s been asked by somebody in the audience, “What would your advice be to your 21-year-old self?” And she said, “Study philosophy.” So that’s where I’m coming from, and my conviction is that yes, we do have to fight for the right of ownership of data for the individuals, even though it will be a long road and it will be very difficult, but I think it’s worth fighting for it.
Benner: That was great. Thank you so much for that.
And Sabrina, you’ve worked at companies. One that has a very strict on property—I’m sorry, on privacy—and who owns the data, and then at Uber, where, like so many companies today, it’s using the data that it collects in myriad ways. I’m wondering if you can take us into a more pragmatic realm from what Christiane has talked about. This is all wonderful, and I think most people would agree with you on balance, but with companies like Facebook and Google being so large, they’ve already mined so much of our data. The horse has already left the barn. At this point, what pragmatically does it mean to have ownership? Can we even have ownership? And then maybe walk us a little bit through the U.S. and EU differing points of view on this and how that will impact tech companies.
Ross: Absolutely. Thanks, Katie.
I couldn’t agree with you more. I think anyone who practices privacy law believes deeply in, if we think of ownership as control, choice around use of data. I think the question, when I hear sort of how these philosophers are thinking about it, is the one Katie asks, which is how? The research is pretty conclusive that users right now are overwhelmed by the amount of choice they have. That they can’t keep up with the cookie bar on every website they go to that we all have to click. The HIPAA privacy notices that we sign. So, to be honest, when I hear a call for a new privacy law right now, I’m a little bit skeptical because the legislative proposals I hear right now don’t seem to me like they will actually move the needle on on-the-ground user control. Where I hear a lot of exciting action is at the sort of intersection of law and technology.
So, to give you one example, the California Attorney General’s office collaborated with Carnegie Mellon on a machine learning analysis of about 18,000 apps in the Google Play Store and found that of those, about 50 percent had a privacy policy as required by law, and, even though, 70 percent of those seemed to be using personal data. So, to me, that points to we have privacy laws that largely aren’t being complied with. I think they’re complied with by the big players, but not by lots of other people in the ecosystem. And so there’s this issue of how do we actually bring up compliance with the laws we have. And then I think there’s absolutely a place for evolution in law, as we’re seeing in Europe under the GDPR.
One other example of the sort of law and tech interaction I wanted to mention is the National Center for Science and Technology, which has now invested millions of dollars in sort of privacy research agenda. And among those are, sort of at the nexus of law, how do we automate checking whether people are complying with their privacy policies. How do we—that they’re complying with individual, discrete user control choices. So I think some of the best questions right now are is technology really prioritizing privacy, and I think smart companies have figured out that they need to be at the leading edge. They’re not just complying with law, but they should be investing in privacy enhancing technologies. They should be collaborating with people like Marta, who are trying to think through really innovative ways to help users understand the controls that are already available to them and maybe develop new ones. Maybe have a single setting that’s your default privacy setting that controls all your data used by apps, and change it for companies that you particularly trust or distrust. Yes?
Tinianow: I have a question. I’m Andrea Tinianow, of Global Delaware. I’m a lawyer, I consult on economic development. I guess I have one question. In Europe, there’s a real difference between the way people treat privacy and a right to privacy. Here in America, our so-called right to privacy is not written down anywhere. It’s something that’s been interpreted, and even that’s been called into question. So I think in a way we sort of have to look at that first, before we start thinking about what companies should do or what legislators should do, because there’s a stark difference, I think, in the way that privacy is treated, just sort of at the very base level.
Ross: Completely agree. We have a sectoral approach. Health privacy, financial privacy. FTC, just based on don’t be misleading about privacy, whereas the EU has this global—
Tinianow: And it leaves so many gaps. Even if you just think about HIPAA laws—I mean, we all get these forms and it’s supposed to protect us, but it’s these contracts of adhesion, that—
Ross: Yes.
Tinianow: You know, you show up and you can’t start negotiating with the people at the front desk. You sign it, you turn it over.
Ross: Well, and just to pick up on that in just one more way, I think that companies by-and-large aren’t architecting their systems for one country or one region. So everyone in this room probably knows that Europe is moving the needle on how companies store data because you now have to be able to delete data in particular ways. That could be an incredible boon to U.S. users, even though it’s not actually U.S. law right now.
Tinianow: That’s right. And someone actually said that yesterday in speaking. They said if we don’t move fast, the Europeans—they’re going to write the laws. Good. Good!
[LAUGHTER]
Benner: Eric, this might be a good time for you to step in and talk a little bit about what you call the health data conundrum, because it really dovetails nicely in with this conversation, and how does the consumer right to controlling one’s data help solve this conundrum.
Topol: Thanks, Katie. I think it’s interesting. We already would acknowledge that data is more valuable than gold or oil, and of all the data, medical data is clearly much more valuable. If you just view it from the market cyberthievery, it’s fivefold more than financial data or anything else in terms of what its value is today. The problem is, of course there’s been a lot of declarations of the end of privacy. I mean, this is a special “Science” issue and many other papers. That may fly for certain things, but it’s not going to for medicine.
So the issue here, I think, is that people don’t own their data today. It’s now being the nodes of entry are not just the hallowed electronic medical record, but rather sensors. They’re getting more and more medicalized. And genome sequences, and all sorts of other ways that one will be gathering data. So it’s your body, you pay for it. But the problem is we have this profound medical paternalism, and doctors believe that patients can’t handle the truth. And this has been a pervasive problem which has prevented people from owning their data. And we have these cockamamie portals that people try to get their data—try to get access. They don’t work. They’re very clunky, plus they’re incomplete. And the data that you get to is full of errors. It’s all cut and paste from one record to the next, so what you’re getting is not very good to start with.
So it denotes a lot of lack of human respect, the way it is today. It needs to change, and hopefully it can. And I don’t if it’s going to change in sync with this bigger data ownership story, but sharing research—I mean, right now we’re spearheading a big part of this precision medicine initiative of a million Americans. And we’re getting them their data. And they’re going to be using sensors, getting sequenced, and microbiome, and all these other facets. And we’re going to see exactly—we do know there’s lots of data prior to this right now—that when people have their medical data, it improves their outcomes. And so we hope to amplify that with a lot more proof at scale.
Benner: There’s a question in the back. Yes?
Herman: Yes, I think the fundamental concept behind a lot of this—this notion of trust that I heard brought up a number of times. My company, PsyML, I’ve been a computational neuroscientist for a long time, building credit models, fraud models, marketing models, healthcare models, and there’s a part of that goal for privacy that we’re all here even talking about, is because people are scared of what people are doing with the data. So I’ve written some articles on Hippocratic Oath for data science and just trying to be much more transparent about what you’re doing with the models. Because even if you do have these really strict privacy rules, right, and then there’s this big data leakages, and there’s companies that aren’t being trusted with what they’re doing with your data, that trust variable is still going to be there.
So I think unless we’re also focusing on how we make what we do with the data transparent, as opposed to not just paternal, but this idea of a subconscious use of the data. Like, “Oh I got an ad coming at me and I don’t know why.” Just getting rid of that kind of paradigm where we kind of have an insight on why things are being done for us, with us, against us, I think will do a lot towards this conversation on privacy. And I just want to hear your thoughts.
Benner: That’s a great segue into what Marta is doing over at Consumer Reports, because Consumer Reports, we all know, has fought for so long to arm consumers with information they need to make consumer choices, but that has now spilled over into data for these very reasons. So could you talk a little bit about that project?
Tellado: Happy to. So you’re probably thinking, “Gosh, you’re the guys who help us figure out what kind of car I should get. What is Consumer Reports doing in this conference on a panel about data privacy and technology?” And we did not have philosopher kings as founders, or queens, but we did have some pretty forward-thinking engineers and economists that founded Consumer Reports about 80 years ago. And they founded it because they saw a lot of market failures, and I think that general core principle still stands today. They saw market failures, and they understand that when the consumer’s in the marketplace, there’s no such thing as a free choice if it’s not an informed choice. And so, lo and behold, we were giving birth to this notion that if you arm consumers, you give them the information they need, you can go into the marketplace as an empowered consumer and take action. So we provide a lot of tools, a lot of you know us about the ranking and the testing we do, but we also break a lot of stories, do investigative stories. We just did one on car insurance and the algorithms that have inherent bias in them when you are looking for car insurance or a mortgage or whatever. And of course we also do work in Washington. When, sometimes, consumer power is not enough, you need some rules and regulations to even have a fighting chance. So it’s all those tools.
But you wake up into a world, and yesterday, I think it was a running theme, that there is so much innovation and so much discovery and new technologies, that the one thing we know is it is absolutely outpacing our ability to think and learn about this world, and what the price of entry is to seemingly free platforms. So what we know is that 60 percent of consumers—and now that we know that software is coursing through all of our products in our connected world—60 percent of consumers take no action regarding their privacy—you know, minimal passwords—80 percent of consumers have actually no idea what kind of information is being mined and monetized of theirs that they have no control over. A majority of consumers, however, believe that their privacy and their data is not secure, and they want somebody to do something about it.
So there’s a real paradox there in terms of the education work we have to do, and we haven’t talked enough about that. That this isn’t going to get out of the gate unless we take the time to bring in the consumer perspective and actually engage them about how you can own your data. And if there is a right to innovation and to make money, which of course we all know there is, there should also be a right to control your digital personhood. And so what we’re in the middle of right now in working with a number of partners—one of which is shaking her head, Rebecca MacKinnon at Ranking Digital Rights—is, so how do you do that in a digital world where all of the things that we were able to—the victories we were able to score with and for consumers on the rules and the standards—the basic standards that allow consumers to make smarter and better and healthier choices in a fair marketplace—don’t apply to this new world. Because there are no standards. So how does an organization like ours go about creating some relevancy for ourselves and some usefulness for consumers?
So I’d love for you guys to go to something that’s called thedigitalstandard.org. We have many partners. It’s an open-source platform and activity where we are iterating on what digital standards might look like. Because it’s fine, quality, reliability, and value is all relevant, but we need to be able to start giving cues to consumers about how to think about data privacy and security when they’re out in the marketplace. So that’s something we’re innovating on and we’ve got a lot of partners on, but I think it’s an important step in the education process.
Benner: Yes, is there a question here?
Reynolds: Yes. Josh Reynolds, independent consultant. I’m curious. You said 60 percent take no action, 80 percent have no idea, but they think more should be done. My question is, is it more that they don’t have an appreciation for how this data might be used in a way that’s not good for them? Or is it more that they just don’t even think about it? I’m just trying to say—is it education or motivation?
Benner: So just that everybody hears the question, the question was are consumers not taking action because they have just no idea that this could be a problem, or is it because they don’t know how? Is it because they don’t care?
Tellado: I would say one word, transparency. There’s absolutely no transparency. There is so much work to be done for consumers to actually understand the price of entry on some of the platforms that they’re on, or on some of the connected devices that are tracking your movements. They’re tracking your sleep patterns. Where’s that data going? Who does it belong to? There’s no transparency into that whatsoever, so it’s very difficult for them.
Reynolds: So just a quick follow-up. There’s precedent in law for this. I’m also a reformed attorney. I used to be a criminal prosecutor.
Tellado: There are a lot of attorneys in this room.
Reynolds: Not anymore. Lapsed a long time ago. But California Civil Code, Section 1701D says—
Benner: I believe you.
Tellado: It’s still very fresh.
Reynolds: It says, if I take something from you, and I know you ought to know something, but I don’t tell you that information in the transaction, that material omission of fact constitutes fraud. So if I don’t tell you something that I think would change your mind about this transaction, it’s not just transparency, it’s proactive transparency for an informed choice. It applies to a car, but apparently not to information about my disease.
Benner: Sabrina, can you talk a little bit about—or Dr. Topol—talk a little bit about why we haven’t seen more fraud cases in this area? Why that is not being used to bring, you know, to bring lawsuits to bear against companies that are putting up “check the box” warnings so we can do things like rent a house on Airbnb or download the new Apple operating system?
Ross: I’m not sure, off the top of my head, why the innovative plaintiff’s bar isn’t bringing cases like that. I guess that we, fundamentally as a society, don’t necessarily agree yet about sort of what are the harms around privacy. Like, should someone be upset if they’re on Facebook and being monetized via advertising. I just don’t think that there’s anything resembling a consensus around what people deserve to know. I think some people, particularly in this room, think, “I deserve a lot of information about how I’m being targeted ads and how I’m being monetized.” To be blunt, and this isn’t what I wish most of the population felt, I think parts of the population don’t care about the details of targeted advertising. They might care more about more sensitive data types like health, like financial data. That’s part of why I would just say that I’m not sure that there’s a concrete will yet around general data.
Benner: I’ll do two audience questions, but I just want to ask one of the entire panel. In addition to transparency, which Marta has talked about and sort of set out for us as a really great ingredient, what are the other things that we need to empower consumers and give them their control back? Let’s put aside the question of whether or not they want the control, but what would also be necessary in addition to transparency, especially in the world we live in right now where so much of our data is already in the wild and already out there?
I don’t know who wants to begin, but anybody can jump in.
Topol: Well, on the medical front, first of all we don’t really have transparency. And back to that prior question, people sign apps. They don’t read anything. It’s all in there, what you’re requesting. If you go 23andMe for your genomic data, it’s in there about pharma companies getting your data. And the same for electrocardiogram and a lot of medical apps today. People don’t read any of that stuff. It’s transparent if anybody actually read it. It’s not so good.
But one of the problems we have today, and why medical data—I would venture that most people in this room have had their medical data hacked and they don’t even know it, because 125 million people—adults—have had their data hacked. And it’s a 10 to maybe 21 ratio of people who have actually ever accessed their data through electronic record.
We don’t have a platform that brings together all the data. It’s homeless. So a lot of it, whether it’s your data for the things that you would measure yourself, which is increasing rapidly, and the other sources of data, there isn’t just one place. And the problem we have is that it sits on massive servers, so it’s centralized. And that is the biggest problem. That’s why we’ve had these Anthem and other massive—it’s not just the things that have happened with Yahoo or with Equifax. I mean we’re talking about things right after those two. It’s Anthem and other health, and hospitals have been held hostage—their information systems. A lot of them we don’t even know because they don’t want to tell the public that all the people’s health data have been held hostage to pay ransoms. So this is a really serious problem that data shouldn’t be held in centralized places.
So, to answer your question, we need to decentralize it. Get it down to units of one or family units. And that, if you talk to any cyber security gurus, is step number one—is to get it out of massive servers because they’re the targets for massive thievery and hacking.
Benner: So decentralization, transparency. Christiane?
zu Salm: I think as long as there’s such a strong economic interest in the data and interest of efficiency increase, and not so much an ethical interest in terms of what we discussed before—and given the transparency issues, I think there must be a law. There must be a law. To answer your question, “What does it need?” it needs law.
I just would like to take another—so it’s good so many attorneys here in the room—I just wanted to come back with one minute on trying to explain what people really—what’s on their minds when they don’t know anything about what they’re doing when they’re giving away their data. I think, from a psychological point of view, I think the convenience factor and the safety factor—or the failed safety factor—are so strong, so much stronger than any fears. “What could they actually do with my data?” For example, if you are with your car in the desert, you don’t give away your data if you want to know where the next gasoline station is. So that’s the safety, the predictability of your life, and also the convenience. We all, I think, check boxes immediately if we want to go somewhere, know some information. The convenience factor is so much stronger than any fears about what they possibly could do with the data. It’s only underlying. It’s a feeling. But I think there’s no incentive to really go for your privacy right or ownership right. And this is why I think there must be laws.
Benner: So we have, in addition, we have changing the economic incentives around what happens with data, and trying to come up with something that’s more compelling than convenience that underpins the consumer feeling toward these products. Which I think those are both great.
Tellado: Can I just add one additional ingredient that’s already been mentioned before, but I think I don’t want to lose sight of, and that is the notion of building trust in a very—in a complicated world that’s not very transparent. I mean, is that a goal that we can reinforce, especially in a world where we’re a social enterprise? It means we do not accept advertising. We purchase all the things we do. We have secret shoppers. But that’s not the world we live in. So I don’t want to lose the trust element.
Benner: Yes, and Sabrina?
Ross: Yes, so briefly I just want to kick it a little back to your philosophical realm. I wouldn’t underestimate the importance of getting meaningful privacy and ethics training into engineering curricula. I really think that in terms of concrete, on the ground, raising literacy in people who are working with these problems every day would do a lot.
On the law point, under the Europe’s General Data Protection Regulation, there is a right to algorithmic transparency for certain automated decision-making that affects your legal rights. So think credit decisions, employment, at a minimum. I hope that that will move the needle a little bit in terms of getting companies to think through, “What does it mean to give transparency about how our algorithms work? We’re not disclosing source code. What level of transparency is most useful to users in building trust, sort of gaining literacy around algorithmic decision making?”
Benner: There’s so many questions out here. I’ll just start taking them right down the row. Ma’am, in the green scarf or sweater.
MacKinnon: Thanks very much. I take it this is the microphone here. I just talk into it—
Benner: Just speak as loudly as you can.
MacKinnon: Thanks so much, Marta, for the shout out. It’s great to be working with Consumer Reports. I run a project called Ranking Digital Rights, where we evaluate the world’s largest internet mobile telecommunications companies on their respect for user’s rights, including privacy, which we look at as a fundamental human right. And so there you are.
And, actually, it’s not true that companies are sufficiently transparent if you were to read their privacy policies. At least among the companies—the consumer-facing companies—that we looked at, we found that actually, for major internet telecommunications mobile platforms, they are not providing enough information, even if you do read their terms of service, for a user to genuinely understand if a profile were to be built about them, what would it look like? What really is being collected, shared, retained, and used, and for what purpose? That is not being clearly disclosed, even for those geeks like myself who actually read these things.
And so—but to speak to who cares and why does it matter and sort of the majority of people don’t care. Well from a civil liberties and human rights frame, I would sort of challenge whether that’s the right question to be asking. Because it depends on who you are and what your risk profile is, right? So if you are the victim of domestic abuse, if you are LGBT in a community that is very hostile to that orientation, if you have family members in the Middle East, your perception of risk around privacy might be very different than a 25-year-old in Menlo Park who’s programming and designing apps. So this is why it’s really important to have disclosure and have projects like Consumer Reports that help—that read the policies so we don’t have to, but enable to make informed choices about what our risks are. And yes, it is subject to debate whether it’s good that a company can monetize my data or not, but I should be able to make the choice whether I want to allow the company to do that, and I should have enough information about that. And frankly, people don’t.
Benner: It’s so interesting. So what you’re saying, on the second point, this idea how that’s the wrong question to ask. It sounds almost like we need a standard setting for privacy, like what Sabrina had mentioned early on in the panel. Sort of like what all of these mutual fund companies have discovered—they really need to make your default mutual fund salary that you’re going to put 15 percent of your salary away into your 401K or savings. And then you have the option to go in and change that to less money, but they found that if you didn’t make the default what was the most responsible, people wouldn’t change. So we need a setting that’s the most responsible with regards to privacy, and then people can go in and make those necessary changes. It sounds a little bit like what you’re saying.
And then there’s another—going down the row, was there another question near the back? Yes. Okay, and then we’ll—
Kim: Hello, I’m Robin Kim. You’ve hinted at a lot of this. Obviously there’s a role of education, there’s a role of governance, there’s a role of operating principles in ensuring privacy. Given the fact that legislators have got so many different views, a whole different education curve and engagement curve, I would love to hear what you feel should be the priority for legislatures in this debate. And also, while we know that Europe has been looking at these issues in a lot of very different ways and in a lot longer, what you have been finding in any conversations your organizations have been having with legislators that they have been taking from that, as you’ve been applying to this.
Benner: Sure. Maybe we just go straight down the line, and, you know, what do you think—each of you think the top priority should be for lawmakers. And who wants to start?
Tellado: I would just say the consumers should be the top priority for lawmakers. I think there’s been a lot of incredible, fast, furious, and remarkable innovation, and I think that we can have that and put consumers first simultaneously. This is a false dichotomy and a false choice, and I would want that to be first and foremost in the legislator’s mind—that you don’t have to make a tradeoff there.
Benner: Absolutely. Yes, go ahead.
Topol: I agree with Marta completely. In fact, in just in vetting some of the principles with respect to medical privacy, there’s bipartisan support that people should own their medical data. Nothing’s been done about it, but there’s no issue from different political parties that this should be a right.
Benner: And then maybe Christiane and Sabrina, when you answer the question, you could talk a little bit about whether or not U.S. lawmakers, or people thinking on this issue in the U.S., have taken cues from Europe a little bit. Because I know that you’re both looking at that situation as well.
Ross: Absolutely. I think that—to your last question—that the FTC is closely following what’s happening in the EU, as are state attorney generals, and are cuing off of it in some ways. So to give one example, there’s not a statutory right to data deletion yet in the U.S., but the FTC has started doing a lot of work in that realm and pushing companies in that direction, and I think we might see some activity there.
I think there’s something to be said, off the top of my head, about the way Europe does protection of sensitive data. So European law defines certain data types—health, things having to do with religion, sexuality—and gives them higher protection. And because it’s so hard to have a conversation about data writ large, cause there’s so many kinds of data that have such vastly different concerns and risks to people, I might be supportive of something that really thought about what kinds of data most urgently need more protection—among them health, of course—and how do we—I know people like to hate on the sectoral approach. But because, again, data can be so different, sort of breaking down types might be useful.
Benner: And then Christiane.
zu Salm: That’s just another point to add to what you said, Sabrina. It’s going to be—it would be really, really difficult to police that. Let’s think about that for a moment. Even if the laws would be in place alongside a ranking of priorities, following the values, how important privacy in which area is to the society, the policeability is really, really difficult. And that is something I don’t know. Does somebody have an idea about how to come up with a solution for that?
Benner: It’s right. You’re right. It’s very difficult to create legislation without any sort of eye toward implementation.
Graham: Okay, yes. So, going in a little bit of a different direction, I have a question for Dr. Topol. I used to work in innovation at Stanford Health Care, and now I’m at Humana doing the same thing. So two parts. One, do patients really want their data? At Stanford, we created what I think is a pretty slick app to get patients to actually work with and interact with their data. But it was a very, very hard, uphill battle to get these Silicon Valley people to, you know, use their data. And the second question is how can payers kind of contribute to helping give patients access to their data?
Topol: Those are really good questions. Well, one thing to note—a movement is starting because UnitedHealth has announced that September 1st of next year—in fact they acquired a company to do this, Century Health—all 50 plus million Americans, they’re going to package their data and give it to them. Now that’s only the UnitedHealth claim related data, but it’s a start. It shows that there’s something going on here.
The point that you’re making about people—do they really want their data? I don’t know much about the Stanford experience, but in my own experience—clinically—when people get their data, they clearly get much more data-centric. And we have many studies, including randomized studies even in cancer patients, with improved outcomes. So I don’t think there’s any question about it, but we haven’t really done it in a significant way.
One other thing that’s—because it’s come up earlier in the conference about health care costs—the amount of waste that’s by people not having their data. We’re talking about more than 10 percent of medical scans, and even more of lab tests, are repeated unnecessarily because people don’t have their data, which they should have. No less the fact that so much repetition and so much real waste across the board that’s being done.
Benner: And over here, it was—yes, Marc.
Rotenberg: Hi. Is this on? Yes? So I’m Marc Rotenberg with the Electronic Privacy Information Center. There was a great moment a couple of weeks ago in a congressional hearing when the former CEO of Equifax, Richard Smith, appeared before a committee and Congresswoman Anna Eshoo leaned over the dais and said, “I wanted to thank you, Mr. Smith, for bringing Democrats and Republicans together on something we all care about, which is the protection of privacy.” And it was a great moment.
I have to say about this panel that I’m a little bit struck by how removed the discussion is from the actual problem. There’s a lot of myth-making in the privacy world. For example, consumer education or consumer empowerment as tools to protect privacy. This is not true. Or the paradox of privacy, that people say they care about privacy but behave in a lot of behavior that suggests otherwise. That’s also not true, because in reality they have almost no choice. The reality is also that it’s not a hard problem to solve when political institutions are functional. And I’m going to talk about this a little bit later today, so I don’t want to get into it too much now, but the contrast between the European approach and the U.S. approach to privacy protection—which, by the way, about 30 years ago was virtually identical—has gone off in two different directions, largely because the U.S. political institutions can no longer engage in a meaningful debate about privacy. And that has a lot to do with the money flowing from Silicon Valley to Washington. And we, as consumers, live with the consequences. The good news on the European side is that it didn’t just lead to greater privacy protection. The effort to come to terms with the consequences of new technology and engage meaningfully in a legislative debate that protects privacy and recognizes innovation and supports commerce and enables the free flow of data also strengthened their democratic institutions. In fact, the documentary film—the award-winning film—about the European privacy law was titled “Democracy.” And it is a wonderful story about how political institutions can be made to work. And I hope we will have a similar story to David’s excellent conference this year. I think it can be found in privacy.
But I’m going to say just straight up to Consumer Reports, you are heading in the wrong direction. You are 180 on this road. It’s not about consumer empowerment or about education. It’s about holding accountable the companies that collect and use personal data. That’s always what privacy’s been about.
Benner: Before I give—before I give—
[APPLAUSE]
Benner: Before I give Marta a chance to respond—thank you, everybody—I just wanted to make a couple of responses to that. I think the idea that it’s easy to solve as long as our political institutions are functional is a punt because we are so far away from our political life being functional that that could never happen. Also, I don’t know that we can really hold Europe up as the best example of functional political life right now, just because—I’m just basing this on non-tech things like the way immigrants are being treated, Brexit, rising xenophobia, and rising anti-Semitism. So I don’t think that Europe and the U.S. are exactly as different as we might—
Rotenberg: Is the U.S. any better right now?
Benner: I’m not—I’m saying that Europe is worse, but not perfect, which is what you sort of, in that “award-winning documentary” statement implied. And also, that saying this was going to be really easy to do once our political institutions are working is another way of saying it’s not going to happen. Anyway, Marta, please go ahead.
Tellado: Look, I want to thank you for the throw down. I never pass up an opportunity to come right back up, you know? I think ignoring consumer sentiment is such a mistake. I think we’ve made that mistake. We have to communicate directly. No one is ignoring the immense challenge that our institutions and our leadership is right now. Nobody is looking or expecting things to change in Washington, and I think you’re going to see some activity in statehouses across the country, and people have their eye on some of that. But to ignore the impact of so many things in our history—culturally and economically—that have been moved by public sentiment is a failure to recognize the power of democracy. So, I’ll just leave it at that. I don’t think it’s an either/or. You’ve got to do all those things. That’s what’s so great about where we are right now. Nobody has the last word, but we have a role to play in making sure that happens.
Kirkpatrick: Yes. Sort of gathering a few strands together and trying to just keep a thread going in the conference itself, given the power of the net giants is such a fundamental, essential part of what we’re talking about here these two days, I wanted to just repeat an interesting comment that someone made to me here at the conference. Which was, making the analogy between Facebook and Google and the fact that data is the new oil in the well-used but quite reasonable cliché that we hear periodically, and analogizing those two companies to Saudi Aramco, which I thought was very interesting. Because, in fact, the point that he was making is that if you look at the behavior of these companies and historic companies that have been in comparable positions, nobody ever has—the way he put it was nobody ever gets an asset of great value that they didn’t pay for and not abuse it. And, you know, we are in a moment where the quantity of data that’s held by a small number of entities is so vast, I just don’t think we should—Uber is an interesting company that some of its own questions to answer, but I think is actually on a very interesting new path, organizationally, with the new CEO, etcetera, which is very promising.
So anyhow, I just wanted to throw that out because I don’t want to let the words Facebook and Google not be very concretely addressed in this discussion. And I do think the Aramco analogy is interesting.
Benner: I think that’s fair too, and I think we’ve sort of hinted at that, talking about the fact that the horse is out of the barn because of those two companies. And to Mark’s earlier point, if we want to get real about privacy, we have to get real about how much clout they have in DC.
There’s somebody back in the back who’s had a question for a while.
Audience Member: Hi. I happen to work with Millennial and Gen Z audiences for music and social media. And I spent a long time working in livestreaming, where the idea of privacy is completely different from this generation. They’re willing to share every aspect of their life, but I think that what’s different about it is that, you know, in many of these platforms, the social media influencers, the musicians, the people who are sharing—they’re actually monetizing the content they create. They don’t just get brand deals on Periscope or YouNow. There’s tipping, so they actually can make a living through creating the content.
And so I think there’s this idea of data having value to the person who owns it, and they’re being able to monetize it, just like Facebook and Google monetize the content you create for them. And this, you know, idea to even decentralize this further, there are social media platforms now going to cryptocurrencies. And in the cryptocurrency platform, the user who participates will make money. The creator, who’s creating the content and the programming will make money. And Google and Apple will not be taking their distribution fee for this kind of social media platform. So I think there’s something in looking at the monetization of data, and if you can give that monetization back to the consumer, the consumer is going to be interested in that data and the ownership of it.
Benner: For the panel, have any of seen an example where consumers have been able to make money off of their own data and that’s had momentum outside of Instagram influencers? Not yet? It’s looking like—
Topol: There are some companies now that if you want to get your genomics done inexpensively, and then you can sell your data to pay for it or pay for a large part of it, they’re starting to crop up now. There’s starting to be the recognition that this is indeed the case.
Benner: Okay. Other—sir, with the beard? Hello?
Anderson: Thank you. Mark Anderson, Strategic News. I’ll do my best to make everyone upset at once. David, I’ll honor your request too.
So, Facebook is not my favorite operation, and I really appreciate what you just said about owning your own data. And I think the American law system, if I understand it properly, and I’m not a lawyer, was based on the British system of property law that came out of Blackstone. And one way for us to maybe cross the aisle in terms of Congress in all these problems, is to go back to that idea of idea, which you mentioned clearly. I think that Facebook is exemplary of, and not alone in, the complete cannibalization and misuse of news. It is the home of lies and fake news. Russia used it to dominate our election last time around. It shouldn’t—in my opinion—it shouldn’t be allowed for anyone, not just Facebook, to digestize somebody else’s writing content, which is driving all of the reporters out of business, and all of the true publishers of news out of business. Because they get their stories stolen by Google and Facebook and others, and put up as if they own it. And then they get paid by advertisers. And they may or may not pay anybody else who created that content. And that, to me, is a gigantic fail in the news world. If we want to continue to have real news, we need to be able to pay the reporters and editors who create real news, and not allow Facebook or Google to steal that money from them. Literally take that advertising money out of their own treasuries.
My guess is that we took this approach, both in the Blackstone sense that you described, so that each consumer owns his or her own created news content—or created content, and we applied it also to Facebook and Google—everybody else who’s digestizing—digesting—the real news created by New York Times or someone else, we would solve many problems at once.
Benner: I thank you very much for what I’m going to now turn into an impassioned plea to subscribe to the New York Times.
[LAUGHTER]
Benner: Thank you.
[APPLAUSE]
Benner: We have time for two more questions. So, yes. Sir?
Herman: Yes, just to follow up a little bit on that, I’m just going to ground some of the—an example. So, disparate actions is something that’s a considered problem in credit underwriting, for example. And this is a lot of the companies being more accountable for what they’re doing. I could build a model that’ll take someone’s name, map it to ethnicity, and probably give a discriminating model score for credit risk. The only thing protecting people from me doing that is a couple manual overheads from compliance, and that’s in a highly regulated environment. Google published an article recently from a couple academics developing a formalism of how to account for biases in models, when you basically can describe it with, say, a disparate action binary. Someone’s in a disparate actually group, versus not, and is your model being biased against them. These are things that are very doable, but it’s really kind of up to the companies and people like myself who do data to be kind of transparent with this. Because it’s very hard to know that this is an available thing. There’s a great TED Talk called “Weapons of Math Destruction” that gets into this problem. And so I really think it has to come from the communities who are really deeply involved in this to expose the solutions that we’re very aware of, to expose out the biases in our systems, and then a lot of things could come from that—maybe a score. Is a company—how much of it is disparate acting versus not? And things like that.
Benner: Sure. So, as the panel has talked about before, it sounds a lot like you also support the idea of companies needing to take the lead on this because the regulatory environment is going to be really fragmented. Great. And then just one last question from the audience in the back.
Audience 2: Can I add this quickly?
Benner: Absolutely.
Audience 2: I think we can also investigate journalism.
Tellado: And, so one of the things that we do which sounds very much is that we have been looking at and purchasing big data stores also looking for that algorithmic bias. And I’m glad you plugged “Weapons of Math Destruction.” We’ve been doing some work with her and she’s been incredibly helpful to us. But that is exactly right. What we found is that—and when you tell consumers this, of course, they’re overwhelmed that their car insurance has nothing to do with their driving record and everything to do with composite credit scores that are being put together with a lot of algorithmic bias. So, but data journalism is something we need more of, because we can’t wait.
Benner: Sorry, Marc, I just wanted to make sure you said that the fragmentation between the U.S. and the EU is going to be resolved soon. Is that what you’re saying?
Rotenberg: No, credit scoring.
Benner: I thought you were saying that the regulatory framework, which is what I mentioned, was going to be resolved.
Benner: And then, last question.
Mailliet: Elodie Mailliet from Getty Images. I just actually came back from a conference with Google and Facebook pertaining to publisher revenue, and the interesting piece—and I think to reemphasize that too—is that the reason why publishers have had a large issue is because now Google and Facebook have more data, so hence they can do much better advertising than publishers can. And I think the point there being also that the pressure point is for publishers to get that data back, and I think that’s where the pressure should be coming in, so that, essentially, publishers are in a position to do better advertising also, moving forward.
Benner: I’d say publishers are trying to do that, but negotiating with those companies is like negotiating with the Mob, so it’s very hard.
So, to that, I would like to ask one last question of this panel. If we give consumers more power over their data—if in an ideal world that could happen, or in part it could happen, what do you think the positive change would be? Why are we doing this? Why are we having this conversation? Why would we be fighting the fight? What will the real practical, positive change be of a consumer having more of his or her own data and more control over it?
Topol: Well, in the medical space, it’s pretty remarkable where this will go. In the era of AI, it’s clear that if you have all your data, and it’s seamlessly being updated, that that would provide potentially the guidance for preventing illness or far better managing illness. And so that’s why it’s so essential that we nail this thing down now, because we know that right today, it’s narrow with respect to certain things like images and patterns, skin lesions, and, you know, retina disease. But sooner, it’s going to get broader in terms of its power to help prevent illness. So the consumer that has all their data and has the appropriate validated algorithms hopefully will be a healthier consumer in the years ahead.
Benner: Okay. Marta?
Tellado: I would add to that and say healthier, safer, and fair marketplace is what we’re striving for.
Benner: Christiane?
zu Salm: I would just add to the marketplace. If data is the new oil, if you would make the consumers participate in the economic value of the oil, it would eventually, even though this is very utopian, maybe, contribute to lowering social inequality.
Topol: Or make it worse.
Ross: I would echo what others have said. I think, specifically, I think sort of technologizing enforcement so we could more quickly drive bad actors out, as well as sort of raising the general bar of how much companies are investing in thinking about this. So that inadvertent discriminatory impact in algorithms becomes, you know, it’s sexy in some fields but it should sort of be at the top of mind more broadly.
Benner: Great. Thank you so much for joining us. Thank you for being here for all of your questions, and enjoy the rest of the conference.