The data you generate on- and off-line about what you watch and look at, buy, borrow, even what ails you, is tracked, quantified, packaged and sold. Your virtual self and your reputation are being qualified, commoditized, and monetized. The dystopian critique is gaining adherence, from novelists to heads of state worldwide. If someone is making money from this info, shouldn’t you?

Fertik:  Thank you for joining us at the end of the hallway here in the most important room. And for posterity, it’s clear there are about 4,000 people in the room. Just can’t see them behind the camera. So if you’re way in the back back there, we’ll send a microphone back to you, if you want to ask a question. Otherwise, you could just get up and shout. We’ll probably ask most of the people up close to ask a question.

My name a Michael Fertik, CEO and founder of reputation.com. The topic today, Is the Internet For You Or Against You. And it’s a very exciting and broad topic, but we’re going to try to narrow it to the concept of data and how data are collected and used. And chiefly personal data, not data about businesses, but data about human beings and how those data are collected, with or without permission, with whom they may be shared, how they might be used, how they might be shared on. Is this basically a good thing, is this basically a bad thing?

As in all things, we are capable of having a very nuanced conversation that is tepid, and instead we are going to try to have a controversial conversation that is exciting, that people will want to watch for a long time. I am very lucky to be joined by a distinguished panel, and there’s also Andrew Keen on the panel—Andrew, that was a joke for you. Thank you—and it’s a terrific group.

You may know that I have a particular point of view on this topic, but I’m going to simply stick to the role of moderator and invite what I think will be a very exciting discussion. We’re going to do a couple things differently. We won’t do intros. I will do a one-line intro for everybody and get right into it, hoping to make this as interactive as possible.

So I would welcome intervention, I would welcome questions, I would welcome jokes. And the only thing we won’t accept is an ad hominem attack, except about me, which I will probably weather.

Andrew, we’ll go to you. Andrew is a very exciting person. I think of him as the Christopher Hitchens of the Internet, and I think he thinks of that as a compliment. Andrew has written extensively on the Internet and is sort of your contrary-wise contrarian type who writes about the Internet. He’s the guy who says the Internet has produced bad art, bad culture.

He’s most recently written a book or published a book called “Digital Vertigo,” which is about how the social social media is actually bad and atomizing and fractalizing for people, places and things in the world. And he’s now writing a book called “The Internet is Not the Answer,” which is literally going to be the follow-along in a way to Christopher Hitchens’ “God is Not Great,” and here he is.

Andrew, could you begin by stirring it up and tell us how, with respect to data in particular, the Internet is against you?

Keen:  Thank you, Michael, and it’s nice to be here. Well, the book that just came out, “Digital Vertigo,” is a remake of Hitchcock’s movie “Vertigo,” which, as you know, is a movie about someone who fell in love with someone who didn’t exist, or the blond woman turned into a brunette. The beautiful heiress in San Francisco didn’t really exist.

And I think the same is true of what’s happened on the Internet in terms of business models or lack of business models or lack of obvious business models. What’s happened on the Internet, with all these free services, free products, is that consumers have been tricked—consumers aren’t very smart in some ways, I guess.

Just like Jimmy Stewart in “Vertigo,” they deserve to be tricked, but they don’t have their eyes open. They don’t understand the way this world works, so they—I just heard some woman talk about MOOCs. I wrote it down, what she said. She said it’s this idea that everyone can always have everything. So she was talking about these MOOCs, which are the next big thing—in other words, the next big disaster, and she said MOOCs, this wonderful thing. Anywhere, any time, at no cost, which is of course absurd. It’s a lie.

Maybe she wasn’t consciously lying, but the same is true of all these services on the Internet, the Facebooks, the Google Pluses and all the rest of it, is that they’re free and they’re claiming to represent humanity and service everybody and enable the social and all the rest of the bit, but actually they’re in the business of acquiring our data and packaging us up.

So like in “Vertigo,” where Jimmy Stewart essentially became the product—and you sit around a table and you don’t know who the poor person is. It’s us. It’s all of us. So in terms of all this data, we have become—again, I wish I had been the person who said this originally, but we are the product. We have been packaged up by all these services, and I think we need to pay for our stuff. I think that the free model is a failure on every level, except for people who have made a huge fortune from these services, and we need to be much more honest—using, again, the words of Silicon Valley, transparent, in terms of making sense of this world.

Fertik:  Is there anything in particular that worries you, apart from kind of low-level anxiety or kind of a general sense that’s bad? What do you think are the outcomes that worry you?

Keen:  There was a great book called “The Information,” which you should all read about.

Fertik:  “The Information”?

Keen:  Yes, “The Information.”  David, who wrote the—James Gleick. It’s a very good—

Fertik:  —his name.

Keen:  G-L-E-I-C-K. At one point he says we’ve become data. Got this whole kind of scientific argument, and I think what I most fear is us becoming data. What I most fear is the idea that humans can be reduced to data and that we can be bought and sold as data.

So just as I don’t like the idea that we’ve been unknowingly packaged up as the product and sold as—to these big advertising companies, I also don’t like some of these solutions—maybe some of you have come to it—the idea that we can make money through selling ourselves as data on the network.

I think it reduces us, it trivializes us. It’s a dirty kind of business. I don’t want to be reduced to data. I don’t want to be reduced to data. I don’t like the idea of getting money in exchange for data about myself.

Fertik:  So your objection is philosophical or aesthetic.

Keen:  Well, my objection is across the board. It’s philosophical, it’s aesthetic, it’s economic.

Fertik:  Political as well?

Keen:  Absolutely. It’s political in the sense that I think it encourages a culture of rights, rather than responsibilities.

Fertik:  Culture of rights.

Keen:  It’s a consequence of that. The Internet is generating this idea that we all have these rights. It’s built around the cult of the consumer, the idea that all these services exist for the purpose of the consumer, and we don’t have responsibilities.

And again, I think it’s part of the kind of dishonesty of what’s going on. People are not acknowledging the way this system works fully, and responsibility is something that I think both companies—big data companies, but also users need to understand, if we’re going to make it a really civil place. It’s one of the reasons why the Internet is lacking civility, is because people only have a sense of their own rights, rather than responsibility.

Fertik:  James, you have something to say about this? James is one of the founding partners of Bloomberg Beta, which is a new venture fund that’s been launched in and through the Bloomberg universe.

Cham:  So I think that much of what Andrew says is right, but it’s right in the same way that like the king of England 300 years ago would have complained about dirty capitalists who are trying to the steal money from him and take advantage of the fact that he naturally owns all the land and that life is much nicer when he’s the one who’s able to tell everyone that they should be—he should be the main patron for Shakespeare.

And on the one hand, I agree sort of for the small number of elites who right now have access to lots of data and who really understand themselves and are super-enlightened that they have a good time; but I would argue that sort of what’s misunderstood is you have this chance to expand the pie enormously. And there’s a great deal of value being offered to people all over the world, thanks to something like a network.

And on the one hand, I would agree that in the same way the early days of journalism, sort of you think about the rise of journalism in the 19 teens with Pulitzer, these guys were like muckrakers and they were ridiculous and gaudy and quite embarrassing. That said, you dramatically increase the pie. Suddenly a lot of people have access to information; and then maybe 30, 40 years later, everyone becomes respectable, and then they start complaining about television.

Fertik:  So you think watch and wait. You think people are getting smarter all the time, they will have more sense of how their data are being collected and used, they will grow up, they will mature, they will become better consumers, contrary to Andrew’s prediction, and all of a sudden—is that your general point of view?

Cham:  So I think there’s like straightforward market argument, which is that there will be a set of innovators who create new services, that some of which will cross the bounds and will end up becoming—sort of making sort of bad decisions.

At the same time, I’d also argue that we’re at a point where we understand enough of this economy, whatever you want to call this, politics now needs to get involved. We have a set of people who are now well-informed enough and understand this basic model well enough that we need to make political decisions around sort of where the power sits.

Fertik:  One second. Before we get to Cyriac, can you give us an example of you think a company that is treating individuals well, that is not represented by this panel, and can you give an example of a company that through its decision to over-invade, over-share, over-something, failed? You’re an investor. You see a lot of deals, see a lot of companies—

Cham:  Certainly there are guys on the edge, but I think the obvious example, if you read “The Everything Store” about the history of Amazon, you see something on both sides. And within the same organization, within the same department, making a set of decisions that on the one hand sort of incredibly enable consumers to have access to goods that they sometimes knew they wanted or didn’t know they wanted through, in one case, a set of e-mails; and on the other hand, making a set of decisions that felt like they overstepped the boundary.

And the crazy thing about that, it’s not about someone in an industry. It’s literally the same person in an organization making on the one hand, a good decision; on the other hand, a bad decision. I think those sort of rules need to be figured out. They get figured out over time. It’s only been 20 years.

I’m reluctant to talk about—you look on the edge right now, look at early stage start-ups, and you see guys making decisions around trying very, very hard to protect the consumer. I think of like guys like Euclid—you must know Euclid Elements—they monitor WiFi pings and understand who goes into stores and who goes out of stores. They try very hard to protect consumers’ interests. On the other hand, you have to ask yourself, what if someone else comes in, undercuts them, they’re less—sort of less savory about how they treat the consumer, and will they win—

Fertik:  That was the article about Google versus Facebook. Google was not willing to sell certain data, and Facebook were. And so the CPM battle begins. Cyriac, you are the CEO and founder of Shopkick, a very successful start-up that is—I’m going to summarize it, and you correct me—you are giving consumers very interesting and appealing offers when they go inside stores, major retail stores, based on information that you may know about them, and also the stores, and targeting offers to them live in the store to give them kind of a discount, special access to a product. Is that right?

And can you—as a guy who collects data, has made decisions that may or may not be pro-privacy, pro-consumer, can you tell us how you navigate this important one in which—what you think will be the future.

Roeding:  First of all, yes, it’s correct what you’re saying. Shopkick is an app that allows you to see which products are hot around you, that are interesting at the stores that you know, but it also doesn’t only just give you discounts. It also gives you rewards just for being there. So the moment you walk through the door of a Target store or of a Macy’s or Best Buy, your earn rewards.

The rewards come in the form of a currency called kicks. You turn these kicks into anything you want, like gift cards or movie tickets, Coach handbags, whatever your heart desires. And the more you visit stores, the more you will collect, and the more you can reap.

Fertik:  This is the consumer culture that Andrew’s so worried about.

Roeding:  I want to say a couple things about this.

Keen:  I thought your company was cool, but—

Roeding: —the end of the world. Let me make a few statements. First of all, I love that you’re shaking up the panel. I think that’s great. Secondly, I fundamentally disagree with your premise, and I don’t disagree for the—I really don’t disagree for the sake of Shopkick or the business we’re in or whatever we’re talking about here.

I disagree fundamentally with the premise of not taking consumers seriously. I actually find it ironic that when you say you’re sort of like in the business of pointing out what’s wrong with the world, that at the same time you’re actually not taking the people seriously that you’re talking about, and that you’re saying consumers are pretty dumb—I think that’s what you said earlier—and that consumers really don’t know what they’re doing.

I’m not a friend of these lawsuits where a guy drives an RV, and then decides to leave the steering wheel and goes to the back of the RV, the RV keeps driving, obviously, and then the RV crashes and the guy sues the company. There was no sign that he can’t leave the driver’s seat while he’s driving the RV. And then he gets awarded $1 million or $10 million or whatever the price was.

Fertik:  It’s America, Cyriac. You’re German.

Roeding:  Or he goes to somewhere and takes a hot coffee and spills it over his leg, and oh, the coffee was hot. I didn’t know that.

Fertik:  That was an old woman.

Roeding:  I fundamentally believe that consumers are not stupid, and I actually believe it is in all of our interest to take consumers seriously, because we are all consumers. So if you’re saying everyone—consumers are dumb, you are talking to yourself as well. And not just that. I think it’s a matter of respect for others, and I actually think consumers are not dumb at all.

And the example you can take—there are many examples. You can take Facebook, the company that you probably hate the most, when they launched iBeacon—when they launched Beacon. Not iBeacon—it didn’t go down so well with consumers. They knew it was going too far. They didn’t want it, they voiced their opinion, and it was retracted. And so I think consumers know very well what they’re playing.

Fertik:  So you think you’re creating a culture of responsibility, not just a culture of rights.

Roeding:  Patronizing to talk down to consumers and say you guys don’t know what you want, you have no clue what you’re doing. In fact, I think people know very well what they’re doing, and especially the younger generation that grew up in the digital economy knows very well what they’re doing.

For example, take Shopkick. At Shopkick, we have a very strong belief that consumers are the kings and the queens, and that no one else rules. And this is not lip service. This is what we believe. This is what we’re built on. We would be stupid, by the way, if we didn’t believe that.

This is not a PR thing. You’re just plain dumb if you build a consumer company and you think the consumer is not queen or king. You are just plain stupid, because it will backfire. And it’s only a matter of time when you please the retailers, when you please the brands, in moments of decision-making.

In most cases we’re trying to create a win-win-win. It’s working very well. The retailers get foot traffic, the consumers get rewards, they get better deals, and the brands get product engagement. We heard from Jim, engagement from everything. People hold up products, literally holding it, they get extra rewards for that.

So it’s a win-win-win; but there are points when you have to make a decision. Do you go left or do you go right? And you can’t always please everyone. And in that moment, you have to have your true north. The true north needs to be, in a consumer company, the consumer. That means —

Fertik:  Other successful consumer companies you can think of of the Internet that don’t have this very enlightened attitude that you clearly have towards consumers? Or do you think all successful entrepreneurs in the consumer role also think of consumers this way?

Roeding:  I think the question is answered in this way:  Short-term, there are a lot of companies that don’t respect consumers. Long-term, there are none, because they won’t survive. It’s just a matter of time. You can make a lot of money in a short time by doing bad things. It’s very easy. We all know that. There’s lots of ways to scam people. And in the long run, it will always come back to you. Life is very fair in the long run.

Fertik:  So let’s take a concrete example that may not be in the service, in your opinion. So there are data broker companies and data collection companies that have collected data for a long time—and then we’ll get to Dan in a second—and yet they have been very reluctant to show consumers what they know about them.

And so I wonder how you integrate this clearly very passionate and well-articulated point of view you have with very successful companies that have been around for 30, 40 years, that are—have huge amounts of data about consumers, that service consumers and give products to consumers that do not express a willingness to share with their consumers what—even what they know about them, won’t even give them a copy of the data they have about them. How do you square that?

Roeding:  I love that statement. So I completely agree with you that that is a problem, and it does exist. And it exists particularly in those industries that need a major shakeup. For example, do you love your cable provider?

Fertik:  No.

Roeding:  Do you like—do you love—

Fertik:  No.

Roeding:  4,000 people. Nobody—

Fertik:  No one in the back?

Roeding:  Do you love your phone company?

Fertik:  No.

Roeding:  Okay, so there we go. Those are the industries that need shakeup. You love your phone company? NSA does too. So in other words, there are industries that need shakeups, where they are monopolistic or duopoly structures or oligopoly structures that are not healthy.

And over time, the only way you can break those up is not—regulation can help, because like anti-merger regulations or whatever, they can help, but they won’t fundamentally change the industry structure if the industry is tending to concentrate over time. So what you do need is innovations that fundamentally break up the story.

For example, cable companies have their first competition ever, I think in 20 or 30 years through IP. First time, right? Unfortunately, IP companies are the phone company, so that doesn’t help too much, but there will be another—

Fertik:  But there’s other stuff that’s happening.

Roeding:  Another innovation will help break that up. Those industries need to be shaken up.

Fertik:  So I love this, because it is a pro-consumer, pro-entrepreneurship disrupt everything thing, which is good for the entrepreneurs who are here.

Dan, you have been very, very patient. Dan is the partner in charge, at Accenture, of innovation or Accenture strategy for Accenture, and travels all around the world and teaches people how to think about these things.

Elron:  Thank you. I’m a consultant, so I did a little homework. I asked one of—I’m afraid I didn’t know that much about Shopkick, so I asked about Shopkick, and I asked our person who’s in charge of e-commerce, and said oh, I love the application. He says every time I go to a mall or whatever where there’s Shopkick, I go in, I give all my kids the smartphone, and I ask them to take bar code pictures of everything out there. And they come back, and they love it. It’s a great game. So that’s how I learned about Shopkick.

Fertik:  So Shopkick, a secondary—

Elron:  So consumers are smart, yeah. Consumers are getting smarter. They know how to respond—

Fertik:  They are accumulating points with the child workforce.

Elron:  Yeah. Andrew, maybe this is a conversation for the bar tonight, but if this is what he thinks about consumers, I wonder what he thinks about voters in this country, because it’s the same people. And we’re letting them vote, and democracy is fundamental principles, and then we don’t trust them to be on the Internet. I’m not sure about that.

I asked a friend who runs a company that actually has data, but—what consumers do on the Internet, and it’s hard to get—this company sees the social log-ins that people use to get into web sites, and gave me some statistics.

They did a survey among their users. The company is called Gigiette, a Silicon Valley company. 63%, according to them, of consumers believe that companies will sell their data. 52% of consumers believe that they will send spam to them, and a fairly high number believe that they also post on their Facebook sites, if they log in with Facebook. Also, they have noticed for e-commerce sites, a 3% decline in registrations for every permission, extra permission that consumers are asked. That tells you—

Fertik:  Permission to send you something, permission to share something.

Elron:  Yeah. Consumers are careful. Another anecdote there that’s more one data point, but companies that agreed not to do those things saw a 300% increase in conversions, because—

Fertik:  Companies who do not require permissions get higher conversion—

Elron:  Took the permissions off the web site. And they saw a huge lift in terms—

Fertik:  Comes back and says you have to put the permissions back in.

Elron:  Or put like 20 years too late, they put in this thing, this site sends cookies out. It took ten years to figure that out. Who cares?

Fertik:  It’s not as if who cares, but what can you do about it? So I wonder, is it that consumers know or do they think I got no choice? What’s the conclusion, the correct conclusion? If I want to read “The Financial Times,” I have to accept the cookies.

Elron:  Yeah. Well, if you look at this data, it looks like consumers are learning very quickly, and they’re pretty smart about what they want to divulge, to whom they want to divulge it and when they want to agree to a relationship that involves sharing data.

And as you said, we’re early, it’s early, James. In ten years, I think our young people will know—the ones that become kids now will know exactly what’s happening in the data. So it’s a—

Fertik:  A very important question. Andrew set it up in a lot of ways, and then Cyriac—I set you up to defend yourself, maybe. So the question is, you said consumers are dumb, and—

Keen:  Did I actually say that?

Fertik:  Well, let’s not fully dismiss—

Elron:  You said they’re not smart. I wrote it down.

Fertik:  Say consumers learn fast, that the Internet moves perhaps father than consumers can learn. Say consumers are perhaps learning the lesson of five or ten years ago now, but the Internet is accelerating at two years for every one year of time in terms of innovation and data collection. Is that a fair fight? Is that a fair fight?

Even if we believe consumers are smart over time—I’m an optimist. I believe that. Is it a fair fight? Is there something else that needs to happen, whether in politics, as per James, or something else, Andrew? Or do you want to defend yourself more?

Keen:  I did want to clarify—what I meant was—I didn’t say I didn’t like people and I didn’t say people are stupid. I said that one of my problems, one of my critiques of the Internet is that it presents people exclusively as consumers. And actually I would contrast the consumer with a citizen, and it comes back to your question about politics.

I don’t think that the Internet is seasoning people for citizenship. It’s turning all of us exclusively into consumers in the sense that all we think about apparently is going into stores and giving our 6-year-old smartphones so that they can click on a dress, so we get a discount. I’m not—if that’s what you want, fine.

Fertik:  What’s the alternative?

Keen:  This whole issue of whether people get what’s happening on the Internet, I would—I don’t think they do. I mean, how many people actually read the terms of service? I think Linked In is one of the better companies, it’s one of the companies I trust more actually; but on the other hand, Linked In knows so much about me, that every suggestion of someone I should contact with—and these are people I had forgotten I even knew 30 years ago.

How does Linked In know so much about me? Probably because I haven’t read the terms of service carefully enough to click in some sort of box so that they don’t access my e-mail. Facebook is another good example. Google. I mean, Cyriac talks about the Internet as if it’s this democratizing force. Look at Google. It’s much more monopolous, much more powerful.

Fertik:  So what about Linked In? What’s the down-side if they know who you should connect with? Why is that bad for you?

Keen:  I don’t want a service that knows—seems to know more about me than I know myself, but let me just come to Google. Consumers use Google all the time. This is the dominant company, 80% of the search market in southern Europe and so on, much more monopolistic than telecom companies or big media companies or Hollywood or anyone like that. And Google is cleverly connecting all its services, encouraging, pushing its users to become—to have accounts on Google. So it knows more and more about us.

Google is the quintessential company here in terms of driving this new economy, and I don’t think—I really strongly believe this—I think a very small minority of consumers understand what Google is doing.

I think the same is true of Facebook. Be interesting to see what David says about the typical Facebook user. Do they read the terms of service? Do they understand the way their data is being used? I don’t think it’s true. I think consumers, for the most part—and I include myself—are lazy when it comes to reading these terms of service. And these companies are really smart.

Fertik:  So it’s a question of fairness and distribution of power. It’s a very important political statement, whether we agree with it or not.

David, then Steven. Okay, there’s a mic around here. So in the front row of the 400,000 rows that we have.

Kirkpatrick:  So I don’t know whether Facebook users know that, but I think Dan’s statistics are fascinating, because they imply you don’t really need to read the terms of service to have a pretty generally good idea about what’s going on. And I would say my impression is people are realistic, they know they’re getting something for free and they’re giving something away in exchange; because otherwise it wouldn’t exist. That’s just common sense.

This is something—it would be very easy to obsess on all of Andrew’s extreme statements, and it’s a great panel. I don’t want to do that, but I have to just ask you, given that you said that you absolutely—the notion of selling your personal information seems to turn your stomach, would you prefer that Facebook, Google, Twitter, Linked In, I could list 15 more, don’t exist at all? Because basically, in order for them to exist, that is what you’re doing.

Keen:  I would prefer they charge for their services.

Kirkpatrick:  Well, then—I assume you use those services. You use Google, right? I mean—

Keen:  Right, but I don’t have a Google account.

Kirkpatrick:  But you know what? They’re still targeting you with advertising. That’s a sale that you’re making. If it’s not cash, it’s usage in exchange—

Keen:  I would personally—I’m a typical consumer in the sense that I do use Google. I’m not on Facebook, I’m not on most of the networks. I personally would rather pay to use Google. I would rather you pay whatever it is, $10 a year, $5 a month to have access to a service that’s very useful, very helpful that I use all the time, but doesn’t have the advertising strings.

Fertik:  Steve?

Sprague:  I think one of the great—Steven Sprague, Wave Systems. I’ve been chasing the aspect of DRM and trust computing for 20 years. So I think one of the huge challenges is to apply this in real concept to children. And I think it provides a clarity of thought.

If your child goes to the hardware store and steals a hammer and is caught shoplifting and reported to the police, we have the concept of a sealed record. I think it is our obligation—it’s really at a very substantial level—our obligation to protect our children. It’s one of our fundamental things we should do.

Ultimately, the ideal solution would be that everything—emphasis on the word “everything”—typed into Facebook by someone under 18 should be encrypted in a manner that they can revert to a policy control that’s mom and me. And so the idea that we can replace the Kodak collection in the basement—so I happened to meet my wife when I was 16. And the other day, we were cleaning out the basement, and we discovered the most evil thing you could possibly find, which is letters we wrote back and forth to each other when we were 16. You do not want to give these to the children yet. They’re not old enough for this content.

And the point is it was safe. It was in the basement, and it was safe. And so how do we create that? How do we create an environment where my 15-year-old daughter can be on Facebook, and they’re not educated, their brains aren’t developed enough, they don’t understand the bits and pieces of it, and so regulatory-wise, one could support that.

It’s different than the right to be forgotten, which is the current European Union’s lunatic concept, which is to take that shoe box of all those mementoes, and when you go to college, have a ceremony in the backyard and burn them. Put that on Facebook, put that as a video on YouTube, burning your childhood mementoes, because that’s evil. That’s in the fundamental concept of it. And so I think we need to provide the tools. I think it’s the obligation—

Fertik:  Is Snapchat an example of that tool? The technology going to provide a solution to this problem—

Sprague:  Snapchat with an independent key server. You cannot combine the data store and the keys in the same company, because then it’s discoverable.

Fertik:  Snapchat is very popular with the kids, and you send a picture of yourself or something, and you send it to the other guy. And your friends look at the picture, then it expires, it locks up, so it’s not on the server anymore. So it’s kind of an ephemeral communication tool.

Sprague:  Just build Snapchat with the keys for a French child in France, so that even though the data’s stored in California, it’s not subject to the Patriot Act. It’s subject to whatever the rules and regulations are protecting a server in France, which is the country where the children reside and the parents thought it was appropriate to sign up as a service. There are tools to do that.

Fertik:  Quickly, back to you, Andrew. Are you hopeful because of things—Snapchat and technologies like Snapchat make you hopeful that data will repose more in the control of the end user, the consumer, the citizen, or is this not something that is giving you hope?

Keen:  I’m hopeful about Snapchat, although you can argue it the other way as well. According to my kids, anyway, Snapchat’s mainly used for teenage pornography.

Fertik:  Okay. Question. Wait for the microphone. This is for posterity.

Westby:  Jody Westby, Global Cyber Risk. I would prefer to think that companies are dumb versus consumers are dumb. And so if we take Apple, for example, Apple understood the consumer. Apple soared beyond Microsoft, blew everybody off the chart because they started giving customer service. They cared about everybody that bought their product. They cared the consumer liked their product.

Fertik:  Could you say the same about Exxon, the biggest market cap company in the world? Do they care deeply about you at the gas pump?

Westby:  So if you look at Amazon, Amazon also is a company that’s a great example of building its business model around the consumer, versus companies that—categorize companies, so the data mining companies, they’re not consumer companies. So put them in the right category, so we don’t mix this up; but the consumer will learn.

And that curve, I think, is directly proportional to two things:  Legal frameworks and enforcement. And when they start understanding the laws that protect them and they start understanding that there are enforcement authorities out there, which are taking off at a galloping pace now, that the learning curve for consumers is going to go up very fast, and the impact on the companies that are dumb is going to be great.

Fertik:  Is the learning curve for the consumer going to go up? Are they going to be okay and justified being okay and not having to think about this more often, because someone who’s an expert in enforcement will be thinking about it on their behalf?

Cham:  A darker view, maybe another way to think about it; it’s not so much the consumers are getting smarter, but rather the political forces and regulatory forces are getting smarter because they recognize this is a fundamental threat to their power. And on their side, they have to get savvy, because they now realize there’s a great deal of both economic power and sort of attention on the part of most consumers and most citizens that’s going to go towards these set of services. They’re going to want to exert some power.

Fertik:  Are we seeing an arc that we’ve already seen with financial institutions, that became very heavily regulated with Telco’s that became very heavily regulated? Telco’s used to have a lot more freedoms, and then they became very heavily regulated as to what they could do with the data about your phone calls and records. And is this now the moment where we can have peace of mind, because that may happen to the Internet as well?

Westby:  Well, what we’re seeing is, I think, increased enforcement authority. We’re also seeing increased action on beefing up privacy. So the European Union is still leading the charge in that. The committee in the European parliament has just approved an enhanced data protection directive that would offer a lot more protections. So at the U.S. level, we have some languishing initiatives at the White House, but they’re trying to move toward the EU role.

So what they’re seeing a lot more in the news about lawmakers becoming a lot more savvy around the world about privacy rights. You’re seeing people in developing countries starting to understand privacy rights, and so this evolving global framework is doing a lot to educate the consumer, and it’s doing a lot to also then fund government initiatives for enforcement. And the fines for enforcement are becoming serious.

So let’s just take HIPAA. Some omnibus rule just came into compliance deadline September 23. The maximum penalty is $1.5 million from like $50,000. It’s a big jump. So it’s a big jump in enforcement, and people pay attention to that.

Fertik:  Thank you very much. Thank you for that very interesting comment. We’ll go to Cyriac in a second. Dan, you and I have—you have a very global perspective on this. You and I have seen—I think we watched a movie over a period of three years at the World Economic Forum about how personal data may or may not get used, and it’s a slow-moving bus; but one thing I think I observed was that the FIs, the Telco’s, the ISPs, the health care companies that have a huge amount of data, but are regulated from using those data in a way that the Internet media companies are not, started to wake up to the idea that there was a huge wealth creation opportunity if they could somehow be pro-privacy or pro-consumer perhaps, and yet—and therefore be able to find and use those data.

Do you think that is a trend that is correctly observed, or you think I’m overstating it? Or what have you observed that might be in that parking lot?

Elron:  It’s clear that some companies have more trust than others. And actually in Europe, some of the telecom companies, as opposed to the U.S., have decided that they want to leverage that. It’s much more complicated, though, because the Internet is moving so much faster than they can in terms of their business model.

I think the idea that regulation lags reality in the marketplace is correct. I think we see that in that situation, because the World Economy Forum and the regulators in Brussels have been working six to eight years. And frankly, I think that’s okay, because we don’t want to slow progress. I think the regulators in Europe are understanding that too much, and we want to let the Internet grow through experimentation.

We heard yesterday a presentation about a company that has unbelievable personal data, which is the greeting card company or paperless post. They know when people get married, when they have birthdays, et cetera. We want these companies to be able to survive and succeed in creating that value. If they were regulated with the heft of the regulations being visited in Brussels, I think it would be very hard for them.

So I think the issue with the incumbents—I think David talks about that a lot—they cannot execute that well in the speed at which that kind of thing is required. It’s not necessarily the regulation that’s slowing them down, and so we need to see more innovation, and maybe the incumbent companies that have the brand and the trust buy the innovators and create those businesses; but for the time being, it’s fairly slow.

Fertik:  Cyriac, you wanted to come in on this point or some other point?

Roeding:  Yeah, so I think I wanted to add something to the conversation you started. I think there is definitely, whenever you create something really new like the Internet, something powerful like the Internet, you are creating problems with it. There’s no doubt that there are issues, and you’re completely right by pointing out, from my perspective, completely right pointing out personal problems that come along with it.

And I think that’s really fair and actually good for someone to point it out, because otherwise we might just be sort of like blindly following the trend, and I think that’s really good. It’s helpful; but I think it’s true for any new thing that comes along. When you go from the horse to a car, people die in car accidents. And when you go from cash only to credit cards, you go to fraud.

There’s all kinds of things that are—whenever they get invented, they produce positive effects and negative effects. In the long run, things will usually figure out themselves whether the benefit is greater than the cost. And if the benefit is greater than the cost, then over the long run, this will stay. And if it’s not, it will go away.

The question is how long does it take for that process and who gets hurt in the meantime. And regulation is required for that. For example, you pointed out, like if we only produced fun for the kids at Shopkick, we would go away. So we have to produce $500 million in revenue a year for our partners now after four years, and we have to produce 30% to 60% incremental revenues for a user who’s using Shopkick. If we don’t, we’re gone, and that’s fair.

So in other words, things have to pan out over time, and it’s good to have a warning signal from somewhere. What I don’t agree with is the conclusion how we should change it.

Fertik:  Quickly, before we get to Andrew, can you give us, without revealing anything that’s too secret about your company, is there a moment that you can recall for us that very concretely puts the meat on the bone of this discussion, that you decided to do X, that would be give consumers control of their data versus Y? Is there a choice you made not to do something a certain way about consumers’ data and their relationship to their own data that you can tell us for our discussion?

Roeding:  I remember very well the very first retailers we were trying to get on board. And the first retailer, the first one ever was Best Buy, and we were building an alliance. We were basically saying hey, you can earn kicks everywhere at multiple retailers and it will be great, because with an alliance, everybody benefits. The more power because of the combined consumer power.

And then the question, of course, that came back was well, who else is in the alliance? Well, unfortunately right now it’s just you, but there will be others. There will be others.

And so in that moment, conversations obviously also began of who owns the data. And when somebody walks into a store, that’s really powerful. It’s great information. And the question was not asked because they wanted to misuse any of it. It was asked because they thought it could be very useful.

And it’s true for almost any retailer we talk to. Of course that’s the first question. Do I own the consumer data? Do I know who’s walking in? And the answer is no, you won’t. And the answer is no, because if we disclose that, the consumer wouldn’t know that and they wouldn’t understand why, they wouldn’t get an immediate benefit from it.

So there’s a—going to be a perception of a lack of respect, which breaks one of the fundamental rules, I think, of data usage, which is respect, the golden rule, how you want to be treated; how I want to be treated I should treat you.

And then the other thing is there’s no immediate value, so I can’t really deliver value. So the answer’s no, however—and this is interesting, because it points out when it can be used—if you were to log in with your loyalty program number from that store and link your loyalty card from that store in our Shopkick app, which we hadn’t done yet, but we might do it one day, then of course it’s totally fine to share the data, because you’ve already signed up for that program. You want them to have data in return for whatever they offer.

So there’s a very clear choice here, and I think it’s a very clear red line. You can always ask yourself, would I want that. And if the answer is no, you can’t do it.

Fertik:  Andrew, you wanted to say something.

Keen:  One of the reasons I actually like this event, David’s event is because it treats what’s happening in digital society—Ericsson did the same thing—as a major moment in history. This is equivalent to the industrial revolution in most respects. It’s the new thing that’s changing everything, and I think this idea well, we’ve been through it before, we don’t know what to expect is wrong.

We need to—and I do this in my book, “Digital Vertigo”—we need to think historically. We need to understand what’s happening now is as traumatic and profound and destabilizing and in some ways very positive as the industrial revolution.

The only way the industrial revolution worked out is because it was regulated. The only way it worked out is because government was strong, and they understood that you needed to respect labor laws and the rest of the bit. The danger issue is the 21st Century version of environmentalism, but—right, but the problem with this is that bound up in the Internet—and I think Stewart Brand is a perfect example of this—is a kind of an—and it comes out—the left libertarianism that rejects authority, rejects government.

And so the Internet, as Brand and Markoff and everybody else understands, was created by people who rejected the idea of authority. It’s all edge and no center. So the problem now in a world in which it’s kind of spinning, in some senses, out of control, is just when we need regulation, just when we need some adults to control what’s happening, the government is increasingly undermined, people don’t trust authority, politicians are disrespected. So that’s the challenge of how to reinvent authority in government in a world which seems to be controlled by the edge rather than the center.

Fertik:  Very interesting statement. Patty, I’m calling on you, and it’s unsolicited, unexpected. You could say “pass” if you want to, but Patty, you run a conference for founders of companies. What occurs to me, you see a lot of founders of a lot of companies, both in North America and in Europe chiefly is what I understand. Are you seeing any entrepreneurs or a growing number of entrepreneurs who are interested in giving consumers more control over their data? Or are you not seeing any growth in that set of entrepreneurs?

And there’s a microphone right behind you. You can also say I don’t want to talk about—

Audience:  Phone a friend. No. So many of these issues are sometimes binary, quite black and white, and I think in issues of data protection, of which Europe would diverge quite strongly to kind of sentiment, or at least a philosophical views of America, I feel entrepreneurs by and large, those I’ve interacted with, I’m not really too sure—I’m not positive if they respect consumers or don’t respect consumers.

I just think it’s a spectrum, and it’s a constant battle. If seizing more information from the hands of consumers helps your company grow, then you’ll optimize for that outcome. And that’s just a consequence of the type of business that you’re building.

I think traditionally, throughout history, large businesses that have had an interest in accumulating knowledge or information on their users or their customers or their consumers will continue to do so and become more aggressive in the pursuit of that.

And until they’re regulated, it’s a kind of natural evolution of any large institution that’s established for the—eventually for the purposes of accumulating ever more date. If you look at Facebook and all the other competitors that were emerging at that time, they were optimized for grabbing as much information about people’s friends as possible, to increase the speed at which their graphs expanded.

Fertik:  Back to James.

Cham:  I think Patty’s exactly right; that we sort of forget in start-ups the desperate search for attention and the desperate search for a business model for someone that actually pays you. Entrepreneurs are going to make such decisions, a lot of which sort of will be illegal probably five years from now. I think that’s relatively clear.

Fertik:  Will that end up basically helping and calcifying the success of the incumbents? Will that?

Cham:  So I think—I think a bunch of guys read Tim Wu’s book about empire building. I think a lot of folks read it and said oh, my God, this is terrible. Then a lot of entrepreneurs read it and said oh, my God, this is why I should be spending money in Washington, D.C., and literally true that people increase their budgets in Washington, D.C., because they realized that regulation was a way to sort of preserve the status—their current status quo.

Fertik:  This is the big battle I have had with the FTC. They thought that they were—not a battle. I’ve just been trying to communicate to the FTC that though their enforcement actions were very brave in a lot of ways, it also had the unintended consequences of sealing the door behind the incumbent and giving them the way to win.

Cham:  And the generalized observation, as far as start-ups trying to think about this seriously, I’d say right now, there are very, very few. I mean, I have looked quite hard for companies that think about the world that way, and in general—

Fertik:  Which way?

Cham:  Creating new products that actually—think about consumer data in a way that they could actually protect it or create new products that they might find interesting. And in general, to be honest, the ones that I have seen are typically idealistic think tank products or they’re very, very scummy.

Fertik:  Please.

Smolens:  Michael Smolens, Dotsub. Do you find, any of the panelists, a difference between—anonymous realtime buying in terms of advertising revenue is probably the largest source of revenue that people are desperately looking to get, and if you want to serve up an ad to someone who looked at a pair of shoes, but you don’t know she’s Mary Smith, but you’re just serving it up, an ad to an anonymous person who has a certain amount of demographic thing, is there—is that a—where does that fall in the spectrum of violating someone’s privacy; but you don’t know their name, rank and serial number, but you know they’re a consumer who has certain characteristics? Do you see a difference in that, as opposed to other kinds of sort of violations of privacy?

Fertik:  Can you help frame the question more broadly and say are there any orientation points that people—that seem to be emerging in the discussion here, as to what the red lines might be?

Elron:  Yeah. We talked a lot about this asymmetry, that companies know a lot more about you than you know about them, and you know about how they use the data. So the question, the way I think about it is, where is fairness intersecting with this asymmetry.

It’s a question of time and education. I think the younger consumers understand very well the economic exchange that’s going on. Less experienced consumers are not as experienced and don’t understand that exchange and are—I browsed for a car two days ago, and now I see an ad for a car. Kids understand—many kids understand that because in many cases, we’re actually trying to teach them that.

So I think it’s a timing issue. I think over time, we’ll understand that. What worries me a little bit is something that we don’t talk about that much, is the trade in intention. The ads, if you see an ad or don’t see an ad, that’s interesting, but there’s now a whole industry, which we don’t talk about that much, that sells very accurate intention data. They will say this person is going to buy a car in three months. This person’s interested, to the Kaplan Group here, in secondary education. We have tracked them over the Internet. We know that they’ll probably sign up for a paid university degree or something like that.

And that is beyond, I think, what consumers know that is being sold and traded. And that’s not $2 a transaction. That can be hundreds of dollars. And that, I think, is something that there’s not enough transparency.

So in terms of kind of the—while I think some of the basic things we’ve had enough time to understand and educate, some of these more sophisticated things and the billions of dollars in R&D are going towards this intention market, maybe hundreds of millions. I may be exaggerating, but a lot of money. That is something we don’t quite understand. And that’s where I would tend to agree with—

Fertik:  This is right here, the nexus of the debate right now.

Keen:  But it’s not some obscure company. Five years ago, Eric Schmidt gave an interview to the FT, and they asked him, where do you want Google to be in the future, in five years. And Schmidt said, I want us to know our users so well that we’ll know what they’ll want to do and we’ll know what jobs they want more than they know themselves. So this is not some obscure slimy little data company somewhere. These are the big companies who are getting into the intentionality business.

Fertik:  Big billions of dollars of investment. And we’re going to work down the chain in a second. Sandy Pendleton with MIT has shown, as others have shown, you can tell where people are going to be in 20 minutes with high degree of consistency and reliability, very frequently during the week.

So the question reigns, can consumers keep up with the pace of change. Google Glasses now on the street. Soon you will be understood as the person who’s been viewed through the Google Glass, then they will be able to track your weight over time, and this data will end up in the hands of some insurance company, will decide whether you’re getting fatter or thinner over time and how close you are to McDonald’s every week. And that is not something I think the consumer today expects to happen. It sounds like sci-fi, but we are close to it. David, can I go to this fellow who hasn’t spoken yet?

Gross:  Hi. I’m Jim Gross, Cisco. I have been studying a lot about the Internet of Things and thinking about the data issues and everything, and there’s a couple things:  One, I tend to agree with you about the market will settle where people do the right thing. I think when we do the right thing and we give the consumers something back—I think of Mint. Mint is non-intrusive, because they give me this great picture of my finances and they tell me what I’m spending too much money on my credit card that I’m using.

Fertik:  Who here uses Mint? Who here used to use Mint and no longer does? That’s interesting. That’s a trend I’m seeing. I think some people just don’t use it anymore. I don’t know why.

Gross:  But one of the things I think there’s an opportunity here for—and I was speaking at another conference and we were talking about big data, and I was in one of these sessions like this, and people were talking about how even within their own companies, the data that they’re collecting, they’re not even sharing, because the data equals power.

I also was reading about dark pools and what’s been happening in high frequency trading and realizing there’s incredible, incredible value in data. And there’s going to be, as we instrument the rest of the world that’s not instrumented yet, this is going to become even much more of a problem, because we’re going to instrument everything.

And so I think of myself as a consumer, and I say, how come Ford gets the data from my car? How come I don’t get that data and get to choose who gets that data? How come I don’t get a chance to choose? Maybe United wants to pay me to learn about my American flights. How come I don’t get to monetize that data myself?

And I think there’s a huge opportunity to build an exchange for data that companies will trade, that people will trade, that could be ten times bigger than the amount of transactions and finance, if that happens, even in our stock markets and currency and futures and all that kind of stuff.

Fertik:  That’s great. Anyone else who hasn’t gone yet? Then we’ll go back to the people who have.

Keen:  Mark Davis talked about—Microsoft talked about data banks, which is another really interesting area.

Clark:  So I wanted to come back to Andrew, because I want to tease out kind of where the line is, because database marketing, you think Experian, Equifax, these guys, it’s been going on a long time. A lot of things like are you going to—I’m sorry? I’m sorry. Darren Clark, CTO of YP.

So where’s the line? Because you all hear where’s the spooky factor start. I’m curious, especially with Andrew, because I don’t know that it’s inherently as moral as you’re making it out to be. And it’s just technology, and new behaviors arise out of that; but could you draw a line, like where does it step over kind of into the dark realm that you’re worried about it cascading into?

Keen:  Just to come back to Michael’s point about Google Glass, I think that if Google Glass and Google self-driving cars catch on, I think those both cross the line. I don’t want a company that knows everything that I’m looking at or where I’m driving. I’m already uncomfortable enough with Google. I use it, but I don’t sign in.

So I would say Google Glass and self-driving cars. That doesn’t mean you can’t have self-driving cars. We did a whole thing on future cars, a really interesting area; but again, this whole issue of data is central to it. And the guy—the car company guy earlier was talking about it. They have to resolve it. You can’t build all these products in companies around the exploitation of the user’s data.

Roeding:  I think the key here is transparency. The consumer needs to know what’s going on in order to be able to make a judgment call.

Fertik:  You mean real transparency, beyond terms and conditions.

Roeding:  No. Talking real transparency, like simplicity. For example, when I know I’m using a self-driving car, I have a very clear sense this data goes there and it’s being used for that, and simple to understand. It’s not in the fine print.

Unidentified:  I should be able to log into Google and see what they have.

Keen:  About transparency, the more a company in Silicon Valley talks about transparency; i.e, Google, the less transparent they are. Transparency actually equals opaqueness.

Fertik:  I agree with you. Let’s stipulate for a second, Andrew, that transparency is a Silicon Valley fetish, and there are many fetishes in Silicon Valley right now; but let’s give Cyriac a chance to describe his version of transparency and what it means to him.

Roeding:  So I think that there’s a conversation here about what is right. We’re basically talking about values here, morals, what’s right and what’s wrong. You asked for the red line, where’s the line. The line is the societal decision really.

Any rules that we think are rules that we grow up with are very different from the rules that exist today. You can look at things outside of technology. Gay marriage used to be something that maybe 10% of Americans could imagine 40 years ago, and now it’s 55% or 60%.

So morals change, they shift. What used to be something that nobody could imagine is now suddenly common. With the Millennials, it is the same thing with regards to data. I think we should be cognizant of the fact that opinions changed about what it means to be in control and what it means to be giving up things about yourself in return for something that you’re getting back, some form of value—then there’s a problem, but if you know what is coming back, I think it’s much less of an issue.

In fact, I think people should be able to choose what they want, and I do not believe with the fundamental stipulation people don’t know what they’re doing. I think they will find out, even if they don’t know right now, something’s going to happen—one thing’s going to blow up, then everyone’s going to watch that. For example, nobody knew what was going on with the NSA until one year ago. One year—

Fertik:  That’s not true. Everybody knew—

Roeding:  We didn’t know the extent of it.

Fertik:  I don’t know. I think anyone who’s paying attention knew.

Roeding:  Over time, things have become very clear, and now we sort of know what’s going on, so now it’s a completely different debate.

Fertik:  I’m very hungry for a concrete example in the commercial realm. Is it a dashboard where I know my data? Is that transparency? What is meaningful transparency when it comes to my personal data? What does it look like?

Roeding:  Meaningful transparency is one sentence. When you sign up for Shopkick, the answer—there’s a clear statement:  Your data doesn’t go to any retailers, period. Your individual data doesn’t go there, period. Unless you choose to and click on something elsewhere you say I want that, it will not happen. That’s it.

Fertik:  That’s a promise. We’ll go to David now.

Kirkpatrick:  It’s a great discussion, and I’m going to broaden it in a way that many will not be surprised. When I started this conference with data from Gapminder, I’m not shopping around. I really think that’s a very serious factor in the whole picture here, and when we say is the Internet for or against you, who is the you? The reality is, the Internet is increasingly in the billions—

Fertik:  Gapminder? What’s that?

Kirkpatrick:  Oh, you weren’t there. It’s basically saying this planet is more than the United States, and that increasingly, the global economy—at our level a larger and larger quantity of the human race, and that is a significant baseline factor.

All these discussions have to factor in. And this one hasn’t been factoring in, in my opinion, although I don’t think it’s particularly negative; but I want to just throw it out, because it goes along with the point I made before about Keen buying his access to Google by giving him his data. It’s all a matter of perceived benefit and perceived risk.

And in the global context, one of the things that companies like Facebook have discovered unequivocally is in Indonesia, privacy is not an issue at all, because the people coming onto Facebook on their BlackBerries in Indonesia never even had a phone before. They were living in a village. They didn’t have anything. They get so much benefit from the Internet at the big picture level that these issues are trivial to them.

And I’m not saying that will remain the case, but I think it’s important to note that the Internet, if that’s what we’re talking about, is it for or against you—and the reason I am so convinced that it is—

Fertik:  For the Indonesians, in your view.

Kirkpatrick:  And this is the majority of people on the Internet today, and increasingly will be, so that’s —

Elron:  But David, how many people believe the Internet is really against you, except for Andrew maybe? How often do you hear that?

Kirkpatrick:  I think Jane Mortsov has been getting extraordinary traction rate recently in every possible venue, and I think it’s healthy to have the Andrews and the Mortsovs, as much as I find certain things they say really obnoxious and objectionable—not you—I think Mortsov is fundamentally naive and really ahistorical and all kinds of things, but I think there is a very—and look at the Dave Eggers book. This is a massive, new, sudden—there’s a very big meme that the Internet is against you right now, and I don’t dispute its validity. I think Dave Eggers’ book is—

Fertik:  Just talk about that book.

Kirkpatrick:  “The Circle,” yeah. It’s basically a novel written by one of the world’s most leading novelists, American novelists, called “The Circle,” about a fictional company that’s like a hybrid of Google and Facebook that basically encourages everyone to be completely transparent to the level of video of everything they do. And that’s the direction that the novel is—fears. It’s written in a very 1984-esque manner, so I don’t know. I just think that yes, people are getting worried. I don’t think it’s going to matter much.

Fertik:  Same old fear mongering crap, same old bogus stuff?

Kirkpatrick:  No, I don’t think that. I think, though, you have to look at which community you’re talking about. And I think for a large number of people the benefits they’re getting today from the Internet are so huge that these issues of privacy and such are trivial to them. I do think we can project ourselves into a world in five, ten years, particularly if there are two or three or one company, The Circle Company or Google or Amazon in particular, or Facebook, that have so much data and that are fundamentally unregulated; that that is a scary potential scenario to me, and I will say one—

Fertik: —about the Chinese as you do about the Indonesians? The Chinese, there’s a huge growth in the Internet. The access is of massive value, but they also have surrendered their privacy to the government. The government has not allowed them their privacy. Do you also think it’s trivial for the largest Internet market in the world, the one you’re accusing us of ignoring? You’re accusing us—

Kirkpatrick:  I think the average Chinese person is not hugely either surprised or concerned at the moment. I think—but anyway, that’s a whole different discussion.

Fertik:  Very easily dismissed by you, right? You said the Indonesians—

Kirkpatrick:  I always said about Zuckerberg, that the biggest risk is regulation, and he didn’t take it seriously enough. I’ve always said that—I said that at Facebook’s office in front of 400 Facebook employees three years ago, and he has taken it more seriously since then, but he still doesn’t take it seriously.

Fertik:  Let’s not make this another Facebook effect. Over there, yeah. I wish you were a shareholder, because at least you—

Westby:  Jody Westby again. Two quick thoughts; one that I neglected to mention is a huge driver for company awareness and consumer awareness is cyber-crime, because it’s the criminal that’s winning and it’s the criminal that’s driving the regulators, and so a lot of the privacy breaches aren’t the bad companies. It’s because bad guys are getting the data.

Fertik:  The cyber criminals are actually for you. They’re raising awareness that will get you protection.

Westby:  Right. The second is a little bit of information will equal a lot of information. So the consumer—what you’re going to see down here, I think, with the Snowden event, you are going to see this huge trajectory in public understanding of data mining. I’m so fed up with people saying well, I don’t have anything to hide; I don’t care, but that’s because—and when they understand data mining, they’re going to understand that this quote, unquote, metadata that NSA nicely wrapped around this legally protected information, metadata is actually something that’s highly useful.

And so a little bit of data, people are going to understand equals identifiable data. And so that learning curve, when the public starts to understand the power of big data, companies that are stupid are going to be in trouble.

Fertik:  James, quickly. James, do you think this Snowden moment or this moment, the zeitgeist is actually stimulating consumer demand for protection of their data through regulation or technology? Or do people just not care?

Cham:  On the consumer side, it’s a fair question. On the enterprise side, it’s a definite yes. On the enterprise side, I think every single CIO for the first time is thinking about security because—in a different way. Rather than trying to convince the CEO they should pay attention, now the CEO is yelling at them, saying hey, have we done this and this. I think that shift is definite. On the consumer side, these things take longer. I think we’ll see more vivid examples, and then I think—and then before you know it, like five years from now, consumer sentiment will be different and the expectation—

Fertik:  And is the enterprise the place where consumers get protected? So if you look at the big protections against fraud, I think Cyriac pointed out fraud, antivirus was another huge epidemic the last couple decades, obviously. The enterprise took it upon itself basically to start locking down the threats, and that’s where the technology and the innovation happened, that ended up protecting end users around fraud and antivirus.

Cham:  To the extent that consumers get protected because of all this innovation happening in enterprise is sort of a by-product, and it won’t be the companies that sell to the enterprise. There’s going to be someone to figure out a new way to sell to consumers and a new way to communicate this, and it’s just not going to be IPM.

Fertik:  Andrew.

Keen:  So I want to respond to David, because I think he’s brought up the most important issue of all, and I think I strongly disagree with him. He makes the point, he compares somehow America and Indonesia and say well, Indonesians don’t care, because they’re—I’ve never been to Indonesia, but he’s implying that somehow they all lived in villages and now they’ve got this wonderful new, shiny thing, their smartphone. And so for them, it’s really not a big deal to give up privacy or data protection.

I think that’s the kind of comment which one could suggest is extremely patronizing, and the reality of the next—Ericsson does great research on this—the reality of the next 15, 20 years, it is the Indonesias of the world and Africas, the countries in Africa that are going to be coming online, and these people, I think, are much more vulnerable on the data front.

And I think it’s unacceptable to argue well, they’re really poor and they’re really underprivileged, and so what’s a little bit of data acquisition for these people. I think it’s a very, very dangerous thing to say. I gave a speech in Oslo to a group of Africans when I was talking about this, and they were utterly horrified by this vision of the Internet, of transparency. The essence of their society was of privacy.

So I think this is going to be an even bigger issue in the next 20 years, and I think what Dwight’s group is doing at Ericsson in terms of this digital society is really interesting. Companies like Ericsson actually understand what’s going on in the rest of the world.

Fertik:  We have to wrap. 30 seconds for each of you, then we’ll say thank you. Quickly, Dan?

Elron:  I think this discussion’s been very good. I think we are at the junction point where government and regulation should come in. I think we have to be very careful about how we do that.

I think we should remember the comment from the beginning of the conference of my colleague James, who talked about the $1 trillion surplus that was created by the Internet and keep that in mind. So this is where I agree with David.

I was in Africa myself. I saw them looking at YouTube videos of how to take care of their kids. That’s worth an enormous amount of money. It may be worth a lot for many years, much more than the privacy issue, so I’m optimistic.

I think we should let entrepreneurialism flourish. I’m not as worried about big companies. In every technological revolution, big companies start it and eventually, either by themselves, they break apart or governments helped break them apart. That’s going to happen probably with those companies here as well. And I think there’s enormous, great value that can occur through disruption by the usage of data and by the usage of the Internet.

Fertik:  Thank you. James, quickly.

Cham:  The Internet is good for you, but it’s also not your friend. We’re at the point where we need to make a set of political decisions, and there are now interests aligned and there are people who are thinking about it in serious ways. And if you’re out there right now, you should be getting involved on the policy side, because these are the decisions that are going to last for 50 to 100 years.

Fertik:  Cyriac. 30 seconds.

Roeding:  I’m very excited about what’s possible that wasn’t possible just a few years ago, and even more excited about what’s coming, Internet of Things, the idea of sensors everywhere, the idea of the technology fading into the background, even our faces becoming interfaces and gestures and so on. So there’s a lot of great things.

At the same time, I think we have to find a way to have a debate on two levels:  One is on what is right, what’s the golden rule; in other words, what do I want to happen to me, and only that should happen to others. And secondly about what kind of regulation is necessary to enforce that for people who have a slightly different moral compass, that don’t understand the golden rule.

And I think the combination of those two things will help it, but I’m very excited about it, and I don’t want it to stop. In fact, I want it to move forward faster, because it’s wonderful, but it has to go along with morals and with the right regulation.

Fertik:  Andrew, bring us home. 30 seconds.

Keen:  30 seconds. I think we—the Internet is the platform for 21st Century life. If we’re going to really humanize it, we need to teach it or force it to learn how to forget.

Fertik:  Very powerful. Thank you very much. Thanks for our host, Techonomy. Thanks for joining us.