fbpx

NYC Conference Report #TechonomyNYC

Figuring Out Facebook’s Future

1/1

  • Jason Kint of Digital Content Next speaks during his panel on Facebook's future at Techonomy NYC in May 2018. (Photo: Rebecca Greenfield)

  • Marc Rotenberg of EPIC speaks during his panel on Facebook's future at Techonomy NYC in May 2018. (Photo: Rebecca Greenfield)

  • Moderator David Kirkpatrick (far left) converses with Marc Rotenberg of EPIC and Digital Content Next's Jason Kint during their panel on Facebook's future at Techonomy NYC in May 2018. (Photo: Rebecca Greenfield)

  • Moderator David Kirkpatrick (far left) converses with Marc Rotenberg of EPIC and Digital Content Next's Jason Kint during their panel on Facebook's future at Techonomy NYC in May 2018. (Photo: Rebecca Greenfield)

  • Moderator David Kirkpatrick (far left) converses with Marc Rotenberg of EPIC and Digital Content Next's Jason Kint during their panel on Facebook's future at Techonomy NYC in May 2018. (Photo: Rebecca Greenfield)

Speaker

Jason Kint
CEO, Digital Content Next

Marc Rotenberg
President and Executive Director, Electronic Privacy Information Center (EPIC)

Moderator

David Kirkpatrick
Founder and Editor-in-Chief, Techonomy


Session Description: The man who brought the FTC complaint and a veteran media leader who struggles against Facebook’s power.

The full transcript is below, with a PDF version accessible here.

Figuring Out Facebook’s Future

(Transcription by RA Fisher Ink)

Kirkpatrick: The two people we’re going to have as part of the conversation are real experts. Marc Rotenberg—come on, Marc.

I don’t know if you all—you all may not know that I wrote a book about Facebook, which was an extremely cheerlead-y kind of book that came out about eight years ago. It did have some cautionary content in it, including from Mark, who I quoted and talked to a lot back in the day. I got to know him as I was reporting that book. And I wrote the book because I really felt that Facebook was possibly the most important tech company that I’d ever seen and its role in the world had potential to be more overarching than any company I’d ever seen. And I was hoping that it would be fundamentally positive. I think it has had an enormous range of positive consequences in the world.

But what we’ve really become quite aware of in the last couple years, particularly since the election—and frankly, I’d been worried about this for some time even before that, but it’s been proven that there are some very, very negative consequences of Facebook’s mode of operation. And the question of how to balance the good and the bad is what I think we’re now faced with, given the extraordinary scale of Facebook’s presence in the world. And before I finish introducing the panel and we launch into the discussion, I want to just run through some of the unique things about Facebook. For one thing, it was the fastest-growing company ever when it first came up. Its profit level now, at a 40 percent net margin, is unprecedented for pretty much any company. So it’s basically the most profitable company you have ever seen, certainly the most profitable company of its size. That is something worth keeping in mind, given the choices that confront it, many of which will necessarily involve reducing its profit.

Another thing about it that’s unusual is the scale of its usership. With 2.2 billion users, it is basically the largest aggregation of humans ever assembled for any purpose, including any religion, anything. There’s never been 2.2 billion people doing any one thing. Google has more than that on all of its services combined, but that 2.2 billion figure does not include WhatsApp or Messenger or any of the other, or Instagram. There is overlap. I’m sure if you were to add the incremental people that are on those services, Facebook would be getting up closer to three billion in its total impact. Certainly well past two and a half billion because WhatsApp has many users who are not on Facebook.

Then you have the control that one person exerts in this company. Unprecedented. Mark has absolute and total control, and one I think has to use the word hubris to describe something that he did last year, or tried to do. He tried to change the rules of the company so that he could sell all but basically 1 percent of his stock and still retain absolute and total voting control of the company personally. And it was only when he was two days away from having to testify in open court in Delaware to defend this against the shareholder lawsuit that he dropped the idea. He in his own mind sees this as a good thing because he believes his values to be so positive and responsible. And I understand where he’s coming from, but unfortunately, no one person could ever be trusted to that degree.

There are other singular things about it. I think the fact that a single company is essentially the town square for a very large proportion of the planet is also bizarrely unique. And then finally, the fact that a commercial company has such extraordinary influence and role in the processes of democracy in 190 countries, at least those that are democratic, is another extremely unusual, unprecedented, and extremely unnerving reality.

So really I think to summarize, the challenge Facebook presents is that it’s a commercial company that has achieved an extraordinary civic role. And that is true for other companies, but not nearly so problematically. Even Google, for all of its challenges that it represents, is not playing a role comparable to Facebook in terms of a public dialog. And it’s partly because of the way Facebook empowers individuals to have a voice and its failure to govern what the consequences of giving all those people a voice might be. And certainly it’s a positive thing, which is why I wrote my book and called it The Facebook Effect, because I felt the effect of giving people a voice was an extremely attractive and powerful proposition.

So forgive me for a lengthy introduction, but in this panel I’m a little bit of a co-panelist as well as moderator.

[LAUGHTER]

Now, let me just introduce the panelists. Jason Kint at the end is CEO of Digital Content Next, which is an aggregation of publishers and content producers and video companies that are all dependent on Facebook and Google and extremely involved in trying to navigate this extraordinarily complex relationship since they are, on the one hand, dependent on Facebook, on the other hand, they have almost no power, in many cases, over how their content is distributed and consumed.

Marc Rotenberg, closer to me, is the president and executive director of the Electronic Privacy Information Center, which is a group that for a long time has been one of the leading defenders and proponents of privacy in a digital realm. And he’s a lawyer and he’s done a lot of—you’re both—you’re a lawyer too, right?

Kint: I’m not.

Kirkpatrick: Oh, you’re not. Okay.

Kint: I pretend to be.

[LAUGHTER]

Kirkpatrick: Anyway, so Marc is a lawyer, but he’s done a lot of tremendously positive work in Washington, where he’s located, working to ensure that the world we’re entering into is not a lawless one controlled by companies in effect, where he’s really thought a lot about the rights of the individual and how to protect those. And if I could just quickly finish introducing him by saying when we hear all this discussion about the FTC complaint against Facebook and whether it’s been in fact violating that for the last several years, which we will discuss, that complaint was really a consequence of that consent decree—sorry, the right term, is a consequence of a complaint that Marc initiated, what, in 2011?

Rotenberg: 2009.

Kirkpatrick: 2009. And then the consent decree was in 2011. And of course, the malfeasance that Facebook is accused of more or less began around 2012 and—anyway, so first of all, Marc, I don’t know if you disagree with anything I’ve said or if you want to elaborate. When you look at Facebook, whether disagreeing with me or not, what’s the basic picture you see?

Rotenberg: I see a very large company, well described by you at the opening, with very little accountability, very little democratic control, an extraordinary asymmetry of transparency, which is something that I’ve noticed over the years. I mean, we live in world where we are increasingly asked to give up more and more of our personal data, but the companies that do this become increasingly secretive. How the News Feed is determined, how the algorithms are created is entirely opaque to the end user. So I think we need to look to our political institutions to try to restore some accountability, restore some balance.

I don’t make moral judgments about Mark Zuckerberg or the company, but I am very concerned about the future of democratic institutions and I think, you know, Mark at Techonomy just the day after the 2016 election—it was an extraordinary moment. He said the whole thing about fake news had been overblown and then the company spent the last year basically trying to hide the details about how a foreign adversary manipulated a presidential election in the United States. So I think we need to take very seriously this challenge. And also, to your opening remarks, to me it’s entirely upside down that Facebook becomes a platform for democratic decision-making, rather than Facebook existing within the context of democratic institutions and being held to account by the public. That should be larger than a company.

Kirkpatrick: Yeah. One thing worth pointing out is there really is a case that Facebook manipulation by the Russians and other basically inappropriate behavior could really have been the primary factor that swung the election, particularly in key states. That’s not widely discussed in the press, but there’s a good data analysis I saw, given that the margin of victory for Trump in Michigan and Wisconsin was I think 40,000.

Rotenberg: Right. Well, people—

Kirkpatrick: That wouldn’t have taken that many fake Facebook ads to have turned the opinion of that many people in those states.

Rotenberg: People who have looked at the issue have been very careful, particularly with regard to voting outcomes, to say that there’s no direct evidence of changes in vote tallies. But if you look at the reach of Russian propaganda on the Facebook platform, you have approximately 120 million people in the United States, you know, who through the various troll farms that Russia had established, were receiving messages that were intended to polarize, that were intended to diminish, that were intended to weaken our democratic institution.

Kirkpatrick: Okay. Let me just make one more point before I go to you, Jason, because another thing that people—there was this great Wired article, which some of you may have read, that was 20-some pages and has gotten some attention. But it really buried its own lead because the key conclusion of the Wired article was that very likely the main reason that Facebook overlooked or deliberately underplayed the risk that the Russian electoral interference was posing was because the company had become extremely concerned about seeming to come down hard on the right wing after a series of controversies that I won’t go into, but that led to Facebook getting a lot of criticism from the right. And they’re so concerned about being perceived about being seen as partisan in any way that they overcompensated by not governing right-wing propaganda. And they continue to do that, in my opinion by the way, because they’re so paranoid about being disapproved of by Trump and his party.

So anyway, Jason, what do you think about any of this and talk about how the content creators are relating to this situation.

Kint: Thank you. Thank you for having me. I won’t reiterate many of the great points here. I would only add to that discussion by saying, one, you kind of glossed over Google, and I know we’re here to talk about Facebook, but Google—all these issues exist with Google, if not more. And I think there’s going to be more and more attention to that. In terms of data collection across the web, Google dwarfs Facebook. Facebook is by far number two though. So we can’t separate that. And what’s common in their practices and where a lot of these issues unite and effect our member businesses, and I think also I’m guessing Marc worries a lot about, and I see that in the work they’re doing which is important, is these are free services that reach enormous breadth of audiences across the globe and they operate around a business model in which their number one goal is to collect as much data as possible to microtarget advertising at those users. And it’s fairly unbridled. There’s no real controls on it at all outside of their own financial business interests.

And it, importantly, often operates outside of consumer expectations, meaning, in the case of Facebook, 70 percent of users, if you look at the research we’ve done, don’t expect that Facebook would track them across the web and absorb their browsing history as they go across the web through unclicked like buttons, which is basically how they do it. But they do it. So that’s outside what any average user would expect. And if you go down their practices, they’re not doing what a typical user would expect and they make enormous amount of money and control the power of their business model by being able to collect data across the web and app ecosystem.

Kirkpatrick: So this is something that I didn’t realize. So any time there’s a like button on a page, Facebook can track your passage through that page.

Kint: Absolutely.

Kirkpatrick: Even if you don’t click the like button.

Kint: Absolutely.

Kirkpatrick: Yeah. That’s not a well-known fact, but I’m curious—

Kint: I wrote about it in the Journal four years ago and it’s the source of a lot of issues.

Kirkpatrick: I should have read that at the time. My apologies.

Kint: No, no. I don’t mean to be critical but it’s—

Kirkpatrick: But do you think that the recent controversies may have increased the likelihood that consumers are aware of that, at least in the developed countries?

Kint: I think what’s happening is that the average user is becoming more aware, but more likely they’re just having a sense of distrust. And the trust in the platforms is dropping significantly. And also, our institutions that are there to hold them in account are stepping up and learning about what they’re doing. So what you’re seeing is the press has been educated significantly in the last couple of years and I think GDPR, if you follow what’s going on in Europe, is starting to educate the press so they can properly cover these companies. And that’s accelerating. And then I wouldn’t say—no disrespect to Washington, I think I’m seeing it more in Europe right now, but they’re starting to understand the business model. The U.K. Parliament—I mean, one thing we have to realize is Facebook, most of the answers to the election and what’s happened in the last year and half or two years had not been disclosed by Facebook and in many ways they’ve misled—

Kirkpatrick: You mean there’s information still to be disclosed?

Kint: Oh, my goodness. Yeah. Totally.

Kirkpatrick: That you believe will eventually come out?

Kint: Yeah. Parliament is expecting answers to 39 questions on Friday from Facebook and they’re also expecting Mark to come testify or he can’t land in Britain again.

Kirkpatrick: If he doesn’t show up by Friday?

Kint: Yeah. No, if he doesn’t agree to testify at a future date. But they have to answer these questions and a lot of them are very important questions that they failed to answer.

Kirkpatrick: Give an example of one of those questions.

Kint: They have to turn over their nondisclosure with the guy who took the 87 million records, along with the employee that works for Facebook that helped him take those records. That is probably, possibly a violation of the consent decree that you mentioned with Marc.

Kirkpatrick: Yeah, explain that about the employee from—

Kint: We can go down a lot of rabbit holes.

Kirkpatrick: But no, that’s an interesting rabbit hole to go down though.

Kint: Everyone understands that a guy named Dr. Kogan built an app that ingested 87 million records, according to Facebook in the reports. His co-equal partner, according to his testimony and Facebook’s testimony, is an employee of Facebook, or was at least, as of recently. And he was hired by Facebook a month before The Guardian made the report. So that hasn’t been put out—

Kirkpatrick: The first report, which was in 2015, right?

Kint: December 11th, 2015, yeah.

Kirkpatrick: Yeah. So you’re wondering, among other things, if I recall correctly from talking to you, that if that guy who now works for Facebook and was Kogan’s partner knew about The Guardian investigation prior to going to Facebook, that in itself is a salient fact.

Kint: These are all questions. There’s just so many questions that they’ve failed to answer so far. And so that is salient fact if they—yeah, if that came up during the discussion. It was just confirmed by their CTO at Parliament a couple weeks ago. But back to my original point is I’m seeing the people that are asking questions of these companies get a lot smarter with their questions, and I think that’s important.

Kirkpatrick: But quickly, and I think you two would disagree on this question of regulations, so I want you to talk about your view on how much you think Facebook should be regulated as a result of what we know so far and what you know in your position as a representative of the publisher.

Kint: I have significant distrust right now that they’re able to fix these problems based on all of their actions in the last year and a half. They’re hiding behind a free business model in which they have 40 percent net margins and $40-some billion dollars of revenue, so—

Kirkpatrick: It’ll be $55 billion this year, as Martin said.

Kint: So there’s plenty of profit to do a better job of meeting consumer expectations that they seem to not be able to take hold of. I generally am concerned about any sort of regulation that affects the entire industry. I’d rather self-regulations step up, but I also will say that GDPR in Europe may be the best thing that we’ve seen so far in terms of raising the bar on privacy and data.

Kirkpatrick: So Marc, do you think self-regulation could be enough here?

Rotenberg: No. I mean, as you know, David, we’ve been at this for a long time and my view has always been we’re not in favor of regulation for regulation’s sake. But we are in favor of privacy protection, competition, and democratic institutions and the experience with self-regulation is that it’s not worked. Our particular frustration with Facebook is that we thought we did come up with a strategy.

Kirkpatrick: With the consent decree.

Rotenberg: Yes. Well, you know, it was a very interesting moment. I’m putting together a book on this now. I should probably interview you. But between the beacon fiasco, which takes us back to 2007, and when we filed the complaint—it’s a very detailed complaint, by the way. It’s longer, in fact, than the consent decree that resulted. At the end of 2009, we had waged a campaign against Facebook to establish democratic accountability for the company. And if you go back to 2009, you’ll see there’s actually a moment where we pretty much stared down Mark Zuckerberg and he agreed to a statement of rights and responsibilities for the company and he agreed to give users the right to vote on policy changes and we agreed not to proceed with our complaint that was pending at the Federal Trade Commission.

Kirkpatrick: Was this a—you dealt directly with him?

Rotenberg: They literally—Chris Kelly, the chief privacy officer, called me on behalf of Mark and said, “We know you’re about to file this complaint. We’ve given this a lot of thought”—and it’s remarkable, by the way, talk about the right to be forgotten. All of this history has somehow evaporated from the internet. But it’s all there and it’s a fascinating history, and I think a timely history, and will make a good book.

But I just want to briefly say that in 2009 all of this happened; we had a statement of rights and responsibilities, we had voting, we had campaigning, and as the year went on, it became clear that Facebook was not going to keep its commitments. So at the end of the year we actually do go to the FTC and we file the 30-page complaint. We spend the next two years at the FTC explaining why they need to act on the complaint. There was additional evidence, more supplements.

And then in November 2011 we get what I thought at the time was a wonderful consent order. People came to me and they said, well, you know, the United States doesn’t have a privacy agency. Don’t you think the U.S. needs a privacy agency like every other democratic country? And I said, “Oh, no, no, no the FTC is doing such a good job. Read this consent order. You know, it hits all the key points, right?” But of course, they failed to enforce it. And we actually sued the FTC when Google integrated all the user services post-Buzz in their policy. That was also a complaint we had filed. We said that violates the order. The FTC chose not to act. We had a sympathetic judge who said, “I kind of agree with you, but I don’t have the authority to tell the FTC what to do.” And then you see of course what happened, and this is where Washington breaks down. When people don’t do their job, companies understand, “Well, they’re going to leave us alone. We don’t have to look at the terms and conditions and the agreement with Kogan when we provide the data to Cambridge Analytica,” for example. “We don’t have to audit the use of the data by the third party app developers.” And very bad practices resulted. What was most interesting to me recently, and I wrote about this for you and Techonomy, was Koum’s decision to leave WhatsApp.

Kirkpatrick: Yeah, Jan Koum, the founder.

Rotenberg: Yeah.

Kirkpatrick: Co-founder, whose other co-founder had already left and was already promoting the delete Facebook hashtag.

Rotenberg: Right. So here’s the story there. This was another detailed complaint we brought to the FTC. This was over the acquisition of WhatsApp. We said, hey, WhatsApp’s got a great messaging service and people really like it for privacy. They’re not collecting a lot of data. It’s end-to-end encrypted. And if Facebook acquires that company, what’s going to happen to the data of the WhatsApp users? And the FTC said to us again, oh, we’re going to—you know, good policy, all that stuff. And of course, they didn’t enforce it, right? And when Koum left the board of Facebook, he was citing exactly the issue that we had raised with the FTC, the need to protect privacy. So my view now is that you need regulation somewhat for the counterintuitive reason to preserve innovation. You need it to protect companies and you need it to protect good practices, otherwise I think we just see a race to the bottom.

Kirkpatrick: Well, I think audience should have their voice here. We don’t have tons of time, but the lights will flicker on and off or something. Who has a comment or question? Okay, please, Rich.

Audience 1: Oh, thank you. Thank you for your comments. That’s encouraging that you guys are on the case. Two concise, related questions. One, when Edward Snowden left the country, at the time the public sentiment that people were more afraid of government surveillance than corporate surveillance, but now I’m wondering if you think that’s switched.

The second part of the question is, I’ve been diligent about what I share on Facebook, but I use a private Gmail account. I write articles for The New Yorker, and I research chronic fatal diseases for one article, and before I know it on Dictionary.com, on other websites, I’m getting ads for drugs for fatal diseases. What the hell is going on? And what’s the implications for my privacy, future health coverage, and how corporations can make money?

Rotenberg: Okay, so two quick answers on good questions. You’re right, Snowden did put the focus on the NSA and the practices of the U.S. intelligence community, but interestingly Snowden also helped turbocharge the European Union’s effort to update its privacy law, which is the GDPR. So the practical consequence was that privacy energy was transformed into a very comprehensive privacy rule that will impact the practices of U.S. firms.

And I guess to your second point, which is what Jason had raised earlier is, you know, Google’s kind of quietly on the sidelines right now watching Facebook take the hits, but in fact, Google’s tracking of users is much greater. And that’s also one of the issues I think we’re going to need to confront in the policy realm.

Kirkpatrick: Do you think—Jason implied they were at least as great or a greater risk at the aggregate level. Do you agree with that?

Rotenberg: Yes, I do. I mean, you know, one of my other hats, I helped establish the dot-org domain because we wanted to promote the non-commercial use of the internet and I spent a fair amount of time in the debate over the CCTLDs and then the generic TLDs. And one of the things that struck me—we’re talking by the way about things like dot-music and dot-toy and you know, dot-sex, and so on. They never really took off in the way people thought they would. But Google kind of raced ahead in that process and tried to grab virtually all the GTLDs that were out there. It was like they were kind of future-proofing the dominance in search by anticipating how future search might occur and I thought that was just fascinating.

Kirkpatrick: Wait, explain that as concisely as you can. I don’t quite understand what you mean by what you just said.

Rotenberg: Well, first of all, we’ve kind of lost the concept of domains in the dot-com, dot-org, dot-edu sense because we’re all now on Facebook in an odd way. I mean it’s, you know there’s been diminished—

Kirkpatrick: Well, also, if you get to a site through Google search, you don’t care what the suffix is.

Rotenberg: Right. But the point I’m making is that a few years ago in ICANN, there was a big auction process to expand the TLD space, top-level domains. And there are about 200 new generic top-level domains and they basically have this dot-music, dot-TV, dot-pro. Google went out and basically tried to bid and capture as many of the GTLDs as they could. I think they even created a holding company so it wouldn’t be apparent that Google was doing this.

Kirkpatrick: And the point of that was what, to prevent others from acquiring them which would somehow undermine their control?

Rotenberg: The point was to maintain the dominance that they had established and continue to expand essentially over the architecture of the Internet. I mean, now we’re talking about DNS for example, where I think Google is now resolving maybe a quarter or a third of all domain lookups across the Internet.

Kirkpatrick: Wow.

Rotenberg: So this is the stuff that operates in the deep levels of the internet architecture and Google is present there. And it’s the reason—you know, when you say for example that you do a search and then you see an ad, how did all that happen, you know, there’s a very high likelihood that it’s within the Google advertising network. It’s not paranoia. It’s a business model. I mean, that’s my point.

Kirkpatrick: Sadly, we’re short on time, but, Jason, any thought on any of that? And first of all, do you agree that we’ve shifted from being worried about government to business?

Kint: Yeah, I think we’re concerned about both. You know, having lots of important news organizations in my membership too, I think that there’s concerns on both right now, for sure. To put a data point on it, according to Princeton, nearly 80 percent of the top one million websites have a Google tracking tag on them. So that means Google knows what you’re doing across almost everything you would do on the web that matters. And why that matters to everyone in the room is, if you’re a publisher, it means they’re able to extract the data from the page to target advertising wherever they want for the highest margin possible. So they’re taking the value of the publishers and using that for their own profit margins.

If you’re a user, which we all are, you should care because the most profitable place to then target that user from data they collect for either a Facebook or a Google is always going to be the place that has the highest margin, the cheapest content, and if you want to go to the extreme case, the piece of fake content or the bot traffic is going to be the most profitable piece. Now, they’re going to avoid that because they have some ethics and concerns there, but ultimately, the piece of high value content from the New York Times or from ESPN is going to be a lower margin place for Google. And so it moves the economics of the entire industry towards lower quality content and more and more tracking and that’s generally bad for everyone here.

Kirkpatrick: Wow.

Kint: Except for the people who work for Google and Facebook.

Kirkpatrick: I think we should take a—Kirsten, can we take another question or two? Because we’re still—it looks like we’re a tiny bit ahead right now. This is too good to stop now. Okay, wait, I want to hear Michael and then you. Okay. Michael?

Audience 2: Hi, there. Going back to the fake ads, fake posts question, there’s been a theory in this country that the enemy of bad speech is more speech. Are you now suggesting that what we need to do is actually regulate what that content is, which would be very different than anything that we’ve done before?

Rotenberg: Let me say a couple of things on that issue, because we have been involved in some of the proposed legislation post-2016 around online advertising. I don’t think we should censor speech. I also don’t think we should bring back the Fairness Doctrine. Both proposals have been put forward in Congress. I think they would be a mistake. I do think that internet companies that provide political advertising should carry the same responsibility that broadcast companies and print companies do to disclose the source of a political ad. I think we made a big mistake—I mean, I didn’t personally, but the FTC made a big mistake when people said, oh, the internet wants to be free, you don’t need to disclose the source of political advertising. Every other broadcast firm does.

Kirkpatrick: If I’m not mistaken, Facebook has more or less—talk about self-regulation—said they will do that on their own now, right?

Rotenberg: Well, that’s the kicking and screaming outcome. You have to understand how they reached that point. They resisted, they resisted—

Kirkpatrick: The threat of regulation, yeah.

Rotenberg: They resisted. And Senator Klobuchar pushed very hard during the hearing—it’s her bill—and said, “Would you now agree to this?” And you know, at that point, you’re in front of 50 members of Congress, you find a way to agree. So I think that’s part of the story is that at least ensuring the same level of accountability in political advertising.

Kirkpatrick: Okay. Jason, do you think Facebook is a media company? Does it matter? I mean—

Kint: I don’t know that it matters. There’s an accountability that they need to have over the content on their product and the way their algorithm is working.

Kirkpatrick: Okay, well, that’s a yes. That’s a yes.

Kint: Right now we have an issue where they have zero liability over everything that goes to their systems but they capture all the value from the data that comes with it. That’s not a good place to be.

Kirkpatrick: Okay. Here comes the mic.

Hatcher: Thanks, David. Douglass Hatcher with communicate4Impact. We’re a storytelling training firm. My deal here is more of an observation. We’ve talked about responsibility, we’ve talked about inclusion, but I think we’ve had a couple people today, I think Marc was one of them, make the connection between regulation and innovation. So I thought maybe you guys could talk about that for just a second longer.

Kirkpatrick: I mean, it’s a very interesting thought that we need to protect the ability to innovate which is what you were saying, right?

Rotenberg: But several of your panelists—I’ve really enjoyed this conference and several of your panelists have made similar points you know, about the new medical technology, new environmental technology, new technology for sustainable food. Regulation can create very favorable incentives and it can basically reflect a public’s view as to what a desirable outcome is. And companies come along and say, gee, it would be really cool if we could do innovative services with less personal data. We’ll reduce the risk of data breach, we’ll reduce the risk of identity theft, and we’ll have really cool services. So I do believe that regulation can play a positive role. I mean, obviously there can be mistakes, but that’s certainly not a reason not to think about positive outcomes and to drive innovation and incentive in that direction.

Kint: I would just add I think the bigger opportunity there to inspire innovation is a structural question. Google was allowed to buy Double Click and even though they promised, I believe, that they never would do this, they merged all the data and cookies across their ad tech company and search and Gmail back in 2016. And Facebook was allowed to buy Instagram because it was a photo app and Facebook wasn’t a photo app, which is ridiculous in hindsight. And important, if you believe the data from Nielsen, Facebook’s usage of the core app is down nearly 10 percent year over year since last summer.

Kirkpatrick: In the U.S.

Kint: Per user. Globally, per user.

Kirkpatrick: Globally?

Kint: On a per user basis. And trust is down, and so their growth is coming from being able to collect the data across all these other products, WhatsApp and Instagram, along with the usage growth of Instagram. So they actually have some fundamentals underneath the business that they’re protecting because of their reach.

Kirkpatrick: Okay, well, we have to wrap this discussion.

 

Leave a Reply

Your email address will not be published. Required fields are marked *