Session Description: Hughes’ new book is Fair Shot: Rethinking Inequality and How We Earn. And he helped start Facebook in his dorm room.
The transcript is below, with the PDF available for download here.
Facebook, Tech, and Rethinking Inequality
(Transcription by RA Fisher Ink)
Kirkpatrick: Chris Hughes probably doesn’t need a whole lot of introduction, but I will tell that he was Mark Zuckerberg’s roommate along with Dustin Moskovitz and a relatively obscure fourth person—
Hughes: Billy! Don’t forget Billy!
Kirkpatrick: What was his name again?
Hughes: Billy Olson.
Kirkpatrick: Billy Olson. And what does he do now?
Hughes: Not to mention Eduardo. That guy was around too.
Kirkpatrick: Eduardo wasn’t a roommate, though, right?
Hughes: No, he wasn’t. He would like for you to think he was, though.
Kirkpatrick: Yeah, but what is Billy Olsen doing now?
Hughes: I don’t know.
Kirkpatrick: No, that’s okay. So—and Chris has written a book that is quite interesting, called Fair Shot: Rethinking Inequality and How We Earn. But not only that, and he has said he’s willing to talk about Facebook up here, but he recently wrote an essay and published it in a very prominent place, on the idea that we should have a data dividend. And explain what that is, and how that might work, and how it relates to other things we’ve been hearing here.
Hughes: Sure, well why don’t I set the scene about how I came to that idea and a little bit on the book and what I’m working on, in order to speak to that.
Kirkpatrick: Yeah, please.
Hughes: So I, as David mentioned, was one of the co-founders of Facebook. But before that I grew up in a little town in North Carolina, where my mom was a public school teacher, dad was a traveling paper salesman, we had a very middle-class life. I could go into it, but whatever you think when you think small town America in the 1980s in the South, that’s what we were like. And then I got a chance to go to boarding school on financial aid, later go to Harvard on scholarship, and there I became acquaintances with Mark Zuckerberg [in] freshman year. We roomed together sophomore year. We started Facebook on February 4th, 2004. The rocket ship takes off and just about everybody in here probably knows the basics of the story, although you may not know the fourth roommate’s name.
Kirkpatrick: I had it in my book, though. I forgot, that’s all.
Hughes: You did. You did. You had everything in your book. [LAUGHTER]
Kirkpatrick: Oh thank you!
Hughes: It was great! The reason, though, I tell that is because, you know, I ended up working at Facebook for three years, doing product work, communications, marketing. I’m really proud of the work we did, was part of the team that started News Feed, some of the early sharing functionality. However, for three years’ worth of work, I was able to earn nearly a half a billion dollars. Which is nothing but a lucky break. And I think we need to start calling my experience by what it was, and the experience of a lot of other people that keeps happening. Not just in tech, but there are an awful lot of us in tech, who do work some, but for whom the economic rewards are just vastly disproportionate with the amount of effort put in. And now this is happening at the same time in our country when median wages have been flat for 40 years. I know you guys have been talking about this a little bit today and yesterday. Cost of living up by 30 percent. So, the 0.1 percent now owns as much wealth as the bottom 90 percent combined. Not the top 1 percent. The top 0.1 percent. It hasn’t been this bad since 1929.
So the reason I tell all of that is because I think it’s an important backdrop for the conversation that we’re having today about Facebook, technology, and the concentration of power. And I think we’re beginning to ask questions about what responsibility do corporations, and those of us who have done the best in the new economy, have to ensure that the American Dream doesn’t die. Because it’s close. It’s on life support. And unless we make changes, something has to change. So one thing I talk about in the book is creating an income floor for all Americans. And one way of doing that is like what they’ve done up in Alaska. They have a dividend that every single person receives of [roughly] $1,500 dollars a year. So man, woman, and child. So if you’ve got a family of four, you’re getting a check in October for $6,000 grand. No strings attached. If your median income is $60,000 dollars, imagine your own take-home income being boosted by 10 percent. That’s not enough—
Kirkpatrick: That’s financed by oil taxes or whatever.
Hughes: It’s not enough to change your life, but it’s meaningful. It’s financed by oil taxes, a common wealth that they agreed 50 years ago, under a Republican governor, that they all owned.
Fast-forward to today, 2018. I think there’s an argument to be made that all of the data that we are creating, everybody in this room, but everybody in this country, and billons of people across the globe, all of this data that we’re creating is a kind of common wealth. It’s not individual really, on a one-by-one basis. One person’s data isn’t particularly valuable, but collectively, as we know, it is very, very valuable. You know, Facebook, Apple, Amazon, Netflix, and Google, their market caps are now nearly a third of the entire NASDAQ composite. They account of all of the—not all, but a meaningful portion of the recent growth.
So I think we should have a conversation about a data dividend. A lot of questions about how this would work, but the basic premise is we’re creating all of this common wealth, and that’s a good thing for the companies. We need privacy protections; we need a whole host of controls in place. I think it’s also reasonable to ask that everyone who’s creating the data shares in a little bit of the upside, you know, a royalty of 5 percent or something like that, on revenues. Could finance a dividend of $400 dollars a year for people. And that would almost certainly grow, as the data grows and—
Kirkpatrick: Is that figure for all citizens, basically?
Hughes: I think that’s right. I think that’s the way it should be—
Kirkpatrick: So it’s kind of really the same thing, in a way, guaranteed income. Or it’s very overlapping.
Hughes: From a different side. The idea is an income floor to provide financial security to people who need it—
Kirkpatrick: But it’s financed differently. Because you’re saying the companies should pay for it, right?
Hughes: I think so. I think so.
Kirkpatrick: That’s a big difference.
Hughes: There are so many questions about how to do this. So the reason that I wrote the essay in The Guardian was not because I’ve got this figured out. Because let me be the first to tell you, I do not. It instead was to provoke the question. A question, in particular, that highlights the concentration of power of wealth, the need to do something about it, and one potential approach. And there are all kinds of very important questions about how this would work and we need experts on data, on privacy, on how sovereign wealth funds, which is the term of art here, should work.
So hopefully this can be the beginning of the conversation. And I’m not saying this is the idea, today. But it should be, as the same time as we’re talking about so many other changes, I would hope that we can talk about economic insecurity as well.
Kirkpatrick: So in a sense, what that would do is shift money that is currently going to the market cap of these companies, that represent a third of the NASDAQ’s growth or market cap, and reduce it and pay it back to the citizenry, in effect. Because it would clearly reduce the profitability and the value of those companies, to some degree, right?
Hughes: It’s a way to ensure that those profits are more equitably shared. Because right now, those profits are being fueled by the participation of people on the platforms. You know, and every Facebook user comes in, clicks “Agree to the terms of service,” how many of them are actually reading it? Actually understanding what’s going on? And I think now people are starting to ask questions, which is great, but it’s a way to ensure that the people who are literally laboring to create this data, which creates these historic margins, are compensated, in some modest way for it.
Kirkpatrick: It’s interestingly relevant to a number of the debates that are underway. So congratulations for vetting that and throwing it out there. I think it’s going to lead to some very good discussions.
Hughes: It’s the beginning, so if you’re interested in it, we would love to talk more. There’s a ton of questions about how it would work.
Kirkpatrick: One thing that it’s relevant to is this 40 percent net margin that Facebook has, and you know, we didn’t even get to it in the earlier discussion just now. But you know, Facebook says that it has to have ads to finance its global growth and the sharing of this distributed platform for empowerment that is so socially valuable. That’s the way Mark thinks of it, and that ads are the way we “pay for it.”
Kirkpatrick: But, you know, given that they are getting so much more than they need to pay for it, in their profits, there is an intrinsic flaw, in my opinion, in that argument. I don’t know whether you look at it that way, but I guess what I would be curious to know is have you had any discussions with your friend Mark or any of the other Facebook people—
Hughes: I have not.
Kirkpatrick: —about this idea? Have they responded in any way? Have you heard anything? They must hate that idea. [LAUGHTER]
Hughes: I don’t know. We—I haven’t talked to them about it. I think they’re—
Hughes: They’ve got a lot going on over there. So you know, I mean I think that they’re—I haven’t talked to anyone specifically about the data dividend idea. I have talked to many people in Facebook’s leadership and elsewhere about income inequality and the responsibility that, as individuals and as corporations, I think we all have to ensure that more people have economic opportunity than have it today. And I think that they’re, you know, all very interested in that. Mark, last year in a speech at Harvard, said that he was interested in how a basic income might work. He went up to Alaska on his 50-state tour, which now seems like a different age, when he was—you know, everyone was speculating he was running for president. But he went to Alaska and was interested in it to.
So there’s interest, not just at Facebook, but, you know, amongst the founders and leadership of a lot of other companies. I think some of that stems from a concern about artificial intelligence and the impact on work. Some of that, I think also stems from just a really direct look at what’s happening when it comes to economic opportunity and small business starts in the country.
Kirkpatrick: Okay, so you basically left Facebook to go to work for the Obama campaign, the first time he ran, and he won. And you were deeply involved in his digital efforts. You know, were you a political science major? I’m trying to remember.
Hughes: I majored in history and literature of France.
Kirkpatrick: You actually did graduate, though, right?
Hughes: I did.
Kirkpatrick: You graduated. Good for you.
Hughes: I was the first in my family to ever go to a school like Harvard, and when you’re there on scholarship or financial aid—I did not drop out. I continued to—I had over-invested in courses my first couple of years, so I moved off campus and was out in Palo Alto and back and forth. But I stayed. I stayed in school.
Kirkpatrick: The reason I mention it is you’re clearly interested in politics, but also democracy. I mean, you’re somebody who thinks about that kind of stuff, and you’ve worked hard in the political realm. Facebook is getting so much criticism for the way that it’s begun to play a negative role in democracy. I’m just curious your thoughts on that.
Hughes: Well, I mean, there’s so many issues with the entire tech sector that are coming up now. And I worry that they often all get conflated. I mean, there’s these democracy issues, so the privacy things that we just discussed—or that some of your panelists were previously discussing. There’s the question of just attention and the attention economy and the effect on news. There’s so much stuff.
Kirkpatrick: We had a session on that today. Yeah.
Hughes: So on democracy, I think we should all worry when a single company effectively owns the public square of the 21st century. And I think there is a reasonable case to be made that Facebook, in the algorithms that it chose to implement, to power News Feed in the past few years, has contributed to the growth of the extreme voices in our politics. It is certainly not the only factor. I mean, everything from our geographic self-sorting and how we live, to the filter bubbles on cable TV, there are many contributions here.
But I think we should recognize that social media is one of those. Now Mark, too, has already engaged with that, some of the changes that they’ve made to the New Feeds algorithms to encourage conversations and commenting seem like steps in the right direction. But I do think that just having a conversation about the fact that News Feed is created by humans, that there are real people, who make real decisions, about how it works, is a really meaningful step forward. Because those of us in rooms like this have known that for a very long time, but I think culturally, there’s always just been this sense of like we just configure the rules and then, you know, the network decides what surfaces. No, actually, Facebook made decisions that enabled extreme voices, politically at the edges of the system to get a lot of attention. The more outrageous the better.
And thus, people felt like they weren’t necessarily alone. And so the extremists became more emboldened and we all—I mean, I think wherever you are politically, I think, on the left or the right, the rise of white nationalism in particular, is a major area of concern. And that is, I think, partially because people feel like it is easier to—they feel more entitled to say what they think. Again, Facebook and many other social and cultural changes are part of that, but we have to name that responsibility if we’re going to do anything about it.
Kirkpatrick: Is that another way of talking about transparency in the algorithm, the News Feed algorithm? Which would also presumably apply to Google’s search algorithm? Is that something you’re an advocate of, that we really need to somehow create mechanisms for transparency about how these programming decisions are made?
Hughes: I think it’s the right question to ask. Of course, they would respond, whether it’s Google, Facebook, or any of these companies, that “Well if we open up the algorithms, it’s just going to make it easier for people to game.”
Kirkpatrick: Well, opening them up and transparency about how they’re contrived is not exactly the same thing.
Hughes: Right, exactly. So I would sort of ask, “How should they be transparent?” You know what I mean?
Kirkpatrick: That’s something that we have to ask and it may take years to get the answer. But we have to keep asking, don’t you think?
Hughes: Right. I think it’s the right question. I think the next question is what control should users have over it? You know, what if you were able to say, “Actually, I just want to use Facebook to see news,” or, “Actually, I don’t want any news,” or what if I just want to see photos? So that’s one, but that too is problematic, because then, if the public square, if there’s a filter so that there’s no news in the public square—so I think ultimately we have to recognize the executives at these companies will be making some normative decisions about what people are able to share, and who shares it, and who sees it. So that is the first step.
Even if they open up and it’s transparent. Even if they provide controls. There will still be defaults. There will still be decisions made about how to surface that to users. And so with that responsibility, I think, or with that power, comes a lot of responsibility. And to just rely on self-regulation alone, to me, I think is concerning. I think the time has changed. We’ve turned a corner, and now we should be having a conversation about is that possible to regulate? If so how?
And by whom? And I don’t have the answers to those questions, but I know that those questions are the ones that we have to be grappling with in rooms like this.
Kirkpatrick: One of the more vexing problems, which has come up a couple of times in our discussion is national law is the logical way that regulation has always happened, but national law doesn’t really aptly apply to a global system. So you know, we don’t have global regulatory mechanisms. It’s a pretty weird dichotomy that is pretty problematic.
But you know, it’s interesting, the thing you said before about if we could just pick to see all photos or all news, the way you followed that with saying but then the public square—you’re still presuming that Facebook has the right or the responsibility to be the public square. Maybe that’s something we need to just give up as an idea. Maybe that shouldn’t necessarily be protected. Maybe it’s wrong that a company should be doing that in the first place.
Hughes: Well, I mean, I think they would say that there are many places that people get news; there are many places that people discuss things. I think that Facebook is effectively the public square for political discourse. Obviously, still have conversations around—
Kirkpatrick: It is! I’m not saying it isn’t.
Hughes: And if that’s going to stay the case, then, you know, we need rules of the road, just as we have them in literal public space, we need them on the internet. Now, if there were more competition, and if there were a lot more Facebooks, if there was the world that some people envision, then you could imagine, “Well, okay. Facebook isn’t really so much the public square anymore.” One way to respond might be a kind of pro-competition kind of policy that would say, “Well, regulation about what people can and can’t say, what Facebook can and can’t do, is going to be highly technical, it’s always going to be changing. That’s not adaptive. What instead is adaptive is just making sure that there’s appropriate competition in this space, so that no single company ever owns the public square.”
Kirkpatrick: Well also, even just the articulation of a set of principles that could be agreed on by some kind of external body, whether it’s global or public/private consortium or something, I think would be a step ahead of where we are today.
I want to ask you another thing—
Hughes: It’s hard to argue with that.
Kirkpatrick: I made a point onstage here yesterday when we talking to Martin Sorrell, I am particularly personally offended by the way that Facebook is exacerbating ethnic discord in so many countries around the world, in large part in many places, because it doesn’t not have local language moderation or even responsiveness when complaints are made. And yet it is the primary public square in those countries too.
And I may the point that I think one of Facebook’s failures, and one of the things that it deserves extreme criticism for, is to have allowed itself to grow to become the de facto public square in countries where it has no local language expertise. That is something it didn’t have to do. It chose to do that. And it’s been very well-documented in Sri Lanka and Myanmar, just to name two. The Philippines is another where there’s a lot of documentation of how bad this is. The problems that it has created, alongside a lot of wonderful things, but the political problems are toxically deadly. And people are dying in numbers in some of these countries, because of the inability of Facebook to be responsive, to even be responsive when people say, “Hey! That fake news is leading to people killing each other!”
Hughes: I mean, to say they go ahead of themselves is an understatement. I mean, you’ve got 2 billion people on the platform with what—you would know the stat better than I would—20,000 total Facebook employees? And the number of employees isn’t—
Kirkpatrick: Yeah, it’s like 25,000 now.
Hughes: Right. Isn’t necessarily the right metric, but it’s symbolic of the power of the platform to connect all of these people, but how thin the administrative layer on top of it actually is.
Hughes: So now they’re rushing to catch-up. So I think it’s important to recognize the changes that they are making, and I think they are moving quickly. However, it is a matter of life and death in many cases. I mean, I think all of us, they would likely agree; they must be moving faster.
Kirkpatrick: But you know, they would agree, but they actually won’t address this kind of thing publicly. I’ve never heard either Mark or Sheryl, ever, once say a single sentence about the thing we’re discussing now. Which has been documented—I mean, the Business Week piece about the Philippines is a great example. The New York Times has done a lot of reporting on Myanmar and on Sri Lanka. The excellent article on Sri Lanka I’m sure you saw not long ago, which was devastatingly critical. And you only had from Facebook the most, you know, bland, PR—they were just self-justifying and not really addressing the issue. I find that disturbing.
The time’s supposedly out, but we definitely have time for a little more. I want to hear audience questions and comments to Chris. Let’s get the lights up and okay, let’s take a question right here.
Gray: Hi, my name’s Kimberly Gray. Thank you so much Chris and David. I’m the CEO of Uvii, it’s an embedded video-commentary platform. I guess my question’s also a statement. When do you think we’re going to get to the point where we’re actually having a connective dialogue through social media? I mean, if you think about social media, it’s like watching the news. It’s like a one-sided conversation, because we don’t really have the opportunity to respond with video or have a real conversation. We’re like all within these boxes within our particular pages, but we’re really not talking. Like what mechanisms do you think need to be implemented for us to really start having a responsive dialogue about these issues just beyond a room like this?
Kirkpatrick: I would say Facebook is designed to have that already, to be honest. It may not manage itself well enough so that’s apparent, but I mean, the thing that did make Facebook enormously important as a social innovation was that it gave individuals the power to broadcast. And that happened, really, starting in 2008 or ’09, more or less, and that’s still true. And it’s really a big deal and that’s why a lot of these problems arise, but it’s also why the Me Too movement has taken off, in large part. It’s why a lot of great things are happening in society. I do think there is a feedback mechanism. It maybe needs to be refined. I would think they should have more people inside Facebook, like you, asking those kinds of questions: How it can be made more clear? How it should work? I don’t know if you have anything to add to that?
Hughes: No, I think that’s well said.
Attendee 1: I just wanted to ask you what you learned from your experiment with The New Republic?
Kirkpatrick: Uh oh, good question. That’s a different topic.
Hughes: A lot. So in the book that I published a couple of months ago called Fair Shot, I talk about this extensively. So for those of you who don’t know, after working on the Obama campaign, and right around the time of Facebook’s IPO, I bought a magazine called The New Republic. It was a political magazine that had been based in DC, nearly 100 years old. And I came in guns blazing. Like I was ready to bring all the principles that I had learned, not just at Facebook, but also on the Obama campaign. A small team focused on scrappy, iterative kinds of changes, to bring the values of The New Republic, the kind of long-form journalism it had done for a very long time, to a much bigger audience. That was what motivated me. I have always been a humanities guy; I have always believed that journalism is a civic pillar of the United States. And at a moment when the business models were under threat, I saw this as an opportunity to try to do the impossible, to be totally honest.
And then after spending a lot of time, a lot of money, and a lot of effort on it, what I increasingly realized was that the idea that long-form journalism was necessarily going to be profitable was—it was pretty far-fetched. I was like the last guy, you know, to get the memo. [LAUGHTER] But it was definitely true. And you know, as we went on that road, there was a lot of great journalism that we produced, there were some good things that I’m still proud of that we did digitally, and we had a huge cultural fallout, as many of you might remember, where a dozen editorial staffers left en masse.
It really was a very public rejection of the idea I sort of symbolized. Now in my view, the whole reason that I was there in the first place was that I cared for the values of journalism and was trying to use technology to bring it to a bigger audience. I had no interest in making it into, you know, a BuzzFeed. This was a kind of institution that had never been that and never would be that. I did, however, want to make it self-sustaining, if not profitable. I did want bigger audiences. I did want a more modern, if you will, media company.
And so anyway, long story short, I think the biggest takeaway, to answer your question, the biggest lesson that I learned from it, was that sometimes having more modest ambitions can serve the institution and goals better. Like if I had come in and said, “You know what? This thing is never going to make any money. Let’s fund the journalism that it does more modestly, but a couple of million dollars a year,” instead of the $5–6 million dollars that we were losing. “Let’s do some great journalism, but we don’t have to hit for the fences. We don’t have to reinvent the next model of journalism,” then I think the institution would have been much better served.
And that lesson directly applies in the work that I do today, as we were talking about at the top of the conversation. The economic security project that I co-run is an initiative to create a guaranteed income for Americans. And a lot of people talk about UBI. I know Andrew Yang was here before. I think that idea and the values behind it is very inspiring. In some ways, that’s like the idealistic kind of extreme. And what I think a lot more about is something a little bit more modest, more like $500 dollars a month to people who make $50,000 or less. More targeted, a use of the earned income tax credit, part of our tax code, as a way to do it. It’s wonky. It’s much more in the weeds. And it’s much more modest, but I do think that that is the more appropriate response in this moment in time. And partially why I feel so emboldened to make that case is because I wish that I had a more, you know, not come into The New Republic hitting for the fences, but instead just trying to, you know, have a good single or double.
Kirkpatrick: Okay, that’s a good answer. Back here.
Robinson: This has been an awesome conversation, so thank you both. And I think we’ve touched on—
Kirkpatrick: Who are you? And I’m sorry.
Robinson: Dan Robinson with OS Fund. I think we’ve touched on some of the more flagrant abuses on the Facebook platform, that certainly should be curbed. And I was, you know, curious as to your thoughts kind of on the other end of the spectrum, you know, if we think of Facebook as sort of a mirror, that kind of reveals some unfortunate truths maybe about who we are, as far as our collective critical reasoning ability, or intelligence, or let’s just maybe leave it at critical reasoning ability. But if it’s that kind of mirror, you know, are there ideas that you have for Facebook, or you know, a platform that might complement or succeed Facebook, that could actually kind of elevate the national discourse? Or is social media, just the truth of social media, is the best we can do is curb the abuses, and it will bring out the lowest common denominator, that’s kind of the name of the game?
Hughes: Well I definitely don’t think that Facebook is a mirror of who people necessarily are. That’s sort of like another cousin of that idea, at least as I’m hearing you explain it, is the idea that technology isn’t good or bad, it just, you know, helps people be who they are. That I don’t think is right.
I think technology is built will certain values in mind that can reward or disincentivize certain kinds of behaviors. So right now, Facebook rewards the kind of self-expression that uses the written word most often, or at times images, occasionally videos and live. But those are specific decisions that have been made. And not only that, but it’s been created so that people can scroll through it at warp speed, and you’re just waiting for what not only grabs your eye, and then you like it if it’s outrageous, and so the person who’s sharing gets the feedback loop that the best stuff that they can share is whatever is most extreme or perhaps most—
Hughes: —dramatically put. So I guess I really think that we’ve spent so much time talking about technology as neutral, that we’ve forgotten that a lot of these design decisions really matter. And so the idea that for instance, my Facebook profile is not a mirror of who I am. I mean, you go there, it’s actually sort of boring. It’s a lot about guaranteed income, a little bit about data dividend, all this kind of stuff. But you know, my husband and I have a little boy. He’s fourth months old.
Kirkpatrick: I didn’t even know that! Congrats! Oh, that’s so cool.
Hughes: Yeah, we’re happy but we don’t share Facebook photos with him publicly. And I think that’s a really good thing for him, for us, for a whole host of reasons, which I could talk about. But my point is like my Facebook is not a mirror of what my days are like, or even necessarily what I’m most focused on in any particular moment. I think that it’s a just a different approach to the platform.
Kirkpatrick: Okay, but I thought his question was a good one. Do you have any ideas about how a better system could be designed? Or is that just not your concern right now?
Hughes: Better how?
Kirkpatrick: Well I mean, he’s saying, could we have—I don’t know—
Hughes: What are the values?
Kirkpatrick: I guess he’s sort of saying a better social media or maybe it’s just too broad of a question.
Hughes: Well no, I don’t think it’s too broad of a question. I think what you’re saying is if it were easier for new social platforms to emerge, you could imagine one that is built around different values, that doesn’t have a News Feed product at its center. I mean, you could argue that Snap is so different than Facebook that that has been one—
Kirkpatrick: It is intended to be. Yeah.
Hughes: —entrant. So I don’t think it’s—I think there are many ways to design the technology and it’s not a fait accompli that it’s necessarily going to be superficial. I mean in some ways it speaks to the first question as well.
Kirkpatrick: Okay, now wait. I saw another hand.
Kint: Chris, Jason Kint from DCN.
I was going to ask about The New Republic, but your candor on The New Republic was phenomenal, so thank you. Maybe a more difficult question: Looking at the 2016 election and all the concerns with the Facebook platform, whether it be dark posts or data and how it was used, is there anything that you would reflect on from the Obama campaign that you say, knowing the current discussion, that you think we probably shouldn’t have done that, or that—and I don’t even know if you can say what it is, but I’m just saying, in hindsight now, looking back at the tools and how you used them—
Kirkpatrick: It was actually the second Obama campaign where they supposedly did the same thing that Trump did, but got away with it, where Facebook said, “Go ahead, use the data inappropriately.” That’s been said.
Hughes: I don’t think that’s true. So I was on the ’08—
Kirkpatrick: I think that was said, that 2012 campaign. But maybe I’m wrong.
Hughes: No. So on the ’08 campaign the technology was so basic—
Kirkpatrick: You’re the one who did the wrong thing, is that what you’re saying? [LAUGHTER] Oh, it was your campaign. Okay, sorry. Jeez.
Hughes: My campaign was the first one, the 2008 campaign.
Kirkpatrick: I know that.
Hughes: Where, you know, we were Twitter only, really. We started using Twitter, I think, in September of that year. It was Myspace and Facebook. And there was the beginning of Facebook’s platform, and we did have an app on top of it, but it was very far from what they were able to do in ’12 and then later in ’16. I think what you’re referring to is, I think unfortunately, a false equivalency that has at times been drawn in the media, between what happened in 2016 with the Trump campaign and the 2012 Obama race.
Now in 2016, just to be specific about what happened, you know, Cambridge Analytica illegally and unethically effectively stole and then used the data of 87 million Facebook users to enable the Trump campaign to do more effective micro-targeting. That was possible because Facebook Connect made it easy for people to take the psychological quiz and for a third party to store that data and just sign off and say, “Oh yeah, I’ll be responsible.” Like that’s one thing.
In 2012, and on the Obama campaign, they also used Facebook Connect to enable users to send messages to their friends in battleground states. So if I have a friend in Iowa, then that’s a more important vote than somebody in New York. And so there was—so users did opt-in and say, “Show me the people that I should connect to.”
Now there’s a conversation about whether even that is what we want in our politics. But the campaigns were following the terms of service. They were using it in a way where the users were aware of—they were connecting their Facebook account in order to recruit other friends. So it is very different from what happened in 2016.
Across the board, though, I think we’re being asked to rethink what privacy expectations we should have around campaigns and politics in America. I mean, the reality is, both on the left and the right, in politics, just like in the commercial space, there are immense amounts of data produced on voters that are used to better target ads, to better target messages. That’s something that, you know, everybody who’s running office in 2020 will do. And just about everybody on all those committees who were grilling Mark Zuckerberg a few weeks ago, has already done in their campaigns or with their consultants.
So again, I think this is the watershed moment that we’re living in, where people are realizing that—I mean, everybody in the room, assuming that you’re a voter, as I have a feeling most of you are, you’re in a database somewhere. Whether the elections that you have voted in or haven’t voted in is available, and it’s probably being layered on with other commercial information about what kind of car you drive and whether you have a mortgage and these kinds of things, to help campaigns target you.
Now that was done on the left and the right, and yet another conversation about is that good for our politics? Do we want that? Or do we want to regulate that? That’s something that we haven’t even, I think, skimmed the surface of.
Kirkpatrick: Well, okay, [LAUGHTER] I want to take one or two more.
Bondini: Hey, I’m Chris Bondini from Icahn. I read a study recently about billionaires, and it says that even though they believe that they’re more empathetic, and more attuned to the public good and the public interest, that as they become more successful, they actually, if they test them, they’re less empathetic and they kind of have blinders on all kinds of downsides, but they use rhetoric about the good of humanity and advancing causes and so forth.
But through your work on campaigns and the digital dividend, does that mean—you seem very different. And I wonder to what do you attribute that? Is it in your background, your upbringing, your studies? And could we distill it and take it out to Silicon Valley? [LAUGHTER] Distribute it? Thanks.
Hughes: Well I’m wary to make sure that—I don’t know, I fall into—I’m human like everybody else. I fall into a lot of the same behavioral patterns as others. We should find each other afterwards because I want to see the study.
I think—I do think, though, that a lot of wealthy folks, if we’re going to paint with a broad brush, have—it’s not just wealthy folks, although it’s many of them—tend to subscribe a lot of importance to work and less importance to luck or fortune. There’s a sense that once you make it, you know, you made because you were very, very smart; you were very driven. And in all—in many of their cases, they—you know, are some of—I say, you know, Fortune 500 CEOs are some of the hardest working people out there, that is true. The people who come behind them to clean their offices, afterwards, are also some of the hardest working people out there. I think more often than not, maybe even harder working and the rules of the road don’t reward their economic activity in the same way.
So I guess, it’s part of what I think is so important, to ask the question about work and deservedness in the United States, and who has economic opportunity, and who doesn’t. And zooming out to understand that we all have a responsibility, but also, I think the country that we want to live in is one where we all take care of one another. We want people to contribute; we want people to generally work to be of use, to find purpose. And we also want to make sure that everybody who’s doing for themselves, for their families or communities, isn’t left behind. Isn’t left in poverty.
So I do think that ultimately that is a conversation about power. I think billionaires and rich people have more power now than they have had in a very long time. And I think we do need to rebalance the playing field, and people don’t love to talk about that so much, because what’s in vogue is like, what’s the win-win? Like the company that I can start that’s going to solve poverty? And I think that’s a good and interesting conversation to have, but it often skirts the fact that inequality is at these record levels, and that the power imbalance does matter. And sometimes you might have to give up some power, you might have to give up some wealth, in order to make sure that everybody has the opportunity there. [APPLAUSE]
Kirkpatrick: Okay, so that’s a really good thing to say and that sets up what I think has to be the last question. You know Mark Zuckerberg very, very well, who is now the fifth-richest person on the planet. You know, at one point I thought he was definitely going to be the richest. At the moment, probably still has a good shot at that, even surpassing Bezos, depending on how things go. But how you would you say he has handled this recent crisis? And how would you say he’s handling his wealth and his position in society?
Hughes: Well I mean, I know—as I said earlier, I have not talked to him about the recent crisis. I think the jury is still out. I think there is one route to go where he and other tech leaders really embrace this moment, and I mean proactively embrace it as an opportunity to talk about privacy and democracy and to truly welcome these conversations. And there I think was evidence that they might move in that direction. He, in particular, in the Congressional testimony from a few weeks ago. I think that is the direction that they should go. The other—
Kirkpatrick: I think he said a lot of promising things in that regard, in the Senate hearing.
Hughes: The other direction is to think that, you know, even conversations like this one, you know, are a threat to Facebook. And I fundamentally believe that we’ve got to be focused on our democracy and our society and our way of life. And what is good and healthy for all of us first, and then we have to think about how Facebook and all tech companies work for the common good. And that is the order.
And I do think that Mark and others in the tech world generally are aware of the responsibility that they have, and also want to be a force for good in the world. I mean, I think that’s very important. And the question, then, in the short term is, you know, how much does he and do they embrace the questions of how to do that? Or just hope that it dies down?
Kirkpatrick: Or think they can figure it out for themselves. Which seems to me to be the way—the modality that he’s adopted, more or less. He’s not really engaging with dialogue that’s two-way with the outside world, except for in Congress where he was forced.
Hughes: Right. No, I mean, no single individual and no single company and no single politician, for that matter, is going to figure this out. Like this is too complex a set of issues and we need civil society, political leaders, business leaders, everybody at the table, for a multi-year, deep conversation about what we can realistically expect from these companies. So it can’t be—we can’t just look to Mark Zuckerberg for the answers. Nor do I think we want to.
Kirkpatrick: No, but Mark Zuckerberg needs to have more dialogue, I would say. I mean, it’s interesting, Facebook said they were going to have some more public conversations with their critics, which we haven’t really seen.
Hughes: We should get him on your stage for Techonomy in the fall.
Kirkpatrick: It’s an open invitation but I’m not sure they’re jumping at that one. [LAUGHTER] So glad you’re here and really, congratulations. As several have pointed out, you really are grappling with your good fortune, which you acknowledge. Your public policy interests are really extremely admirable, that you’re staying so engaged. You don’t have to. I mean, you could go just chill out and that would be not unexpected in today’s world, but I hope you will stay engaged—
Hughes: Thank you.
Kirkpatrick: —because I think you have a lot of great contributions to make.
Hughes: Don’t worry.
Kirkpatrick: So thank you, Chris. [APPLAUSE]
Hughes: Thank you. Thank you, David.
Kirkpatrick: Really good to have you.