Description: No company’s influence in modern society is more fraught and unresolved. Facebook faces credible accusations of serious impacts on democracy globally, of privacy of its users, and on digital addiction. The company is trying to do right, but its business model creates resistance. What should it do now? What should society do?
The following transcript has been lightly edited for readability.
Can Facebook Recover?
(Transcription by RA Fisher Ink)
Kirkpatrick: As you know from seeing the cover of our magazine and if you know me, this is a subject that I care a lot about, not just because I happen to have written a book about it but—I wrote a book about Facebook—because I really thought it was an unusually important company. I think what you’ll find is we have four panelists, all of whom also have always believed Facebook is an unusually important company, unusually important service, with an unusually important role in the world.
However, all five of us who will be up here have switched our views to be extremely alarmed in one way or another, each in a different way. So I’m quite excited to introduce you to them as soon as we get the unusual number of chairs on the stage. But I don’t think anybody would have much dispute with the idea that Facebook is in a kind of crisis, although I’ll say that one group that doesn’t seem to believe that is the securities analysts who were asking Mark and Sheryl and the CFO the questions on the most recent earnings call.
Panelists, please come up and I’ll continue. Wall Street is more or less sanguine. That’s what was quite clearly evident on the earnings call. Only one question really addressed the issues that for many of are the central issues, which is how can Facebook continue having a fundamentally positive role in society? And before we start, I should also say I very specifically and firmly asked Facebook if they would send someone to talk at this conference, not necessarily to be on this session although they could have if they would have—I would have had them before, after, during, whatever. All their executives were too busy. That was their response. So I think that’s predictable, unfortunately, because it’s in a pattern that they have established now of essentially evading the dialogue that they’ve created the need for.
But anyway, let me now introduce the panelists. We’ll start with Wael at the end. Very, very honored and pleased by every single one of these people being on the panel but you know, Wael became world famous during Arab Spring, as the activist who was a Google employee in Egypt using Facebook, [and] really did a lot of catalytic organizing of what became the movement to bring down Mubarak. Things that have subsequently happened in Egypt have not been quite as gratifying; he now lives here. He wrote a book called Revolution 2.0. He’s also been on quite a personal journey, which he and I have discussed and maybe he’ll reflect on a little bit in terms of thinking about what really is happening with technology as a tool for activism and society generally.
Next to Wael, Sanjana Hattotuwa, who’s a good friend, lives in Sri Lanka, currently in graduate school in New Zealand. I really am grateful for his traveling from New Zealand here for the conference. He is a longtime activist with an extremely important organization called ICT4Peace, which I tremendously admire and has had a lot of impact working with the United Nations and other organizations on issues of how technology can behave responsibly and positively in society. He also has been involved in Colombo, Sri Lanka, with a group called the Centre for Policy Alternatives, which he’s done a lot of research about Facebook’s role in his country that he may talk about.
Next to Sanjana, Roger McNamee, legendary investor and expert—Roger is by many people’s analysis the person who is most authoritative about what’s happening in technology and he has been for a really long time. He’s, you know, just an investor; he was formerly just a security analyst. But he always had an ability to understand more things at the same time than anybody going back to when he and I used to be on a show called—what was that show called on—
McNamee: Digital Jam.
Kirkpatrick: Digital Jam; thank you for remembering that. And we used to, that’s when I first got to know him, in the green room at Digital Jam on CNN FM, when that existed. He has a book coming out the beginning next year called Zucked: The Education of an Unlikely Activist.
Kirkpatrick: Roger, who played a huge role introducing Sheryl Sandberg to Mark Zuckerberg and was an early investor in Facebook and made a lot of money on it, became extremely alarmed—he’ll tell us a little bit about that—during the runup to the election. One of the first people to really raise the alarm about what might be happening or what was happening with political interference on Facebook before the presidential election and was quite dismayed by what happened as a result.
Finally, Brian Wieser is a security analyst at Pivotal Research who has a sell on the stock and has a sell on the stock of Facebook since July of 2017, after having previously been the biggest bull on Wall Street on Facebook stock so his calls have been pretty darn accurate. Anyway, that’s who we have up here.
We were talking before about what we should discuss and I think the thing we should discuss is what should Facebook do now? They are clearly still making a lot of money, maybe a lower profit rate than before, the growth continues at a lower rate than before, that’s why their stock dropped $120 billion worth and has continued to go down from there. So stock had I think been over—what, its peak was $210 or something? $220 and it’s now around $140, which, for this company, the only other time that happened was immediately after their IPO, when their stock dropped from, what was it, around $30 at the IPO? $31? It went to $17. But then it went steadily from $17 to $220, basically, has plummeted and it is still in trouble as a stock. So, maybe Roger, I’ll start with you. What should Facebook do now?
McNamee: So the problem that I perceive is that the business models of Facebook and Google are essentially built on surveillance, they’re built on essentially taking the data of individuals, the content of content-creating companies, and essentially creating an arbitrage by using addictive technology properties to get the surveillance to work more effectively. When you think about that as a business model, there is no fix. As long as the business model is based on surveillance, as long it’s based on monopolizing people’s attention, then at best they can go and play whack-a-mole against problems that have come along, as they did with foreign interference in the midterms, where at least there’s no apparent foreign interference that’s shown up. I think the challenge for Facebook and the challenge for Google are very simple. This business model is extremely dangerous for society and I don’t believe they can be fixed without changing the business model.
Kirkpatrick: Okay, I’m trying to decide who to go to next. I think I’m going to go to Sanjana, if you don’t mind. I mean, what Roger just made is a fairly extreme statement, right? That the business model cannot be fixed. You have chronicled the crisis that Facebook has represented for your country, but you also are a believer in its importance. So talk about that contradiction or non-contradiction and do you think Facebook can be fixed?
Hattotuwa: Not easily. But one hopes so. And the reason I say that is that, as is now commonly known, Facebook is the internet for many of our countries, particularly Myanmar, where the public imagination and awareness and understanding of the internet is actually Facebook. There is nothing beyond that. That is the glue that holds social and political cultures, religious and communal relations together in these countries.
Kirkpatrick: Is it true in Sri Lanka?
Hattotuwa: It is very much the case in Sri Lanka as well and in fact, we are in a historical moment in my small country where, after the 26th of October, political relations have been so turbulent and in an unprecedented manner, that Facebook is both the vector and vehicle and channel of negotiating that political space and also the harbinger and the tsunami of misinformation, disinformation, rumor, and instability as well.
You have this Janus-faced behemoth that people like myself now study but also have to kind of contend with. It is an inordinately important role and I don’t think it’s understood in its complexity and its rich nuance by the very creators of the platform. And so I try to tread a middle part, where the frustration and, may I just say on public record, anger at the company not doing what it could have done with the evidence provided to it—it’s not opinion and that’s what I want to stress. There was evidence; it was data-driven, evidence-based, translated into English at a time when the company didn’t have confidence or capacity to deal with the kinds of things that we were seeing.
Kirkpatrick: This was starting in 2014, right?
Hattotuwa: This is late 2013 and 2014. And then it becomes a Western—
Kirkpatrick: And you wrote formal reports—
Kirkpatrick: —that you submitted to the company that they did not respond to?
Hattotuwa: David, the only time that they responded to me was with the very serious communal violence in March this year. There was no communication from the company around—
Kirkpatrick: They responded once Zuckerberg was in the process of testifying to Congress and the heat was on?
Hattotuwa: Well, it was coincidental that it happened at the time when we wrote open letters to Facebook, we wanted to use the scrutiny of the company and Mr. Zuckerberg to heighten the awareness of things that were outside the United States. It is ironical but perhaps the way of the world that the so-called third world or the global south, however you want to phrase it, deals with these issues for years before.
Every imaginable aspect and facet and conversation, argument around post-2016 United States of America has been with us and we have been living with it with the same technology for years before. And post-2016, it becomes the height of debate here and controversy, but we have been telling Facebook of the consequences of inattention to detail and c’est la vie.
Kirkpatrick: So since Mark Zuckerberg on this very stage two days after the election said it was a crazy idea that fake news on Facebook could have affected the election, if they had bothered to read your reports, he would not have thought it was a crazy idea?
Hattotuwa: I don’t have the chutzpah to suggest that, David. I don’t think that a small country like ours affects in any degree, shape, or form what happens in the United States. However, the contours of the problem, the inattention to the way that the platform is used, how hate is generated, the vectors of hate, how violent actors, inauthentic voices flourish on the platform, the fact that they didn’t have any kind of human moderation or content moderation capacity or oversight—almost every single problem that at scale became an issue here was in its embryonic form, of course very big for us but we are a small country. It is an inordinately significant problem for us, but we saw it in its embryonic state that then went on to affect things in the United States and elsewhere.
Kirkpatrick: This is a multifaceted issue but before we go off, Sanjana, WhatsApp is also critically intertwined with this issue so could you just talk about that in the Sri Lankan context? How do we understand how WhatsApp is functioning?
Hattotuwa: Well, people are moving out of Facebook and onto WhatsApp.
Kirkpatrick: Owned by Facebook, of course.
Hattotuwa: Yes but they don’t realize that.
Hattotuwa: No, it’s true.
Kirkpatrick: No, I’m just saying people in the room, make sure everyone listening knows that it’s owned.
Hattotuwa: Yes, and this is very much the case in India as well and in many parts of surrounding countries, where for a variety of reasons, it’s instant messaging—Facebook Messenger and WhatsApp in particular—that are taking the conversational domains by storm. Now, that is a problem in a sense for academics because you can’t immediately ascertain what is being communicated on these platforms easily.
It is also a problem for Facebook because as is the case in Brazil now but also in India, the encryption works for both parties. It works to secure activists against repressive regimes; it also works to secure hurtful, harmful, and hateful conversations from regulatory and governmental oversight. So it’s again a Janus-faced kind of technology platform but from what we can ascertain from proxy indicators, the sheer volume of conversation and content as was made clear yesterday by the representative from Jio, I mean the gentleman who talked about India’s, just this one cellular service, it beggars belief. And so a lot of what is happening and how people see the country, which actually is very important, the high partisan nature, the fractured nature, the language and the expression; it’s all very, very evident on Facebook. But there’s no easy way to deal with that.
Kirkpatrick: I want to come back to WhatsApp. There’s a lot of things to discuss. I want to go to Brian before we go to Wael because I know Wael has somewhat of a different view, a bigger view maybe in some ways, because he’s not really as much day to day on Facebook as the rest of you. You cover a lot of companies, but you’ve been very negative on Facebook for some basic financial reasons so just talk about your perspective on the company and particularly in the context of its social functionality/harm.
Wieser: Yes. So the core view around my negativity on the stock has been that the gap between what Wall Street expects and what I know is realistic in terms of revenue growth is wide or was wider prior to their reduction of guidance a quarter ago. So I used to be global head of forecasting at Interpublic. There’s like three people alive who have actually tried to understand the Sri Lankan advertising economy from New York or London, let alone the Malaysian newspaper industry. So I have more than a point of view on this stuff. And I think most of Wall Street was not really looking at that.
In addition, costs were going up, there are a lot of things they needed to invest in, margins were going down, and there was a lot of room to run but only so much. So anyway, I got more negative relative to where expectations were and the stock started running up. And I mean, that ultimately led to a sell, it was just a gap between [what] expectations were and where I was.
But at the same time, I remember the first thing I was aware of, which was a problem or hole in the story, which was the video metrics issue that came up in August of 2016, where it was in some ways such a minor thing but it turns out that they were providing bad data. It happened to be in Facebook’s favor. It was discovered by an agency. And then another error was discovered and another error—just a coincidence, I’m sure, that all the errors were in Facebook’s favor.
Kirkpatrick: Errors in ad metric management that caused advertisers to pay more than they should.
Wieser: Yes or mislead them into making decisions. And so these were kind of minor but I remember thinking and telling investors in late 2016, “Well, it’s only a few problems, nothing major, if a few other shoes drop I’d start to get a bit worried.” Obviously with the election, we started to see more of these issues. More problems emerged. And I even found one on my own. In September 2017, after reading about an Australian journalist that wrote about the fact that Facebook claimed more 18- to 34-year-olds existed on Facebook than existed in Australia at all. I tried that for the U.S. and it was like, oh my God, the same thing.
Kirkpatrick: They were claiming more viewers than existed in the country.
Wieser: Any one of you connected now, if you have Ads Manager, go into it right now, you’ll see. The error is still there, by the way. And there’s a class action lawsuit. This is actively misleading, by the way, and they say oh, it’s not a big deal; it’s just an estimate. But businesses make decisions based on this data. When I was a banker, years ago, if I had a presentation deck that was intended to induce someone to make a decision, even if not what they actually had to pay, that was a sackable offense at minimum. So the issue that came to my attention at the time and I remember talking to I think Brian Stelter—it really catalyzed when he was doing work on CNN on the Russian interference—I remember he asked the question, “Do you think these two things are connected?” And I realized, oh my God, they are, of course. Because sloppiness in one place represents sloppiness in so many others.
Kirkpatrick: So you see a pattern of sloppiness in understanding how the system actually works?
Wieser: Right. And I’ve come to realize the company’s simply badly run. I mean, it’s not the only thing that’s a problem but when Cambridge Analytica then happened, again my joke on that is that the first reaction every marketer had to Cambridge Analytica was “Oh my God, oh my God, I’m deleting my app.” The second reaction was “Why didn’t I get that data from my ad?”
Wieser: But the reality was that the fact that they were sloppy in checking on their data partners was a problem. The fact that they threatened to sue the Guardian was a problem. The fact that they lied to Parliament was a problem. All of these things, I realized, oh my God, this is systemic. And that’s where I wrote this taxonomy of mismanagement; I actually had to categorize like 13 different categories of problems from a managerial perspective. Now let’s be clear, that’s not why I had the sell rating. But all of these issues, I realized, represent incremental risks above and beyond.
Kirkpatrick: Now we have several human rights activists on the panel but was that even one of the categories? I mean, not that you’re opposed to human rights but you’re looking at it just on pure business terms, right?
Wieser: Yes, I mean, I was certainly increasingly aware of it. I wasn’t aware of the scale of Myanmar until I read—certainly the UK parliamentary committee has done some pretty tremendous work.
Kirkpatrick: And the U.N. has done a lot of work, too.
Wieser: True. But I thought that the argument that the UK committee made also in commercial terms, for lack of a better word—it’s sad that this is a good argument but it is—when they say the UK government spends 100 million GBP on resettling refugees every year, Facebook has contributed meaningfully to the circumstances that have caused there to be refugees, therefore they have a financial interest. I mean, you can put this into numbers. But yes, I was aware that this was a problem because if you want to bet Mark Zuckerberg versus the world governments, I’m going to guess until Facebook can acquire a nuclear weapon, world governments probably win and so it does enhance the chances of regulation, it enhances chances—and regulation in ways that investors don’t even understand.
Kirkpatrick: So you think even at $140, you still have a sell on the stock?
Wieser: Yes, I have a $125 price tag right now.
Kirkpatrick: Okay. So Wael, there’s a lot to pick up on here. I mean, there’s different points of view but they’re all negative, it’s interesting. I mean, I did curate the panel but look, this is not—Facebook has very few public defenders except for other security analysts who don’t agree with him and some people in the ad community.
Wieser: They’re looking 12 months out and there’s not huge problems in 12 months.
Kirkpatrick: I can tell you, if I had really tried to get a defender of Facebook on this—
McNamee: The rest of the leading companies in the tech industry are still mostly supportive and the venture capital community is very supportive. And there are a lot of people in government who are still supportive.
Kirkpatrick: But I will say, I don’t think it would have been that easy to get a prominent person to come on this panel and speak on their behalf. And certainly they wouldn’t do it for themselves, which is interesting. But it’s neither here nor there, I guess. It’s just my experience is how few people have been speaking out on their behalf over the last two years. I challenge you—Roger, have there been any?
McNamee: So I’ve had to debate a bunch of them but what I would suggest is that at least in my experience, there’s just a fatalism about it that’s a little bit like climate change, where people may think it’s a problem but they don’t perceive that there’s anything we can do about it. They don’t perceive government will act; they don’t perceive users will act, and I would like to believe that both users and governments are smarter than that and that if we make a good, reasoned case, we can change the calculus.
Kirkpatrick: Well, we do need to get back to specifically what they should do differently. Maybe Wael can start us on thinking what they should or could do differently but any observations you have?
Ghonim: I think one of the things I feel I have had was I sat down with ranking engineers as my job and I worked on algorithms that distribute content that you would call mobocratic or sensational or—
Kirkpatrick: Mobocratic, that’s a word of yours, yes.
Ghonim: Yes, from the mobs, it just incentivized mobs’ behavior. So I kind of see the world from that angle and I see how hard it is to actually—I spend a lot of time trying to explain to my colleagues we should be trusting that these algorithms are partisan, that these algorithms are actually sensational. They are editors but they are just bad editors.
Kirkpatrick: This was when you were at Google, you’re saying.
Ghonim: Well, it was throughout the time after 2011, I just spent a lot of time either working, advising, or working in companies so I don’t want to talk about which company I did that at.
Ghonim: One of the things that strikes me is that there is a lot of desire to push away the power that they have in their hand, that they just don’t want to admit it. And I understand that, I empathize with them, and I think it’s actually very important for us to be able to be useful in this game is to try as much to empathize and understand where they’re coming from because, yes, like everyone in here, I’m very disappointed at what they are doing. I think it is fairly irresponsible. At the same time, I have no idea what they are facing and the kind of struggles or tests that they are going through and one of the things I learned in 2011 is I had such a naïve perspective on what is the size of the problem that the people in power deal with and my attitude toward them was more counterproductive than productive.
Not that I’m trying to make them less accountable, they are extremely accountable and actually I think I agree with Roger that what they are doing right now—I always give the example of the gift shop in Egypt that before the Yelp era used to fool every single tourist because they want to maximize the money out of the tourist. The tourist is never going to come back, you know, the fact is there’s no retention in this business in the pre-Yelp era.
But in the post-Yelp era, they now understand that tourists have a tool in their hand where they could go in and see a hundred reviews and the place got one star for deceiving or cheating on people to make money. And I think the real problem we have right now is that we have decentralized publishing or it got decentralized because of the new powers and we failed at creating the right checks and balances. We, here, means the platform before anyone else because it is in the interest of the platform to create those checks and balances so that they continue to thrive for the long term. Otherwise, we would then advocate for tabloid journalism and say it has far more value creation than traditional journalism that is more authentic and respectful.
So I think initially they made the choice based on ignorance and I think that ignorance was very justified at the time because what did we know? We all—you know, there are so many things I said in 2011 that I’m very embarrassed every time I watch them. I’m like, oh my God, I was like completely naïve about this. And I should give them the benefit of the doubt because also there’s something about these companies, people assume that since they have the data, they are looking at it but that’s not true. They’re looking at completely—like, everyday life there is not to do qualitative experiments and look at what is happening and how the users’ lives are changing. They are just monitoring retention, time on site, all these metrics.
I think Roger, you talk about that it just kind of restricts your view on what is important. But the reality is, if they had enough talent that would help them measure long-term impact that has to do with trust, for example. I think the banking industry has lost a lot of pace because of the collapse of trust between them and the user. And I think the tech industry is actually on the verge of doing that right now. We are losing a lot of trust and I look at the elections and the results of the election and I’m like, this actually makes sense, that’s exactly what should happen, people should go after populists because the core algorithm that is driving how people should retrieve information is based on populism so the world is going to vote for a populist. Otherwise, that algorithm wouldn’t have been perceived as a good algorithm by the majority of the people, people would not have voted for a populist. But the populist came in to exploit a system that is already working and people are already happily exploited.
Kirkpatrick: Can we quickly just explain that in a little more detail? Maybe Roger wants to do it?
McNamee: I really strongly support what Wael just said because pushing back on you, I think the issue is not that they’re badly managed, it’s that they pick the wrong things to optimize. So Facebook is one of the most focused companies I’ve ever been around and the notion that at 20 years old, they’ve figured out how to apply graph theory to network design was one of the most astonishing intellectual breakthroughs Silicon Valley’s ever seen.
And they were 20 years old when they came up with it. But they’ve obviously taken this model that has been popularized by some venture capitalists of you pick a few metrics and you focus exclusively on those metrics and the ones that they chose allowed them to grow faster than any company before them and they achieved enormous success and after five years of everything going right, they were convinced that they were King Midas and everything that they thought about and did had to be right.
So they were blind to feedback and they literally, when I went to them in October of 2016 with my fears about the election and what was going on, they just could not, they couldn’t imagine that. And so I think failure of imagination is a better way of describing the problem than bad management because right now, you can’t convince them that the things you’re talking about are important enough to manage to and that’s part of the problem. And to Wael’s key point here and really to Sanjana’s as well, the scale that they have at 3 billion people globally touching one of their properties every single day, they are the public square. And there’s enormous responsibility that comes with that. That requires that you be fluent in the languages you’re operating in, fluent in the cultures you’re operating in because you’re like a dinosaur or Godzilla that turns around in a city and its tail inadvertently knocks a building over, right? I mean, the unintended consequences are staggering.
And so it comes to the business model. I actually believe that they can change the business model. I think the utility functions of these products are so valuable that changing the way people pay for it is actually a smaller problem. This happened with television. We went from free TV to paid cable in a very smooth transition in less than a generation and there’s literally no reason we can’t do that here. Plus Facebook has a lot of other things like Marketplace that they could be monetizing.
Kirkpatrick: So people should be paying for social media, for Facebook?
McNamee: I’m saying for their own good, we need to have a different business model, of which that is a choice, okay?
Kirkpatrick: For the company’s own good or society’s own good?
McNamee: Well, especially society, but I think the company as well.
McNamee: Look, nobody there set out to harm the world and the problem they have is, even now, you sit there—9,000 people confirmed dead in Myanmar, 42,000 missing and presumed dead. That’s roughly the same size as the U.S. deaths in the Vietnam War, but it’s six months, not ten years. And yet nobody at Facebook felt badly enough about that to leak anything. I mean, think about that for a minute. Right? This is on their head. And yet they culturally still believe in what they’re doing so intensely, they have not been able to accept that they’re responsible for things like that.
Ghonim: Can I just make a quick addition to this point? Actually, I think one of the problems of software in general is that you can build it and deliver it and people use it without you having to see them or even know that they exist. And for you, they’re just numbers at the end of the day. And one of the things that we could change—that’s why I think engaging with Facebook in a dialogue even as it’s so unfortunate that they don’t send someone here—again, honestly, it’s very irresponsible not to engage with more people and be more public about it because it is true that people are being tough on them right now but at the same time people are willing to change their positions.
A lot of reasonable people want to see them better because we could all create a happy story. So one way to talk about what should we do, I had the other day a conversation with someone who’s prominent in tech and I was like, why aren’t there psychologists, artists, sociologists, philosophers in the product development cycle? Why is that someone wakes up in the morning and decided to build this feature to mark yourself as safe, which is a great feature I like a lot, especially in natural disasters. But when you send 5 million notifications to people who live in London because 20 people died and get them to proactively publish to 100 million of their friends, there is a side effect to that, which is that if I’m part of the ISIS strategy team, I’d be really happy that this feature exists because that is exactly what they’re looking for. That’s why they kill people in the Underground. They pick the most public places, they want to create the most damage to frighten people.
Kirkpatrick: Because they want fear.
Ghonim: Yes and actually, in this case, I understand, maybe there’s something I’m personally missing on how important this feature is in such events. But I wonder if there’s a dialogue that happened and that dialogue was critical and early enough in the beginning of the conversation and that risk was assisted but I highly doubt it.
Kirkpatrick: I want to hear Sanjana’s prescription for what should happen, if possible. But I want to just clarify a couple of quick things to just make sure it’s on the table for the audience and the video because Roger’s talked about things they did. They prioritized growth pretty much as their primary goal for almost their entire history, probably to this minute, and they did not prioritize governance. And when you talk about the language and knowledge that they lacked, the cultural knowledge they lacked, this was extreme.
Because they really pushed into literally every country on the planet except for North Korea and China tried to keep them out but they did not have local speakers. They did not have any knowledge or any interest in knowledge of the local culture and the system was abused in many of those places. And in an article in our magazine that we distribute here, I pointed out, in Myanmar, just to name one country and this is true in many, they didn’t even have their rules translated into Burmese until 2015. So in many of these countries, there were literally no rules. So how can you blame people for misusing the system? And I also wanted to just clarify Wael’s point on the populism. What you’re really saying is because the system prioritizes fear and anger algorithmically because attention is the goal in order to show ads next to the attention that you people have given to whatever content, you’re saying that is intrinsically beneficial to somebody with a populist message. Am I right on that?
Ghonim: Yes, not only populist but actually on the right because the way the left uses negative messaging is mainly through shaming. The way the right uses negative messaging is through fear and fear is far more effective as a tool for engaging people than shame.
Kirkpatrick: Okay, so Sanjana, I know that you still really, really do hope for Facebook to succeed and to solve its problems. What would you like to see happen next?
Hattotuwa: Well, three very quick points. I know Burma is an emotive issue when everybody latches onto Myanmar but genocide is not monocausal. And just to place on the record for the consideration of this forum is that we’re talking about Facebook but Facebook didn’t cause the genocide. There are complex social, political, communal dynamics for decades that have happened in these contexts within which Facebook is sometimes an accelerant around the worst of who we can be instead of giving voice and strengthening the best of who we should be. And so it’s unfair I think, in a way, to project onto Facebook per se everything that’s going wrong in the world. These are complex, complicated problems.
Kirkpatrick: I hope nobody here is saying everything going wrong in the world is caused by Facebook.
Hattotuwa: I mean, there is that dominant suggestion that Facebook created the genocide in Burma and we need to push—
McNamee: Well, that would not have been my position. My point here was simply that you went into a country that had no media, no telecom infrastructure, and you took them from nothing straight to Facebook. They have no antibodies against fake news and they believed in authority and authoritative people used Facebook.
Hattotuwa: On that, we are on the same page.
McNamee: And the point was Facebook’s role in this [was] essential.
Hattotuwa: And the point, David, in Sri Lanka as you well know, is that till March 2018, they didn’t have proper communal standards translated into my language.
Kirkpatrick: Until this year?
Hattotuwa: We did it for Facebook.
Hattotuwa: So, you know, let’s start from there. The point is also, I mean, to compliment Wael, that there are people in the company who are deeply and acutely sensitive about these issues and I work with them as a consequence of what happened in Sri Lanka and the non-recurrence of it. I am willing to be convinced of their ability to convince management around course correction. That is not something as an outsider I have any indication of, all I have is the bona fides of the people within Facebook who are looking at these issues very, very seriously and they are quite concerned about it.
Ghonim: On a positive note, when things worked well in the past, it’s been always working well for them so they didn’t get their moment of what I call corporate depression when they just fall down and start realizing, oh my God, we’re doing something wrong and maybe at that time they correct their path. I feel they are realizing that now. But a lot of people like you guys, you were close to them; there is something they lost by losing you and they know that.
Hattotuwa: Well, they never had us to lose us.
Ghonim: Well, I mean, I would say for me, they had me because I was a believer, I was someone that believed that this is a liberating tool.
McNamee: The three of us were definitely—
Kirkpatrick: You were a believer, you set up a media page in 2007, right?
Hattotuwa: Yes, I was the first one to set up—as far as I know—a Facebook page for media in South Asia and certainly in Sri Lanka. And I have a screenshot of it, which is now of some archival value of what Facebook looked at 11, 12 years ago. I knew what a potential it had in a repressive, authoritarian country with a democratic deficit to be the vehicle or the platform upon what I thought, naively at the time, would be prosocial, democratic conversations that opened up the space for debate that simply wasn’t there in the commons or in the mainstream media. But lo and behold, it’s now become a kind of hydra-headed beast that activists have to manage, can’t do without, but very often get up and wonder what on earth I’m going to do this—
Kirkpatrick: And to give them credit, they are spending a lot of money, which is why their results are worse at the moment. They’re putting tons of people onto it—maybe not tons enough—but they believe and Zuckerberg personally believes that AI will be almost a magic bullet and he’s investing more in AI that will be directed towards hate speech and politically subversive speech and terrorist content. So it’s not like they’re doing nothing.
McNamee: But you have to be careful to recognize what they’re actually doing. So it appears to me that they’ve made a list of all the things that went wrong in 2016 and they’re just running down the list in series. They’re not making any effort to anticipate the issues with artificial intelligence; they’re not making any effort to get out in front of the problems at WhatsApp or Instagram, you know, because those weren’t issues creating political crisis in 2016.
Kirkpatrick: What I would say is we don’t know they are because they refuse to have a dialogue with people like us in anything except a private context where you have to sign an NDA.
McNamee: We know that they aren’t because they’re very, very, very good at publicizing every good move that they make. So we know that if they were making substantive changes to WhatsApp or Instagram that were likely to reduce this damage, we would know about it. What we know instead is that the founders and most of the senior executives at those places have left recently for one of two reasons. Either they’re uncomfortable with the future direction of the business or they think there are land mines in the history that are going to go public that they don’t want to have to deal with.
Kirkpatrick: I still want Sanjana to say what they should do but I want to make one more—you know, since I wrote a book about it. I’m the moderator but let me just say one thing. You know, WhatsApp is proven to be an extremely toxic tool in elections when authoritarians are seeking power. That is a known fact.
And in India, because it has had such a negative impact, they have reduced the number of people to which a fake WhatsApp message can be rebroadcast to five, down from what I think is normally something like 200. But in Brazil, where we just had an election that led to a neo-fascist being elected as president, who’s an openly hateful person towards many minorities, gays, et cetera, they did nothing that we know of. And I’m sure, as Roger says, we would know of it. They knew they could have restricted the number of WhatsApp message repetitions that happen, they chose not to do that. And that’s something that can be done regardless of whether you know the content of the message.
I heard an interview on NPR where a Brazilian political observer said pretty much all the political messages they saw on WhatsApp during the election were either fake or incendiary, almost literally all. And Facebook as far as we know did nothing to stop that. So this is still the context that we’re fighting against. Sanjana, I know you still want them to be a successful medium in Sri Lanka. What should they do?
Hattotuwa: I don’t have the ability to speak on their behalf, David, because I’m a small man from a small country.
Hattotuwa: And we try to deal with our problems as best we can. Let me give you a very short example. In the space of one week, through my doctoral research, I saw through data and just by analyzing the conversational domains on Facebook and Twitter but primarily Facebook—as that’s the driver of everything good and great and bad in my country—that there was a point of time when something needed to happen politically in the real world because there was nothing happening on Facebook.
And you were able to kind of determine and say that something’s going to happen as a consequence of what you didn’t see on Facebook. And as it turned out, though I didn’t take credit for it, I mean certainly I wasn’t the architect of it, something big did happen. And then Facebook kicked in and the propaganda machine kicked in as well. So they’re able to almost divine in a sense what’s going to happen as a consequence of looking at the data purposefully, with intent.
I think they’re sitting on a lot of data that can be good for the world and good for societies but I think they need guidance on how they can look at it better. And from a very small perspective, they need to work with civil society, who understand the context of the cultures and the communities and the countries that they’re working and operating in and then support and strengthen their capacity to help Facebook deal with the problems at scale.
And that involves learning, scaling up, retooling, and then spreading that globally. So even though we are a small country, perhaps we pack a punch above our geographic size in helping the country to determine and ascertain what Roger said could be the problems in the future because we were the problems of the future in 2013 and 2014. They just didn’t have the capacity to listen. So I hope that senior management is more responsive and attentive, meaningfully so, in the future.
Kirkpatrick: Hold that thought for 10 seconds. We did say we would end at 12:30, if anybody has to leave, please do. We’re not going to go too much longer but we are going to continue for a little bit longer. Roger.
McNamee: The point these guys are making is so baked into the culture. So one of the reasons Facebook was so successful—which by the way, a point I do need to make, we could just as easily be having this conversation about Google. I mean, what’s going on with YouTube, what’s going on with search results are enormously problematic. But relative to Facebook, the culture was all about eliminating friction. Everything was done in order to grow maximally fast to maximum scale. You eliminate all forms of friction.
And if you think about it, regulation is a form of friction, dealing with problems of any kind. So they tended to take the MacArthur strategy of the Pacific and they would isolate problems on a little island and then skip it. So when they got the consent to create, they put a thing in place that provided legal cover without requiring them to actually honor the consent decree. When they get situations around the world, they do the same thing, they play whack-a-mole on old problems, treating them as though they’re isolated when in fact they are systemic.
Kirkpatrick: Right. They’ve done that with Myanmar tremendously.
McNamee: They’ve done it with literally everything and I understand how they go there. And in the early days, when we were on the good side of this, that lack of friction seemed like a good idea. But if you sit there and think about it, friction is really important in life. It’s the thing that gives you a chance to think about stuff and make choices as to whether something’s a good or bad idea. I mean, things as basic as procreation are entirely dependent on friction.
McNamee: So I would simply point out that the elimination of friction from life, which has been a designed goal of these internet platforms in fact runs contrary, I think, to what common sense tells you is the best way to approach the problems that they’re solving.
Kirkpatrick: That’s a much more systemic critique than anything we’ve said thus far.
McNamee: Well, then I’m glad I’m here.
Kirkpatrick: Well, it applies to far more companies than just Facebook, clearly.
McNamee: And I would argue that Facebook, Google, Amazon are really good places to start, Facebook and Google because of the information thing and Amazon for other reasons but yes, I think we start there. If we could do something about those three, that will buy us time to create alternative visions.
Wieser: Well, I can give you what passes for a prescription from my vantage point. I would only push back a bit on that to say—and I’ve got a hold rating on Google—but I always perceive as an outsider—and by the way, keep in mind, as a securities analyst, I can’t know things that are happening on the inside. We are not allowed to and I stay away from asking people who I’ve known who work there, how’s things? How’s this, how’s that?
So the only things that I can really respond to are what their customers say—and their customers are not consumers, remember, they’re advertisers—and what their suppliers say but I can’t go into the company. I observe that Google seems to be more buttoned down. If there are ads sponsoring beheadings, I’m guessing they know about it. But they’re not taking those ads down for a reason. And I think DMCA is usually the reason or was the reason.
But I get the sense—and I may be wrong in this—that they’re just more buttoned down on all these little things, what I call the sloppiness of Facebook. Facebook is sloppy. I see it frequently. So from that point—and I try not to advocate, for lack of a better word, I try to just observe—I could say that if they did certain things, would they have positive or negative effects? So if they did something like get rid of internet basics, it would be unfortunate for internet access in many countries. But if you argue the first rule of thumb is do no harm, maybe that’s the right answer. You take WhatsApp. If they jettisoned WhatsApp today, there’d be no difference in the revenue or cost structure of the business. I’m not saying they should do it—
Kirkpatrick: Because there’s really no revenue coming from it at the moment.
Wieser: No. Maybe in Brazil there is some—which, by the way, Brazil being such an important market for Facebook might be why they haven’t done anything—but as far as we know, there’s no tangible revenue. That’s not to say it’s not important strategically.
Kirkpatrick: But sometimes they’ve actually done that just during an election season, by the way, just reduced the rebroadcasting of WhatsApp.
Wieser: Sure but the other thing is that if they were to start by saying that AI will not solve their problems, there’s a basic concept of know your customer that exists in my business, in many other people’s business—how on earth is it that they think they cannot know who their customers are? Remember, advertisers. There’s no way a TV station would have sold ads in rubles to Russian operatives, because they have to know their customer. Automating that process is a problem.
Kirkpatrick: So that’s another example of sloppiness, the degree to which they’ve automated their interaction with their advertising customers.
Wieser: And everything. Too much automation and dependence on automation.
Kirkpatrick: Which they’re proud of, in fact. I want to hear from the audience, questions and comments. Can we get the house lights up and—oh boy, there’s a very fast hand. Okay, let’s get this person here. Okay and then we’ll go over here. Yes, okay, good lights, okay, please.
Kocherlakota: Yes, my name is Swamy Kocherlakota, CIO for S&P Global. Great conversation. I have two questions. We talked about India, Sri Lanka, and other countries. What is the role of fake news in China? That’s question number one and number two, what do you see as the role of the government and regulation to solve this issue?
Kirkpatrick: Right, that’s a great thing we were in need of getting to. China, anybody have any insight on that?
Wieser: It’s not relevant to Facebook, I mean.
Kirkpatrick: Yes. Facebook’s not really functioning there.
Wieser: They take $5 billion to $7 billion of advertising dollars out of China that go into other countries, by the way, but that’s another story.
Kirkpatrick: The thing in China is that the government essentially runs the communications systems so the fake news that’s there would be chosen by the government.
Ghonim: I can just say one thing I thought about here. When it comes to fake news, I think what people need to look for is actually emotional exploitation. So how much of what is—for example, I’m always aware of when I watch news videos, there’s dramatized music in the background. The whole purpose of this dramatized music is to actually emotionally exploit you; it’s to drive a certain type of fear or hope or whatever that is to allow the message to get—we call it engagement but it’s in reality, manipulation because you want the person to be at a certain emotional state for them to just take whatever you tell them. And I feel like a lot of times we focus on what is fake news.
So I don’t know the fake news state in China but it’s not hard for me to guess that the same tools of emotional manipulation are being—because it is already in the business model. That’s why it’s not a coincidence that Trump won the elections and Obama won the elections and Hillary Clinton—you know, she could have won but it does make sense that she lost the election. It’s just this idea that the best way to move forward is to know how to emotionally exploit people and strip them out of their agency to decide based on the facts. And I think this is actually a great problem right now because people who are like us should be fighting that in their camp before the other camp.
I look at a lot of politicians on the left represented around the world and they are just as much emotional exploiters as the others. Their message is of a different essence. It’s a better message, like it actually gets us in a better place, but they are training people to be emotionally exploited. So in that sense, I am actually hopeful because I think people are discovering and understanding that more and more and they will build some sort of mechanism to defend themselves from that. But at the same time, I wouldn’t look at it like China’s doing better than us. I’m not sure.
McNamee: Yes, we’re running a massive evolutionary experiment. And the question is can humanity adapt well enough, quickly enough, to protect the institutions that are there to support the less well advantaged?
Hattotuwa: So I have a brief question from the academic side because also geopolitically, Sri Lanka is under the shadow of China, as well as are many other countries, including on the African subcontinent. And the Chinese model, according to academic, is the three Fs, fear, friction, and flooding. Fear is when you are fearful, self-censorship around what could happen to you based on the consequences of things that happened to somebody else that you know or heard about that are very violent. So you tone down your content output as well.
Friction is making it hard to get onto critical sites. I think that statistical data and analysis suggest that just a small amount of time spent more on getting to sites actually reduces the interaction with those sites quite dramatically. And flooding is when you kind of flood the conversational domain with whatever that you want to flood, cute kittens, for example, that take away public attention from the more critical issues. So in combination, that’s actually a very effective digital propaganda model, that kind of control. So there’s an interesting monopoly or duopoly or maybe a tightly controlled fake news market, I believe, from the stuff that I’ve read, in China, which is atypical. But also, I suppose, extraneous to what we’ve been talking about on this panel.
Kirkpatrick: Okay, we shouldn’t let this government question go, although I want to hear from other people. Is there anyone up here who doesn’t think that there’s a strong likelihood that government is going to start regulating Facebook in particular and internet companies generally more stringently in many, many countries? Does anyone disagree with that? I mean, I think that’s just happening because of the things we’re talking about here—
Wieser: The question is how meaningful it is, right? So if you end up with laws that say you can no longer target fewer than 10,000 people, for example, with an ad with a message. That’s huge.
Kirkpatrick: Is that a possibility?
Wieser: I’ve heard it floated in the UK in a political advertising context.
Wieser: You couldn’t do that in the United States.
Kirkpatrick: Political advertising.
Wieser: Yes, but I mean, just think about that. Like, if small business is really driving most of the spending growth, imagine for all addressable advertising, that would be pretty impactful if something like that came to light. Custom audience is an example of a product that is, my guess, a third of their advertising. There is no way that most advertisers have appropriate consent to use that product. But imagine if that—
Kirkpatrick: Because the data’s coming from outside Facebook as well as inside?
Wieser: It’s matching outside data with inside. So restrictions like that would be more meaningful than—GPDR, the European data protection law in its original intention, if that is how the law plays out, will probably be damaging but it might also be damaging to everybody. And the least bad alternative wins so they could be the least bad alternative still.
Hattotuwa: David, really quickly, I mean, I have a concern around government regulation from my context. I mean, I know that’s a Damoclean sword that is hanging on the companies in the domestic context here. But in a liberal context where rights are sacrosanct and aren’t taken for granted, I think what you have is turnkey legislation, where governments would say that they’re bringing this in to hold social media companies to account but overnight, they suddenly turn into beasts and monstrosities that clamp down on democratic dissent.
Ghonim: I agree with that, but they are doing that regardless. Like in Egypt right now, there are laws that will put you in jail for writing a post if you happen to have more than 5,000 followers and you insult the judiciary. One of my friends has disappeared 45 days ago and we never know where he is now and there’s a good chance that he was killed. And he was an activist that writes online.
So I actually would rather have governments regulate in the West and figure out what’s the best way to regulate because at the end of the day, the absence of checks and balances made the interests completely misaligned. As a company, I make a huge amount of money by milking the cow and you are the cow in this case and you’re just going to eventually become an addicted person that loses a lot of your well-being. And I think that needs to happen.
The one thing, however, I think is very critical for regulations is having certain angles. One angle I think about is transparency, especially when it comes to personalized chat platforms like WhatsApp. Does society owe these companies to know exactly what kind of links are being distributed on these networks? Without exposing people but just saying that these are the links. Because right now, a lot of us are blind because of personalization. Your feed is different from mine. Our perceptions of realities are not aligned and there needs to be a collective way to do that. That’s one thing. The other thing, which is an idea I was thinking about, just like why does the CIA disclose cases after 20 years? It’s to create checks and balances for the CIA executives.
Why isn’t it that we are? I know these companies argue a lot about IP, a lot of their AB tests are pretty important for the public to know, like if I’m doing certain AB tests. When Facebook did one mistake of running the emotional experiment and made it public, we were all furious and a lot of people—but that happens every day. This is actually their everyday business. They’re just not running it as studies; they’re optimizing for growth and the public should know. Maybe it should know after one year or two years or three years. But we should eventually get exposure to all that code or these experiments, all that data, and the public should be in a position of judging them. And that will create the right checks and balances for that.
Kirkpatrick: Okay, we’re going to have wrap fairly soon, but I did say I’d get to you. So, quickly and then we’ll go to one or two more. Maybe we’ll—we may go to a few questions all at once and then wrap up. Go ahead.
Sherman: Strat Sherman. I’d like to bring this down to a level of simplicity. I think the first point made by Roger is absolutely correct, that the big fix here is to replace a surveillance advertising business model with one in which users pay for the service that Facebook and Google and Twitter and others are providing. Do that and you eliminate most of the incentives for most of what’s gone wrong.
The second point, which I haven’t heard any mention of at all though, is that in the case of Facebook, the entire company is in the hands of one person, Mark Zuckerberg, who is not answerable to a board of directors because he personally and individually controls the boards of directors, which means that he has the autonomous power, by himself, to fix the problem. He could do that in a moment and there would be a blizzard of shareholder lawsuits, no doubt, but as one of the world’s richest men, he could afford them. Similarly, Google. Larry and Sergey are in exactly the same position vis-à-vis the Google board of directors. They control it, they have autonomous power.
Kirkpatrick: We don’t really have time to talk about the board. Unfortunately, we really do have to wrap but I would like to invite any one of the four of you to make any comments that you didn’t get a chance to make that you want to conclude with because, I mean, this is a conversation that has to continue. I really do strongly wish that the company would engage with people like us and the many other critics they have—there are so many, it’s almost uncountable. But there’s a lot of goodwill to help them if they would just engage in dialogue.
Wieser: I would only say this, I don’t actually consider them as not engaging with me any differently pre/post selloff, for whatever that’s worth. But again, that’s more of a reference to how they’re dealing with investor relations. They’re far better than Google, who I think just hates all analysts.
Kirkpatrick: Well, I do want to thank all four of you. Thank you for traveling so far, Sanjana. Thank you for a great discussion. And thank you so much. And we’ll end the conference momentarily.