Description: Technology could help solve problems that have plagued humankind for millennia. “If we really collaborated together we could do amazing things,” says Justin Rosenstein, a longtime innovator. We need an ethical and global mindset. The alternative is a world run amok.

The following transcript has been lightly edited and condensed for ease of reading.


Speaker: Justin Rosenstein, Asana

Interviewer: Josh Constine, TechCrunch

Introduction: David Kirkpatrick, Techonomy

(Transcription by RA Fisher Ink)

Kirkpatrick: Justin Rosenstein, who is coming on next, is a cofounder of a company called Asana who he cofounded with Dustin Moskovitz, who was a cofounder of Facebook. But the reason he is here is because he is a really deep thinker about what technology can and should do. And having a conversation with him about that is a wonderful journalist named Josh Constine from TechCrunch. Please, take it away Josh and Justin.

Constine: So we spend a lot of time on social media liking things, but we don’t actually always like social media. It’s become this fascinating paradox that something that was supposed to give us this validation about who we are or what we’re doing has actually become this sort of bottomless well, this hole that we can fall into. And actually a very recent study that’s going to be published very soon out of the University of Pennsylvania shows that a reduction in social media can actually have a causal effect leading to less depression and less loneliness.

And why this is so important right now is we have one of the founders of the Like button team, or the inventors of the Like button here. And so Justin, you were one of the Like button creators at Facebook. Tell us the story of how you were inspired to build it but also how the outcome differed from its intentions.

Rosenstein: I’ve always been a believer that if you do technology right, it can be this huge boom to civilization, it can create really positive effects. And so the Like button, which we originally called the Awesome button, came out of some conversations between me and Leah Pearlman at Facebook. We were asking ourselves, what if we can make it one click-easy to be able to inject little bits of positivity into the world, a little bit into the social network. We were also looking at newsfeed and asking how can we make newsfeed this kind of online democracy where it’s surfacing to people the most important information that they could see at any given time. And felt like, oh, this will allow anyone on the internet to be able to—a jury of your peers will decide what are the things that are most worth your attention.

And in some ways, I think that it’s been beautiful to see when that has worked out. Something like the Me Too movement obviously predates the social media and yet seeing how social media was able to spread this really important idea to millions of people in a matter of weeks was incredibly inspiring. At the same time, there have been all of these unintended consequences and to see the ways that social media has contributed to addiction, to distraction, to polarization, to people defining their self-worth in terms of the number of Likes they’re getting, those have been much more disturbing. And when you’ve been paying attention, even with the best intentions, our things have unintended consequences.

Constine: So since leaving Facebook, you’ve been contributing to this conversation about businesses needing to prioritize society’s well-being as well as their own. How did you develop this philosophy throughout your lifetime?

Rosenstein: Actually, I just cared deeply about the world. Every day that I wake up I am kind of just amazed that we’re alive. Amazed that there’s a universe. I’m amazed that we are these sentient beings that are the result of billions of years of evolution and that we have control over our destinies and that we’ve inherited this incredibly beautiful earth. And it’s an incredible time to have been born into this reality where on the one hand we’re finally developing the technological capabilities, the coordination capabilities, being able to create, certainly not a utopia, I don’t think anything like that exists, but a protopia, a world where things get better and better every year. You know, we have enough food to feed everyone if we’d get our act together, we could actually distribute that to all the right people. And yet at the same time, there’s so many ways in which it seems like civilization is heading in dystopian directions. We were having this conversation just outside right now, it’s hard to breathe the air because of things that are natural disasters but are pretty clearly exacerbated by the climate crisis that humanity has been creating.

And so I think that when I look at civilization as a whole, I look at the technology industry, in particular, I see so much promise and so much potential and I’m so sad at some of the particular things that we’re doing today. And my core thesis is that if we—today we look at organizations almost like sport teams where everyone is vying to get the highest score, make the most money, have the highest evaluation. If we instead of seeing ourselves in competition with each other, instead see our organizations as part of one team, as part of one project that has a single-shared common goal to create a thriving, sustainable world for all people, if we could get our act together and collaborate at that kind of global scale I think we could take the right fork in the road that we’re seeing between this dystopia and this protopia.

Constine: I think it’s a really fascinating analogy because with sports teams what happens is even if you lose over and over again you keep rooting for your team, it’s really hard to actually change your perception of what you should be rooting for. And I think that that’s a dangerous prospect when it’s not just watching somebody else play a sport but something that really affects our day to day lives. But oftentimes we hear a lot about the issues but I want to hear a little bit more specifically, what do you see as the most dire risks and consequences of this approach to building without caution or foresight?

Rosenstein: Yes, in some ways I think that the problems we’re seeing with addiction to mobile phones and problems with social media are almost blessings in disguise in the sense that they are a wake-up call, a wakeup call that we desperately need to answer now and figure out how to handle the way we deal with technology before we make too much progress on things like artificial intelligence and virtual reality, biotech, nanotech, 3-D printing. All of these are things where if we don’t do them very carefully we may be inadvertently consumerizing weapons of mass destruction, the kinds of things people could do with biotech and 3-D printers are staggering to think about. And so for so long as we continue the rivalrous dynamics that we’re in where companies are in competition to see who can develop gene editing the fastest or America and China are in competition to see who can develop artificial intelligence the fastest, that’s going to lead us to not take the time we need to be mindful and do these things the right way. Things that could be huge booms to civilization but if not done carefully, could lead to our destruction.

Constine: I think we’re in the middle of this democratization of violence and a lot of the problems that we see with technology. Every technological field seems to go through this reckoning moment, you know, we had that with chemistry with the invention of TNT and eventually with physics with the nuclear bomb when scientists suddenly realized that the work they were doing that they thought might have been really positive could actually have such negative effects. But what does that mean for what we do next? I really want to hear, how do we actually change some of these issues and what else do you see, beyond some of these social issues, how do we get away from that growth-at-all-costs mentality? How do we make sure that we’re actually thinking about the humanitarian outcomes and how do we realign our incentives towards a more wholistic view of progress?

Rosenstein: Yes, we need a radical shift in our basic definition of what it means to succeed as a technologist, as a company. Because right now it is this try to get the highest score, try to get the most attention from people, try to extract the most money out of people. And we can see how of course if that’s what people are optimizing for, that’s going to lead to more and more destruction.

But the fact that we are operating within this existing economic order is very challenging. I mean, isn’t it crazy how often in order to succeed economically the doing so flies in the face of doing the right thing. A great example of this is I think Facebook—there’s a lot of criticism of Facebook that’s well-deserved but one thing that I was inspired by that they did last year was they said, okay, we recognize a lot of the problems with polarization and with distraction, so we’re going to make changes. We are going to do the right thing that’s a benefit to users and they forecasted that it’s going to mean a hit to our bottom line because the things that we were doing before were more in service to making profit. Then a few months later they did an earnings report, profit had indeed gone down in the way they suggested and in a lot of cases people in our community, in the larger technology community, the very same people who were criticizing them and telling them, “You need to change your ways,” were the same people criticizing them for, “Ha, ha, now your stock price has fallen.”

Constine: It seems like that’s actually part of this problem with the media, it falls into the same problem that affects Facebook. Which is the things that are the most engaging are the things that are often not the best for humanity. And so yes, you could say, “Actually Facebook did something right here,” but the headlines of, “Oh, Facebook stock went down 20%. Oh, they’re falling apart,” those just get more clicks and meanwhile on Facebook it’s those sensationalized tropes, those often fake news bits, those clickbait pieces that get so much more attention. And so it seems like both sides are falling into that problem.

Rosenstein: Absolutely. And a lot of the problems that people attribute to technology are actually problems of the media. Because whether it’s social media or traditional media, the business model is still if you can attract as many eyeballs as possible, that’s how you’re going to make more revenue. And so media also has to have a new definition of success.

I think when you’re operating in the existing economic order in which we find ourselves, it has all these flaws built in and I think we should be looking really squarely at the axioms or economic order and questioning those. But in the meantime, there’s a lot of important projects that need to happen and often the best way to achieve those projects is by creating for-profit businesses.

That’s certainly the way we see Asana is we have a mission where we want to enable people to be able to work together more easily and we think the best way to achieve that is not creating profit as the end in itself but treating money more like rocket fuel, you don’t want to maximize rocket fuel when you’re building something to go to the moon but you use it as a means to an end. So we want to use money as a means toward an end for being able to build this large business. And so when we measure success, we’re more interested in things like the fact that customers report that they’re 45% faster in achieving their goals when they use Asana. That they say things like—more than 80% of customers say they have more clarity or accountability or transparency. Those are the metrics we look at to see, okay, we’re succeeding in our mission.

Constine: I think that ties into this issue, we’ve heard a little bit about this from Alexis Ohanian who is the cofounder of Reddit, this idea of hustle porn where basically in Silicon Valley everyone wants to be crushing it at all times and if you are not crushing it and every metric is not up into the right at all times, everyone thinks you’re just going directly into a death spiral. And that not only causes that problem of pushing people to grow as fast as they can without thinking about the consequences but it also leads to this issue of mental health with founders where people aren’t willing to actually talk about their problems because nobody wants to be seen as weak. And so when there’s that personal affliction of that issue as well where it’s not just about society but it can really hurt the founders as well, how do we change that culture of crushing it?

Rosenstein: Yes, a lot of it is just about changing the basis from which we derive self-worth, ideally from being independent of our professions all together but at a minimum moving from that I’m getting those headlines and I’m seeing those big evaluations to redirecting to how can I be a force of good in the world and how can I do—have as much positive effect as possible. And redirect toward seeing ourselves as—really see it as, the things we’re building, requiring deep, philosophical thinking.

Constine: So many are calling for action from tech leaders to improve society or prioritize time well-spent. When I’ve asked these leaders, sometimes they just don’t actually suggest any real solutions, they give me these broad platitudes or these very unnuanced positions like, “Facebook should just shut down everything until it can manage to have 100% certainty that no link is fake news,” and that to me seems almost too totalitarian in a way. But also just completely ignores the nuance of there are good points and bad points in the tech lab forms. There are positive repercussions as well as the negative ones and if we just throw the baby out with the bath water I don’t think that’s the best way to move forward. Yet, oftentimes, the best way for these figures to gain attention is just to say, “It’s all bad, it all needs to be brought down. Just burn the tech industry to the ground.” And that doesn’t seem to work. So I’d love to hear from you, you’ve worked inside these companies, inside Google, inside Facebook, what would you specifically like to see tech giants in the industry at large do to create products that don’t jeopardize our humanity?

Rosenstein: Yes, you’re putting out this meta-problem that the tools that we’re using to have conversations don’t encourage nuance and we need nuance in order to improve the tools about which we have these conversations. I’ll give some concrete suggestions.

One is we need to be extremely mindful when we’re developing things. I think in general, people in our industry tend to think if it’s new, it must be better. That’s just not true. We need to think ahead and sometimes we can do that ourselves, sometimes that’s going to require—I think all these big tech companies should be hiring psychologists, sociologists, even philosophers, and ethicists to be thinking deeply about these things.

Constine: I hope that includes pessimists too and not just ethicists, we need skepticists, people who don’t just when they see something they don’t say, “Oh, imagine all the money we’re going to make,” or, “Imagine all the ways we’re going to bring people together,” but, “I know that the ways that humans are not always good hearted. I know humans are sometimes greedy, we’re sometimes selfish, and we sometimes do things that aren’t in the benefit of the rest of society.” And let’s predict those ahead of time and get out in front of those problems.

Rosenstein: Exactly. So part of it is trying to predict. But then there’s monitoring because no matter how smart you are, it’s going to be really hard to predict all the consequences of your creations. So monitoring and seeing, okay, is the effect the way that we want and if you notice that something like the Like button is creating polarization, you start to ask, this is a metaphor, but maybe instead of a Like button we need a This Changed My Mind button or be measuring things that instead of just maximizing for engagement, maximizing for the more subtle success metrics.

And more broadly, I think, that as an industry we need to develop a code of ethics or in our industry’s case it would be an ethics of code. So for example, we need to be dedicated to always aligning people’s attention with their intention. In the cases where technology distracts someone it’s removing their attention from their intention. And so to get very concrete, I think one of the biggest culprits of that is notifications, whether that’s push notifications, or that’s the little red dot in the upper right corner of products. Today, by default, if your photo on Instagram gets a Like it physically vibrates your leg. If I had to give a candidate for the meaning of life I think presence would be one of the top things on the list. And if technology is constantly distracting you away from the present moment it’s extremely difficult to get deep into your own life but also to do the kind of deep-thinking required to be a responsible citizen in a democracy. And so until these companies restrict themselves to using those notifications for things that are timely and important it’s hard to take seriously the sincerity or the dedication to really doing the right thing on these causes.

Constine: Yes, I just feel like some of the notifications have gotten completely out of hand. You might say that, okay, a Like is something that it’s on your piece of content, something you made, it’s somebody trying to express a direct sentiment to you, there are arguments I could say that that would be good, I’d probably prefer to see those batched into groups.

Rosenstein: Yes, once a day.

Constine: Exactly, or at least once every few hours. The idea of getting these every minute is somewhat ludicrous. But Facebook has really gone much farther than that, I get notifications when somebody that I follow commented on somebody else’s fundraiser. Or when a friend of mine posts a story which is already visible in several places on Facebook and yet I get a whole notification about that. That seems really superfluous.

Rosenstein: Yes, the very architecture of a lot of these systems is that we’re not thinking about it through the lens of a set of principles and rules. So another great example is you have sites like YouTube that are just trying to maximize for getting you to watch the next video or the newsfeeds that are trying to show you what’s new and most interesting to you. But what if we turned that model on its head and said, okay, the real goal of technology when it’s guiding your attention is something closer to a best feed, what are the things given your goals in life that are the content that will most help you accomplish those goals. Or what’s the content that would be most helpful to you in becoming a citizen of the world and understanding the importance of problems like the climate crisis and what are the real solutions that can help us to solve those things.

Constine: Yes, and everything doesn’t have to be that newest is best, we’ve lost this desire to read the classics of literature because we read so many tweets. I know that I could reads tomes of incredible poetry, biography, nonfiction, history but instead I just read the latest tweets because that’s what’s easiest and that’s what Twitter serves me.

Rosenstein: Yes, and the workplace is the same way where we’re constantly bombarded by emails from people and chat messages and all these things that distract you to the point where it takes 20 minutes after a distraction to get into a zone of deep thinking. And today it’s very rare that people have the 20 minutes before they’re distracted again. So Asana’s purpose is to be able to take all of the information about a company and figure out, okay, of all the things you could be working on right now, what’s the most important thing that you could be doing, trying to give people back their sanity and clarity and focus in the workplace.

Constine: And what do we do about this from like the lowest levels of computer science? Can we get this built into the curriculums of people being taught these kinds of skills?

Rosenstein: Yes, actually I just cosigned a letter that the [INDISCERNIBLE 0:17:54.2] network put together about getting this right into the curriculum.  Sometimes computer science courses have an elective on ethics but we can’t tolerate that anymore of seeing it just as some sideshow, ethics needs to be core from the very first day that you start realizing that you want to be creating technology because every choice you make as a technologist really affects things.

Constine: So with regulators being too slow and out of touch to make significant change or at least at the urgent speed we need and with users beholden to products such that even when people say they don’t like Facebook, they don’t leave. Facebook is still growing by the millions of users even in North America despite Cambridge Analytica security breaches every problem that they’ve had. What I’ve seen is that technology employees are discovering that due to the talent shortage, they have massive leverage over the tech industry to actually make change. So what do you think they should do?

Rosenstein: Yes, I think people who—even people with computer science degrees coming straight out of college really underestimate the extent of the power that they have. Because there’s a very finite supply of people who are working in the technology industry and so many different, important projects and problems that are going on. And so essentially when you choose who to work for you’re voting this is a mission that I want to help manifest into the world and that’s an important choice. And similarly for entrepreneurs, I think it’s common when entrepreneurs are starting companies to ask themselves, “Well, what is a problem that affects me?” And so I hate doing my laundry, I’ll go start a laundry automation company instead of doing the deep research around what does the world need and where could my skills be most leveraged in creating that better world. The United Nations has the sustainable development goals that outline here’s important steps in progress that we need to make as a civilization and yet there’s effectively no real global, sustainable plan to correspond to that, we’re not on track to hit most of those goals.

And so the potential as an entrepreneur is to really look at those deep questions, “Where can I help?” And often the answer is quite counterintuitive. I helped fund this project called Drawdown where a bunch of climate scientists took all the different possible interventions that we could apply to helping to resolve the climate crisis and did a prioritized list based on the actual numbers of what would reduce the most CO2 from the atmosphere and the number one thing they came out with was refrigerant management which is not something you’re going to intuitively think of.

Constine: No, you don’t think about it.

Rosenstein: And yet one of the most important things potentially entrepreneurs starting companies today could do is go work on improvements to refrigerator management. And so I really just see this huge opportunity for us as an industry to change from seeing ourselves as disruptors to seeing ourselves as collaborators and to go work with leaders in fields like education and mental health and physical health, equalities, sustainability, clean energy and to be saying, “How can we show up in those conversations and collaborate with people in order to move the world in a positive direction.”

Constine: So I think that there’s maybe some skeptics out there who wonder, hey, you actually made a good fortune from Google, from Facebook, and, yes, well, these solutions that you’re suggesting address some of the repercussions of the system that we’ve built, they don’t necessarily address some of the underlying problems. And we don’t have the consensus or the time right now to address some of these massive topics in our final few minutes but I want to discuss how can some of these intractable issues with how the economy works, how do we make a fundamental shift there or at least what questions should we be asking ourselves as a society about how our economy works?

Rosenstein: Yes, first of all, I come with this with a lot of humility and a lot more questions than answers. But you’re absolutely right and I’ve referred to this a few times is that we find ourselves in this existing economic order that has created a lot of progress and a lot of good things for humanity but is also incentivizing a lot of terrible things. And so indeed, when I do a kind of 5-Whys analysis of looking at all of these different problems in the world, whether that’s the climate crisis or the attention economy crisis, so often if you just keep looking, what’s the root cause of that, the root cause of that, you get back to the existing economic order.

And so I spoke positively of there are certain problems that we can solve within existing capitalism and when we find those opportunities we can use capitalism as this engine in order to drive the success of those missions. But so many important things that happen in the world don’t have business models, all of the models that I’ve seen for enabling us to stay within 1.5 degrees Celsius of climate change all require huge investments in technology that just takes carbon out of the air and puts it back into the earth. But there’s no business model for that and yet we still have to make those major investments.

So how do we handle it when there’s no business model, how do we handle it when there is a business model but you get conflicts between that model and the mission? And as you’re pointing out, how do we do all this in a way that is equitable and fair and just and doesn’t lead to the kind of radical consolidation of wealth that we’re seeing in the industry? And even those of use or I’d especially those of us who are currently the beneficiaries of that system, we need to be taking a really hard look at the kind of axioms of our existing economic order and questioning those things. And I don’t know entirely what that looks like yet. For my own self I’ve taken the giving pledge of dedicating to give half the wealth I create off these systems back into philanthropic causes, every cent that I make off of Asana I’m dedicated to giving to philanthropic causes and not consuming myself. But I really think that those are small things and just the beginning of taking an even more fundamental look at the systems that we’re currently the beneficiaries of and questioning those even if it’s to our detriment.

At the same time that we’re doing that kind of high level systemic thinking, we don’t have time to wait. We have to be able to also within the existing system, figure out how can we collaborate, how can we work together, how can we make sure that everything that we’re working on is, as much as possible, in service to the one team, the one project, the one earth that we all share and find ourselves on, I think that’s absolutely necessary if we’re going to survive and thrive the next 100 years.

Constine: That’s great. So to recap some of the insights that we heard from Justin, there’s room for positivity. That was the whole reason of creating the Like button was to give people a little bit of positivity through a jury of their peers. And it spread some really important ideas, like the Me Too movement and really life is kind of impressive overall still and we do hope that we can create this protopia but we need to be on one big team before we can start addressing some of these underlying problems. Otherwise we’re going to end up consumerizing weapons of mass destruction and we’re in that moment of reckoning for computer science right now and that means we have to finally redefine success. And that means both figuring out new business models but also addressing the similarities between things like social media and traditional media which exploit that search for attention. We don’t want to just maximize rocket fuel, we have to really change where we find our self-worth from an external stimulus to something internal where we know we’re doing good even if we don’t get all those Likes or all those news headlines.

And so Justin’s slate for actually improving this stuff includes being more mindful and knowing that new isn’t always better, hiring psychologists, ethicists, skepticists to think about these problems before they manifest, maximizing for more subtle metrics that aren’t always obvious but are really important to what makes us human, finding an ethics of code, developing that kind of bill of rights for how we deal with code, and aligning people’s intention with their attention. And first off I think the most important thing that any tech company if they really want to take this seriously, rather than a little counter of how many minutes we’re spending that’s buried somewhere in their app, start batching your push notifications and send us less push notifications for things that don’t really matter. Give us a best feed, not always just the most recent feed and let’s get ethics built back into the computer science core. And if you’re an entrepreneur out there, you can find bigger problems to solve than the things that maybe piss you off in your everyday life. Look at things that are impacting people on a global scale. And if you’re a tech worker, vote with your feet. If you don’t like the way your big corporation is handling what they’re doing, move somewhere where you can feel good about waking up every morning and going to work.

We need new business models if we’re going to change this because some of the problems are in the underlying incentives of capitalism and what consolidates wealth for individuals in the technology industry. But if we can all get on one team and treat this not as your company versus my company but as all in one society, I think there’s still a lot of room that we can make to improve this world and use these technologies not as just problems and not just as things that earn us money but as things that bring the world closer together, not just as a cliché but in truth and in reality. So thank you all for watching and thank you, Justin for talking to us.