This year, Facebook distracted the world with its stances on political ads and law enforcement access, drawing attention away from a significant problem: the destructive impact of organic content. Around the world, enormous harm is being done using online platforms, all without spending a single dollar on ads. No platform illustrates this better than WhatsApp, the messaging app that Facebook acquired in 2014.
With an estimated 1.5 billion users who send more than 75 billion messages per day, WhatsApp is the world’s biggest messaging service (ad free, for now). Users can call or send messages to their friends and family; but large groups (up to 256 people) are also a compelling feature of the app. Highly convenient, and with low bandwidth thirst, WhatsApp makes it very easy for users to rapidly send text, images and videos to literally thousands of other users within seconds.
WhatsApp’s most defining feature—end-to-end encryption—means such user-generated or forwarded content can’t be read, let alone moderated, by outsiders. Mark Zuckerberg, when announcing “the future is private,” compared encrypted messaging to the “digital living room,” a place for private conversations with just a few people. In reality, though, WhatsApp groups can also resemble a “town square” filled with haters, racists, liars, and abusers. Enabling privacy is virtuous and essential. Enabling the rapid and large-scale spread of dangerous, distorted, and deceitful content is irresponsible and dangerous.
Very quickly, WhatsApp’s scale and speed have become vehicles for dangerous ammunition. Messages that promote hate, distort the truth, call for violence, and seek to manipulate recipients move at an unprecedented velocity and have an extraordinary reach.
This is not theoretical: WhatsApp messages have incited a series of lynchings in India, swayed an election in Brazil, and sparked riots in Indonesia. The platform has also become a breeding ground for white supremacy. And the provenance of such messages is not only unverifiable, but in many cases, has been shown to be coming from organized purveyors with dark campaign agendas.
WhatsApp and its parent company, Facebook, have been slow to respond to these and other abuses. In desperation, governments are resorting to the most amateur of IT fixes: flicking the switch off. But censorship is not justice, and the platform is still on the hook. WhatsApp can and must do more to address the dangerous behavior that its platform is enabling.
Here’s how:
First, to be clear, WhatsApp needs to stay strong on encryption. We live in an age of mass state-led and corporate surveillance. Encryption is critical to protect our human rights online. WhatsApp needs to stay the course, especially in the face of law enforcement demanding “backdoor” access to these services and spyware vendors like the NSO Group thwarting encryption to target vulnerable groups.
Second, WhatsApp needs to make more product changes to stop its service from being so easily exploited. Like Facebook, WhatsApp was designed with minimal “friction” to maximize growth, creating a bonanza for bad actors. A recent study in the MIT Technology Review suggests that adding responsible limits, minor inconveniences, and slowing down transactions has significantly slowed the spread of disinformation. In this area, WhatsApp can go much further than its belated efforts to tweak group settings and forwarding limits. For example, the platform can cap spam and make all groups opt-in by default. Spamming the masses and adding hundreds of users to groups without their permission is antithetical to its privacy principle.
Third, WhatsApp needs genuine innovation to address the most dangerous content and behavior on its platform. Facebook has developed various “trust and safety” systems to try to manage these issues, including automatic detection of pornography, child abuse, and inauthentic behavior, and initiatives to debunk and downrate false stories. Citing its end-to-end encryption, WhatsApp likes to say, “What content?” But as experts at a recent roundtable discussion at Stanford’s Internet Observatory suggested, there may be ways to develop safety tools to run automatically on users’ devices while preserving their privacy. Innovation along these lines might enable encryption to be maintained while tackling some of the most problematic content.
Fourth, WhatsApp needs to dramatically increase its engagement with stakeholders. It’s not possible to understand and solve these problems from inside the Silicon Valley bubble. WhatsApp needs to connect and collaborate with child protection organizations, human rights defenders, minority groups, and other affected users as well as researchers and academics to understand the dynamics of dangerous content and test better ways to address it. This will require much more openness, transparency, and humility. In return, WhatsApp will gain a deeper understanding of harms, access better research, and source innovative solutions from a wider pool.
Despite the damage that WhatsApp is enabling, so far the platform has largely escaped scrutiny. Mark Zuckerberg hasn’t had a “WhatsApp Myanmar moment” yet; he hasn’t been grilled by Congress about the trail of deaths and democratic distortions that WhatsApp enables. But it’s only a matter of time; last week’s Congressional hearings on encryption, while focused on the wrong solution, point to real risks. There’s growing concern in the US that white nationalists are increasingly using encrypted messaging to spread hate speech, and that the millions of pieces of child abuse content that Facebook reports every year would become invisible if Zuckerberg encrypts all messaging services. We predict that WhatsApp will be a powerful, targeted, and un-scrutinized misinformation tool in the 2020 elections.
Legislators and regulators are paying much closer attention now—as they must. WhatsApp would be well-advised to spend less time on its new advertising and payment systems and more time fixing its current product.
David Madden is a senior advisor at Omidyar Network, Founder of Phandeeyar, an innovation lab in Myanmar (Burma), co-founder of Purpose, a digital strategy agency, and social movements, Avaaz.org and GetUp.org.
Anamitra Deb is managing director of the beneficial tech initiative at Omidyar Network, the philanthropic investment firm established by Pam and Pierre Omidyar, the founder of eBay.