Skip to main content

WhatsApp CEO Will Cathcart on a rocky year for the app

A candid Q&A on encryption, privacy, and ProPublica

Share this story

Illustration by Alex Castro / The Verge

It has been a contentious year for WhatsApp.

In January, a simple effort to update its terms of service to enable some commerce features triggered a massive backlash in India, helping its rival Signal to double its user base in a month. In May, the Facebook-owned messaging app sued India over new rules issued by the country’s IT ministry that could break end-to-end encryption around the globe. And just last week, a widely read report in ProPublica drew attention to the service’s use of human reviewers to investigate potential violations of WhatsApp’s terms of service, in part by reading the five most recent messages in any reported exchange.

Writing about ProPublica’s story, I took exception to the idea that allowing users to report bad actors is necessarily bad for privacy. (After publication, ProPublica added an editor’s note saying it had altered some language in the story to clarify that a user reporting feature does not break encryption.)

A few days later, the company announced the introduction of a way to let you encrypt a backup of your WhatsApp messages, preventing anyone who doesn’t have your encryption key (or, alternatively, a password that you set) from reading the contents of any of your messages. (The Verge’s Alex Heath has a nice technical overview of how this works.)

All these issues are the purview of Will Cathcart, who took over WhatsApp in March 2019. Cathcart, who joined parent company Facebook in 2010 after a stint at Google, previously oversaw Facebook’s almighty News Feed.

The jobs are very different, but both have involved high-stakes global political battles about speech and democracy. On Monday morning, I caught up with Cathcart over Zoom about privacy, encryption, and the future of messaging. I also asked him if he ever wished he had an easier job. “I love my job,” he assured me.

Highlights of our conversation follow.

This interview has been lightly edited for clarity and length.

Casey Newton: On Friday you announced that WhatsApp is introducing encrypted backups for its users on Android and iOS. Why?

Will Cathcart: We’re always focused on what we can do to protect the privacy of people’s messages. People’s messages are very sensitive. And the reality is that over time there are growing threats to people’s privacy — hackers, hostile governments, criminals. And so we’re always looking at how can we add more privacy, especially around the theme of protecting what you say.

We’ve had end-to-end encryption for five years, which means if you send a message to someone on WhatsApp, we can’t see what you sent as it passes through all of our servers. It’s completely secure. We think that’s really important. But the reality is there’s other things we can do to protect people’s messages. One is to actually help people’s messages not live forever. We launched disappearing messages late last year, because when I talk to you in person, we don’t usually keep a transcript of the conversation.

Another area we’ve been looking at for a while is backups. Many people don’t back up their messages, but a lot of people do. And you can opt into backup on Google or iCloud if you have an Android or an iPhone. We wanted to see if we could find a way to add the same level of end-to-end encrypted security that you get when you send a message across WhatsApp to those backups.

How do you do it?

That is a hard problem, because if you back up your message in a truly end-to-end encrypted way, you have to have a password for it, and if you lose your password or your phone, you really have no way to get them back. We don’t have some way to help you if you lose them.

So what we took a long time figuring out how to do was, how can we do this in a way that we felt would be accessible such that a lot of people would be able to use it? And what we’ve come up with is there’s two options you can choose. One is you can keep a 64-digit key, and you can keep that yourself — you can print it out, you can write it down, or you can try to remember it, but I wouldn’t recommend it.

Or if that’s too intimidating or too hard, which we think it will be for a lot of people, we’ve come up with a system where we’ll store the key for you using hardware security modules, which means we don’t have a way to access it. And you can come up with a shorter, easier-to-remember passphrase to get access to it. And that I think will help make this more accessible for a lot of people.

As you mentioned, in recent years we’ve seen stories about state-sponsored hackers attempting to access the WhatsApp messages of government officials, diplomats, activists, journalists, and human rights activists, among others. Are backups part of that story? Are they in the threat model?

Yes, absolutely.

In some of the stories around spyware companies, the most worrying version of this is where they get full access to your phone. But it’s absolutely a threat that people could try to get access to your backups. There was just a story in the LA Times a few weeks ago about a predator who was using social engineering get access to women’s backups just to try to look through their photos. There was some horrifying number of people affected by that.

The reality is, people have really sensitive stuff in what they say and what they send. We think we’ve got to look at all the ways that there could be a threat to them, and if in any case we can find an interesting or novel way to protect it, add it.

So on one hand, WhatsApp now offers a stronger degree of protection to users here than some other encrypted messengers, like Apple’s iMessage, which doesn’t encrypt its backups. But WhatsApp came in for criticism last week over the fact that it allows users to report each other, and to include recent messages in the reports they submit. And those reports — and messages — are reviewed by humans. How did that system come about?

We’ve had the ability for people to report for a long time. And look, we just disagree with the criticism here. If you and I have a private conversation, that’s private — [but] that doesn’t mean you don’t have the right to go complain to someone if I say something harassing, offensive, or dangerous.

“[Encryption] doesn’t mean you don’t have the right to go complain to someone”

That’s how the real world works: in the real world, two people can have a private conversation and then one of them can go ask for help and relay what they were told if they need to. I just think that matches how normal people communicate.

I feel like here we’ve really hit on how “privacy” seems to be a word that is understood differently by every person using it.

For what it’s worth, in this area — I haven’t heard people who use WhatsApp tell me they think the idea that we let people report is a problem. I do think there’s some really hard questions around privacy and technology and where are the lines, and things like that. But this one isn’t something I’ve seen actual people who use WhatsApp have a lot of concern about.

What are some of the ways that you feel that user reporting benefits WhatsApp?

The clearest high-level way it benefits us is that it helps us run the service with reduced amounts of spam. This is a service with 2 billion users. That is a big global system. Unfortunately, there are going to be some people who are going to try to abuse it — send out spam, send out phishing messages, send out things that are trying to make the experience for people less safe. And the fact that people can report is one of the most powerful techniques we have to catch that stuff. We’re able to ban millions of accounts a month based on [those reports].

Again, we can’t see the messages people send, but we can see when someone reports to us. We think it’s okay for you to report a spammer. And then we can use that to ban people and help keep the service more safe.

And then there’s other, more rare but very meaningful problems to try to work on — for example, the sharing of child exploitative imagery. We think we’ve found a way to have an end-to-end encrypted system that has the level of security people need for their private messages — but uses things like reports, and some of the metadata we have, to ban people who appear to be sharing childhood exploitative imagery. And in some cases, make actual referrals to the National Center for Missing and Exploited Children. We made something like 400,000 referrals last year, and that’s powered by reports. I think that’s very consistent with people’s model of privacy: if I send you something and you think it’s a problem and you want to ask for help, you should be able to.

When I talked to ProPublica’s president about all this, and he said look: at the end of the day, this company is saying that WhatsApp messages are totally private, when in fact in some cases they’re reviewed by humans. Do you think most of your users understand that dynamic, or could you do more there?

I think people get this. There’s not confusion or concern from the people who actually use WhatsApp. Anyone who uses WhatsApp can go in and hit the report button, and it gets used a lot. It’s really transparent when you do that, that it’s going to send messages to us. So this whole particular criticism did surprise me.

I wrote last week that WhatApp’s encryption-plus-reporting approach seemed to be trying to find a workable middle ground in a world where encryption is under threat. The services that provide it are probably going to have to make some sort of concessions to the government. And so how do you maximally protect encryption while also enabling at least some kind of interfacing with law enforcement to catch the worst actors? Is this how you see it?

I think about it a little differently. End-to-end encryption protects all of our users. It protects them by keeping their messages secure, while on top of that — letting people tell us if someone’s spamming protects our users. It’s usually framed as like, “are you picking privacy or are you picking safety?” I see this as the same thing — end-to-end encryption is one of most powerful technologies we have to protect people’s safety all around the world.

What is making you comfortable that on balance, the benefits of the encryption you provide outweigh any harms that may be caused by people sort of having access to these protected systems?

I would say two things. One is, I just see the trends on all the threats going on around the world. And I think through, years from now, what are the consequences if we don’t have secure protection for our data? Especially in liberal societies, in a world where hostile governments have a very different worldview about privacy and information?

And two, one thing I find helpful is thinking through real-world analogs. A lot of stuff feels so new that the debates feel very new, but the real-world equivalents, they’re not new.

“Most people have an instinctive horrified reaction. I think that’s telling.”

People have been able to meet in private in person and talk privately for hundreds and hundreds of years, and there’s no automatic system keeping a backup. There’s no automatic system relaying it to a company. And I think that’s been a good thing. Sometimes when you look at some of the proposals on breaking encryption, or traceability in India, or scanning every private photo against the database, and you just apply it to “Hey, how would you feel about doing this in people’s living rooms?” Most people have an instinctive horrified reaction. I think that’s telling.

Let’s talk about the current global situation around end-to-end encryption globally. You’re currently suing the Indian government over new regulations that would require you to trace the originator of individual messages, and to filter messages based on what they contain. Presumably this would apply to encrypted backups as well. What’s at stake here?

With the IT rules in India, the specific thing those rules would require is us to build some system [to comply] if someone comes to us and says “Hey, someone said the words ‘XYZ.’ Tell us who the first person is who said the words XYZ.” That’s not private. And it undermines the security that end-to-end encryption provides.

I think 10 years from now, even more of our lives will be online. Even more of our sensitive data will be online. There will be even more sophisticated hackers, spyware companies, hostile governments, criminals trying to get access to it. And not having the best security means that information is stolen. I think that has real consequences for free society. If journalists’ information is being stolen, which we saw in some of the reporting around NSO Group, I think that undermines the free press. If people who want to organize can’t communicate in private, I think that undermines their ability to advocate for change.

I think there’s a lot of core tenets of democracy and liberalism that actually rely on people being able to have private information.

Say you lose in India. Does that break encryption in WhatsApp globally, or can you contain the fallout to India somehow — and maybe eventually in other countries who might adopt similar rules?

You know, I don’t have a crystal ball. My hope is that over the next few years, increasingly governments will realize that on balance, the more important thing for them to do is protect their citizens’ data. That the threats are growing, and so their interest in protecting people’s security is higher, and therefore they’ll be dismissive of what some other countries are asking for. But I don’t know.

I want to try to ask it again, though. If India says, “Sorry, Will, you lose on this one, you have to build this awful system.” Can the damage be contained to India?

I think that there’s a political question and there’s a technical question. The way they wrote the rules, and what they’ve said, is that they only want it to apply it to people in India. But I think there’s a broader political question.

The more some countries see other countries do it, or push for it, the more they want to push for it, too.

Do you ever long for the days when you had an easier job, like running the Facebook News Feed?

(Laughs) I love my job. I get that there are going to be questions. I get that when we launch things like end-to-end encrypted backups, there are going to be some people who criticize it. But at the end of the day I just feel so lucky to get to work on something that so many people love and use for stuff that’s so important.


This column was co-published with Platformer, a daily newsletter about Big Tech and democracy.