Dispatches from founder Julia Angwin. Hello World is a weekly newsletter—delivered every Saturday morning—that goes deep into our original reporting and the questions we put to big thinkers in the field. Browse the archive here.
Hello, friends,
Happy New Year. In the final days of 2022, the everlasting debate resurfaced about whether too much privacy harms the ability to catch criminals.
In a New York Times op-ed titled “The Signal App and the Danger of Privacy at All Costs,” a digital ethicist named Reid Blackman argued that encrypted messaging apps such as Signal are dangerously promoting “a rather extreme conception of privacy” that can be exploited by criminals and other bad elements of society.
His op-ed was immediately flamed on Twitter and prompted rebuttals from Signal president Meredith Whittaker and technologist Tim Bray, among others.
But Blackman is not the first, nor is he likely to be the last, to argue against public access to powerful privacy tools.
The so-called “Crypto Wars” have been raging ever since the ’90s, when robust, mathematically advanced encryption moved from being a tool of state intelligence agencies to being available to the public.
In 1991, Phil Zimmermann released his Pretty Good Privacy encryption tool, giving regular people access to what had previously been military-grade encryption. The U.S. government initiated a criminal investigation of him for illegal arms trafficking.
Eventually, the U.S. dropped its ban on exporting encryption products and the charges against him were dropped.
Since then, the debate has raged on, with governments regularly arguing for “backdoors” to be built into encryption that would help them surveil criminals, while technologists have convincingly argued that backdoors cannot be built responsibly in ways that wouldn’t weaken all of cybersecurity.
The latest salvo in the Crypto Wars is the Online Safety Bill being considered by the U.K. Parliament.
The bill would require tech companies to scan their services for terrorism or child sexual exploitation content, but technologists say that requirement would effectively “erode end-to-end encryption in private messaging.”
Last year, Apple dropped a similar plan to scan photos on people’s phones for child sexual abuse content after a wave of criticism from technologists (including that of Alex Stamos, director of the Stanford Internet Observatory, published in this newsletter).
As it happens, I recently interviewed Signal president Meredith Whittaker about the challenges of protecting privacy at the Web Summit conference last fall. So I transcribed our live discussion and asked her to revise and extend her remarks a bit.
This is Whittaker’s second appearance in this newsletter: I last interviewed her in April 2020 about gig workers fighting to win safety protections during the height of the pandemic.
Whittaker is a veteran tech activist. While at Google, she founded Google’s Open Research Group and co-founded M-Lab, a global internet measurement platform. She was also a core organizer of internal protests against military use of Google technology and of the Google Walkout in 2018.
She co-founded the AI Now Institute and was Minderoo Research Professor at NYU. Before joining Signal as president, she completed a term as a senior adviser on AI to the chair of the Federal Trade Commission.
Our conversation, edited and expanded, is below.
Angwin: When I first met you, you were a Google executive, but you recently joined Signal as president of the encrypted messaging app. Can you describe your journey?
Whittaker: I joined tech not because I love gadgets or had any desire to be a programmer but because I graduated from college and needed a job. Google found my résumé on Monster in 2006, and I responded to the recruiter.
I was at Google for over 13 years, and I moved from my entry-level job in customer support to founding a research group. I was interested in hard political and social problems, so I sought them out.
At the time, net neutrality was the big issue, and this is where I started. I also worked on issues related to privacy, and the social consequences of AI.
By the end of my stint at Google, I was a go-to expert on these topics, but I arrived at what is maybe an old-school conclusion: that it’s fine to have a good analysis, and it’s great to be right, but if you don’t actually have decision-making power, then you’re likely not to change much.
This led me to other tactics, namely labor organizing—building countervailing power—against unethical military contracts and the surveillance practices of Google and other tech companies.
Fast-forward through a stint in academia I’m still recovering from and a taste of government. I ended up at Signal, where I am today.
Signal is an organization I care about deeply. I don’t think we have much of a chance for a livable future if we don’t have a truly private means to communicate with each other.
A world that doesn’t have privacy is a world where the power structures that exist now are cemented, almost impossible to perturb, and authoritarianism thrives in these conditions.
Angwin: Tell us about Signal. Why should we use Signal and not WhatsApp or other messaging services?
Whittaker: Oh, I love this question! There are a lot of differences between Signal and WhatsApp and the others. I’ll start with the business model, which matters. Signal is a nonprofit.
This structure is a first line of defense to ensure that we’re laser-focused on providing people with a truly private way to communicate.
We don’t have shareholders or equity, so we’re not being pushed to prioritize profits and growth over our core mission.
There’s no billion-dollar exit coming for executives, so even if I turned into a terrible person tomorrow and decided to sell Signal to a private equity firm, I wouldn’t get anything out of it. This is in contrast to WhatsApp, which is owned and underwritten by Facebook.
So that’s already a world of difference in terms of the incentive models and the pressures that could be applied to loosen privacy and encryption.
At the level of the technology, Signal messages are encrypted end-to-end. This is similar to WhatsApp, which uses the Signal Protocol to encrypt many WhatsApp messages. But unlike WhatsApp and others, Signal goes way beyond this.
We encrypt metadata as well, so we don’t know who you are, we don’t know your name or profile information, and we don’t know which groups you’re in or who’s talking to whom. Metadata is very important and revealing.
All other platforms currently collect it, and potentially join it with other data sources they have access to (looking at you, Facebook), or hand it over to law enforcement.
Angwin: Law enforcement is known for wanting access to encrypted messages, for example in the San Bernardino case where Apple refused to turn over information. If law enforcement asks Signal for information about its user communications, what do you do?
Whittaker: We don’t have the information. Knowing nothing about you is simply how Signal works. By way of analogy, say law enforcement goes to the headquarters of a company that produces pens.
They bring a specific pen to the company, and they ask the company to tell them everything that’s ever been written with that specific pen.
Of course the company would look at them like WTF. That’s not how pens work, we can’t tell you that! Everyone would get that, understand it, and let law enforcement go on their way.
Well, Signal doesn’t work that way either. We built it from the ground up in a way that preserves privacy.
Sitting here in the wake of data breach after data breach, terms of service change after terms of service change, I’m confident in saying that building systems that work to not collect data, like Signal, is the only meaningful way to protect privacy.
Angwin: Europe has passed new regulations that will require big messaging apps like WhatsApp to offer interoperability to other messaging apps, such as Signal.
I personally would love interoperability because I have some relatives who only use WhatsApp, and I prefer to use Signal because of its privacy promise.
Does Signal plan to work toward interoperability with other messaging platforms?
Whittaker: In theory, interoperability is great. But in Signal’s case, we are not willing to lower our privacy bar in the name of interoperating with apps like WhatsApp or iMessage that don’t adhere to our standards.
If these apps wanted to adopt our strict privacy provisions, interop is certainly a conversation we’d welcome.
We would also need to ensure that we could validate other apps’ implementation, guaranteeing that there were no tricks on the backend that were compromising the promises we make to the people who rely on Signal.
All of this, in addition to the fact that the companies are already unhappy about the prospect, means that, practically speaking, it’s very unlikely that Signal will be interoperating with other apps in the foreseeable future. But we are open to it, under the conditions I outlined.
As always, thanks for reading.
Best,
Julia Angwin
The Markup
(Additional Hello World research by Eve Zelickson.)
Credits: Julia Angwin
Also published here