Premium
This is an archive article published on October 9, 2023

Not much of a chance for a livable future without private communication: Signal President

Meredith Whittaker, the President of the Signal Foundation – which runs the eponymous messaging app – said that privacy is about power asymmetries, with governments and corporations trying to leverage intimate information they have about people.

Signal PresidentMeredith Whittaker, the President of the Signal Foundation.
Listen to this article
Not much of a chance for a livable future without private communication: Signal President
x
00:00
1x 1.5x 1.8x

End-to-end encryption, the technology that keeps communication over the Internet secure, is under threat – in the UK, the US, and in India. Meredith Whittaker, the President of the Signal Foundation – which runs the eponymous messaging app – said that privacy is about power asymmetries, with governments and corporations trying to leverage intimate information they have about people. The messaging app is supported by the Signal Foundation, which relies on donations made by users for funds. In an interview with Soumyarendra Barik, she also spoke about how Signal responds to law enforcement agencies including in India, the hype around current AI platforms, attempts by big businesses to launch open source AI models and how Signal could help during Internet shutdowns. Edited excerpts:

We finally had a data protection law some weeks ago, but despite that, you see the notion of privacy having varying explanations in our society. Some say it is perhaps an elitist right, in that only the well to do people care about it. The reality is also that despite being the biggest democracy online, there are households where one digital device is shared by adults of different genders, and kids. How do you view India as a market for an app that’s solely focused on offering private communications?

I’ll need to be careful because I don’t have a lot of experience in India and don’t want to speak for a heterogeneous culture. But my reflexive reaction is, of course privacy isn’t an elitist value.

Story continues below this ad

We have to ask the question, privacy from whom? – and privacy doesn’t mean we’re not sharing with each other, it doesn’t mean we don’t have people we love, or have relationships with. Even Signal, it is the world’s most widely used truly private messaging app, but it’s a messaging app, it’s actually about sharing information with the people you want to know you, it’s about being safe to expose yourself to people in ways you might not want to be perceived by others.

Privacy is fundamentally about power asymmetries, it’s about those who have power over you by having information about you that they’re able to leverage. It’s your boss surveilling you through a gig working app, who’s able to see where you are, how many miles you’ve driven and then being able to calibrate the algorithm to make sure you don’t get that bonus that you might want, who are able to have full control and full insights of your operation and your labour process, when you have no insights into theirs.

We need to actually understand the values that privacy is protecting – it’s the values of intimacy, of love, and of dissent. It’s about being able to express ourselves honestly to each other, and we can’t have a democracy without those.

WhatsApp and Facebook have challenged the traceability requirement here saying it threatens encryption. How has your relationship with the Indian government been? Have you received requests for user data from them?

Story continues below this ad

Let’s move back just a bit and just establish the vast differences between a Meta and a Signal. We’re not Meta. We don’t have billions of dollars and huge resources to have a policy team in every capital city in the world to be able to finance things like lawsuits. We are a non-profit with a little over 40 people – very small given how geopolitically significant our infrastructure is.

We don’t have direct lines of communication with governments, in part because there’s very little we can provide to governments… And it keeps those conversations, if they happen at all, very short because our answers are clear because it’s not that we’re refusing to give data to them. We very literally cannot, because we don’t have it.

WhatsApp keeps a lot of metadata. Signal encrypts all of it. We publish all the requests that we have been forced to comply with, which is not all the requests we get.

When we get a request, we process that, and any request that we are forced to comply with, we work to unseal it and then we post it on Signal.org/BigBrother. And I would need to go back to that site to confirm my recollection. But I don’t believe there’s anything from the Indian government there, which doesn’t mean we haven’t received a request, but I don’t have a log of every single request we have received.

Story continues below this ad

WhatsApp is the most popular application in India, and it has also managed to market the fact that it is a secure platform for communications. What would your pitch be for people to join Signal in a similar way?

I want to be careful here because I recognise that if one person shifts to Signal,

but their family doesn’t, they still can’t really use Signal. It takes a village to switch communications apps.

WhatsApp uses the Signal protocol to encrypt the contents of its messages, which is very important. But as James Comey from the FBI has said, metadata will get you 80 per cent of what you need – and WhatsApp collects a lot of metadata. That includes who you are, who you speak to, your contacts’ list, who’s in the groups that you message etc. Also, they’re also owned by Meta, and we need to be very real that Meta is a surveillance company.

Story continues below this ad

It’s better communication hygiene to not be passing all of your intimate thoughts to a US-based company that has been known to turn those over. They process law enforcement requests all day. There’s also the fact that you simply don’t know what will happen to data that you share today down the road.

Companies that have built the most popular AI models are today going around the world, calling for a global regulator for the sector. What do you think is the motivation behind that? Is it actually meaningful, or maybe an attempt to stall regulatory conversations, because a global framework could take a long time…

Since around 2015, there has been an emergent regulatory interest in AI. There have been many congressional testimonies, conversations with the European Commission etc. We have been through this. But what happened recently with Microsoft creating ChatGPT as an advertisement for Azure APIs was able to shift the narrative and allow a bunch of these companies to claim that there was a paradigm shift in AI technology… That is being done to effectively erase the last 10 years of policy discussions.

The idea of a global regulator exists in the world of speculation. The idea that you could form a global regulator in less than 15 years, especially as the climate starts to deteriorate, is very unlikely. But what does that give them? That gives them a 15 year runway to entrench a business model that will be very difficult to rip out.

Story continues below this ad

Meta is trying to pioneer an open source approach to AI with its Llama models. Do you find it ironic that companies, which have traditionally benefited from large amounts of user data, are now attempting to create an open source model?

We just live in a hall of irony. Meta in particular has stretched the term open in the context of AI to the point where even the open source initiative is like they are abusing the term. Llama is not an open source model. They are offering scant documentation, and have a licence that was just created on their own which is not endorsed by the open source initiative or recognized by the open source community. The lack of a definition has allowed them to just make up their own version of it, to get the halo of open source and the implied virtues.

It is very expensive to train AI, it costs millions of dollars. It’s also very expensive to run them. So one estimate said that running ChatGPT, the advertisement for Microsoft’s API business, costs as much as a training run. This is not just like running some software, the inference required to run these large models is really computationally expensive. So that also limits in practical terms who can actually use the models.

When the unrest in Iran happened, and Signal was blocked by the government there, you ensured that users in Iran could access the app via proxies etc. Data shows that India has the most internet shutdowns in the world. Do you reckon Signal can replicate what it managed in Iran, here in India?

Story continues below this ad

Our commitment everywhere, regardless of jurisdiction, is to make sure that everyone anywhere can pick up their device and contact anyone else privately and securely. So that doesn’t change anywhere. We will never undermine the privacy promises we make to the people who rely on Signal. And we will do everything we can, like we did in Iran, like we would do in the UK, like we would do anywhere to get the people who need Signal, access to Signal.

Soumyarendra Barik is Special Correspondent with The Indian Express and reports on the intersection of technology, policy and society. With over five years of newsroom experience, he has reported on issues of gig workers’ rights, privacy, India’s prevalent digital divide and a range of other policy interventions that impact big tech companies. He once also tailed a food delivery worker for over 12 hours to quantify the amount of money they make, and the pain they go through while doing so. In his free time, he likes to nerd about watches, Formula 1 and football. ... Read More

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement