Quantcast
Channel: Fast Company
Viewing all articles
Browse latest Browse all 2739

Meredith Whittaker: Signal’s job is ‘to preserve private communications’

$
0
0

Our digital activity, including our most private interactions and conversation, is constantly monitored and tracked. For Meredith Whittaker, president of Signal, this is a break from human communication throughout history, and is not needed or wanted. Recorded live at the 2024 Masters of Scale Summit in San Francisco, Whittaker tackles digital surveillance, trust, and privacy, and what a Bic pen has to do with it all.   

This is an abridged transcript of an interview from Rapid Response, hosted by Robert Safian, former editor-in-chief of Fast Company. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode.

One of the biggest questions facing our global society, and a key factor for tech platforms, is about the interplay between trust and privacy and surveillance—by governments and companies tracking our digital activity. Can you explain what Signal is, its purpose, and philosophy?

For hundreds of thousands of years of human history, the norm for communicating with each other, with the people we loved, with the people we dealt with, with our world, was privacy. You know, we walk down the street, we’re having a conversation. We don’t assume that’s going into some database owned by a company in Mountain View.

We assume that that is ephemeral and then, you know, if I change my mind, maybe you don’t have a record of that that can be brought up in 12 years, right? 14 years. Now, the internet has obviously put network computation at the center of human communications. And it is now the norm that almost everything we do, where we are, who we talk to, what we buy is not private.

Signal is there to preserve that norm of private, intimate communications. against a, a trend that really has crept up in the last 20, 30 years without, I believe, clear social consent that a handful of private companies somehow have access to more intimate data and dossiers about all of us than has ever existed in human history.

So I think of us as maintaining a status quo that we really should have checked in on over the past few decades, not as. Being a heterodox actor who’s bucking a normative trend.

We all want to avoid surveillance, right? We want to have the freedom to do and be and act the way we want. On the other hand, that same freedom can sometimes support bad actors and dangerous activities. Signal is a great resource for people who are persecuted in many parts of the world. But it was also used by the organizers of January 6th. How do you think about the trade-off between freedom and bad actors?

Well, I mean, let’s break that down. The roads were also used by all of those actors, right? The goods and the bads. 

There’s an analogy I like. I’m in law enforcement. There’s a crime that was committed. I go into the house of this purported criminal, and I find a pen, like a Bic pen they used, and I’m like, oh, this is the tool with which they wrote down the scheming plans for the crime.

I go to Bic Incorporated. I knock on the door and I say, excuse me, this pen was used to communicate about a crime. I need you to reverse engineer this pen to tell me everything that was ever written with it. And, like, the CEO would be like, are you out of your mind? That’s not how pens work, right? Go try many of the other surveillance tools in your tool box, the massive budget that you just got from Eric Adams, whatever it is, do that.

But like, obviously, we’re not going to put a gyroscope in a pen and make it, you have to charge and then the OS doesn’t update every year and it doesn’t work, um, because that’s not how pens work? So what are we actually asking here? Are we asking that every single artifact, every single tool we touch somehow record our presence?

And then who’s watching? Because I just saw a president on stage talking about some really grim futures. So what are we actually talking about? Are we just buying into this going dark narrative, because any corner of the world in which we have the ability to truly communicate privately, as we have for hundreds of millions, hundreds of thousands of years, is somehow unacceptable to a state or to a corporate apparatus that feels that surveillance is now the norm, even though we haven’t had social consent for any of that, in my view. 

What you’re saying is that we designed our digital activity in a way where it could be tracked.

If we look at post World War II investment in computational infrastructures and technology as command and control infrastructure to try to win the Cold War, right?

And then in the ’90s, there was a kind of network computation and this infrastructure was privatized. And even though there were a lot of warnings around privacy, and there’s no fault here, but there was an understanding back in the ’70s and before that hey, this stuff isn’t private.

There was an industry that grew up around that to monetize surveillance. The advertising supported surveillance industry came out of the ’90s. There were two decisions that were made in the Clinton-era framework. One that there were no restrictions on surveillance, so there was no privacy law.

We still don’t have a federal privacy law in the U.S. And to that advertisement would be the economic engine of the tech industry. And why are those two things important together? Advertising requires you know your customer, and how do you know them? Well, you collect data on them. So there was an impetus for this surveillance sort of baked into the paradigm we’re talking about that’s in no way natural.

This is in no way the way that tech works. Look, Signal’s rebuilding the stack to show you we can do it differently. And by the way, that’s all open source. If you want to use it, we can raise that bar, but we need to change these incentives and we need to change the articles of faith around surveillance and privacy and the way safety gets deployed to kind of demonize private activity while valorizing centralized surveillance in a way that’s often not critical.

How much is Signal as a platform something to sort of raise these issues versus being the platform for actually making it a way that globally we operate?

I think we need to do both. 

In the middle of the night, if something goes off, I have to call someone, right? And I have to make sure they call Amazon so that we figure out why the API just went dark. And then, we’re worried about the people who rely on Signal in some part of the world where we’ve been shut up, right?

In a sense, the theory needs to be drawn from the practice, and the practice needs to be informed by the theory, and that’s Praxis. That’s part of what Signal is doing. And showing: kind of a keystone species in the tech ecosystem. Kind of setting that bar for privacy and saying again, this is not natural.

This is a product of a certain economic system, a certain set of incentives, a certain narrative and historiography. You know the flow of the world, but we can change it. We can build it differently. 

Signal’s a nonprofit, and we’re a nonprofit because we don’t want to be pushed on by those incentives which so highly prioritize surveillance as the, the way of, garnering revenue that we don’t think that’s safe in this ecosystem.

We’re also trying to change the ecosystem. We don’t want to be a single pine tree in the desert. We want to, to use my friend Maria and Robin’s term, rewild that desert so a lot of pine trees can grow.


Viewing all articles
Browse latest Browse all 2739

Trending Articles