WIRED:
You're in the
online privacy business. To start super broadly, how do you
define privacy?
Andy Yen:
These days, all
Google and Apple and Big Tech talk about is privacy, so the best
way to give our definition is to give the contrast.
The way
Google defines privacy
is,
"Nobody can
exploit your data, except for us."
Our definition is
cleaner, more simple, and more authentic:
Nobody can
exploit your data - period...
We literally want to
build things that give us access to as little data as possible.
The use of end-to-end encryption and zero-access encryption
allows that.
Because
fundamentally, we believe the best way to protect user data is
to not have it in the first place.
WIRED:
If you ask someone, "Would you like more privacy or less?"
they always say more. But if you watch how people actually
behave, for most people, data privacy is not a very high
priority. Why do you think that is?
Privacy is inherent to being human.
We have curtains on
the windows, we have locks on our doors. But we tend to
disconnect the digital world from the physical world.
So if you take the
analogy of Google, it's someone that's following you around
every single day, recording everything that you say and every
place you visit.
In real life, we
would never tolerate that. On the internet, somehow, because
it's not visible, we tend to think that it's not there.
But the surveillance
that you don't notice tends to be far more insidious than the
one that you do.
WIRED:
Your company has come out in support of reforms to strengthen
antitrust enforcement. But a lot of people argue that privacy
and competition are in conflict. Apple will say, "If you force
us to allow more competition on the platform that we run, then
that will reduce our control over the security and the privacy
of the user. So if you make us increase competition, that will
bring privacy down." And then you see the flip side of the
argument, which is when Apple or Google implements some new
privacy feature that may hurt competitors. How do you think
about these potential conflicts?
What Apple is basically claiming is,
you need to let
us continue our use of app store practices because we're the
only company in the world that can get privacy right.
It's an attempt to
monopolize privacy, which I don't think makes any sense.
If you look at the
FTC lawsuit against Facebook,
the theory is that privacy and competition are two
sides of the same coin.
If you're not happy
with Facebook's privacy practices, what is the alternative
social media that you can go to other than Facebook and
Instagram?
You don't really have
that many options. We need more players out there. If they had
to compete, then competition would force privacy to be a selling
point.
The same is true for other services that we offer.
Today, Google
controls the Android operating system, which is used by the
majority of people, and they can preload all their applications
as a default on your user devices.
So they have a
massive advantage already because users don't change the
defaults. So even though their privacy practices are quite
terrible for most people, there's no real pressure to change it
because the alternatives don't really exist.
And if they do exist,
Google's able to hide them, because they set the defaults on
their devices.
So if you want to fix
the privacy issue, the best way to do it is to have more
competition, because then there will be user choice, and users
tend to choose what is more private, because, as you said,
everybody wants more privacy.
WIRED:
Europe's new Digital Market Act has a controversial section
requiring the biggest messaging platforms to let competitors
interoperate with them, while still preserving end-to-end
encryption. But quite a lot of people argue that you can't
actually do both of those things. So here's a place where
privacy and competition really do seem to be conflicting with
each other technologically.
I have to say, this has been around since the early '90s. PGP is
basically interoperable encryption, based on the email standard.
So it may not be technologically the easiest to do.
But to say it's
technologically impossible is also not correct.
WIRED:
Another EU proposal would require companies to implement
methods of detecting child sex abuse material, or CSAM, on their
platforms. People are very alarmed about the implications of
that for encryption.
We're still in the process of analyzing it, so I can't comment
specifically on the details of the proposal.
But these proposals
are not new. In fact, they've been coming up in various forms
over the past decade. What is novel and different this time is
that these proposals used to be packaged under "terrorism." Now,
they're packaged under CSAM.
It was very clever to
repackage this idea into a topic that is even more toxic, which
makes informed debate difficult. Obviously, CSAM is a horrible
problem, something that the world is better without.
But a wholesale
attack on encryption can have unforeseeable consequences that
are not always completely understood or considered by the
drafters of these proposals.
WIRED:
Do you think that this is a wholesale attack on encryption,
though? The people making these proposals say, "We're not
attacking encryption. You just have to figure out a way to
monitor for CSAM."
Well, there is really no practical way in today's technology to
do that in a way that doesn't weaken encryption.
WIRED:
The analogy to terrorism is interesting because, during the
Bush-era War on Terror, there
was a sense of literally anything being justified in the name of
stopping terrorism. The US government was secretly spying on its
own citizens. It was really hard to argue that we have to accept
that some terrorism is going to happen. It's even harder to say,
look, we've got to accept that some amount of child exploitation
is going to happen and people are going to use digital tools to
spread it. But at some point, I think you do have to defend the
principle that we have to tolerate a certain amount of even the
very worst things if we want to have meaningful civil liberties.
If there was no privacy in the world, that world would be more
"secure."
But that world does
exist; it's called North Korea. And the people that live there
probably don't feel very secure. As a democracy, you have to
strike the right balance.
The balance is not
total mass surveillance of everybody, because we know that there
are serious consequences to democracy and freedom as a result of
that. It's not easy to find the right balance.
But during the
Bush
years, with terrorism, I think they went to an
extreme that really was a backsliding on democracy.
And this is something
that we need to avoid.
WIRED:
But then on the flip side,
you read stories about law
enforcement catching really bad criminals, and in a lot of those
stories, if that person had used a Tor browser and a ProtonMail
account and
a VPN, and so on, they might
not have been caught. Do you ever worry about a future where all
the bad guys get smart enough to use the best privacy tools, and
it becomes too easy to evade the legal system entirely?
Well, encryption and privacy technologies are what I would call
dual-use.
What is law
enforcement also concerned about these days? People's
information being stolen, sensitive communications being hacked,
emails of political campaigns being stolen by state actors and
disseminated to shift the political balance.
In order to prevent
all of those potential ills, you need,
privacy,
encryption, and good security...
So, the same tools
that people in law enforcement criticize are actually the same
things that are shielding a lot of the internet ecosystem and
the economy from a disastrous outcome.
If you were to weaken
or prohibit all of these security tools and privacy tools, then
you would open the floodgate to a massive amount of cybercrime
and data breaches.
WIRED:
Proton has grown a lot over the years, but it's still
basically a rounding error compared to something like Google.
We've talked about competition from a regulatory perspective,
but on a practical level, how do you even try to compete with
your massive rivals?
The current plan is the launch of the Proton ecosystem...
It's one account that
gives you access to four privacy services:
-
Proton Mail
-
Proton
Calendar
-
Proton Drive
-
Proton VPN
One subscription that
gives you access to all those services.
It's the first time
anybody has taken a series of privacy services and combined them
to form a consolidated ecosystem. That doesn't match all of Big
Tech's offerings, of course.
But I think it
provides, for the first time, a viable alternative that lets
people say,
"If I really want
to get off of Google, I can now do it, because I have enough
components to live a lot of my daily life."
For the first time,
you'll have a privacy option that's not fully competitive with
Google, but reasonably competitive, and that will start to break
the dam.
I don't know how it
will go, but I think this is the future of privacy, and that's
why we're doing it.
WIRED:
This is probably the first time I have ever thought about
having an encrypted calendar.
A calendar is essentially a record of your life:
everybody you've
met, everywhere you've been, everything that you have done.
It's extremely sensitive...
So you don't
intuitively think about protecting that, but actually, it's
essential.
WIRED:
And by making that encrypted, who am I protecting that
information from?
Maybe it's the
government requesting information on you.
Maybe it's a data
leak.
Maybe it's a
change in business model of your cloud provider at some
point in the future that decides that they want to monetize
user data in a different way.
Your data is just one
acquisition away from going across the border to a country that
you didn't expect when you signed up for the service.
WIRED:
Right. Elon Musk is about to own all my Twitter DMs.
Exactly right. And with end-to-end encryption, no matter what
happens, it's your data; you control it. It's just a
mathematical guarantee.
WIRED:
But what if I move all my stuff to the Proton ecosystem, and
then like four years from now, you go out of business? What
happens to my stuff?
Proton has been around for eight years now. In the tech space,
that's a long time.
I think an indicator
of what is sustainable in the long term is alignment between the
business and the customers.
Our business model is
simple:
Premium users pay
us to keep their data private, and our only incentive is to
keep it private.
Sometimes the easiest
and simplest models are the ones that are the most durable.
I strongly believe
that Proton will be a company that outlives us.