JAY: Hi, I'm Jay from
the security team. We have a really
great talk today. Bruce is here to
talk about big data. "Data and Goliath"
is his new book. And I had a hard
time figuring out what to talk about,
how to introduce him. There's the stuff you read. And you're a Harvard
fellow, EFF, 13 books, well known in the
security community. But last night I stayed
up, and I read his book. I was up most of the night,
and then I got up at 5:30 this morning. I was really tired, and with the
time change it was like 4:30. And I wanted to take some notes,
and I had an event this morning that made me appreciate
the book, what was in it. And I got into my car. I started to drive away,
and I realized that I left his book in my house. So I left my car running,
went inside, got the book, left my door open. My car rolled down the
driveway, took out my door, crashed into an oak tree,
pretty much totaled my car. This is at 5:30 this morning. AUDIENCE: Aw. JAY: So 5:45 I'm on
the phone with Geico filing an insurance claim. And it dawned on me, I am now
even more involved in Google. I'm involved with
Geico and big data. And in reading his
book, there was a company that looks at
car websites, right Bruce? Hey, you're looking
for new cars, and when you go in to buy a
new car, you read about this. They already know you're
looking for a new car. They can sort of take
advantage of that, and was it $300, $400 they
can take advantage of you in that way? And I was like, oh my god. I was looking at new cars
a couple of weeks ago. Is Geico going to
say, hey, I'm not going to deal with your
claim because you just pushed your car down the driveway
because you wanted a new car? Or there's companies that
look at where you drive. They're collecting
your license plates. There's all this big data
that they're generating. And I just realized that a lot
of stuff that was in his book was completely relevant not
only to what I do at work here at Google but in my daily life
of, hey, I just crashed my car. What's going to happen? So just keep that in
mind that I didn't realize that I would
have this big data event right before this happened. But Bruce, why don't you come up
here and talk about your book? It actually is a
really good read, and there's a lot to go over. So thanks for coming out. [APPLAUSE] BRUCE SCHNEIER: Hey, thank you. Maybe we should talk about the
value of the parking brake. [LAUGHTER] I want to talk about data. The book I wrote is
"Data and Goliath." I appreciate the
proper pronunciation. I didn't realize this, but there
are lots of people out there who will pronounce
it "Data and Goliath" and then not get the joke. "Data and Goliath," I'm
happy with the title. The title was a collaboration
between my editor and I. We were going around
with different titles, and we came up with
"Data and Goliath." I immediately loved it
because it's so evocative, but the problem with "Data
and Goliath" as a title is that Malcolm
Gladwell just came out with a book called
"David and Goliath." And that would be OK, except
that Malcolm Gladwell's previous book was
called "Outliers" and my previous book was
called "Liars and Outliers." It came out after his. And aping him twice
seemed like too much. So I wrote on my blog this
story of the title that's great but cannot be, and I got an
email out of the blue from Malcolm Gladwell, who
I don't know, saying, you should use the title. And I said, thanks. Will you blurb book? And he said, sure. And I said, you
know the publisher will put it on the front cover
in a font bigger than my name. So it is on the front cover. It's actually not in a
font bigger than my name, but I appreciate the permission
to use the title and the kind words. What I'm writing about is data. I'm really writing about data
in society and how it's used. You all know that all
computers produce data about things that are happening,
about what's going on, transaction records. So I'm writing about how
these transaction records are generated, how they
are increasingly stored and increasingly
used, saved, bought, and sold by different companies. I look at Google. I look at license plate capture. I look at different systems
that follow you around on the internet, cameras,
all of these technologies of collection, which I
think have been well talked about in the media. What's talked about less
are systems of analysis. The media likes to
focus on collecting this, collecting that,
spends less time on analysis. And I think one of the
common misconceptions I find talking to
people is they have very human views of analysis,
that people are looking at it or people-like entities. So you can hide in
a sea of a big data. If you've got thousands
or millions of records, they'll never find you. There isn't this
conception that computers are really good at incredibly
boring, time-consuming, repetitive tasks. And the intuition
of how this data can be used, how things
can be correlated doesn't apply anymore. And I'm reminded, this is only
related of the programs that would take shredded
paper and reassemble the documents basically by brute
force, laying them all flat, taking a photograph, and
moving the parts around until you got-- it's
basically a puzzle. But it's a puzzle that a
human being could never solve. So just like paper shredders are
designed for a human adversary, not really a computer
adversary, our notions of data and what happens to our data
is conceptualized mostly based on human adversaries. So I talk a lot about this. I talk about the data
being surveillance data. And I think this is a problem. Another problem in the
popular conception, the notion that
it's only metadata. We heard President Obama say
that a year and change ago. And metadata is fundamentally
surveillance data. To me the way to think of it is
just do a thought experiment. Imagine I hired a private
detective to spy on that guy, and the detective would
put a bug in your home, in your office, in your car. And I get a report of
the conversations he had. That's the data. That's the kind of
thing President Obama says is not being captured
for all Americans. Now I imagine I take
the same detective and said, put this guy
under surveillance. I would get a different report. I get a report of where he went,
who he spoke to, what he did. That's all the metadata. The metadata is the
surveillance data, and it's surprisingly intimate. Metadata speaks to
our relationships, our associations, what
we're interested in, what's important to us. It's really who we are. And it's much easier to
store, to search and analyze. Take an obvious
example of metadata is our location data
produced by our cellphones. We carry them
around all the time. Not out of malice. It's how the phones operate. They can't deliver phone calls
unless they know where you are. But a pretty accurate picture
of where you are reveals a lot. It's much more important for
an entity trying to control us, whoever that might
be, that we are all in the same room
together much less so than what I happen to
be saying from the podium. So we are in the golden
age of surveillance. And there's a couple of
characteristics of it. It's incidental. It's generally a side-effect
of the things we want to do. We don't pick up our phone
in the morning and say, I'm going to put my surveillance
device in my pocket today. But it has to be that. Otherwise it can't
operate as a cellphone. It's covert. We don't see it. If there were 50 people
standing over your shoulder as you surfed the web,
you would notice that. You'd say, hey, get away. I'm doing something. But if it's 50 cookies tracking
you, you don't notice it. It's hard to opt out of. I'm always asked,
what can people do to avoid surveillance? And the advice like
don't carry a cell phone and don't have an email
address is kind of dumb advice. Don't have a credit card. Don't have a Facebook account. These are the things you need
to be a fully functioning member of society. And those aren't things
we can easily do without. And it's also ubiquitous. It's happening everywhere. Simply because more and more
of our life involves computers. Our commerce, our socialization,
our research, our reading, intermediated by computers,
so the data is collected. And ubiquitous surveillance
is fundamentally different. It's not follow that car. It's follow every car. You can do more things. You can follow people
backwards in time. You can do what the
NSA calls hop searches. Who am I talking to? Who are they talking to? Who are they talking to? When you hear three hops, that's
what they're talking about. You can do about searches. Don't search this
person, but tell me who's spoken these words. Tell me who writes
about this topic. Find me somebody that meets
these particular surveillance characteristics. Maybe three time
location parameters, probably a unique person,
or flagging something based on interesting. This data is being collected and
used primarily by corporations. It's surveillance is the
business model of the internet. You guys know that. And we build systems
that spy on people in exchange for services. A lot of reasons why this is so. When the internet began,
there wasn't any way to charge for anything. Then it began to be used
for commercial reasons. And then people expect the
internet to be free just based on its pre-commerce history. So advertising was
the obvious way to make money on the internet,
and personalized advertising was the obvious way to
extend that and make that more profitable. So I think of this as
free and convenient as the drivers of surveillance. And corporations
know a lot about us, and that's sort of
an amazing amount. My cell phone knows where
I live, where I work, when I go to sleep,
when I wake up. We all carry cell phones. My cell phone knows
who I sleep with. I used to say that
Google knows more about me than my wife does. And that's true, but
it's not even enough. I think Google knows more
about me that than I do because Google remembers
things that I don't. And I think all of us, when
we're interested in something, we search for it. You guys know what kind of
porn every American likes, and that's sort of creepy. Now, do you remember
last year we saw that Uber post where
Uber was looking at rides. It was looking at
rides to a location at night and then rides from
a location the next morning. So it basically found people
using Uber to go have sex. And they published
stats about this, what cities, what neighborhoods
were the best for this. And they were all aggregates. They all hid the
individual people, but Uber knows those
individual people. Uber can produce the list
if they wanted to of people who use Uber to have sex. It's probably in the
license agreement that they're allowed to. And this is done
as a side-effect of using a very useful service. Government surveillance
largely piggybacks on these capabilities. We've learned that a little
from the Snowden documents. We know that from China. It's not like the NSA woke
up one morning and said, let's spy on everybody. They woke up one
morning and said, look, these companies are
spying on everybody. Let's get ourselves a copy. They do it a lot
of different ways. They do it through legal
compulsion, national security letters. They do it through subversion,
when they hacked your data center links outside the US. They do it all different ways. And this really allows
government to get away with a level of surveillance
we would never allow otherwise. We would never agree
that we would all put our tracking device in
our pockets every morning. But we carry a cell phone. Or if the FBI said, whenever
you make a new friend, you must alert the police. You laugh, but you
all alert Facebook. Or give the police a copy
of all our correspondence. No, we just put it in Gmail. I mean, I don't use Gmail,
but last time I checked, about a third of my
email is stored by Google because everyone else does. And government surveillance
is largely driven by fear. On the good side, you can
say it's fear of terrorists and fear of criminals. On the bad side, you can
say it's fear of dissidence and fear of new ideas and
fear of political organizing, depending on which country
we're talking about it. But this is how it happens,
and this is how it extends. So, I spend the first
part of the book on that, on all this complex
discussion of surveillance. Corporate and
government, what I think of as the public-private
surveillance partnership, how things are working together. I use a lot of the
Snowden documents. I use a lot of things that
happen in the commercial world. It's important to understand
that this isn't just the US. The Snowden documents have
given us an extraordinary window into US NSA
surveillance, but there's no reason to believe that
other countries don't do the same thing to the
extent of their ability, certainly China,
Russia, other countries with big military budgets. And a lot of what
we see the NSA doing are straightforward
extensions of hacker tools. To me, the most surprising thing
about the Snowden documents is the lack of surprising
things in the Snowden documents, that the NSA is
not made of magic. You'd think with their
budget there'd be some magic. But it seems to turn
out that it's just putting more formal
process and bigger budget and more people on
the [? tach ?] tools, and you get
straightforward extensions of commercial and hacker tools. And technology democratizes us. I make this point
again and again. Today's top secret NSA programs
are tomorrow's PhD theses and the next day's hacker tools. So when you see a really
impressive NSA trick, it's a preview of
what the hackers are going to do two years from now. The latest one was the ability--
we saw this from Kaspersky-- the ability to hide malware in
the boot areas of hard drives, a truly impressive
technique that even if you take your computer,
erase all the memory, reinstall the operating
system, it's still infected. So this, I guess in 2008,
was a secret NSA program. After Kaspersky
talked about it, I started looking at
the academic research, and I found three papers
over the past few years talking about the
same techniques. So do we know that the
government of China doesn't do this? I wouldn't trust
that they didn't. I still believe the best way,
when you come back from a trip to China, to scrub your computer
is to throw it away and get a new one. So when we're looking
at these techniques, we need to keep in mind
that they're not NSA only. I spend the second
part of the book on, it's a section called
What's at Stake where I talk about why this
matters, why privacy matters, why data matters. And I head on attack
the two tropes you'll hear in common
discussion about this. And the first one is
security versus privacy. And the second is, if
you have nothing to hide, you have nothing to fear. Those are the two
main talking points you'll hear out in
the common world. Security versus privacy, I
think that's obviously not true. Whenever someone says security
versus privacy, look at them and say door lock,
burglar alarm, tall fence. There's a lot of security that
has nothing to do with privacy. And also you don't
actually feel secure when your privacy is violated. This notion that they don't
go hand in hand I think just doesn't make any sense, that
privacy is a part of security. To be sure, there are
aspects of security that require a privacy
violation, police investigating crimes. And we have lots
of systems in place to ensure that that
process happens fairly with minimal abuse. And I talk a lot about them. So the nothing to hide
nothing to fear argument, that mischaracterizes privacy
as something to hide. Privacy is much more about
autonomy and power and control. Privacy is my ability to
decide how I present myself to the world. And stripping someone of
that is very dehumanizing. And lots of scientific studies
on surveillance bear that out. And more importantly, privacy
allows society to move forward. Right now in the
United States, we're at the brink of two
amazing social changes. Gay marriage will be legal
in all 50 states soon, and marijuana will be legal
in all 50 states soon, two things which right now feel
inevitable but three years ago would have felt impossible. Now, the process to get from one
to the other requires privacy. In order for pot
to become legal, it has to be that sometime
in the past, someone tried pot and said, you know? That wasn't that bad. And then a couple of
generations occur, and more and more
people think, you know? That's not that bad. But if you could imagine a
world with perfect surveillance, and every gay relationship
is stopped and prosecuted, absolutely you'd never get to
a world where a lot of people are saying, why should I care? This is fine. It's a really
interesting process from something being illegal,
to illegal and tolerated, to illegal and really
tolerated, to legal. And that process only works
through imperfect enforcement. So I spent a lot of time
on the value of privacy. I talk about the business
reasons and the fact that we have a lot of
trouble in the United States now that US products and
services are not trusted. I remember sometime last year. It was after the
muscular revelations, and Google did a bunch of things
to secure their data centers. And it was Eric Schmidt
who said that now he's confident that the NSA
can't penetrate his systems. One, I don't think that's true. But even the best he
actually could say is, the NSA can't
penetrate my systems except for the ways
I don't know about and the ways I've
been legally compelled not to tell you about. We know in the United States
companies have gotten orders to deliberately
break their security and not tell anybody about it. And as long as we're living
in a country where those sorts of secret orders are possible,
we cannot get to a point where we can trust any company's
attestations about its security. This is very bad. This is very dangerous. I'm amazed we are at this point. So I talk about all that. The third part of the book
is entitled How to Fix It, and there I spend
time on solutions. This is very hard. In the first chapter I just
talk about the principles. And one of the principles
I want to mention here is the notion that we have just
one network and one answer. The NSA traditionally
has a dual mission. You look back at their
Cold War origins, they had two different
complementary missions. One was to protect
US communications. The other was to attack
Soviet communications. And those two missions
were able to coexist because they were separate. Think of radios. The US and Soviet Union
had different radios on different frequencies,
different hardware, different systems. And you could attack theirs
while defending ours. If you were eavesdropping on
an undersea cable out of Moscow to Vladivostok, you would
never get conversations from Peoria in it. The physical object allowed
you to separate the defensive and the offensive mission. That doesn't work
on the internet. Now everybody uses TCP/IP and
Cisco routers and iPhones, Chrome browsers. We're all using the same stuff. And it cannot be that you can
defend ours and attack theirs at the same time. You have to make choices. If you find a vulnerability,
you can either use it to defend ours. You can fix it, at the same
time making them more secure. Or we can use it to attack
them, at the same time leaving us less secure. And you have to make that
choice again and again. And I think from
everything we've learned that security is more
important than surveillance. Take StingRays as an example. A StingRay is basically
a fake cell phone tower. That's a product name. It's probably a
series of products from Harris Corporation
sold to the FBI that allow the FBI without
a warrant to figure out who's in a location
and get a bunch of data from their phones. FBI's been very secretive
about this to the extent that they will
instruct prosecutors to lie about it in court. Even though a lot of
information is public, they're still very,
very secretive. So last year, some website,
I forget which one, started looking around DC
and found these StingRays all over the city run
by who knows who for who knows what reason. So here's our choice. Either we can exploit this
technology, leaving all of us vulnerable to whatever other
country or organization wants to exploit the technology. Or we can fix it, add some
authentication to our air to ground, air to
cell phone traffic, depriving the FBI of a tool
but making us all secure. I talk about a bunch
of these principles. I talk about things I think
governments should do, things I think
corporations should do, which often are government
[INAUDIBLE] corporations. Things that I
think people should do as individuals,
technologies we can employ. This is very hard. Everyone wants to know what can
they do to avoid surveillance. And a lot of the
answer is not much, because so much of our data is
in the hands of third parties. If Anthem Health
gets hacked, there's nothing I can do about it. I can't even decide whether
or not they get my data. It probably is mandated
by my employer. And again, the opting out tools
are just not viable answers. So we can do things like use
encrypted email or SSL or OTR. But they tend to work
around the edges. They don't affect the
metadata, and they rarely affect third-party data. So we're living in
an interesting world where the solutions are
a combination of law and technology. Even worse, we're
living in a world where law can
undermine technology. Right? Google gets a secret court
order to break their security and fights it in court, and it's
two years out, and you lose. And we also live
in a world where two caffeine-fueled
undergrads at Stanford could write an app that
undermines the law, which means that both have
to work in order for us to get security again. And this is a thorny problem. What I want people to do, when
people ask what should I do, the thing I say is, we need to
start observing surveillance and talking about surveillance. Pew Research did an
international survey on the effects of the
Snowden documents. And one of the things
they asked is, have you taken any steps to protect
your privacy since the Snowden revelations. And they produced
numbers by country. I did the calculation by
population and percentage and came up with a figure
that 700 million people on the planet have
done something to protect their privacy in the
wake of the Snowden documents and the NSA's activities. Now, probably most of the stuff
people did wasn't effective. Probably some of the people
who said I did something didn't actually do something. But what that is
a measure of, it's a measure of people's changing
their perceptions of data and security based
on these documents. I can't think of
another issue that moved 700 million people on this
planet in the course of a year. That is truly amazing. What Snowden said is he wanted
to start the conversation. I think this proves he did it. And I think we have to continue
the conversation, so observing surveillance and discussing it. And it's not a matter of
all surveillance is bad. I think this is a complex issue. This is an issue of
designing systems to extract group
value from our data while protecting
people individually. And I actually think this
is a fundamental issue of the information age. Our data has enormous
value to us collectively, and our data has enormous
value to us each individually. How do we reconcile this? What law enforcement
will say is, we need to get your
data to prevent crime. NSA will say, we need your
data to prevent terrorism. Behavioral data is
valuable for advertising. Medical data, I think
there's huge value in taking all of
our medical data, putting it in one big database,
and letting researchers at it. Yet it's incredibly personal. Or something as easy
as movement data, I like it when
Google Maps tells me real-time traffic
information based on real-time surveillance. That is a valuable service. How do we extract these valuable
group benefits of our data while protecting us
each individually? And I think this is a very
core problem to big data and one we really
need to address. And I've said this
before, but I think data is the pollution problem
of the information age. I think it's a reasonably
robust analogy. All processes produce it. It stays around. We're discussing secondary uses,
recycling, storage, disposal. And I really think that in
the same way that we look back at the titans of industry
100 years ago, 150 years ago ignoring pollution as they built
the industrial age, that we're going to be judged
by our grandchildren and great grandchildren
on the decisions we make about data here
in the early decades of the information age. So that's the book I wrote,
and that's why I wrote it. I'm happy to take questions. [APPLAUSE] AUDIENCE: I'd like to
ask you a little bit more about Edward Snowden. Last year Governor Bill
Richardson was where you were, and he said, I don't think
Snowden is a patriot, I think Snowden, what
he did was wrong. Do you think Snowden
is a patriot? Do you think we need to keep
relying on people like Snowden to keep exposing these things? And if so, is that any way
to run a society, to really just fingers crossed somebody
puts themselves on the line and has to go and
escape to Russia and hope the CIA
doesn't get them? BRUCE SCHNEIER: Well, that
was a pretty extreme story. I think whistleblowers are
extremely valuable in society. And I think they act
somewhat as a random audit, and they do provide
a great service. And yes, it's not something
you want to rely on. But they are a safety valve. And Yochai Benkler has
written a paper on this. I think it's called "Leaky
Leviathan" talking about how good systems, robust
systems are leaky, and the leaks are valuable. I personally think what Snowden
did was moral and sound. But the discussion
of patriot or traitor is really a history discussion. I tend not to like it because it
focuses the story on the person rather than the documents. And I think the real story
is the documents and the NSA, and not the method
by which we learned about the documents and the NSA. AUDIENCE: Thank you. AUDIENCE: So in hindsight when
the NSA proposed the Clipper chip, it was obvious that they
wanted to do surveillance, and collectively we rejected it. And now we have the NSA
hacking all our systems. Do you think some sort of key
escrow or voluntary key escrow perhaps might be the way to go? BRUCE SCHNEIER: I don't. There are a whole
lot of reasons why that was a dumb idea in the
'90s and is an equally dumb idea today. The basic reason is I
can't make it secure. Basically I can't
build any back door into a system that
somehow regulates for the morality of
the person using it. That as soon as I build a method
for access into the system, I have to assume that the bad
guys will use it just as much as the bad guys,
or possibly more. And I have a much better secure
system if nobody has access. And that's why key escrow
didn't make sense then, and that's why it
doesn't make sense now. I tend to be OK
with NSA hacking. It's interesting. One of the things we learned
from the NSA documents is that cryptography
works, that cryptography properly implemented gives us
NSA trouble at least at scale. And I was saying this earlier,
the NSA is not made of magic. And they are subject
to the same laws of mathematics and physics and
economics as everybody else is. And what good cryptography
does is leverage the economics. I actually have no
doubt that if the NSA wants to be in your network. They are in your network. Period. Done. If they are not, it's
for one of two reasons. One, it is illegal under their
very aggressive interpretations of the law. And two, you are
not that important in the scheme of
budgetary allocation. Breaking crypto, being able
to read encrypted traffic en masse, is a much
more cost-effective way of getting everybody's data. If the NSA has to target
companies or individuals one by one, it's going to force
them to target on the bad guys. And maybe if we're lucky
the Belgian phone company won't make the cut. AUDIENCE: I had a follow-up
question about ways of securely aggregating data. I know the biggest recent
thing I've heard of is fully homomorphic
encryption, which if it ever becomes feasible,
could in theory aggregate data. Do you know of any other
technical solutions for doing the
hidden aggregation? BRUCE SCHNEIER:
There really aren't. Homomorphic encryption
still is theoretical. I'm not convinced it'll
ever be practical. I mean, I'd like to be
wrong, but it is going to require a lot more advances. Right now, I don't know
if homomorphic encryption would change it because of
so much hardware hacking. We have to trust the
platforms that have our data. We have no choice. They are trusted in the sense of
they can subvert our security, not that they are trustworthy. And because we are
seeing so much hacking underneath the
software layer, I don't know if homomorphic
software-- how much is that going to help when you
have all this hardware hacking? So we really need to
rethink our trust models. They seem to be failing in
this world of everybody hacking everything. AUDIENCE: Thanks. AUDIENCE: Thank you
for coming here. I know you said everybody
asks you, what can I do? But this is a particularly--
we are part of the surveillance system. And it was described
to me when I first got here that the company only
works because users trust us. If we lose user
trust, then that's an existential threat
to the company. And because of
these revelations, we're kind of seen
as the bad guy. Now, I know you
don't particularly have a stake in the
company, but people here tend to be pretty well
meaning about this. What can we do to lean towards
the good uses of surveillance as opposed to the evil ones? And how can we move the
system towards that? BRUCE SCHNEIER: So, I
have a few suggestions. The first one is
transparency, that the more that your systems
are transparent, the more they're trusted. And I'd like there to be some
way to make the search results, the system that produces search
results, more transparent. I know that's hard. I know there's proprietary
data all through that. But the more
transparency the better. And that's my first
suggestion in all systems. The second is we need to fight
this notion of secret law, that as long as you can
be legally compelled to lie to all of us about
how secure your stuff is, there can be no final trust. Now, this isn't your fault. This is something
you've been thrust into. But you could help
us solve this. So fighting these secret orders
in every way possible I think would be a huge thing. And I think doing that,
doing that in public is another way to
engender trust. Microsoft is getting
huge PR value out of fighting this court
order to turn over data that's in their servers in Ireland. On a purely self-serving
point of view, it's a great
decision of Microsoft to fight, win or lose. The third thing is to think
about encrypting Gmail. I wonder what the
marginal value is from being able to get people's
interest out of their Gmail. And my hope is and my
thought is that it's low enough that you can
offer more common encryption. I mean, yes, if someone's
using Google Now, we need to figure out
how to make that work. Or I guess you can't use Google
Now if your stuff is encrypted, but some way to give
more users easier access to email encryption is
something Google can do and make an enormous difference. So those are my
three suggestions. There's probably more
if I thought about it. Those are three
that come to mind. AUDIENCE: Thank you. And just as a note, there's
a public project called End to End for doing client-side
email encryption for Gmail. It's not widely rolled out,
but we are working on that. BRUCE SCHNEIER: Yeah, but I
want it almost to be default. I want it to be not
just rolled out. I want it to be a thing
that an average Gmail user without any
technical knowledge gets. The reason SSL works
to the extent it does is you're not thinking about it. It just works. And what you're doing
on trying to make more SSL everywhere is great. And those are the
things that make a difference, because Google
can move so many users just by changing a default. AUDIENCE: Thank you. AUDIENCE: I get the sense
that in this country people are more worried about
government surveillance, and in Europe people
are more worried about corporate surveillance. What do you think
is the more damaging and risky to our society? BRUCE SCHNEIER: So, I
think your generalization is largely correct. There are certainly exceptions. There are a lot of people
in both countries worried about the other thing. But by and large you're right. I think the biggest problem
is the two together, that separating
doesn't make sense, that it's the public-private
security partnership that I worry about. It's governments
using corporate data. It's corporations getting
government contracts and lobbying for
government surveillance. Now, these things
working together is really what's
causing the problem, and separating them
doesn't make sense anymore. Because it's all
about power using data against the less powerful. You're going to be my
last two questions. AUDIENCE: I'm wondering
what you think of David Brin's thesis in
"The Transparent Society" where he says that,
essentially, a world of privacy is not something that we
can achieve going forward, that loss of privacy
is inevitable. But the choice we
have is symmetric versus asymmetric
loss of privacy, and that the worst outcome is
the one we have right now where the powerful, the NSA, get to
do their surveillance in secret, and that the way forward is
in the direction that supports democratic
accountability that there are real limits on their
ability to do things in secret. BRUCE SCHNEIER: So, I think what
Brin misses in the analysis is how power factors into
it, that you're really talking about the powerful
state and corporations, the less powerful individuals. And that just allowing
surveillance in this direction doesn't even the score. When a policeman
asks for your ID, asking for the
policeman's ID also doesn't make that
an even exchange. So I'm all in favor
of transparency, and I like the idea of
sousveillance and surveillance from below. But I don't think
that changes things. I disagree with his thesis that
loss of privacy is inevitable. I think that that's not true. That's too fatalistic,
and we haven't lost. And there are ways to get
privacy, maybe not technically but certainly legally,
because that's the kind of society we are. AUDIENCE: Several
years ago you published on how you maintained
for the guests to your home an
open Wi-Fi network. In 2015, and particularly
in high-density areas like the Bay Area,
would you still recommend maintaining
an open Wi-Fi hotspot? BRUCE SCHNEIER: So,
I tell you what I do. I still have an
open Wi-Fi hotspot in my house in Minneapolis. I recently got an
apartment in Cambridge because I'm spending a
lot of time at Harvard, and there I have a password. So I guess your
notion of high density is what made a difference to me. I still think it's fine
to have open wireless. I think it's easy and polite. But in an apartment
building, I decided that putting a password on
it was the right thing to do. It's an easy to
guess password, so-- [APPLAUSE] [MUSIC PLAYING]
interesting stuff and well articulated. the talk is a tl;dr of his book.