Debates of the Century @NYU Wagner: National Security (Featuring Edward Snowden and Fareed Zakaria)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Good evening. I'm Sherry Glied, the Dean of NYU Wagner. On behalf of NYU Wagner, it's my distinct honor and pleasure to welcome you to tonight's debate. This is the third debate in an exciting new series we've launched with our partners at The Century Foundation. And we are thrilled to have such a large and eager audience join us. We held our first debate in early March on the topic of free public college, and the second, just a few weeks later on immigration reform. Both debates proved to be informative, engaging, and even fun discussions, and I know tonight's debate will be the same. We at NYU Wagner have a long history of making an impact on public service, both here in New York City and around the globe. For over 75 years, we've been preparing the world's public service leaders to have an effective and lasting impact on the public good. Through a curriculum that blends theory and practice, our students gain skills and experiences that enable them to address significant public problems, and actually get things done. Our alumni all over the world apply their education to make real improvement in people's lives through healthcare, international development, urban planning, non-profit management, finance, and many other fields. Part of making real improvements is understanding all perspectives on an issue. Conversations such as tonight's are vital to encouraging public discourse and developing the best possible policy solutions. These debates are an opportunity for thought leaders, practitioners, students, decision makers and any interested citizen to hear varied perspectives on relevant and timely issues. Tonight's topic on the intersection of security and personal data is relevant and important to all of us as citizens of the world. As has become far too evident with the attacks we witnessed in many places, Paris, Brussels, closer to home in San Bernardino, our safety is not guaranteed. As citizens, we often take for granted that our government will keep us out of harm's way... But at what cost? Technology is ruling our lives, continues to grow, and we increasingly rely on our devices, please turn them off, to communicate, work, and manage our day-to-day routines. That raises the question of, "What access the government should have to our information?" Will giving the government that access make us safer or is the infringement on our privacy too high a price to pay? We are so pleased to welcome tonight's debaters to delve deeply into this important and challenging subject. I want to offer a warm welcome to Fareed, Bart, and Edward, and thank them for being with us tonight whether in person or from thousands of miles away. Fareed, you bring a unique perspective grounded in the history of American foreign policy and I'm eager to hear what you have to say. Bart, you're one of this country's premier and celebrated journalists, and I know you'll do an excellent job tonight keeping Fareed and Edward in line. We're also very excited to have Edward Snowden join us via live video. While you may not be able to feel it personally, Edward, the energy in this room tonight is palpable. I understand that while you participated in many interviews and conversations, this is the first debate of this nature that you've taken part in, and we're thrilled to have you. Edward, Fareed, and Bart will all receive proper introductions shortly. But on behalf of NYU Wagner and the Century Foundation, we thank you all for being a part of this debate, and sharing your insights. We're also live streaming tonight's debate, so I want to welcome those of you joining online. We hope you can join us virtually. I know one person who is most certainly watching, Century Foundation president, Mark Zuckerman, wherever you are. He's been vital to the creation and success of this series. He's been a wonderful partner and colleague. He couldn't be here with us tonight as he is overseas in Paris. But his team assures me that he's taking time from his vacation to participate via the live stream. Hi. I'm sure you've noticed the cameras from CNN in the audience. We're happy to say that this conversation will continue after tonight's event, when Fareed, airs a segment on his show, "Fareed Zakaria GPS." Be sure to tune in. When we first began discussing this debate series with The Century Foundation, we all agreed that a vital part of the program would be hearing from you, the audience. There are several ways you can take part in tonight's conversation. First, we encourage you to use the hashtag, DOTC 2016, which you will see in your programs and on the screens to tweet about tonight's discussion. You'll also have a chance to ask the debaters questions. You could do this by tweeting questions during the debate using the DOTC 2016 hashtag. You could also ask questions through the poll we'll be conducting. Before we start, we want to take your pulse about tonight's topic. We're gonna conduct a poll now, and then again after the debate. Instructions are in your program and on the screens beside me, but I'm gonna walk you through it. There are two ways to participate in this poll. First, from your phone's browser or from your computer for those of you at home, go to the website, pollev.com/dotc2016, and follow the prompts. There's no need to register or log in. The second way is via text. Simply text dotc2016 to 22333. 22333 to join the session. I'll give you all a second to text or join through your browser. Okay, ready? Let's poll. Tonight's resolution is, "Government should have lawful access to any encrypted message or device." Do you agree with this statement? How are we doing? Okay, now see the results? You can see them, and they'll be changing in real time. We'll do another poll after the debate to see if our experts were able to sway any opinions. It's now my pleasure to turn the stage over to the Chairman of the Board of the Century Foundation, Bradley Abelow.[applause] I'm pleased to welcome everyone to tonight's event on behalf of the Century Foundation and NYU Wagner. The Century Foundation is incredibly proud to be able to co-host tonight's debate. I'd like to extend a big thank you to Century Foundation President Mark Zuckerman and NYU Dean Sherry Glied for their work in creating Debates of the Century events series. Thank you also to the teams of the Century Foundation and NYU Wagner for the time they've put in to organize tonight's program. As one of the oldest public policy institutes in the country, the Century Foundation has dedicated itself to evidence-based research and analysis with the aim of informing citizens and policy-makers. Tonight's debate on encryption and national security is a topic area that we research in order to encourage democracy and ensure civil liberties are protected in today's technological age. I would also like to recognize some of the Century Foundation's Trustees who are in the audience with us tonight including Trustee Emeritus Richard Ravitch, Trustee John Alter, and our newest Trustee, Anne Milgram. It is my pleasure to now introduce our moderator for tonight's debate, Barton Gellman. Bart is a much-honored author, journalist and blogger. Mr. Gellman is a Senior Fellow at the Century Foundation where he researches surveillance and privacy. In 2013, while working at the Washington Post, Bart was one of three journalists worldwide to receive leaked documents revealing previously undisclosed surveillance programs from former NSA contractor Edward Snowden. After 21 years at the Washington Post writing about many of the most important events of our times, Gellman joined Time Magazine in 2010 as contributing editor. He's also been a lecturer and author-in-residence at Princeton. His professional distinctions include three Pulitzer prizes, two George Polk awards, Harvard's Goldsmith Prize for Investigative Reporting, and the New York Times' Best Book 0f 2008 award for his bestseller Angler: The Cheney Vice Presidency. Please join me in welcoming Bart to the stage. [applause] Right to it if I can, I'm here to introduce our speakers. You all see the motion, the proposition before us. Fareed Zakaria will argue in the affirmative. Since 2008, he has hosted CNN's Peabody winning award program on International Affairs, Fareed Zakaria, GPS. His notable interviews have included Barack Obama, Narendra Modi, King Abdullah II, Muammar Gaddafi, and David Cameron. I would like you to imagine that green room for just a moment. Before his turn to cable news, Fareed was managing editor of Foreign Affairs, editor of Newsweek's International Editions and a contributing editor for Time. Alongside his CNN show now, he writes columns for the Washington Post and The Atlantic. His most recent book is In Defense of A Liberal Education. And finally, based on my personal observations, Fareed is also the king of Davos, which is not a reference to Game of Thrones, but it should be.[laughter] Ed Snowden is here to oppose the motion. For those of you with ear pieces and sleeve microphones here, I mean, Moscow at 4680 miles or so away, by the miracles of Google Hangouts and the speed of light, not necessarily in that order, he should bridge the distance to the stage in a fraction of a second. I'm told the lag is a little bit longer today. Ed is a former CIA technical officer, DIA Cyber Security Trainer and NSA contractor assigned to the regional operations center in Hawaii. We all know about the disclosures he made about the NSA in 2013. Since then, he has attracted a global following which ranges from a US government grand jury...[laughter] to a shelf full of honors from NGOs, international organizations and public service foundations. Ed is not here as that NSA guy. He's here, he comes to us here for his expertise in device and communications security. Finally, in 2014, Ed joined the Board of the Freedom of the Press Foundation where he works principally on the technology of human rights. Fareed, will you please join us here onstage? [applause] And fingers crossed that Ed's link is working and we can put him up on the screen now. Let's wait for it. So we're about to begin. I need to explain the rules and the format briefly. Our topic, as you know, is government access to encrypted messages and devices. We are not here to debate surveillance for the NSA or the political climate in Moscow or for me for that matter. Both speakers have agreed to those boundaries. Each speaker begins with a five-minute opening statement. Fareed will go first, then Ed. Next, Fareed will offer a three minute rebuttal and Ed will follow suit. After that, comes about half an hour of free-form debate where I will toss in provocations as necessary.[chuckle] Each speaker will also have a chance to ask one question of the other. Sometime around 7:30, I will open the floor and there are two ways to ask a question similar to the poll business, you may either text your question to 22333 or tweet it with a hashtag DOTC for Debates Of The Century, DOTC 2016. I expect a lot of questions and so I'll ask Ed and Fareed to respond succinctly at that point. To conclude the debate, I'll invite the speakers to make a three-minute closing statements, Fareed will close first and then Ed. Finally, we'll take another live poll. Fareed, I invite you to begin. Thank you so much Bart, thank you all. It's a pleasure to be here. I have to confess, looking at that initial poll, I feel like I have been the underdog before in my life but never quite to this capacity. I thought these would be retirees but it turns out to be students.[laughter] Anyway. Imagine tomorrow, the Bank of America announced that it had a new product, let's call it an iVault and Bank of America said this is a vault, a virtual vault in which you can put all your bank information, any financial information you have, any other kind of information you want. Remember, all information is now digital, so it could be your tax receipts, it could be your will, it could be receipts for travel, it could be whatever it is you want to keep secure and safe. Now, imagine that there was a guy, let's call him Bernie Madoff, who embezzled billions of dollars from poor workers' pension funds and it turned out he had one of these iVaults and the government is trying to figure out exactly the extent and the scale of the crime, they need evidence for it, they need to find out what else he might have embezzled and they go to a court and ask for a search warrant. The court provides it but Bank of America says "No! This is encrypted digital information. In fact, our whole sales pitch to our customers says this is encrypted, so you can't have access to it. How would you get around that problem? Because after all, if Apple says you cannot have access to the information in an iPhone because it is encrypted, why does Bank of America not have the same right? Why does any institution frankly, any company in the United States not have the right to encrypt the information it has, this is relatively routine software at this point and then, argue that it has in a sense created a zone of immunity in which no laws can reach, no courts can reach, no government can reach. That's really... It seems to me the heart of the question here. I could bring up the issue of terrorism and scare you all with the ticking time bomb and I might do that a little later...[laughter] But right now, I'm gonna argue that the case is very simple which is our way of society of laws. Is there some process of law by which a government, the democratically elected government with independent courts, has the authority to access information? Now I know what you're thinking, you don't want people to see what's on your iPhone, neither do I but I understand that within a democracy, if you have rules and a lot of laws, you have to sacrifice liberty for security at some point. This is not an absolute disposition, I believe in strong protections for those liberties. I do not want the government abusing its authority, I believe it has but you cannot have an absolute zone of privacy. First of all remember, you actually don't have an absolute zone of privacy. Your emails? Your employer has access to all your emails. You know all the websites you look at? Your employer has access to all your websites as do all the technology companies that you use when you look around the web. Facebook says that it can provide you with targeted ads appropriate for you with 90% accuracy, how does it do that? All these companies are collecting the data based on what you were doing in your digital life. They have it. I understand it's not the same as the government, but there isn't such a zone of privacy quite as you imagine. Now, you're gonna hear a lot or you probably have already heard a lot about the dangers, technologically, that this... Now producers, that it might mean a master key that unlocks all information everywhere, that endangers all kind of encryption everywhere. I'm not a technology guy but I thought it'd be worth listening to what a technology guy has to say about this, somebody who ran the largest technology company specializing in software for two decades, Bill Gates. So here's what Bill Gates says about Apple's request... The Federal Government's request of Apple that it unlock an iPhone. Bill Gates, "Apple has access to this information, they're just refusing to provide the access and the courts will tell them whether to provide the access or not. You shouldn't call the access some special thing. It's no different than asking the phone company to get information or bank records. There is no difference between this information. The government comes asking for a specific set of information and the bank can say it's tied a ribbon around the disk drive and says, 'Don't make me cut this ribbon because if I cut it this one time, I'll have to cut it many times.'" As I say, I worry a great deal about what the government might do with all this information, which is why I believe you need laws that clearly demarcate when the government may have access to information, when it may not, what it can do with that information. But you cannot have liberty in the absence of law, that is the rule of the jungle. That's welcome to Haiti, welcome to Somalia. If you want to live in a democratic society that has rules, the laws, the authorities have to have some recourse to lawful court orders. Look, I love this phone. It is the coolest thing that I have but there's something even cooler, the United States constitution. And it has to be possible for a government of laws to operate in a way that legal authority has the ability to access this kind of information. Look. In 1974, we had a test of this basic idea. The president of the Unites States argued that a court could not force him to provide information. At the time, the information was, in his view, encrypted information communications between the president and his top advisers. And his argument was, executive privilege meant the president did not have to provide the court a court order with this kind of information. [BG:] Sorry. Can you... I will wrap up. Thank you. He was arguing in a sense the information was encrypted and if he revealed it, it would set a dangerous precedent for the future. In a unanimous ruling in the United States versus Nixon, the supreme court ruled, "No person, not even the president of the United States is completely above the law. And the president cannot use executive privilege as an excuse to withhold evidence that is demonstrably relevant in a criminal trial." That is the issue. No one in America can withhold evidence that is relevant to a court. Not the president, not the world's most powerful company, not any individual, not even the most shining and alluring product, not even the iPhone is above the law. [BG:] Thank you Fareed.[applause] Ed? [ES:] Edward Snowden: Okay. I'll do my best. I am like Bill Gates, a technologist. I am not... I will point out this little bit of a technical problem for the people working with the audio. I'm hearing a little bit of echo on myself. But if I could continue... Well, let me just get right into it. Let's start with what tonight is not about. Fundamentally, tonight is not about politics nor is it really about the law. It's about science and for that reason, it doesn't really matter whether you're for or against surveillance because by the end of this debate, we'll have established that the proposition is not really a choice between privacy and security. It's rather about more security or less security. Here's the problem. We're in the midst of the greatest crisis in computer security in history. One of my greatest critics personally, Director of National Intelligence, General James Clapper said just months ago, "A lot of people find this surprising in our post 9/11 world but computer security bumped terrorism out of the top spot on our list of national security threats." Now, let me underline that. Our intelligence agencies say computer security is a bigger problem than terrorism, than crime, than anything else. The backbone of computer security today is encryption. Encryption is the thing that keeps your money in your bank account rather than in a criminal's. It is the thing that keeps our dams closed and our roads open. Encryption is the only thing that determines whether the medical devices in our hospitals or the ones in your body deliver a therapeutic dose or a fatal one. Encryption saves lives, encryption protects property. Without it, our economy stops, our government stops, everything stops. Now, my point of hope is that somebody could perhaps find a way to make encryption work only for the good guys. But encryption is a field of mathematics and no matter how much we might hope otherwise, math is math. It works the same for Mother Teresa as it does for Osama Bin Laden. And the signs of a consensus on this next point is absolute, lawful access to any device or communication cannot be provided to anybody without fatally compromising the security of everybody. And that's not my opinion either, that's the formal conclusion from gathering of the world's top computer scientists and security experts at MIT to study precisely this issue. And these weren't just a couple of grad students either but names like Whitfield Diffie, who literally invented the principles of modern cryptography. Now, the fundamental problem of the science in this phase is that for the government to unlock everything, there has to be a key to everything. Now, we can pass a law to require a key under every doormat in order to make things easier for police but the problem is that every other person in the world can find that key, too. And they can use it. You might be saying, "Oh well. That's all well and good." But what about national security? This is a legitimate interest. Now this debate is unusual on that point. Now recently, when all of the top intelligence officials who are free to talk about these things, join in any conversation with which I'm attached, they're always on the other side. If I was for kittens, they would be completely against them. But tonight, they're actually on my side. The former director of National Intelligence, two directors of the CIA, the director of the National Security Agency, the nation's former top counter-terrorism official have all said that despite their sympathy for the FBI, our nation's computer security is simply more important than yet another surveillance tool. In fact, that NSA director former that I just cited, Michael Hayden said this: "The FBI director, Jim Comey, is wrong. America is simply more secure, America is safer with unbreakable end-to-end encryption." And I look forward to exploring the details of all of this tonight with you and Mr. Zakaria, but I can promise you, ladies and gentlemen, one thing: If I am standing shoulder to shoulder with a director of the National Security Agency on something, there's a damn good reason for that. Thank you very much. [applause] Thank you, Ed. Our format now gives Fareed three minutes to rebut. So, part of the issue here, a very substantial part of the issue, is whether it is possible to provide the... Let me put it differently. A large part of the issue at stake here is whether it is possible to open a single device without compromising all encryption of that device, of other devices. And as I said, I'm not a technologist, but I quoted to you from possibly the world's most famous technologist, who says it is possible. The evidence for this is very plain. Apple had done it 70 times. It only stopped... It had opened 70 iPhones without, in its view, compromising the end-to-end encrypted security it provided. It only decided to file suit when the FBI, perhaps foolishly, decided to disclose that Apple was cooperating in this regard. Apple had been selling its product in a way that suggested it had this bulletproof end-to-end encryption. This was compromised by the FBI's announcement, and so, it decided it needed to fight this or contest it. Not just Apple, but technology companies throughout the country have been providing this kind of single key unlocking routinely. Gates' statement again points out that this has happened in phone companies, this has happened with banks. If it isn't the case, as I say, we have to explain how we would create a system of laws in the United States where banks would be required to provide information. Otherwise, every bank in America would become like a Swiss bank from 30 or 40 years ago, where everyone was able to keep secret accounts, secret transactions, secret information. We would look like the Republic of Panama with large illegal transactions, furtive bank accounts, and laundered money. If we don't do that, we have to have some system in place that allows for a process by which you determine when it is lawful for the government to access information, when it is not. If you move to a system where you say, it is an absolute disposition and the government will never have lawful entry, you are simply encouraging the government itself to become part of the lawbreaking scheme and use the kind of methods that I'm sure Ed Snowden and I would agree on are not ones that the government should be engaged in. My final point would be about that panoply of extraordinary officials, former officials, that he pointed out surprisingly agree with him. It is really very stunning, and what's particularly stunning is that all these people, only two years ago when they were in government, or four years ago or six years ago, when in government, had entirely diametrically opposed positions on this issue. Remember, this is not an issue that came up yesterday. They all had public positions that said essentially the opposite of what they're saying now. I will point out that almost all of them now work in consulting firms that have, as clients, all the major technology companies that have a huge vested interest in arguing the other way. So to answer your puzzle, Ed Snowden, the reason that Michael Hayden and others agree with you might not be so complicated.[laughter] BG: Thank you. Ed, you've got your own three minutes now. ES: I do appreciate that suggestion. However, I would point out one important distinction. Not all officials leave government under the same circumstances, for the same reason, or change their ideas for the same reason. I myself once worked for the National Security Agency, I myself once worked for the government, and now I work for the public. I do have a different position, but that doesn't mean I'm motivated by anything other than the same allegiance to the same society. The difference is that I don't have a boss telling me what to say. Now, there's one point that you can't really dodge on here, which is that saying that the government should have lawful access to any device, whether it's a single device, whether it's all devices, is no different than saying that we should weaken the digital security of every device and product and service that is produced in the United States. That's the science of it. That's simply how it works. That's the choice we're faced with today. Ignoring that, otherwise, we might as well be discussing well, what if the earth is flat? Right? But you do raise an important point there, a key distinction which is that there are some types of data that, despite encryption, despite the fact that it is out there are still available, that can still be shared with the government and in fact, are. Now, this is the magic of communications. This is the magic of how even encrypted communications can be followed and tracked and in fact, I did this personally at the NSA. My target was not terrorists, which are actually seen as the easy target at the NSA. They were Chinese hackers working for the Chinese government. These were extraordinarily sophisticated individuals who used encryption for everything they did. And yet, we followed them back to their homes, their bases of operation to their home computers, we broke in, we stole everything, and we got around encryption, seven days out of the week, twice on Sundays. This is how the world works. Unlike the example that you said in your introduction, which is that, there's an iVault that can't be broken into. It can't be broken into when it's perfectly sealed, when no one sees it. But it is then a brick. You can never go in and enjoy the fruits of what you've stolen. It's locked away from yourself, just as it is for your adversary. If you ever try to enter it, if you ever try to use it, if you ever try to go in there, we can follow you, and we will. We will steal the key and then, we will own your vault. And this is the way that law enforcement has, will, and will continue to deal with the problem of encryption. Thank you [BG:] Thank you both. [applause] Ed, FBI Director Jim Comey has said that he fears a moment when a child is kidnapped, and there's evidence on a phone. And parents with tears in their eyes look at him and say, "What do you mean you can't get at it?" What would you tell those parents? [ES:] Well, the idea here is that there can always be theoretical cases. There can always be a worst case where something could go wrong. But fortunately, we actually have a track record on this. We have evidence. We don't have to rely on hypotheticals, we don't have to rely on potentialities, we can look at the facts. And the fact is, encryption has been around for a long time. We had this debate previously in the 1990s, people said the sky was gonna fall, the atmosphere was gonna boil off, the world was over. And yet despite that, law enforcement is sitting in a better position today than they were before. Now, that's not to say that this couldn't happen. That's not to say that some crime could not potentially occur in which encryption is the factor. But we should focus on the world as it is, not the world as it necessarily could be. And the FBI has in no case, put forth a real smoking gun where but for the presence of encryption, criminals went free. However, there are many, many, many cases and we have statistics that I can talk about all night if you would like, that show encryption is actually quite a rare factor in investigations, and even where it is, it doesn't stop prosecutions, it doesn't stop investigations, and ultimately, it doesn't stop prosecutions. We have a question now for Fareed with a little preamble. The language of this debate can be very controversial and there are various parties who don't like the term, "backdoor," "side door," "golden key," and so on. And as the neutral moderator, I felt obliged to make up my own term for it. So my term is "fenestra," which is a noun from the Latin which means any government mandated bypass of encryption used to secure communications device usage. Apple refuses adamantly to provide a fenestra to the FBI. So, there is now... This is the part of the argument that Snowden has been making, a consensus that is akin to the consensus on climate science, that it is not possible for cryptographers to build a fenestra without weakening encryption against potential outsider adversaries. Now, do you doubt the science of that, or are you prepared to accept the costs to security of that weakness or, do you have a third choice? [FZ:] Well, I think firstly, if I could just to the last point, in that same testimony, the FBI Director did point out that there were several cases where they genuinely were, these were not hypothetical cases, there were cases of an eight month pregnant woman who, there was a murder, they had no clue as to why it happened. The only thing they had was a phone. They wanted to open the phone, they couldn't find a way. There are many such cases, the New York prosecutors have many others, so this is not entirely a hypothetical situation. On your point, look, as I said, what I can do when I talk about these issues, I'm not a technologist myself and I would respectfully say that neither Mr. Snowden nor I are technologists on the scale of Bill Gates. He has pointed out that he... Has said pretty clearly what the situation is, right? And as I pointed out again, the history of it is that these devices have been unlocked repeatedly, routinely. There are a number of, strike me as highly self-serving arguments now made by technology companies saying that they cannot do it without compromising the security. The way I would put it is, as far as I can see as a consumer of this stuff, any time you write code, you know how to unwrite it. That it is not so complicated. When you encrypt something in software, there are obviously, logically ways to un-encrypt it. For example, if you take the case of the iPhone, the San Bernardino iPhone, the issue was, can you figure out a way to tell the iPhone not to auto erase when it hits the tenth password? Well, since it is their software and they're telling it to auto erase after the tenth attempt, obviously, logically, it's possible to write software that can tell it how not to auto erase. Inherent in the writing of encryption is the possibility of unencryption. The person who wrote the encryption in his head has that, it would take a few hours to do it, but the idea that you let it out there and somehow, this becomes some dangerous virus that would then contaminate the entire digital universe is not born out by the evidence that these phones and these devices have often been unlocked in the past, is not born out by the testimony of the CEO of the largest technology company in the world for two decades, Bill Gates, and it doesn't stand common sense. But what do I know? I'm not a technologist.[laughter] Actually, Ed, I'm gonna let you respond to that. If you could try to clarify what the technology does and does not mean in this debate. [ES:] Right. So the challenge is, when you create a way that a system can be broken by an outside party, by a third party, that can be used by any third party. It could be the FBI, right? It could be the police going in for a lawful purpose who wants to go in, they want to investigate a case, which none of us would dispute, was a good thing in that case. However it can also be used by anybody else, and there's a problem once you create these keys, these sort of universal unlocking mechanisms that, for example, in the case of San Bernardino, now the FBI said, Apple had the exclusive technical means to get in this phone, it simply wasn't possible to get in unless they did it. Now, six week later they said, "Oops, we were actually wrong about that, there was another way in." Which is often the case, in fact, in all of the cases that we've seen where this has been a significant technical controversy, they've found a way in because again, devices today are insecure. This is really the challenge. You mentioned, for example, these phones didn't used to be encrypted, they used to be able to be broken into much more easily and now, it's a little bit harder which many people, particularly surveillance authorities, once they become accustomed to this sort of status quo, they begin to feel entitled to it and they don't have the same balancing of equities that the broader society does, because they don't have to deal with the risks of having things stolen from their bank accounts. They don't have to deal with their identity being stolen. The victims have to do that. Now, that thing that I mentioned in my introduction, where General James Clapper, the Director of National Intelligence, said that the US Intelligence Community have said, "Computer security is the biggest threat that we're facing right now," that happened in 2013, right? These devices didn't have that level of encryption prior to 2013, because the threats that we're facing are encrypted or are increasing and therefore, our countermeasures, our defenses must also increase or else, our lunch will be eaten not just by every other criminal group, but every other foreign adversary around the world. And just one thing that I would actually like to pass back to Mr. Zakaria for comment is, let's presume for a moment, hypothetically, I'll give away what we really have here, which is the fact that by the science, this is going to make everything insecure, but let's say it didn't. Let's say there was a golden key that could only be used by governments. Now, you have a key that anybody with a court order can get and then get into these devices, get into all of these services; get into Microsoft, get in Google, get into Facebook, get into your phone. Now, they're only used for instance, court orders, but who else has courts? It's not just the United States. Kim Jong-un, in North Korea, he also has courts. Syria, they have courts. Russia's Vladimir Putin, he has courts. China, they also have courts. What will you do when suddenly, we have everyone in the world, every government that has access, as long as they can get anybody to stamp a sheet of paper that says court order, they have access to every device including our own, including those held by our own officials. [FZ:] Can I get in on? Because I feel... [BG:] Yeah, good. Do that. First, I wanna point out, if the FBI now says, it can open that phone, I don't know. Is Apple howling that, "Oh my God the security of every iPhone is now in jeopardy," clearly it means that this so-called master key is not so master after all. But more importantly, to just answer your central point, I think it's very important for us to be honest in this debate and point out that when large technology companies operate in countries like Russia, where you have found safe harbor, or China, those governments already have considerable access to all information; virtual, digital, real. They use every mechanism possible. And I think the idea that by allowing a legitimate court order to be pursued in the United States, somehow we are empowering the North Korean intelligence service, or the Chinese intelligence service, I think they're doing fine without that help. [ES:] So that's a fair point... Go ahead. So, we've got this time lag and I might have to raise a flag here. I wanna ask you to address Fareed's point of principle and I would put it historically. Has there ever been a time, in American history, when a citizen could store evidence, not just have a whispered conversation, but store evidence in such a way that authorities could not possibly reach it? And we haven't built a safe that no one can ever crack. So, in that sense, are you not asking for something that's without precedent? [ES:] No, I would argue actually that's been the status quo throughout American history. In fact, it's still happening today. When you think about banks for example, the ones we see quite recently in the Panama Papers, they store documents and records on their premises all the time. They have very little fear of anyone getting in, and when the police cars pull up, they've got shredders running. And many times, they don't wait for the police to show up. At the same time, we've also had people who store drugs in their home, and when they're concerned, when they have any fear, they place themselves, you might argue, above the law, by flushing it down the toilet. But that doesn't mean that we pass a law to re-order the whole of our society for the convenience of our investigators and, as I believe, the Director of Homeland Security previously argued. Because look, we're not gonna say these things that are necessary, that are critical to the operation of society. For example, our economy. For example, all of our communications as the internet is, has to be restructured for the benefit of the police. Now, this isn't to say that we intentionally make their life harder, this isn't to say, that we do this, that or the other. But we recognize that look, yes, the needs of law enforcement are legitimate. Yes, we want to enable them as best we can, but we do not stop innovation in this country. We do not require the production of a new service, we do not require the next version of the iPhone to first go to the FBI, and get a stamp of approval before we say that it's lawful. [BG:] So I wanna address maybe a somewhat comparable point of principle to you, Fareed, which is this. We, in this country accept all kinds of inefficiencies in law enforcement, probable cause, proof beyond a reasonable doubt, unanimity in juries, suppression of evidence. Honestly, looking around this crowd, the FBI could probably find something fishy if it turned out everybody's pockets...[chuckle] But it can't do that. Sometimes, it can't see inside encrypted containers. Why is that different? If there is any... If you accept what the technologists say, which is that there is a large risk in giving FBI mandatory access, then why is that any different from the other examples I mentioned? [FZ:] It's a very good question and it gets to the heart I think of what the resolution argues, and what I'm arguing for. It is simply not the case that banks were allowed to keep vaults with secret information in American history. Of course, they're able to keep secret information. The question that this resolution asks is, if there is a legitimate court order requiring you to open that safe deposit, requiring you to disclose information in that account, does the bank have to comply? That's the issue, not, "Are you able to keep whatever you want in your house, whatever you want in your bank?" The presumption is that you don't have anything illegal going on, but if the law enforcement authorities believe there is reasonable suspicion, if they can convince a court to issue a search warrant, if that warrant is then upheld, do you have the obligation to provide that? That is a much higher standard. I would argue that what I'm trying to persuade people of, is to take a middle position here. The government at times seems to want this all Ritz petition argument which says effectively, they can do whatever they want, whenever they want. Mr. Snowden is arguing for an absolute zone of privacy. What I'm saying is, we need to clarify this. And we need to clarify it now, not in the wake of the next terrorist attack. Because when that happens just as with the Patriot Act, we will overreact, and whatever you saw about the poll and this audience, it ain't gonna be that way with the American people. It's gonna look a whole lot different, and we'd be much better off coming up with sensible constraints and guidelines now. So...[ES:] I'd like to follow up on that because there's a little bit of nuance there actually, that I think is quite interesting which is, I'm not arguing for an absolute zone of privacy. I'm not saying that people should place themselves above the law, or beyond the reach of court orders, or even companies to do the same. We're talking about who is being compelled here, and how our investigators are going about their work. Now, Apple quite recently gave sworn testimony I believe in a court filing where they said, or in response to the court controversy of course, over San Bernardino, that they said, in fact even China, the China that you said previously is doing quite well on their own, has not asked them for this capability. They have not asked them to go so far as to re-engineer their products and services to make them able to comply with all these kind of things. Now, however, intelligence services everywhere, whether they're in China, or whether they're in the United States, whether they're in France, whether they're in Germany, whether they're in Brazil, are doing quite well. And again, this is why I don't say that I'm arguing for an absolute zone of privacy. I sat at that desk, right? I've read actual terrorist communications, I've read the communications of hackers, I've read the communications of all kinds of malicious sort of forces you might call them from adversary nations. Now, the thing is, encryption is not an unbreakable wall, or rather if it is an unbreakable wall, it is one that we can get around if we are patient, if we are careful. If we think, and plan about how to go about our investigations... And this really works. We have seen this, for example, in the United States quite recently. For someone who is really sort of, Mr. Zakaria, your worst case scenario. This would be the case quite recently of the Silk Road. Sort of a dark net drug market as it's alleged to be, where this is a space online, it's all perfectly encrypted. There's all anonymous communications, you're not supposed to be able to see who's even involved in it. And yet the king pin, the one who's actually operating this site turned out to be an American citizen. Now, because he made mistakes, because he wasn't perfect everyday of every year for the entire time he's operating this site, we saw where he was operating from, we followed him and then even though his devices were perfectly encrypted, sort of the iVault from your introductory statement, we created a pretext. The FBI did the same, FBI now complaining about unbreakable encryption, where when he goes into the library, he was actually operating out of public libraries just from what we see, so he wasn't using his home connection, he logs into his dark net sort of drug market as it is, and at that moment, his computer is unencrypted because remember, if it's perfectly encrypted, it's a brick to him as well. The FBI agents standing to his left, create a sort of an appearance of a domestic dispute, a man and a woman arguing. As soon as his head turns to the left, an agent on the right, physically grabs his open, unencrypted laptop and now, they're reading his diaries in court, or at least they did before they convicted him to life sentence. [BG:] So I wanna follow up here because you said before we do quite well. And I wanna think about who we are, when you talk about using sophisticated means of getting around an unbreakable vault. As long as you support the idea that there ought to be... They build it to do targeted surveillance for law enforcement or intelligence, if companies sell strong crypto and they won't give you a fenestra, authorities need something like a seven-figure budget or maybe even an eight-figure budget to conduct the kinds of operations you're talking about or to buy hacks from private companies or grey market afters as the FBI did in San Bernardino. How does the average police department get in and get this evidence? [ES:] Well, I think the Silk Road case again, shows quite clearly that when you think carefully enough, the actual budget you need are just the police officers who are already on your force, it's really thinking about how you use them. But there are cases... We can imagine cases where these are people very far away, a teenager in a basement in Moldova somewhere, and you have to actually do some kind of remote attack, right? In theory, you shouldn't have to do this. And in practice, we actually very rarely need to. We have what are called mutual legal assistance treaties with many nations around the world where we can go through established processes through court orders which again, Mr. Zakaria prefers, and then we can rely on the services, the police forces that are available in that country to act as an arm of our own investigation. And again, we do the same in turn for them. So it's mutually beneficial and it's tightly controlled. But let's say we couldn't do that and let's say there were an extraordinary rare situation, you've got an arms dealer, you've got a human trafficker, a truly terrible villain out there who is incredibly sophisticated in terms of the level of the defenses they're using, they don't get lazy, they don't get sloppy, like the Silk Road guy, they never make a mistake so you have to hack them. And hacking, as you implied, can be quite difficult particularly as we eventually some day, not in the next ten years but maybe 30 years from now, finally solve our computer security crisis, what do you do? Well, the FBI faced this, they would argue in the San Bernardino case where they said, "We can't get in this iPhone" and then someone calls them and says, "If you sprinkle a pot of money over top of us, we'll solve your problem." The FBI director said they spent something like $1.3 million, based on his salary, it was sort of extrapolating here, which is a lot of money. But the point is, these are exceptional cases with exceptional needs. They should be rare and one could argue that actually, the best process constraint that we can impose on the police is a cost constraint. It ensures that they only use the most exceptional, most intrusive, most expansive capabilities, the biggest guns if you will, when they're absolutely necessary to achieve their goals, rather than when they wanna snoop in the pockets of everyone in the room as you suggested earlier. [BG:] Thank you, Ed. Fareed, I wanna ask you a different question. The key driver of this debate where it started was that law enforcement, particularly the FBI, so they've got a going dark problem, that more and more of the world is opaque to them and they can't do their jobs. So I wanna test that proposition in a kind of thought experiment. So by the power vested in me as moderator, you are now the FBI director. presto. And you have to choose between two investigative packages. One of them is the 1990 package. You can send agents anywhere, seize anything you like, get business records, anything you get a hold of you can read 'cause it's not encrypted. The 2016 package, lets you do those things with those kinds of old analogue records but also, let's you track anybody's movements 24/7 now or in the past with GPS and license tag readers, cell tower records. You can tap into exaBytes of data at Google and Facebook. You can identify conspirators and their networks through metadata, the trails they left of who they talked to and when, and the price of all that is that there are sometimes gonna be containers you can't open. If you really had to choose one or the other, would you go back to 1990? [FZ:] No of course not, I'd much rather be in today's world, I'd much rather have all the tools that you're describing. I would also like to have the ability to convince the court when I thought it was appropriate to have a bank give me the digital information or technology company , give me digital information or whoever. The question you posed though gets at an issue that Mr. Snowden just highlighted, which I think is important to understand. The word that he's describing is one which American law enforcement and/or intelligence and I think his background with the NSA is perhaps biasing him in this respect, has enormous power, resources, very talented people like him and they can throw this at these problems and they will find a way around the law and the dark zones themselves. They can play this cat and mouse game which is fine and great. And I suppose that that's all well and good. But what about the local police department in a poor precinct, in a poor neighborhood. We have enough inequality in America. The picture that Mr. Snowden was painting, is frankly, one of frightening inequality of law enforcement, where what you're saying is that the Beverly Hills police department will be able to hire hackers to get into whatever iPhones they want. But up in Harlem and the Bronx, "Well, there you're on your own and those murders will go unsolved." I don't think that's an equal justice under the law. I don't think that's how it should operate, there have to be rules. [BG:] I've got time for...[ES:] I think this is your... To each... I'm gonna ask you each one last question. You have about a minute each to answer them before we go into closing. Before we go into the audience questions. For Ed, I guess I'd like to ask you this. Let's assume there's a 100% probability that authorities are going to do one of the following things: Ban strong encryption altogether, require encryption bypasses by law, secretly weaken encryption standards, hack into encrypted devices or buy fenestras from hackers. What's the least bad outcome? How would you want them to do it given that they have to try to find evidence? [ES:] I'm not sure I caught the distinction. There was between sort of getting around it and banning it, that was the distinction there? [BG:] Ban it, hack it, weaken it, build your own hacks, buy hacks from others. [ES:] Right. So I don't agree that buying hacks, building hacks and everything like that is wonderful. That's not what we want and that's not something that I particularly support. What I'm saying is that that's the least bad means we have available. The world, if they ban encryption totally, looks like this. American jurisdiction only goes so far. If you were for the proposal tonight, right, and you think the government should have lawful access, you think they should mandate insecurity for every device, every product, every service here, the problem is that that stops at our own borders. Every competing product, every computing service from every other country will still be able to provide perfect security. We will be the only ones without it. So clearly, given that context, banning encryption is simply the worst move. Arguing that we must create a lawful access mechanism hurts only us because we won't be safe, right? Terrorists won't use our phones but we will be less free. [BG:] Alright. And one last point of philosophy maybe for you, Fareed. How much does your argument rely on a belief in the fundamental goodness of government authorities, that they're well-intentioned and sufficiently bound by oversight to prevent serious abuse? And in that case, how would you answer the Nixon problem or the J. Edgar Hoover problem? [FZ:] Yeah. So, two very quick points. One, we don't make the argument that Mr. Snowden just made with regard to foreign companies for anything else. We don't say we're not gonna have laws for our banks because people could just go and use foreign banks. We have laws for companies that operate in the United States and foreign companies that operate in the United States are bound by those same laws. The same is gonna be true of technology companies. Do I have faith in the goodness of people, the American people? No. I think I believe as James Madison did, "If men were angels, no government would be necessary". But I do have faith in the constitution and in America's democratic system. I've grown up in another country, I've travelled the world, this country has lots of flaws but at the end of the day, the checks and balances do provide greater protections for liberty than almost anywhere else in the world. I will remind you, that Richard Nixon was impeached. Well had to resign for fear of impeachment. Which is why, which is a symbol surely of the system working.[laughter] [BG:] I'm going now to audience questions from our potentially global audience here. There's a question for Ed, from Juan Pablo Geraldo, an employee of UNICEF who asks, "Are we as citizens, less safe because hackers unlock the San Bernardino phone for one or more millions of dollars?" Yeah. One of the points actually that was raised by Mr. Zakaria earlier that was that he said, "Well, Apple hasn't really made any response, they haven't made any stink," in response to this. Now in fact they have, they've challenged the FBI, I believe in court where they tried to get them to compel, that already may have simply been through internal processes before they get to court, to get the FBI to disclose the vulnerability that was used to get into this phone so that they could close it. So they could protect the millions of Americans who are using these kind of devices. And I think that is proper, when the FBI finds a case that is so exceptional that they have to break the security of the device to get in, right? It merits these kind of exceptional circumstances, they should try to do that. At the same time, they should make sure they close the door behind them so that the rest of us, whether we work at UNICEF or whether we work at Starbucks, are safe and don't face the same threats tomorrow. [BG:] Okay. Thank you for a concise answer. I'll ask you both to keep that in mind. For Fareed, there's a question from Sajid Mahmood, a software engineer, and I'm gonna translate it a little bit. He asks, "Should it be legal for WhatsApp, a messaging app, to use end-to-end encryption?" And I'll give you a little context for the question. Apple security is very good, but in the past, has had side doors. Apple has had the ability to decrypt certain portions of it. There are services and products being offered right now, that as far as the best auditing can tell, do not have those flaws. So WhatsApp uses a protocol that the company that makes it cannot read, period. Should it be legal or not legal to sell encryption that you cannot break yourself? Look, I think that... This gets it to the kind of level of the hard cases that are more difficult to solve. What I would say is if a court requires... Asks for information and if the company has the ability to access that information with some reasonable effort, it should provide it. By Apple estimated that it would've taken six people, two weeks to write the code, to overwrite the auto erase function. If WhatsApp says, "We literally do not know how to write this code." I am myself skeptical of this, but I want that issue defer to people who are more expert than I. It seems to me, very difficult to understand how you could be able to write encryption that you cannot unencrypt. As I say, I have talked to many technologists who agree with that. But if it is the consensus, the scientific consensus that that is in fact the case, then I think that WhatsApp could demonstrate to a court that they don't have to do it. And that I suppose is where it would stand. Strikes me as a pretty hard case. And hard cases make bad law. [BG:] Well, it may be a hard case, but it's going to be a very common case and it is increasingly a very common case. And for that reason, the people who are most on the side of the FBI... In Congress, Feinstein and Burr have proposed a law that would outlaw strong encryption, that would outlaw unbreakable encryption, that would outlaw encryption that the company that provides it cannot crack. So do I take it that you would not be in favor of that? Yeah, I don't think so. I think that there are even precedents for this. If you have a safe that the manufacturer cannot open, I believe I'm right that case history is that the government does not somehow hold the safe-maker liable if genuinely, the safe-maker cannot open the safe. [BG:] Ed, there's a question for you saying... Looking, again, for sort of a least-bad solution, would it be better simply to say that if you're a suspect and they have your phone and they can't open it, you are compelled to provide access, you are compelled to decrypt it, that therefore doesn't compromise the technology itself? No, we can't do that because it's prohibited by our Constitution. Of course the Fifth Amendment gives us a right against self-incrimination. And if it was to basically voluntarily provide something that's going to be used against us in an incriminatory form, obviously that... I would argue at least, and there has been no court that has actually held that that is the case, at least certainly not the Supreme Court, that you can be compelled to testify against yourself whether it's in court saying "I did it," or whether you're sort of unlocking your diary that says, "I did it." [BG:] Yeah. And just for the record as factual intervention, right now, the Supreme Court has not made a final holding on either of these, but the state of the law in various districts and circuits seems to be that you can be compelled to put your thumb print on the reader and open a phone that way, but you cannot be compelled to supply a pass phrase. The reason being that one, is like the compulsory provision of fingerprints in any other context, and the other is testimonial evidence. So if you really care... Turn off touch ID, yeah.[laughter] Question for you, Fareed. From Chris Soghoian at the ACLU is that... He says you told him in a CNN makeup room once that you don't encrypt your communications and wonders if you do that now.[laughter] Or plan to after this evening? I am the lamest person with regard to this stuff. I don't encrypt anything, I... Perhaps CNN does what it does. But I'm the kind of guy who, when the fire alarm goes off in a hotel, I just keep sleeping. I assume it's a false alarm and I would have died in the towers in 9/11 for sure for that reason. I always assume that these things... That the odds are gonna stay with me. But I've been lucky so far. [BG:] I guess... Just a quick followup on that, that would be sort of more illustrative is do you have an iPhone? I do have an iPhone. Here it is.[laughter] [ES]: Do you use encryption? [BG:] Ed is reading it right now.[laughter] [FZ:] Yeah. I imagine...Oh, you have a standard operating procedure Ed, it's slightly different from mine. [BG:] Ed, you... If I understand you correctly, tend to believe that if the FBI has found the security hole in the iPhone for example, it got into the San Bernardino phone that way with it's million dollar exploit, that it ought to disclose what the flaw was to Apple so that Apple can fix it, so that Apple can close the security hole against attack by foreign governments or hackers or criminals. If that's the case though, if Apple won't help and FBI needs the information and spends a million bucks to find a way in, can you really expect them to hand over that recipe to the company that refused to help? [ES:] Absolutely, because they're not doing it to help the company. They're doing it to help the country, right? They're doing it to help everybody in America who uses those products, who uses those services. And what we think about this, if they're buying exceptional exploits that have exceptional impact, that could be used against any senator, any federal judge, anybody who uses one of these devices. They're not really wasting this money or subsidizing Apple. What they're doing is they're investing in an infrastructural cyber security improvement plan for every US product service company. Now, you could make a distinction here where they go, "Well, what about this or what about that?" But I think even in the case of foreign companies, if it is a product or service that is used by majority of American customers, majority of American citizens, there's a clear, actually, one might even argue prevailing public interest enclosing that vulnerability. Now, where it would get tricky, where we would get into a nuance and perhaps, we could ask Mr. Zakaria his opinion here is, what if it were a service that were used not by Americans, right? It were only used extensively overseas, but it were in a hostile regime, should they close a vulnerability in a Chinese product, in a Russian product, maybe an Indian product. [FZ:] I have to confess, here I am much more cautious about what strikes me as collusion between big business and government. I don't think it's the FBI's job to improve and refine the products of the world's most valuable corporation. I think that if Apple is not able to provide the encryption that its customers like, well maybe somebody else will. I haven't thought this through completely but it doesn't strike me that it is the FBI's job to perfect the security systems of Apple or Microsoft or Google or Facebook. That's up to them and if they can do it, great. If not, somebody else will provide a product that people will use. [ES:] Just a pressure on that... I wanna add a point of factor. There is an arm of the FBI, whose job is to try to secure US computer networks and devices. There's a big chunk of Department of Homeland Security, half of the NSA has a defensive mission, the National Institute of Standards and Technology. I mean government typically says, there's a White House statement to this effect that when it finds a vulnerability, it's default option will be to disclose it to improve everyone's security. But that there can be exceptions in cases of exceptional need for law enforcement and intelligence gathering. So I guess, I'm wondering sort of where you see the equities in this particular case, in the San Bernardino case. [FZ:] Well as I said, I think they're slightly different because this is not... What you're describing is foreign agents, etcetera hacking into a large digital network, perhaps corporate, perhaps non-governmental, perhaps the nuclear energy administration. The NSA detects it, tries to, in some way, fill the gap. What we're doing with here is single phone, the FBI wanted access for information, got the access to the information and moves on. It strikes me one is a... One you're talking about systemic breaches and systemic problems and the other you're not. As I point out, this has happened 70 times before. And I don't recall massive repair operations going on. [BG:] Oh. There were. Apple is.. [FZ:] No, by the FBI. BG: Ah. Right. [FZ:] The FBI. The FBI, they're not rush to provide, to tell Tim Cook "Please let us help you improve your product." [ES:] I would, if I could just point out here. BG: Sure. Go ahead. [ES:] Let's remember when we talk about yes, corporate products, the president of the United States was famous for using Blackberry. [BG:] Yeah. FZ: Yeah.[laughter] FZ: And he uses Colgate toothpaste also. I don't think the government should be refining that product either. I'm guessing he uses Colgate. ES: Even if there's a known threat in the Colgate toothpaste, you think they shouldn't warn it even if they know the president's going to use it and it could cause harm to him or any other American.FZ: Well there's poison in the toothpaste, yes he should take it out but that's...[laughter] FZ: You see, I don't think it should be the job of the FBI to improve the products of American corporations. I guess, let me put it that way. That strikes... ES: Fair enough. I would simply point out that vulnerabilities in digital devices are poison on the internet. BG: Is there a sense in which this debate is weirdly hypothetical and pointless? [laughter] BG: And I purposely didn't ask this question at the start. FZ: You set it up.[laughter] BG: In the following sense. We're talking about using legal means to regulate science or mathematics or technology and even if Congress wants to pass a law saying, "Thou shall decrypt upon command," I can tell you as a heavy user of security products myself that something like four out of five encryption programs that exist in the world, 400, some of them are made overseas beyond the reach of US law. And so, what you have theoretically is the ability for anybody who's a bad guy and knows that authorities will be looking for them to adopt some of those tools and the legislative idea will be irrelevant. So, is there a point to having this debate on the law with all that in mind? FZ: Look, you raise a very important problem which is we do not live in the world alone. There are other countries that operate on very different standards than the United States does. It's been true for a long time. It's true in many other areas as well. As I point out, we don't say that American banks don't have to follow the laws and regulations because Swiss banks don't, or because Hong Kong banks don't, or because banks in the Cayman Islands don't. We have an expectation that the operations that take place within the United States will follow American law. I will grant you this. We are unusual and have an unusual advantage in that we have a very large market. We are a very important country and as a result, we have more bargaining power than if you were a small, developing country with a small population, small market, and that does mean the United States can make certain demands which are not possible for other countries. So, I'm grateful that in that context, the United States is a liberal democracy and not a dictatorship. BG: So we're out of time, but I wanted to ask Ed, if you wanna take a crack at that in 30 seconds. That question about whether... And even is it rational to have a legal debate here? ES: I do think it's rational because importantly this is a problem about public education. As we began in this debate, right? You heard a lot about politics, about iVault, about things like that which sound convincing and they sound threatening, but then when you actually start drilling down on the signs, you start drilling into the math, and you start drilling down into the facts, things look very different, right? And this also applies to Congress. These are generalists who have to focus on many different issues in many different areas all of the time., right? Same thing with everyone in this room. You guys don't have time. You shouldn't have to study and become experts in encryption just to relate and interact with your world. But when we come together as a community, when we have people with different opinions such as Mr. Zakaria and myself, come together and actually debate these topics in a reason and respectful way, we can produce a public good that serves the wider body and hopefully, can actually result in better policies. BG: Toward that end, in a reasoned and respectful way, we're gonna have closing statements beginning with Fareed for three minutes. FZ: So, this is the 30th anniversary of a terrible, terrible accident. This week is the 30th anniversary of the Chernobyl nuclear accident. That accident spewed more radiation into its region than all the radiation that emanated from Hiroshima and Nagasaki, the two bombs that the United States dropped on Japan. The reason I bring it up is because this is the kind of problem we might face in the future. It is not a hypothetical or speculative point I'm making. There is now ample evidence that the perpetrators of the Brussel's terrorist attacks were initially or at some point, planning to try to explode or to cause an explosion at a Belgium nuclear power plant. If they had done that, you would've had an absolutely catastrophic fall-out both in terms of the radiation of course, the thousands, and thousands of lives lost, the tens of thousands of people displaced, but also politically. There would have been a dramatic shift in the attitude of publics in the western world everywhere toward this whole debate that we're having. And I think that's the point I really feel is important to understand. FZ: We do face real threats out there. This is not the figment of somebody's imagination. There are people out there trying to do bad things. It is much better that we figure out what the government is allowed to do, what it is not allowed to do, what information it can have access to, what information it cannot have access to before you face one of these terrible events because once they happen, the public will react with fury. The Government will be given carte blanche and they will be able to do many, many, more things than Mr. Snowden or I would want governments to do. If you imagine the world we are talking about without any kind of framework of law. As we are talking about a world in which there is a huge zone of privacy, safety, immunity for terrorists, drug dealers, bank launderers, criminals of all kinds to operate and then you have these furtive attempts by law enforcement authorities by the NSA, by the FBI, to play in that dark zone. All of this taking place in a kind of law of the digital jungle. Is that really the kind of world we want to end-up in? What I'm suggesting is a more recent path which is we try to figure out what are the laws that we can agree on democratically? What are the systems that we think courts should enforce so that we can end-up in a situation where we balance security and liberty in the way frankly, that the United States and western democracies have had to do since their founding. Technology always sounds new and shiny and it changes everything. Yeah, but it doesn't change this age-old debate between security and liberty. Thank you, Fareed.[applause] Ed, your three minute closing, please. ES: First of all, I'd like to say, those are important thoughts, and I think everyone in the National Security Agency and other areas of the US Intelligence community that I mentioned previously, also considered those thoughts when they said that America is safer, America is more secure with unbreakable end-to-end data encryption. But more generally, those thoughts, important though they were, did not address the proposition, which is not, should we consider the powers that government could have, but what powers should the government have. Should the government have access, lawful access to any communications or device even though we know it would cause fatal harm to the actual security that we have. Now, the FBI director spoke on this saying things very similar to what Mr. Zakaria said, unbreakable encryption will allow drug lords, spies, terrorists, even violent gangs to communicate about their crimes and conspiracies with impunity. We will lose one of the few remaining vulnerabilities of the worst criminals and terrorists upon law enforcement, upon which law enforcement depends to successfully investigate and often, prevent the worst crimes. The FBI also said, if we didn't get the kind of lawful access we're discussing right now, in three years, wire taps worked by the FBI would be useless. Only 40% would provide anything and a few years later, they would provide nothing at all.The problem is that's not from 2016. That's not from 2015. Those numbers are from 1992. And the laws did not provide an encryption backdoor. Despite the fact that in 1992, we did not change our laws nor in '95 to compromise fatally the security over every American product and device, law enforcement is in a better place today than they have ever been before for means of investigation. The NSA's own classified documents but they don't say it in public say that we are in the golden age of surveillance. And they are right. Computer security is a real threat. And I must thank Mr. Zakaria for joining me in this very important conversation tonight. And I wanna thank you all for spending the evening with us. I hopefully learned or I hope that he learned from me as I learned from him. He is a master debater and this was my first. So, it was very helpful.[laughter] ES: But I would say, let's remember ultimately, that saying the government should have lawful access to any encrypted communication is identical to saying that the government should mandate weak security for all of us. Mandatory in security might be convenient for investigators here or there, no argument. And let's not forget, also China. But the cost of doing so would be fatal. Thank you very much.[applause] BG: Thank you both very much and all that's left for us now is do our second and final poll of the evening. So, you know what the proposition is. Again, text DOTC2016 to the number 22333, follow the prompt, and cast your vote, yes for the proposition, no or undecided.[pause] [BG:] I think we might have fewer undecided. That's good. People have formed conclusions. I'm not sure when we call an end to this, who could tell me? Are we, are we good? Alright. Thank you, everyone. Thanks for coming. Thank you, Ed. Thank you, Fareed. And goodnight.[applause]
Info
Channel: The Century Foundation
Views: 55,636
Rating: 4.9424462 out of 5
Keywords: privacy, encryption, fareed zakaria, edward snowden, barton gellman, nsa, cia, fbi, iphone encryption, PGP, end-to-end encryption, terrorism, san bernadino, cnn, snowden debate, prism, debates of the century
Id: -yoyX6sNEqs
Channel Id: undefined
Length: 82min 55sec (4975 seconds)
Published: Sun May 01 2016
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.