Why The Government Shouldn't Break WhatsApp

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

"Tom Scott wonderfully explains..." is true for just about every topic he makes a video on. :)

👍︎︎ 214 👤︎︎ u/JDGumby 📅︎︎ Jul 03 2017 🗫︎ replies

I thought it was weird that he puts so much trust in people and the possibility to find elicit code in proprietary software. Always assume all keys are being stolen if you cannot verify they aren't. This is the problem with Whatsapp/iMessage.

Which segways into his other point. Criminals are never going to use such services if they are smart, and or, somewhat coordinated with a terrorist cell. I get that the UK parliament is on this personal crusade against modern technology, but targeting these services make no sense. But, meh, its government.

👍︎︎ 45 👤︎︎ u/TheeEmperor 📅︎︎ Jul 03 2017 🗫︎ replies

I'm saving this for the next time I'm talking about UK government surveillance, GCHQ, NSA, etc., to family and friends. Sometimes I don't come over well and sound like a bit of a conspiracy nut. I'll just send them a link to this.

👍︎︎ 80 👤︎︎ u/huddie71 📅︎︎ Jul 03 2017 🗫︎ replies

If you emailed this to your local MP do you think they would watch it, because some of them really need an education on this topic

👍︎︎ 25 👤︎︎ u/joshp18 📅︎︎ Jul 03 2017 🗫︎ replies

So what I never understood, and these videos never explain, is how does a public key encrypt a message that only your private key can open? They must have access to your private key if they can do that, no?

👍︎︎ 15 👤︎︎ u/fakeittilyoumakeit 📅︎︎ Jul 03 2017 🗫︎ replies

Nice video, arguments well put.

👍︎︎ 21 👤︎︎ u/SE193SB 📅︎︎ Jul 03 2017 🗫︎ replies

I thought WhatsApp was considered compromised after it got sold last year.
http://www.makeuseof.com/tag/4-security-threats-whatsapp-users-need-know/

👍︎︎ 10 👤︎︎ u/stonecats 📅︎︎ Jul 03 2017 🗫︎ replies

Go to 6:36 for the actual answer, first half is just an explanation of encryption

👍︎︎ 8 👤︎︎ u/Funky_Beets 📅︎︎ Jul 03 2017 🗫︎ replies

Came here to post the same video. Great explanation!

👍︎︎ 4 👤︎︎ u/[deleted] 📅︎︎ Jul 03 2017 🗫︎ replies
Captions
There’s been a lot of talk about whether the British government will force companies like WhatsApp to introduce a backdoor into their encryption, so that the police and government can read your messages if they need to. As I record this, they haven't done it yet, but the laws that could let them do so in the future are already in place. And here's something you might not expect me to say: that sounds like a reasonable idea. After all, backdoors have been allowed for old-school phone conversations for decades. They’re called wiretaps. And if a criminal investigation has enough evidence that they can get a legal warrant, then they can look inside your postal mail, they can listen to your phone calls, and they can intercept your text messages. And it's called a wiretap because, many years ago, the police would literally be attaching a device to a physical phone wire. So for anyone who grew up knowing that, anyone who grew up with computers like this, like pretty much every politician in government, well, it seems reasonable that that should also extend to, for example, WhatsApp. So why not? Well, first, let's look at the technical detail. It all depends on who is holding the keys. Modern encryption uses complicated math that is easy for a computer to calculate one way, but almost impossible to work out in reverse. A really simple example: if I ask you to multiply two prime numbers together, like 13×17, you can do that by just hitting a few keys on your calculator. And because those were prime numbers, we know that's the only way to make 221 by multiplying two whole numbers together. Other than 221 times 1, and that's not really helpful. But if I ask you: what two prime numbers were multiplied together to make 161? There is no way to work that out quickly. There are a few shortcuts that you can take, but it's still basically a brute-force method. Now imagine that you're not trying to work out 161, but instead something like this... and you start to see the scale of the problem. And that's just a simple example, modern cryptography uses way more complicated one-way operations. The important part is that you can have a computer do math that’s simple one way, but could take longer than the lifetime of the universe to brute-force back. The result is that you can have two keys: two massive numbers. One public, one private. You send your public key out to the world. Anyone can encrypt a message with it: the message gets converted to what looks like random noise. Even that same public key can’t convert it back. But you can take that noise and use your private key -- and only your private key -- to decrypt it. When you want to send a message back, you use their public key, and they use their private key to decrypt it. And the beautiful part of this: there's no need to exchange keys in advance, you don't have to work out old-school one-time pads, or anything like that. You can post your public key out on the internet for all to see. As long as you keep that private key secret, no-one else can read your messages. This is a system that has been tested under incredibly harsh conditions for decades. It works. The catch is, it's really unfriendly to use. It's difficult enough to get someone to join a new messaging service as it is, let alone bring their friends along. Now you have to generate these weird key things as well? And if you lose your phone or somehow forget that key, or your hard drive crashes and you haven't got a backup, all your messages are gone, lost as random noise forever. Email that works this way has been around for decades but it’s too complicated and it's too unfriendly for most people. The security wasn't worth the effort. So instead, web mail services, along with Facebook, Twitter, and everyone else, didn't worry about that. Early on, they were mostly unencrypted, but rapidly realised that was a bad idea -- so now, they use regular web encryption, that padlock in your browser, to make sure that no-one on your network can see your password or your messages when they’re in transit. And that's the threat that most people have to worry about. But they do have the content of those messages in plain text, or something close to it, and those companies can give that back to you whenever you want. Which means that when a government comes along with a legal warrant, the companies can also give the messages to them. And this was fine, right? This was reasonable. This was an acceptable compromise between security and usability. Or at least it was, until it was revealed that -- in short -- every major government was keeping a copy of pretty much everything everyone ever wrote, at which point a few companies decided, that, actually, they didn't want to take the risk of anyone -- not even their own employees -- being able to even theoretically access the messages that people were sending. The result is WhatsApp, and iMessage, and the many smaller apps like them. They have "end-to-end encryption". Your phone generates a public and private key for you, automatically. It exchanges public keys behind-the-scenes, while you're writing your first message to someone, and everything after that is encrypted. And it's all automatic! And so WhatsApp and iMessage aren’t open source, in theory they could steal your private key as well or quietly issue a fake one to someone and sit in the middle listening, but in practice people would notice. Sure, there are small loopholes that could work in particular circumstances, but the odds are remote, and security researchers are already decompiling and tearing apart every version of every messenger program just to see if someone's put a backdoor into it. The short version is: if any of these apps get served with a government warrant right now, the most they could do is say how much two people have been talking, and maybe roughly where they were: but never what they were talking about. More than that is literally, mathematically impossible. But it's impossible only because of the way they've designed their systems. And that is the vulnerability. A government could make it a legal requirement for Apple and Facebook to quietly add a backdoor in all their encryption if they want to sell anything in their country. I've heard this phrased as "outlawing maths", but that's a bit like saying that making punching a stranger in the face illegal is "outlawing hands". And if Apple and Facebook refuse to add a backdoor, a government could... well, theoretically they could ban their phones or ban their apps from sale, or prosecute the people in charge, or block Facebook, who own WhatsApp, or they could tell internet providers to block their services, or they could... Look, in practice they're going to fine the company. Apple and Facebook have local addresses, they pay... some tax. Sitting on the sidelines, I would love to see the British government go up against Apple and see who blinked first. But companies have bowed to foreign countries loads of times in the past. BlackBerry let the Indian government have full access to users’ chats and web history back in 2013. The only reason WhatsApp can't read your messages is because they have deliberately chosen to design their systems that way. They were just as popular without encryption: it was an afterthought, they'd been going for years before they switched encryption on. This was a human decision, not an inevitable fact of technology. So why is an encryption backdoor such a bad idea? Well, if there's a backdoor, it can and will be abused. Local British authorities already used our surveillance laws, the ones that were brought in to stop terrorism, to monitor loud dogs barking, crack down on illegal feeding of pigeons, and to spy on some parents to see if they actually lived near enough to a particular school they wanted to get their kids into. Now, is this useful for preventing crime? Sure. And there's the argument that "if you have nothing to hide, you have nothing to fear": maybe they shouldn't have illegally fed those pigeons. And yes, you, watching this, you probably have nothing to hide and nothing to fear from the current government in your country. But laws and governments change, and besides that: the internet, and the apps that we use on our phones, are global. If you allow a backdoor here, you're also allowing it for another country's government to spy on its opponents, and another to spy on people they suspect might be gay, or who use marijuana, or who are Christian, or whichever thing is illegal in that country. In fifty years, maybe you'll be part of a country where eating meat has been outlawed, and the government will want to come after you for tracking down the illegal bacon-trading ring that your friends are part of. "Nothing to hide" only works if the folks in power share the values of you and everyone you know entirely and always will. To make it worse, on the surface this seems like it's equivalent to a regular, old-school wiretap, but it's not: depending on how the backdoor’s set up, a government might not just be able to get what someone’s sending now. They could get the whole message history. Perhaps years of messages, back and forth with hundreds or thousands of other people. It's not just a look into what a person’s saying: it's an overreaching look into the thoughts of many, many people. it's that long-forgotten naked picture that someone sent five years ago. It's that angry essay they wrote in school and which they completely disagree with now. It's not just "what are they saying", it's "what have they ever said". That's all assuming the backdoor doesn't get abused by folks with more personal grievances. All it takes is one rogue employee, in the government or at a messaging app, and we've got a huge amount of personal information being leaked. Either of the public at large or of specific people that someone would like to take revenge on. It fails the "bitter ex test": can someone with an agenda use this to ruin a life? An AP investigation found hundreds of cases where police officers and civilian staff in the US looked up private information for personal reasons. And let’s not start on what would happen if a hacker, or even some other government’s intelligence service, got access to the backdoor. Or how it’d make it much more risky to report abuses of government power, on any scale. There is an argument that it would all be worth it, that all those drawbacks would be a small price to pay for stopping very rare Bad Things. I disagree, but that’s an opinion, not a fact. But an encryption backdoor wouldn’t stop bad things happening. The problem with stopping terrorism right now is not a lack of information. The Manchester bomber was reported to the authorities five times, including by his own friends and family. One anonymous source inside the UK security services told Reuters that at any time there are 500 people being investigated, and about 3,000 people "of interest". For scale, just to reassure you, that's only about .005% of the UK population. But the way to solve this is not more data, it's having enough police officers and security staff with enough time to do their jobs and investigate. And let's be clear: anyone who wanted secure communication for evil purposes would just use something else, any of thousands of smaller services that the government hasn’t noticed yet or that they couldn’t possibly have jurisdiction over. Or if even that is not an option, they can come up with a code themselves, even just in-jokes and references that no-one else understands. So when I say that an encryption backdoor sounds like a reasonable idea, I mean it. It sounds reasonable. Like a lot of ideas sound reasonable when you express them in one or two sentences. But the devil is in the detail. If we could replicate the way wiretaps used to work, limited in scope and time, requiring a warrant and some physical effort, not including the history of everything that someone's ever said, and not open to repressive governments elsewhere in the world, then sure, I would absolutely be in favour of it. Building an encryption backdoor isn't impossible: but building a reasonable one is. Thank you to everyone who helped proofread my script, and to everyone here at the Cambridge Centre for Computing History, who let me film with this wonderful old equipment.
Info
Channel: Tom Scott
Views: 3,389,171
Rating: undefined out of 5
Keywords: tom scott, tomscott, the basics, encryption, whatsapp, imessage, public key, private key, encryption backdoor, backdoor, end-to-end encryption, investigatory powers act, computer science
Id: CINVwWHlzTY
Channel Id: undefined
Length: 11min 10sec (670 seconds)
Published: Mon Jul 03 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.