Secrets in Your Data | Full Documentary | NOVA | PBS

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
(bright music) - A man checks his smartwatch. - Our modern, digital lives offer limitless convenience, community, and easy to access everything. - [Speaker] The internet opens unbelievable opportunities. - [Host] But there's a trade-off. Your personal data. - [Speaker] Pretty much everything you do on the web is being tracked and logged. - Who you are, what you're doing, and where you're going. - This is the mothership. Can you show me what I've inadvertently given up? And that personal data is worth a fortune. - People expect it to grow to $400 billion. - To completely avoid data collection requires you to go live in a cave. - [Host] Thankfully, there are tools to help maintain your privacy and security online. - Let's create some strong and unique passwords. - Everything has privacy settings. We can dial to whatever level we feel comfortable. - [Host] And some coders are rewriting the rules of the web. - The decentralized web is our antidote to the web as we now think of it. - New technologies can proliferate without centralized control. - This is about technological self-determination. - [Host] Secrets in your data right now on "NOVA." (triumphant upbeat music) ANNOUNCER: As an American-based supplier to the construction industry, Carlisle is committed to developing a diverse workplace that supports our employees' advancement into the next generation of leaders, from the manufacturing floor to the front office. Learn more at Carlisle.com. (phone camera clicking) PATEL: Our world runs on data. When you're shopping online, streaming your favorite shows... (mouse clicking) posting family pics-- that's all your personal data leaving your device. (on phone): What's up, everyone? I'm about to head to the hospital for a night shift... (voiceover): I'm Alok Patel, a pediatrician, medical journalist, and all-around social media enthusiast. (on phone): Just got here at the children's hospital... (voiceover): I kinda love our connected lives, and I haven't been shy at all about sharing mine pretty publicly. (on phone): 30 million children... (voiceover): And I know that sharing personal data can have major upsides. The media's reported on some of the most dramatic examples. NARRATOR (archival): ...is conducting several studies to see how far wearables can go in detecting disease. A California man, rescued yesterday after going missing credits a social media post for helping save his life. Six out of ten people with Alzheimer's or dementia will wander away from home, and that's where this little box comes in. PATEL: There's no doubt, our data is out there, but did you ever wonder where it goes and how it's used? I do. I want to know: what are the benefits of sharing and the risks? And what can we do to make our connected world safer and just generally better for everyone? First, the basics. I want to know: how much personal data have I been sharing? Meet Hayley Tsukayama, tech journalist and data privacy advocate. (chuckles) PATEL: You must be Hayley. Hi, nice to meet you. What is going on? (voiceover): Okay, hold on, wasn't expecting this. She's already got my whole life up on the screen. So this is a data report gathered about you from social media posts, from the images of you that may be on the internet, from publicly available information. This is like a family business. That's my dad's cell phone number. Why is this available? (groaning): Oh, this is my address! This is just available? Just available online. Okay, you guys are going to blur all this, right? Please? There's-- oh, okay, cool, there's my cell phone number. Oh, this is concerning. As a medical doctor, we all have license numbers and issue dates and expiration dates. (chuckling): And it's literally right here. This is so weird, these are the names of my neighbors from my childhood home. (whispering): Why do you have this information? Like, I haven't seen these names in... years. (voiceover): Not gonna lie, I expected some of my data to be out there, but this is... this is just creepy. How is there a report about who my neighbors were in the '80s? LEVY: You can be assured that pretty much everything you do on the web and with a browser is being tracked and logged. Where you go, what you look at, what apps you use. SAFIYA NOBLE: Modern life has been arranged so that every aspect of what we do and how we live can be captured. MEREDITH WHITTAKER: They pick up our faces as we walk down the street. They track our keystrokes while we're at work on behalf of our employer. (keys clacking) GALPERIN: And ad networks use a bunch of creepy and nefarious ways of figuring out who you are, what you're doing, and where you're going. PATEL: What happened to the web I grew up with? ♪ ♪ I came of age when the internet was a fun and weird party, and everyone was invited. Remember GeoCities, AOL Instant Messenger, or the dancing "ooga chacka, ooga chacka" baby? ♪ ♪ Today, it's clear that innocence is gone. Is the party over? To understand the present, I want a refresher on the past. (machinery whirring) So I'm visiting the Computer History Museum in Silicon Valley. (video game music, Patel chuckling) MARC WEBER: Okay, I'm headed straight for the sun... PATEL: I'm meeting computer historian Marc Weber. What a maneuver! There's a straight line of bullets coming right at you-- (exploding sound effect, Patel groans) (voiceover): ...Who just wasted me in the cosmic battlefield of "Space War," one of the first video games. (laughing) This is actually really fun. ♪ ♪ This is cool. ICBM computer from a nuclear missile. Woah. Human-computer interaction. PATEL: Now, this looks fascinating. I can't tell if this is like, this universal clock apparatus, or what's happening here? This is the first punch card machine. And this was an invention to make the U.S. census faster. PATEL (voiceover): It's kind of wild, but the dawn of automated personal data collection looks like this-- census info poked into humble little punch cards. Marc, can you explain punch cards to me? Because when I think punch cards, I'm thinking lottery tickets, or "buy nine cups of coffee, and your tenth is free." Well, it was the first way to record machine-readable data. PATEL: Okay, let's make sense of the census. This kind of data collection has been going on for centuries. But starting in 1890, data was recorded in a pattern of holes "punched" into a census card. So the cards were used to collect census data. What exact questions were on there? So things like age, gender, number of children. PATEL: When a card passes through the machine, the holes allow pins to make electrical connections through the card. A counter keeps a running tally in each census category as more cards pass through. Census data is how the government allocates resources for schools and hospitals; draws the lines for districts; and decides how many representatives each state will send to congress. It's a reminder of the value of collecting data, but it's a far cry from having my every move tracked on the internet. To help connect the dots, Marc shows me the next surprising step in the evolution of data collection. ♪ ♪ I look at this and I'm almost like, "Oh, this looks like a modern office building to me." WEBER: Exactly. Each one of those is one of these giant terminals here. (static hissing) PATEL: During the Cold War, IBM figured out how to hook up computers from all across the country to monitor U.S. airspace in real-time. (rocket blasting off) So, this is a terminal from SAGE. (beeping, static) PATEL: It was the first real-time networked data system. Real-time is just that: instantaneous. And "networked" means connected. Computers from all across the country were hooked up to each other, so they could share data in real time, Sort of like the internet. WEBER: The whole point was to track incoming Soviet bombers, and they built, arguably, the first computer network in the world. PATEL: IBM soon realized that their aircraft-tracking computers could be used commercially. ♪ ♪ So In 1960, along came SABRE. (beeping, static) A real-time networked data system that shared personal data. WEBER: The story goes that the head of American Airlines met with IBM, and they decided to do a partnership, the key idea being real time. They could look at your age, your name, financial information, what flights you'd been on before, your dietary preferences for your meal, all of this right then in real time. PATEL: We went from tracking bombs to tracking butts on planes. WEBER: Exactly. PATEL (voiceover): Between "Pong," punch cards, and planes... Whoa, whoa, whoa! PATEL (voiceover): I could see how data started connecting the world-- the military to our airspace and companies to their customers. But what if those customers all over the world were linked up to each other? What would we even call that? Oh yeah, the World Wide Web. PRESENTER: The World Wide Web is a part of our lives and getting bigger every day. But what about the man who dreamt it all up? PATEL: News reports opened our eyes to this innovation and its creator, Tim Berners-Lee. PRESENTER: He simply wanted to give it away for the benefit of others. TIM BERNERS-LEE: When the web started, anybody could make a website. You could make your own style, your own poems, your own blogs, and you could link to other people. The feeling and the sense was of tremendous empowerment. (modem dialing up) NOBLE: Some of us remember the internet when it was managed and curated by people. Where it was, you know, bulletin board systems and chat rooms. ♪ ♪ PATEL: In the late '90s, small start-ups were maturing into big tech companies-- with lots of users and user data. KAHLE: As the web grew, we ended up with a very few companies going and doing all of the hosting. It wasn't your actual personal computer doing it. NOBLE: The large platforms that have emerged are dependent upon user-generated content and user engagement. The modern internet is a wholly commercial kind of internet. PATEL: And as web traffic started to flow more and more through just a few companies, they began to realize the value of all this personal data piling up on their servers. ALEKSANDRA KOROLOVA: Google started to understand the power of individual's data. They have information about not just how the websites are interconnected, but also information about the individuals. HILL: The big technology companies. They're looking at where you are, They're wanting to know what your gender is, what your age is, you know, how much you have to spend, what your political beliefs are. They have a lot of power to determine what our privacy is. RASKAR: That's trillions of dollars of economic value that's out there, and there's a constant struggle between who owns the data. We create the data, but somebody else monetizes it. PATEL: As big tech was maturing, early data collection on websites was pretty limited. So what happened to supercharge it? To create the vast tracking infrastructure that allowed Hayley to find out so much about me? It turns out things really got going when someone invented... Cookies? A cookie is not actually a tasty snack. When you go to a website, the website sends a small file to your browser. (keys clacking) PATEL: That small file allows the websites to collect information about your visit, and update it every time you come back. You might have heard of cookies because a lot of websites pop up with messages prompting you to accept theirs. But what happens after you hit "accept"? GALPERIN: It then uses cookies to track who you are and what you're doing, including in many cases, all of the other websites that you go to. RUMMAN CHOWDHURY: Cookies have the cutest name, but they're actually a really important piece of information about you. Every time you click on a menu, every time you look at an item or put it in your cart, every time you want to buy something and click out of the website and come back in. A cookie is what's enabling you to do that. It's a piece of tracking information. You can impute data and information about you from the seemingly innocent behavior online that's tracked via cookies. Do you have certain lifestyle conditions? What age are you? What gender or race are you? Do you have children? PATEL: Okay, so I'm starting to get that these tech companies use things like cookies to keep track of what sites I'm on, what I'm shopping for, and what I'm watching. ♪ ♪ (doorbell chimes) But do they have other ways of learning the secrets in my data? To learn more, I'm tracking down a data security expert in his hacking high-tech superhero lair... naturally. (door rolling open) Are you Patrick Jackson? My name's Alok. Yes, I am. Awesome, I'm in the right place. (door rolls shut) Your phone is sending this extra data to these people that you don't know exist. PATEL: Patrick Jackson has been all over the news for his work as a former NSA research scientist, and is now chief tech officer for the internet security firm Disconnect. And if anyone can help me understand how my personal data is leaking, it's him. Patrick, when I think about the main control of my work, life, play, what I'm doing at home, when I'm on the road, this is the mothership. If I were to relinquish control of this precious device for a few moments, can you show me, like, what I've inadvertently given up? Yes, yeah. I can show you what these data companies know about you and what they're collecting. PATEL (voiceover): Using a hacking technique called a "man-in-the-middle attack," Patrick can intercept the personal data that's leaving my device, essentially eavesdropping on a conversation between me and the websites and applications I use. JACKSON: Contact information, name, phone number, email address. When you agree to allow them to have your location, I can see not only where you're at on a map, but also where you're at in your house, whether you're at the back of the house or the front of the house. When you look at the data leaving the phone, every time you open an app, you send about, maybe, 500 kilobytes of data to their servers. That is equivalent to about 125 pages of text that you would print in a printer. PATEL (voiceover): It's one thing if the apps on my phone are collecting data, I chose to download them. But Patrick says that other companies have even sneakier ways to get my data. Like something as harmless as a marketing email. To demonstrate, Patrick sends me a mock promotional email. I have an email that says "Shoe Sale." It says "open to view the shoe sale," and I want to open it, so I'm going to. "Hi, Alok, get ready for upcoming shoe sale." I like this graphic, and there's like, social media icons. This looks legit. (voiceover): It turns out, emails can include a tracking pixel-- also known as a spy or invisible pixel. JACKSON: An invisible pixel is a type of image you're never intended to see, but it still initiates a handshake of data from your device to these data companies. (keys clacking) Most of the emails that you receive likely have these tracking pixels in them. Most? Most. This invisible pixel is actually disguised as the banner in that email. In other cases, the tracking pixel will be a one-pixel, very, very tiny image that you would never see with your eyes, and it's not meant to be seen. (zooming in) PATEL: That little pixel, that little spy. See it? Right there. It can hide in images embedded in an email, like a banner, or even an "unsubscribe" button. And when you open the email, it contacts a tracking website to collect data about you. It can suck up your device model, location, and even some browsing history. JACKSON: This is essentially a digital fingerprint of your device. PATEL: No! JACKSON: That fingerprint would identify you as who you are, and somebody else in the future could look at that fingerprint, make you do a new fingerprint, and they would know that this is the same person. PATEL: Companies are snatching copies of my digital fingerprints wherever I go online. And by following that trail... JACKSON: Companies are, over time, collecting everything about you. That's how they build up this, this digital profile about you. HILL: The internet is this data collection machine, and as we're moving through the internet, there are these companies whose whole job is to reassemble that trail. PATEL: So now I know how my digital profile is out there, but why? Who really cares that I take pictures doing handstands and own way too many sneakers? Turns out, there's a whole industry devoted to selling my data. It's the data brokers. ♪ ♪ Data brokers are companies that are set up to collect information, repackage it, and resell it or re-share it to other companies that may want to know information about you. So what do data brokers want with all our personal data? Why do they care that I love chicken mole and that "Cool Runnings" is in my top five movies of all-time? TSUKAYAMA: They collect all this information, and then they're really trying to slice and dice it in ways that are appealing to customers. You know, it could be an employer who's doing research on you, it could be a retailer who wants to know what kind of customers they can target with ads. That kind of information is really valuable for advertisers, because they want to target advertising to a certain type of person. PATEL: Our data ends up eventually getting compiled into reports like these, that are then sold to... whoever's interested? TSUKAYAMA: Data brokers are able to say, "Look, we have a group of people "that will fit the audience of your product. "We really are happy to serve this list of people to you," or to make a score about how likely they might be to purchase your product. Okay, I know people say "Oh, your phones aren't listening to you," but we were just talking about retro video games, and here is an ad for home mini-arcade machines-- which looks kind of cool-- but how did it know we were just talking about that? Is my phone listening? Is it a psychic? What's happening? People always say, "I was talking about this product, and then I saw an ad for it. "Matt, are they listening through my phone?" And I'm like, "Well, they didn't hear you, they didn't listen to anything." The truth is actually more frightening. You talked about that thing because they influenced you into having this conversation. It's because the algorithm took you to this place. And that's the situation with our data. PATEL: There have been some outlier examples of ad tech companies listening without consent, but that's against the law. For the most part, advertising algorithms know you so well that they can predict what you would find interesting. They are so good because they've studied all the data gathered about you-- from the data brokers. There's this kind of dossier that's being created about you as you're wandering around the internet, and it's being used to, you know, decide what ad you see. PATEL: Advertising fuels the economics of the internet. And advertising is fueled by you, me, us-- our personal data. So how do the algorithms actually work? You land on a webpage, and that webpage gets your I.P. address. A unique identifier is returned to a data broker. That data broker looks up all the facts that were ever gleaned about you. A dossier that describes you. PATEL: And then advertisers literally bid, they bid on you, in a real-time auction based on the only lesson I remember from Econ 101: supply and demand. DOCTOROW: The demand-side platform that the advertiser is on says, "I have here "an 18- to 34-year-old man-child "with an Xbox in Southern California. "Who wants to pay to cram something nonconsensually into his eyeballs?" And that takes place in a marketplace, and you have advertisers on the other side who have standing orders. They're like, "I want to advertise to "18- to 34-year-old man-children in Southern California who own Xboxes," and the marketplace conducts an automated high-speed auction where it's just like, "I'll pay so many fractions of a cent," "I'll pay that plus ten percent." (auction bell rings) The high bidder gets to show you an ad. PATEL: All of this happens nearly instantly. And it's possible because data brokers sort and cull personal data into super detailed consumer groups. This is the process that delivers relevant personalized ads for cute shoes or that new tea club-- if you're into that sort of thing-- which all the ads seem to know about you. TSUKAYAMA: Here we have, for example, Soccer Mom. PATEL: Okay. I might be able to identify with that. A sporting goods retailer could then go and check out a file on what a Soccer Mom's online activity is like. TSUKAYAMA: What are the things that they tend to spend money on, and do they like to save? Are there certain times of year where they might spend more or less? PATEL: With all this data about Soccer Mom's consumer habits, retailers can send personalized ads with uncanny timing. Some people like the results-- soccer moms get deals. Advertisers keep websites we love in business. But... some consumer categories can be more concerning. HILL: Sometimes they'll have categories that are like, "This person has erectile dysfunction, "this person has a gambling problem. "This is a list of, kind of, people that are likely to fall for frauds." So these can be really powerful and damaging lists. TSUKAYAMA: So this one, for example, diabetes. PATEL: Wow. Okay, healthcare demographic... The thing about that, right, is like, when you're thinking particularly about health data, that indicates that you're probably in a pretty sensitive situation. PATEL: I'm now thinking about all the patients I take care of. And I shudder to think at the targeted ads that are deliberately targeting these individuals. (keys clacking) TSUKAYAMA: Data brokers have lists about sexual assault victims, they have lists about dementia sufferers. Those are some really concerning types of categories, and you could definitely see how an advertising company could take advantage of people on those lists. So I think a lot of people's first reaction on seeing these data broker reports is like, where has all of this come from? There are a lot of places where data brokers get information. Most commonly apps that you download that have deals with data brokers to share information. PATEL: All right, I knew that apps were getting my data, but I had no idea some were sharing it with data brokers. How is that even legal? TSUKAYAMA: How often have you seen a terms and conditions screen just like this? All the time. Anytime I download anything, basically. (both chuckle) I download apps all the time for work, play, life, convenience. I just scroll all the way down and hit accept. because I'm over it. Right. And I think that's how most people feel about it. PATEL: "By accepting these terms, you allow the platform to freely..." "...content for demonstration, promotion, and..." TSUKAYAMA: Terms and conditions are really long. Cool, I'm in a commercial and don't even know about it. TSUKAYAMA: Carnegie Mellon once did a study that said it would take days for somebody to get through all of the terms and conditions. They actually use the word exploiting. "You represent and warrant..." TSUKAYAMA: That puts people in a really difficult position when we're supposed to manage our own privacy, but we're also supposed to use all these things that are products that will make our lives better. (papers rustling, teacup clatters) Data brokers are a big industry. It's about a $200 billion industry right now. I think a lot of people expect it to grow to, you know, $400 billion in the next few years. NOBLE: That level of data from our past, all the things we've done being used and put into systems to help predict our futures, that's unprecedented, and that's rife with the potential for discrimination, for harm, for exclusion. PATEL: Okay, I know I said that I'm not the type to log in to... (voiceover): Look, I love being online. I like sharing what I'm up to on social media, and I'm not afraid to share my thoughts. Okay, bugs, I don't bite you, you don't bite me. It's that easy. But some of this personal data is, well, personal. So, now I need to know: What can I do to make my digital life more private and secure? GALPERIN: Privacy and security are not the same thing. For example, uh, Facebook is extremely interested in protecting your security. They want to make sure that it is always you logging in to your account. They will go through a great deal of trouble to keep your account secure. But you enter all kinds of data into that account. You tell it where you are located. You send it all of your pictures. You send messages and Facebook collects all of that data. They don't want you to keep it private. They want you to hand it to them so that they can use it in order to serve you targeted ads and make them money. PATEL (voiceover): My accounts are mostly secure when I control access to them. But that doesn't mean the data I put in them stays private, far from it. Privacy and security are not the same. But they are two sides of the same coin, and I have to understand both if I'm gonna protect my personal data. MITCHELL: When your privacy is taken from you, your agency is taken from you. Privacy is that whisper. When you think you're whispering to your friend, but you're shouting in a crowded elevator, you're robbed of something. And that's why privacy is so important. PATEL: What can I do right now to protect my privacy and my security? To learn tips and tools to preserve my data on both fronts, I'm chatting with hacker and educator Matt Mitchell and cybersecurity expert Eva Galperin. First up: privacy. Sorry. (whispering): Privacy. Matt is a privacy advocate at CryptoHarlem, as in cryptography, the process of hiding or coding information. Hey. I'm Matt. Thanks for coming to CryptoHarlem. MITCHELL (voiceover): CryptoHarlem is anti-surveillance workshops for the Black community and all marginalized communities around the world. And our mission is to develop people's digital skills so they can actually help us in the fight to push back on digital harms. MITCHELL: Privacy is not secrecy. Privacy is just a door. A door in your home. There's a sense of, like, I want to control what can be seen, what can't be seen just for me. PATEL (voiceover): And I want to close that door. So how do I do this? Even just a little? PATEL: Okay, what do I do to make all this safer for me? What do I do, who do I talk to, how do I start? You have to ask yourself, "Is this a problem that needs to be fixed?" Privacy isn't a switch. It's a dial. You get to control how much you share with who and with what. PATEL (voiceover): Let's start with a big one, something I use all the time, every day, and I bet you do, too. Have you ever used this website called Google? I've heard of it. Yeah. Well, let's check this out. Well, if we go here to myactivity.google.com... ...it'll show us all the things that you've been doing. So for example, when we go here, we see all the different Google services that you use. I don't think they make a service you don't use. PATEL (voiceover): These platforms are so deeply embedded in many of our lives and for good reason. They make products that can be really useful. It's hard for me to imagine going a single day without searching the web or using a navigation app. But they also suck up a lot of our data. It's a trade-off that I'm comfortable making, within reason. I've used literally everything Google has ever offered. And it knows what you've used. And it records everything you've ever used and how you've used it. Every search term you've used, every, uh, you know, shopping item you've used. But this is the dashboard to delete it. PATEL (voiceover): I didn't know this, but I can literally just delete huge amounts of data that Google is storing about me. And the same is true for a lot of other services. We can dial to whatever level we feel comfortable. For example, on LinkedIn, you would just click on me, and then you would go to your settings and privacy. Here, when we go to manage your activity, it tells you that, you know, you started sharing your LinkedIn data with a permitted application. PATEL (voiceover): Treating privacy like a dial means it's not all or nothing. You can have your data cake and eat it, too. Like now I'm thinking I want to log into everything-- all my social media, my email, my LinkedIn, everything-- regularly and look to see who is using these. Exactly, it is about awareness. Furthermore, the companies, they know how many people actually use the privacy controls. And by you even peeking in it, you're saying I believe privacy matters. At the Electronic Frontier Foundation, Eva shows me how I can take my privacy game to the next level... learning some Surveillance Self-Defense. Although, apparently not that kind of self-defense. GALPERIN: So, the next step in your privacy journey is fighting back against types of corporate surveillance. I-- and one of the things that websites really like to do when, uh, is not just to track what you are doing on their website, but to track all the other websites that you go to. And they do this using cookies. There's some companies I trust, and I'm like, fine, you have these cookies. They're chocolate chip. I know where they were made. I know what you're doing with them. But then there's these third party companies, I don't want them around me. You can use a browser extension to eat these cookies, uh, and fight back against this kind of tracking and keep those websites from seeing where else you're going. PATEL (voiceover): Browser extensions are add-ons for your web browser that give it extra features and functionality, like eating cookies. I'm imagining this, like, digital Cookie Monster that's eating up all these pieces of my online activity so that companies don't know what I'm doing online. And it reduces the amount of privacy tracking. Am I understanding that? What these browser extensions do is they get rid of, uh, the tracking cookies that these websites use to see all the other sites that you're going to, which is none of their business. (knuckles crack) PATEL (voiceover): My personal digital world is starting to feel comfy, controlled and a lot more private. I've learned how to close the doors and blinds. But still, someone could open them right back up. I could get hacked. How do I lock my doors? Time to talk about security. MITCHELL: The tragedy of this recent pandemic teaches us that we're all pretty vulnerable. And without that herd immunity, without that community response, you can't just say, "I'm okay, and therefore I don't have to worry about this." This is the first time that I've ever heard someone compare data privacy to epidemiology and herd immunity. All you need is one person to make a mistake and it's game over. That's what happens to so many other corporations, businesses, even hospitals during a ransomware attack that'll freeze all machines and stop them from working until someone pays a bunch of hackers. My one error could compromise the security of my entire institution. MITCHELL: Human beings will make mistakes. Folks won't wash their hands sometimes, right? But we try to teach best practices to prevent the worst. Okay, Matt, you have my wheels spinning. I do not want to be patient zero in this, like, massive infectious data leak. What can I do to start pulling it back and better protecting my information? Well, I got something for that, okay? Oh, you have, like, a bag of tricks. I've got a bag. I thought you were just gonna say, like, change your password. We got to go all the way. This is a privacy screen. And this will keep people from being able to look over your shoulder. Shoulder surfers beware. Can't look through that. This is for like, using my computer in a public space, the airplane. Everywhere. This is a Faraday bag. You got to keep your phone in here. This blocks RF signals. It's basically aluminum foil... PATEL (voiceover): Yo, Matt? Can we slow this down a little bit? You can't get hacked when you're not attached to anything. Is this really necessary, a Faraday bag? PATEL (voiceover): Okay, this looks a little more complicated than I expected. I guess I just have to become Dr. 007. We also have a Wi-Fi pineapple. This is a portable access point. And we also could use it to detect and stop access points from going bad. PATEL (voiceover): Okay, Matt, now you've just gone rogue. MITCHELL: Let's say you have a hard drive with some important files from work in it. What happens if you lose it? Someone can plug that in and have access to all your stuff. Not if it's an encrypted hard drive. I understand-- There's more. How do you know that you're not being bugged? This is nuts. This is something I could use to find out if there is a audio bug in the room. (beeping) I can find Wi-Fi signals... (device beeps) Oops, something's here, right? PATEL (voiceover): Honestly, I thought being a spy would be more fun. But I'm starting to feel shaken, not stirred. How am I supposed to keep track of all this stuff? I'm no hacker genius like Matt. What if I just left all of this behind and just quit. Goodbye, digital world. (clock ticking) (yawning) Do you know if there's breaking news? (cat meows) Look in my eyes and tell me. What are we doing here? What are we doing? Like, I don't even know how many steps I've taken today. Because my step counter is not on my hand. I'm technically disconnected, this isn't cheating, but can someone tell me what the Suns score is? CREW MEMBER: Uh, down by 11 at the half. (groans) (cat purring) Oh, I am supposed to pick up my daughter. How long do I have to do this for? (voiceover): All right, maybe it's not that easy. But there's got to be some middle ground between Inspector Gadget and Fred Flintstone that's right for me. Instead of going completely off the grid, I'm checking back in with Eva for some more surveillance self-defense. And the best place to start is with the basics: passwords. I'll be honest, my current password is basically a combination of my first pet's name plus my birthday or a favorite video game or song. I don't think any hacker's really gonna guess that. But you're telling me that there is still a vulnerability there. Yes. Hackers can go find out information about you from data brokers, including, you know, the name of the street you grew up on or the city where you went to college. So you wouldn't want a password like the name of your pet. Or password123. Well, I did have a pet fish, and his name was Password123. So, I guess that kind of, like, hits both boxes and makes me more vulnerable. Let's start by, uh, creating some strong and unique passwords for all of your, uh, your different accounts. And that will make them much more hacker-proof. I'm all ears. Fantastic. Well, we're going to start by rolling these five dice. GALPERIN (voiceover): There are easy ways to create passwords that are long and strong and easy to remember that are not based on information that attackers can easily find about you. And the way that we do that is we use word lists and dice. The word list is essentially just a long list of dictionary words with numbers attached. So what we're going to do is we're going to write down the numbers on this dice. 45263, my new lucky numbers. Okay. And now we are going to look up 45263 in this book. So there are words that correlate to the numbers that I just randomly rolled. Yes. Okay. I want a cool word like "tiger." The word is "presoak." "Presoak?" "Presoak." 34115. "Henna." Okay, I like this. PATEL (voiceover): The idea behind this dice game is that it's both random and long. Hackers using fast, powerful computers and sophisticated password-cracking software are able to try many, many passwords a second. A recent study by a security firm showed that a seven-character password can be cracked in just a few seconds. But a 12-character password with uppercase letters and symbols is much more difficult to break. Studies show it could take hackers well over 100 years to crack. That's safe. But also more difficult to type or remember. Eva's six-word random passphrase is easier to remember than a random string of 12 characters and practically impossible to break. "Stinging." "Ignition." Six is "Clutch." "Clutch," okay. "Presoak Henna Stinging Ignition Clutch Handbrake." (bell dings) Fantastic. Now you have a passphrase. These six words. It is long. It's unique. It's difficult to guess. And it's relatively easy to remember, considering how long it is. It is a very good way of creating random passwords, but a lot of password managers will automatically do this for you. PATEL (voiceover): Once you have a passphrase, you can use this to unlock a password manager that generates strong random passwords and stores them securely. Now I have this beautiful random password or passphrase, but what happens if someone steals this? Is it just game over, efforts are gone? No, it is not. Uh, that leads us to the next step on your surveillance self-defense journey, and that is two-factor authentication. PATEL (voiceover): This may sound like a hacker ninja term, but it's just an added layer of security, requiring an additional method of confirming your identity. Most of the time, it's just a simple text message. GALPERIN: Because it requires that you have two things in order to log into your account. You will need both your password and also this code, so the site will send you a text message. There is also an authenticator app. The site will ask you to take a photo of and then go to the site in a QR code. The best way to be a security superhero is to find the stuff that works for you and implement it as sort of smoothly and seamlessly as you can. PATEL (voiceover): These data self-defense tweaks are great for me and my own data. But what about everyone else? I'm wondering, is there a way for all of us to share data and get all the sweet benefits, without sacrificing privacy and security? At M.I.T.'s Media Lab, I'm meeting up with Ramesh Raskar, who is working on a way to access the benefits of personal health data-- finally, a topic I understand-- without compromising our private information. At Ramesh's Camera Culture Lab, they're building some data collecting tools that seem straight out of a science fiction movie. PATEL: Love when a sign says "Danger: Laser." You know important things are happening in here. (voiceover): They've got cameras that can literally see around corners. PATEL: Get out! (both laughing) RASKAR: We are not violating the laws of physics. PATEL (voiceover): And cameras that can see inside our own bodies. That's about as personal as it gets. For Ramesh, protecting patient privacy is paramount. But he's found a drawback in how we do that. Since data is locked up for privacy, advancement in medical science is not as fast as it could be. Because, think about it, access to tons of personal health data could help researchers make major medical breakthroughs. But today, researchers have to ask for your consent to peek at your data in order to learn from it. RASKAR: When we talk about consent, someone's still peeking into your data. A no-peek privacy, on the other hand, is where nobody can peek at your data. PATEL (voiceover): No-peek privacy? But how could researchers learn anything? RASKAR: There are many ways to create no-peek privacy. First one is the notion of smashing information. PATEL (voiceover): No, not literally smashing your phone. But I'm not gonna lie... ...that was really fun. Smashing is basically the idea of taking raw data and smashing it into just the wisdom. PATEL (voiceover): According to Ramesh, smashing data is simply the process of extracting the useful information, which he calls the "wisdom," while obscuring the private data. In other words, a collection of private health records contains two kinds of data. One kind is the personal data-- the names, conditions, and histories, the stuff we absolutely want to protect. But collectively, the records may contain valuable information about patterns in health care. The wisdom. I'm thinking about examples in health care, and the individual data, the patient data is protected, and you can't reverse engineer it. Let's take a concrete example of COVID. Imagine during the early days of the pandemic, we could have sent medical experts from here in Boston to China, to Korea, to Italy, and embedded them in those hospitals to understand what COVID pneumonia looks like on a chest X-ray. And they could have come back to Boston, all those medical experts, and said, "Hey, together, we can figure out what this is." PATEL (voiceover): So, the experts would return back with just the wisdom, an understanding of COVID derived from being exposed to large data sets. But they wouldn't come back with specific patient info. None of the raw, sensitive, and private data. But of course, that would require initially sharing patient data; a non-starter. Because of privacy, because of regulations, because of national security issues, they couldn't. PATEL (voiceover): This is where A.I., Artificial Intelligence, comes in. Instead of sending experts to each hospital, Ramesh is working on a way to send A.I. to learn instead. An A.I. model could be trained on patients' lung X-rays to learn what signs of COVID look like. The private health data would never be copied and would never leave the hospital or the patient's file. The A.I. would only transmit its conclusions, the wisdom, the smashed data. RASKAR: It's not enough to just remove the name from the chest X-ray, but make sure that you don't send any of the pixels of the chest X-ray at all. PATEL (voiceover): The A.I. can learn patterns and gain knowledge without having to keep hold of everyone's data. So the "wisdom" from one hospital can be combined with the "wisdom" of others. RASKAR: Achieving privacy and benefits of the data simultaneously, it's like having a cake and eating it, too; it's available. And it's just a matter of convincing large companies to play along those rules. PATEL: Smashed data is one way we can learn to stop worrying and love sharing our personal data. But every corporate network would have to agree to smash our raw data. I'm not convinced we should wait around for big tech companies to do this. HILL: The big technology companies, they have built the infrastructure of the whole internet. You're not just interacting with these companies when you know you're interacting with them. PATEL: Even if it seems like you're on a totally independent website, big tech still sees you. For example, tons of websites, like some of the ones owned by NASA, Pfizer, BMW, even PBS, use Amazon Web Services, which means Amazon gets some data when you visit them. Or when you visit any app on your iPhone, Apple knows. Also, every time you've been to a webpage that has a Facebook "like" button, Facebook gets to gobble up your personal data. KAHLE: So it may seem like the web is decentralized because it comes from many different places. But in fact, there's centralized points of control. CHOWDHURY: Right now, for example, when you are in a social media app, you're sort of locked in, right? All your information is on there. It's hard to leave a social media app because you're like, "All my friends are there, all my photos are there. "If I move to another app, I have to rebuild that community." GALPERIN: Those social media websites are controlled by a very small number of companies. And the rules about using those websites, who has access to them and what kind of behavior is acceptable on them, are set by the websites themselves. PATEL: This centralized data monopoly, how do we start to dismantle it? That's exactly what proponents of an idea called the decentralized web want to do. GALPERIN: The decentralized web is sort of our antidote, the antithesis of the web as we now think of it. CHRISTINE LEMMER-WEBBER: The decentralized web can be seen in contrast to the centralized web, of course. Instead of what has become, on the internet, of a few big players kind of controlling a lot of the web and the internet space, we're really kind of rolling things back to kind of what the vision of the internet was. PATEL: I'm wondering what a decentralized web would look and feel like? So I'm meeting up with coder Christine Lemmer-Webber, who is working on writing the standards and rules for how social media could work on a decentralized web. LEMMER-WEBBER: Most people are familiar with social networks: you know, they've used, you know, X-slash-Twitter; they've used Facebook; they've used Instagram. The decentralized social web is like that, but no one company, no one gatekeeper controls the thing. We want to be able to have all of our different social media sites, all of our different online communication tools be able to talk to each other. DOCTOROW: A decentralized web is one that is focused on the self-determination of users. You're not locked in. So you find a service, and you like it. (duck quacks) And then you and they start to part ways. The art you made, the stories you told are there. In a decentralized web, you go somewhere else. You go somewhere else, and you don't lose any of that. PATEL (voiceover): But imagine, instead of having one Facebook-type company, we each had our own data live on our own devices, and it was easy to join or merge with others, or disconnect from them. Leaving one group and joining another wouldn't mean rebuilding everything from scratch because your data moves with you. It sounds like a tech fantasy. But it's actually something we've already proven works online. Consider something you may use every day, or if you're like me, every minute: email. LEMMER-WEBBER: So what a lot of people don't realize is that they use decentralized networks every day. It doesn't matter if somebody's on Gmail, or if they're on Hotmail, they're on their university email. They don't even have to think about it. They type an email to their friend, and they send it off, and it gets there, right? At the heart of it, that's kind of the basics of what we're doing. PATEL: In other words, you can get an email sent from Hotmail even if you have a Gmail account. Or you can create your own email server, for that matter. It doesn't matter what email service you use because the protocol, the rules, are written to allow the email servers to talk to one another. LEMMER-WEBBER: That's because there's a shared protocol that says, here's how we get the messages from place to place. Social networks could just work like email. PATEL: And in fact, there are plenty of decentralized projects out there like Mastodon and Blue Sky putting the user's profile and data back into the users' hands. LEMMER-WEBBER: In the systems we're moving towards that will be much more the case where private communication is much more private. They're, by default, secure in that type of way. We're building new foundations for the internet that allow for healthier, safer, more decentralized systems, gatekeeper-free. That's the vision and the future we're trying to build. PATEL: No doubt, there are secrets in your data. NOBLE: Everything we touch is increasingly datafied, every dimension of our lives. CHOWDHURY: All of this data goes into feed algorithms. And algorithms make predictions about how long you will live, or your health, or your lifestyle. PATEL (voiceover): And while I learned how to protect my personal privacy and security, there's only so much I can do myself. HILL: There are things that individuals can do to protect their privacy. Ultimately, in order to avoid a huge crisis, it requires systemic change. GALPERIN: There are limits to what we can do as individuals. There are also things that need to be addressed through regulation, through litigation, and through legislation in order to make all of us safer. PATEL: It's comforting to know there are techies out there trying to protect us and our data, designing new ways to enjoy the fun and convenience of the digital world without being exploited. KAHLE: New technologies, new games can proliferate without centralized points of control. That's the excitement of the decentralized web. LEMMER-WEBBER: I can't wait for people to see this vision we're building for collaborative, more consensual, decentralized social networks. That's going to be really exciting. PATEL: The future we're building might not look too different from the internet of today. But it could be much more private and secure, if we get it right. DOCTOROW: This is about how technological self-determination and regulation work together to make a world that is more safe for human habitation. ♪ ♪ ♪ ♪ ♪ ♪ ♪ ♪ ♪ ♪ ♪ ♪
Info
Channel: NOVA PBS Official
Views: 203,440
Rating: undefined out of 5
Keywords: nova, pbs, novapbs
Id: ih_GGQX_zmM
Channel Id: undefined
Length: 53min 57sec (3237 seconds)
Published: Thu May 16 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.