Facial Recognition: Last Week Tonight with John Oliver (HBO)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

The fact that the Clearview guy has a history of running phishing scams is quite concerning. How do you as a company look at that resume, and think "He seems like a trustworthy individual to work with!". And yes I know: $$$, but wouldn't they be opening themselves up to lawsuits if Clearview ended up doing suspicious things with the data they've collected? (As in more suspicious than what they're publicly doing)

Edit: Spelling

πŸ‘οΈŽ︎ 186 πŸ‘€οΈŽ︎ u/HoLYxNoAH πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

covering your face outside trying to avoid being videoed is deemed suspicious now, this awfully disturbing

πŸ‘οΈŽ︎ 235 πŸ‘€οΈŽ︎ u/Rond3rd πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

Anyone with a mirror?

πŸ‘οΈŽ︎ 28 πŸ‘€οΈŽ︎ u/ZenithPeverell πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

The most effective form of political action proposed herein is to hold up signs expressing displeasure in front of random cameras.

That's not a great plan.

Anyone have any other ideas?

πŸ‘οΈŽ︎ 28 πŸ‘€οΈŽ︎ u/slip-7 πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

The very fact that the police in the UK stop you, harass you, and then force you to have your picture taken after giving you a fine, all because you decided to cover up in front of the camera, shows you the state of policing in the world today.

πŸ‘οΈŽ︎ 70 πŸ‘€οΈŽ︎ u/Dreams-in-Data πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

We're officially living in a Black Mirror episode.

πŸ‘οΈŽ︎ 89 πŸ‘€οΈŽ︎ u/Driew27 πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

Black people not being recognised remind me of a great episode of Better off ted. https://www.youtube.com/watch?v=lMy5YpJysy4

πŸ‘οΈŽ︎ 18 πŸ‘€οΈŽ︎ u/erkanan πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

If you’re an EU citizen/resident, California or Illinois resident you can request to be deleted and opt-out of Clearview’s system all together.

Edit: For them to process your request, you need to upload a clear photo of your face and a photo of a government issue ID.

https://clearview.ai/privacy/requests

πŸ‘οΈŽ︎ 82 πŸ‘€οΈŽ︎ u/Darabo πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies

Go fuck yourself, Hon Thon-That.

πŸ‘οΈŽ︎ 56 πŸ‘€οΈŽ︎ u/BoogsterSU2 πŸ“…οΈŽ︎ Jun 15 2020 πŸ—«︎ replies
Captions
our main story tonight concerns facial recognition the thing that makes sure my iPhone won't open unless it sees my face or the face of any toucan but that is it facial recognition technologies be showcased in TV shows and movies for years Denzel Washington even discovered a creative use for it in the 2006 action movie deja vu we have facial recognition software yeah let's use it on the bag Cross Master - all the bags on the south side of the city in the 48 hours leading up to the explosion right don't think that's ever been used this way same day bingo bingo indeed Denzel with smart believable plot development like that is frankly no wonder that deja vu receives such glowing IMDB reviews as an insult to anybody who finished elementary school worst movie of all time more like deja ARPU and my personal favorites a one-star review that reads bruce greenwood as always is great and so sexy and there's a cat who survives a review that was clearly written either by bruce greenwood or that cat now the technology behind facial recognition has been around for years but recently as it's grown more sophisticated is applications have expanded greatly freezers it's no longer just humans who can be the targets the i farm sensor scans each fish and uses automatic image processing to uniquely identify each individual a number of symptoms are recognized including loser fish yes loser fish which by the way is an actual industry term now that company says it can detect which fish are losers by facial scan which is important because can you tell which one of these fish is a loser and which one is a winner are you sure about that because they're the same fish this is why you need a computer but the growth of facial recognition and what it's capable of brings with a host of privacy and civil liberties issues because if you want a sense of just how terrifying this technology could be if it becomes part of everyday life just watch as a Russian TV presenter demonstrates an app called find face yes the rule if you find yourself in a cafe with an attractive girl and you don't have the guts to approach her no problem all you need is a smartphone and the application fine face find new friends take a picture and wait for the result now you're already looking at her profile page burn it all down burn everything down I'll realize that this is a sentence that no one involved in creating the app ever once thought but just imagine that from a woman's perspective you're going about your day when suddenly you get a random message from a guy you don't know that says hello I saw you in cafe earlier and used find face app to learn your name and contact information I'll pick you up from your place at 8:00 don't worry I already know where you live but one of the biggest users of facial recognition is perhaps unsurprisingly law enforcement since 2011 the FBI has logged more than three hundred and ninety thousand facial recognition searches and the databases law enforcement pulling from include over one hundred and seventeen million American adults and incorporates among other things driver's license photos from residents of all these states so roughly one in two of us have had our photo search this way and the police will argue that this is all for the best in fact here is an official with the London police explaining why they use it they're here in London we've had the London Bridge attacked the Westminster Bridge attacked the suspects involved in that the people were guilty of those offences were often known by the authorities had they been on some database had they been picked up by cameras beforehand we may have been able to prevent those atrocities and that would definitely be a price worth paying okay it's hard to come out against the prevention of atrocities this show is and always has been anti atrocity but the key question there is what's the trade-off if the police can't guarantee that they could prevent all robberies but the only way to do that is by having an officer stationed in every bathroom watching you every time you take a I'm not sure everyone would agree that it's worth it and the people who do might want that for reasons other than preventing crime and now it's actually a very good time to be looking at this issue because there are currently serious concerns the facial-recognition is being used to identify black lives matter protesters and if that's true it wouldn't actually be the first time as this senior scientist Google Timothy brew will tell you there was an example with Baltimore Police and the Freddie gray marches where they use face recognition to identify protesters and then they try to link them up with their social media profiles and then target them for arrests so right now a lot of people are actually urging people not to put images of protesters on social media because there are people out there who do whose job is just to look up these people and and target them it's true join the Freddie gray protests police officers used facial recognition technology to look for people with outstanding warrants and arrest them which is a pretty sinister way to undermine the right to assemble so tonight let's take a look at facial recognition let's start with the fact that even as big companies like Microsoft Amazon and IBM have been developing it and governments all over the world have been happily rolling it out there haven't been many rules or a framework in place for how it is used in Britain they've actually been experimenting with facial recognition zones even putting up signs alerting you to the fact that you're about to enter one which seems polite but what what happens when one man decided he didn't actually want his face scanned this man didn't want to be caught by the police cameras so he covered his face but he stopped him they photographed him anyway an argument followed what's your suspicion the the fact that he walked past I would do the same it's a cold day as well we've found that the police offices I'll sweep the pump in so I've got me back up I'll sit with me cough I've got on there 90 pound fine there you go look at that no effects next 90 pound well done yeah that guy Ritchie character was rightly mad about that and incidentally if you are not British and you're looking at that man then at me I'm wondering how we both came from the same Island let me quickly explain British people come in to variations so emotionally-stunted that they're practically comatose and cheerfully telling large groups of policemen to off and do one if you're gonna take a photo with me face there's absolutely nothing in between the two and the UK's by no means alone in building out a system Australia is investing heavily in a national facial biometric system called the capability which sounds like the name of a Netflix original movie although that's actually perfect if you want people to notice it think that seems interesting and then forget it ever existed and you don't have to imagine what this technology would look like in the hands of an authoritarian government because China is unsurprisingly embracing it in a big way for more countries in la soie we can match every face with an ID card and trace all your movements back one week in time we can match your face with your car match you with your relatives and the people you're in touch with with enough cameras we can know who you frequently meet that is a terrifying level of surveillance imagine the Eye of Sauron but instead of scouring Middler for the One Ring he was just really into knowing where all his orcs like to go to dinner and some state-funded developers in China seem weirdly oblivious to just how sinister their projects sound China what is that the Terminator is a a favorite film of war founder so play for you the same name but they want to prove something good in the list system so ok in the Terminator Skynet is evil reigns down death from the sky but in China Skynet is good yeah not a difference oh that's the difference is it you know it's not exactly reassuring that you called your massive all-encompassing AI network Skynet but a good version because it'd be like if the Today Show built a robot journalist and called it Matt Lauer but good oh yeah this one's completely different sure he does also have a button under his office desk but all it does is release lilac air freshener this is the good version the point is this technology raises troubling philosophical questions about personal freedom and right now there are also some very immediate practical issues because even though it is currently being used this technology is still very much a work in progress and its error rate is particularly high when it comes to matching faces in real time in fact in the UK when human rights researchers watched police put one such system to the test they found that only 8 out of 42 matches were verifiably correct and that's even before we get into the fact that these systems can have some worrying blah spots as one MIT researcher found out when testing out numerous algorithms including Amazon's own recognition system at first glance MIT researchers joy bull and weenie says the overall accuracy rate was high even though all companies better detected and identified men's faces than women's but the error rate grew as she dug deeper lighter male faces were the easiest to guess the gender on and darker female faces were the hardest one system couldn't even detect if she had a face and the others misidentified her gender white guy no problem yeah white guy no problem which yes is the unofficial motto of history but it's not like what we needed right now was for computers to somehow find a way to exacerbate the problem and it gets worse in one test Amazon system even failed on the face of Oprah Winfrey someone so recognizable her magazine only had to type the first letter of her name and your brain auto-completed the rest and that's not all a federal study of more than a hundred facial recognition algorithms found that Asian and African American people were up to a hundred times more likely to be misidentified than white man so that is clearly concerning and on top of all of this some law-enforcement agencies have been using these systems in ways they weren't exactly designed to be used in 2017 police were looking for this beer thief the surveillance image wasn't clear enough for facial recognition software to identify him so instead police used a picture of a look-alike which happened to be actor Woody Harrelson that produced names of several possible suspects and led to an arrest yeah they used a photo of Woody Harrelson to catch a beer thief and how dare you drag Woody Harrelson into this this is the man that one's got drunk at Wimbledon in this magnificent hat made this facial expression in the stands and in doing so accidentally made tennis interesting for a day he doesn't deserve prison for that he deserves the Wimbledon trophy and there's been multiple instances where investigators have had such confidence in a match they've made disastrous mistakes a few years back Sri Lankan authorities mistakenly targeted this Brown University student as a suspect in a heinous crime which made for pretty awful finals week on the morning of april 25th in the midst of final season I woke up in my dorm room 2:35 missed calls all frantically informing me that I had been falsely identified as one of the terrorists involved in the recent Easter attacks in my beloved motherland Sri Lanka that's terrible finals week is already bad enough what with staying up all night alternating shots of 5-hour energy and java monster mean beam while trying to push your brain to remember the differences between baroque and Rococo architecture without waking up to find out the Evo's I've been accused of terrorism because a computer sucks up faces now on the one hand these technical issues could get smoothed out over time but even if this technology eventually becomes perfect we should really be asking ourselves how much we're comfortable with it being used by police by governments by companies or indeed by anyone and we should be asking that right now because we're about to cross a major line for years many tech companies approached facial recognition with caution in fact in 2011 the then chairman of Google said it was the one technology the company had held back because it could be used in a very bad way and think about that it was - Pandora's Box e for Silicon Valley the world's most enthusiastic Pandora's Box openers and even some of the big companies that have developed facial recognition algorithms have designed it for use unlimited data sets like mug shots or driver's license photos but now something important has changed and it is because of this guy wonton tat and his company Clearview AI and I'll let him describe what it does quite simply Clearview is basically a search engine for faces so anyone in law enforcement can upload a face to the system and it finds any other publicly available material that matches that particular face ok so the key phrase there is publicly available material because Clearview says it's collected a database of 3 billion images that is larger than any other facial recognition database in the world and it's done that by scraping them from public facing social media like Facebook LinkedIn Twitter and Instagram so for instance cliff Hughes system would theoretically include this publicly available photo of tante at at what appears to be Burning Man or this one of him wearing a suit from the exclusive Santa Clau after dark collection at Men's Wearhouse and this very real photo of him shirtless and lighting a cigarette with blood covered hands which by the way is his profile photo untitled because yes of course he's also a musician I can only assume that that's the cover of an album called automatic skip if this ever comes up on a Pandora station and Tom tats willingness to do what others have not been willing to do and that is scraped the whole internet for photos has made this company a genuine game changer in the worst possible way just what does the impresses our journalists by running a sample search so here's the photo you uploaded to me mm-hmm a headshot from CNN mm-hmm so first few images it's found it's found a few different versions of that that same picture but now as we scroll down we're starting to see pictures of me that are not from that original image oh go to it Wow oh my god so this this photograph is from my local newspaper where I lived in Ireland and this photo would have been taken when I'm when I was like 16 Wow that's crazy yeah it is well though here is some advice if there is an embarrassing photo of you from when you're a teenager don't run away from it make it the center of your television shows promotional campaign and own it use the fact that your teenage years were a hormonal stalingrad harness the pain but the notion that someone can take your picture and immediately find out everything about you is alarming enough even before you discover that over 600 law enforcement agencies have been using Clearview service and you're probably in that database even if you don't know it if a photo of you has been uploaded to the Internet there is a decent chance that Clear View has it even if someone uploaded it without your consent even if you untagged yourself or later set your account of private and if you're thinking hold on isn't this against the Terms of Service for internet companies you should know Clearview actually received cease and desist orders from Twitter YouTube and Facebook earlier this year but it has refused to stop arguing that it has a First Amendment right to harvest data from social media which is just not at all how the First Amendment works you might as well argue that you have an Eighth Amendment right to dress up rabbits like John Lennon that amendment does not cover what I think you think it does and yet TomTom insists that this was all inevitable so we should all frankly be glad that he's the one who did it I think the choice now is not between like no facial recognition and facial recognition is between you know bad facial recognition and responsible facial recognition and we want to be in the responsible category well sure you want to be but are you because there are a lot of red flags here for starters apps he developed before this included one called Trump hair which would just add Trump's hair to a user's photo and another called vidi ho that fished its own users tricking them into sharing access to their Gmail account and then spanning all their contacts so I'm not sure that I would want to trust my privacy to this guy if however I was looking for someone to build an app that let me put Ron Swanson's mustache on my face as my checking account was quietly drained sure then he'd be the top of my list and despite clear his repeated reassurances that its product is intended only for law enforcement as if that is inherently a good thing he's already put it in a lot of other people's hands because in addition to users like the DEA and the FBI he's also made it available to employees at Kohl's Walmart and Macy's which has alone completed more than 6,000 facial searches and it gets worse because they've also reportedly tried to pitch their service to congressional candidate and white supremacist Paul Nealon suggesting that they could help him use unconventional databases for extreme opposition research which is a terrifying series of words to share a sentence with white supremacists now cliff who says that that offer was unauthorized but when questioned about who else he might be willing to work with Tom tats answer hasn't been reassuring there's some countries that would never sell to that at very adverse to the US for example like China and Russia Iran North Korea so those are the things that are definitely off the table and about countries that think that being gay should be illegal it's a crime so like I said you know we want to make sure that we do everything correctly mainly focus on the u.s. in Canada and the interest has been overwhelming to be honest just so much interest that you know you're taking in one day at a time yeah that's no terribly comforting when you ask a farmer if you'd let foxes into the henhouse the answer you hope for is know not the interest from foxes has been overwhelming to be honest just so much interest so you know we're taking it one day at a time and unsurprisingly reporters for BuzzFeed have found that Clear View has quietly offered its services to entities in Saudi Arabia and the United Arab Emirates countries that view human rights laws with the same level of respect that clear view seems to have to Facebook's Terms of Service so facial recognition technology is already here the question is what can we do about it well some are trying to find ways to thwart the cameras themselves hi guys it's me Jillian again with a new makeup tutorial today's topic is how to hide from cameras okay first that's probably not a scalable solution and second I'm not sure if that makes you less identifiable or the most identifiable person on earth officers are on the lookout for a young woman dark hair medium build looks like a mine who went through a shredder look clearly what we really need to do is put limits on how this technology can be used and some locations have laws in place already San Francisco banned facial recognition last year but the scope of that is limited to City law enforcement it doesn't affect state and federal use or private companies meanwhile Illinois has a law requiring companies to obtain written permission before collecting a person's fingerprints facial scans or other identifying biological characteristics and that is good but we also need a comprehensive nationwide policy and we need it right now because again there are worries that it is being used in the protests that we are seeing now and the good news is that just this week thanks to those protests and two years of work by activists some companies did pull back from facial recognition for instance IBM says they'll no longer develop facial recognition meanwhile Amazon said it was putting a one-year hold on working with law enforcement and Microsoft said it wouldn't sell its technology to police without federal regulation but there is nothing to stop those companies from changing their mind if people's outrage dies down and for the record while Clearview says it's canceling its private contracts it's also said it will keep working with the police just as it will keep harvesting you photos from the internet so if Clearview is gonna keep grabbing our photos at the very least there may be a way to let them know what you think about that so if the next time you feel the need to upload a photo maybe throw in an extra one for them to collect maybe hold up a sign that says these photos were taken unwillingly and I'd rather you not be looking at them or if that feels too complicated just Clearview that really does get the message across and remember these photos are often being searched by law enforcement so you may want to take this opportunity to talk to the investigators looking through your photos maybe something like I don't look like Woody Harrelson but while I have your attention defund the police really whatever you feel is most important to tell them you should put on a sign that's our show thank you so much for watching we'll see you next week good night [Music] you
Info
Channel: LastWeekTonight
Views: 7,346,633
Rating: 4.8644781 out of 5
Keywords:
Id: jZjmlJPJgug
Channel Id: undefined
Length: 21min 11sec (1271 seconds)
Published: Mon Jun 15 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.