How to Disappear Completely

Video Statistics and Information

Video
Captions Word Cloud
Captions
hello no it is working it must be because everyone stopped talking so they must have heard me um hi everybody welcome back again to the privacy security identity mini conference the one with the longest name we're winning that so up next we've got Lily Ryan who is a pen tester Python Wrangler and recovering historian from Melbourne more importantly she has two adorable greyhounds and is one of the conference's many knitters Lily does an amazing talk where she wears a wizard robe and has a wand and she's not doing that today instead she's gonna tell us how to disappear completely hi folks yeah this one we're all good everyone can hear me everyone at the back yes good I've got a thumbs down and a thumbs up and now I'm confused go to assume that because you laughed it was fine my name is Lily Ryan and today I am going to teach you how to disappear completely you can find me online as Atticus or on Twitter as Atticus underscore au because someone is squatting on the other name and has been a few years in previous life I was a medieval historian and a linguist and a graphic designer but these days is there an said I'm a pen tester and my job is to push the boundaries of systems the systems that I like to interrogate the most of the social systems that govern how and when technologies used and I'm especially interested in the non-consensual use of Technology and public spaces and how we can make these users more visible to the public and how we can challenge them when we find them so today we're diving into one of the more thorny and increasingly prevalent examples of non-consensual public tech use facial recognition technology over the next half an hour I'll show you how this tech works where it gets used and how it can be broken and we'll also talk about why it's important to keep being critical of facial recognition technology even though legislation like the gdpr promises to make it easier for this tech to not be used without your informed consent so let's start with the science-fiction version of facial recognition so this is the intro for the American television series personal interest there's a lot of high-tech sci-fi face detecting going on you know in connecting faces with identities we're interpreting moods linking phone call metadata all of that stuff it's really great fiction but reality has already stepped over this threshold so you might have seen this clip online before in September 2017 this clip of traffic surveillance cameras at a Chinese intersection surfaced online because we live in the era of fake news it's easy to be skeptical about all these short clips that claim to be unveiling this near futuristic tech but this is actually real software running on a live system and here's a better screen capture of that same software the labels are in English this time instead of in Mandarin but they are still probably too small for you to read so I'll tell you what it's doing it's detecting the basics of the things that are in the video stream in front of it is this thing a car is it a bicycle is it a pedestrian is it a truck if it's a car what color is it you can also see it highlighting the path that everybody's taking as well and if it thinks it sees it a pedestrian it will then start describing what gender it perceives that person to be the approximate age for the person what they might be wearing as it passes it address that kind of thing and having access to this kind of capability makes it really easy to track somebody across a busy city even in China where there's a population of over a billion people and you can see it the size of the screen where the software is pulling out a lot of still images it'll use those and store them to compare them to images later on for reference and since time which is the company that makes this software have another products called sense face which does exactly what it says on the tin cents face picks up live streams from cameras at places where people will pass through with their faces clearly visible to the stream screen that's places like escalators train station ticket barriers their software will match the features from the faces that they see with the faces that they already have on file and the Chinese government requires an ID card for every citizen over the age of 18 which is also what this technology was trained on so there's not only a ready-made government database to compare this to but also it's almost certainly seen every single one of these faces before so its results are extremely accurate because this is its training set it can be really easy to look at this kind of stuff and dismiss it out of hand because we hear a lot that the general attitude of the Chinese public to things like privacy and surveillance are fundamentally different to the attitudes to those things in other countries but systems like this absolutely get used in other places the United States and particularly gives us a number of egregious examples and this isn't just because Silicon Valley keeps coming up with new things that it decides to embed facial recognition technology into but also because of the diversity of the state by state laws in that country it's led to some really extreme use cases one among the many things that Amazon's been criticized for recently is for selling cheap access and consulting services to local US law enforcement agencies that want to use its recognition API and at a time when people are unjustly targeted by law enforcement in the US and many other places every day because just the color of their skin it's really troubling to learn but yet another closed source proprietary algorithm is being mixed into law enforcement processes without any kind of apparent oversight and it's partly because of things like this that a lot of the loudest critical and ethical research into facial recognition technology also comes from the United States this is joy Balham weanie she is a researcher at MIT Media labs she is one hell of a poet and she's campaigning really hard for more transparency into around how accurate facial recognition tech can be in 2015 well and when he founded the algorithmic Justice League to raise awareness of the social biases that are built into facial recognition tech and she started it because her own research projects into facial recognition technology and augmented reality were extremely difficult for her to finish because everything that she tried was full of this proprietary algorithm that had been trained on heavily died heavily by its datasets that had failed to recognize her dark skin as a human face so in order to complete her work she had to result to holding a white mask over her face or to get one of her lighter skinned colleagues to perform the experiment for her because the datasets weren't picking her up so in December 2017 under the umbrella of the algorithmic Justice League Walla Meany published a comparative study of the accuracy of three commercially available facial recognition api's she looked at face plus plus Microsoft Azure face API and IBM's Watson and she found that there was an overall trend towards accuracy but only if you were a lighter skinned male presenting person and there were many many more false matches if you were a darker skinned or a female presenting person or especially if you were both of those things her work also highlighted how just how opaque a lot of these algorithms are so while everyone's running around copy writing and closing off the algorithms in the datasets that make up their secret sauce facial recognition systems almost nobody is actually independently auditing any of these things most of the statistics that we have about how accurate any given facial recognition API is come from the companies that sell them this is not science this is marketing and when this kind of tech gets sold to third parties then used to make decisions about who gets to be arrested and who should be detained there is really no recourse for any kind of justice and it's not just bias based on skin color either some of you may have seen this study late last year pair of researchers from Stanford University published a study that claimed to be able to use facial recognition to detect whether or not somebody was homosexual or heterosexual I yelled so loudly when I read this paper that one of my neighbors came over to see if I was okay because while the algorithm absolutely does take pictures of human faces and it assigns them a rating of gay or straight the way the algorithms underneath was set up completely ignored the idea that anybody might be bisexual or asexual or that gender exists on a spectrum instead of as a strict binary and yet all over the news people were talking about this stuff as some kind of magic scientific facial recognition gaydar instead of questioning what really is pseudo scientific phrenological nonsense nobody really questioned it mostly because it used phrases like facial recognition and machine learning and so of course it must be cutting-edge and accurate but it isn't just the law enforcement agencies and their headline-grabbing researchers that use this kind of tech facial recognition is used in marketing and advertising contexts far more often than anything else and many times even with less public scrutiny and awareness and oversight this for example is the maintenance interface of a digital advertising screen from a shopping mall in Canberra in Australia this was taken in October 2017 you can see it there detecting the faces of the two people in front of it in Australia it is very common for digital advertising screens to do this a lot of them are equipped with cameras there right up the top of the little monolith looking thing and they're pre-installed with tracking software this one's running software by a French company called qui VD who they specialize in commercial applications of facial recognition so the Westfield corporation which operates the shopping mall where that particular screen was found runs many other shopping malls all across Australia and New Zealand and Europe and North and South America and using additive using advertising screens like that one they detect what they think is your age and your gender and your mood and then they correlate this information with data from your smartphone paying off their complimentary Wi-Fi routers all around the malls to make a real-time map of where people are what advertising they should be showing to these people tracking suspicious-looking kinds of people for security incident purposes and to work out whether they can sell everybody everything else when this made news in Australia in 2017 what's field was contacted for comment and a spokeswoman for Westfield said that they do this stuff to connect customers with retailers in a more meaningful way I'm not actually sure what that sentence means because there's someone who's walking around that kind of mall I don't feel a special relationship to anybody especially if I don't know that this kind of traffic is happening connection is a two-way thing and if I'm not participating it's not a connection but Westfield isn't the only retail operator who does this kind of thing inside of shopping environments so in May last year foodstuffs the parent company who runs the new world in the Foursquare supermarkets in New Zealand among other things they got a lot of press when it came out that they were using facial recognition technology in their stores to try to detect shoplifters it only came out because they used this check to apprehend a visitor to one of their supermarkets who turned out wasn't actually the person that they were after so this cutting-edge technology got a lot of bad press they were being used to persecute for being used to persecute people for crimes which clearly didn't work very well but even more than that people were angry that this kind of technology was in use every time they stopped by to get a loaf of bread and they hadn't been informed about it even churches have been getting in on the facial-recognition action in the last few years there is a whole subset of the industry dedicated to making it easy to know just who has and hasn't been to chat with God this week it is what I look it up at its core facial recognition tech is just trying to answer this one question who is this particular person the problems come when we overlay different meanings on that answer the question of who is this can be either benign or threatening when we look at it through the lens of law enforcement or marketing or religion or sexuality or any number of other applications sometimes it makes life convenient for often it comes at the expense of being invisibly surveilled without being told about it and depending on who you are where you are in the world where you are in life and how political factors work for or against you at any given point in time the thing that was convenient today can be threatening tomorrow but without questioning the convenience about pushing the boundaries and the assumptions that we make about this technology the infrastructure will remainder facial recognition tech is often marketed as something that helps us to trust that a person is who they say they are but given how given what we know about how many different facial recognition products there are and how little of them are independently audited at all you know how much we kick how much can we trust that facial recognition itself is who it says it is how much can we trust that the facial recognition Texas used in the spaces that we inhabit won't one day be used to turn us algorithmic biases around on us the good news for a given value of good is that plenty of people from artists to academics to fashion designers have given thought to how to circumvent these technologies and we can do this ourselves as well to do it properly we first have to understand how facial recognition works like we've just established facial recognition technology is a broad umbrella term that covers a lot of proprietary implementations algorithms and approaches as well as a handful of open source solutions but although every company's product looks a little different under the hood they're all operating using roughly the same kinds of steps the three steps that they use our detection face print creation and recognition before a system can recognize a face it needs to know that it is looking at a face this is what detection does traditional computer vision facial detection is all about the patterns maps of light and dark parts of an image and the pattern formed by the eyes the nose and the mouth which is quite a distinctive thing this is a representation of the Viola Jones object detection framework which is a fairly used fairly commonly used system for detecting face in an image if the text patterns of dark and light relative to the patterns next to them and it refines this model over a few iterations it's pretty crude pattern matching needs a full-frontal image of a face in order it'll work effectively but it does the work fairly well with these limitations this is what most phone cameras will use to detect faces and put a little box around them it's cheap it really doesn't take much processing power to do this the next step is face print creation which helps to get helps the system's get a more accurate map of a human face from a two-dimensional image traditional fake facial recognition tech often does this by creating a mesh this measures things like the distance between your eyes and the width of your nose and depth of the eye sockets the shape of the cheekbones the length of the jawline often it begins with a generic model of a human face which is trained on lots of different photographs of people and then kind of averaged out and then the system is placed over your face which has already been detected by the system and the dot points are adjusted to fit the edges of the dark and light patterns of your face and then this model is refined by relating the dots to one another which creates a mesh of your face that can track you as you tilt and move when you aren't looking directly at the camera but as I said this isn't the only way to do it newer methods take different approaches this is what the iPhones true depth camera sees the iPhone projects an array of 30,000 infrared dots over your face to get a real-time sense of depth and this is just one frame of data from those senses they take when it takes a capture it takes many many frames of data in real time and compares them to each other to get a fairly accurate 3d map of your whole face once the system has a map of the face it can try to recognize in recognition connects all of this info with the other data that the system already has about you in the case of the iPhone face ID you have to register your face first and then it stores that information to compare all the new data against then you can have like a numerical representation of a mesh pattern compared against other mesh patterns in different database like like a government database some recognition approaches are a little bit different though some of these earlier steps might be skipped in favor of a different machine learning approach which happens after it detects a face for example Facebook's facial recognition algorithm is called deep face which I thought was ill-advised but there you have it and this is what it uses in the jurisdictions where it's allowed to at any rate to order a tag you and your friends in images when you upload them to the site and this method is apparently 97% accurate although again that is according to Facebook themselves which so that's marketing and it's not actually science another approach which is used by things like the Python face recognition library uses a pre trained deep convolutional neural network to do the detecting and they're recognizing in the face it'll take still photographs of people's faces do a bit of mesh creation and then skew them until the system has some kind of average a bit like what the face recognition library has done to this picture will ferrel up here and then the neural network takes 128 separate measurements of the face these measurements correspond to we we don't actually know what they correspond to because the neural network kind of decides for itself what it wants to measure and however it does that whatever happens it comes up with these mysterious 128 numbers and then these measurements are unique enough to be used to identify people when the camera sees them again because the network can run the same calculations and then cross-correlate i've tested this library out by training on a picture of myself and it was really spooky how quickly it would recognize me even when I was wearing a mask or a fake beard or pulling a silly face which leads me to my next point how do we mess with these things this depends on whether you want to mess with facial detection and stop this at the source so it doesn't actually know that there's a person there or faceprint creation so that the record of your face that's created now doesn't match whatever your face looks like in a few seconds or a few minutes or whether you want to mess with facial recognition or you try to stop a system from recognizing that this is you specifically and the focus of the adversarial methods that I'm about to discuss is on the physical layer that is how the facial recognition systems can be attacked by a person standing in front of a camera if you do further reading on this topic you'll find that there are also attacks on the digital layer which is by altering recognition information in a database so that it presents false information when it's consulted for example and I find this part just as fascinating but I only have half an hour and you're all gonna want your dinner at some point if you also want to go off on a tangent about this there are also fields of study about other biometric recognition attacks like gait analysis iris scanning fingerprints spoofing etc I'd encourage you to look that up later with those caveats let's get invisible here are some of the things that people have tried for many years to avoid to avoid detection this is one of the most traditional facial recognition attack techniques of all obscuring the whole face or the majority of the face using a mask or balaclava this is a look mostly favored by hackers and stock photos but but some facial recognition systems can absolutely be fooled by masks even in 2019 you can still trick some of them with masks made out of printed pictures of your targets face the iPhone seems to be the gold standard for facial recognition security but Android which is still the most popular smartphone operating system in the world has a variety of problems with this so last month Forbes conducted an experiment where they were able to bypass the facial recognition authentication feature on many common Android phones by what's called a presentation attack which is whether device hour locks when it's shown a representation or an image of this person that it's expecting to see they did this for an LG g7 think think thank you I'm not sure if anyone has one you can correct me later a samsung s9 a Samsung Note 8 and a one plus six using a 3d printed head their article on these findings prompted every single one of these companies to release statement saying that biometric authentication wasn't really recommended as a strong smartphone security measure and that they encourage the use of pins or passphrases for better security in fact in May last year Mashable proved that the 1 plus 6 smartphone face unlock function could be fooled by someone just holding up a picture of a printed out photograph of the target and if that's not enough in late 2017 Microsoft's Windows 10 hello face authentication system was discovered to be vulnerable to the exact same attack Microsoft has patched this so of course none of us have machines that are vulnerable to this now because we all run software updates as soon as they're available right yep good another traditional technique for avoiding recognition favored by many celebrities over the years is obscuring the eyes your eyes in the bridge of your nose provide a really rich collection of unique data points that are treated in the most traditional face maps sunglasses cover these things therefore making it kind of hard to identify you but this doesn't always fool the facial detection step again it really depends on the software that's running under the hood making weird faces is another way that facial recognition sometimes fooled some software can't recognize you if you aren't using a neutral expression Airport smart gates fall into this category and I get tripped up by it a lot because you know being from this part of the world means that I like a lot of you have to travel a very long way to get anywhere else and I definitely do not look the same as my passport picture after I've been in transit for 28 hours the number of airport smart gates that have failed to recognize me after that has been nonzero in the realm of specialized anonymity fashion we have the paparazzi scarf which was created by a guy called Seif Siddiqui this scarf project was inspired after a bicycle reflect a ruined flash photograph that he was taking and he realized it would be kind of convenient in some cases to want to ruin that particular photograph apparently Paris Hilton has been seen with one but in case you are thinking of getting one just be warned that the scarf in this picture will cost you 680 New Zealand dollars which is why Paris Hilton has one and I don't but for everyday people who don't have 680 bucks to throw away on one scarf you can get yourself one of these this is the paparazzi visor they became briefly popular in late 2017 it's advertised as a way to protect your face from the Sun and also from Annoying cameras and much like about clobbering most of your face although from all the reports I've heard it is also very annoying to wear but don't worry this is slightly less annoying adversarial fashion reflector calls these work along the same lines as the paparazzi scarf but for your face reflected cools the sunglasses with reflective decoration on them they work with visible light and with infrared light and they have the added benefit of being a little less fashionably jarring to wear than the paparazzi Visor I don't even do it once a while but any survey of facial recognition attacks would be incomplete if I don't mention Adam Harvey's work hobby is an artist who's based in Berlin he's mostly known for his CV Da'Ville work that's this is a computer vision attempt - at the it's a computer vision attempt at this old warfare tactic of dazzle camouflage so d2 for a quick history lesson because this is the top being given by me dazzle camouflage was an experimental form of painting used on ships in World War one it wasn't intended to conceal the ship at all it's very obviously there but it made it more difficult to estimate things about the ship like how fast it was going its distance from you which way it was facing and that way it made the ships harder for enemies to fire out or predict things about and the desert camouflage method achieved this as you can see here by using huge blocks of color in irregular patterns on ships to break up their outline and all their features in just the same way CV dazzle will not make you invisible to other human beings that certainly will not make you invisible unless you're at a sci-fi convention or at a fashion show but to computers it will make it a lot harder for some of them and particularly the more older ones to recognise your face and it does this more or less the same way they did it in World War one they break up a lot of the outlines of the things that the algorithms are designed to detect mostly by messing with the patterns of light and dark Harvey is also responsible for the hyper phase project which is focused on making the parts of the image around a face you know your atmospheric stuff more confusing for computers instead of just focusing on the face itself so patterns like this one aim to saturate the facial recognition algorithm my facial detection algorithm in particular with a bunch of patterns that it really wants to interpret its faces and therefore it will stop it from recognizing actual faces that are present in the same image so the idea is that you could get a shirt printed up like this and wear it down the street and cause account a bunch of computers to really freak out outside of the realms of art in fashion though university researchers have also been trying hard to work around facial recognition tech most of these methods are still pretty experimental but some of them seem to be effective and more of them being developed all the time something that I haven't seen done for facial recognition quite so much yet but which is really having an impact on other forms of machine learning based computer vision is stickers a research team at Google discovered that these trippy looking stickers will completely mess with a neural network trying to detect things in a photo in this case the presence of this sticker makes the neural network think that this banana is actually a toaster I have not seen this tried with human faces yet and it's definitely not going to stop facial recognition techniques that are bought built on more traditional ideas about patterns of light and dark and that kind of thing or a mesh creation by if anyone is in the mood to conduct some research on this there's a good chance that some kinds of experimental new neural network driven facial recognition tech and like several of the other options that we've discussed it is clearly something that will challenge ideas about fashion further and this can only be a good thing this is the one I'm most excited about the in a paper that was published in March last year some researchers from Fudan University and Shanghai experimented with the effects of infrared light on facial recognition and detection systems this team was able to demonstrate that by beaming patches of infrared light onto your face in different ways it can make some facial recognition systems actually think that you were someone else they successfully got one system to recognize a member of their team has four different people using this method including Moby for some reason they did this with a small array of infrared lights that are mounted on the brim of a cap as you can see here the lights were programmed to shine on the face of the person wearing the Hat and there would be in different patterns these patterns were enough to convince the Machine looking at the image that this person was a different person I really like this solution particularly because the cap can be programmed to do whatever you want the infrared light won't bother you that much and when you're done with messing with the facial recognition tech you can just kind of take it off and put it away instead of removing a lot of makeup or looking super suspicious with a visor down over your face academic research and fashion and art are all working together to come up with newer and stranger ways to defeat the newer and stranger algorithms that are being deployed but honestly the most effective way to combat facial recognition system this has nothing to do with facing the cameras at all and all to do with facing other people the fact that the gdpr is now in effect means that a lot of companies in Europe in particular as well as companies that do business in Europe are now forced to consider transparency of usage and informed consent when it comes to deploying facial recognition tech but even though the gdpr might make it more obvious to you that you are being surveilled it still doesn't do anything to guarantee that the underlying algorithms are based on anything that's an unbiased or that the accuracy of any of this tech has been independently audited does not do that there's also the fact that in a global economy non GDP are based tech will absolutely still be produced and sold and get used everywhere including in Europe by people who think that they're not going to get caught there was also the fact that the Australian government in between bouts of infighting is still considering things like the identity matching services bill which has been sitting before the House of Representatives since early last year and around 2:00 it's been busy apparently so this bill aims to draw together a database of all the people all the faces all the pictures of people from passports from driver's licenses from criminal records immigration data other documents and then to match that with information that the government has about our identities which is usually quite a lot then and it wants to give peated buttons Department of Home Affairs access to this whole capability and they literally call this tech the capability with a capital C which does not sound cartoonishly evil in the slightest then there's the assistance in access bill which Ben spoke about earlier so this legislation was brushed through and is completely reprehensible and unconsidered Manor last month it means that any Australian or anyone living in Australia can be asked by the Australian Government to create backdoors in almost any software or face jail time we really can't expect realistic algorithmic transparency or any kind of provable accuracy in facial recognition technology any line of code to arbitrarily be at state secret and for rogue versions of facial recognition technology to be deployed to random devices all over the place but even though algorithmic transparency is still a complete pipe dream striving for it matters now more than ever in a world where we have the assistance in access bill Donald Trump is President of the United States every everyday things just seem to get weirder and worse for basically everybody we need to continue to ask questions and to talk to each other and to be alert to the uses and abuses of biased attack we need to challenge the fancy statistics where we see when it comes to facial recognition accuracy and ask ourselves if what we're hearing is marketing or independently verified fact because right now most of it definitely isn't unbiased and most of it definitely is inaccurate for everybody it's also almost impossible to know which facial recognition systems can be trusted to be socially and ethically responsible my guess would be probably zero of them as a final note I am NOT here to make the case that facial recognition technology has place in the world there are cases like an international border crossings where it can sometimes be useful and their cases like unlocking an iPhone where it can be really convenient but there are also cases where recognition is performed on us and used to draw conclusions about us that we currently don't have any real ways and meaningfully interrogate and that we can't necessarily trust and I'm here to urge caution in where these things have deployed and urge for the creation of more skepticism and more openness better datasets and it clearer ways to know whose technology is being used under the hood and if we're all aware that facial recognition is going to be invisibly performed on us in the absence of change from the top down I would encourage you all to think about how you might become invisible to it in response thank you you
Info
Channel: linux.conf.au 2019 — Christchurch, New Zealand
Views: 45,722
Rating: 4.0345993 out of 5
Keywords: lca, lca2019, #linux.conf.au#linux#foss#opensource, LillyRyan
Id: LOulCAz4S0M
Channel Id: undefined
Length: 33min 56sec (2036 seconds)
Published: Tue Jan 22 2019
Reddit Comments
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.