This video is sponsored by. Brilliant. So remember when we were all worried
about Clearview AI, that shady facial recognition company that was
scraping social media websites in violation of their privacy policies
for our images without anyone's permission, and actually got sued
by the state of Illinois for that. And was selling access to law
enforcement who were using it to potentially identify criminals,
except there were several cases of in particular people of color being this
identified and arrested because of it. Yeah. So it turns out that you can
actually do that yourself for free. is a facial recognition search website
that allows you to upload an image of in theory yourself, but in reality, anyone
and searched for any facial recognition match to the person in that image. And it's actually been around since
2017, although it was recently acquired. By some other entity in 2020,
and has since gone through a rather significant rebrand. Originally was a website where
you could upload any photo. In fact, a screenshot from 2018
shows it saying that you can upload anyone you want, but when they were
acquired by this mysterious company, which we'll talk about in a sec, they
rebranded as a company designed to help us protect our personal privacy. By being able to track when our
faces show up on the internet. They seem to be based out of Poland,
but CNN business actually did a whole in-depth article on this, which I'd
highly recommend checking out if you want to read more details on the company. And one of the things that they did
was reach out to the company to see who they were and the company declined to. So that's reassuring, but unlikely, real
AI does not scrape social media, or at least it claims to not scrape social
media and instead uses publicly available images that are hosted on websites that
allow people to index their images. And importantly, after a lot of media
coverage, due to concerns around people, using it for stalking, they've also come
out and said that it doesn't actually reveal the identity of the person
in the image, which is kind of BS. And you'll see why in a minute. So you can use penalize for free, but if
you purchase an individual subscription, which is about $30 a month, you can
search for more things and you can also put alerts for specific faces. So of course, I had to try a
subscription to get the full package. So here is their landing page. You can either take a photo using
the webcam on your computer, or you can upload your own photo,
which is what I did at first. I tried a photo of me wearing my glasses. I'll go on to try one, not wearing my
glasses, just to see how well they do. And to use it. You have to accept the terms and
conditions you see that it says you're claiming to use your own face or
an image of your own face in this. But it's a lot of this is
based on the honor system. And as you can see, there are a
lot of results that match my face. And I think it's important to say here
that as a person who exists publicly on the internet, it doesn't surprise me
that there are a lot of images of myself. And in fact, I expected to have seen and. Likely uploaded myself. Most of the images that I
would find through PIM eyes. However, there were a couple images that
I didn't know existed on the internet. One of them is from the tutoring
website that I apparently tutor for. I don't, I would believe that I signed up
to do so at one point, but they're still using my photo on their website, even
though I definitely don't work for them. So that's cool. There are also some weirder examples
like this Instagram thing, I don't know exactly what this is or whether
it was an image that used to be on here and isn't on there anymore. I don't know what's going on there. And here's a photo that I didn't actually
know was taken in the first place. So I both did not upload this, but
also didn't know that it existed. And that was definitely the
interesting part of this. So this is a photo from a. Gala that I went to a few
years ago for a nonprofit. Yeah. My dad is involved in and I had no
idea this picture was being taken and apparently someone uploaded
it to the internet and it exists. Or now importantly, one of the
claims of semis is that by using their service, you can't actually
associate anyone's identity with. The image that you're uploading or
the images that you find, but clearly that's not necessarily true because
you can go to the website associated with any of these images and pretty
easily figure out, especially if you're someone who exists on the internet,
in the form of social media or your own corporate websites, who they are. Here's another example of a
photo that I think I do know. Exists, but don't know why it's here. This is from college. When we went to Niagara falls,
also in this photo is red at Bebo. Who's a new CS professor at
UC Berkeley and is an amazing researcher in algorithmic fairness. So check her out. She's great. And then as you get further into the
results, they show you a separate section of results that they are
less confident match your face. And so in my experience,
these generally weren't me. They tended to be. Random people who look vaguely like me. Um, also a lot of porn, but it wasn't. I see that there was a, it was an
indication of how confident they were, that the face in the image actually
matched me and that they separated out by faces, that they were pretty sure
in faces that they weren't having sex. That I did actually still find a couple
of examples of people who were in the high confidence section who were not me. So this is a great example
of how, even if you have an algorithm that's fairly accurate. In fact, in the interview with CNN
business, they say that their algorithm is about 90% accurate in terms of
facial recognition, you can still get misidentification pretty easily. So next I've tried a photo of
me without classes, just to see whether or not it made a difference. And from what I can tell it really. Didn't I think that the order of images
tended to trend more with the hairstyle that I had in an image then with
whether I was wearing glasses or not. So, I mean, it was with my hair
in ponytails were higher up and images with my hair out. Like this were lower down, but
the same set of images showed up. All right. So after going through this,
my next question was how this compares to something like
Google reverse image search. And if you didn't already know, you can
actually upload pictures to Google image, search, and Google show you results
of pictures that are similar to that. And importantly, Reverse image search. Isn't a facial recognition system. So it's not looking for facial
features and matching them. It's just looking at the photo overall
and categorizing it with photos that are similar, based on the categorization
that it uses via the computer vision API. And so, as you can see, it's looking
for images of people with curly hair. Like my own, uh, but isn't
necessarily looking at images of me. So none of the people in any of these
pictures are me, and this is all to say that when it comes to identifying
people correctly, so it's like PIM eyes are a huge step up from anything
like reverse image search, and I can do a much better job of linking. Random photos of people, whether
it be a suspect that you're looking for in a criminal case or a random
person that you took a photo of on the street to their actual identity. And this circles back to the concerns
about stocking that came from predominantly women in the media
and Pomona is his response that they a have a privacy policy that
says that you can't use it for that. And B have decided as a tool so that
women can actually have more ownership and more autonomy over how their
images are posted on the internet. And I think that this is an interesting
claim because a, no one reads the privacy policy and B even people
who do read the privacy policy probably violate the privacy policy. So the entire thing kind of
runs on the honor system. In fact with the individual subscription,
you can track up to 25 different faces and get alerts for whatever a
new image with that face is posted. So clearly it's designed to track more
than just you at the same time though. I think it can be useful. If you are concerned about someone
posting an image of you without your permission, things like revenge porn come
to mind because you can upload an image of yourself and see whether or not that
exists on the internet and then do your best to get it removed, or at least. Know that it exists instead of
having it surprise you during something like a job interview. In fact, PIM I's was actually a
resource used by a lot of people, including investigative journalists
to track down people who were present at the January 6th capital riot. But at the end of the day, at least
for me, Pema has, does feel like one of those tools that is well-intentioned
or is marketed as well-intentioned, but doesn't necessarily acknowledge
the fact that it can be used for. Malicious purposes, things like Apple, air
tags are another example of that, where it turns out that you might be able to
stalk people, veer their air tags, just like you could stock them via PIM eyes. And while it is nice that it gives
people the ability to track their online presence and know if something
bad has been posted about them, it would also be nice to see some ways. To prevent those things from
happening in the first place and install some safeguards that make
sure that systems like this aren't abused, but these are just ideas. And based on some of the comments that
you guys have left on other videos, you might be wondering how you can develop
real-world machine learning systems that solve useful problems without
compromising things like personal privacy. So what should you do? Well, you're already watching this
video on YouTube, which is a great start, but in order to really
learn something, you have to. Do it brilliant is a website and app built
off this very principle you learn best while doing and solving in real time. Jump right into solving problems
and be coached bit by bit until before you realize it. You've learned a new subject in STEM. You won't have to memorize long,
messy formulas and endless facts. Just pick a course you're
interested in and get started. Feeling stuck or made a mistake. You can read the explanations to find
out more and learn at your own pace. Brilliant has something for everybody. Whether you want to start at the basics
of math, science, and computer science, or dive into cutting edge topics like
cryptocurrency or neural networks. Personally, I've been using really ant to
learn a little more about cryptocurrency so that I can better understand what
is happening with doge coin and NFTs. So if you'd like to join me in a community
of 8 million learners and educators today, sign up for free at brilliant.org/jordan,
or click on the link in the description. In fact, the first 200 people to go
to that link will also get 20% off the annual premium subscription. Otherwise, if you liked this video
and let me know by smashing the black button and subscribing to my
channel, you can also check out the video that I did on Clearview AI. If you haven't seen that already to get
a background on what's going on in the realm of facial recognition, otherwise,
if you want to follow my PhD life, you can do so on Twitter, Instagram, tick
doc, and via my sub stacked newsletter. And I will see you on Monday. Bye.