How police manipulate facial recognition

How police manipulate facial recognition


– So, this is gonna sound weird, but, a few years ago, the NYPD
used facial recognition to catch a shoplifter. And they didn’t even have a
clear picture of his face. The clerk said the guy kind of looked like the
actor Woody Harrelson. So they just pulled up
a photo of Harrelson and put it in the system. It worked, they caught the guy, it turns out he really did
look like Woody Harrelson. But facial recognition
systems were never built to be used this way. And incidents like this raise the question of whether police should
have this technology at all. It’s not the only time something
like this has happened. This past year, Georgetown University conducted a study of some of the use cases and they’re pretty wild. Some police departments
have actually pasted in different facial features in an effort to get the system to produce a match. If the left eye is
blocked in your picture, just paste in a new left eye. If the suspect’s mouth is open, paste in a new mouth that’s closed. These are delicate algorithms but in most cases there
are no strict rules for how police use the algorithms. And whatever the machine produces can be used as grounds for a police stop. Okay, but before we get into that, let’s talk about how facial
recognition really works. At its core, facial recognition is about tracking key facial landmarks
from photo to photo. The distance between your pupils, the angle of your nose, the
shape of your cheekbones, basically all the details of your face that make it distinctive. That works best from a straight on photo with at least 80 pixels
between the pupils. Think like a passport photo
or a driver’s license. But once you’ve got that basic pattern, sophisticated programs can recognize the same features at an angle. It can even work if part
of your face is blocked. As long as there are two pupils and enough features to be sure. Vendors like NEC, Morpho and Cognitech pioneered these systems, selling their software to local
and federal police forces. But in the past few
years, Amazon and Google have been building it into
their computing clouds too which makes it a lot easier to get. With a couple hundred bucks
and some coding skills, almost anyone can create a
facial recognition system. These programs work off
accuracy thresholds. The tighter the match,
the higher the number. But there’s no firm rule about how high the number needs to be. Which means that the same
time police are playing with the photos they upload to the system, they’re also playing with the standard for what counts as a match. So if you look like Woody
Harrelson even a little bit, an officer could adjust
the accuracy threshold until it registers as a hit. And then if you ask why
you’re being stopped, they can just say, the
machine said it was you. If you talk to the people
making these tools, you’re really not supposed
to do any of this. It’s like steering a car with your feet, you can make it work, but
it’s bizarre and dangerous. And with police, the end
result of all of that is stopping someone, maybe for no reason. That’s even worse because the
algorithms are less accurate for women and people of color. It’s not totally clear why that’s true, a lot of people think it’s just a result of algorithms that are
mostly trained on white men. But government testing shows it really consistently across the industry. You can see on this chart, the red lines are the error rates for black people, and the green lines are the
error rate for white people. The red lines are almost always higher which means the person
getting stopped for no reason is more likely to be
from a community at risk. Now, the NYPD says that
no one has been arrested on the basis of facial recognition alone. And that’s true, but facial recognition has been involved somehow
in more than 2800 arrests in the five and a half years
the program has been running. Even when there’s no arrests at all, a false match can still
lead to a police stop which has dangers of its own. There’s supposed to be a clear legal bar for making those stops,
but facial recognition is short circuiting that. Now, defenders of facial recognition will say that despite the problems, it’s still an effective tool for police to protect their communities. Detroit’s Project Greenlight is a network of connected surveillance cameras recently upgraded with facial recognition, and it’s credited with a 23%
drop in crime in the city. But it’s still controversial,
some community members say there’s no transparent oversight and the flood of new tips is
overwhelming the police force. The fight’s gotten really heated, so heated that one of the
city’s Police Commissioners was actually arrested at a hearing trying to speak out against the system. Other cities have passed local laws banning the use of facial
recognition by police for just that reason. San Francisco, home to some
of the largest tech firms in the world, banned it last year. San Francisco supervisor Aaron Peskin was particularly critical, calling it Big Brother technology. But, this isn’t just a tech problem, fundamentally, San Francisco’s saying the government just can’t be
trusted with this technology. Not because it’s so bad but because we don’t have enough oversight over how police departments
will actually use it. That’s a problem that goes much deeper than just recognizing faces. And as we find more powerful ways to peer into the average person’s life, it’s a problem that’s not going away. Thanks for watching, like and subscribe if you want some more. And, if you’re looking for another video, my colleague Kasey Newton
has an incredible report about Facebook moderators
and just what a difficult and disturbing job it is. So, check that out.

About the Author: Earl Hamill

20 Comments

  1. Check out Verge reporter Casey Newton's report about the working conditions of Facebook moderators: https://youtu.be/bDnjiNCtFk4

  2. When you have for-profit prisons, you can no longer work under the presumption that the law enforcement and judicial system is impartial and fair. Cops, judges, everyone in the law enforcement and judicial machine is tasked with feeding the capitalist beast. Cops will have stop/search and arrest quotas, judges will have sentencing quotas, parole boards will have approval/denial targets. All aspects of the system will be gamed to ensure a steady flow of inmates to keep the cellblocks full. For-profit prisons are the reason the US incarcerates more people per head of population than any other nation on earth.

  3. Everything has its advantages and disadvantages. In my opinion, Technology is not bad but how it is used, that is the problem.

  4. I'm all for it as a person that doesn't break the law
    If we knew who did what and where they were, crime would drop dramatically
    we are already on camera when we shop for food,clothing,furniture,electronics etc
    if the option to know if a dangerous person was near you, your spouse or your kids we'd take it

  5. talked about algorithm and to be honest to make one algorithm you have to do so much work… algorithms are not easy to make as you just said ..

  6. That‘s why America is so degenerated! In Europe, Citizens take it easy when the police stops them. Usually it‘s a regular Smalltalk with showing the ID and some Blabla. Totally harmless, you don‘t even care. But hosts like you spread the fear of events like this – making them the worst thing that can happen to you. It‘s sad. Why you promote resistance against police stoppings which leads to people get killed?

  7. Fitting everybody with an arrest anklet will also increase safety, hey while we're here, let's just hold everyone guilty until proven otherwise, it's interesting how years back when there was the idea of microchipping people huge majority was shocked and refused the idea, but now after years of our brains slowly trained into accepting privacy intrusions and bought off by pretty selfies and promise of a glittering lifestyle , the unprecedented theft of freedom and essential liberty is probably to go smoothly as any slave master's dream

  8. Hong Kong has shown you how Surveillance technology will turn into a tool that the police force abuses.
    It is not worth it for a few percents dropped in crime.

  9. Don't let thoughts of the police state, the loss or privacy or the fact that people are now a commodity for the govt to sell to advertisers get you down. Don't want to risk having a mental breakdown now do we? Those can be dangerous these days with the police shooting people in their homes and what not. Remember, only criminals have anything to hide. You're not a criminal are you? Obey and pay your taxes so we can work on future technologies that will deter crime and keep you safe by reading the minds of anyone within range. But don't worry… only criminals have anything to hide. Besides, mind reading sensors would only be placed in public spaces and by leaving the privacy of your dwelling that we ultimately own, you would therefore physically consent to our public safety monitoring systems. If you don't wish to consent to our metadata collection, mind readers, advertisements, trackers, digital fingerprinters, speech recognition, motion detection and anything else we do, you could always just stay indoors. But make sure you regularly check the terms and conditions of every electronic device you own as we frequently update TOCs unbeknownst to the end user because we care for your well-being and the safety of our citizens. Only criminals have anything to hide.

Leave a Reply

Your email address will not be published. Required fields are marked *