Skip to main content
Find a Lawyer
Please enter a legal issue and/or a location
Begin typing to search, use arrow keys to navigate, use enter to select

Find a Lawyer

More Options

Racial Profiling Goes High-Tech: Facial Recognition Gone Wrong

By A.J. Firstman | Last updated on

Six Detroit city police officers showed up at Porcha Woodruff's door one morning to place her under arrest. They had reason to believe that she was the culprit behind a robbery and carjacking that had been caught on camera, and they were convinced that she was the perp they looking for. Ms. Woodruff could either come with them willingly or she could be taken by force.

All of this was something of a surprise to Ms. Woodruff. She didn't recall committing any robberies, carjackings, or other felonies, for one. She was also usually too busy with work, her nursing school classwork, and her two young daughters to find the time to go into the city and rob strangers, and that sort of behavior wouldn't set a great example for her kids.

Oh, and she was eight months pregnant at the time of her arrest. While cravings are very common for people who are in the process of making another person in their bodies, most of those cravings are for certain foods, not violent crime.

None of this mattered to the Detroit PD, of course. They had evidence. They'd run the suspect's face through their facial recognition software, you see, and that software identified Ms. Woodruff as the culprit. And it's not like the software could have been wrong, right?

The Software Was Wrong

Ms. Woodruff was taken to the Detroit Detention Center, detained for 11 hours, questioned repeatedly about her supposed crimes, and finally released on a $100,000 bond after being officially charged with robbery and carjacking. She was taken to the hospital immediately afterward. The concrete benches had hurt to sit on for so long, and the stress of being wrongfully accused and held for so long had given her panic attacks that caused spasms and false contractions. She was treated for dehydration at the hospital and thankfully she and her baby made a full recovery.

Once Ms. Woodruff was released from custody and the hospital, she was left with a burning question: What in the heck was that all about?

The answer was as stupid as it was distressing. The Detroit PD's automated facial recognition search for the real suspect had identified Ms. Woodruff as the suspect, and the police decided to arrest the pregnant mother of two based on that ID alone.

Ms. Woodruff's is the third case that's been brought against the Detroit PD involving their facial recognition software, and the sixth overall case of people being charged with crimes thanks to faulty facial recognition technology. That doesn't sound like many, but Detroit PD only runs an average of 125 facial recognition searches a year.

What's more concerning is the single trait that Ms. Woodruff, the other plaintiffs of the cases against the Detroit PD, every one of the misidentified people charged with crimes, and almost all of Detroit PD's 125 yearly searches. You can probably guess what that is if you know about the history of American policing or the problems with current facial recognition technology. For everyone else, here's a hint: Ms. Woodward is a woman of color.

Technological Prejudice

These aren't the first times people have raised the alarm about law enforcement's use of facial recognition technology. There are obvious privacy concerns, of course, but the most tangible problem is one of what you could call technological prejudice.

Every kind of tech that uses facial recognition software has the same problem, whether we're talking about cell phones, security cameras, or law enforcement equipment: They have an extremely hard time identifying people with darker skin tones.

You can cite any number of different reasons why facial recognition technology doesn't work as well on people of color, but none of them are good or even defensible.

There are the more innocuous explanations like arguing current camera technology has a hard time with dark colors and less-reflective surfaces in general. That's not offensive at first blush, but it does beg the question of why that technology was used in the first place.

You could argue that part of the problem is a bias in the images used to train the algorithms. It's fairly well-known that tech companies' datasets skew toward images of people with lighter skin, so it stands to reason that the resultant technologies would be better at identifying lighter-skinned people. That makes some sense, but it begs a very similar question. If it's a known issue, why in the name of Steve Jobs hasn't it been rectified?

Finally, we get to the least defensible and most upsetting reason of all. Law enforcement facial recognition tech relies on those same algorithms, but it checks against mugshot databases and other law enforcement image repositories. Black faces are underrepresented in the training data. They're vastly overrepresented in the images law enforcement checks against.

In other words: they're choosing from databases full of images that they haven't been adequately trained to recognize, leading to false positives, mistaken identities, and pregnant mothers being locked up for stuff they didn't do.

Intersectional Injustice

Ms. Woodruff's baby was born healthy and happy. The charges against her were dropped about a month after being filed. She's filed a lawsuit against the city of Detroit for wrongful arrest, and it seems a clear-cut case.

It's anyone's guess whether things will actually change, but one thing's for sure: Ms. Woodruff will have her day in court, and she will be heard.

Was this helpful?

You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help

Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.

Or contact an attorney near you:
Copied to clipboard