Block on Trump's Asylum Ban Upheld by Supreme Court
Yes, yes, we've all heard the jokes before -- the federal government is full of crooks and lairs. But Amazon is mistaking congresspeople for actual criminals. The American Civil Liberties Union ran a little test on Amazon's facial recognition system, Rekognition, using it to compare public photos of every current member of the House and Senate to a database of 25,000 publicly available arrest photos. The results were not good.
The software ID'd 28 current congressmembers as criminals. And if you're cynically thinking, "I bet they were all people of color," well ... you're not entirely wrong.
Amazon's Rekognition system is available to the public, and the ACLU says it cost them less than $13 to run the test. "Nearly 40 percent of Rekognition's false matches in our test were of people of color," the ACLU asserted, "even though they make up only 20 percent of Congress":
The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance.
(To be fair, Lewis has been arrested for peaceful protests 45 times, according to his own count, most recently in 2013, but his mugshots weren't included in the comparison set.)
The ACLU claims Amazon is "aggressively marketing its face surveillance technology to police," and some departments have already started using it. The civil liberties organization and Microsoft recently requested guidance from Congress on using facial recognition systems.
The Negatives of False Positives
The ACLU's findings come a month after the Congressional Black Caucus sent a letter to Amazon CEO Jeff Bezos, warning of the "profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protestors":
It is quite clear that communities of color are more heavily and aggressively policed than white communities. This status quo results in an oversampling of data which, once used as inputs to an analytical framework leveraging artificial intelligence, could negatively impact outcomes in those oversampled communities. Even body cameras, which were originally intended to strengthen police accountability, could be used as a tool to surveil law-abiding Americans and potentially violate their fourth amendment rights. We are seriously concerned that wrong decisions will be made due to the skewed data set produced by what we view as unfair and, at times, unconstitutional policing practices.
While the federal government thus far hasn't been eager to rein in local police departments when it comes to using new tech, the results of the ACLU's experiment may spur more actual from those "crooks" in D.C.