Breaking News

Did the ACLU discover that 28 members of Congress were criminals? Not exactly.

The future is now! And it’s kind of scary.

Amazon’s come a long way from being the little online bookstore that could. Now, in addition to delivering your packages, running your smart home features, and telling you what to wear, it may also soon be helping the government track every move you make.

A few items on that list are a little creepy, but it’s really that last one that’s setting off red flags with people and groups like the ACLU concerned with civil liberties.

In 2016, the company launched Amazon Rekognition, its flagship image recognition software. The basic premise was that you could take a picture, run it through the software, and it’d respond by telling you what the picture was. The example used in the rollout was a photo of a dog. Awww!

Fast forward to 2018, and Rekognition has gotten a few upgrades. It’s even being tested out by a handful of police departments. The company boasts about the technology’s ability to detect, track, and analyze photos or videos of people. They refer to it as “high-quality person tracking” and “activity detection.”

The analysis incorrectly matched the faces of 28 members of Congress with mugshots.

In other words, not only might this new software be used as the backbone of a new surveillance state, but it also might flag you as a criminal. That’s not ideal! Thankfully, it caught congresspeople’s attention, with a number of senators and representatives issuing statements about the experiment.

The ACLU’s study also revealed another issue with the technology: People of color are disproportionately likely to get a false match.

Six members of the Congressional Black Caucus were falsely matched to mugshots. Despite that just 20% of members of Congress are people of color, 39% of false matches were people of color.

Rep. Luis Gutiérrez (D-Illinois), who was one of the politicians wrongly matched by Rekognition, signed a letter with other congresspeople to Amazon CEO Jeff Bezos expressing concerns. Photo by Scott Olson/Getty Images.

“It’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins,” the ACLU’s Jacob Snow wrote on the group’s blog. “Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.”

Snow continued:

“An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.”

But there are some simple things you can do to prevent facial recognition software from being used the wrong way.

For one, you can join the ACLU’s efforts to petition Amazon to do the right thing and stop selling surveillance equipment to the government. You can also donate to the ACLU to help fund its efforts to fight back against government overreach and threats to our privacy.

The most important thing you can do is to call up your representatives at the federal, state, and local levels. Let them know that this is something that concerns you and that you’d like to see action taken to make sure this technology doesn’t get misused.

Leave a Reply

Your email address will not be published. Required fields are marked *