Police Body Camera Maker Debates Using Advanced Facial Recognition AI

axon-camera.png
(Axon police body camera)

Axon has become the largest supplier of police body cameras in the country. With their market share growing, they're debating on how to improve their products and one of the things they're dabbling with is advanced AI, including facial recognition. 

On April 26th of 2018 Axon announced that they formed an AI Ethics Board to responsibly guide their AI-powered devices and services. According to their press release: 

"The AI Ethics Board is made up of experts from varying fields including AI, computer science, privacy, law enforcement, civil liberties, and public policy."

The board will meet twice a year to discuss all the ethical implications that come from using AI-powered devices. What's the big issue here? Why would anyone have a problem with cameras doing facial recognition or automatically uploading your image to a database?

Well, what if you're not relevant to any current investigation? What if the camera on a police officer just sees you and your friends walking down the street and then the AI software uploads your image to a server and the complex algorithms there recognize you, not from a mugshot, but from an interconnected link of social media apps and sites as well as your own photo storage on Google. Then it tracks where you are and what you do. 

I think it should be used like the new mobile fingerprinting solutions that officers are currently trying out at police stations across the country. If they're making an arrest and are having a hard time identifying someone because of a lack of identification they should be able to see if the person is who they think they are. Now, facial recognition shouldn't be used to properly identify someone to make an arrest, but it could very easily be used in evidence to find witnesses or to back up someone's statement. 

The use of AI and Facial Recognition should be very useful AFTER the footage was taken. There are too many variables to go wrong with trying to identify someone on the fly while making an arrest, but afterward when they're analyzing the crime footage, perhaps it makes sense to go through whoever is on that video and try and identify them with the advanced software. 

Now, if you have nothing to hide, what's the harm in having cameras all over the city doing a constant facial recognition? Well, what if you're at a peaceful protest and they then target and flag your image because you were standing up for something you believed in? What if they sold that data?

Large companies sell anything and everything and the number one thing that they can sell is information. Who does what, when, where and why. If Axon could promise that the data from the live scanning of faces are stored in an encrypted file until there's been an actual crime and they need to pull up certain dates, but can they do that? 

Our privacy is something many Americans give away freely on Facebook, Instagram, and Snapchat. But it doesn't mean it's the same when someone takes it from us.

Source: AXON