This blog is all about tech head and lot more.

Advertisement

Breaking

Friday, June 19, 2020

Amazon bans police from using its facial recognition technology for the next year





Amazon is announcing a one-year moratorium on allowing law enforcement to use its controversial Recognition facial recognition platform, the e-commerce giant said on Wednesday.


The news comes just two days after IBM said it would no longer offer, develop, or research facial recognition technology, citing potential human rights and privacy abuses and research indicating facial recognition tech, despite the advances provided by artificial intelligence, remains biased along lines of age, gender, race, and ethnicity.

Much of the foundational work showing the flaws of modern facial recognition tech with regard to racial bias is thanks to Joy Buolamwini, a researcher at the MIT Media Lab, and Timnit Gebru, a member at Microsoft Research. Buolamwini and Gebry co-authored a widely cited 2018 paper that found error rates for facial recognition systems from major tech companies, including IBM and Microsoft, for identifying darker-skinned individuals were dozens of percentage points higher than when identifying white-skinned individuals. 

The issues lie in part with the data sets used to train the systems, which can be overwhelmingly male and white, according to a report from The New York Times.
In a separate 2019 study, Buolamwini and coauthor Deborah Raji analyzed Rekognition and found that Amazon’s system too had significant issues identifying the gender of darker-skinned individuals, as well as mistaking darker-skinned women for men. The system worked with a near-zero error rate when analyzing images of lighter-skinned people, the study found.

Amazon tried to undermine the findings, but Buolamwini posted a lengthy and detailed response to Medium, in which she says, “Amazon’s approach thus far has been one of denial, deflection, and delay. We cannot rely on Amazon to police itself or provide unregulated and unproven technology to police or government agencies.” Her and Raji’s findings were later backed up by a group of dozens of AI researchers who penned an open letter saying Rekognition was flawed and should not be in the hands of law enforcement.

Amazon did not give a concrete reason for the decision beyond calling for federal regulation of the tech, although the company says it will continue providing the software to rights organizations dedicated to missing and exploited children and combating human trafficking. The unspoken context here of course is the death of George Floyd, a black man killed by former Minnesota police officers, and ongoing protests around the US and the globe against racism and systemic police brutality.

Here’s Amazon’s full note on the one-year ban:

We’re implementing a one-year moratorium on police use of Amazon’s facial recognition technology. We will continue to allow organizations like Thorn, the International Center for Missing and Exploited Children, and Marinus Analytics to use Amazon Recognition to help rescue human trafficking victims and reunite missing children with their families.

We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.

It seems as if Amazon decided police cannot be trusted to use the technology responsibly, although the company has never disclosed just how many police departments do actually use the tech. As of last summer, it appeared like only two departments— one in Oregon and one in Florida — were actively using Rekognition, and Orlando has since stopped. A much more widely used facial recognition system is that of Clearview AI, a secretive company now facing down a number of privacy lawsuits after scraping social media sites for photos and building a more than 3 billion-photo database it sells to law enforcement.


No comments: