Despite concerns from investors, civil rights groups and others, Amazon thinks its facial-recognition technology can be helpful for law enforcement.

At the same time, the company agrees that there should be national legislation to regulate how this type of technology should — and shouldn’t — be used. In a blog post Thursday, Amazon Web Services’ vice president of global public policy Michael Punke, defended Rekognition, which the company has offered since 2016.

Punke highlighted some of the benefits of facial-recognition technology, such as helping identify criminals and spotting missing people. He said that in the more than two years that Rekognition has been available, Amazon hasn’t gotten “a single report of misuse by law enforcement.”

The post comes after numerous groups — from investors to hundreds of Amazon employees to an ACLU-led coalition of civil rights groups — urged the company to stop selling Rekognition to government agencies, due to fears that it could be used to violate people’s rights.

Amazon investors want it to quit selling facial-recognition tech to the government; It is unclear how many law-enforcement groups are currently using Amazon’s technology; it has been used by police departments in Florida and Oregon. An Amazon spokesperson said the company doesn’t share customers’ names or use cases without their permission.

The company also said it supports “calls for an appropriate national legislative framework that protects individual civil rights and ensures that governments are transparent in their use of facial recognition technology.”

Amazon is the latest major Tech company to indicate its support for such legislation. Microsoft has also said it is in favor of laws that regulate how facial-recognition technology can be used. It is reportedly backing a privacy bill related to it in Washington, where both Microsoft and Amazon are based (Amazon, reportedly, is considering whether to support it). Currently, there is no national legislation or set of laws that specifically regulate the technology.

Punke, a former US ambassador to the World Trade Organization who also wrote the book that was the basis for the 2015 Oscar-winning thriller “The Revenant,” outlined five guidelines Amazon hopes lawmakers will consider. These include stating that law enforcement agencies should manually review facial-recognition matches “before making any decision to interview or detain” a person, and that these agencies should be open with the public about how they’re using such technology.

Rekognition uses deep learning — an Artificial Intelligence technique for finding patterns in data — to identify objects, faces and scenes in videos and images. For example, it could be used to scan the faces of people entering a courthouse in real time to see if they are in a criminal database.

The technology can be useful, but it’s also prone to errors. Numerous studies have shown it can be inaccurate when it comes to identifying people in certain groups. For instance, an MIT Media Lab study in January indicated that Rekognition was worse than similar technology from Microsoft and IBM at determining the gender of female faces and darker-skinned faces in photos. (Amazon has pointed out that this study in particular used Rekognition’s facial analysis function, which detects features like frowns as well as gender, rather than facial recognition, which is meant to match an image to a specific person’s face.)

Amazon’s post referred to the MIT study and other outside tests of Rekognition, saying that they did not use the service “properly.”

Inioluwa Deborah Raji, a coauthor of the MIT paper and a student at the University of Toronto, said Thursday that Amazon’s “defensive language” is “disappointing.” However, she sees the discussion of ethical usage and legislation guidelines as a step in the right direction. Even after minimizing any possible biases in facial-recognition systems, she said, “There are still different ways this technology can be weaponized.”



Author avatar

Post a comment

Your email address will not be published. Required fields are marked *