The Federal Bureau of Investigation (FBI) is piloting Amazon’s controversial facial matching software, Rekognition, as a means to quickly sift through and analyze mountains of video surveillance footage routinely collected by the agency during investigations.
However, the use of Amazon’s software by the FBI has sparked widespread protests by investors and employees about the company marketing the unproven software to police departments for mere pennies on the dollar, and amidst growing public opposition to the program due to concerns of its racial bias and false matches. Last July, a study conducted by the American Civil Liberties Union (ACLU) found the Rekognition software incorrectly matched 28 members of the U.S. Congress against a mugshot database, falsely identifying them as other people who have been arrested for a crime. The false positives disproportionately matched people of color twice as often including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-GA).
The news has subsequently caused dozens of corporate shareholders and senior engineers and hundreds of Amazon employees to lodge formal complaints with CEO Jeff Bezos, urging the company to cease its cooperation with police departments for fear of the technology being misused. In October, a letter by 450 employees also demanded the removal of the company Palantir from AWS because it helped US Immigration and Customs Enforcement (ICE) track and deport illegal immigrants.
The pilot kicked off in early 2018 following a string of high-profile counterterrorism investigations that tested the limits of the FBI’s technological capabilities, according to FBI officials.
For example, in the aftermath of the October 2017 massacre in Las Vegas — the deadliest mass shooting in modern history — which authorities have attributed to a single shooter, Stephen Paddock, law enforcement collected a petabyte worth of data, much of it video from cellphones and surveillance cameras.
At the Amazon Web Services re:Invent conference in Las Vegas last November, FBI Deputy Assistant Director for Counterterrorism Christine Halvorsen lamented that while it took human FBI agents three weeks of round-the-clock extensive work to comb through the petabyte’s worth of surveillance data and videos to find every instance of Paddock’s face, if they’d had access to Amazon’s Rekognition software the job could’ve been done “in 24 hours.”
“The cases don’t stop, the threats keep going. Being able to not pull people off that and have computers do it is very important,” Halvorsen told conference attendees.
Last May, Sputnik reported that the artificial intelligence behind Rekognition, which can identify, track, and analyse people and recognise up to 100 faces in a single image, was being marketed by Amazon to US police departments for as little as $6 a month. That tiny fee gave law enforcement agencies access to Amazon’s cloud computing platform, Amazon Web Services (AWS). In turn, Amazon requested that those agencies recommend the brand to their partners, including body camera manufacturers, according to documents obtained by the ACLU.
Amazon has also provided US intelligence services with their own private corner of AWS, the so-called “AWS Secret Region,” for 17 intelligence agencies to host, analyse, and run applications on government data classified as secret via a $500 million contract with the Central Intelligence Agency (CIA).
However, it emerged later that month that Amazon had actually been courting ICE directly since at least the summer. The Daily Beast reported last October about a June pitch by Amazon of Rekognition to ICE in the Silicon Valley offices of management company McKinsey & Company, which had previously partnered with ICE. The revelation was made by the Project on Government Oversight via a Freedom of Information Act (FOIA) request.
Further, the Washington Examiner reported last week that the FBI was also courting the National Institute of Standards and Technology’s (NIST) tattoo image-matching system, a project it had supported for four years, despite it only having a 67.9 percent accuracy rate, including false positives. The system, dubbed Tatt-E, or Tattoo Recognition Technology Evaluation, works similarly to Rekognition, pulling on a large database of images to match them via AI.
One of the major problems with these systems, web developer and technologist Chris Garaffa told Sputnik Friday, is that “there is no guarantee for innocent individuals who will inevitably be caught up in these videos.” Indeed, Tatt-E’s algorithm doesn’t even consider the possibility of a false positive.
Further, “Rekognition technology is going to allow automated, very fast review of video, with similarly automated cataloging of the faces at various times and places,” Garaffa continued, “effectively creating a database of where you were and when.”
Garaffa further noted that breaches of the AWS storage system happen on a regular basis “because of poorly-configured control over who has access, creating real security concerns over such a sensitive database.”
*Expanded from original article published at Sputnik News.