This marks another major setback for Rakomation, which has been subject to biased regulation, illegal surveillance, racial discrimination and Amazon's controversy over its sale to the police.
Last May, documents obtained by the American Civil Liberties Union of Northern California first exposed the use of Rekognition by law enforcement agencies. At the time, Amazon was selling the cloud-based face recognition platform to police in Orlando and Washington County, Oregon, but the company did not publicly explain it to law enforcement. In fact, it took measures such as confidentiality agreements to keep it secret.
Amazon's approach has been strongly opposed by AI communities, activists and human rights organizations, who fear that its inherent flaws will lead to illegal surveillance and other human rights violations. Research shows that Amazon's system may return a large number of mismatches, and it is difficult to accurately identify darker-skinned individuals and women's gender.
Amazon's argument is that Rekognition will be used as a police aide, and only when it identifies a match with 99% accuracy will the police take its advice. It is not clear, however, how much Amazon has contributed to monitoring the violation of its terms of service by the agencies involved. Amazon claims that these provisions allow it to suspend or prohibit organizations and individuals who use Rekognition illegally or immorally.
Despite strong criticism from both inside and outside the company, Amazon said it would continue to sell the software to U.S. law enforcement.
But under pressure, Orlando's contract with Amazon seems to have expired at the end of June last year. However, according to Orlando Weekly, the pilot project was launched again in October last year, when the police tried to run the system on four cameras around the police headquarters in the city centre and one outside the community entertainment center.
Now, the project will be terminated again. According to local police, Amazon employees decided to give up because the software was too expensive, too cumbersome to install, and they couldn't even help them build a reliable real-time streaming media so that the software could run in real time.
In response, Matt Cagle, an ACLU technician, civil rights lawyer and Rekognition critic, said he congratulated the Orlando police on finally finding out that Amazon's surveillance technology, which they had long warned, was not working and threatened the privacy and security of the public. "This failed pilot project explains precisely why monitoring decisions should be made by the public through their elected leaders, rather than by company secretly lobbying police officers to deploy dangerous systems against the public."