‘Booyaaa’: Australian Federal Police use of Clearview AI detailed
Earlier this year, the Australian Federal Police (AFP) admitted to using a facial recognition tool, despite not having an appropriate legislative framework in place, to help counter child exploitation.
The tool was Clearview AI, a controversial New York-based startup that has scraped social media networks for people’s photos and created one of the biggest facial recognition databases in the world. It provides facial recognition software, marketed primarily at law enforcement.
The AFP previously said while it did not adopt the facial recognition platform Clearview AI as an enterprise product and had not entered into any formal procurement arrangements with the company, it did use a trial version.
Documents published by the AFP under the Freedom of Information Act 1982 confirmed that the AFP-led Australian Centre to Counter Child Exploitation (ACCCE) registered for a free trial of the Clearview AI facial recognition tool and conducted a pilot of the system from 2 November 2019 to 22 January 2020.
The ACCCE’s Covert Online Engagement (COE) team and the Child Protection Triage Unit (CPTU) used Clearview to attempt to find an offender with no result. The AFP said that staff used the facial recognition tool to check the accuracy and effectiveness of its algorithm, however.
“Clearview is like Google Search for faces. Just upload a photo to the app and instantly get results from mug shots, social media, and other publicly available sources,” an email to an AFP staff member from “Team Clearview” says.
Another email encourages the user to “search a lot” as the account has unlimited searches. Team Clearview tells the user not to stop at one search, rather to “see if you can reach 100 searches” as “it’s a numbers game”.
The email continues to tell the user to refer their colleagues, as “the more people that search, the more success”.
This approach from Clearview mirrors that used on Victoria Police
Clearview AI founder, Australian entrepreneur Hoan Ton-That, also reached out directly to one AFP staff member, who in response said, “We’ve only just started using it and so far it has been valuable”.
The AFP said previously its trial saw nine invitations sent from Clearview AI to AFP officers to register for a free trial, with seven officers activating the trial and conducting searches.
Documents show an AFP officer telling colleagues that they ran someone’s mugshot through the Clearview system and “got a hit from his Instagram account”.
Responses included “Nice work” and “Booyaaa! Luv it!”.
Further email exchanges between the AFP’s staff reveal that one staff member was aware the tool was potentially not approved for use. In response, one staff member said she was running the app off her personal phone, while another asked if any concerns had been raised by the team responsible for infosec.
With Clearview AI in February suffering a data breach that exposed its customer list, the number of accounts each customer has, and the number of searches those customers have made, the AFP sent an email asking those in receipt of the memo change their AFPNET password immediately.
After media reports emerged that the AFP was using the software, one staff member suggested that they cease using it “since everyone is raising the issue of approval”.
Last week, the UK Information Commissioner’s Office and Office of the Australian Information Commissioner (OAIC) announced they would be teaming up to conduct a joint investigation into Clearview AI.
Prior to this, the OAIC in April asked questions of the company and issued a notice to produce under section 44 of the Australian Privacy Act. The OAIC also reached out to the AFP in May, asking the agency what it used the platform for and directed the AFP to cease use of the product.
Following the Clearview ban, staff emails between those working in the agency’s Victim Identification Team indicate that not having access to the tool makes “things difficult”.
Why bother with messy official approvals, tedious legal and privacy assessments, or even ethics when cops use facial recognition? ‘Feel free to run wild with your searches,’ says Clearview.
Seven officers have conducted searches on the Clearview AI facial recognition platform.
The American Civil Liberties Union has accused Clearview AI’s biometric platform of creating a nightmare scenario that many have long feared.