13/10/2021
This is why we advocated auditing the data collection independently
"....With South Australia, New South Wales and Victoria conducting trials of home quarantine apps using facial recognition, and Western Australia already doing it, new research by the Centre for Responsible Technology shows strict limits and controls are needed to protect the public.
It follows reports that in several states police have accessed Covid check-in data to undertake routine law enforcement activities. The privacy breaches from check-in apps already show that state governments must regain the public’s trust before trialling even more complex surveillance technologies, like facial recognition.
Director of the Centre for Responsible Technology Peter Lewis said the report highlighted several issues with using facial recognition technology, including systemic biases against women and minorities; a lack of transparency; and making surveillance an acceptable activity.
He said that governments at a minimum should constrain facial recognition to a single use with strict limits, including data expiry and proper consent; update state privacy legislations to cover facial recognition as protected information, in line with the Federal Privacy Act; develop strong human rights protections in law, to guard against abuses and establish an Artificial Intelligence ethics advisory board, to properly scrutinise the effects of facial recognition technology on an ongoing basis...."
State governments trialling home quarantine need to take active steps to ensure they are not crossing a new frontier in the surveillance of citizens by using facial recognition technology, privacy experts have warned.