Australian federal police using AI to analyse data obtained under surveillance warrants

The Australian federal police has said it uses AI to analyse data obtained under telecommunications and surveillance warrants, as the agency promises full transparency over the use of the technology.

In a submission to the federal government’s discussion paper on the responsible use of AI, the AFP said its use of the technology had been limited so far, including using AI to translate foreign materials into English.

But it noted that AI tools – including large language models (LLMs) – gave the AFP an opportunity to find useful information in large, lawfully collected datasets.

“By speeding the discovery task, members can make decisions earlier and execute the necessary actions accordingly,” the AFP said.

The AFP indicated it could also potentially help analyse transactional data to identify irregular patterns like money laundering and potential fraud.

In 2021, the AFP came under fire from the privacy commissioner over employees using the controversial Clearview AI facial recognition technology, which built its dataset from photos of people taken from social media without their permission. The AFP has ceased using the technology, but secretly met with Clearview AI after it claimed to have stopped using the technology.

The AFP said in its submission that it would be transparent and “proactively undertake due diligence into technologies before deployment”, taking into account ethical considerations and robust governance and oversight.

“Policing is deeply connected to society and must reflect the values, norms and expectations of the community it serves and critically requires human oversight and accountability.”

A spokesperson for the AFP confirmed that sensitive information obtained from warrants would be fed into LLMs or neural networks. But the agency said it ensures the data is protected, whether it is an in-house tool or when using a commercial product, so it would not feed into public datasets.

The lawfully collected data used could include data collected under a warrant, including telecommunications interception data and surveillance data.

The spokesperson also said all language translations are checked by a human.

We can design regulation providing a platform for innovation while protecting Australians

The inquiry, established by the Albanese government earlier this year, received 510 submissions from a wide variety of people and organisations. Submissions came from groups at the forefront of the technology, including Meta, Google, Amazon Web Services, Open AI, and Microsoft, and industries likely to be affected, including legal firms, healthcare organisations, business groups, banks, supermarkets, and film, music and television companies.

Many of the submissions raised concerns over AI, particularly large language models, being trained on their content without permission or payment.

The Australian Recording Industry Association (ARIA) said AI that creates deepfakes or vocal clones without authorisation should be severely restricted.

“Such use of AI technology robs artists of control over their own voices and image, and can confuse and mislead fans who may be unaware they are not listening to genuine music created by their favourite artist. This can have a detrimental impact on an artist’s career.”

Getty Images said that AI developers needed to be transparent about the datasets their technology is built on to ensure that intellectual property and privacy rights are not being violated.

skip past newsletter promotion

Sign up to Afternoon Update

Our Australian afternoon update breaks down the key stories of the day, telling you what’s happening and why it matters

Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.

after newsletter promotion

“One way to mandate transparency requirements is to require both private and public sector organisations to keep auditable records of all training datasets used including how the data was sourced,” Getty Images said in its submission.

Free TV Australia said that content owners should be paid for the use of their content, while also having the option to refuse AI tools access to their content.

Guardian Australia previously reported Google’s submission to the inquiry called for flexible copyright laws to allow AI to be able to mine content from websites unless those websites opt out.

Since then, a number of news publications including the Guardian, have blocked ChatGPT owner OpenAI from mining their sites.

In a speech on Thursday, industry minister, Ed Husic, said most tech industry submissions called for using existing laws, while others had noted gaps in the legislation. He said the government wanted to get it right and would consider the responses over the next few months.

“I also know that we can design regulation providing a platform for innovation while protecting Australians – our communities and our national wellbeing,” he said.

“The root of this debate isn’t, should we regulate AI? It is, in what circumstances should we expect people and organisations developing and using AI to have appropriate safeguards in place?”

The Law Council of Australia said the federal government should consider in the short-term regulating high-risk AI technology including biometric tech such as facial recognition and social scoring, as well as protecting from AI-generated fakes and scams.

The Shopping Centre Council of Australia raised concerns that its existing technology such as CCTV and “facial detection” screens might get caught up in any regulation of AI, while banks and financial institutions such as Visa, CBA and NAB said they had long deployed AI to detect potential fraud and argued that much of what they do is already covered by existing law.

The Business Council of Australia said the problem wasn’t that there were no laws covering AI, just that there needed to be a better understanding of how existing laws apply before any new laws are made.

Leave a Comment