Australia’s media watchdog may soon be able to force big tech companies to hand over data in a small step towards combating misinformation under legislation proposed by the federal government.
Media law and ethics experts warn that while the Australian Communications and Media Authority (ACMA) legislation is a step in the right direction it lacks the clarity to fully respond to the significant challenges social media misinformation presents.
University of Wollongong senior journalism lecturer Dr Steinar Ellingsen said the proposed legislative changes were inadequate.
“The code is awfully complex and full of technical jargon. The definitions are also vague, which will make the code challenging to enforce because it is not fully developed,” Dr Ellingsen said.
The proposed legislation will expand the Australian Communications and Media Authority’s (ACMA’s) information-gathering powers to incentivise transparency and improve access to Australian specific data on the effectiveness of measures addressing online misinformation.
Additionally, the legislation will empower ACMA to hold big tech companies to account should voluntary efforts to remove harmful content be inadequate.
Dr Ellingsen said the expanded ACMA power creates potential concerns.
“Privacy concerns are always a risk where data is being handed over,” Dr Ellingsen said.
“I also have concerns surrounding the accuracy of companies reporting.
“If companies are being urged to be transparent under the code, more independent oversight would be welcomed.”
UTS Centre for Media Transition (Faculty of Law) senior lecturer Sacha Molitorisz said while sufficiency concerns in the code were valid, the legislative changes are positive.
“Obviously we need to consider ethical and legal concerns very carefully. But here the common goal should be clear: we need to do better at curbing misinformation and disinformation, which are doing real harm,” Dr Molitorisz said.
Minister for Communications, Urban Infrastructure, Cities and the Arts Paul Fletcher welcomed all five recommendations contained in the report.
“Digital platforms must take responsibility for what is on their sites and take action when harmful or misleading content appears,” Mr Fletcher said.
A Misinformation and Disinformation Action Group will be established under the legislation. This action group will bring together key stakeholders to discuss emerging issues and best practices.