Fact or Fiction? Unpacking Australia’s New Proposed Misinformation and Disinformation Laws

Legislation has been introduced into the Australian Parliament designed to combat the spread of misinformation and disinformation on online platforms.

The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (Cth) (the Bill) was introduced into the House of Representatives on 12 September 2024. The Bill is not intended to cover the dissemination of all types of false content, but rather the dissemination of content that is verifiably false, misleading or deceptive and which causes or contributes to serious harm.

The Bill has three main objectives:

  • to empower the federal media regulator, the Australian Communications and Media Authority (ACMA), to require digital communications platform providers to take steps to manage the risks posed by misinformation and disinformation online;
  • to increase transparency regarding the way in digital communications platform providers manage misinformation and disinformation; and
  • to empower users of digital communications platforms to identify and respond to misinformation and disinformation online.

The Bill does not empower the ACMA to take down individual content or user accounts but is rather targeted at the development and enforcement of industry codes and standards.

What is the reason for the changes?

In February 2021, the Australian Code of Practice on Disinformation and Misinformation (the Code) was released by DIGI, a not-for-profit industry association representing the digital platform industry. The Code, which is opt-in, is overseen by the ACMA and requires signatories to commit to a number of measures to address the spread of misinformation and disinformation on their platforms.

There are currently only nine signatories to the Code: Adobe, Apple, Google, Meta, Microsoft, Redbubble, TikTok, Twitch and Legitimate. X (formerly Twitter) was a signatory to the Code until its signatory status was withdrawn in November 2023.

In its June 2021 and June 2023 reports to the Australian Government regarding the operation of the Code, the ACMA noted that there were a number of shortcomings in the self-regulatory arrangement. The Bill has therefore been designed to enhance and complement the framework under the Code and to incentivise digital communications platform providers to have robust systems and measures in place to address misinformation and disinformation online.

A draft version of the Bill was released last year for public consultation.

Who does the Bill apply to?

Under the Bill, the proposed laws will apply to “digital communications platform providers”. “Digital communications platforms” are defined to include:

  • connective media services, being services that enable online interaction between two or more end-users;
  • content aggregation services, being services that collate and present content from a range of online sources to end-users;
  • media sharing services, being services that provide audio, visual (animated or otherwise) or audio-visual content to end-users;
  • internet search engine services; and
  • other kinds of digital services determined by the Minister for Communications from time to time.

Internet carriage services, SMS services and MMS services are excluded from the definition of “digital communications platform”.

What does the Bill require digital communications platform providers to do?

The Bill requires digital communications platform providers to increase their transparency with Australian users about how misinformation and disinformation is handled on their services. Specifically, the Bill requires that digital communications platform providers:

  • assess risks relating to misinformation and disinformation on their platform and publish the results of that assessment;
  • publish policies in relation to the management of misinformation and disinformation; and
  • publish media literacy plans setting out the measures that they will take to ensure that end-users are better able to identify misinformation and disinformation.

The Bill enables the ACMA to approve and register enforceable misinformation codes that have been developed by the digital platform industry (Misinformation Codes). If the ACMA considers that the industry misinformation codes are not adequate, the ACMA may determine misinformation standards for sections of the digital platforms industry (Misinformation Standards).1

The Bill also enables the ACMA to make rules requiring digital communications platform providers to implement a complaints and dispute resolution process regarding online misinformation and disinformation.

Failure to publish required information or to make information available to the ACMA when requested, as well as non-compliance with a Misinformation Code or Misinformation Standard more generally, is punishable by the ACMA. The enforcement mechanisms available to the ACMA include the issue of formal warnings, remedial directions or infringement notices, as well as commencing proceedings seeking the imposition of civil penalties and/or injunctions. Civil penalties are severe and can range up to the greater of 25,000 penalty units (currently $7,825,000) or 5% of annual turnover for some contraventions.

What types of content does the Bill apply to?

The Bill sets a high threshold for the types of content that would be considered to be misinformation or disinformation. Broadly speaking, in order for the prohibitions under the Bill to apply, the content must be:

  • reasonably verifiable as false, misleading or deceptive; and
  • reasonably likely to cause or contribute to “serious harm”.

The types of “serious harms” covered by the bill include:

  • harm to the operation or integrity of an electoral or referendum process in Australia;
  • harm to public health in Australia including to the efficacy of preventive health measures;
  • vilification of a group in Australian society on the grounds of race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin, or an individual because of a belief that the individual is a member of such a group;
  • intentionally inflicted physical injury to an individual in Australia;
  • imminent damage to critical infrastructure or disruption of emergency services in Australia; and
  • imminent harm to the Australian economy.

Whether content would be considered to be misinformation or disinformation is dependent on the intention of the person who engages in the dissemination of the content – if there are grounds to suspect that the person disseminated the content with the intention to deceive others, the content would be considered disinformation; if there is no such identifiable intention, the content would be misinformation.

Importantly, the Bill does not apply to the dissemination of:

  • content that would reasonably be regarded as parody or satire;
  • professional news content (ie. content disseminated by persons or organisations who produce and publish online news content and who are subject to professional rules and editorial standards); or
  • content for any academic, artistic, scientific or religious purpose (provided that the dissemination is reasonable).

What’s next?

The Bill is currently in the early stages and is before the House of Representatives. The Bill is yet to be voted on.

The most vocal criticism about the Bill, and the consultation draft release last year, was the potential for the proposed changes to limit free speech. It is expected that there will be significant debate about whether the Bill has struck the correct balance.

We will be monitoring the progress of the Bill closely.

For regular insights, follow Addisons on LinkedIn and subscribe to our updates.


  1. The approach under the Bill relating to the development of disallowable enforceable industry codes is similar to the position taken under Australia’s online safety regime. For more information, see our previous Insight Online Content Regulation in Australia.

Liability limited by a scheme approved under Professional Standards Legislation.
© ADDISONS. No part of this document may in any form or by any means be reproduced, stored in a retrieval system or transmitted without prior written consent. This document is for general information only and cannot be relied upon as legal advice.