Follow us


Overview

On 12 September 2024 the Federal Government introduced the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (Bill) into Parliament with the stated aim of protecting Australians from the threats to democracy and public safety posed by the spread of harmful misinformation and disinformation on “digital communications platforms”.

The Bill proposes to add a new Schedule 9 to the Broadcasting Services Act 1992 (Cth) under which:

  • obligations will be imposed on digital communications platform providers in relation to the management of misinformation and disinformation on their platforms; 
  • the Australian Communications and Media Authority (ACMA) will be empowered to create Digital Platform Rules requiring digital communications platforms keep records and provide certain reports to ACMA;
  • bodies or associations representing sections of the digital platform industry are permitted to develop codes to prevent or respond to misinformation and disinformation on digital communications platforms, compliance with which will become mandatory once approved by ACMA; and
  • if codes are not created by industry (or ACMA considers those codes are deficient), ACMA will have the power to create mandatory standards to provide Australians with adequate protection from serious harm caused or contributed by misinformation or disinformation on digital communications platforms.

This is the second attempt by the Federal Government to introduce legislative reform in relation to misinformation and disinformation. When announcing the new Bill, the Minister for Communications, the Hon Michelle Rowland MP, was quoted saying: “following public consultation on the draft Bill last year, revisions have been made that carefully balance the public interest in combatting seriously harmful misinformation and disinformation with the freedom of expression that is so fundamental to our democracy.

Key changes in this Bill include:

  • There is no longer an express exclusion for content authorised by government.
  • The definition misinformation and disinformation now captures information that is “reasonably verifiable as false, misleading or deceptive” where previously it was information that is “false, misleading or deceptive”.
  • A narrowing of the meaning of “serious harm”.

In this article, we provide an overview of the proposed regime including who it will apply to, the new obligations imposed, and the consequences for non-compliance.

Who will this apply to?

The new regime will apply to providers of “digital communications platforms”, defined as being:

  • a “digital service”, being a service that:
    • delivers content to persons by, or allows end-users to access content using, an internet carriage service;
    • is provided to the public; and
    • is offered in Australia,
      (but not including a broadcasting or datacasting service); and
  • which satisfies the conditions of one or more of the platform types described in the table below. “Internet carriage services”, “SMS services” and “MMS services” are expressly excluded.

Platform type

Conditions

Connective Media Services

  • the primary function* of the digital service is to enable online interaction between 2 or more end-users;
  • the digital service allows end-users to link to, or interact with, some or all of other end-users; and
  • the digital service has an “interactive feature”.

Content Aggregation Services

  • the primary function* of the digital service is to collate and present to end-users content from a range of online sources, including sources other than the digital service; and
  • the digital service is not an internet search engine service. 

Internet Search Engine Services

  • the digital service collects, indexes or ranks content from a range of online sources, including sources other than the digital service; and
  • the primary function* of the digital service is to enable an end-user to search the digital service’s collection, index or ranking.

Media Sharing Services

  • the primary function* of the digital service is to provide audio, visual (animated or otherwise) or audio-visual content to end-users.

NB: media sharing services that do not have an “interactive feature” are exempt.

*In assessing primary function, the provision of advertising material and the collection of data (or the generation of revenue from either) are to be disregarded.

Additional conditions applying to each platform type may also be included in the Digital Platform Rules.

What is misinformation and disinformation?

The Bill creates two distinct concepts of “misinformation” and “disinformation”:

Misinformation

  1. the content contains information that is reasonably verifiable as false, misleading or deceptive;
  2. the content is provided on the digital service to one or more end-users in Australia;
  3. the provision of the content on the digital service is reasonably likely to cause or contribute to serious harm; and
  4. the dissemination is not “excluded dissemination” (defined to mean dissemination of content that would reasonably be regarded as parody or satire, dissemination of “professional news content”, or reasonable dissemination of content for any academic, artistic, scientific or religious purpose).

Disinformation

  1. the same requirements as “misinformation”; and
  2. either:
    1. there are grounds to suspect that the person disseminating, or causing the dissemination of, the content intends that the content deceive another person; or 
    2. the dissemination involves “inauthentic behaviour”. 

While the Bill requires digital communications platform providers to manage both “misinformation” and “disinformation”, it leaves open the possibility applicable misinformation codes or standards will impose different obligations in respect of misinformation and disinformation.

New obligations on digital communication platform providers

Obligation to publish information 

Digital communications platform providers will be required to ensure the following information is both available to end-users on its platform(s) and publicly accessible on its website:

  1. A report on the outcomes of the provider’s assessment of risks relating to misinformation and disinformation on their platform.
  2. The platform’s policy or information on its current policy approach in relation to misinformation and disinformation.
  3. The platform’s current “media literacy plan”, being a plan setting out measures the provider of the platform will take to enable end-users to identify the source of content disseminated on the platform.

Obligation to comply with the Digital Platform Rules, codes and standards 

Digital communications platform providers will be required to comply with:

  1. the Digital Platform Rules made by ACMA;
  2. any applicable misinformation codes prepared by bodies or associations that represent a particular section of the digital platform industry and approved by ACMA; and
  3. any misinformation standards prepared by ACMA.

The Bill does not prescribe minimum requirements which must be included in codes or standards. However, the Bill does provide examples of matters that could be dealt with by the codes and standards, including:

  • Using technology to prevent or respond to misinformation or disinformation.
  • Preventing advertising involving misinformation or disinformation.
  • Preventing monetisation of misinformation or disinformation.
  • Allowing end-users to detect and report misinformation or disinformation.
  • Supporting fact checking.
  • Giving information to end-users about the source of political or issues-based advertisements.

Consequences for non-compliance

In the event of non-compliance with certain provisions of the Bill, or applicable mandatory codes or standards approved or made by ACMA, ACMA may issue formal warnings, written directions and infringement notices. A number of these provisions are also civil penalty provisions. We have summarised some of these in the table below:

Contravention

Maximum penalty (body corporates)

Contravention of transparency, risk management, media literacy and complaints / dispute handling obligations

5,000 penalty units

Non-compliance with requests from ACMA to provide information and documents

40 penalty units

Contraventions of misinformation code / non-compliance with remedial directions from ACMA in relation to contraventions of misinformation code

10,000 penalty units or 2% of the annual turnover of the body corporate during the turnover period

Contraventions of misinformation standard / non-compliance with remedial directions from ACMA in relation to contraventions of misinformation standard

25,000 penalty units or 5% of the annual turnover of the body corporate during the turnover period.

Related developments

September has already been a busy month for the Government. The Bill was released in the same week as the Privacy and Other Legislation Amendment Bill 2024 (Cth) was introduced into Parliament, and shortly after the Government launched a public consultation on its proposal to introduce mandatory guardrails around high-risk AI systems and models.

See our dedicated page for the latest developments in relation to technology, privacy and data.


Key contacts

Kristin Stammer photo

Kristin Stammer

Executive Partner, Asia and Australia, Sydney

Kristin Stammer
Kwok Tang photo

Kwok Tang

Partner, Sydney

Kwok Tang
Tess Mierendorff photo

Tess Mierendorff

Partner, Sydney

Tess Mierendorff
George Psaltis photo

George Psaltis

Senior Associate, Sydney

George Psaltis

Stay in the know

We’ll send you the latest insights and briefings tailored to your needs

Sydney Australia Perth Brisbane Melbourne Technology, Media and Entertainment, and Telecommunications Technology, Media and Telecommunications Consumer Kristin Stammer Kwok Tang Tess Mierendorff George Psaltis