Online Safety Act 2021 – application to global online business.

Does your online business need to comply?

The Online Safety Act 2021 (Cth) (Act) was passed on 23 June 2021 and came into effect on 23 January 2022. The Act creates the world’s first cyber abuse take-down scheme for adults, in addition to strengthening online safety protections. If your business is an online service provider with Australian end-users, regardless of where your business is located in the world, the Act will apply to your service.

What does the Act do?

The Act implements five “schemes” to deal with the removal of harmful online material relating to:

  1. children (cyber-bullying scheme);
  2. adults (adult cyber-abuse scheme);
  3. the sharing of intimate images without consent (image-based scheme);
  4. abhorrent violent material (abhorrent violent material blocking scheme); and
  5. harmful online content (online content scheme).

The Act affects a variety of online service providers (OSPs) with Australian end-users, including (but not limited to) providers of:

  • electronic services, which enable end-users to:
    • communicate via email, instant messaging, SMS, MMS or a chat service; or
    • play online games with other end-users;
  • social media services (where the sole or primary purpose of the service is to allow social interaction between end-users and end-users are permitted to post material to the service);
  • internet service providers;
  • app distributions services (such as Apple’s App Store or Google’s Play Store); and
  • search engines.

Overseas-based OSPs will need to comply with the Act, which has extraterritorial application.

The eSafety Commissioner has the power to issue “removal notices” to OSPs, which includes providers of online games, electronic messaging and chat services, and social media services amongst others. Any OSP issued with a removal notice must remove the offending content within 24 hours.

Industry expectations

The Act also establishes a “Basic Online Safety Expectations” (BOSE) framework.1 This obliges OSPs to take reasonable steps to ensure the safety of users and minimise the extent to which harmful material is provided on their service. Broadly, the BOSE require an OSP to:

  • take reasonable steps to ensure that end-users can access online services in a safe manner;
  • consult with the eSafety Commissioner to determine what “reasonable steps” mean for that OSP (which could involve following guidance issued by the eSafety Commissioner rather than a positive consultation obligation);
  • take reasonable steps to minimise the extent to which certain materials are provided (including cyber-bullying material, cyber-abuse material, non-consensual intimate images and material that promotes or depicts abhorrent violent conduct); and
  • ensure that online services have clear and readily identifiable mechanisms to enable users to report and make complaints about objectionable material.

To ensure compliance with the BOSE, the eSafety Commissioner may require OSPs – either individually or as a class – to provide specific information (such as their response to terrorism or abhorrent violent material) and report back on the steps which they have taken to comply with the BOSE.

Online Content Scheme

The online content scheme, which regulates offensive content2 through a complaints-based mechanism under the Act, requires OSPs to develop mandatory industry codes and standards. Once registered, the codes and standards will be enforceable by fines and injunctions.

The eSafety Commissioner has also made it mandatory for OSPs to implement a “restricted access system”3 that requires users to declare that they are 18 years or older in order to access R 18+ or Category 1 content, which includes films and computer games. Such restricted content typically includes sex, high impact nudity, drugs or violence.

The access-control system must also warn the user about the nature of the content.

Adult cyber-abuse scheme

The adult cyber-abuse scheme is new and is the first of its kind to assist adult victims of cyber-abuse. It provides for the removal of material that seriously harms Australian adults. The eSafety Commissioner has the power to issue removal notices to “end-users” and OSPs requiring them to remove abusive material within 24 hours. A removal notice may be issued to an OSP or end-user anywhere in the world so long as the alleged victim is an Australian resident.

The test for what constitutes abuse directed at adults requires that the material be intended to have an effect of causing “serious harm”, which is defined as serious physical harm or serious harm to a person’s mental health and includes serious psychological harm and serious distress. The test for adults is higher when compared to children in recognition of their resilience and the need to balance the need for free speech.

Individuals must make a complaint with the relevant service provider at the first instance. If the OSP fails to assist them, they may request the eSafety Commissioner to issue a removal notice.

Penalties

Failure to comply with obligations under the Act can result in liability for a fine of $110,000 for individuals or $555,000 for companies. The eSafety Commissioner may also seek enforceable undertakings and issue formal warnings.

What else has changed?

The Act has updated and expanded existing laws in relation to harmful online content. Some key points include:

  • the time for complying with a removal notice (previously only in place in relation to cyber-bullying of children) has been reduced from 48 hours to 24 hours; and
  • take-down notices may be issued not only to social media platforms and websites, but also to online gaming platforms and messaging services. This expands the scope of previous cyber-bullying laws concerning children.

The eSafety Commissioner now also has specific powers to:

  • compel information about the identity of “trolls”, or persons who use fake accounts or anonymity to abuse others or exchange illegal content;
  • block websites in response to online crises (such as violent events) by requesting internet service providers to block access to content for a limited time. This mirrors the abhorrent violent material provisions in the Criminal Code, which were enacted as a response to the Christchurch terrorist attack; and
  • force search engines and app stores to remove access to websites, or apps from the app store, that have failed to comply with previous removal orders.

Key takeaways

The Act applies not only to social media platforms and search engines, but also to, for example, providers of online games and electronic messaging services. Although the eSafety Commissioner has expressed the view that she will prioritise investigation of complaints about the most harmful material (such as child sexual exploitation material, material that advocates the doing of a terrorist act and material that promotes, instructs or incites in matters of crime and violence),4 we recommend that OSPs with Australian end-users review and update their online safety procedures to ensure compliance with the BOSE, the restricted access system regime and the Act in general. OSPs will also need to have mechanisms in place to enable them to respond promptly to removal notices.

If your organisation requires assistance in understanding its obligations under the Act, please contact us.


1 Online Safety (Basic Online Safety Expectations) Determination 2022 available here.
2 This content may include relevant material that has been or would likely be classified by the Classification Board as Refused Classification, X 18+, Category 2 restricted, R 18+, and Category 1 restricted.
3 Online Safety (Restricted Access Systems) Declaration 2022 available here.
4 See the Online Content Scheme Regulatory Guidance (eSC RG 4) published by the eSafety Commissioner, available here.

Liability limited by a scheme approved under Professional Standards Legislation.
© ADDISONS. No part of this document may in any form or by any means be reproduced, stored in a retrieval system or transmitted without prior written consent. This document is for general information only and cannot be relied upon as legal advice.