Search for:
Australia’s online safety regulator has released new draft industry standards under the Online Content Scheme, while the government has commenced consultation on proposed changes to the Basic Online Safety Expectations

In brief

On 20 November 2023, Australia’s eSafety Commissioner (“eSafety“) released drafts of two new industry standards under the Online Safety Act 2021 (Cth) (OSA) for public consultation (“Draft Standards“).

The Standards cover providers of:

  • Relevant electronic services (RES), which cover a wide category of services that allow end-users to communicate online (including email, SMS, MMS, chat, instant messaging, various online games and dating services) (“RES Standard“).
  • Designated internet services (DIS), which cover a broad range of websites and apps not otherwise captured by the RES Standard or the industry codes that have been registered by eSafety under the OSA (“DIS Standard“).

eSafety also released a discussion paper on the Draft Standards (“Discussion Paper“) as well as fact sheets for both Draft Standards (available here and here). Public consultation on the Draft Standards is open until 21 December 2023, although time extensions to make a submission can be requested from eSafety.

Separately, on 22 November 2023, the Australian Government commenced consultation on proposed amendments to the Basic Online Safety Expectations (BOSE) made under the OSA, and released a draft amendment determination (“Amendment Determination“) and corresponding consultation paper. A summary of the proposed changes prepared by the government is available here. This consultation will remain open until 16 February 2024.

Both of these developments reflect an overall increase in the level of specificity in Australia’s online safety regulatory regime. This will be followed by a wholesale review of the OSA, which the government has announced will be brought forward to early 2024.

Key takeaways

Draft Standards 

The Draft Standards contain a set of mandatory compliance measures requiring RES and DIS providers to take proactive steps and implement transparency and accountability mechanisms to minimise access and exposure to the most harmful material on their services (see Background below for further detail). Both Draft Standards have their roots in the draft industry codes for RES and DIS providers that were previously rejected by eSafety (“Draft Codes“) but differ significantly from the Draft Codes on many levels. Some of the most notable changes include:

  • Significantly increased proactive detection obligations for known child sexual abuse material and pro-terror material across a range of service types (although these obligations do not require providers to use systems, processes or technologies to detect material where to do so is not “technically feasible” – eSafety has indicated this is in recognition of the limitations that may be faced by some closed communication services in detecting material on their services).
  • Enhanced obligations for certain services to disrupt and deter end-users from using their services to create, access, store or distribute child sexual abuse material or pro-terror material.
  • New targeted obligations for DIS with generative AI capabilities, which apply depending on whether a service falls within certain pre-assessed categories. eSafety has stated that these proposed obligations aim to address the evolving risks associated with generative AI, particularly with respect to its potential to create synthetic material and deepfake images or videos.
  • Requirements for certain services to establish development programs, which must include investments and activities designed to develop systems, processes and technologies that enhance the ability to detect and disrupt the distribution of certain online material.
  • Significantly increased reporting requirements across a range of service types. 

It should also be noted that unlike the Draft Codes, guidance on the mandatory compliance measures in the Draft Standards is not embedded but will rather be contained in explanatory statements and other regulatory guidance from eSafety. The Draft Standards also do not contain any “optional” compliance measures, but make clear that the mandatory compliance measures are not exhaustive and do not prevent providers from taking additional measures to improve and promote online safety.

The Discussion Paper sets out a number of questions designed to guide submissions on the Draft Standards. These include questions relating to:

  • Whether the technical feasibility exception to the proactive detection obligations is appropriate, and whether there are any other limitations that would prevent service providers from being able to comply.
  • The appropriateness of the threshold to determine which services are required to establish development programs.
  • The effectiveness and appropriateness of the proposals to include targeted requirements in the DIS Standard for certain services with generative AI capabilities.
  • Likely compliance costs for service providers and potential new industry entrants.

BOSE Amendment Determination

The draft Amendment Determination released by the Minister for Communications proposes to amend the existing Basic Online Safety Expectations Determination 2022 (“BOSE Determination“) to include a range of new additional expectations for providers of social media services, RES and DIS, including that such service providers:

  • Must consider user safety and incorporate safety measures in the design, implementation and maintenance of any generative AI capabilities.
  • Ensure the best interests of a child are a primary consideration in the design and operation of any service that is used by, or accessible to, children.
  • Prepare and publish regular transparency reports on the measures they are taking to keep Australians safe online.

The Amendment Determination also proposes to amend and add to the examples in the BOSE of reasonable steps expected to be taken by service providers to ensure end-users can use their service in a safe manner. This includes strengthening the existing example relating to assessments of safety risks, and adding an example of having processes for detecting and addressing “hate speech” – a new concept under the regime. 

Next steps

The closing date for submissions on the Draft Standards is 21 December 2023, although time extensions to make a submission can be requested by emailing eSafety at codes@eSafety.gov.au. Information on how to make a submission is available on this page.

eSafety has stated that it will review the public submissions received and amend the Draft Standards as required in January and February 2024, before lodging the Draft Standards and explanatory statements with the Office of Parliamentary Counsel in late March 2024. The Standards are set to have a six-month transition period once they are registered to allow providers to come into compliance.

Consultation on the proposed amendments to the BOSE will remain open until 16 February 2024. Submissions can be made here.

Further details of the wholesale review of the OSA are yet to be published, but are expected to be released in early 2024.

Background

As detailed in our prior alerts, the OSA was passed on 23 June 2021 and commenced on 23 January 2022. Please view our previous articles (herehere, and here) for further details regarding the OSA.

In short, the OSA was a significant overhaul of Australia’s online safety laws. Among other things, it allows for the Minister for Communications to determine core basic online safety expectations for social media services, RES and DIS (i.e., the BOSE), and sets out an Online Content Scheme that provides for the development of codes by industry associations or, alternatively, the development of standards by eSafety, for eight sections of the online industry.

The BOSE Determination, which set out both broad expectations as to the overall safety of social media services, RES, and DIS, as well as more specific expectations (for example to consult with eSafety and keep records of certain complaints), came into effect on 23 January 2022.

As detailed in our earlier article (here), industry codes for each of the eight sections of the online industry were developed by industry and submitted by representative associations to eSafety for consideration. These draft codes required providers to adopt various compliance measures in relation to child sexual exploitation material, pro-terror material, extreme crime and violence material, crime and violence material, and drug-related material (together, class 1A and class 1B materials).

Five of the codes were approved and registered by eSafety under the OSA on 16 June 2023, and will come into force on 16 December 2023. These codes cover providers of social media services, app distribution services, hosting services, and internet carriage services, as well as persons who manufacture, supply, maintain or install equipment used by end-users in connection with online services or internet carriage services.

A sixth industry code, for providers of search engine services was approved and registered by eSafety on 12 September 2023, and will come into force on 12 March 2024. 

* * * * *

If you have any questions on how these developments may impact you, please contact a Baker McKenzie representative. 

Author

Adrian Lawrence is the head of the Firm's Asia Pacific Technology, Media & Telecommunications Group. He is a partner in the Sydney office of Baker McKenzie where he advises on media, intellectual property and information technology, providing advice in relation to major issues relating to the online and offline media interests. He is recognised as a leading Australian media and telecommunications lawyer.

Author

Andrew Stewart leads the Intellectual Property & Technology Practice Group in Australia and the Firm's Global Digital Media & Copyright Content practice Business Unit. He is also a member of the Firm's Asia Pacific Intellectual Property & Technology Steering Committee. Andrew has significant in-house experience in one of Australia's most successful television networks, giving him an insight into the media environment in Australia and is an advisory board member of the Melbourne University Centre for Media and Communications in the Law.

Author

Allison Manvell is a special counsel in the Technology, Communications and Commercial, and Media & Content, teams at Baker McKenzie. Allison works across Baker McKenzie's Sydney and Brisbane offices. Allison has more than ten years' experience advising on commercial and regulatory matters across a range of industries with a particular focus on digital media, technology, broadcasting and content licensing and regulation. Allison has also spent time on client secondment within the media industry. She is a member of the Communications and Media Law Association and she speaks and presents regularly on legal issues relevant to convergence and digital media.

Write A Comment