News

A new era for Australian online safety regulation

Image
Image

Online safety regulation has, and continues to be, an ongoing priority for regulators globally. Indeed, the Online Safety Bill 2021 (“Bill”) and Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021 received royal assent on 23 July 2021 having only been introduced to Parliament earlier this year.

Online service providers, social media service providers and other designated internet service providers have the next six months to ensure their policies and procedures are up to date and compliant with Australian laws.

Overview

The Online Safety Bill 2021 (“Bill”), passed as the Online Safety Act 2021 (Cth) (the “Act”), builds upon the existing online regulatory framework established in the Enhancing Online Safety Act 2015 (Cth) and also introduces additional compliance obligations for companies operating online.   

The main purpose of the Act is to create a modernised framework for online safety.  This is achieved in a number of ways, primarily through the introduction of the eSafety Commission (the “Commission”), an independent statutory body supported by the Australian Communications and Media Authority.

The Commission’s key functions are to enforce the Act and administer a complaints system for:

  • cyber-bullying material targeted at an Australian child;
  • non-consensual sharing of intimate images;
  • cyber-abuse material targeted at an Australian adult; and
  • an online content scheme.

Who is impacted by the new regime?

The Act will not only influence the way in which Australian end-users engage with online platforms, but it will also impact the way online service providers provide its services on those platforms.

Specifically, the Act will impact:

  • designated internet service providers: entities that allow Australian end-users to access online materials, and internet service providers, being those entities who supply internet carriage services including among others, Google, Safari and Internet Explorer;
  • social media service providers:  entities who provide services that connect two Australian end-users through online means including among others, Facebook, LinkedIn and Instagram;
  • electronic service providers: entities which allow end users to communicate with one another (e.g. Outlook and gaming chat services);
  • app distribution service providers: entities who provide access to app services including among others, Google (through the Google PlayStore) and Apple (through IOS App Store); and
  • hosting service providers: entities that enable hosting of stored materials provided on social media services, relevant electronic services or designated internet services including, among others,  Apple and Microsoft each through their provision of cloud services.

Importantly, companies captured under the Act must proactively protect Australian end-users and have capacity to respond to notices from the Commission on short notice.

What are the key changes?

The Act introduces a number of key changes to enhance the online safety experience for Australians.  We set out some of these key changes below.

Basic Online Safety Expectations (“BOSE”)

The Act addresses the gaps in Australian legislation by introducing a set of basic online safety expectations (“BOSE”).

Providers of social media services, relevant electronic services or designated internet services may now be required to give the Commission reports in relation to their compliance with the BOSE. This significantly expands the previous frameworks of online safety requirements for online service providers.

BOSE includes, among others:

  • sufficient standards for safe use of the service;
  • controlling the type of content made available on the service;
  • controlling access of certain materials to certain demographics;
  • ensuring implementation of necessary technologies to protect minors activity on the service; and
  • on notice, provide the Commission with timeframes for compliance with removal or blocking notices.

The Act also includes a non-exhaustive power for the Minister to determine any other BOSE as it considers necessary.

Online content scheme

The Act establishes an online content scheme for the take-down of contravening material (for example, material falling within the refused classification, X18+ and R18+ classifications of the Classification Act 1995 (Cth)). This scheme largely replicates the online content scheme in Schedule 5 (online services) and Schedule 7 (content services) of the Broadcasting Services Act 1992 (Cth), although with strengthened power to target material from, or hosted outside Australia.

Under the Act, the Commission may issue:

  • removal notices to providers of social media services, electronic services, designated internet services, or hosting services to remove, or take all reasonable steps to remove the content specified in the notice;
  • remedial notices to social media services, electronic services, designated internet services, requiring the online service provider to take all reasonable steps to ensure contravening material is removed from the service, and access to that material be subject to a further restricted access system;
  • link deletion notices to the provider of an internet search engine service to cease providing a link to certain material; and/or
  • app removal notices requiring the provider to cease enabling end-users to download an app that facilitates the posting of certain material on a social media service, relevant electronic service or designated internet service.

Depending on the contravening material, these online service providers must comply with the take-down notice within 24 hours of receipt of that notice.  This is half the time of the current 48 hour period.

Blocking Request Powers

The Act enables the Commissioner to request or require an internet service provider to block access to domain names, URLs or IP addresses containing material that depicts, promotes, incites or instructs in ‘abhorrent violent conduct’. ‘Abhorrent violent conduct’ includes terrorist acts, murder, attempted murder, torture, rape and kidnapping. To issue a blocking request or a blocking notice, the Commissioner must be satisfied that availability of the material online is likely to cause significant harm to the Australian community.

Each of the above regimes come into effect in six months (or on a date to be fixed by proclamation).

Enforcement

The Act grants the Commissioner with a range of enforcement powers, including civil penalties, formal warnings, infringement notices, enforceable undertakings and injunctions.

Further, in addition to the Commissioner’s removal and blocking powers set out above, the Commissioner has powers to:

  • obtain contact or identifying information from online service providers about individuals using anonymous accounts for actions which the Commission reasonably determines to be in contra to the principles of the Act including abusing, bulling or non-consensual distribution of intimate images of others; and
  • do any other thing the Commissioner deems necessary in connection with its’ functions.

Significance

Although the Act expands on the existing Australian framework, the Act introduces a number of additional obligations on social media service providers, electronic service providers, designated internet service providers and hosting service providers.  It is therefore important for companies operating in the online sector to familiarise themselves with their obligations under the Act and to ensure its policies and procedures are up to date to deal with any urgent take-down notice requests.

Please contact us for more information.

 

 

Authored by Mandi Jacobson, Angell Zhang and Zachary Forrai.

Search

Register now to receive personalized content and more!