Follow us


The next evolution of the internet is set to be driven in part by immersive technologies. As the take-up of these technologies accelerates, online interactions will change. Existing laws and regulations will need to evolve to address the new types of harms that will emerge.

The way we interact with people and digital content through the internet and via online platforms is constantly evolving.

The last few years has seen big tech companies including Meta, Microsoft, Google and ByteDance delve into the possibilities of immersive technologies such as virtual, augmented and mixed reality (captured by the umbrella term ‘extended reality’). Most recently, Apple has propelled the technology forward by announcing that it is stepping into the era of ‘spatial computing’ with similar immersive products.

Immersion is fundamental to the concept of the ‘metaverse’, which, despite mixed responses from the public, continues to incite broader discussions around the future of connectedness in the digital landscape.

Online service providers and developers looking to leverage the opportunities presented by immersive technologies need to understand the current regulatory landscape and be prepared for change as development and adoption of these technologies accelerates.

To learn more about how online safety regulations broadly apply in Australia, see our first Article in our Online Safety Series here.

 


 

What are the risks and implications of immersive technologies?

Immersive technologies allow individuals to embody online experiences and feel present with others in virtual spaces, particularly in the context of online social platforms.

A heightened level of immersion enables a greater level of connectivity and interaction between platform users. Accordingly, the impact of interactions and experiences can be far more intense and consequences more severe. Certain types of harmful behaviours such as harassment and assault translate in a way that is not possible through traditional forms of browser-based technology.

Immersive social platforms and virtual worlds also have the potential to facilitate various harmful activities online including the proliferation of harmful content, image-based abuse, grooming, cyberbullying and other types of abuse. Already many of these platforms (1) offer public virtual spaces where users have little control over deliberate interactions with others, and (2) are often populated by a significant proportion of children and other vulnerable people. These factors increase the risk for harm in these immersive worlds.

Such harms can of course arise in various contexts that might employ immersive technologies, including in a workplace setting or in private.

 


 

Regulation of immersive interactions in Australia

As noted in Article 1, online companies have an obligation to ensure safe use of their platforms and compliance with the Online Safety regime in Australia, broadly consisting of the:

The operation of the Online Safety Act is broad and would capture online service providers hosting or operating immersive social platforms.

Currently, the Online Safety Act deals primarily with individual pieces of harmful material or content posted online and provides mechanisms to ensure that content can be efficiently taken down. This would apply in the same way to immersive platforms caught by the Act that facilitate the sharing of harmful online content. However, in its current form, the Act does not yet specifically address real-time, synchronous interactions between users.

Immersive platform providers are also held accountable for ensuring a safe online environment by meeting the basic online safety expectations set out in the Basic Online Safety Expectations 2022. Companies need to take proactive steps to ensure they are meeting these expectations, particularly in the context of immersive social platforms where interactions between users are possible. How this might be achieved would likely require novel solutions and considerations.

In addition, the recently registered Industry Codes (which will come into force on 16 December 2023) include prescribed risk assessment and compliance measures that service providers must take in respect of class 1 material (including child sexual exploitation material and terrorist material), or else be subject to hefty financial penalties. Most relevant to companies in the immersive technology space would be the Industry Codes in respect of social media services, app distribution services, and equipment providers.

The eSafety Commissioner is aware of serious abuse that could be facilitated by immersive technologies. The Commissioner has previously released a position statement on immersive technologies (see here) which sets out the various online safety risks that it has identified with such technologies, and how it is championing a ‘proactive harm-prevention approach’ in Australia as part of its Safety by Design initiative. The Safety by Design approach is embedded into the compliance measures set by the Industry Codes.

 

Did you know?

The operation of the Online Safety Act must be independently reviewed 3 years after its commencement which provides it with flexibility to adapt to novel harms or issues arising from emerging technologies such as immersive tech and the metaverse.

 


 

What next?

As the cross-over between social platforms and immersive technologies continue to evolve, compliance with Australia’s Online Safety regime and scrutiny over the ways platforms are ensuring a safe online environment will come to the fore.

The application, adequacy and suitability of existing laws and regulations such as the Online Safety Act will be tested as complex issues surrounding immersive technologies, novel forms of online interaction and the metaverse come to light.

Service providers should be prepared to future proof their products and services to adapt to the evolving landscape of both law and technology.

Australian online safety series

A spotlight on Australian regulation, specifically addressing online safety

Key contacts

Kwok Tang photo

Kwok Tang

Partner, Sydney

Kwok Tang
Tania Gray photo

Tania Gray

Partner, Sydney

Tania Gray
Christine Wong photo

Christine Wong

Partner, Sydney

Christine Wong
Susannah Wilkinson photo

Susannah Wilkinson

Director, Generative AI (Digital Change), Brisbane

Susannah Wilkinson
Rachel Holland photo

Rachel Holland

Solicitor, Sydney

Rachel Holland
Eric Kong photo

Eric Kong

Solicitor, Sydney

Eric Kong

Stay in the know

We’ll send you the latest insights and briefings tailored to your needs

Australia Data Protection and Privacy Technology, Media and Telecommunications AI and Emerging Technologies Technology, Media and Telecoms Kwok Tang Tania Gray Christine Wong Susannah Wilkinson Rachel Holland Eric Kong