Online Safety Act (Australia)
Overview
The Online Safety Act 2021 is a comprehensive Australian law designed to improve online safety for all Australians, with particular protections for children. It establishes a centralized regulator—the eSafety Commissioner—with broad powers to compel content removal, enforce industry standards, and coordinate safety across digital platforms. The Act was signed into law on 23 January 2022 and is periodically amended.
Regulation Summary
Timeline
- 23 January 2022 – Online Safety Act commenced
- 2021–2024 – Inclusion of age restriction provisions and industry standards
- Ongoing – eSafety Commissioner develops and enforces industry codes
What Businesses Are Affected
- Social media platforms and age-restricted platforms
- Relevant electronic services (e.g. messaging, gaming, chat)
- Designated internet services (e.g. forums, websites)
- Hosting service providers and app distributors
- Internet service providers
Exemptions
- Services that are not accessible by Australian users
- Exempt services specified by legislative rule
- Content related to court, parliamentary or official inquiries
Responsibilities for Businesses
- Remove harmful content upon notice (cyberbullying, abuse, violent material)
- Cease hosting non-consensual intimate images upon notice
- Cooperate with complaints investigations
- Adhere to Basic Online Safety Expectations (BOSE) if applicable
- Provide clear reporting mechanisms for users
- Prevent access by children to Class 2 material
- Design and maintain systems for timely takedown of harmful content
Specific Responsibilities for Website Owners
- Implement and display a reporting channel for users to flag:
- Cyberbullying or cyber-abuse
- Non-consensual intimate images
- Class 1 and Class 2 material
- Abhorrent violent material
- Comply with removal notices within 24 hours unless extended
- Notify users of action taken and offer appeal processes
- Display terms of use that address harmful content
Additional Requirements
- Social media platforms must take reasonable steps to prevent underage users (under 16) from holding accounts.
- Platforms must not collect certain types of personal or identification data when enforcing age restrictions.
- Providers must cooperate with the Commissioner’s requests for reporting and data.
- Platforms must comply with industry codes or standards developed under the Act.
Individual Rights
- Users may file complaints or objections about harmful content.
- Individuals can object to the posting of intimate images without consent.
- Right to appeal or counter-notify following takedown.
- Privacy protections apply to data collected for compliance.
Enforcement
- Enforcing authority: eSafety Commissioner
- Regulatory Mechanism:
- Civil penalties for non-compliance with takedown notices, safety expectations, or age-verification requirements
- Public warnings, infringement notices, enforceable undertakings, and injunctions
- Penalties:
- Civil penalties up to 500 penalty units per violation (approx. USD $103,000)
- Up to 30,000 penalty units for breaches of age-verification requirements (approx. USD $6,200,000)
Questions?
If you would like to learn more, our compliance experts are happy to support you.
Leave us a Messagesupport@clym.io
+1 980 446 8535 +1 866 275 2596