Regulatory Map

Online Safety Act 2021 (Cth)

Compilation No. 3 · As at 11 December 2024 · Federal Register of Legislation C2021A00076

Objects & Scope of the Act

Part 1 · ss 3–24
The Act's objects are to improve and promote online safety for Australians (s 3). It applies to a broad range of online services with an Australian connection — services accessible to Australians, provided from Australia, or where relevant conduct has an Australian nexus. The Crown is bound (s 22). The Act requires regard to the UN Convention on the Rights of the Child (s 24).
Objects of the Act s 3
Five core objects: protect Australians online, promote responsible online services, address harmful content efficiently, and ensure adequate complaints/reporting mechanisms.
Definition of "Material" s 5
Defined broadly: text, data, speech, music, other sounds, visual images, moving images, or any other form, or any combination of forms.
Social Media Service s 13
Service with sole or primary purpose of enabling online social interaction between 2+ end-users, via posts, sharing, or real-time communication.
Relevant Electronic Service s 13A
Service delivered via internet carriage service, whether or not it also delivers via other means. Captures messaging, email, gaming, dating, and other interactive services.
Designated Internet Service s 14
Internet content hosting / websites. Residual category capturing services not already SMS or RES — e.g. websites, file-sharing, cloud storage services.
Hosting Service Provider s 17
Provider of a service hosting internet content in Australia. Subject to removal notices and other obligations.
Internet Service Provider (ISP) s 19
Provider supplying internet carriage service to the public. Subject to blocking notices for abhorrent violent material.
Consent s 21
For intimate images: must be free, voluntary, informed, and specific. Consent may be withdrawn at any time; consent for one purpose does not equal consent for another.

eSafety Commissioner

Part 2 · ss 25–28
The Act establishes the eSafety Commissioner as Australia's independent online safety regulator. The Commissioner has broad functions spanning education, investigation, enforcement, and policy development.
Commissioner — Role & Appointment s 26
Statutory office holder appointed by the Minister. Operates independently within the ACMA framework.
Functions of the Commissioner s 27
16+ enumerated functions including online safety promotion, complaint handling, industry engagement, research, and advising the Minister.
Powers of the Commissioner s 28
Power to do all things necessary or convenient for performing functions. Includes issuing notices, conducting investigations, and making determinations.

Regulated Harms & Material Categories

Parts 5–9
The Act addresses five core categories of harmful online content. Each has its own regulatory scheme with tailored enforcement tools. The categories are not mutually exclusive — material may fall into more than one category.
Cyberbullying — Australian Child Part 5 · ss 6, 64–73
Material targeted at an Australian child that is likely, having regard to its content, to have the effect of seriously threatening, seriously intimidating, seriously harassing, or seriously humiliating the child.
Enforceable via: Removal notices · End-user notices · Formal warnings
Non-Consensual Intimate Images Part 6 · ss 15–16, 74–86
Intimate images posted without the consent (or purported consent) of the depicted person. Covers real images, altered images (deepfakes), and digitally created imagery.
Enforceable via: Civil penalties · Removal notices · Remedial directions
Cyber Abuse — Australian Adult Part 7 · ss 7, 87–93
Material targeted at an Australian adult intended to cause serious harm, being menacing, harassing, or offensive in a way that reasonable persons would consider to be serious.
Enforceable via: Removal notices · Formal warnings
Abhorrent Violent Conduct Material Part 8 · ss 9, 94–104
Material that depicts terrorist acts, murder/attempted murder, torture, rape/kidnapping. Subject to rapid website blocking during crisis events (e.g. Christchurch-type attacks).
Enforceable via: Blocking requests · Blocking notices to ISPs
Class 1 Material (Illegal Content) Part 9 Div 2 · s 106
Material classified or likely to be classified RC (Refused Classification) under the National Classification Scheme. Includes CSAM, pro-terror content, extreme violence and crime instruction.
Enforceable via: Removal notices · AFP referral (if hosted in AU)
Class 2 Material (Restricted Content) Part 9 Divs 3–4 · s 107
Material classified or likely to be classified X 18+, R 18+, or MA 15+ under the Classification Scheme. Not illegal per se but harmful to children if unrestricted. Must be behind a restricted access system or removed.
Enforceable via: Removal notices · Remedial notices · Restricted access system
Classification linkage: Class 1 and class 2 are defined by reference to the National Classification Code and Classification Guidelines under the Classification (Publications, Films and Computer Games) Act 1995 (Cth). The Phase 2 Industry Codes build upon these base classifications, defining subcategories such as class 1C, class 2A, class 2B, self-harm material, high-impact violence, and others.

Regulatory & Enforcement Tools

Parts 5–14
Removal notices
Other notices
Blocking
Codes & standards
Expectations
Penalties & enforcement
Click any item for details
1
Removal & Takedown
Removal notices — cyberbullying SMSRESDISHOST
Part 5 · ss 65–67
24-hour compliance. Service provider or hosting provider.
Removal notices — intimate images SMSRESDISHOSTEND USER
Part 6 · ss 77–79
24-hour compliance. Can target providers, hosts, and the posting end-user.
Removal notices — adult cyber abuse SMSRESDISHOSTEND USER
Part 7 · ss 88–90
24-hour compliance. Additional higher threshold of "serious harm" requirement.
Removal notices — class 1 material SMSRESDISHOST
Part 9 Div 2 · ss 109–113A
24-hour compliance. Material classified or likely RC. AFP referral if hosted in AU.
Removal/remedial notices — class 2 material SMSRESDISHOST
Part 9 Divs 3–4 · ss 114–123A
Remove or place behind restricted access system. 24-hour compliance for removal; remedial notices allow RAS alternative.
Link deletion notices SEARCH ENGINES
Part 9 Div 5 · ss 124–127
Commissioner can require search engine providers to stop providing links to material found to be class 1 or class 2.
App removal notices APP STORES
Part 9 Div 6 · ss 128–131
Commissioner can require app distribution services to remove an app that facilitates access to class 1 material.
2
Website Blocking
Blocking requests — abhorrent violent material ISPs
Part 8 Div 2 · ss 95–98
Commissioner requests ISPs block domain/URL providing abhorrent violent material. Voluntary compliance. Max 3 months.
Blocking notices — abhorrent violent material ISPs
Part 8 Div 3 · ss 99–103
Mandatory blocking notice. Civil penalty for non-compliance. Max 3 months, revocable.
3
End-User Notices & Individual Liability
End-user notice — cyberbullying END USER
Part 5 · ss 70–72
Notice to the person who posted cyberbullying material, requiring cessation. Formal warning for non-compliance.
Civil penalty — posting intimate images END USER
Part 6 · s 75
Civil penalty of up to 500 penalty units for posting non-consensual intimate image. Extended to deepfakes and digitally created imagery.
Remedial directions — intimate images END USER
Part 6 · s 83
Commissioner can direct a person to take reasonable steps to remove intimate image from other locations, destroy copies, or apologise.
4
Basic Online Safety Expectations
Basic Online Safety Expectations (BOSE) SMSRESDIS
Part 4 · ss 45–48
Minister determines expectations by legislative instrument. Sets baseline behavioural standards for all regulated services.
Core expectations ALL SERVICES
Part 4 · s 46
Mandatory core expectations that must be included: reasonable steps to minimise class 1, ensure safe use for children, report to agencies.
BOSE reporting — periodic & non-periodic NOTIFIED PROVIDERS
Part 4 Div 3 · ss 49–63
Commissioner can require transparency reports on compliance. Formal warnings and civil penalties for non-compliance.
5
Social Media Minimum Age
Minimum age — civil penalty AGE-RESTRICTED SMS
Part 4A · s 63D
Provider must take reasonable steps to prevent age-restricted users (under 16) from holding accounts. Civil penalty for non-compliance.
Data restrictions & privacy protections
Part 4A · ss 63DA–63F
Restrictions on data collection for age assurance. Must not collect government ID numbers or biometric data. Privacy protections for information collected.
Commissioner information-gathering powers
Part 4A Div 4 · ss 63G–63H
Commissioner may require platform providers to give information about compliance with minimum age requirements.
6
Industry Codes & Standards
Industry codes — registration & compliance INDUSTRY SECTIONS
Part 9 Div 7 Subdiv C · ss 140–144
Industry bodies develop codes addressing class 1/2 material. Commissioner registers if "appropriate community safeguards." Enforceable once registered.
Industry standards — Commissioner-determined INDUSTRY SECTIONS
Part 9 Div 7 Subdiv D · ss 145–148
Commissioner can determine standards if codes fail/are inadequate. Higher regulatory intervention. Enforceable with civil penalties.
Service provider rules SMSRESDIS
Part 10 · ss 150–162
Minister can make rules requiring service providers to comply with specific standards regarding class 1/2 material. Legislative instruments.
7
Enforcement & Investigation
Formal warnings
Various · throughout Act
Commissioner may issue formal warnings for breaches. Published on the Register. Preliminary step before escalation.
Civil penalties
Various · throughout Act
Ranging from 500 to 500,000 penalty units (individuals to bodies corporate). Applies to: failure to comply with notices, NCII posting, industry code/standard breaches, SMMA non-compliance.
Injunctions FEDERAL COURT
Part 12 · ss 180–189
Commissioner can seek Federal Court injunctions restraining conduct, requiring action, or preventing anticipated breaches.
Enforceable undertakings
Part 13 · ss 190–194
Commissioner can accept written undertakings. Breach enforceable in Federal Court. Alternative to litigation.
Investigation powers
Part 14 · ss 195–205
Search warrants, information-gathering notices, examination on oath, production of documents. Backed by penalties for non-compliance.
8
Complaints, Disclosure & Review
Complaints framework
Part 3 · ss 29–43
Divisions for: cyberbullying (Div 2), intimate images (Div 3), cyber abuse (Div 4), and online content scheme (Div 5). Complaints may trigger investigations and enforcement.
Disclosure of information
Part 15 · ss 206–218
Commissioner may disclose information to: Minister, ACMA, law enforcement, Royal Commissions, teachers/principals, parents/guardians. Subject to statutory protections.
Review of decisions
Part 16 · ss 220–220A
Decisions reviewable by Administrative Review Tribunal. Internal review available for certain decisions. Statutory protection from civil/criminal proceedings for Commissioner.

Regulatory Architecture — Escalation Framework

Design Principle
The Act establishes a graduated, escalating regulatory framework. This reflects the co-regulatory model: industry self-regulation is the preferred first step, with increasing levels of government intervention available where self-regulation fails.
Level 1 — Basic Online Safety Expectations BASELINE
Non-binding expectations set by the Minister. Service providers can be required to report on compliance. Formal warnings for non-reporting.
Part 4 · Transparency pressure
Level 2 — Industry Codes CO-REGULATION
Industry-developed, Commissioner-registered codes. Mandatory and enforceable once registered. Commissioner can request codes be developed.
Part 9 Div 7 Subdiv C · Including Phase 2 SMS Codes
Level 3 — Industry Standards DIRECT REGULATION
Commissioner-determined standards where codes fail or are inadequate. Imposed without industry agreement. Civil penalties for non-compliance.
Part 9 Div 7 Subdiv D · E.g. RES/DIS Class 1A/1B Standards
Level 4 — Service Provider Rules MINISTERIAL RULES
Minister can make rules as legislative instruments imposing direct obligations. Highest level of systemic regulation under the Act.
Part 10 · Legislative instruments
Level 5 — Direct Enforcement ENFORCEMENT
Removal notices, blocking notices, civil penalties, injunctions, enforceable undertakings. Commissioner exercises powers directly against providers, hosts, and individuals.
Parts 5–9, 11–13 · Complaint-triggered or own-motion
Where the Phase 2 Codes sit: The Phase 2 SMS Codes (Schedules 4 & 4A) are Level 2 — Industry Codes registered under Part 9, Division 7, Subdivision C of this Act. Once registered by the Commissioner, they become mandatory for all participants in the relevant section of the online industry. The Phase 2 Codes specifically address class 1C and class 2 material on social media services.