Digital platforms like X Corp (formerly Twitter), which facilitate social interaction and public discourse around the world, are subject to increasing regulatory scrutiny by governments seeking to address harmful content, ensure user safety, and hold tech giants accountable. Yet efforts to regulate these platforms are fraught with complexity. The global nature of their operations raises questions about the limits of jurisdiction, the powers of enforcement and the right to freedom of expression, together with widespread and sometimes heated controversy about the restriction of access to certain material by government officials. Australia, like many jurisdictions, is required to grapple with how to target platforms’ responsibility for the content they host whilst balancing these considerations.
Australia’s Online Safety Act 2021 (Cth) (the OS Act) establishes a set of Basic Online Safety Expectations (BOSE) for online service providers. These expectations require providers to take reasonable steps to ensure user safety, reduce harmful content and report on compliance. The OS Act empowers the eSafety Commissioner to issue a removal notice for certain material, to require internet service providers to restrict access to this material, and to issue civil penalties for non-compliance. The OS Act has extraterritorial reach, applying to platforms accessible within Australia, even if they are not based in the country.
The Australian Federal Court considered the legal meaning and scope of a removal notice to X Corp last year in eSafety Commissioner v X Corp [2024] FCA 499, a matter which garnered significant public interest and caused the Court to maintain a publicly available online file. The content the subject of the removal notice was a video, accessible via URLs hosted on X, depicting a violent stabbing attack on Bishop Mar Mari Emmanuel, leader of the Assyrian Orthodox Christ the Good Shepherd Church, during a livestreamed church service in Wakeley, Sydney, on 15 April 2024. The footage showed a teenage male rushing at the Bishop and attacking him. It showed the assailant raising their arm and striking the Bishop several times with a downward motion and the Bishop falling backwards. It is not clear from the video that a knife is being used, although that can be inferred. The reactions of witnesses can be heard.
On 16 April 2024, a delegate of the eSafety Commissioner issued a removal notice under section 109 of the OS Act to X Corp, requiring it to take all reasonable steps to ensure the removal of the material from its platform within 24 hours. X Corp responded by geo-blocking the URLs in Australia, thereby preventing access to the material by users with Australian IP addresses. However, the company did not take steps to prevent access by Australian users employing virtual private networks (VPNs) or other circumvention tools, nor did it remove the content entirely from its platform or limit its visibility through other technical means. The Commissioner raised concerns regarding the platform’s compliance under the OS Act, given the nature of the material – which the New South Wales Police Commissioner had described as a terrorist act.
The eSafety Commissioner applied to the Court seeking a declaration to the effect that X Corp had not complied with the notice, a pecuniary penalty and an injunction, which effectively required X Corp to carry out certain removal steps.
The arguments raised by the eSafety Commissioner included that X Corp’s actions did not satisfy the statutory requirement to remove the material, as defined in section 12 of the OS Act, which provides that material is considered removed only when it is neither accessible to, nor delivered to, any end-users in Australia. It was submitted that X Corp was capable of taking additional technical measures, such as removing or restricting the material entirely, obscuring it with a warning notice, or reducing its discoverability on the platform. It was also submitted that geo-blocking was insufficient in light of the ease with which Australian users could circumvent such restrictions.
The arguments raised by X Corp included that its actions—geo-blocking the URLs in Australia—were reasonable and sufficient under the OS Act, and that global removal or further restrictions were not reasonable. It emphasised the Bishop’s consent and support for the publication of the video, freedom of speech, the relevance to public discourse and the availability of the video on other platforms.
The case raised interesting issues, notably the extraterritorial reach of Australian online safety laws and the implications for global internet governance and freedom of expression. However, on this occasion the Court’s role was confined to construing the legal meaning and scope of the removal notice issued under section 109 of the OS Act, which authorised the eSafety Commissioner to issue a notice requiring a service provider to take “all reasonable steps” to ensure specified material is “removed” from the service—defined under section 12 to mean that it is no longer accessible to or delivered to “any end-user in Australia”:
[40] The policy questions underlying the parties’ dispute are large. They have generated widespread and sometimes heated controversy. Apart from questions concerning freedom of expression in Australia, there is widespread alarm at the prospect of a decision by an official of a national government restricting access to controversial material on the internet by people all over the world. It has been said that if such capacity existed it might be used by a variety of regimes for a variety of purposes, not all of which would be benign. The task of the Court, at least at this stage of the analysis, is only to determine the legal meaning and effect of the removal notice. That is done by construing its language and the language of the Act under which it was issued. It is ultimately the words used by Parliament that determine how far the notice reaches.
[41] Section 109(1), which is set out above, determines what a removal notice is and does. The only notice that may be given is a notice “requiring the provider” to “take all reasonable steps to ensure the removal of the material from the service”. The Commissioner chooses the material to which the notice is to apply (based on whether it is “class 1 material”) but does not have a discretion concerning how stringent or widespread the restrictions on access to that material are to be. The notice necessarily requires “all reasonable steps” to “ensure the removal” of the material.
[42] “Removed”, as noted above, is defined by s 12 of the OS Act. Section 18A of the Acts Interpretation Act requires (as common sense would suggest) that other grammatical forms of the same word be given corresponding meanings. “Removal” of material from a social media platform is a process that results in the material being “removed” in the defined sense: that is, a state of affairs where “the material is neither accessible to, nor delivered to, any of the end-users in Australia using the service”.
[43] The phrase “any of the end-users in Australia” must be read in context.
- One aspect of the context is s 23, which provides that the OS Act extends to acts, omissions, matters and things outside Australia.
- A second aspect of the context is the objects of the OS Act, set out in s 3, which are to promote and improve “online safety for Australians”. The reference to “Australians” suggests that the Act directs its attention to all Australian residents, not only those who use Australian service providers to connect to the internet.
- A third aspect of the context is the Explanatory Memorandum to the Bill for the OS Act (the Online Safety Bill 2021 (Cth)). The Explanatory Memorandum does not cast any direct light on the intended scope of a removal notice under s 109 (other than by observing that the section was intended to apply whether or not the relevant service is provided from within Australia). It notes that the provisions in what became Part 9 of the OS Act were substantially a re-enactment of earlier provisions in Schedules 5 and 7 to the Broadcasting Services Act 1992 (Cth) (the BS Act). Within the time frame of an urgent interlocutory decision, the extent to which I have been able to do my own research on the legislative history is limited. With the parties (both represented by competent counsel) not having submitted that any part of the legislative history would assist me in resolving the constructional issues as to what a removal notice requires to be done, I have proceeded on the basis that analysis of the former provisions of the BS Act would not be illuminating.
[44] The breadth with which the objects of the OS Act are expressed indicates that “any of the end-users in Australia” in s 12 should not be read narrowly. I was not taken to anything in the Act suggesting that the location of the IP address through which a person physically located in Australia connects with the internet was intended to make a difference as to whether they were to be denied access to class 1 material by operation of a removal notice. The Act does not use concepts derived from the structure of the internet, in lieu of ordinary geographical or territorial notions, to describe where people are. I have concluded that the phrase was intended to have its ordinary meaning and that “removal” therefore means making the material inaccessible to all users physically located in Australia. The original location of the relevant provisions in the BS Act, which regulates traditional broadcast media, tends (albeit not very strongly) to confirm this conclusion.
The Court considered that the context of the OS Act supported construing “any end-user in Australia” by reference to users’ physical location, rather than IP routing or service origin. Accordingly, “removal” required the material to be inaccessible to all users physically located in Australia.
In relation to what constituted “reasonable steps”, the Court considered that although the global removal of URLs may be a reasonable course of action for X Corp as a matter of business discretion, it did not follow that it was a required step under section 109. The eSafety Commissioner’s construction—that reasonable steps included global removal to ensure local inaccessibility—would confer upon the Commissioner powers with extraterritorial consequences incompatible with the comity of nations. Such a reading would require clear legislative language, which was absent. Ultimately, it was found that “reasonable steps” required by a removal notice issued under section 109 did not include the steps which the Commissioner sought to compel X Corp to take, such as a global removal:
[45] What the removal notice requires, therefore, is “all reasonable steps to ensure” that the 65 URLs are not accessible to any users physically in Australia. What is meant by “reasonable” steps is therefore critical.
[46] I have no doubt that removing the 65 URLs from its platform altogether would be a reasonable step for X Corp to take, in the sense that a decision by X to take that step could readily be justified. There is uncontroversial evidence that this is what other social media platforms have done, and that X Corp would not be in breach of any United States law if it took this step. However, this is not the test. The OS Act pursues a policy. It is not bounded by the policies of service providers or their contractual relationships with their users. Section 109 imposes its requirements regardless of the wishes of providers and of individual users.
[47] The qualifier “reasonable” should therefore be understood as limiting what must be done in response to a notice to the steps that it is reasonable to expect or require the provider to undertake. That understanding is consistent with how duties arising under the general law to take “reasonable” steps commonly work. Identification of the steps that are “reasonable” in this sense may involve consideration of expense, technical difficulty, the time permitted for compliance (which may be short: see s 109(2)) and the other interests that are affected. It is the last of these factors that is the focus of the parties’ disagreement.
[48] The argument that making the 65 URLs inaccessible to all users of X Corp’s platform everywhere in the world is not a step that it is “reasonable” to require X Corp to perform in order to ensure that the URLs are inaccessible to Australian users (and therefore is not a step required by the removal notice) is powerful.
[49] If s 109 of the OS Act provided for a notice imposing such a requirement, it would clash with what is sometimes described as the “comity of nations” in a fundamental manner. That concept, and the principle of statutory construction that arises from it, were recently discussed by reference to earlier cases in BHP Group Ltd v Impiombato [2022] HCA 33; 96 ALJR 956 at [23]-[32] (Kiefel CJ and Gageler J). It is not limited to the familiar presumption against the extraterritorial operation of statutes and is therefore not excluded here by the express provision for extraterritorial operation in s 23 of the OS Act. It is useful to set out their Honours’ recitation of the authorities at [27]-[31].
Exposition of the common law presumption in play in Morgan v White and in Meyer Heine can be traced in Australia to Jumbunna Coal Mine, No Liability v Victorian Coal Miners’ Association. There O’Connor J said:
Most Statutes, if their general words were to be taken literally in their widest sense, would apply to the whole world, but they are always read as being prima facie restricted in their operation within territorial limits. Under the same general presumption every Statute is to be so interpretated and applied as far as its language admits as not to be inconsistent with the comity of nations or with the established rules of international law: Maxwell on Statutes, 3rd ed, p 200.
Plainly, O’Connor J did not see the implied restriction on the territorial operation of a statute to which he referred in the first sentence as freestanding but rather as a reflection of the “general presumption” which he expressed in the second sentence with reference to Maxwell on Statutes. There, the presumption appeared in the precise terms adopted by O’Connor J under the heading “Presumption against a Violation of International Law”.
In Barcelo v Electrolytic Zinc Co of Australasia Ltd, Dixon J expressed the presumption in the same language drawn from Maxwell on Statutes as had been adopted by O’Connor J in Jumbunna. His Honour did so interchangeably with language drawn from 19th century English authority to the effect that “[i]t is always to be understood and implied that the legislature of a country is not intending to deal with persons or matters over which, according to the comity of nations, the jurisdiction properly belongs to some other sovereign or State”.
Dixon J returned to the presumption in Wanganui-Rangitikei Electric Power Board v Australian Mutual Provident Society. The “well settled rule of construction”, his Honour there explained, is that “an enactment describing acts, matters or things in general words, so that, if restrained by no consideration lying outside its expressed meaning, its intended application would be universal, is to be read as confined to what, according to the rules of international law administered or recognized in our Courts, it is within the province of our law to affect or control”.
In R v Foster; Ex parte Eastern and Australian Steamship Co Ltd, Dixon CJ expressed the presumption yet again. He did so, more pithily, in terms which he said were appropriate to be applied to a Commonwealth statute after the Statute of Westminster Adoption Act. He described it as “a presumption which assumes that the legislature is expressing itself only with respect to things which internationally considered are subject to its own sovereign powers”.
(Footnotes omitted.)
[50] If given the reach contended for by the Commissioner, the removal notice would govern (and subject to punitive consequences under Australian law) the activities of a foreign corporation in the United States (where X Corp’s corporate decision-making occurs) and every country where its servers are located; and it would likewise govern the relationships between that corporation and its users everywhere in the world. The Commissioner, exercising her power under s 109, would be deciding what users of social media services throughout the world were allowed to see on those services. The content to which access may be denied by a removal notice is not limited to Australian content. Insofar as the notice prevented content being available to users in other parts of the world, at least in the circumstances of the present case, it would be a clear case of a national law purporting to apply to “persons or matters over which, according to the comity of nations, the jurisdiction properly belongs to some other sovereign or State”. Those “persons or matters” can be described as the relationships of a foreign corporation with users of its services who are outside (and have no connection with) Australia. What X Corp is to be permitted to show to users in a particular country is something that the “comity of nations” would ordinarily regard as the province of that country’s government.
[51] The potential consequences for orderly and amicable relations between nations, if a notice with the breadth contended for were enforced, are obvious. Most likely, the notice would be ignored or disparaged in other countries. (The parties on this application tendered reports by experts on US law, who were agreed that a US court would not enforce any injunction granted in this case to require X Corp to take down the 65 URLs.)
[52] Section 23(2) of the OS Act extends the operation of its provisions to “acts, omissions, matters and things outside Australia”. It confirms that X Corp is in breach of the removal notice if it fails to take some “reasonable step” notwithstanding that the act or omission constituting that failure occurs overseas. However, s 23(2) does not control the meaning of “all reasonable steps”. A clear expression of intention would be necessary to support a conclusion that Parliament intended to empower the Commissioner to issue removal notices with the effect for which she contends.
[53] The result is that, read in context and in the light of normal principles of statutory construction, the “reasonable steps” required by a removal notice issued under s 109 do not include the steps which the Commissioner seeks to compel X Corp to take in the present case.
[54] For these reasons I have come to the view, based on the arguments advanced at this interlocutory stage, that the Commissioner will not succeed in establishing that compliance with the removal notice entails blocking access to the 65 URLs by all users of X Corp. It follows that there is not a prima facie case for the grant of a final injunction in the terms sought.
The Court declined to grant the interlocutory injunction. The case did not justify such relief, considering the principle of the comity of nations and the fact that such relief could potentially affect millions of users, where ultimately, the eSafety Commissioner had failed to establish that there were strong prospects of success – or effectiveness.
As to the potential effectiveness of the injunction, the Court considered, based on legal expert evidence from both sides, that a U.S. court would be unlikely to enforce any injunction to require X Corp to take down all URLs. The expert U.S. lawyers agreed that the removal notice would be contrary to the First Amendment if it were imposed by a government actor in the U.S. and if it restricted the ability of users in the United States to access the video, and that it was highly likely that courts in the U.S. would decline to enforce an Australian court order enforcing the removal notice – either because they would view such an order as repugnant to the public policy of the U.S. or because they might view such an order as penal in character (and U.S. courts do not enforce foreign penal orders).
Even on the assumption that the proposed injunction was not enforceable in the U.S., while a potential educative or deterrent effect was considered, the Court did not find that this was a relevant consideration for the making of interlocutory orders.
The application for an injunction was refused:
[56] If the considerations relating to the comity of nations (discussed at [48]–[51] above) had not led me to the view that the Commissioner has not made out a prima facie case, the same considerations would have led me to conclude that the balance of convenience does not favour extending the interlocutory injunction in its current (or any similar) form.
[57] On the one hand the injunction, if complied with or enforced, has a literally global effect on the operations of X Corp, including operations that have no real connection with Australia or Australia’s interests. The interests of millions of people unconnected with the litigation would be affected. Justifying an interlocutory order with such a broad effect would in my view require strong prospects of success, strong evidence of a real likelihood of harm if the order is not made, and good reason to think it would be effective. At least the first and the third of these circumstances seem to be largely absent. The first is discussed above. As to the third, it is not in dispute that the stabbing video can currently be viewed on internet platforms other than X. I was informed that the video is harder to find on these platforms. The interim injunction is therefore not wholly pointless. However, removal of the stabbing video from X would not prevent people who want to see the video and have access to the internet from watching it.
[58] On the other hand, there is uncontroversial expert evidence that a court in the US (where X Corp is based) would be highly unlikely to enforce a final injunction of the kind sought by the Commissioner; and it would seem to follow that the same is true of any interim injunction to similar effect. This is not in itself a reason why X Corp should not be held to account, but it suggests that an injunction is not a sensible way of doing that. Courts rightly hesitate to make orders that cannot be enforced, as it has the potential to bring the administration of justice into disrepute.
[59] It was suggested that an injunction, even if not enforceable, could have an educative or deterrent effect. X Corp’s amenability to education and deterrence might be thought to be open to doubt. In any event, while these are sometimes important considerations in the framing of final relief, I doubt whether they have a proper role in the making of interlocutory orders.
The judgment was delivered on 13 May 2024.
Shortly prior to the delivery of this judgment, on 6 May 2024, X Corp challenged the removal notice in the Administrative Appeals Tribunal (AAT) on the basis that the video did not meet “Class 1” under the Australian classification regime which encompasses “extreme violence material” and that the removal notice was invalid.
On 5 June 2024, the eSafety Commissioner filed a notice of discontinuance of the whole of the Federal Court proceeding for the stated reason that it intended to focus on the AAT matter.[1] At this time, it was reported by the eSafety Commissioner to the Guardian Australia that eSafety had several legal fights with X, “Litigation across multiple locations, multiple cases, prudent use of public funds,” she said. “[X] had a phalanx of lawyers plus the most expensive barrister in Australia [Bret Walker SC].”[2]
Ultimately, the AAT proceeding was resolved by agreement. The office of the eSafety Commissioner said in a statement published on its website on 11 October 2024 that, “eSafety believes that rather than test the interaction of the National Classification Scheme and the Online Safety Act in the context of this particular case, it is more appropriate to await the Federal Government’s consideration of a pending review of Australia’s statutory online safety framework”.[3]
The 2024 Statutory Review of the Online Safety Act 2021, released on 4 February 2025, proposes significant obligations on online platforms such as Facebook, Instagram, TikTok, and X. The recommendations include the introduction of a duty of care by service providers to take reasonable steps to address and prevent foreseeable harms arising from the content on their platforms—such as harm to people’s mental and physical wellbeing and threats to national security and social cohesion—with penalties for a breach of the duty of care of up to 5% of global turnover or $50 million (whichever is greater).
The Report also proposes the creation of a new Online Safety Commission, with powers to impose higher penalties for non-compliance with removal notices—up to $10 million. Platforms with significant reach would be subject to stricter compliance obligations, such as mandatory annual risk assessments, mitigation of risks, measurement of success (or otherwise) and strong transparency reporting. Further, the Report recommends requiring overseas platforms to establish a local presence in Australia.
Elon Musk, the owner and executive chairman of X Corp, has previously publicly criticised Australia’s online safety regulations, such as in a post on X dated 20 April 2024, by referring to the eSafety Commissioner as the “Australian censorship commissar“[4], and continues to challenge Australian online safety regulations. Very recently, in May 2025, X Corp initiated legal action seeking a declaration that a new safety standard for harmful online content (called the Relevant Electronic Services (RES) Standard) does not apply to it. The RES covers two categories of Class 1 material linked to serious harm: Class 1A material, including child sexual exploitation and pro-terror content, and Class 1B material, such as crime, violence, and drug-related content.
The action is ongoing.
In conclusion, digital platform regulation will continue to require courts and regulators to grapple with the balance between national interests, the rights of individuals, and platform responsibility. While the First Amendment in the U.S. robustly protects free speech in the U.S., other countries like Australia appear to place greater emphasis on curbing harmful content to protect the public. This tension becomes apparent when digital platforms such as X Corp argue that complying with removal notices infringes on the free speech rights of users in jurisdictions with more permissive content regulations.
The decision in eSafety Commissioner v X Corp [2024] FCA 499 and the publicly available Court file can be found here.
[1] https://www.esafety.gov.au/newsroom/media-releases/statement-from-the-esafety-commissioner-re-federal-court-proceedings.
[2] Guardian Australia article entitled “X says ‘free speech has prevailed’ after eSafety commissioner drops case over Wakeley church attack posts”, published on 5 June 2024.
[3] https://www.esafety.gov.au/newsroom/media-releases/esafety-statement-administrative-appeals-tribunal-orders#:~:text=eSafety%20believes%20that%20rather%20than,Australia’s%20statutory%20online%20safety%20framework.
[4] https://www.theguardian.com/australia-news/2024/apr/23/elon-musks-x-v-australias-online-safety-regulator-untangling-the-tweet-takedown-orders.