The Australian eSafety Commissioner vs X: Testing the Effectiveness of Enforcement Powers on Platforms

Tanvi Nair, Research Fellow, ANU's Tech Policy Design Centre

2024-05-20

AUSTRALIA

CYBERSECURITY

This article first appeared on the Australian Institute of International Affairs 'Australian Outlook'

240520 Elon Musk   The Summit 2013
The Australian news cycle has been dominated by the fight between the Australian eSafety Commissioner and X Corp (formerly known as Twitter)

As their battle continues to play out, the question of who controls the internet is thrust into the public debate once again.

The clash between the eSafety Commissioner and X started from when the Commissioner, using her powers under the Online Safety Act 2021, issued a notice to X that required the platform to take all reasonable steps to ensure that the extreme violent video material of the alleged terrorist act at Wakely in Sydney on 15 April was removed. The notice identified specific URLs where the material was located.

In response to the removal notice, X geoblocked the video content for Australians. Geoblocking, or blocking access to content for users who’s IP address is from a specific location, meant that users located in Australia would be unable to access this content. The Commissioner argued that geoblocking was not sufficient, as the URLs that were ordered to be removed could still be accessed by Australians through VPNs. On this basis, the Commissioner issued a civil penalty fine on X for non-compliance on 16 April 2024.

X accused the Commissioner of censorship and said that they will fight the penalties. However, the Commissioner continued to argue that the video must be taken down because of its potential to cause serious harm to Australians. The Commissioner referred the case to the Federal Court of Australia, who granted the Commissioner an interim injunction against X.

Almost a month after the stabbing was livestreamed, the Federal Court rejected the Commissioner’s request for an extension on the injunction on 13 May 2024. This means that while X will still have to remove the content for Australians, the Commissioner’s request for content removal to include global measures (beyond geoblocking for Australians) has been denied. The hearing between X and the eSafety Commissioner is ongoing.

Who is Liable for Content on the Internet?

The battle between X and the eSafety Commissioner showcases another instance of governments and platforms fighting to control each other. The current hearing against X shows three areas of tension between them regarding internet regulation: 1) The injunctions against X highlights questions about the effectiveness of the government’s enforcement powers; 2) X’s position as a social media platform gives it a unique vantage point from which it can influence the Australian public narrative on this debate; 3) and finally, whatever the ruling, there will be an impact on governments and platforms worldwide as to how they approach internet regulation.

Testing Enforcement Powers on Platforms

The Australian Federal Court ruled that X is liable to pay the fines for refusing to take down the content. In continuing to argue against the fine, X is testing the effectiveness of the enforcement powers of the Online Safety Act and is bringing the debate over what “compliance” looks like under the legislation into the spotlight.

Self-regulation from platforms is proving to be ineffective. X’s own content policies ban the dissemination of violent material (referred to as sensitive media), so they should have removed the violent video material flagged by the Commissioner regardless of government policy. X’s failure to remove this content, or rather, their ability to indiscriminately decide what is violent content, legitimises the need for government intervention on platforms, as it is evident that X can contradict its own content policies.

As the Act is being reviewed this year, a lack of compliance from X priovides convincing evidence for governments to create harsher consequences for platforms, and stronger powers for regulators.

Failure to enforce penalties for platforms who do not comply with government regulation sets a dangerous precedent. It shows that some platforms consider themselves to be above domestic regulation and can operate solely based on their interests. This would significantly hinder the government’s ability to hold platforms accountable for their practices, and most importantly, to be able to protect citizens in online spaces.

Weaponising Narratives in the Public Debate

Since his acquisition of X (at that time referred to as Twitter) in 2022, Musk has publicly called himself a “free speech absolutist.” He has built a persona around championing free speech and fighting against censorship. In his battle with the eSafety Commissioner, Musk is reshaping the narrative as one of himself protecting free speech from the reign of the Commissioner, who he has referred to as the “Australian censorship commissar.

Musk has successfully muddied the narrative about the Commissioner’s takedown request by equating the removal of violent material to an attack on free speech. This is despite a spokesperson from the Commission clarifying that the removal notice does not relate to commentary about the event, it only concerns the video depicting the attack.

Global Consequences

Both governments and platforms internationally will be watching the outcomes of the battle between X and the Australian Government closely. The effectiveness of enforcement powers will set a precedent for how other platforms will engage with Australia.

For governments considering their own domestic online safety regulation, such as Canada’s Online Harms Bill 2024 and the United Kingdom’s recently passed Online Safety Act 2023, the battle between X and the Australian government could influence their conceptions of compliance for platforms under regulation. These governments may consider harsher penalties or stronger enforcement powers to effectively reign in platforms.

If the Australian eSafety Commissioner can effectively order global content takedowns on X, it may also have unintended consequences of emboldening other governments to assert greater control over content on the internet. Civil society groups have argued that the eSafety Commissioner’s powers over violent content are overarching and oversimplified, which could hide human rights violations or perpetuate the continuation of violence behind closed doors. Strong enforcement powers on platforms without proper oversight could encourage other governments around the world to censor online content, which would have a chilling effect on free speech and on legitimate government control over the media.

Global platforms operating in Australia will also be watching to see how enforcement powers can affect them. Historically, platforms have tried to avoid liability for hosted content based on Section 230 of the USA’s Communications Decency Act (1996). While platforms such as Meta and Google regularly cooperate with the eSafety Commissioner, they will certainly be watching to assess the effectiveness of enforcement to inform their own compliance strategies. Platforms may see compliance, as with stronger enforcement powers, too hard to maintain, which could lead to them pulling their services out of Australia, similar to Meta removing news content in Australia earlier this year.

The current debate has greater impacts on the future of internet regulation, the jurisdiction of governments, and the effectiveness of regulation on big tech platforms. Whatever the outcome, there will be changes to government and platform approaches to internet regulation.

VIEW ORIGINAL ARTICLE

Membership

NZIIA membership is open to anyone interested in understanding the importance of global affairs to the political and economic well-being of New Zealand.