US Supreme Court justice Clarence Thomas has suggested that social media platforms be regulated in a similar way to phone companies
US Supreme Court justice Clarence Thomas has suggested that social media platforms be regulated in a similar way to phone companies © FT Montage/Alamy

Big Tech companies such as Facebook, YouTube and Twitter exercise near-unlimited power to monitor and moderate speech on their platforms. They have become the modern public square.

Justice Clarence Thomas of the US Supreme Court took the highly unusual step last month of addressing the power of platforms, free speech and regulation — all while dismissing as moot a lawsuit against Donald Trump, the former US president. It was the clearest signal yet that both the courts and legislature may move to rein in these companies’ absolute discretion on how they operate.

The court dismissed an earlier ruling by a lower, federal court that Trump had violated the First Amendment rights of the people he blocked on Twitter. But at the same time, Thomas took the opportunity to make the point that “digital platforms provide avenues for historically unprecedented amounts of speech”. He suggested that these platforms should be viewed as common carriers, such as telephone companies, and subject to similar regulation.

Ethical and legal dilemmas over our modern-day public squares arise because of their dominance, combined with ownership by a few big companies. They have assumed essential social and community functions and now set norms in society. From Trump’s screeds on Twitter, to Greta Thunberg’s warnings on Facebook, to YouTube performances by the poet Amanda Gorman, social media is the primary platform for information, free speech and artistic expression.

Frederick Mostert and Alex Urbelis
Frederick Mostert and Alex Urbelis

At the same time, the protection of public discourse and free speech and the prevention of global online crime is under extreme pressure. This is due — ironically — to both over-blocking and under-blocking by social media platforms.

At one end, user-generated content is too often subject to algorithmic, robo-takedowns. On Facebook or Twitter, even a small error rate can result in significant deprivation of speech. A recent Twitter debacle is a case in point: in March, the platform suspended for 12 hours any account that tweeted the word Memphis. The ban remains unexplained but was likely the result of Twitter trying to prevent the dissemination of personal information.

At the other extreme, criminal activities such as online child sexual abuse run rampant and often go undetected as the UK’s Online Harms white paper pointed out in 2019. The UN even identified an increase in social media-based human trafficking during the coronavirus pandemic.

These extremes are, to some extent, caused by inadequate content moderation and the opaque internal legal systems developed by the digital gatekeepers.

There is a clear and present need for timely due process, to enable the balancing of fundamental rights within the international digital community.

Not enough justice is meted out to criminals, whose malicious content can spread rapidly. Meanwhile, artistic expressions are removed robotically. Facebook and YouTube’s overzealous enforcement of copyright has recently led to the automated and baseless muting of classical musicians performing works by Beethoven, Mozart and other composers whose copyrights ended centuries ago.

As Roger McNamee, an early investor in Facebook who became disenchanted by its practices, argued in his 2019 book Zucked, “the hands-off attitude of platforms have altered the normal balance of free speech, often giving advantage to extreme voices over reasonable ones”.

The virality, volume and velocity of online activity has caused an unprecedented rise in crime. Reports of online sexual abuse have grown from fewer than 1m to more than 18.4m in less than 10 years.

We are witnessing a new type of digital space with a new legal infrastructure, in effect a parallel judicial universe. This new legal order, often unwritten and private, regulates our digital public squares and sets new norms, affecting billions of people.

Government regulation may seem to be the obvious next step. Yet regulation goes against the internet’s ethos, as envisioned by digital pioneers such as Sir Tim Berners-Lee, who invented the world-wide-web, and Vint Cerf, one of the so-called fathers of the internet. They postulated a free and unfettered cyberworld where information flowed freely; the right to know was a given; scientific collaboration would flourish and opinions were expressed without censure.

Governments, the “weary giants of flesh and steel”, as digital libertarian John Perry Barlow called them, were neither welcome nor competent to regulate cyber space. How, then, to reconcile that which is potion, with that which is poison? The general consensus is that the early visions of the internet need rethinking, and in recent years some pioneers have called for a revision. This is why fundamental principles of due process and procedural fairness are required.

Government regulation without precise direction will not address the complex problems around content moderation. Rather, online platforms must apply principles of due process, procedural fairness and natural justice in a standardised and universal manner. As our colleague Professor Eleonora Rosati of the Digital Scholar Institute has said, “Online due process may very well form the basis of an online infrastructure to restore fundamental freedoms and support the rule of law in the digital world.”

Thomas wrote in his opinion that both the legislature and the Supreme Court “will soon have no choice but to address how our legal doctrines apply to highly concentrated, privately owned information infrastructure such as digital platforms”.

Committing to sound principles of online due process would limit the amount of necessary government intervention. At the same time, platform users would be given the opportunity to have their day in court. Due process principles would mitigate under and over-blocking, and platforms would disclose how they moderate content, curate data, make disclosures, publish transparency reports, and conduct other public-facing activities.

As Berners-Lee has argued, what the online world needs now is a Magna Carta for the web. There is a clear need for online justice in relation to social media and other platforms. “Online due process” could serve as a cornerstone in our digital architecture to reconstitute fundamental rights and the rule of law in the online world.

Frederick Mostert is a professor of practice in intellectual property law at King’s College, London and a member of the Digital Scholarship Institute.

Alex Urbelis is partner at the Blackstone Law Group LLP

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article