IT is humbly submitted that to suggest social media platforms should not be regulated, or regulated to a minimal extent, because of concerns that regulation may impact the right to freedom of expression, is to fundamentally mischaracterise the right and its interaction with other fundamental human rights.
The right to free speech should be understood in relation to other fundamental rights, such as the right to freedom of thought and conscience, right to information, right to life, liberty and security of person, among others.
It must be appreciated that online harm has the potential to impact the enjoyment of the above rights.
What is online harm?
The World Economic Forum’s “Toolkit for Digital Safety Design Interventions and Innovations: Typology of Online Harms” report is frequently cited as helping to define the general nature of online harms.
The report acknowledges how much the internet has heightened various social harms, such as bullying and harassment, hate speech, disinformation and radicalisation and how the amplification of these harms has far-reaching consequences, affecting individuals, communities and societies.
Developed by a working group of the Global Coalition for Digital Safety, comprising representatives from industry, governments, civil society and academia, the Typology of Online Harms is intended to serve as a “foundation for facilitating multistakeholder discussions and cross-jurisdictional dialogues to find a common terminology and shared understanding of online safety”.
It is useful in providing a framework with which to identify and categorise online harms.
The Typology of Online Harms recognises the complex and interconnected nature of online safety, encompassing content, contact and conduct risks.
Content harms include harms sustained in content production, distribution and consumption. Contact harms are those harms that can occur as a result of online interactions with others, whereas conduct harms are harms incurred through an individual user’s behaviour which is facilitated by technology and digital platforms.
Online harms are not only categorised but examples were also given. Examples of online harms that threaten personal and community safety include child sexual exploitation material, pro-terror material and extremist context, among others.
For harm to health and wellbeing, some examples are online content that promotes suicide, self-harm and eating disorders.
Harms that constitute a violation of dignity include bullying and harassment, doxxing and image-based abuse.
Deception and manipulation are exemplified by impersonation (posing as an existing person, group or organisation in a confusing or deceptive manner), scams (dishonest schemes that seek to manipulate and take advantage of people to gain benefits such as money or access to personal details), phishing (sending of fraudulent messages, pretending to be from organisations or people the receiver trusts, to try and steal details such as online banking logins, credit card details and passwords from the receiver) and catfishing (use of social media to create a false identity, usually to defraud or scam someone).
Consequently, one can easily identify and appreciate that online harms are now rampant, with victims experiencing a range of significant and lasting impacts.
In acknowledging the role that users play in the production, distribution, and consumption of content, the typology is also cognisant of how technology facilitates behaviour that is conducive to harm.
It is therefore imperative that the responsibility for content, contact and conduct risks and harms must include social media platforms.
It is a misguided interpretation of the right to free speech that social media platforms should not be regulated and not be accountable for the harms caused by abuses of the right to free speech.
No government should be dissuaded from pursuing a comprehensive regulatory regime.
The World Economic Forum has already called for urgent action which is needed “to minimise the potential harm to all people, with an emphasis on society’s most vulnerable groups, including children”.
Urgent and comprehensive – not hasty and knee-jerk action. – July 31, 2024
Hafiz Hassan reads Scoop