Regulation of Social Media Content

In an article published in its issue of March 10 2018, the UK magazine The Economist points out that that ‘on Twitter at least, false stories travel faster and farther than true ones.’ This conclusion was based on a study carried out by ‘MIT’s Laboratories for Social Machines’ which statistically examined every one of 4.5 million tweets sent between 2006 – 2017 (126,000 stories in all).

Further, the researchers found that the reason things spread faster through social networks was that ‘they are appealing, not because they are true.’ Untrue stories, it appears, are more appealing because ‘they inspire emotions such as fear, disgust and surprise,’ and moreover, ‘people prefer to share stories that generate strong negative reactions.’

The government has released a ‘Consultation Paper on proposed amendments to the ICT Act for regulating the use and addressing the abuse and misuse of Social Media in Mauritius (14 April 2021) with a deadline for submission of comments by 05 May 2021, and has added a list of questions as a guide (to which others may be added by those making submissions).

The paper refers to 2051 incidents reported on the Mauritian Cybercrime Online Reporting System (MAUCORS) from January 2020 to January 2021 of several types: Hacking, Online Harassment, Offensive Contents, Sextortion, Identity Theft, Cyberbullying, Cyber Stalking. Online Scams and Frauds, Phishing, Malware. Along with apparent lacunae in the ICTA which limits its scope, the MAURICORS data are deemed to further reinforce ‘the need for appropriate corrective measures to be undertaken in Mauritius.’

The paper indicates that ‘The issue at hand is when these abuses, even though perpetrated by few individuals/groups, go viral, the damage created is very far reaching. In the early 2000s, social media firms argued that they simply created tools that enable distribution of information. They did not regulate the content on their platforms.’

However, with the increasing number of complaints that have been made across the world, the social media platforms (Google, Facebook, etc) have had to introduce self-regulating measures. ‘But legal provisions prove to be relatively effective only in countries where social media platforms have regional offices. Such is not the case for Mauritius. The only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.’

These amendments purport ‘to come up with operational measures in an autonomous and independent manner without the need to solely rely on social media administrators for actions. It is also imperative to do due diligence by building appropriate safeguards in this operational framework so as to avoid infringing the constitutional rights of the Mauritian citizens as to their freedom of expression and fundamental democratic values.’

This will be done by the setting up of a two-pronged operational framework:

▪ a National Digital Ethics Committee (NDEC) as the decision-making body on the contents; and
▪ a Technical Enforcement Unit to enforce the technical measures as directed by the NDEC.

There are two major points that arise in respect of this framework;

  1. The composition of the NDEC, the proposal being that ‘the Chairperson and members of the NDEC be independent, and persons of high calibre and good repute.’

The key issue here is HOW are these persons going to be selected? Are they going to be political nominees? – in which case bye-bye to independence, etc. More information is definitely required about this process, and the definitive criteria which the general terms ‘high calibre and good repute’ leave too open.

  1. At the outset, it should be made clear and spelt out that ‘enforcement’ will not mean ‘force’ – such as police squads with blaring sirens and blinding lights forcing their way into residences at odd hours, as happened in no less that the ‘case’ of the DPP, and recently in the case of Ms Aruna Gangoosingh.

There will have to be a minimum of decency and respect for the ‘suspect’ or ‘accused’, and all his/her legal rights guaranteed, including access to legal adviser/advice before any action on the spot is taken.

In principle, there is definitely a need to exercise oversight over social media content, as several jurisdictions cited as examples in the Consultation Paper are undertaking, but there must be delivery on the due diligence pledged in the paper, as well as all the other guarantees that are to be expected in the democracy that we pretend to be.

The current atmosphere of trust deficit in the country’s establishment should not be further enhanced by manu militari legislation or action – if need be more time must be given for thorough consultations to be completed, as this is such a serious matter.


* Published in print edition on 20 April 2021

An Appeal

Dear Reader

65 years ago Mauritius Times was founded with a resolve to fight for justice and fairness and the advancement of the public good. It has never deviated from this principle no matter how daunting the challenges and how costly the price it has had to pay at different times of our history.

With print journalism struggling to keep afloat due to falling advertising revenues and the wide availability of free sources of information, it is crucially important for the Mauritius Times to survive and prosper. We can only continue doing it with the support of our readers.

The best way you can support our efforts is to take a subscription or by making a recurring donation through a Standing Order to our non-profit Foundation.
Thank you.

Add a Comment

Your email address will not be published. Required fields are marked *