Saturday, September 10, 2022

Online Content Regulation v Freedom of Expression

The livestreaming of the Christchurch terrorist attack in New Zeeland in 2019 has demonstrated potential threat to spread illegal and harmful contents on social media by the terrorists to amplify their mal intention to the general users of internet. As a result, online content regulation gets high political momentum around the world after the live video broadcasting of massacre of 51 Muslim worshippers at the Al Noor Mosque by terrorist Brenton Tarrant. Earlier, the global social media companies govern contents voluntarily through their own policy and by mostly depending on AI technology and users’ report to remove any illegal or harmful content. However, this incident has shifted the burden from voluntary action to mandatory legislative measure.



More than 40 new social media content regulation laws have been adopted worldwide in the last couple of years while another 30 are under active consideration. These laws aim to prevent social media platforms being weaponised for the purpose of flourishing extremism and propaganda by forcing online providers to control content on their platforms more cautiously. These legislations attempt to moderate both illegal and harmful but not directly illegal contents on online surface. Illegal content would encompass a large variety of items which directly contravene the legislation, such as hate speech, incitement to violence, child abuse, revenge porn etc. Instead, harmful content, refers to information that does not strictly fall under legal prohibitions but that might nevertheless have harmful effects like portraying self-harm, suicidal attempt content, cyberbullying, mis-or disinformation etc.



Soon after the Christchurch incident, the parliament of the commonwealth of Australia had enacted the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act in March 2019. This new law created fresh offences and liability for the providers of online contents and hosting services to make their platforms safe and responsible. The law compels tech companies to expeditiously remove illegal and harmful contents and failure to get rid of the content in a stipulated time could bring imprisonment and fines of up to 10% of their annual profit. Again in 2021, Australia has enacted unique Online Safety Act which includes world-first Adult Cyber Abuse prevention mechanism.

Likewise, Germany, France, European Union, Turkey, Brazil, Russia, United Kingdom, United States, India have either already passed or under process to formulate similar legal framework to govern social media contents. However, there is an interesting discourse around the globe whether judiciary or private tech company will determine the legality of content as the recent trend of worldwide online safety laws assign obligation to assess the legality of the content on private tech companies. Nonetheless, these laws provide forum for setting up a complaint handling system to make the tech giants accountable and ask them to produce transparent annual report on their actions against illegal and harmful contents.

However, there is a common allegation of restricting freedom of expression and imposition of censorship against these laws as there is genuine fear that many hosting providers and platforms will remove contents to avoid liability without assessing the merit of the content judiciously. Additionally, outsourcing of private tech companies or asking a government body instead of employing independent judiciary to evaluate the legally creates apprehension of restricting dissenting voice on social media.

However, Bangladesh has faced a lot of consequence arising out of illegal and harmful contents on social media ranging from livestreaming of suicide, disclosure of revenge porn, cyberbullying to harassment, hate speech, abuse, communal unrest etc. At this moment, there is no specific law governing illegal and harmful social media contents, although there are some controversial laws like the Digital Security Act, The ICT Act mainly to prevent cybercrime but not exhaustive to handle illegal and harmful contents.

Now, Bangladesh mainly follows command and control approach to regulate toxic contents on social media where both the government and the court orders telecommunication regulatory authority and digital security agency to scrap any controversial content or block access to a particular link from Bangladesh. Hence, there is actual risk of limiting the freedom of expression, plurality of opinion and restricting dissenting voices in absence of a clear-cut standard to determine illegal and harmful contents.

Nevertheless, the Bangladesh Telecommunication Regulatory Commission (BTRC) released a draft regulation on digital, social media and OTT platforms in 2021 to comply a court order to formulate policy for OTT platforms only that has great similarity with the much criticised the Indian Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. Immediately after publishing the draft regulation, it received massive criticism from rights group and civil society for limiting freedom of expression. There are so many incomplete and vague terms which have no concrete definition i.e. sovereignty, integrity or security of the country, decency or morality, friendly relations with foreign countries, or defamation (clause 6.01 (d)). Lack of proper definition and not knowing the exact elements to constitute the crime will create a fearful environment to express opinion as there are several allegations of irrational use of these grounds to jail people earlier.

Again, clause 7.03 of the draft obliges intermediaries like messaging service providers to unlock privacy of correspondence of the users to trace the first originator of a message and reveal his/her identity upon receipt an order from court or BTRC which is a clear violation of article 43 of the constitution. Moreover, authorizing BTRC through this draft regulation raises a genuine apprehension of possible arbitrary use of this power. There are several other scopes in part two and three of the mentioned regulation that can violate citizens’ right to freedom of expression and privacy guaranteed under the constitution. Hence, the draft regulation should be prepared newly respecting the international human rights standards, established best practices and aiming to create conducive, safe online environment without contravening rights of anyone.

Published on the New Age as Sub-Editorial on 10 September 2022 at page 8.

Published on the Daily Sangbad as Sub-Editorial on 15 September 2022 at page 6.

Published on Sarabangla as Opinion on 1 October 2022

Published on Drik News as Opinion on 23 October 2022.

No comments:

Post a Comment

Beyond the Gavel: The Twists of Prenatal Sex Detection

  In a recent decision, a divisional bench of the High Court Division (HCD) has imposed embargo on pre-natal sex detection in Bangladesh in ...