`


THERE IS NO GOD EXCEPT ALLAH
read:
MALAYSIA Tanah Tumpah Darahku

LOVE MALAYSIA!!!


Wednesday, July 31, 2024

Govt must halt social media licensing plan

Article 19 and the Centre for Independent Journalism (CIJ) are deeply concerned about the recent announcement by the government that a new regulatory framework will be introduced on Aug 1, with enforcement effective Jan 1, 2025, for social media companies to obtain licences under the Communications and Multimedia Act (CMA) 1998.

This development is seen as a direct attempt to exert control over social media platforms, which could have far-reaching implications for the freedom of expression guaranteed in the Federal Constitution.

Furthermore, there is a growing apprehension that such regulatory measures could pose a significant threat to the fundamental democratic values that underpin the nation’s governance and the underlying principle of CMA Section 3(3), which states that “nothing in the CMA shall be construed as permitting the censorship of the internet”.

Civil society organisations (CSOs), including Article 19 and CIJ, which have been previously engaged for consultation, have expressed our concerns to the government regarding the potential imposition of licensing on social media platforms to moderate harmful content.

We have advised the government against hasty decision-making and emphasised the need for thorough consideration of the implications and stakeholders involved.

Additionally, on June 27, 2024, the CSOs issued a letter to the prime minister urging the government to prioritise increased collaboration and consultation with CSOs and other relevant stakeholders.

Their message highlighted the importance of inclusive and transparent processes in shaping policies related to social media regulation.

Overreach of licensing framework

The licensing system for network and application services faces a significant challenge due to difficulties in anticipating future needs and developments, and a notable lack of independent oversight, which can impact the fairness and transparency of the licensing process.

This lack of clear guidelines and oversight has created uncertainties for social media platforms.

Consequently, these platforms may need to meet specific regulatory requirements and adhere to standards set by regulatory authorities as part of the licence renewal process.

This would involve a closer working relationship between the platforms and the regulatory bodies, ensuring that they operate by the requirements outlined in the licensing framework.

As a result, platforms could be more compliant and consent to more removal requests from the government instead of focusing on effective and timely content moderation.

It is important to note that the lack of transparency in the compliance process gives large platforms even more power to police what we see, say, and share online-with disastrous consequences for public debate, the free flow of information and democracy.

Social media networks are a vital space for us to connect, share and access information.

Article 19, in a legal analysis of the CMA, repeatedly warned that some of the provisions under the CMA are problematic and not in line with international human rights standards.

The more regulations are in place, the more power the Malaysian Communications and Multimedia Commission (MCMC) has to regulate content and social media companies.

We have repeatedly raised the issue of using Sections 211 and 233 of the CMA to define harmful content. At the same time, the provisions have been abused over the years to restrict freedom of expression.

In principle, we reiterate that Sections 211 and 233 of the CMA should be repealed as they have an expansive scope and vague interpretation.

The provisions also do not meet the international freedom of expression standard, especially the three-part test: legitimate aim, provided by law, proportionate, and necessary.

Platform accountability

We understand the government’s intent to hold social media platforms and messaging applications accountable as a means of tackling online abuse, hate speech and other problematic content, including scams and fraud that target children or other online users.

The important step is to get the social media platforms to enhance their community standards and guidelines to meet international human rights standards, including data protection, privacy and transparency on the use of artificial intelligence (AI).

They must also ensure that their content moderation and removal policies and actions are effective and timely, done in transparent and systematic ways, without personal, political, or business biases.

Social media platforms will have to invest in adequate human and language detection resources to go beyond automated flagging or using AI to detect harmful content.

Thus, the government will have to adopt innovative and alternative means of holding these platforms accountable.

Attempts to incorporate these platforms into a more traditional regulatory regime are unlikely to be effective and may have unforeseen implications given the rapidly growing nature of technology and the global reach of these platforms.

Any measures to hold the platforms accountable must ensure that there is meaningful protection of the rights of the public, including not infringing on the users’ freedom of expression.

Way forward

It is essential to address the lack of transparency regarding specific requests made by the MCMC or other government entities to the platforms and their responses to these requests.

The government should avoid unnecessarily regulating online content moderation and licensing social media platforms.

Any regulatory framework for social media platforms must be based on principles of transparency, accountability and the protection of human rights.

This should include requirements to enhance transparency in content moderation decisions and improve systems for resolving disputes arising from these decisions.

It is recommended that the government adopt the following:

  • Establish a social media council which would promote a multi-stakeholder independent regulatory framework;

  • Set up an independent committee to review the root causes of hate speech and cyberbullying and relatedly develop a comprehensive plan of action using the Rabat Plan of Action as the framework; and

  • Enhance its education and awareness programmes aimed at building a resilient society guided by ethical and responsible content-creating standards, with adequate digital literacy to combat the dangers of harmful content.

In conclusion, to achieve better results in countering harmful social media content and protecting users, the government must reconsider its current plan and consult more comprehensively with CSOs.

This is necessary because effectively addressing harmful content goes beyond just content moderation; it also entails addressing the root causes of issues such as hate speech, cyberbullying, and gender-based violence.

Engaging with CSOs can provide insights into the broader societal and systemic problems that contribute to harmful content and help develop more holistic and effective strategies for mitigating these challenges. - Mkini


The joint statement was issued by Article 19 and the Centre for Independent Journalism.

The views expressed here are those of the author/contributor and do not necessarily represent the views of MMKtT.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.