We, the undersigned civil society organisations and individuals, urge the government to immediately withdraw plans to ban social media for children under 16.
We understand that public concerns are at a peak, and there is no denying that social media companies have increasingly posed challenges to the well-being of children and all social media users in recent years.
We also understand that the government’s aim to protect children and young people from online harm and the negative impact of social media on their wellbeing is consistent with Article 19 of the United Nations Convention on the Rights of the Child (UNCRC), which requires states to take measures to protect children from violence (both physical and mental), abuse, and maltreatment.
While framed as a protective measure, this proposed blanket ban on social media for children under 16, expected to be implemented by June 2026, is misguided and disproportionate, and it risks undermining the privacy and freedom of expression of all social media users, both adults and children.
It does not address the systemic and structural drivers of harm in digital spaces and may ultimately prove ineffective. Far from solving the problem, it threatens to entrench it.
It is critical that we adopt evidence-based, rights-respecting regulations grounded in nuance, rather than a total prohibition on children under 16 from participating in the digital world.

Children do not need to be excluded from digital spaces; instead, they need protection within them, along with the skills and safeguards to participate safely and meaningfully.
Malaysia stands at a critical juncture for reform.
The government must reject simplistic, punitive restrictions and instead pursue a rights-based, evidence-driven agenda that confronts the root causes of online harm: platform design, exploitative business models, invasive data practices, and weak regulatory accountability.
It should be noted that the UN Committee on the Rights of the Child, in its recent concluding observations on Malaysia (February 2026), expressed concern that the government’s steps to ensure inclusive digital access for children remain inadequate.
The committee raised that there is “a lack of digital literacy and a significant digital divide; age-based prohibition and control over access to social media limiting children’s access to age-appropriate information and online child-friendly platforms; and limited guarantee of privacy protection”.
The committee also recommended that Malaysia “adopt a child rights-based approach to the implementation of the Online Safety Act 2025”.

We recommend the following reform agenda for Malaysia and urge the government to immediately:
Withdraw the proposed blanket ban on social media for children under 16 and ensure all reforms undergo full parliamentary scrutiny and meaningful and inclusive public consultation.
Introduce comprehensive and robust platform regulation aligned with constitutional and international human rights standards, including the Federal Constitution (Article 8 and 10), Child Act 2001 (Preamble), and Malaysia’s commitments under the Universal Declaration of Human Rights (UDHR), the UNCRC, and the International Covenant on the Civil and Political Rights (ICCPR), among other international measures.
Mandate human rights due diligence and child rights impact assessments for digital services to identify and mitigate any risks, including specifically on children’s rights, and consider which regulatory mechanisms can be used to enforce action based on identified risks.
ADSAdopt a whole-of-society approach that not only looks at social media in isolation but also proactively engages with civil society, children, children with disabilities, parents, women’s rights groups, children’s rights groups, disability rights groups, health services, educational settings, and other pertinent stakeholders.
By collaborating with these groups, the government can gain valuable insights, develop comprehensive strategies, and implement impactful measures to safeguard children and individuals from online harm.
Our main concerns are as follows:
1. Subject ban to legislative process, parliamentary scrutiny
The government is looking to introduce the social media ban through a Child Protection Code under Section 80 of the Online Safety Act (Onsa).
However, as we have stated previously, the Onsa lacks clarity and lacks strong safeguards for fundamental rights, including freedom of expression. We believe that any legislation aimed at regulating social media companies must prioritise transparency, human rights, independent oversight, and accountability.
Crucially, the Onsa does not provide any legal basis for a blanket ban on users under 16. On the contrary, Section 18 of the Onsa presumes children’s access to digital platforms and focuses on making those environments safer.
Imposing a ban via subsidiary legislation that appears to contradict an Act raises serious concerns of executive overreach.
(a) Bypassing Parliament’s scrutiny: Any measure that significantly restricts human rights must be enacted through clear primary legislation, subject to parliamentary scrutiny and democratic debate. Circumventing this process undermines legitimacy and public trust.
(b) Tokenistic consultation: The consultation period conducted by the MCMC on the proposed regulatory framework for the Online Safety Plan, the Draft Risk Mitigation Code, and the Draft Child Protection Code, from Feb 12 to March 31, 2026, is manifestly insufficient.
It fails to meet basic standards of inclusivity, transparency, and evidence-based policymaking, particularly given the scale of impact on millions of users, including children.
2. The ban undermines children’s human rights
The proposed ban represents a profound regression in the protection of children’s rights.
Children have rights and protections under the UNCRC, including to safety, privacy, protection from exploitation, and the freedoms of expression and information, all of which apply fully in the digital environment, as affirmed by General Comment 25 (2021).
States are obligated to uphold these rights online, and businesses must respect them. These protections are further reflected in the ICCPR and the UDHR, which guide Malaysia’s human rights framework, including under the Human Rights Commission of Malaysia Act 1999.

Children face real risks online, including through the dissemination of inciting content, disinformation, addictive features, and exploitative data practices.
These risks are driven by the architecture, design, as well as business and operating models of the platforms, not simply by children’s presence on them.
A blanket ban would:
(a) Shifts responsibility away from the government and digital platforms and onto users, thus likely to violate international human rights standards, including legality, necessity, and proportionality.
(b) Suppresses children’s rights to learn, communicate, express themselves, and participate meaningfully in society.
Digital platforms are now integral to education, social connection, as well as political and civic engagement. Excluding children risks deepening inequality and social isolation, and would inherently impact their autonomy, agency, and self-development.
(c) Will likely drive children to migrate to less regulated and potentially more dangerous online spaces, including the “dark web”.
(d) Erodes privacy due to the intrusive age verification mechanisms, which would eliminate online anonymity and lead to an increase in processing and potential exploitation of the personal data of children.
(e) Fails to address tech companies’ harmful business models and practices, nor does a ban create better or safer spaces for children.
Instead, these approaches may disincentivise tech companies, those within and beyond the restriction’s scope, from providing age-appropriate and rights-respecting digital experiences for children, as required under the Onsa.
(f) Ultimately, a ban creates a dangerous “cliff-edge” phenomenon, where children are suddenly exposed to high-risk environments at 16, with insufficient preparation.
Consequently, removing opportunities for gradual, supported engagement would not cultivate resilience; rather, it would merely defer the onset of risk.
3. Age verification undermines rights of all
Enforcing a blanket ban would require intrusive, large-scale age-verification or age assurance systems, likely based on sensitive personal data.
The government’s proposal that all social media platforms operating in Malaysia be required to adopt mandatory electronic Know Your Customer (e-KYC) verification using government-issued documents, such as MyKad, passports, and MyDigital ID, inherently involves expanding surveillance technology and personal data collection processes by proprietary actors.

It is likely to be exploited by government, private corporations, and malicious actors alike. Such measures risk normalising mass data collection, privacy invasions, and eroding anonymity online.
Linking age-verification to government-issued documents and overly broad identity verification systems is likely to lead to exclusion, discrimination, or reinforce existing barriers to access, particularly for individuals and at-risk communities who don’t have identity documents and who are already facing disproportionate levels of structural discrimination.
Age assurance should be used to provide children with age-appropriate digital experiences and must be lawful, rights-respecting, privacy-preserving, risk-based, and proportionate.
This means that the government should use the least restrictive approach to achieve the legitimate aim of protecting children from harm; these approaches should be preferred over a blanket prohibition or ban.
It also must be designed to minimise data collection and prevent misuse, and not be a blunt instrument for exclusion.
Notably, in March 2026, over 400 security and privacy scientists and researchers from 32 countries called for a moratorium on the use of age-assurance technologies until there is substantial evidence of their effectiveness and societal implications.
While protecting minors is crucial, implementing blanket identity checks across the internet is too dangerous and counterproductive. Malaysia should heed this warning rather than rush into high-risk, unproven systems.
4. Building resilience through a whole-of-society approach
The government must shift from a “move fast and break things” mentality to one that prioritises empowering young people with the skills, knowledge, and support to navigate digital environments safely and with resilience.
We urge the government to invest in sustainable solutions that empower children, particularly via:
Digital literacy and critical thinking education;
Accessible and affordable mental health and support services;
Ongoing engagement with children, parents, educators, and civil society.
Addressing root issues
We reiterate that the proposed blanket social media ban does not address the root issues of the business models and services of social media companies.
Children should not be prohibited from accessing the digital world; they should be able to do so safely and in ways that protect their rights.
It is the platforms and their business models that exploit children, and which should be regulated and held accountable.
We stand ready to engage constructively with the government to advance a reform agenda that is rights-based, evidence-driven, and fit for the evolving digital age. - Mkini
The statement was endorsed by over 80 groups and four individuals. Among them are Article 19, Centre for Independent Journalism, Borneo Komrad, North South Initiative, and Justice for Sisters.
The views expressed here are those of the author/contributor and do not necessarily represent the views of MMKtT

No comments:
Post a Comment
Note: Only a member of this blog may post a comment.