Funmilayo Odude, Partner, Commercial and Energy Law Practice (CANDELP)
Follow Funmilayo Odude
@funmiodude
Subjects of Interest
- Law and Society
Balancing online safety and child rights 20 Apr 2026
The Federal Government’s decision to initiate public consultations on age restrictions and social media regulation for children warrants careful, sustained engagement. The objective of protecting children from digital harm is not in doubt, but the design of such protection, and the manner of its enforcement, will determine whether the resulting framework advances child welfare or merely signals regulatory intent without practical effect.
On this subject, the Federal Ministry of Communications, Innovation and Digital Economy has launched a survey inviting input from parents, educators, and stakeholders, with the stated goal of developing evidence-based policy. The proposed measures include setting minimum ages for social media accounts, mandatory age-verification systems, and enhanced platform accountability. The initiative is anchored in the Nigeria Data Protection Act 2023, and government statements suggest potential measures such as requiring a National Identification Number (NIN) to operate social media accounts. These are not trivial proposals, and they require scrutiny at the implementation level.
There is little room for disagreement on the underlying problem. Nigerian children are participating in digital spaces in increasing numbers, and the risks associated with that participation are both real and, in certain instances, severe. Cyberbullying, sexual exploitation, exposure to harmful content, and the less immediately visible but well-documented psychological effects of sustained social media use are not speculative concerns. They are evidenced harms, and they justify regulatory attention.
However, the existence of harm does not, in itself, resolve the question of constitutional limitations and regulatory method. Other jurisdictions have confronted similar challenges and have adopted a range of responses. Australia’s Online Safety Act and its subsequent decision to ban social media access for children under 16 from December 2025 represent one of the most sweeping national responses anywhere in the world. Indonesia has taken similar action. The European Union’s Digital Services Act imposes significant obligations on platforms regarding minor users. These examples are instructive but not directly transferable.
The conditions within which regulations operate differ materially across jurisdictions. The structural conditions within which any regulation must operate, including the state of digital infrastructure, the reliability of national identity systems, the maturity of regulatory institutions, and the depth of public digital literacy, are materially different across countries and jurisdictions. A policy architecture designed for a context with robust enforcement capacity, high smartphone penetration in formally registered names, and relatively functional identity infrastructure will not translate seamlessly to a country where millions of children access the internet through borrowed devices, unregistered SIM cards, and shared family accounts.
For example, the effectiveness of age-based restrictions depends, in part, on the reliability of identity systems, the reach of digital infrastructure, and the capacity of regulatory institutions to enforce compliance. In Nigeria, where many children access the internet through shared devices, unregistered SIM cards, or accounts created using third-party credentials, a model that assumes individualised, verifiable digital identities may prove difficult to operationalise.
This is most evident in the question of age verification. It is often presented as the central mechanism for enforcing child online safety. However, in practice, it raises a series of secondary concerns. A system that relies on NIN verification presupposes both widespread coverage for minors and users' willingness to submit identity data to private platforms as a condition of access. While the National Identity Management Commission (NIMC) has expanded registration, coverage gaps persist, particularly in underserved communities.
More fundamentally, such an approach introduces data protection risks. Requiring identity verification at scale entails collecting and processing sensitive personal data by entities that may not be domiciled in Nigeria. Although the Nigeria Data Protection Act 2023 establishes a legal framework for data protection, enforcement against large, multinational platforms remains uneven. A regulatory approach that mitigates one category of risk while creating another of comparable magnitude cannot be regarded as effective.
An alternative approach, reflected in emerging international practice, is age assurance. This model does not seek definitive proof of age but instead applies probabilistic methods to assess whether a user falls within a particular age range. Its objective is not perfect exclusion but the introduction of friction sufficient to reduce underage access, without establishing a permanent identity-linked access regime. In a context such as Nigeria’s, this may represent a more balanced response.
It is also necessary to identify the primary locus of responsibility correctly. For the purposes of designing policy, the burden of protecting children online cannot rest principally on children themselves, nor on parents acting without institutional support, nor on the state as a gatekeeper of access. Platforms occupy a central position in the digital ecosystem. This is not merely a rights-based argument, it is also a practical one. Platforms are the entities with the technical capacity, the data, and the commercial incentive structures that determine how children experience social media. They design the algorithms that serve content to minor users. They design the systems through which content is distributed, determine default privacy settings, and control the mechanisms through which users interact. A regulatory framework that focuses predominantly on restricting access, rather than shaping platform behaviour, risks addressing the symptom rather than the source.
Accordingly, greater emphasis should be placed on enforceable platform obligations. These may include transparency requirements regarding minor-user safety, restrictions on the algorithmic promotion of harmful content to younger users, default privacy protections for accounts identified as belonging to minors, and financial penalties for non-compliance. Such measures are more likely to produce sustained improvements than access restrictions that are readily circumvented.
Any regulatory framework must also consider the role of digital access in education. For many Nigerian children, the internet functions not only as a social space but as a primary channel for learning, mentorship, and opportunity. A blanket restriction, unaccompanied by carefully designed exceptions or graduated access models, risks exacerbating existing inequalities. Those with access to resources and digital literacy will find alternatives; those without may simply be excluded.
The constitutional context is equally relevant. The right to freedom of expression under Section 39 of the Nigerian Constitution is not confined to adults. While children’s rights to expression, information, and association may legitimately be subject to greater restriction than those of adults, restrictions must still be justified – they must be necessary, proportionate, and carefully targeted at the harm they purport to address.
At the same time, Section 17 of the Constitution establishes a directive obligation on the state to protect children from exploitation and neglect. This provision, read alongside Nigeria’s obligations under the United Nations Convention on the Rights of the Child, creates an affirmative duty to act. But the duty to protect does not collapse into a duty to exclude.
The Convention equally guarantees children’s rights to access information, to freedom of expression, and to participation in decisions that affect their lives. A policy that treats children as passive objects of protection rather than rights-bearing citizens with developing agency will struggle to withstand principled legal scrutiny – and will, in any event, fail to secure the buy-in of the young people whose behaviour it is trying to shape.
A coherent policy framework would therefore proceed on multiple fronts. It would impose clear, enforceable obligations on platforms operating within Nigeria’s jurisdiction. It would adopt age assurance mechanisms that are effective without being unnecessarily intrusive. It would mandate default safety protections for minors and ensure that parental control tools are accessible across linguistic and socioeconomic contexts. It would incorporate digital literacy into formal education, equipping children to navigate online environments with critical awareness. And it would establish accessible enforcement and complaint mechanisms for affected families.
The present consultation process creates an opportunity to develop such a framework. Whether that opportunity is realised will depend on the extent to which policy design engages with the realities of implementation and assigns responsibility to those actors best positioned to effect change.
The question is not whether children should be protected online. It is whether the regulatory response recognises that protection, properly understood, requires more than restriction. It requires a system that is legally sound, practically enforceable, and attentive to the balance between safety, access, and rights.
Funmilayo Odude is a Partner at Commercial and Energy Law Practice (CANDELP).
Latest Blogs By Funmilayo Odude
- Balancing online safety and child rights
- Executive Order 9 and its legal crisis
- Towards healthcare system that protects patients and is fair to practitioners
- Restoring asset declaration as a tool of public accountability
- Constitutionalism must anchor discipline in Nigerian Armed Forces



