Kinjal Ahuja and Ashish Rawat
“We cannot have a system, or even the appearance of system, where surveillance is secret, or where decisions are made about individuals by a Kafkaesque system of opaque and unreviewable decision-makers”
– Reyna and Farley, 2006, p.43
Introduction
India’s first dedicated data privacy law, the Digital Personal Data Protection Act, 2023 (“DPDP Act”), was enacted on August 11, 2023. The Ministry of Electronics and Information Technology recently published the draft Digital Personal Data Protection Rules, 2025, on January 3, 2025 (“Draft Rules”), with an aim to operationalise this Act.
Rule 10 of the Draft Rules outlines the requirements for obtaining verifiable consent from parents or lawful guardians before processing the personal data of children or persons with disabilities. This rule is focused on protecting the data of children and persons with disabilities. Majority of Indian youth comprising 85% of non-adult users have access to smartphones. Most of them are online five hours a day and 80% admitted to using social media. These data reflect the pressing need to safeguard child’s data available on the internet however the protection of this data should not come at the cost of user privacy or stifling voices. There is an unstated conflict between this Rule and user Privacy. While this Rule aims at preserving data, it could lead to significant unintended consequences.
The implicit need for mandatory age verification of all users
On a plain reading of this Rule, it can be inferred that Data Fiduciaries might be required to verify the age of all users in order to comply with it in entirety. The intent of this Rule is to protect the data of children, and for this, it mandates the Data Fiduciaries to take verifiable consent of parents/lawful guardian before processing the data of a child.
To comply with this requirement, it is imperative for Data Fiduciaries to know which of their users is a child, implying that they will have to verify the age of all their users. However, placing the responsibility of choosing the appropriate age verification method solely on Data Fiduciaries may create a significant compliance gap due to the lack of clear guidelines.
Age Assurance techniques: No Perfect Solution
In pursuance of the same, a term used for verification of age is called Age assurance. It is a range of techniques for estimating or verifying the ages of children and users like Self-declaration, AI and biometric-based systems, technical design measures, tokenised age checking using third parties and hard identifiers like passports. Regulators in Australia and France are of the view that none of the existing age assurance technologies meet the parameters of sufficiently reliable verification, complete coverage of the population and upholding privacy of citizens.
In our context, Rule 10(1)(b) precisely mentions two techniques for age assurance and identity verification, the first one being; ‘voluntarily provided details of identity and age by the user’ and the second one being; “a virtual token mapped to the same, which is issued by an entity entrusted by law or the Central Government or a State Government with the maintenance of such details or a person appointed or permitted by such entity for such issuance, and includes such details or token verified and made available by a Digital Locker service provider.”
As of now, a plethora of Data Fiduciaries use Self-Declaration techniques for age and identity verification like tick boxes or voluntarily fed details by users. Relying on such methods would negate the intent of the Rules since they can be easily bypassed by entering incorrect details. Hence, this method exhibits low levels of assurance.
Analysing the second technique proposed by Rule 10, Virtual Tokensmapped to the identity and age of the user, issued by Central/State Government or an entity entrusted by law can lead to enormous privacy concerns like data profiling and centralization. This seems to be paradoxical since the intent of this Act is to ensure Data Privacy. Let’s see how this is problematic.
If every user willing to make an account on a Data Fiduciary has to undergo age verification in the way proposed by Rule 10, it can have a detrimental effect on anonymity especially for whistleblowers, journalists and in fact anyone who wants to maintain anonymity. Take for instance, employees exposing corporate fraud or unethical practices rely heavily on anonymous channels to report violations without fear of retaliation, or victims of domestic violence, sexual assault, or other abuse often use anonymous platforms to seek support or share their stories.
Additionally, this technique of age and identity verification is contradictory to the principle of Data Minimization as embodied in Section 6 of the DPDP Act, and in Article 8 of the European Union’s General Data Protection Regulation (“GDPR”). This principle was also underscored by the Srikrishna Committee in its 2017 White Paper in which it stated that data that is processed ought to be minimal and necessary for the purposes for which such data is sought and other compatible purposes beneficial for the data subject.
Making Data Fiduciaries rely on tools like Digital Locker or other government-approved platforms to verify a user’s age and identity can end up causing them to collect and store more personal data than what’s truly necessary for that limited purpose. For example, linking virtual tokens to Aadhaar or similar identification systems can inadvertently reveal a range of sensitive information, such as family details, residential address, etc.
Using this age assurance technique can be risky, as it might lead to data breaches and enable data fiduciaries to create comprehensive profiles of their users, which can be misused for targeted advertising or unauthorised surveillance. Such profiling of users is against the principle of data minimisation, and it poses risks which are neither necessary nor proportionate to the objective of safeguarding children. Hence, the technique which will be employed by data fiduciaries to verify the age and identity of the users is a dilemma due to the lack of clear guidelines, and thus there is a pressing need for clearer and more concise guidelines or rules regarding the same.
Proposed Alternative in contrast to methods mandated by Rule 10
A viable path to reconciling the tension between safeguarding children online and protecting user privacy lies in the adoption of Decentralized Identifiers (DIDs) coupled with Zero-Knowledge Proofs (ZKPs). Unlike conventional verification methods that require disclosure and storage of extensive personal data, this approach allows individuals to prove their age without revealing their full identity or sensitive documents.
DIDs are cryptographically verifiable digital identities created and controlled by the user rather than a central authority. When combined with zero-knowledge proofs, they provide a mechanism where a user can cryptographically confirm that they meet a specific criterion—such as being above a certain age threshold—without disclosing any other personal information. Zero-knowledge proofs by their nature can enhance privacy in identity-sharing systems, which are vulnerable to data breaches and identity theft. When integrated into a decentralized identifier system, ZKPs add an extra layer of encryption on DID documents.
For example, instead of uploading an Aadhaar card or school certificate, a child or parent could use a DID wallet that interacts with an authorized age verification service. The service issues a verifiable credential stating simply that the user is over a given age. When the user seeks to access an online platform, a zero-knowledge proof can be generated to confirm the age claim without revealing any underlying data such as name, address, or document identifiers.
This model aligns with the principle of data minimization by ensuring that only the essential fact of age eligibility is disclosed to the platform, without requiring the collection or storage of supporting identity documents each time verification occurs. Unlike the approach envisioned under Rule 10(1)(b), where a virtual token remains mapped to an individual’s identity and may be queried repeatedly through a central authority, the DID-ZKP framework avoids creating a persistent trail linking the user’s activities across different platforms. Each proof generated is unique and un-linkable, meaning no party can aggregate a user’s verification events to build a profile or track their online behaviour.
Moreover, while Rule 10(1)(b) requires data fiduciaries to rely on government-authorised repositories or directly collect identifying information, this decentralised system shifts control back to the user. The issuer is involved only at the point of initial credential issuance and has no visibility into where or how often the credential is presented thereafter. This significantly reduces the risks of profiling, data breaches, and surveillance, since no centralised logs of verification activity are maintained. In practice, this means that a child or parent can demonstrate compliance with age-gating requirements without disclosing any other personal data or allowing continuous oversight by the verifying authority.
Other Concerns regarding Implementation of Rule 10
Beyond the fundamental privacy challenges and the lack of clarity around age verification mechanisms, there are several other practical and contextual factors that could hinder the effective implementation of Rule 10. These issues go beyond purely technical concerns and reflect deeper socio-economic, infrastructural, and cultural realities unique to India. Two of the most prominent challenges in this regard are the widespread digital illiteracy that affects how consent is understood and exercised, and the rigid age threshold that may not align with the evolving capacities of adolescents.
Digital Illiteracy and the Risk of Superficial Consent
Even if we assume that Data fiduciaries somehow able to identify the age and identity of their users accurately and are further able to verify their parents’ or lawful guardians’ identity, the reliability of the consent given by the parents or guardians will be questioned. Considering the digital landscape of India, where digital literacy levels remain uneven, particularly in rural and economically weaker societies, many parents may not fully understand the implications of providing such consent. In most cases, the consent is given without understanding the risk of data sharing, making it more of a checkbox exercise than an informed decision. Also, many children use devices registered in their parents’ names, meaning even if age verification or parental consent mechanisms are in place, they can be easily bypassed.
An Overly Restrictive Age Threshold
Another important aspect to consider is that the threshold for age in Rule 10 is too high. Section 2(f) of DPDP Act defines a child as a person who has not yet completed the age of 18 years. This is inconsistent with other Indian laws as well as international standards. Legislations like the Child Labour (Prohibition and Regulation) Act, 1986, which permits adolescents aged 14 – 18 to work in non-hazardous occupations; The Right of Children to Free and Compulsory Education Act, 2009, which protects children only up to 14 years; and The Motor Vehicles Act, 1988, which permits individuals aged 16-18 years to obtain licenses for gearless two-wheelers, are all in conflict with the 18-year-old age threshold of DPDP Act. These laws recognise that children under the age of eighteen have varying levels of maturity and decision-making ability.
The USA’s Children’s Online Privacy Protection Act, 1998 (“COPPA”) mandates verifiable parental consent only for children under the age of 13, while Article 8 of the GDPR sets the age threshold for parental consent between 13 and 16 years, allowing Member States to formulate guidelines within this range. These frameworks are consistent with the ‘evolving capacities’ principle embodied in United Nations Convention on the Rights of the Child (“UNCRC”), which acknowledges that children’s capacities evolve as they grow and develop and they must be enabled to exercise greater agency over their lives. However, in our context, this 18 years age threshold is inconsistent with this principle.
Conclusion
Rule 10 of the DPDP Rules, 2025, while introduced with an objective of safeguarding children’s data online, raises major challenges in its implementation. A mandated age verification requirement exists, yet no ideal age assurance technique is available, and the methods that have been proposed raise concerns like efficacy, privacy and anonymity. Moreover, the 18-year-old threshold reflects a rigid framework that ignores principles like ‘evolving capacities’ and is inconsistent with the Indian and global standards. Issues like digital illiteracy and the prevalence of shared devices also pose challenges to the implementation of this rule. To address these challenges, a more balanced approach is essential, one that integrates privacy-preserving verification mechanisms such as Decentralized Identifiers and Zero-Knowledge Proofs, which allow users to prove age eligibility without disclosing unnecessary personal data or creating persistent records of their online activity. Incorporating such innovative solutions, alongside clearer guidelines and flexible standards, would help achieve the dual objectives of protecting children in the digital realm and upholding the fundamental right to privacy in a proportionate manner.
Kinjal Ahuja (B.A., LL.B Hons) and Ashish Rawat (B.B.A., LL.B Hons) are third-year students at Chanakya National Law University, Patna.
