Skip to main content

Effective Governance for Children’s Privacy Online

Internet
Discussions around children’s privacy protection will continue to take their place on regulatory agendas in the following years.
| Burak Haylamaz | Issue 152 (Mar - Apr 2023)

This article has been viewed 10956 times

Effective Governance for Children’s Privacy Online

In This Article

  • Children and teens represent one-third of all internet users and experience the digital world very early stage of their lives.
  • There are legitimate concerns about the secondary use of collected data by third parties, e.g., in the form of targeted advertising, that may attach to children and follow them into adulthood.
  • A risk-based approach is perceived as being one of the promising data governance models to protect children’s privacy by holding companies, rather than parents, accountable.

Let’s close our eyes for a second and imagine a food court in a shopping mall during peak hours. It would not be that difficult in our imaginations to place some toddlers holding tablets or teenagers checking social media feeds. While such a narrative directly cautions our protective behaviours against those technologies, we are also aware of the reality that without those technological advancements younger generations would not be able to continue their education during the pandemic, or in general, access valuable information and socialize instantly within a couple of clicks. According to UNICEF, children and teens represent one-third of all internet users and experience the digital world very early stage of their lives [1]. However, as children spend more time online than ever before they also deserve protection so that they can interact, learn, and play online safely.

The Internet does not know if a user is a minor or not unless there is input data that enables service providers to make a determination about a user’s age. As with adults, a child’s ability to participate online requires the collection, processing, and use of personal data. While such processing of personal data makes the online experience optimized and personalized, and content moderation appropriate to minors, it also raises concerns about children’s privacy and potential manipulation of children’s vulnerabilities. Parents are also worried about the creation of their children’s profiles that will attach and follow them into their adult years. To that end, the reconciliation of both ends of the spectrum, i.e., benefits and risks of technologies available to children, requires effective regulation and policy-making to contextualize the “best interest of the child” online. As UNICEF envisioned, this could be achievable through empowering children to exercise their rights online, including privacy and freedom of expression and information rights [2]. This article focuses on underlying privacy considerations that are discussed in academic literature aiming to strengthen children’s digital safety.

Most data protection legal frameworks require organizations to obtain consent from minors,  parents, or responsible adults on their behalf for the lawful processing of children’s data. However, the age of consent differs significantly across countries and is mostly reflected by cultural differences. For instance, in the United States, children can provide valid consent for the collection and processing of their data at 13, while this number becomes 16 in the European Union, or 14 in South Korea and Argentina – any processing under those prescribed ages would be subject to parental consent. Setting a threshold for valid consent is not a problem-free solution. First, it is difficult to determine the maturity of a child's understanding of the meaning of his or her consent and categorize it with a certain age threshold. This creates significant compliance challenges for organizations which eventually results in inconsistent protection for children. Besides, while certain content or experiences would be inappropriate for a four-year-old, it may not be the case for a ten-year-old although both would be restricted without parental consent.

On the other hand, parents (or responsible adults) may not necessarily understand the technologies used by their children or how parental consent works. Such a mechanism may also lead to unwanted consequences such as discrimination against children not living with their parents. Also, seeking parental consent results in further data collection about a child and a parent, for instance, to authenticate the relationship between them. Thus, those practical challenges and the reality should be reflected while considering how to address issues surrounding children’s privacy. For instance, some jurisdictions, such as the United Kingdom and the European Union, allow organizations to rely on an alternative legal basis in lieu of consent for lawful data collection and processing, e.g., when there is a legitimate business interest. This decreases the level of burdens imposed on companies, children, and parents because companies would not need to reiterate consent requests for each type of processing. Importantly, such an alternative legal base does not immunize companies from ensuring a sufficient level of safeguards as regulations still hold them accountable in absence of the desired level of protection. In doing so, this risk-based approach potentially creates effective governance for children’s privacy by shifting the burden of protecting children’s data away from parents (via consent) to companies that will engage in risk and impact assessments, and accordingly mitigate potential harms that may arise from the collection and processing of children’s data [3].

Another consideration is the profiling of children through data collection and processing. Creating a profile is inevitable and enhances the online experience through personalization. For instance, companies must use profiling to make sure children are exposed to content, activities, and products appropriate for their age. However, there are legitimate concerns about the secondary use of collected data by third parties, e.g., in the form of targeted advertising, that may attach to children and follow them into adulthood. Indeed, children may not have the cognitive capacity to understand the nature or purpose of advertisements, particularly those deceitful or manipulative advertisements. In practice, some companies have already embedded in privacy enhancing technologies using age verification in real-time and on a one-time-only basis and subsequently disposing of collected data without storing it in their data repositories. Besides, it is important for companies to provide transparency that is understandable by children and families such as which personal information is used for which purpose, by whom, which safeguards are taken in place, and what individual rights can be invoked. Beyond self-regulation by companies, while some jurisdictions require organizations not to engage in targeted advertising for users below 18 unless they opt-in for such a feature, [4] others take a more radical approach with a blanket prohibition of targeted advertising to minors [5].

In conclusion, discussions around children’s privacy protection will continue to take their place on regulatory agendas in the following years. However, a risk-based approach is perceived as being one of the promising data governance models to protect children’s privacy by holding companies, rather than parents, accountable. It requires companies to carry out impact assessments and mitigate risks to rights and freedoms arising from data processing while preserving the benefits of data processing. Nevertheless, the effectiveness of this approach is dependent on clear regulatory guidance on how to implement measures necessary for children’s privacy protection and ongoing oversight by regulators. Finally, any harmonization between regulators or mutual recognition would enable companies to streamline their global compliance.

References and Notes

  1. UNICEF, The State of the World’s Children 2017: Children in a Digital World, December 2017, available at https://www.unicef.org/media/48601/file
  2. UNICEF, Industry Toolkit: Children’s Online Privacy and Freedom of Expression, May 2018, available at https://sites.unicef.org/csr/files/UNICEF_Childrens_Online_Privacy_and_Freedom_of_Expression(1).pdf
  3. This approach is, partially or in entirety, followed by several jurisdictions such as the European Union’s general Data Protection Regulation, proposed American Data Privacy and Protection Act, Asia Pacific Economic Cooperation Privacy Framework, Canadian Consumer Privacy Protection Act.
  4. United Kingdom Information Commissioner Office’s Age-Appropriate Design Code and California Age-Appropriate Design Code Act
  5. European Union Digital Services Act.

More Coverage

Being human is being “of Adam.” In terms of the Universal Spirit (al-ruh al-kull), “being of Adam” is a kernel from the truth of the Praised One, Ahmad (meant to bear fruit as the perfected human). In terms of the realm of Divine Might and Majesty...
Bennet Omalu[1] is a physician specialized as a forensic expert and pathologist, which means he examines the tissues and organs of dead people to determine their cause of death. One day, he was asked to prepare an autopsy report about the corpse o...
Hi, my name is Everlasting. I know what you are thinking. What an absurd name, what a paradox, as no one can last forever except in paradise.  Well, I’m not sure who gave me the name, but I like it. It gives me the power to grow in an environment ...
Jam‘ (absorption) literally means coming and bringing together. In the language of Sufism, it means fixing all one’s feeling, sight and consciousness on the Ultimate Truth, to the extent that one is absorbed in Him and does not feel the existence ...