blank

maev2i89

Share
TikTok fine ICO, under-13s social media, Age-Appropriate Design Code, children's data privacy, parental consent requirements, UK data protection children, ICO enforcement action

In 2020, approximately 1.4 million children under the age of thirteen used TikTok in the United Kingdom.

This was not a failure of technology. It was a failure of priority. TikTok’s own regulations prohibited account creation by children under thirteen. Senior employees had raised concerns internally about underage users and the company’s failure to remove them. Those concerns were documented. They were not acted upon (ICO, 2023).

In April 2023, the Information Commissioner’s Office imposed a £12.7 million fine on TikTok for breaching UK data protection laws. The penalty was not for a novel or complex regulatory requirement. It was for failing to obtain parental consent before processing children’s data—a requirement that has been clear since the Data Protection Act 2018 came into force.

This article examines what happened, why it matters, and what every brand operating in the UK must understand about child data protection.

The Legal Framework: What the Law Actually Requires

Under UK data protection law, when organisations use personal data to provide “information society services” to children under the age of thirteen, they must obtain consent from a parent or guardian (ICO, 2023).

This is not a recommendation. It is not best practice. It is a statutory requirement.

The Data Protection Act 2018, which implements the UK GDPR, establishes clear principles for all data processing: fairness, lawfulness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, and confidentiality (Government Digital Service, 2015). For children, these principles carry additional weight because children may be less aware of the risks involved in data processing.

The UK GDPR explicitly recognises that children merit specific protection regarding their personal data. Recital 38 states that children may be less aware of the risks, consequences, and safeguards concerned with data processing (ICO, n.d.).

TikTok’s failure was not subtle. The company allowed 1.4 million children to create accounts, process their data, and be exposed to targeted content—all without the legally required parental consent. It also failed to conduct sufficient checks to detect and remove underage users, despite knowing they were present (ICO, 2023).

 

Why Age Verification Matters: The Developmental Reality

The significance of privacy for child development cannot be overstated.

Stoilova, Nandagiri, and Livingstone (2019) conducted a systematic evidence mapping of children’s understanding of personal data and privacy online. Their findings should inform every marketing decision that could affect minors. Privacy-related media literacy skills are closely intertwined with various areas of child development. As children age, their awareness, literacy, and requirements regarding privacy evolve.

But even the most mature children struggle to fully comprehend certain aspects of the digital environment:

  1. The complexity of internet data flows

  2. What data commercialisation actually means in practice

  3. The permanence of their digital footprint

  4. Who benefits from their attention and information

  5. How their data may be used in ways they did not anticipate

The research is clear: a customised approach is necessary—one that considers diverse developmental trajectories and individual differences among children. One-size-fits-all privacy policies are, in practice, one-size-fits-none (Stoilova et al., 2019).

This is not about intelligence. It is about developmental capacity. The cognitive ability to recognise manipulation, understand commercial motives, and evaluate long-term consequences develops gradually throughout childhood and adolescence. The law recognises this by requiring parental involvement.

 

The ICO’s Investigation: What the Regulator Found

The ICO’s investigation into TikTok revealed systemic failures that went beyond isolated errors.

The company was aware that underage children were using its platform. Senior employees had raised concerns. Those concerns were documented internally. Yet the company did not take adequate steps to detect and remove underage users or to obtain the legally required parental consent (ICO, 2023).

The ICO’s findings point to several specific failures:

Inadequate age verification. TikTok’s age verification measures were insufficient to prevent underage users from creating accounts. The company relied on self-declaration of age—users simply entered a date of birth—without effective checks to verify accuracy.

Failure to remove underage users once identified. Even when TikTok became aware of underage users, it did not consistently remove them from the platform. The systems for detection and removal were not fit for purpose.

Insufficient parental consent mechanisms. The company did not implement adequate processes to obtain verifiable parental consent before processing children’s data, as required by law.

Ignored internal warnings. Concerns raised by senior employees were not acted upon. The company had knowledge of the problem and chose not to fix it.

The ICO’s £12.7 million fine reflects the seriousness of these failures. But the monetary penalty is only part of the consequence. The reputational damage, regulatory scrutiny, and loss of trust are ongoing.

The Age-Appropriate Design Code: A New Standard

The TikTok case unfolded against the backdrop of a significant regulatory development in the UK. In 2021, the Information Commissioner’s Office introduced the Age-Appropriate Design Code—a statutory code of practice for online services likely to be accessed by children (Wood, 2023).

The code establishes fifteen standards that online services must meet to comply with data protection law when processing children’s data. These include:

  • Best interests of the child. The best interests of the child should be a primary consideration when designing and developing online services.
  • Age-appropriate application. Children of different ages have different capacities. The level of protection should be appropriate to the age of the child.
  • Transparency. Privacy information must be concise, prominent, and in language children can understand.
  • Detrimental use of data. Children’s data must not be used in ways that have been shown to be detrimental to their wellbeing.
  • Policies and community standards. Terms, policies, and community standards must be upheld, and children must be informed of how they will be enforced.
  • Default settings. Settings must be “high privacy” by default unless there are compelling reasons for a different default.
  • Data minimisation. Only the minimum amount of personal data necessary to provide the service should be collected and retained.
  • Data sharing. Children’s data must not be disclosed unless a compelling justification exists.

The code represents a fundamental shift. It places the burden of protection on the platform, not the child. It requires services to assume children may be present and design accordingly, rather than waiting for problems to emerge and reacting.

As Wood (2023) notes, the attention economy has been harvesting young people’s personal data for years. The Age-Appropriate Design Code aims to change that. It requires platforms to consider child development at every stage of design and to build protections from the start, not bolt them on after problems surface.

The Persuasion Problem: Why Marketing to Children Is Different

The TikTok case raises a broader question that every marketer should consider: is marketing to children fundamentally different from marketing to adults?

The answer, supported by developmental research, is yes.

Jenkin and colleagues (2014) conducted a systematic review of persuasive marketing techniques to promote food to children on television. Their findings apply equally to digital spaces. Marketers deploy strategies that can undermine autonomy—and children’s autonomy is already developmentally limited.

Tactics like “limited time offers,” emotional appeals, and influencer endorsements may not be inherently unethical with adults who have fully developed cognitive capacities for evaluating persuasive intent. With children, the calculation changes. The capacity to recognise manipulation, understand commercial motives, and resist persuasion is not fully developed.

This does not mean marketing to children is impossible. It means the burden of proof shifts. Marketers must demonstrate that their techniques do not exploit developmental vulnerabilities. The default assumption should be caution, not permission.

The CAP Code, enforced by the Advertising Standards Authority, reflects this principle. It prioritises child protection and prohibits targeting children with advertising for products such as gambling or alcohol (ASA, n.d.). The Code requires that marketing communications be prepared with a sense of responsibility to consumers and to society, with particular care taken to avoid causing harm to children.

 

What Brands Must Do: Practical Steps for Compliance and Ethics

If your brand’s marketing could reach children—and in the digital environment, most marketing can—these steps are essential.

  • Conduct genuine age assessments. Not performative ones. If your platform or marketing could reach under-thirteens, you need systems that actually detect them. Relying on self-declaration of age is not sufficient. The ICO expects services to use appropriate technical measures to establish age with reasonable certainty.
  • Obtain verifiable parental consent. If you are collecting data from under-thirteens, you need consent from a parent or guardian. Not a checkbox a child can tick. Not a terms-of-service agreement a child clicked through without reading. Real, verifiable consent that meets the standards of the UK GDPR.
  • Design for safety first. The Age-Appropriate Design Code is not just compliance—it is a framework. Build privacy and safety into products from the start. Don’t add them after problems emerge. Default settings should protect children, not expose them.
  • Train your team. The TikTok case revealed that senior employees raised concerns internally. Those concerns were ignored. Your team needs to know that raising ethical questions about child protection is not career-limiting—it is part of the job. Create channels for concerns to be heard and acted upon.
  • Audit regularly. Data practices drift. What was compliant six months ago might not be today. Regular, documented audits that specifically examine child protection catch problems before regulators do.
  • Consider the full user journey. Children may encounter your marketing even if they are not your target audience. Consider how your content appears on platforms children use, how your ads are targeted, and whether your age-gating measures are effective.
  • Apply the precautionary principle. When in doubt about whether a practice could harm children, err on the side of protection. The commercial cost of excluding some users is far less than the regulatory, reputational, and human cost of causing harm.

The Cost of Getting It Wrong

The TikTok fine was £12.7 million. That is substantial. But it is not the real cost.

The real cost is harder to calculate:

Parents who will never trust that platform again. Children whose data will circulate indefinitely because it was collected without proper consent. Regulators who now scrutinise everything that company does. Competitors who use the case study in their pitch decks. Employees who raised concerns and were ignored, now disengaged or departed. The years of regulatory oversight that will follow.

Herring (2023) notes that corporate social responsibility helps attract and retain customers, while social irresponsibility damages reputation and revenue. The reputational damage from a child privacy breach is severe and lasting. It signals that an organisation cannot be trusted with the most vulnerable members of society.

Orlitzky (2008) raises a harder question: if no one finds out about a breach, and there is no reputational damage, do companies still have an obligation to act ethically?

The answer, in UK law and in ethical practice, is yes. Because ethics is not damage limitation. It is not risk management. It is the baseline requirement for doing business in a society that includes children.

A Way Forward

The attention economy is not going away. AI-driven personalisation is not going away. Children using digital platforms is not going away.

What can change is how brands approach all three.

Trengove and colleagues (2022), in their critical review of the Online Safety Bill, argue for an ethics-by-design approach—focusing on ethical features in the design process rather than relying solely on content-specific regulation. This approach gives services space to adopt flexible, innovative solutions to social problems.

For child protection, this means:

  1. Building privacy controls that children can actually understand

  2. Designing interfaces that do not exploit developmental vulnerabilities

  3. Creating systems that flag potential underage users for review

  4. Being transparent about what data is collected and why

  5. Making it easy for parents to exercise their rights on behalf of children

The law will keep evolving. The ICO will keep enforcing. But ethics-by-design does not wait for enforcement. It builds responsibility into the product from the beginning.

The Question Every Brand Should Ask

Not “is this legal?” The legal floor is too low.

Not “will we get caught?” That is the wrong framework entirely.

The question is: if a child encounters our marketing, can we defend every decision we made in designing it?

If the answer requires caveats, exceptions, or “buts,” start over. Design again. Because 1.4 million children on a platform that knew they should not be there—that is what happens when “buts” become business as usual.

The data proves children are present online. Now brands must prove they are ready for them.

 

 

 


References

ASA. (n.d.). CAP (Non-broadcast code). ASA. https://www.asa.org.uk/codes-and-rulings/advertising-codes/non-broadcast-code.html

Government Digital Service. (2015). Data protectionGOV.UKhttps://www.gov.uk/data-protection

Herring, J. (2023). Legal ethics. In Oxford University Press eBooks.

ICO. (2023). ICO fines TikTok £12.7 million for misusing children’s datahttps://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/04/ico-fines-tiktok-127-million-for-misusing-children-s-data/

ICO. (n.d.). The benefits of data protection lawshttps://ico.org.uk/for-organisations/advice-for-small-organisations/the-benefits-of-data-protection-laws/

Jenkin, G., Madhvani, N., Signal, L., & Bowers, S. G. (2014). A systematic review of persuasive marketing techniques to promote food to children on television. Obesity Reviews, *15*(4), 281–293.

Orlitzky, M. (2008). Corporate Social Performance and Financial Performance: A Research Synthesis. In The Oxford Handbook of Corporate Social Responsibility. Oxford University Press.

Stoilova, M., Nandagiri, R., & Livingstone, S. (2019). Children’s understanding of personal data and privacy online – a systematic evidence mapping. Information, Communication & Society, *24*(4), 557–575.

Trengove, M., Kazim, E., Almeida, D., Hilliard, A., Zannone, S., & Lomas, E. (2022). A critical review of the Online Safety Bill. Patterns, *3*(8), 100544.

Wood, E. D. W. D. (2023). Children’s privacy laws and freedom of expression: Lessons from the UK Age-Appropriate Design Code. International Association of Privacy Professionalshttps://iapp.org/news/a/childrens-privacy-laws-and-freedom-of-expression-lessons-from-the-uk-age-appropriate-design-code/