
“1.4 million children under 13 were on TikTok UK in 2020. Not despite its rules—while they existed.”
Let me start with a number that stopped me mid-research: 1.4 million.
That’s how many children under the age of 13 TikTok allowed on its UK platform in 2020. Not despite its rules prohibiting under-13 accounts. But while those rules existed.
The Information Commissioner’s Office fined TikTok ÂŁ12.7 million for this. But here’s what keeps me awake: somewhere in a marketing department, someone decided that compliance wasn’t urgent. Someone weighed the risk against the reach. Someone forgot that privacy isn’t abstract when it involves an eight-year-old.
I wrote my dissertation on ethical issues in consumer-based digital marketing. This chapter—protecting vulnerable populations—was the one I couldn’t shake. Because the data tells a story we don’t want to hear: we’re failing children online, and the law is only just catching up.
The Developmental Reality Marketers Ignore
Here’s what every marketing textbook should say but doesn’t: children’s brains are not miniature adult brains.
Stoilova and colleagues (2019) conducted systematic research into children’s understanding of personal data and privacy online. Their findings should be required reading for anyone in digital marketing. Even the most “mature” children—whatever that means—struggle to fully comprehend:
- How internet data flows work
- What data commercialisation actually means
- The permanence of their digital footprint
- Who benefits from their attention and information
This isn’t about intelligence. It’s about developmental capacity. There’s a reason we don’t let seven-year-olds sign contracts. There should be a reason we don’t let them sign away their data either.
Key insight: “Even the most mature children struggle to fully comprehend the intricacies of internet data flows and certain aspects of data commercialization.” — Stoilova et al., 2019
The Law: What the Data Protection Act 2018 Actually Requires
I trained in law before marketing, and if there’s one thing I’ve learned, it’s that legal language hides hard obligations in plain sight.
Under UK data protection law, when organisations use personal data to provide “information society services” to children under 13, they must obtain parental or guardian consent.
Not “should.” Not “it’s good practice.” Must.
The Data Protection Act 2018 and UK GDPR don’t leave room for interpretation here. Children under 13 cannot consent to data processing themselves. Period.
But here’s where theory meets uncomfortable reality: platforms know underage children are using their services. The TikTok case proves it. Senior employees raised concerns. Those concerns were documented. And still, 1.4 million children remained.
ICO fine against TikTok for failing to protect underage children’s data.
Beyond Compliance: The Age-Appropriate Design Code
In 2021, the United Kingdom did something genuinely innovative. It introduced the Age-Appropriate Design Code, developed by the Information Commissioner’s Office.
This statutory code isn’t just another regulation to file away. It fundamentally reframes how online services should treat children. The code requires that online platforms provide age-appropriate services that protect young individuals’ privacy and safety.
Wood (2023), writing in the International Association of Privacy Professionals, noted that the UK led the way on this. The attention economy has been harvesting young people’s personal data for years. As artificial intelligence becomes more prominent in shaping online experiences, the need for this kind of structural protection becomes urgent.
What the Statistics Actually Tell Us
Let me share a figure that complicates the narrative: 78% of UK children using social media know how to change their privacy settings. And 84% of those have actually done so at some point (Statista, 2022b).
On the surface, this looks good. Digital literacy is improving. Kids understand privacy controls.
But here’s what that statistic doesn’t tell you:
It doesn’t tell you whether those children understand why they’re changing those settings. It doesn’t tell you if they know what data is still being collected after those changes. It doesn’t tell you if they comprehend that “private” on a platform isn’t the same as “private” in law.
The Ethical Framework: Beyond What’s Legal
Here’s where my dual training in law and marketing becomes impossible to ignore. The law sets a floor. Ethics asks about the ceiling.
Corporate Social Responsibility (CSR), as defined by the European Commission, requires companies to understand their positive and negative impacts on society (Matten & Moon, 2007). This isn’t optional CSR—it’s foundational. Companies exist because society permits them to. That permission comes with obligations.
When applied to children in digital marketing, this means:
- Anticipating harm before it happens. Not waiting for complaints. Not waiting for fines.
- Auditing all data practices involving children. Regularly. Rigorously. With external accountability.
- Reporting concerns and incidents. Transparency even when it hurts.
- Taking appropriate action when breaches occur. Not just when regulators force it.
Eagle and colleagues (2020) argue that implementing risk assessments and proper procedures demonstrates organisations’ commitment to child safety. It prevents the unintentional negative effects that consumer-based digital marketing can create.
The Persuasion Problem
Let’s address something uncomfortable: marketing is persuasion. That’s its function. But persuasion directed at children raises distinct ethical questions.
Jenkin and colleagues (2014) conducted a systematic review of persuasive marketing techniques to promote food to children on television. The findings apply equally to digital spaces. Marketers deploy strategies that can undermine autonomy—and children’s autonomy is already developmentally limited.
Tactics like “limited time offers” or emotional appeals may not be inherently unethical with adults. With children, the calculation changes. The cognitive capacity to recognise manipulation isn’t fully developed.
Practical Steps for Ethical Marketing Around Children
If you’re reading this and thinking, “But my brand doesn’t target children”—good. But ask yourself this: could children still encounter your marketing?
If the answer is yes, these steps apply:
- Conduct genuine age assessments. Not performative ones. If your platform or marketing could reach under-13s, you need systems that actually detect and remove them.
- Obtain proper consent. Verifiable parental consent. Not a checkbox.
- Train your team. Ensure concerns are heard and acted upon.
- Audit regularly. Documented, regular audits catch problems early.
- Design for safety first. Build privacy and safety into products from the start.
The Cost of Getting It Wrong
The TikTok fine was ÂŁ12.7 million. That’s real money. But it’s not the real cost.
The real cost is harder to calculate:
- Parents who will never trust that platform again
- Children whose data will circulate indefinitely because it was collected without proper consent
- Regulators who now scrutinise everything that company does
- Competitors who use the case study in their pitch decks
Orlitzky (2008) raises a harder question: if no one finds out about a breach, and there’s no reputational damage, do companies still have an obligation to act ethically?
My answer, and Techy Copy’s answer, is yes. Because ethics isn’t damage limitation. It’s the baseline requirement for doing business in a society that includes children.
A Way Forward
The attention economy isn’t going away. AI-driven personalisation isn’t going away. Children using digital platforms isn’t going away.
What can change is how we approach all three.
Trengove and colleagues (2022) argue for an “ethics-by-design” approach—focusing on ethical features in the design process rather than relying solely on content-specific regulation.
For child protection, this means:
- Building privacy controls that children can actually understand
- Designing interfaces that don’t exploit developmental vulnerabilities
- Creating systems that flag potential underage users for review
- Being transparent about what data is collected and why
The law will keep evolving. The ICO will keep enforcing. But ethics-by-design doesn’t wait for enforcement. It builds responsibility into the product.
The Question You Should Be Asking
Not “is this legal?” The legal floor is too low.
Not “will we get caught?” That’s the wrong framework entirely.
The question is: if a child encounters our marketing, can we defend every decision we made in designing it?
If the answer requires caveats, exceptions, or “buts,” start over. Design again. Because 1.4 million children on a platform that knew they shouldn’t be there—that’s what happens when “buts” become business as usual.
At Techy Copy, we believe marketing can work without manipulation. We believe persuasion doesn’t require exploitation. And we believe children deserve better than “we didn’t think they’d be here.”
The data proves they are. Now we have to prove we’re ready for them.