The recent fine issued to Reddit, reported by the BBC, is another example of regulators taking children’s privacy and online safety seriously. But beyond the headline number, this case isn’t really about technical nuance or grey areas in the law. It’s about fundamentals.
At its core, the issues appear to be twofold:
- A failure to carry out an appropriate Data Protection Impact Assessment (DPIA) for processing that was likely to pose high risk.
- A lack of effective controls to prevent under-13s from accessing the platform, despite terms and conditions stating they shouldn’t be there.
And when children are involved, those aren’t minor oversights.
You Can’t Rely on Your Terms and Conditions to Protect Children
It’s common for online platforms to state in their terms that under-13s (or under-16s, depending on jurisdiction) are not permitted to use the service.
If a platform knows – or should reasonably expect – that children are likely to access its service, then it needs to think carefully about how that risk is managed in practice. If there are no meaningful technical or operational safeguards in place, then simply pointing to the small print isn’t enough.
DPIAs Are Not Paperwork Exercises
The other key issue is the apparent absence of a robust DPIA.
Where processing is likely to result in high risk to individuals – and children will almost always elevate that risk – a DPIA is a legal requirement. More importantly, it’s a structured way of asking the right questions:
- Who might be affected?
- What could go wrong?
- How serious would the impact be?
- What practical steps are we taking to reduce the risk?
When done properly, a DPIA isn’t just compliance documentation. It’s a mechanism for forcing organisations to confront uncomfortable realities about how their services are used in the real world.
If children form a meaningful part of your user base – whether intentionally or not – that risk needs to be identified, documented and addressed.
When Children’s Data is Involved, The Stakes Are Higher
The consequences of getting this wrong aren’t limited to unlawful processing.
Children’s data may be handled without a clear lawful basis. Safeguards may not be appropriately designed. And most importantly, children themselves may be exposed to content or interactions that are inappropriate or harmful.
In that context, data protection risk is not abstract. It sits alongside – and often directly connects to – real-world safety risk.
Privacy and Age Assurance Aren’t Opposites
There is an ongoing and important discussion in the industry about how to balance data minimisation with the need to understand who is using a service. Platforms are rightly cautious about collecting more data than necessary. At the same time, they are increasingly expected to apply age-appropriate safeguards.
Those goals don’t have to be in conflict, but they do require thoughtful design and careful risk assessment.
This case is a reminder that you can’t avoid that complexity by doing less. If children are likely to be on your platform, you have to engage with the risk head-on.
For organisations operating in similar spaces, the message is clear:
Go back to basics, and:
- Identify where children may be present.
- Carry out meaningful DPIAs.
- Establish and document your lawful basis.
- Put effective controls in place.
When it comes to children’s data, good intentions and well-drafted terms are not enough. Regulators expect to see evidence that you have understood the risk and acted on it.