MediaLab fined over children’s data protection failings: what can organisations learn from this?

read time: 2 min read time: 2 min
11.03.26 11.03.26

The Information Commissioner’s Office (ICO) has fined Medialab, the owner of the image sharing platform Imgur, £247,590 for failing to lawfully process children’s personal data and for breaching fundamental requirements under UK data protection law. 

The enforcement action follows an ICO investigation which found that Medialab allowed children to use Imgur for several years without implementing basic safeguards designed to protect children’s personal information.

This article examines the ICO’s decision to fine Medialab and highlights the regulator’s expectations for protecting children online under UK data protection law.

ICO findings

The ICO found that Medialab breached the UK General Data Protection Regulation by:

  • failing to implement any age checking or age assurance measures,
  • processing the personal data of children under 13 without parental consent or another lawful basis and
  • failing to carry out a data protection impact assessment to identify and mitigate risks to children’s privacy. 

Risk of harm to children

Because Imgur had no effective way of determining users’ ages, the ICO concluded that children were exposed to potentially harmful and inappropriate content, including material relating to eating disorders, antisemitism, homophobia, and sexual or violent imagery. 

The ICO emphasised that personal data often drives content recommendations, meaning that inadequate safeguards can significantly increase risks to children when platforms process their data unlawfully.

ICO enforcement message

UK Information Commissioner John Edwards stated that Medialab failed in its legal duties to protect children, allowing them to use Imgur without effective age checks while collecting and processing their personal data. 

The ICO made clear that age assurance measures play an important role in preventing children’s data from being used in ways that may cause harm, including through age inappropriate content recommendations. The fine forms part of the ICO’s wider regulatory work to improve children’s online data protection standards, and organisations which ignore the fact that children use their services can expect enforcement action. 

How the fine was calculated

In setting the penalty, the ICO took into account the number of children affected, the level of potential harm, the duration of the breaches, and Medialab’s global turnover. The ICO also noted Medialab’s acceptance of its provisional findings and its commitment to address the failings if Imgur becomes accessible again in the UK. Further regulatory action may follow if those commitments are not met. 

What does this mean for organisations?

This decision reinforces the ICO’s continued focus on children’s data and the Children’s Code (Age Appropriate Design Code). Organisations operating online services that are likely to be accessed by children should ensure that they:

  • understand whether children are using their services,
  • implement proportionate age assurance or apply the Children’s Code protections to all users,
  • identify an appropriate lawful basis for processing children’s personal data and
  • carry out data protection impact assessments where processing is likely to present a high risk to children’s rights and freedoms. 

Failure to do so may expose organisations to enforcement action, even where terms and conditions purport to restrict access by age.

For further information, please contact our data protection team.

Sign up for legal insights

We produce a range of insights and publications to help keep our clients up-to-date with legal and sector developments.  

Sign up