The Information Commissioner’s Office (ICO) has fined Medialab, the owner of the image sharing platform Imgur, £247,590 for failing to lawfully process children’s personal data and for breaching fundamental requirements under UK data protection law.
The enforcement action follows an ICO investigation which found that Medialab allowed children to use Imgur for several years without implementing basic safeguards designed to protect children’s personal information.
This article examines the ICO’s decision to fine Medialab and highlights the regulator’s expectations for protecting children online under UK data protection law.
The ICO found that Medialab breached the UK General Data Protection Regulation by:
Because Imgur had no effective way of determining users’ ages, the ICO concluded that children were exposed to potentially harmful and inappropriate content, including material relating to eating disorders, antisemitism, homophobia, and sexual or violent imagery.
The ICO emphasised that personal data often drives content recommendations, meaning that inadequate safeguards can significantly increase risks to children when platforms process their data unlawfully.
UK Information Commissioner John Edwards stated that Medialab failed in its legal duties to protect children, allowing them to use Imgur without effective age checks while collecting and processing their personal data.
The ICO made clear that age assurance measures play an important role in preventing children’s data from being used in ways that may cause harm, including through age inappropriate content recommendations. The fine forms part of the ICO’s wider regulatory work to improve children’s online data protection standards, and organisations which ignore the fact that children use their services can expect enforcement action.
In setting the penalty, the ICO took into account the number of children affected, the level of potential harm, the duration of the breaches, and Medialab’s global turnover. The ICO also noted Medialab’s acceptance of its provisional findings and its commitment to address the failings if Imgur becomes accessible again in the UK. Further regulatory action may follow if those commitments are not met.
This decision reinforces the ICO’s continued focus on children’s data and the Children’s Code (Age Appropriate Design Code). Organisations operating online services that are likely to be accessed by children should ensure that they:
Failure to do so may expose organisations to enforcement action, even where terms and conditions purport to restrict access by age.
For further information, please contact our data protection team.