Guidance on responsible AI in recruitment: helping to navigate data protection compliance

read time: 4 mins

The Department for Science, Innovation and Technology (DSIT) has issued new guidance (guidance) titled 'Responsible AI in Recruitment’. The guidance is aimed at mitigating the potential ethical risks associated with the use of artificial intelligence (AI) in recruitment and hiring processes, and discusses data protection compliance issues. 

Whilst the guidance is not legally binding, it is intended to help organisations align their AI systems with the UK Government's AI regulatory principles. These principles include ‘fairness’, ‘accountability and governance’ and also ‘appropriate transparency and explainability’, which are broadly consistent with the data protection principles under the UK GDPR. 

This article focuses on the key data protection takeaways from the ‘Responsible AI in Recruitment’ guidance. 

Examples of AI recruitment technologies 

AI can be utilised in various ways as part of the recruitment process: from ‘sourcing tools’ such as software which recommends candidates for particular roles or helps generate job descriptions, to ‘screening tools’ such as those that screen and score applicant CVs or provide AI-powered psychometric testing. 

It is also possible to use AI during the interview itself, to transcribe the interview or even to detect candidate behaviours and qualities.  

AI-powered chatbots are another example. These may be used to support candidates throughout the recruitment process and to answer frequently asked questions.

So what can we learn from the guidance about using these tools in a data protection compliant way?

Impact assessments 

The guidance clarifies that organisations should consider implementing assurance mechanisms such as impact assessments. Completing an impact assessment will allow employers to properly identify associated risks and explore whether it is possible to satisfactorily mitigate these. The guidance identifies different kinds of impact assessments including:

  • Algorithmic impact assessments: which consider the potential short and long term impacts of an AI system. 
  • Equality impact assessments: which assess equalities outcomes.

The guidance also flags the potential requirement for employers to carry out a data protection impact assessment (DPIA) under UK data protection law. A DPIA is required for any data processing activity that is likely to result in a high risk to the rights and freedoms of individuals. Therefore, it is easy to see why many AI recruitment tools will also require a DPIA. 

An algorithmic impact assessment should assess various impacts including accessibility, bias and also data protection. As a result, creating an algorithmic impact assessment which assesses data processing risks in line with DPIA requirements, will be a sensible way for organisations to streamline the impact assessment process and ensure that they are properly assessing and mitigating all relevant risks.

Solely automated decision-making 

The guidance also notes that organisations must consider whether AI technologies result in solely automated decision-making which has a legal or similarly significant effect on individuals. This is because under the UK GDPR, individuals have the right not to be subject to solely automated decision-making of this kind unless an exception applies. 

‘A legal or similarly significant effect’ is not defined in the UK GDPR. However, AI tools used without human intervention which then discriminate and prevent particular categories of individual from progressing applications, will have a ‘similarly significant effect’ on these individuals and will be caught by the UK GDPR restriction. 

The exceptions to this restriction are where the decision is:

  1. Necessary for the entry into or performance of a contract.
  2. Authorised by domestic law.
  3. Based on the individual’s explicit consent.

However these exceptions apply narrowly. Using automated decision-making to make the recruitment process more efficient wouldn’t be considered necessary for entry into an employment contract. Additionally explicit consent is unlikely to be appropriate in these circumstances, due to the imbalance of bargaining power between a prospective employer and an applicant. 

In any event, even where an exception does apply, UK data protection laws require organisations to take steps to prevent errors, bias and discrimination. Additionally individuals must be properly informed about the automated decision-making and be able to request human intervention or challenge an automated decision.

Third party processor due diligence 

The guidance also advises on thorough evaluation of AI tools, including verifying supplier claims and understanding the tool's intended function and integration with existing systems. 

This evaluation should form a fundamental part of the due diligence process when engaging any new third party data processor to provide AI recruitment tools, alongside assessing items such as their privacy practices, international data transfers and security measures. 

Utilising the guidance in recruitment

Protecting the privacy rights of job applicants is a central part of using AI responsibly within recruitment. As well as discussing data protection risks, the guidance also covers a wide range of ethical, equality and accessibility risks, together with the importance of assessing and mitigating these. As a result the guidance will be a useful resource for any employer considering new AI technologies to innovate their recruitment process. 

The guidance should be reviewed in conjunction with the various pieces of AI guidance that have been published by the Information Commissioner’s Office (ICO) as well as the ICO’s draft guidance for employers and recruiters on recruitment and selection. Consultation on this draft guidance closed on 5 March 2024 and input will inform the final text, for the ICO’s online resources.

For more information, please contact the data protection team.

Sign up for legal insights

We produce a range of insights and publications to help keep our clients up-to-date with legal and sector developments.  

Sign up