Thinking of using AI in your business?

Here are 5 practical considerations when it comes to data protection

What matters

What matters next

Most commentary around AI looks at the big picture or the future of work, but in this article we look at some practical data protection implications for businesses that are considering implementing AI systems.

1. Automated Decision-Making

There are specific rules in both the EU and UK GDPR (in this article we will refer to both together) covering individuals’ rights when processing involves solely automated decision-making, including profiling. These specify what information businesses must provide to individuals about the automated decision-making (often done with a privacy notice) and their rights in relation to any automated decisions made about them (e.g., the right to challenge a decision or the right not to be subject to automated decisions which have a significant impact on them). 

The GDPR states that businesses will need to provide individuals with “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing.” This can be challenging given the complexities of AI algorithms and any meaningful information may not even be understandable to individuals in the first place. However, regulators do not expect businesses to provide a complex explanation of how the AI system works. Instead, the information provided should be sufficiently comprehensive for the individual to understand the reasoning behind the automated decision. 

In addition, the GDPR requires businesses to implement suitable safeguards when processing personal data to make solely automated decisions. The most important of these is meaningful human intervention. Businesses must, therefore, ensure that they have appropriate procedures in place to support meaningful human review of any AI generated automated decisions. This will include designing a system and developing a training programme which will allow employees to address, escalate and, if required, override an automated decision.

2. Data Protection Impact Assessment

In many cases, the use of an AI system by a business will trigger the need for a data protection impact assessment (DPIA) to help identify and minimise the data protection risks associated with the project. A DPIA is mandatory if the type of processing is likely to result in a high risk to the rights and freedoms of individuals, a threshold which many AI systems are likely to meet. In fact, both the UK and EU regulators have published guidance that states that in the vast majority of cases the implementation of AI technology will trigger the need for a DPIA.

It is also worth remembering that businesses may need to consult a data protection supervisory authority prior to any processing where the DPIA indicates that the processing undertaken by the AI systems would result in a high risk to individuals, but those risks cannot be mitigated.

3. Data Minimisation

The use of large amounts of data is central to the development and use of AI systems, putting the GDPR’s data minimisation principle to the test where this includes personal data. Businesses will need to think of ways of ensuring they only process personal data that is ‘adequate, relevant and limited’ to what is necessary to develop and operate the AI system. 

To achieve this, the business should implement data minimisation practices and procedures from the very outset of the AI system design phase. Additionally, to the extent that an AI system is provided or operated by a third party, the business should factor in data minimisation as part of the procurement process (see Vendor Due Diligence below).

As the training of an AI system requires a large amount of data, the French supervisory authority has commented that it may be possible for a business to use fictitious data with the same value as real data but not linked to an individual person. This synthetic data would not constitute personal data. However, there are limitations to the use of synthetic data given that it may provide self-reinforcing results. There are also potential copyright risks with using synthetic data to train AI systems (which is likely to increase if the draft EU AI Act comes into force).

4. Transparency

The transparency requirements of the GDPR create a number of overarching legal obligations for how a business makes public its collection and use of personal data. To meet this obligation, the GDPR sets out the information that an individual must be provided with depending on whether the personal data has been directly collected from that individual, or by other means (e.g. using data lists from third parties). A privacy notice is the most common way that businesses provide clear and detailed information to individuals about what they are doing with personal data. 

As the design, training and implementation of an AI system will result in a new form of processing, businesses will need to update their privacy notices to reflect this new processing activity, together with information about the purpose and lawful basis for this processing.

5. Vendor Due Diligence

For most businesses, an AI system is likely to be provided by a third party, which means vendor due diligence will play a crucial role in determining whether implementing, developing and maintaining a third-party AI system will comply with the business’s GDPR obligations. If during the due diligence process the third-party vendor fails to satisfy queries raised about its compliance with the GDPR, then the business may have to opt for a different solution or offset the risk in its terms of business. 

Additionally, as new legislation (such as the upcoming EU AI Act and UK Data Protection and Digital Information (No.2) Bill) comes in, and new compliance considerations arise during the course of an AI systems deployment, the business should undertake regular reviews of the third-party’s service and, where necessary, modify the terms of the service or switch to another provider if the AI system is no longer compliant with the GDPR. Continuous diligence is crucial for businesses to demonstrate they are complying with their accountability and governance obligations under the GDPR. 

 

Useful Regulatory Guidance:

New guidance is published regularly, so make sure you keep an eye out for it!

Disclaimer

This information is for educational purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. © Shoosmiths LLP 2024.

Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.