BLS Stay Compliant

News and Information

A button in the middle of a computer board states AI for artificial intelligence

Considering artificial intelligence – what does it mean for data protection?

As artificial intelligence (AI) emerges further and further into business practices, there are more reasons than ever to incorporate data protection firmly into the planning process. 

The Information Commissioner’s Office (ICO) states artificial intelligence is a ‘high risk’ practice and is currently classed as a priority area. As AI practices continue to grow, as does the danger to data privacy.  

In recent years, the ICO has published thought pieces related to AI, from the use of facial recognition through to the impact on the Children’s Code and there is guidance available for organisations wishing to develop their own AI use.  

How can AI be used safely under data protection legislation? 

The ICO recommends following their guidance to ensure an improved practice when incorporating AI, including the below.

  • AI is considered high risk – ensure appropriate consultation with user groups in the at-risk categories as well as experts in the field to properly ascertain whether this is the right route for your organisation. If artificial intelligence is the most suitable, undertaking a data protection impact assessment (DPIA) will help identify and minimise any risks of non-compliance with data protection legislation. In some cases, this is a legal requirement.
  • Be clear, open and honest about the use of AI and ensure this is well explained, particularly to those whose data may be in use by the system. It is difficult to explain the decisions made about the use of data – a requirement under legislation – when that decision is made by machine learning.
     
  • Only use the amount of data you need for the system to run and ensure all data used is adequate, accurate, relevant and limited. Consider incorporating known techniques into the process that ensure data is protected and carefully select ones that are best practice for your system development.
     
  • Balance the risks of bias and discrimination. Manage the potential of causing data protection issues or non-compliance with legislation when developing your AI processes – and how this might impact the individuals whose data you hold as well as the overall impact on your organisation.
     
  • AI development is a time-consuming process – ensure you dedicate appropriate resources to ensuring the data protection element is firmly in place. Clear lines of accountability as well as regular reviews and process auditing will ensure the safety of individual’s data.
     
  • Secure your system appropriately to protect your data. Legally, every organisation is required to implement appropriate technical and organisational security measures throughout any process and this should also apply to your AI development.
     
  • Consider the decision-making process and whether this is most appropriate as a human-made decision or an automated one. People have the right to request all decisions with legal or significant impact are made by human review, ensure your team are appropriately trained and at the right level of the organisation for such decision making.
     
  • Choose a third-party supplier carefully when building your AI processes. As the data controller, it is still your organisation’s responsibility to ensure data protection is securely in place and compliant with legislation. 

Whilst the uptake on AI continues to grow and there is a consensus and recognition that such processes have an intended benefit to society, there are many steps to be taken to ensure any data held or used by such technology is well protected. Following the guidance available and ensuring all staff are aware of such guidance is an easy starting point to stay current with AI. 

BLS Stay Compliant has over 50 years of combined experience with data protection legislation. If we can assist you, through training or bespoke work when it comes to developing AI for your organisation, please do get in touch.  

Share this post