February 24, 2020
The guidance contains advice on how to understand data protection law in relation to artificial intelligence (AI) and recommendations for organisational and technical measures to mitigate the risks AI poses to individuals. It also provides a solid methodology to audit AI applications and ensure they process personal data fairly.
Aimed at both technology specialists developing AI systems and risk specialists whose organisations use AI systems, this guidance will help in assessing the risks to rights and freedoms that AI can cause; and the appropriate measures organisations can implement to mitigate them.
The ICO says that it supports innovation and understands the benefits AI can bring as well as the risks. It wants to engage, educate and influence those innovating, to ensure data protection can be built in to AI systems in practice.
This is the first piece of guidance published by the ICO that has a broad focus on the management of several different risks arising from AI systems as well as governance and accountability measures. The ICO says that it is essential for the guidance to be both conceptually sound and applicable to real life situations, as it will shape how the ICO will regulate in this space. It says that feedback from those developing and implementing these systems is essential.
The ICO is seeking feedback from both those with a compliance focus such as: (i) data protection officers (DPOs); (ii) general counsel; and (iii) risk managers, as well as technology specialists, including: (i) machine learning experts; (ii) data scientists; (iii) software developers and engineers; and (iv) cyber security and IT risk managers.
In March 2019, the ICO launched a call for views about its initial thinking in relation to auditing AI. Since then, its thinking has developed and it has established a more practical approach to the guidance. Organisations that have already engaged with the ICO should therefore feel free to provide feedback once again.
The consultation closes at 5pm on Wednesday 1 April 2020. To access the consultation, click here.