ICO launches the third chapter of its AI consultation series

On 12 April 2024 the ICO launched the third chapter of its AI consultation, this time in relation to the accuracy of generative AI models. The consultation focuses on the application of the accuracy principle to the outputs produced by generative AI models and the impact that the accuracy of training data has on the outputs.

The ICO states that developers of AI models should:

  • Know whether the training data is made up of accurate, factual and up to date information, historical information, inferences, opinions, or even AI-generated information relating to individuals
  • Understand and document the impact that the accuracy of the training data has on the generative AI model outputs
  • Consider whether the statistical accuracy of the generative AI model output is sufficient for the purpose for which the model is used and how that impacts on data protection accuracy; and
  • Clearly, transparently and concisely communicate the matters set out above to deployers and end users, to ensure that the lack of accuracy at the training stage does not result in negative impacts on individuals at the deployment phase

In relation to training data, the ICO expects those who deploy generative AI to:

  • Consider how a potential lack of accurate training data and outputs could impact individuals and mitigate those risks ahead of deployment (e.g., restrictions on user queries, output filters);
  • Provide clear information about the statistical accuracy of the application and its intended use; and
  • Monitor how the application is used to inform and, if necessary, improve the information provided to the public and restrictions on the use of the application.

It is good to see this issue being discussed, challenging the indiscriminate use of information to develop models, but acknowledging that accuracy is a flexible concept depending on the purpose of the AI model.

The call for evidence can be found here.

 

Share:

Facebook
Twitter
Pinterest
LinkedIn
Don't just take our word for it