Insights AI: UK Parliament Private Members Bill proposed

Contact

On 22 November 2023, the Artificial Intelligence (Regulation) Bill was introduced to the House of Lords. Amongst other things, it proposes that the Government bring in legislation to create an AI Authority to oversee how UK regulators are addressing AI, to review the current legislation to ensure its suitability and effectiveness in addressing AI, including whether it implements a number of AI principles listed in the bill (these include security, transparency and explainability, fairness, accountability, contestability, redress (which provisions must have been implemented by regulation), testing and issues relating to bias) and to support testbed and sandbox initiatives to help AI innovators get their technology to market. It also proposes legislation requiring any business which develops, deploys or uses AI to have a designated AI officer to ensure the AI is used responsibly, and legislation designed to protect IP rights in AI training data, to require AI suppliers to provide health warnings to consumers, and for the AI Authority to have rights to audit any business which develops, deploys or uses AI.

The Bill also offers a new definition of “AI”, namely, technology enabling the programming or training of a device or software to:

(a) perceive environments through the use of data;

(b) interpret data using automated processing designed to approximate cognitive abilities; and

(c) make recommendations, predictions or decisions;

with a view to achieving a specific objective.

This includes generative AI, defined as “deep or large language models able to generate text and other content based on the data on which they were trained.”

The Bill reflects some of the provisions of the Government’s March 2023 policy paper, “A pro-innovation approach to AI regulation,” which set out a framework, underpinned by five cross-sector principles (safety, transparency, fairness, accountability, contestability and redress), aimed at guiding UK regulator responses to AI risks and opportunities. UK regulators would be expected to address these key issues using their existing powers. Like the paper, the Bill mandates that the AI Authority must have regard to certain AI principles. However, the Bill also seeks to address some of the concerns that have been raised by Parliamentary Committees since publication of the Government’s paper.

For example, there are some who consider there to be a need for AI risks to be addressed in new UK legislation as a matter of urgency, as has been the approach in the US and the EU. In the March 2023 policy paper, the Government did commit to introducing a statutory obligation on UK regulators to have due regard to cross-sector principles “when parliamentary time allows”. In its interim report published in August 2023, Parliament’s Science, Innovation and Technology Committee called on the government to introduce that proposed legislation in the King’s Speech in November, it being the last chance to do so before the General Election. The Committee highlighted concerns that other jurisdictions like the EU and the US will steal a march and their laws and framework will become the default even if they are less effective. The King’s speech did not introduce such legislation and, in its response to the Committee’s report published on 16 November (previously reported by Wiggin), the Government explained why it chose not to legislate at this time. The publication of this private bill would appear to be a message to the Government that its delay is unacceptable to the House of Lords.

However, Private Member’s Bills rarely make it on to the statute books and this is even less likely where the Bill addresses an area on which the Government is already focused.

For more information, click here.