Insights Artificial Intelligence and the music industry: APPG on Music publishes report


The All-Party Parliamentary Group on Music (“APPG”) has published a report on artificial intelligence and the music industry, making a series of recommendations which are aimed at both taking advantage of the opportunities offered by AI, but also confronting “the danger that unfettered developments in AI could pose to the UK’s musicians and music businesses”.

The report addresses four themes: (1) consumer protection; (2) fair market access; (3) voice and image likeness; and (4) international action. Taking each in turn, the APPG cites research conducted by UK Music which found that the vast majority of adults who were polled were in favour of music created by AI being labelled as such, and that there was a concern about unknowingly listening to AI-generated music. In response, the APPG has recommended that all AI-generated music must be labelled – whether within the metadata or by other means – as a matter of consumer protection, and that the Consumer Rights Act 2015 be amended accordingly to impose such a requirement. In order to facilitate adequate labelling, the APPG has also recommended that the Government introduces “a standalone obligation for AI developers and those using large language models to comply with record keeping requirements for all data sets used for ingestion, not solely limited to personal data”.

Under the heading of ‘fair market access’, the APPG expresses concern (which is reflected in UK Music’s polling) about AI applications generating music using others’ works without their permission. Notwithstanding cases that are currently before the courts on the applicability of copyright law in the context of generative AI, the APPG recommends that the Government should promote compliance with copyright law, and that anyone who wishes to use music that is protected by copyright needs to obtain express permission to do so. Alongside this, the report recommends additional initiatives to educate music creators and artists on their rights.

Deepfakes are another area of concern for the APPG. We have previously written about the limits of existing legislation to protect against the abuse of other’s likenesses through deepfakes here. The APPG echoes these concerns and states that “unambiguous legislation that protects creators and artists from misappropriation and false endorsement would provide clarity and certainty for all involved”. It cites action taken in this regard in the United States, including the ELVIS Act in Tennessee (on which we reported here), and goes as far as suggesting that the Government should introduce a ‘specific personality right’ which would protect the voice, image, name and likeness of creators and artists.

The APPG recognises the difficulties of addressing any of these matters at a purely domestic level, given that AI developers and their operations may well be based in other jurisdictions. Invoking the analogy of food or pharmaceutical standards, it recommends that the Government introduces specific standards for large language models (“LLMs”) “which operate and generate revenue in the UK as a condition for market access. As part of such standards, LLMs would need to comply with UK copyright provisions, notwithstanding whether their services or goods would have been created in compliance with the local rules of a third-party jurisdiction”. Finally, underpinning all these recommendations, the APPG concludes that the Government should pass a ‘pro-creative industries AI Bill’.

To read the report in full, click here.