Introduction

The UK government is navigating complex challenges in balancing the interests of AI developers and the creative industries, particularly concerning AI training on copyrighted works. As legislative deadlines approach, the government faces pressure to address AI and copyright issues, with significant implications for the country’s cultural and economic landscape.

Description

Peter Kyle oversees tech policy in the UK government [4], which is facing increasing pressure to legislate on AI and copyright issues as a statutory timeframe for action approaches [4]. In late 2024 [1], the government proposed a significant policy shift allowing AI developers to train models on copyrighted works by default unless rights holders opt out [1]. This proposal has sparked criticism from creative professionals [1], including over 1,000 musicians [1], such as Annie Lennox and Damon Albarn, who recorded a silent protest album against what they perceive as an erosion of artistic rights. Prominent figures in the industry, including Sir Paul McCartney and Sir Elton John [8], have also voiced their concerns, warning that these changes could undermine artists’ rights and threaten the future of the creative sector [8], which is vital to the UK’s cultural and economic landscape [8]. The Creative Rights in AI Coalition [2], which includes the Society of Authors and the Publishers Association [2], has criticized the proposal, advocating that AI developers should only use copyright-protected works with explicit permission from rights holders [2].

The recently enacted Data (Use and Access) Act modernizes the country’s data governance and infrastructure [7], impacting various sectors [7], including healthcare and AI [7]. While the Act aims to streamline data sharing and update the UK’s GDPR [7], it has faced criticism for weakening restrictions on automated decision-making and neglecting copyright issues related to AI training [7]. Under current UK law [1], copyright protection is automatic upon creation [1], granting creators exclusive rights to their works [1]. AI developers currently need permission from copyright holders to use their works for training [1], with unauthorized use constituting infringement [1]. The proposed opt-out scheme raises concerns about the practicality of tracking unauthorized use in a digital landscape [1].

Content creators [4] [5], including publishers [4], authors [2] [4], and musicians [4], are concerned that their copyrighted works are being used without permission to train AI models [4]. They seek greater transparency and control over the use of their content [4], along with fair remuneration [4]. The government supports a model allowing AI developers to use copyrighted content by default unless rights holders opt out [7], similar to the EU’s text and data mining exception [7]. This has raised alarms among creators [7], who fear losing control over their works [7]. In contrast [4], AI developers argue that their activities do not infringe copyright and advocate for reduced restrictions on data access [4], emphasizing the potential benefits of AI for economic and social outcomes [4].

The UK government aims to balance the interests of both AI developers and the creative industries [4]. A consultation on AI and copyright was launched [4], favoring legislative changes that would allow AI training on copyrighted material unless copyright owners opt out [4], supported by new transparency obligations [4]. However, repeated attempts to introduce transparency requirements for AI developers regarding the datasets used for training were rejected [7], with the government arguing that such measures could hinder innovation [7]. This consultation received over 11,500 responses [4], and the government has yet to respond formally [4].

Parliamentary discussions on the Digital [4], Culture [4] [7] [8], Media and Sport (DCMS) Bill revealed a divide between the House of Commons and the House of Lords [4], with peers proposing amendments for enhanced copyright protections [4]. These amendments were resisted by MPs who preferred a comprehensive approach to AI-related copyright law reform [4]. Although the Bill did not include significant copyright reforms [4], it mandates the government to advance discussions on the topic [4].

Following the passage of the Data (Use and Access) Act on 19 June 2025, the government has nine months to publish an economic impact assessment of various copyright reform options and an AI copyright report [4], considering the implications for both copyright owners and AI developers [4]. The Act also establishes enforcement provisions for copyright protection in AI development, involving regulatory oversight [6], and expands its scope to encompass AI systems developed outside the UK [6]. Two industry working groups will be established to address transparency in AI training and technical solutions for rights holders [4]. A parliamentary group will also be formed to aid in developing UK AI copyright policy [4].

Government ministers emphasize the need for thorough consultation and working group input before making decisions on copyright reform [4]. Legislative changes are not guaranteed [4], with a focus on ensuring practical solutions that enhance rights holder control and provide legal clarity for AI developers [4]. Potential new laws may include transparency measures regarding copyright works used in AI training [4], though concerns exist about delays affecting rights holders’ ability to track their works [4].

On enforcement [4], the government will not take an active role [4], leaving it to copyright owners and prosecuting authorities [4]. Current UK copyright law permits text and data mining for non-commercial research [7], but the Data Act does not enhance protections or clarify enforcement mechanisms [7]. Discussions on remuneration aim to facilitate licensing of copyrighted material by AI companies [4], particularly for those not part of collecting societies [4]. The Copyright Licensing Agency is developing a new gen-AI training license to ensure fair compensation for creators while providing legal certainty for AI developers [4].

Evidence has emerged showing that commercial AI systems have been trained on unlicensed music [7], with instances of AI-generated content closely resembling copyrighted works [7]. This raises significant legal questions about compliance with existing copyright laws [7], as AI developers may not be adhering to licensing requirements [7]. A significant legal case is set to begin in June 2025 [1], with Getty Images suing Stability AI for allegedly using over 12 million copyrighted images without permission for its AI model [1]. This case will test the application of UK copyright law to AI training [1]. External factors [4], such as ongoing legal [1] [2] [3] [4] [7] disputes and global developments in political, legal, technical [2] [4] [7], and commercial spheres [4], are likely to shape UK AI copyright policy [4]. Reform may occur in two distinct phases [4].

In addition, there is a strong push for the UK to revise its laws to enhance the participation of its creative and tech sectors in the global market. A broad exception for AI and text and data mining (TDM) is recommended [3], allowing temporary copies of copyrighted works for AI training [3]. It is emphasized that the extraction of uncopyrightable elements [3], such as facts and ideas [3], should remain lawful [3]. There is also significant advocacy for the protection of noncommercial AI research [3], asserting that academic institutions and researchers should not encounter legal obstacles when utilizing copyrighted works for training AI models in research contexts [3]. Additional licensing requirements could impose substantial burdens on these institutions [3], which already incur significant costs to access research materials [3]. The government has indicated plans for a comprehensive AI bill by May 2026 [7], which will revisit transparency and opt-out mechanisms [7], highlighting the ongoing tension between fostering AI innovation and protecting the rights of creators in a rapidly evolving technological landscape [7].

The UK Intellectual Property Office (IPO) has proposed that rights holders can reserve their rights [2], allowing authors to control the use of their works by AI model developers during the training process [2]. If an AI developer disregards an author’s decision [2], the rights holder can enforce their copyright [2]. If the UK proposal’s option 4 is approved [2], it would align more closely with the EU’s Article 4 of the Digital Single Market Copyright Directive (DSM) [2], which includes exceptions for text and data mining for commercial research involving databases and copyright-protected works [2]. However, the EU’s exception has limitations [2], and the criteria for a valid reservation of rights to opt out remain unclear [2]. The European Union Intellectual Property Office (EUIPO) has acknowledged the lack of a universal solution for copyright holders in its study on GenAI and Copyright [2]. A survey indicated that only a small percentage of signed artists find their streaming revenue satisfactory [2], highlighting challenges in rights recovery [2]. The UK government aims to learn from the EU’s implementation challenges [2].

The UK IPO suggests that works made available online should be reserved in effective [2], machine-readable formats [2], recognizing the technical challenges involved [2]. Clear labeling of AI-generated material is crucial for distinguishing synthetic content from human-created works [2], ensuring transparency and accountability [2]. The rise of deepfake technology raises concerns about misinformation and identity fraud [2], necessitating stricter regulations and ethical guidelines [2]. A structured licensing approach for AI-generated works could facilitate fair compensation for creators whose original content informs AI outputs [2], promoting a more equitable digital ecosystem [2]. Pending the consultation outcome [2], significant legislative reform may focus on expanding the ‘text and data mining’ exception to copyright infringement [2]. A formal decision date has not been announced as the UK IPO reviews consultation responses [2].

Conclusion

The ongoing debate over AI and copyright in the UK underscores the tension between fostering technological innovation and safeguarding the rights of creators. The government’s approach, including consultations and legislative proposals, aims to strike a balance that supports both AI development and the creative sector. The outcomes of these efforts will have significant implications for the UK’s cultural and economic landscape, influencing the future of both industries in a rapidly evolving digital world.

References

[1] https://taylorhampton.co.uk/ai-copyright-uk-opt-out-proposal/
[2] https://www.lexology.com/library/detail.aspx?g=e3394b01-f6f8-477e-8020-80dab7e92bc8
[3] https://www.authorsalliance.org/2025/03/07/updates-on-ai-copyright-law-and-policy-section-1202-of-the-dmca-doe-v-github-and-the-uk-copyright-and-ai-consultation/
[4] https://www.pinsentmasons.com/out-law/analysis/ai-and-copyright-post-data-bill-uk-timeline-2026
[5] https://www.alt.ac.uk/news/all_news/alt%E2%80%99s-response-government-copyright-and-ai-consultation
[6] https://digitalpolicyalert.org/event/31126-data-use-and-access-bill-requiring-inquiry-into-use-of-copyright-works-in-development-of-ai-systems-received-royal-assent
[7] https://www.forbes.com/sites/virginieberger/2025/06/25/transparency-deferred-what-the-uks-data-bill-means-for-music-ai-and-copyright/
[8] https://londondaily.com/uk-government-faces-backlash-from-artists-over-ai-and-copyright-proposals