Introduction

The intersection of artificial intelligence (AI) and the creative industry has become a focal point of discussion, particularly in the context of copyright law and the implications of AI-generated content. This debate has been highlighted by recent events in the film industry and ongoing legislative discussions in the UK, where concerns about the balance between technological advancement and the protection of creative rights are being actively addressed.

Description

Artificial intelligence (AI) has emerged as a pivotal topic in the film industry, particularly highlighted during the Oscars on 2 March 2025 [1], where filmmakers engaged in discussions about the implications of AI-generated content. Notable examples include the use of the AI tool Respeecher in “The Brutalist” to modify accents and AI techniques in “A Complete Unknown” to enhance stunt performances [1]. This has ignited a broader conversation within the industry regarding the role of AI and its impact on traditional filmmaking processes [1].

In the UK [3] [8] [9], the creative sector [1] [2] [3] [5] [7] [8], valued at over £120 billion annually and employing millions, has voiced significant concerns over the government’s proposed changes to copyright law, particularly in relation to the Copyright and AI consultation. The Labour government’s suggestion of an opt-out mechanism for rights holders has been met with skepticism, as industry leaders argue it could lead to widespread content theft and favor AI developers at the expense of human creators. Over 1,000 musicians [2], including prominent figures like Elton John, Dua Lipa [1] [5], and Sir Paul McCartney [5] [7] [8], have protested against these proposals, asserting that they threaten their livelihoods and undermine the growth of the creative sector. Polling indicates that a substantial majority of the public believes AI companies should pay royalties to creators of the content used for training AI models [7]. Critics contend that the rationale behind the government’s approach is flawed, as the AI industry heavily relies on access to copyrighted works for training [2], often without proper licensing [2].

The proposed changes include a “rights reservation” system [8], which would require creators to actively opt out of having their work used for AI training [8], placing an unreasonable burden on individual creators to manage their rights against numerous AI service providers [8]. This system raises concerns about the imbalance of resources between independent publishers and AI/tech companies, potentially destabilizing the business model of publishing [6]. The Industry and Producers Association (IPA) has raised concerns, arguing that the expansion of the text and data mining (TDM) exception under UK copyright law could unfairly shift the burden of copyright protection onto rights holders and favor AI developers [4]. They advocate for the development of technology that facilitates easier protection of works for rights holders [4]. Currently, the legal framework allows for a narrow TDM exception for non-commercial research [1], provided researchers have lawful access to the works [1]. However, the creative industry contends that the existing requirements for commercial AI model training are overly restrictive and lack transparency [1], complicating the enforcement of copyright protections [1]. The notion that artists can “opt out” of AI training is criticized as impractical [2], given the vast amounts of data AI models scrape from the internet [2]. The government’s emphasis on the ability to opt out is viewed as inadequate [2], prompting calls for a mandatory opt-in licensing system to ensure that AI developers obtain explicit permission from creators [2]. The IPA emphasizes the need for robust measures that can be independently audited to ensure compliance [4], highlighting the necessity for transparency from AI developers regarding the works utilized in their systems.

In response to the growing concerns, government ministers are seeking feedback on a proposed text and data mining exemption for generative AI developers that could bypass existing copyright laws [9]. This initiative arises amid a surge of AI products and tools that provide users access to publishers’ content [9], including subscription-based materials [9], without authorization or compensation [9]. The IPA has cautioned against introducing overly simplistic measures that could lead to unintended consequences, stressing the challenge of balancing support for both the AI and creative industries [4].

The UK government is contemplating legislative changes aimed at balancing the interests of rights holders and AI developers [1], with a focus on transparency regarding the training data used by AI models [1]. Proposed amendments would require AI developers to disclose information about their training datasets [1], which would help ensure compliance with copyright law [1]. However, there remains significant skepticism within the creative community about the effectiveness of these proposed measures [1], particularly concerning the potential for a broad data mining exception that could undermine rights holders’ control over their works [1]. The IPA has raised alarms about the financial sustainability of artists, especially emerging musicians who rely on royalties, arguing that AI-generated music could devalue human creativity and limit revenue streams for artists [2]. The artistic community has mobilized against the government’s plans [2], with campaigns like ‘Make It Fair’ urging public support for stronger intellectual property protections [2]. This campaign, supported by prominent UK writers and musicians [5], seeks to prevent tech companies from using creative content without compensation [5]. The ongoing debate over copyright in the age of AI emphasizes the integrity of human artistry and the need for fair compensation for creators, challenging the narrative that copyright laws are hindering technological advancements in the AI sector. Additionally, the IPA asserts that digital likenesses of individuals should not be used without their consent or in a misleading manner [4], and there is a pressing need for clarity regarding copyright in computer-generated works to ensure that advertising agencies can confidently navigate these new challenges.

The consultation also explores whether AI-generated outputs should receive copyright protection [7], amidst existing uncertainties regarding the requirement of human authorship for such protection [7]. Most jurisdictions [6] [7], including the US and EU member states [7], do not extend copyright to works generated without human authorship [7]. The government has indicated a willingness to remove this provision if the consultation does not yield sufficient evidence of its necessity [7]. Upcoming parliamentary votes in March will address AI regulation provisions aimed at reinforcing copyright protections [9]. The PPA [4] [9], as part of the Creative Rights in AI Coalition [9], has collaborated with various sectors to promote fair treatment of creative content in AI development [9], emphasizing that generative AI companies should not exploit publishers’ content for commercial gain without fair compensation [9]. The PPA calls for mandatory transparency in AI development processes [9], requiring firms to disclose data sources and comply with copyright laws [9]. A statutory regulator should be established to enforce compliance and address violations [9], ensuring that publishers retain the right to pursue legal action against AI companies that misuse their content [9]. Furthermore, AI and search engine companies should not impose unfair conditions on publishers regarding content visibility linked to AI training [9]. Competition regulators are urged to prevent anti-competitive practices [9], ensuring that AI firms obtain explicit permission and fair licensing agreements for content use [9]. Without adequate regulation [9], the quality of AI-generated content may decline [9], as it relies on high-quality original material for training [9]. The government will review all feedback received and publish a formal response outlining future steps [9], although a timeline for this announcement has not yet been provided [9].

Conclusion

The ongoing discourse surrounding AI and copyright law underscores the critical need to balance innovation with the protection of creative rights. The potential for AI to transform the creative industry is immense, yet it also poses significant challenges to traditional business models and the livelihoods of creators. As the UK government considers legislative changes [1], the emphasis on transparency, fair compensation [2] [3] [7] [8] [9], and robust regulatory frameworks will be crucial in ensuring that the interests of both AI developers and rights holders are equitably addressed. The outcome of this debate will have far-reaching implications for the future of creativity and technology.

References

[1] https://www.jdsupra.com/legalnews/ai-takes-centre-stage-at-the-oscars-as-3904636/
[2] https://www.forbes.com/sites/virginieberger/2025/02/28/how-the-uks-ai-copyright-exception-hands-creators-work-to-big-tech-for-free/
[3] https://www.musicweek.com/labels/read/uk-creative-industries-launch-make-it-fair-campaign-to-highlight-risks-from-government-s-ai-policy/091481
[4] https://ipa.co.uk/news/government-consultation-on-copyright-and-ai/
[5] https://inews.co.uk/news/new-laws-ai-uk-artists-writers-work-stolen-mps-warn-3554088
[6] https://www.publishingscotland.org/2025/03/ai-copyright-consultation-response/
[7] https://www.lexology.com/library/detail.aspx?g=cdb078ee-d069-4d1e-9713-158099fa5b4c
[8] https://theweek.com/tech/ai-freedom-vs-copyright-law-the-uks-creative-controversy
[9] https://ppa.co.uk/ppa-response-to-the-government-consultation-on-ai-and-copyright