Clegg Warns Opt-in Copyright Could 'Kill' UK AI Industry
Is permission for AI data use a death knell for UK tech? Clegg's stark warning ignites copyright war.
May 26, 2025

A stark warning has been issued by former Meta executive Nick Clegg, who cautioned that compelling technology companies to obtain explicit permission before utilizing copyrighted material to train artificial intelligence models could prove fatal to the burgeoning UK AI industry.[1][2][3] Clegg, who recently departed from his role as president of global affairs at Meta, argued that such a mandatory opt-in system is "unworkable" and "implausible" given the immense datasets already being used to develop AI.[1][2] His intervention has intensified the already heated debate surrounding the intersection of AI development and intellectual property rights in the United Kingdom, a nation aspiring to be a global AI superpower.[4]
Clegg's core argument, delivered at the Charleston Festival, centers on the practical and competitive implications of a strict opt-in regime.[2] He contended that AI systems are, by their nature, trained on vast quantities of publicly available data.[2] To retroactively, or even prospectively, seek individual permission for every piece of content would be a technologically and logistically insurmountable task.[2][5] "I just don't know how you go around, asking everyone first. I just don't see how that would work," Clegg stated.[2][5] Furthermore, he warned that if the UK were to unilaterally impose such a requirement while other nations did not, it would "basically kill the AI industry in this country overnight."[2][5] This, he suggests, would stifle innovation and drive AI development to jurisdictions with more permissive regulatory environments.[1][6] While acknowledging the "natural justice" in allowing creators to opt out of having their work used, he believes requiring prior consent "collides with the physics of the technology itself."[2][5]
The debate is taking place against a backdrop of significant government consultation and parliamentary discussion regarding copyright law in the age of AI.[7][8][9] The UK government has acknowledged that the current legal framework is disputed and fails to meet the needs of either the creative industries or the AI sector.[7][8] This uncertainty, it is argued, is already undermining investment and innovation.[7][4] In December 2024, the government launched a consultation on proposals to change how UK copyright applies to AI development, which included considerations for an exception to copyright for text and data mining (TDM), potentially with an opt-out mechanism for rights holders.[10][8] This approach would be somewhat similar to the EU's Digital Single Market Directive, which allows TDM but permits rights holders to reserve their rights.[7][8] However, the UK's current regime is considered more restrictive than both the US (which has a "fair use" doctrine) and the EU, only allowing data mining for non-commercial research.[4][10] TechUK, a trade association, has argued that a broad TDM exception would provide the strongest competitive advantage for the UK's AI sector, projecting significant economic benefits.[11][12] Conversely, they estimate that an overly restrictive licensing-only approach could cost the UK economy billions.[11]
The creative industries, however, have reacted with considerable alarm to proposals that might weaken their control over copyrighted material.[8][13] Prominent figures and organizations, including Sir Elton John, Sir Paul McCartney, the Society of Authors, and the Publishers Association, have voiced strong opposition to measures they fear would legitimize the unauthorized use of their work and jeopardize the livelihoods of millions employed in the UK's £120 billion creative sector.[1][2][13] They argue that AI developers should be required to obtain explicit permission and provide fair remuneration for the use of copyrighted content.[14][13][15] The "Make it Fair" campaign was launched to raise public awareness of what they describe as an existential threat from generative AI models scraping content without permission or payment.[13] Concerns have also been raised that an opt-out system places an unfair administrative burden on creators, particularly individual artists and small businesses, to constantly monitor and register their works.[16][17] Academics from the University of Cambridge have warned that an opt-out model tells British artists their creations are less valuable than tech industry profitability and that it is not in the "spirit of copyright law."[16][18] They also highlight the difficulty in enforcing opt-outs, especially for content already widely distributed or when metadata is stripped.[16][17]
The UK government finds itself in a challenging position, attempting to balance its ambition to be a world leader in AI with the need to protect its economically significant creative industries.[7][4][9] The government's consultation paper acknowledges the goal of ensuring right holders can control and be remunerated for the use of their content, while also enabling AI developers to access high-quality data.[7] Technology Secretary Peter Kyle recently admitted "regret" over the government's initial indication of a preferred "opt-out" option, acknowledging it was not the way to bring both sides together and that the UK's copyright regime is "not fit for purpose" in the digital age.[19] He emphasized that a "workable solution" on transparency around training data will be foundational to the government's approach.[19] The ongoing legal disputes, such as the Getty Images vs Stability AI case, further underscore the current legal uncertainty.[7][4] Meta itself has faced lawsuits from authors alleging their work was used without permission to train its AI models and has argued for "fair use" in US courts, a defense that UK legal experts suggest would not hold the same weight under current UK "fair dealing" provisions.[20][21][15] Meta has also engaged with the UK's Information Commissioner's Office regarding the use of public user posts from Facebook and Instagram for AI training, stating it will reflect British culture and that it is not using private messages or data from users under 18.[22][23]
Ultimately, Nick Clegg's intervention highlights the critical economic and innovation stakes involved in the AI copyright debate. His assertion that a mandatory opt-in would "kill" the UK AI industry frames the discussion in stark terms of international competitiveness.[1][2][5] As the UK government continues to navigate this complex issue, striving for a framework that both fosters AI development and safeguards creative rights, the path forward remains contentious.[7][11][9] The government has stated its intention to deliver a solution that supports both sectors, recognizing that the status quo is unsustainable, but achieving this balance will require careful policymaking and potentially new technical solutions for managing rights and transparency.[7][9] The outcome will significantly shape the future of both the UK's AI ambitions and its vibrant creative economy.
Research Queries Used
Nick Clegg AI training opt-in UK copyright
Meta's stance on AI training data copyright UK
UK government policy AI copyright training data
Impact of mandatory opt-in for AI training on UK AI industry
Arguments for and against AI training data opt-in UK
Creative industries UK AI copyright concerns
Nick Clegg warning AI copyright UK
AI copyright debate UK tech sector
Sources
[1]
[4]
[5]
[6]
[7]
[10]
[11]
[14]
[15]
[16]
[18]
[19]
[21]