Last week, when the Government announced its expert working groups on AI and copyright, the Secretary of State for Science, Innovation and Technology, Peter Kyle confirmed he was “determined to harness expert insights across the debate“. Many artists and creators were quick to voice their concerns, seeing the groups as designed to legitimise what they’ve long argued amounts to state-sanctioned intellectual property theft: allowing tech companies to freely harvest decades of creative work to train AI models.
The news yesterday that OpenAI has signed a memorandum of understanding with the Government to identify opportunities to deploy AI across the civil service, justice, security and education probably won’t do much to quieten the fears of the creative industry.
The Government has positioned creative industries at the heart of its ten-year Industrial Strategy, promising to make Britain “the number one destination worldwide for investment in creativity and innovation“, but its approach to AI and copyright has led critics to suggest that the real priority is keeping Big Tech happy.
The deal with OpenAI follows a similar agreement with Google signed earlier in the month. Whitehall called that arrangement a way for “advanced tech to shake off decades-old ‘ball and chain’ legacy contracts which leave essential services vulnerable to cyber-attack“. Campaigners for fairer use technology called it “dangerously naïve“.
As always, the truth probably lies somewhere in the middle, but for all the £75 million pledged to screen content and £100 million for creative R&D clusters, the Industrial Strategy stumbles on the one issue that matters most to creatives, investors and buyers of media businesses: legal certainty around who owns what when AI meets IP.
The Copyright Conundrum
The Government published proposals in December 2024 to establish copyright exemptions for AI model training, effectively allowing tech companies to use vast datasets of creative works without permission or payment unless the rights holder had agreed to opt out of the regime. That was quickly abandoned due to backlash from the creative industries. The new working groups are tasked with finding “workable solutions” to resolve the differences between AI developers and content creators. But when one side wants unrestricted access to decades of creative output and the other wants to be paid for it, what exactly is “workable” about that conversation?
While the UK forms committees, the US is embedding AI exceptions into fair use doctrine through court precedent. The EU has codified its position in the AI Act. Meanwhile, UK businesses face an investment paradox: massive government support for AI development, but no clear legal framework to commercialise it.
Take music production. AI can now compose, arrange and master tracks that are commercially indistinguishable from human-created content. But who owns the rights when an AI model trained on existing works creates something genuinely new? The current consultation-heavy approach offers no answers, leaving creators, tech companies, and investors in limbo.
Where the Strategy Gets it Right
Credit where due: the infrastructure commitments in the Industrial Strategy are substantial. The Creative Content Exchange, CoSTAR labs and enhanced IP marketplaces represent genuine progress. The Government understands that digital-first creative businesses need digital-first support structures.
The regional focus is also to be applauded. By backing creative clusters outside London – from Manchester’s gaming hub to Bristol’s animation sector – the strategy acknowledges that talent is distributed but opportunity has been centralised. This geographic rebalancing could unlock significant economic potential.
The Skills Gap Reality Check
Arguably, though, the Industrial Strategy’s skills agenda – while well-intentioned – misses the urgency of the AI transition. The skills necessary to grow UK creative industries are currently in short supply, and traditional apprenticeship models won’t bridge the gap fast enough.
Consider this: prompt engineering – the art of communicating effectively with AI systems – didn’t exist as a job category three years ago. Today, skilled prompt engineers command six-figure salaries. Yet the strategy’s skills framework remains anchored in conventional creative disciplines.
The UK needs to be preparing for creative jobs that don’t yet exist: algorithmic curators, virtual art directors, narrative AI engineers. Not just the roles of today, but the hybrid roles of tomorrow. That means radical curriculum reform, not just incremental tweaks. It means championing cross-disciplinary fluency – design, data, ethics, code – from GCSE to post-grad.
Timing Issues
There’s also a palpable frustration across the industry about the time it has already taken to not yet reach a workable solution. By the time we factor in impact assessments and draft bills, even bullish forecasts don’t see new AI copyright laws being put to Parliament before mid-2026.
Singapore has recently launched a national AI governance framework specifically for creative industries. South Korea is subsidising AI-human collaboration projects in entertainment. Canada has created tax incentives for AI-enhanced creative production.
The UK’s response? More consultation.
The Industrial Strategy correctly identifies creative industries as a growth engine and acknowledges AI’s transformative potential. But strategy without implementation is just expensive optimism.
The UK has approximately 18 months before the next wave of AI capabilities fundamentally reshapes creative production. Models capable of generating feature-length films, entire video games or interactive virtual experiences aren’t theoretical: they’re in development now.
The question isn’t whether AI will transform creative industries. It’s whether the UK will shape that transformation or simply respond to it. The Government’s Industrial Strategy provides the framework. Now it needs the legal backbone to make it work.
Find out more about our experience in Digital & Tech here.