Can Labour balance AI innovation and creators’ rights?

3 Oct 2024

How far will the Party go in protecting intellectual property rights-holders as it seeks to fire growth with AI, ask Samuel Pina and Freddie Eltringham.

With the delivery of the UK Government’s “AI Opportunities Action Plan” imminent, tech giant Google has wasted no time in applying pressure on the Labour Government to avoid what it would see as excessive regulation.

Google warns that if the UK fails to relax its copyright laws on text and data mining (TDM) practices – used for the of training AI models – the nation risks falling behind in the global AI race.

This is the latest development in the already heated debate surrounding intellectual property law and generative AI that has presented a complex policy challenge for governments around the world. As the new Labour administration prioritises growth, there is little margin for error when setting the balance between the interests of AI companies, who advocate for less restrictive IP laws on copyright, and the creative sector whose revenue streams rely heavily on their intellectual property rights.

A thorny reception from industry

The appointment of Feryal Clark MP as the Parliamentary Under-Secretary of State for AI and Digital Government, with additional responsibility for IP, has raised questions about the direction of the UK’s future IP framework, given the potential for competing priorities. The Publishers’ Licensing Service has already expressed concerns about the Government’s lack of clarity on addressing the creative industry’s fears of copyright infringement by AI developers. Nick Kounoupias, Chief Counsel of Anti-Copying in Design, has even gone so far to say that Clark’s department is “promoting [unlicensed] use of IP and clearly skewed towards users rather than rights owners.”

As the Labour Government tries to find the right mix in its upcoming AI Action Plan, the Party may look to its European neighbours for inspiration, where creators are remunerated for the private storing and copying of their creative works – a model which the UK could potentially adopt. The European Commission has been advocating for stronger AI regulation, and in a significant development last week, over 100 tech companies and organisations voluntarily signed the EU’s AI Pact, pledging to implement safety and reliability measures in AI development and deployment ahead of formal legislation.

A fairer system for the UK’s creators

Implementing a statutory private copying scheme in the UK would not only provide an additional revenue stream for domestic creators, but it would also safeguard the income they already receive from similar schemes in countries like France, Germany, and Spain. To ensure transparency and fairness, Labour also is under pressure from creators to establish robust licensing regimes and enforcement mechanisms. This should be driven by a more activist stance from the UK Intellectual Property Office (IPO) that calls out copyright infringement.

EU Member States have for their part been cautious not to stifle their domestic AI industries and have pushed back to water down the EU’s landmark AI Act. Speaking at the UK AI Safety Summit in November 2023, the then French Finance Minister Bruno Le Maire said “before regulating, we must innovate,” suggesting that the EU AI Act should regulate the uses of AI rather than the underlying models. This rivalled concerns by the Competition and Markets Authority earlier this year that “powerful incumbents” in the AI Foundation Model market could exploit their positions to shield themselves from competition.

Some policymakers are wary that any over-regulation could lead to an ecosystem in which larger companies could comply but smaller ones might struggle. By their reasoning, a truly successful regulatory environment should foster innovation and competition among companies of all sizes, including domestic ones, rather than inadvertently creating barriers to entry or growth.

"To ensure transparency and fairness, Labour also is under pressure from creators to establish robust licensing regimes and enforcement mechanisms."

Which way does the regulator go?

The UK appears poised to take a similar approach, with the UK IPO predicted to maintain a light touch on AI regulation, rather than the activist stance hoped by copyright holders. In February 2024, the Government shelved a consultation-led code that would have set rules for training AI models using copyrighted material. While this decision was made by the previous government, it reflects ongoing discussions and considerations within the civil service.

The IPO’s 2024-2025 Corporate Plan, while acknowledging the need to protect IP rights holders in the face of AI advancements, prioritises keeping pace with technological developments. This plan, developed by civils servants, likely indicates the type of advice that is being presented to the new Labour administration.

This approach has raised concerns among creators, who fear that the current status quo favours AI development at the expense of their intellectual property rights. Many worry that their IP is being used to train AI models without proper licensing or practical means of recourse.

As the Labour Government navigates this complex landscape, it must find a way to support both the burgeoning AI industry and the creators whose work fuels its growth. Many will be hoping that the AI Action Plan establishes a framework that encourages innovation, while ensuring fair compensation and protection for those whose intellectual property is at stake.