DRD and Fountain Court Chambers roundtable: Can the UK cope with AI change?

5 Dec 2024

Financing AI risk into opportunity; a recent DRD/Fountain Court Chambers roundtable explored whether the UK dons a regulatory straitjacket or can common law cope with the technology’s rise?

DRD teamed up with Fountain Court Chambers to co-host a breakfast roundtable, where attendees considered whether the UK’s legal and regulatory landscape could keep pace with the rate of AI innovation and ultimately establish a bridge across this chasm.

The event featured leading experts from the fields of AI development, policy, and litigation. Their discussion examined the role of Government, Parliament, companies, and developers in the evolving UK legal framework governing AI, while raising examples of best practice.

The roundtable was co-chaired by leading DRD Senior Associate Michael Rose and Fountain Court’s Jacob Turner, one of the nation’s leading AI barristers, who has advised the UK Government on various AI policy matters.

In what was a wide-ranging discussion, topics included:

The IP of creative sector workers: The first ‘hot topic’ was how potential rollbacks for creative sector rightsholders under the EU’s “opt-out” model for web-scraping, a version of which the Government is reportedly considering adopting, may affect commercial exploitation litigation. Smaller rightsholders will struggle to bring claims under the existing UK copyright framework, which all attendees agreed was insufficiently comprehensive to deal with new challenges posed by generative AI and web-scraping. The table was aligned it its view that an opt-out system for companies and creators would impose an undue burden on the UK creative sector.

A shift in Labour’s approach: Attendees then considered more generally whether these rollbacks and the current lack of AI regulation was indicative of a broader policy shift towards appeasing ‘Big Tech’ to encourage the innovation that will drive Labour’s pro-growth agenda. Attendees who work with AI developers were eager to emphasise more positively that the Government’s “stand-off” approach to regulation was successfully facilitating innovation within the sector, with the table in agreement that Labour’s opposition to sweeping interventionism would continue.

Will developers be able to regulate themselves? However, with some new policy and regulation emerging, the table was considered whether the leading players in the AI sector are willing to ‘mark their own homework,’ and if the existing legal framework can enforce it. The prevailing view was that UK common law was well placed to deal with harm emerging from the misuse of digital assets. There was more concern that claims and class actions concerning AI liability would pose far more problematic challenges for which the common law may be insufficient due to the lack of precedent for autonomous technology, as well as the fact that final court decisions may take several years to be reached, during which period the technology will likely have moved on.

How will the government legislate against more complex harms and crises? Addressing the details of proposed legislation, the roundtable heard several arguments that more macro-regulation of AI should be best left to industry experts, whereas regulators should be entrusted to address more specific instances of harm or exploitation. Given the absence of any clear scope of remit for a prospective regulator, there was no real means by which the group could align their thinking. However, there was consensus that given the speed of which this technology is progressing, simply “watching this space” may cause the Government more problems down the road.

How to bridge the chasm: It was not lost on the group that, perhaps unsurprisingly, a room largely full of lawyers and policy experts had dwelled on the question of how emerging technologies can be regulated and their threats nullified. The table reflected that rooms of developers would currently be concentrating on the horizons of AI rather than considering its conceivable ramifications. The chasm that new law and regulation must bridge was ever clearer.

A Westminster skills deficit: In the backdrop of detailed policy conversation, the group considered whether UK legislators were truly up to speed with the rollout and capability of new AI. There was recognition that as this technology is deployed into the public sector, each Government department will have a different AI agenda, receive different legal counsel, and be subject to different forms of lobbying from the sector. The prevailing view found that it remains to be seen whether a legislature perhaps lacking in technological literacy can agree a sensible route through uncharted waters.