The UK government shouldn’t allow AI firms to take copyrighted work without a licence, committee recommends

Brazil AI copyright law
Share this Article:

The government should “abandon” a proposed exception to UK law which would allow AI firms to take copyrighted material, a new report recommends.


Last December, the UK government put forward a proposal that tech firms like OpenAI and Google should be free to train their AI chatbots on copyrighted material. A new report put together by the Commons Culture, Media and Sport Committee (CMS) however, has recommended the opposite: it argues that the government should “abandon” its proposed exception, arguing that it would “undermine the growth” of the UK’s film and TV industries.

The recommendation forms part of a wider survey conducted by the House of Commons committee, designed to provide insight into what it calls ‘British film and high-end television’. Published today (the 10th April), the same report has also argued in favour of a 5 percent levy on US streaming companies like Netflix and Prime Video to help fund home-grown production. You can find the full CMS report on the UK Parliament website.

For its section headlined ‘Impact of Artificial Intelligence’, the CMS has spoken to a number of experts and creatives across the UK film and TV industry, and concludes that the government’s current plan to make copyrighted material available to AI companies would negatively impact jobs and incomes.

“…the rapid growth of generative AI technologies threatens their earnings and future employment opportunities,” the report concludes. “This is not just an issue for one part of the industry: it [is] about real lives and livelihoods, and the impact will be felt by the most vulnerable.”

The report also recommends against the government’s suggested ‘opt-out’ process. In other words, if a company or artist didn’t want OpenAI to make money by taking their work, then they’d have to tick a box of fill out a form expressing that preference. This, the CMS argues, “stands to damage the UK’s reputation among inward investors for our previously gold-standard copyright and IP framework.”

Instead, companies should be required to “license any copyrighted works” before using them for their AI models.

Read more: The Coca-Cola AI Christmas Campaign review | It’s not the real thing

The conclusion is doubly ironic given that it comes just a week after AI companies also rejected the UK government’s opt-out proposal. In a response published by Politico, OpenAI said, “We believe training on the open web must be free,” adding that “excessive transparency requirements… could hinder AI development and impact the UK’s competitiveness in this space.”

In essence: these firms want unfettered access to everything, no questions asked.

Generative AI promises to usher in a new era of instant writing and imagery conjured up within seconds. The stuff churned out by these pieces of software is a synthesis of existing creative material, however. On one side of the argument, there’s director James Cameron and former prime minister Tony Blair, who argue that the use of existing work for ‘training’ is akin to the human brain soaking up information. Blair’s Institute for Global Change has similarly argued that copyright law should be ‘rebooted’ to help tech companies pursue their AI-powered visions.

On the other side of the divide are the thousands of people in the UK’s creative industries whose livelihoods could be permanently affected by having their own work taken out from beneath them. Earlier in April, a group of authors gathered outside Meta’s offices to protest over the company’s use of pirated books to train its software.

The CMS’ report is careful to distinguish between the sorts of AI that are used to cure diseases, or even a company like Metaphysic, which uses AI technology for visual effects, such as the ‘deepfake’ de-ageing seen in Robert Zemeckis’ time-spanning drama, Here. Where that form of the technology can make ethical use of existing data – such as a collection of photographs of an actor’s face – generative AI is scraped on vast tranches of writing and art, often taken without knowledge or consent.

As Ed Newton-Rex, the CEO of Fairly Trained, is quoted as saying in the report: “You train on creative work you haven’t licensed if you want to replace the creative industries with AI without paying them – not if you want to cure cancer with AI.”

Share this Article:

Related Stories

More like this