With its AI action plan, the UK government risks selling us out to chatbots

AI UK action plan, illustrated by a shot from Wallace & Gromit: Vengeance Most Fowl
Share this Article:

The UK government’s AI action plan is high on credulity, low on clear-eyed thinking about how it’ll affect our creative industries. A few thoughts.


“Uncoked kermelss of corn, thalt stay stay undigitated,” warns a medical drawing. “Uncoked kardes of corn, abenen of part inite broken bown into grose and absosed.”

No, this isn’t a bit of Dadaist poetry, but rather an AI-generated piece of ‘content’ dreamed up in response to the search query, “Does corn get digested?” 

As reported by Futurism, the picture containing the words above is the first result to come up when you type the question into Google Image Search. Other images at the top of the results are also clearly AI-generated. 

Here are a couple now:


It’s an example of both how generative AI can conjure up imagery that looks convincing at first glance, but quickly falls apart under scrutiny. It’s also an example of how quickly the internet is filling up with computer-made bilge, and how ineffective Google is at discerning between what’s useful and what quite obviously isn’t.

We mention this because the UK government has announced its intention to ‘mainline AI into the veins’ of our wind-battered country ā€“ plans that will, Prime Minister Sir Keir Starmer says, bring £47bn into the economy each year. (The use of drug-related terminology like ā€˜mainline into the veins seems spectacularly ill-advised messaging, but weā€™ll set that aside for now.)

Those plans were further laid out in a surprisingly brief 28-page action plan, published on the 13th January on the government’s website. Written by tech investor Matt Clifford CBE, its essential argument is that, with the world’s most powerful nations embarking on an artificial intelligence race, the UK would be foolish not to get involved. As one bullet point puts it, the aim is to “Position the UK to be an AI maker, not an AI taker.”

The government isn’t hanging about, either, if the action plan is followed to the letter; a long-term commitment to beefing up the UK’s AI infrastructure will be set out within six months, it suggests, with the aim being to commit to considerable investment over the next decade.

In many respects, the action plan isn’t wide of the mark. Machine learning and expert systems have been a key part of academic research for decades, and we’ll surely see all kinds of invaluable scientific and medical breakthroughs as technology improves in years to come. But the government’s action plan doesn’t distinguish between these established and maturing fields of AI research and what has essentially become a breathlessly-hyped, catch-all marketing term used by such companies as OpenAI and Meta.

As we’ve previously mentioned on this website, such platforms as ChatGPT and Midjourney rely on the theft of writers and artists to create the responses. Those mangled images described at the start of this piece, showing an X-ray view of a human torso filled with blatantly incorrect internal organs, will have originated from a more accurate medical illustration scraped from the internet somewhere. AI-generated images and text are now emerging at such speed that they threaten to fill up the internet; and as the web fills up with synthesised bilge, tech companies need fresh sources of unpolluted data on which to train their algorithms.

Read more: The UK government mulls letting AI firms steal copyrighted work

The government certainly knows that generative AI requires huge amounts of data to dine on, and that much of it is subject to copyright. A section in its paper talks about establishing a “copyright-cleared British media asset training data set,” which could contain work found in the National Archives, the Natural History Museum, the British library and the BBC. In short,work spanning hundreds if not thousands of years of human history. That work, the paper says, could be “licensed internationally at scale.”

Will creatives whose work appears in that dataset be compensated for their stolen work? The paper doesn’t say.

Nor does the paper even lightly address the inevitable impact generative AI will have on the UK’s creative industries. In December 2024, Keir Starmer said that “AI is going to be transformative for the better in art, culture…” but failed to concede that any innovation that thrives on generating video, imagery or writing will inevitably have some effect on people who work in those fields. 

Investors are pumping billions into the AI sector because they’re gambling that companies will soon be able to replace untold numbers of expensive workers with apps. One recent estimate suggests that as many as eight million jobs could go across the country over the next decade ā€“ that’s roughly a quarter of the workforce. How will the UK government mitigate against all these job losses?

The paper also talks about setting up special ‘AI Growth Zones’ ā€“ parts of the UK where AI data centres can be built, presumably with taxpayer subsidies. What it only glancingly mentions is just how environmentally damaging AI is as a sector; it requires vast amounts of energy and water to cool its systems. Within hours of the government’s action plan being published, there have already been warnings that one planned AI Growth Zone ā€“ located in Culham, Oxfordshire ā€“ could put further strain on its local reservoir, which already runs the risk of running out of water.

(There’s also the thorny question of data privacy, particularly when it comes to the ‘anonymised’ data of NHS patients.)

Rather than explore these concerns, the action plan seems worryingly credulous about an area of tech that has a tendency to make grand claims that, like those AI-generated medical diagrams, fail to stand up to scrutiny. Google’s AI search results are meant to be at the cutting edge of a new era of tailored information, yet one AI overview recently claimed that ‘Carry on up the cyber’ is a quote from 1999’s The Matrix. 

Open AI boss Sam Altman has recently claimed that his company is now “confident we know how to build AGI [artificial generalised intelligence] ā€“ that is, a computer better than human beings at a range of tasks. He adds that these could be replacing employees as soon as this year. A quick glance at one of the firm’s internal documents, however, states that ‘AGI will have been achieved at Open  when one of its AI models generates at least $100 billion in profits.”

In other words, it isn’t about whether the scientific breakthrough has truly been achieved, or whether what it’s selling is what it’s purported to be, but rather how much money something branded ‘AGI’ can make for its investors.

To make matters worse, tech companies like Meta and OpenAI have spent billions if not trillions on their products, and to date, they haven’t figured out a way of making them profitable. Only a week ago, Sam Altman admitted that his company is losing money even when it charges its subscribers $200 a month ā€“ each query costs the firm so much that, if a user enters too many per month, it loses money. A recent report suggested that OpenAI could post a loss of around $5bn in 2024.

There are already fears that many of the world’s biggest tech stocks ā€“ including Nvidia, whose GPUs power the sector’s AI systems ā€“ could be vastly overvalued. With all the expense that goes into AI, and the lingering question over whether it can deliver on its increasingly outlandish promises of free labour, there are some suggestions that, far from changing the world, the sector is a bubble that could burst at any minute.

Quite understandably, the UK government is keen to find some good economic news to broadcast to the nation. But in rushing through a set of plans that could essentially give the world’s biggest tech companies everything they want ā€“ from state-subsidised data centres to tax breaks to wholesale theft of our broadcasters and libraries’ archives ā€“ the government could also be about to pump billions into a sector liable to promising far more than it can deliver.

There are clear opportunities to be found in AI ā€“ if it can save lives by catching certain types of cancer early, then that can only be a good thing. But again, the UK government’s seeming rush to join the AI race leaves it at risk of credulously accepting the latest hype; that Keir Starmer believes the country’s productivity will somehow double in less than five years is not a good sign.

In the face of an uncertain future, disintegrating public services, streets lying cracked and broken, a monorail salesman has arrived in the UK. Our government, it seems, is only too happy to invest.

ā€”

Thank you for visiting! If youā€™d like to support our attempts to make a non-clickbaity movie website:

Follow Film Stories on Twitter here, and on Facebook here.

Buy our Film Stories and Film Junior print magazines here.

Become a Patron here.

Share this Article:

Related Stories

More like this