Politics

Friday, 2 January 2026

Deepfakes are undermining faith in politics

AI videos of MPs in porn scenes or spouting misleading speeches are on the rise. As May polls loom, the race to take them down is on

In October, the Tory MP George Freeman found himself bombarded by messages from friends and colleagues accusing him of betrayal. “They were calling me unrepeatable names, saying, ‘I never thought you’d do that to us.’ I didn’t understand what had happened,” he said.

“Then I saw the video. There was me in my office in parliament, wearing my tank top, waving my arms around, speaking like me, and saying that after 20 years I’d had enough of the Conservative party, it was a busted flush, a failing organisation and I was joining Reform.”

The video posted on Facebook and X was a deepfake that had used artificial intelligence to make the former technology minister look like he was announcing his defection to Nigel Farage’s party. “It was going viral,” the MP for Mid Norfolk said.

‘You work so hard to build a reputation and in 15 seconds somebody can destroy your career’

George Freeman, MP

“I was once mugged by three teenagers in an underpass in White City and this felt similar. In politics you work so hard to try and build a reputation, to speak your mind and choose your words carefully. And in 15 seconds, somebody who you don’t know, from the comfort of their own anonymity, can destroy your career with no comeback.”

Freeman contacted Meta, which owns Facebook. “They said, ‘It doesn’t breach any of our guidelines.’ I understand it’s not sexual grooming, it’s not violent terrorism or extremism, but it’s pretty serious when an elected representative is wilfully and mischievously completely misrepresented.”

He also reported the bogus video to the police and found “there’s no law against using somebody’s identity in this way”.

Now Freeman is talking to Lindsay Hoyle, the House of Commons speaker, and other MPs from all parties about how to outlaw political deepfakes. “This isn’t a Conservative point. All of us have a shared interest in protecting the integrity of the democratic process,” he said. “It’s really important that in politics there’s a long and proud tradition of satire. I welcome that but this is different.

“You just have to imagine a video coming out the day before an election misrepresenting somebody and the voters saying, ‘I’m not voting for them, that’s a disgrace,’ and your democracy is then completely corrupted.”

Around the world, political deepfakes are proliferating. A bogus news report last month that claimed there had been a coup in France went viral. It was so convincing that Emmanuel Macron was contacted by an African head of state asking him what had happened to his government.

In October, just days before the Irish presidential election, a video appeared online purporting to show one of the candidates, Catherine Connolly, announcing her withdrawal from the contest. Connolly went on to win but described the deepfake as a “disgraceful attempt to mislead voters and undermine our democracy”.

The first round of the Romanian presidential elections, held in November 2024, was annulled following credible reports of Russian interference involving deepfake videos. Moldova was flooded with deepfakes ahead of crucial elections last year that were seen as a choice between east and west.

The prime minister, Keir Starmer, the London mayor, Sadiq Khan, and the health secretary, Wes Streeting, have all been targeted.

There is an added dimension for female politicians whose images are manipulated with AI to create deepfake porn. In the run-up to last year’s UK general election, digitally altered pictures of more than 30 female candidates including Angela Rayner, the former Labour deputy leader, and Priti Patel, the shadow foreign secretary, appeared on a sexually explicit website.

Jess Asato, the Labour MP for Lowestoft, said: “There’s a conversation to be had about where do we draw the line on freedom of speech, but sexualised depictions of politicians are just absolutely wrong. I worry about how quick and easy it is to produce something that’s completely untrue.”

‘It’s clearly an avenue for foreign interference, but it could just be somebody sitting in their garage making deepfakes of someone they don’t like’

Vijay Rangarajan, Electoral Commission chief executive

The Electoral Commission, the independent body that oversees elections, is so concerned about the potential threat that it is setting up a new unit ahead of May’s crucial local, Scottish and Welsh elections, to identify and take down political deepfakes.

The system, which is being developed in collaboration with the Home Office, will scan and monitor social media sites including Facebook, X, TikTok and Instagram. It will be able to spot AI-generated bogus content in Welsh as well as in English.

When a deepfake is flagged, the commission will immediately inform the platforms and try to get the post removed. Candidates and parties will also be warned. Politicians will be able to contact the commission for support if they find a deepfake circulating online.

Vijay Rangarajan, chief executive of the commission, said the “barrier to entry” for making the videos had dropped dramatically because of the widespread availability of AI-enabled technology. “It’s clearly an avenue for foreign interference, but it could just be somebody sitting in their garage making deepfakes of someone they don’t like,” he said. “The law needs to keep up with all these developing threats.”

An Electoral Commission survey found that a quarter of voters said they had encountered a deepfake video, audio or photo at the last general election. “Trust in politics is very low,” Rangarajan said. “What this can do if it is left unchecked is feed a sense of ‘I can’t believe anything I see’ and if people just don’t know they may not want to vote.”

He insisted the commission was determined to be proactive in tackling a growing problem. “We’ve seen it in use in elections around the world to mislead voters, so what we are trying to do is to get ahead of this by detecting it in the UK. Hopefully just by telling people we’re detecting it and looking really hard for it we will dissuade people from using it.

“We will be trying to make sure that this doesn’t become a threat to democracy. We don’t think it’s affected any UK election results and we want to keep it that way.”

Illustration by Observer Design

Follow

The Observer
The Observer Magazine
The ObserverNew Review
The Observer Food Monthly
Copyright © 2025 Tortoise MediaPrivacy PolicyTerms & Conditions