No evidence Sadiq Khan ‘Remembrance weekend’ audio clip is real

12 April 2024
What was claimed

Sadiq Khan said he doesn’t “give a flying shit about Remembrance weekend” in a leaked recording.

Our verdict

There is no evidence that Mr Khan made any such comment, or that this is a real recording.

An audio clip that alleges to be a recording of London mayor, Sadiq Khan, saying he doesn’t care about “Remembrance weekend” is circulating online. But this comes from a longer recording that Full Fact has found no evidence is real and is likely to be a deepfake. 

In a TikTok video shared on 2 April 2024 a voice strongly resembling Mr Khan’s can be heard saying: “I don't give a flying shit about the Remembrance weekend, and even more so don't care about next Saturday. What's important and paramount is the one million man Palestinian march takes place on Saturday. 

“I control the Met Police. They will do as the Mayor of London tells them and obey orders. The British public need to get a grip of I run London [sic], and once they understand and accept that, we'll all get on a lot better.”

The video features a photo of Mr Khan and has overlaid text saying: “This is a sackable offence for the mayor of London, so why is he still here? Share to spread the word!” 

Multiple Facebook posts have shared a link to this video in recent days with captions suggesting the recording is real, including: “This is shocking. People need to hear what the mayor of our capital city has to say”, “Get this guy out of office in May” and “Corruption at its finest”. Other videos on TikTok also feature the audio.

But there’s no evidence that this recording is real or that Mr Khan said any such thing. Both Mr Khan and the Metropolitan Police have previously confirmed the longer recording—where this clip comes from—is not genuine.

The longer version first appeared in November 2023 amid tensions over a pro-Palestinian march in London that was scheduled for Armistice Day. In the clip, Mr Khan can supposedly be heard calling for the Armistice Day events to be postponed in favour of the march, and says “this is a private conversation”, giving the impression the recording was secretly taken and leaked. 

These supposed comments, and those made in the TikTok videos circulating more recently, contradicted those shared by Mr Khan on social media at the time, where he described the Remembrance commemorations as “a hugely important part of our national calendar”. 

We’ve previously written about other audio snippets from this longer recording that went viral on social media. A spokesperson for the Mayor of London told us at the time that the “fake video” was being investigated by both the Metropolitan police and counter terror experts. A spokesperson for the Metropolitan Police Service said it had been made aware of a video “featuring artificial audio of the Mayor”. 

We’ve not been able to confirm the source of the alleged recording. The BBC reported that it was seemingly first shared by a different TikTok account on 9 November, which denied creating the video and said it shares “news that could be real with a sense of humour”. The TikTok account appears to have since been deleted

Honesty in public debate matters

You can help us take action – and get our regular free email

What is a deepfake?

As we’ve written before, a ‘deepfake’ refers to audio or video that has been created using artificial intelligence (AI) and can be used to mimic the face or voice of a public figure.

Dr Dominic Lees, an Associate Professor in Filmmaking and convenor of the University of Reading’s Synthetic Media Research Network, previously told us that as little as 15 seconds of real speech can be used as “training data” by AI to identify speech patterns for cloning. 

One clue that a recording could be a deepfake is “unnatural cadences”, according to Dr Lees. For example, in the alleged recording, there is a long pause between “one-million-man” and “Palestinian march”. There is also a strange turn of phrase—when the recording says: “the British public needs to get a grip of” and then immediately after that: “I run London”—which may further point towards the use of AI. 

The Metropolitan Police has said that the “artificial” clip “does not constitute a criminal offence”.

Why can't we say it is definitely a deepfake?

It is very difficult to prove definitively that an audio clip is a deepfake. There are several tools that claim to be able to tell you, some with a specific percentage of confidence, whether an audio clip was generated using AI. However, at the time of writing, we don’t quote these tools in our articles because we’ve found they don’t work consistently. 

Moreover, there are other ways that audio can be faked that doesn’t necessarily involve AI. For example, using an impersonator or editing techniques where individual words from genuine audio are stitched together to form artificial sentences. 

Mr Khan has referred to the audio as a deepfake and told the BBC that it “was deliberately made to give the impression that I’d said what I hadn’t said but it looked and sounded so authentic we did get concerned very quickly about the impression it would create”.

Full Fact has previously written about an audio clip supposedly capturing Labour leader Sir Keir Starmer swearing at his staff, as well as deepfake videos of Mr Starmer and BBC presenters appearing to promote an investment scheme. We’ve also written a guide to spotting deepfakes.

Image courtesy of U.S. Embassy London

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.