Israel-Iran misinformation is circulating online - what to watch out for

AI-generated and miscaptioned footage and images are circulating widely on social media as the Israel-Iran conflict continues.
Both countries have launched multiple strikes against each other following Israel’s attack on Iranian nuclear and military sites on 13 June 2025.
Here are some of the most viral claims we’ve fact checked as of Wednesday 18 June, and some tips on what to watch out for. We’ll be adding to this article as and when more examples emerge.
Join 72,547 people who trust us to check the facts
Subscribe to get weekly updates on politics, immigration, health and more.
AI-generated misinformation
We increasingly see AI-generated footage and images shared on social media in the wake of big breaking news events, and that’s been the case in recent days.
While we can’t always definitively say where a video clip comes from, we’ve seen a number which were almost certainly created with artificial intelligence (AI). For example:
- ‘Doomsday in Tel Aviv’ footage. One video of a bombed city is being shared on social media with claims it shows “doomsday in Tel Aviv” in Israel. However it appears to have previously been shared on 28 May, before the latest set of strikes between Israel and Iran. There are also clear signs that it was almost certainly made using AI. For example, in the first clip, two cars approaching each other at a T-junction in the top left corner appear to merge into one. Other vehicles in the video also become glitchy and blurry as they move. Read our full fact check here.
- Destroyed passenger planes. An image is being shared with claims it shows damage caused by Iranian strikes on Tel Aviv’s airport. But, using reverse image search tools, Full Fact traced the image to a (since deleted) video posted on TikTok on 15 June, which appears to have been generated using AI tools. There are visual glitches in the rendering of the plane at the forefront of the image, with portholes along the cabin appearing in a gap where a section of the plane is missing.
Miscaptioned footage
When there’s a lot of interest in a global news story it’s also very common for us to see old or unrelated video or photos passed off as something they’re not. Here are some examples we’ve seen in relation to the current conflict:
- Iranian missile launches. One video we spotted online was shared with the claim it showed Iranian missiles being fired at Israel. But this footage actually dates back to at least October 2024, and so isn’t related to the current conflict. Full Fact was not able to verify the location, although it does appear to show Iranian missiles targeting Israel.
- Drone explosion. Another clip, of what appears to be a drone causing an explosion in a built up area, has been shared with claims it shows an Iranian drone strike on Tel Aviv. However, Full Fact traced the video back to footage of drone attacks on Kyiv, Ukraine, in October 2022. The version being shared recently has been horizontally flipped.
- Fire footage from China. A clip of a large fire in a built-up area has been shared with a caption claiming it shows “an Iranian attack like Tel Aviv has never seen before”. But the landmarks in this video match other clips of a fire at a motorcycle parking lot in China on 11 June 2025.
- Video of protests is several years old. Footage being shared with claims it shows recent protests against the regime in Iran is also old, and has been online since at least 2017.
- Iraq airstrikes footage. We’ve seen what appears to be footage from US airstrikes on Iraq in 2003 being shared with claims it shows recent missile strikes between Iran and Israel.
What to watch out for
Misleading information can spread quickly during breaking news events, especially during periods of crisis and conflict.
Before sharing content that you see online, it’s important to consider whether it comes from a trustworthy and verifiable source.
If you’re wondering if a video clip is AI, one tip that’s worth noting is that some social media posts share versions of footage that are much more grainy and blurry than the original, making it difficult to identify signs of AI. So it’s always worth looking for clearer versions by searching key frames of footage using tools like TinEye or Google Lens.
We’ve written a toolkit with practical tools anyone can use to identify bad information, and also have specific guides on: