Fake AI image of Bondi Beach victim having blood applied circulates online
17 December 2025
What was claimed
An image shows a man who resembles
Arsen Ostrovsky, a man who was injured in the Bondi Beach mass shooting, smiling as a woman applies fake blood to his face.
Our verdict
This image is fake. It has an invisible watermark which flags it as having been made or edited with Google AI. There are also a number of discrepancies and visual glitches that are hallmarks of being created with artificial intelligence. Arsen Ostrovsky did suffer a head wound in the attack in Sydney.
An AI image apparently depicting one of the victims of the Bondi Beach attack having fake blood applied by a makeup artist has been widely shared online. But it’s not real.
The picture (warning: distressing content), which has been liked over 41,000 times on X, appears to depict a smiling man sitting on the floor while a woman applies fake blood to his face, which also covers his t-shirt and arms. A set of makeup brushes and sauce bottles full of fake blood are on a nearby table.
Many posts sharing the image include captions claiming it proves the attack in Australia was a ‘false flag’ (an act carried out with the intention of blaming a political or military opponent for it), or that the man pictured was a ‘crisis actor’, someone who is a paid to feign injury or death outside of a training situation to further an agenda. Beliefs that events like the Bondi Beach shootings involve crisis actors are a common feature of conspiracy theories .
But the image isn’t real, and has been created with artificial intelligence (AI). When Full Fact put the picture through Google reverse image search, it was flagged as having been “made with Google AI”.
Images produce this prompt when they contain a SynthID digital watermark, which is undetectable with the human eye, but is embedded into content made with several Google AI products. The watermark remains detectable despite any changes made to the quality or size of the picture.
Google previously told us the presence of a watermark can't tell us whether AI was used to completely generate a brand-new image or modify an existing one. But there are plenty of other clues that the image was completely generated with AI.
The man’s likeness and outfit resembles that of Arsen Ostrovsky, an Israeli lawyer who was one of the victims injured in the attack on Jewish people attending a Hanukkah event at the beach on Sunday, 14 December which killed 15 and injured dozens more.
He shared a selfie (warning, graphic content) of his head and one arm covered with blood in the immediate aftermath of the attack, and was interviewed on the beach by an Australian broadcaster also bloodied and wearing a bandage.
The day after the attack Mr Ostrovsky shared an image of himself lighting a Chanukah candle in hospital with a dressing on his head.
On 16 December he also posted on X about the image being shared supposedly depicting him, saying he was aware of the “twisted fake AI campaign” that was “suggesting my injuries from Bondi Massacre were fake”.
“I saw these images as I was being prepped to go into surgery today and will not dignify this sick campaign of lies and hate with a response,” he said.
Despite the image being totally fake, Grok (X’s in-house AI chatbot) told users on the platform it is “consistent with a real photo”.
One user shared a screenshot of the picture being run through an AI detector site which mistakenly said there was a 0% “AI generated probability”. We do not find these types of AI detection websites reliable and do not use them in our fact checking of suspected AI content.
Join 72,953 people who trust us to check the facts
Sign up to get weekly updates on politics, immigration, health and more.
Subscribe to weekly email newsletters from Full Fact for updates on politics, immigration, health and more.
Our fact checks are free to read but not to produce, so you will also get occasional emails about fundraising
and other ways you can help. You can unsubscribe at any time. For more information about how we use your data
see our Privacy Policy.
How can we tell this image is AI?
The outfit worn by Mr Ostrovsky in the genuine pictures and video from the aftermath of the shooting resembles that shown in the viral image, suggesting that they were used as a prompt to generate the AI picture.
But there are a number of errors in the image that are also hallmarks of AI generation.
Although at first glance the grey t-shirt in the image looks similar to the one worn by Mr Ostrovsky, which bore the words ‘United States’ and an American eagle graphic, the text rendered on the viral image is garbled and illegible, and the graphic is also completely different.
His left ear, which is visible in the viral picture, is also distinctly different and malformed, compared with genuine images of him which show this side of his head.
As we explain in our blog about spotting AI-generated images, ears are often parts of the body that AI fails to render accurately, although AI content is becoming increasingly sophisticated and contains fewer obvious errors.
The pattern of the blood stains on his t-shirt in the image being shared also does not match that which is clear in genuine footage of him being interviewed in the immediate aftermath of the shooting, when he was bandaged.
An uncropped version with more obvious errors features mangled hands on the surrounding film crews, which are distorted with too few fingers, and a car in the background also appearing to be glitched into two vehicles.
Before sharing content which you see on social media, first take a step back and consider whether it comes from a trustworthy, and verifiable source. Misinformation can spread widely during or in the immediate aftermath of crisis events such as terror attacks. Our toolkit can help you identify bad information which you may see online.
This article is part of our work fact checking potentially false pictures, videos and stories on Facebook. You can read more about this—and find out how to report Facebook content—here.
For the purposes of that scheme, we’ve rated this claim as false because this image was created with artificial intelligence, and Arsen Ostrovsky really was injured in the mass shooting at Bondi Beach.
Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.
Subscribe to weekly email newsletters from Full Fact for updates on politics, immigration, health and more.
Our fact checks are free to read but not to produce, so you will also get occasional emails about fundraising
and other ways you can help. You can unsubscribe at any time. For more information about how we use your data
see our Privacy Policy.