Missing children, lost dogs and escaped snakes: how hoax posts are swamping local Facebook groups
Shortly before 3pm on 2 July 2023, a Facebook page in the name of “Ella Fisher” posted an urgent appeal on the Doncaster New and Used Items For Sale community group.
Alongside a picture of a young child holding the leashes of two small dachshunds, the post read: "Help!!! MISSING Child!! (Doncaster) Sofia only 5years old went out on her bike earlier today and she still hasn't returned.She doesn’t know where she’s going, new surroundings.There is a silver alert activated on her. Please help bump this post so we can get her home safely!! [sic]"
Comments on the post had been disabled. This meant that although people could share it, and react with ‘likes’ or emojis, no one was able to point out that, in the space of just a few days, the same Facebook page posted almost identical appeals in more than 30 community groups across the country, some hundreds of miles apart, with each post claiming the child had last been seen locally.
Ella Fisher’s page wasn’t the only one posting these alerts. In just a few days other Facebook accounts posted similar appeals hundreds of times to different groups across the country, from Bristol and Ramsgate to Llanelli and Aberdeen.
In some versions, the poster claimed to be the grandmother of the missing child or said she was six rather than five. But the rest of the text was virtually identical, with the same image every time.
The posts, many of which were shared hundreds of times, were all bogus. As we wrote in our fact check, the girl was not actually missing at all—and was not called ‘Sofia’, either. Her picture had been featured on BBC News more than two years ago after the two dogs, Pippin and Purdey, were reportedly stolen from a home in North Yorkshire.
Such hoax posts are part of a worldwide phenomenon, which a Full Fact investigation has found is inundating community Facebook groups across the UK with highly emotive and entirely false stories about alarming events supposedly taking place in the local area.
Some appear designed to terrify communities, such as reports of a “serial killer” supposedly “hunting” in Dundee, Telford and Newbury, or claims a dangerous man with a knife was going “door to door” in Chesterfield, Bicester and the small town of Magherafelt in Northern Ireland.
Many posts have focused on missing children, or pensioners—particularly elderly people with dementia. Others have been about unidentified victims of muggings or road accidents, or fake appeals to find someone’s birth parents. And animals have frequently featured—most commonly lost or injured dogs, but also deadly rattlesnakes supposedly found in unlikely locations—including inside a toilet rim in Princes Risborough.
Quantifying the scale of the problem is difficult, especially because many hoax posts are never fact checked or reported. But as part of our work fact checking online misinformation over the past year, we have counted over 1,200 examples of such hoax posts. This is likely to be the tip of the iceberg.
Our investigation found at least 115 different communities across the UK have been the victim of hoaxes, including big cities like Birmingham and Bristol and smaller places such as Bewdley in Worcestershire and Burntisland in Fife. Posts have also appeared in many groups overseas.
The sheer scale of the hoaxes have made them much more than an occasional nuisance.
Genuine appeals for help on Facebook—for example, to identify a dog which has really been found—receive less interaction, or in some cases have been wrongly dismissed as a hoax. And some local community Facebook groups have been overwhelmed with false information and become difficult to use as a result.
Why are the hoaxes happening?
Whenever the hoax posts are discussed online, one question keeps coming up: why?
The answer lies not in the hoax posts themselves, but in what they become.
In many cases, once a hoax post has generated engagement, the author will use Facebook’s editing function to change it into something completely different, such as a survey, property listing or an advert for a cashback site.
The aim appears to be for the edited posts to benefit from the engagement the original post received, if only because users may take the fact a post has been liked by many others—and perhaps one of their friends—as some kind of endorsement.
A wide range of experts have warned these edited posts are some kind of scam, but there’s been little consensus on what exactly is happening. The consumer website Which?, for example, suggested posts may be edited into a “straightforward investment scam”, while the Better Business Bureau in the US claimed the aim may be to get users’ personal information. Others have speculated the posts are an attempt to identify people who may be vulnerable to other scams or get users to begin a direct message conversation which could in turn lead to them being scammed.
While it’s possible some people have been affected in this way, we’ve not seen reliable evidence of it, or in fact any proven examples of anyone having money taken or their identity compromised. None of the fraud investigators or police forces we have spoken to have told us they are aware of anyone suffering a financial loss.
However, we have found evidence of some of the hoax posters attempting to make money in a different way.
Shortly before 6.30pm on 3 July, the day after the post about ‘Sofia’ having gone missing in Doncaster went live, it was edited. The original words and pictures were removed, and replaced with an entirely unrelated post addressed to “chocolate lovers”, offering them the chance to win chocolate hampers “worth £500”. By then the original post had been shared more than 100 times.
We have seen dozens of other hoax posts edited in the same way to plug everything from cashback sites and homes for rent to nappy giveaways. Having analysed multiple posts in detail, we’ve found these apparently different offers often follow a similar pattern.
The edited posts contain an active link which users are asked to click, which leads to a landing page on one of many different websites outside Facebook.
These landing pages are often crudely branded according to the offer in question—for example, using the logo of a genuine cashback site, or a popular chocolate brand. But the websites are not affiliated with the companies in question, and appear to have been created with a free website tool.
These websites then have further links, for example in a ‘Continue’ button, which do often lead to the genuine websites of genuine companies which are unconnected to the hoax posters. But crucially, they do so via a hidden affiliate link—a special web link which allows someone to earn a small commission for promoting a product or service.In other words, Facebook users clicking on links in edited hoax posts are eventually taken to the real website of a legitimate company or organisation, but arrive there via an unconnected third party website and an affiliate link which earns a small fee.
To make matters worse, some of the edited posts make exaggerated claims about the legitimate company they go on to link to. For example, edited posts we’ve seen promoting the legitimate cashback site Cashback UK claim you can earn hundreds of pounds for completing a single task or £150 as a “sign up bonus”.
Submission Technology, owners of Cashback UK, told us: “The information given in the posts is incorrect, and has been purposely exaggerated by the publisher in order to trick people into signing up. Our real welcome bonus is £5 not £150. You can make £200+. However this is more likely to be over a period of weeks or months, not from completing just a few surveys.
“Activity like this is damaging for our brands, as users who follow those links will quickly see that they have not been awarded such bonuses, therefore leave the site, brand us as a scam and not trust us if they see our genuine ads elsewhere.”
The company, which also owns the OhMyDosh cashback site which has been the subject of some edited posts, added: “When we discover publishers promoting our sites in this way, they are immediately paused our side and any leads they have generated will also be removed so we will not pay them for their traffic.”
A spokesperson for Toluna, a survey site we’ve also seen promoted in edited posts, told us an interim website which we spotted being used to link Facebook and its site was “fraudulent” because it was “impersonating our brand without our knowledge” and it would be taking action. That interim website now appears to have been shut down.
Full Fact asked SEON Technologies, a firm which specialises in fraud prevention, to examine a number of interim websites linked to from edited posts, and it also confirmed they used affiliate links.
It’s by no means certain that all the edited posts are attempts to profit from affiliate links—we’ve only been able to investigate a small sample. But it does appear to be at least one way that the hoax posters are making the alarming claims they’re sharing pay.
‘I’m just tearing my hair out with these fake posts’
While the edited posts can mislead Facebook users and damage legitimate companies’ brands, the impact of the original hoax posts may be even more damaging—not least because a large proportion of the original posts never go on to be edited and their false claims remain littered across local Facebook groups.
Lynne Parker, a terrier owner from Somerset, helped set up a Facebook group dedicated to exposing hoax posts involving lost or injured animals which now has over 9,000 members. She told us: “I'm just tearing my hair out with these fake posts. They are beginning to damage genuine owners of missing dogs or finders of lost ones who are now falsely being accused of posting fake posts, which they don’t need.”
In a post last September, Ms Parker gave an example of the impact.“I shared a [genuine] post for a missing 17-year old, partially deaf partially blind missing dog,” she wrote. The genuine post had had 226 shares, she noted, compared to 552 for a fake missing dog alert published at the same time.
Lisa Loops is the director of the Muddy Paws Crime Facebook group, which helps locate stolen dogs and has been working closely with the owners of Pippin and Purdey, who are still trying to find them. She told us her team had reported more than 200 separate Facebook accounts which uploaded ‘Sofia’ missing girl posts.
Ms Loops said the mother of the girl pictured in the posts was “horrified”.
She added: “It’s bad enough that they are still grieving for their dogs and wondering if they are even alive, but to then have a picture of your daughter all over the UK and the USA with the claim that she is missing—it’s just horrifying.”
Ms Loops and Ms Parker both told us Facebook has not always shut down hoax posts after they are reported, in some cases claiming they are not against its community guidelines.
(When we put this complaint to Meta, it did not provide a specific response.)
Meanwhile for users of local Facebook groups, the hoax posts are a constant headache. Groups with no active admin appear to be particularly vulnerable, while elsewhere admins have come in for fierce criticism from users for allowing hoax posts to appear—or have even been accused of collaborating with the hoaxers.
We’ve seen many complaints from affected users. One who earlier this month threatened to leave her local group because there was “way too much spam” from hoax posts summed up her frustration simply: “It is wrong and this site is riddled with it.”
Hoax posts first began appearing in large numbers last summer.
Initially, the posts were mostly uploaded by accounts from Zimbabwe, though it’s unclear whether the account owners themselves were publishing the hoax posts, or their accounts had somehow been hijacked.
Prosper Tatenda of ZimTracker, a Zimbabwean fact checking company, told Full Fact back in August 2022: “Our investigations suggest these may not be Zimbabweans per se.” We attempted to contact several accounts behind the hoax posts. Only one account responded, with the owner telling us he had been hacked, though we weren’t able to corroborate this.
Hoax posters now typically use accounts with English-sounding names and Caucasian faces. They also often make use of business pages instead of individual profiles, providing an extra layer of anonymity for those behind the posts.
Tackling the hoax posts
While social media networks have long grappled with the problem of misinformation, specific policy changes at Facebook appear to have inadvertently helped hoax posts in local community groups become a global phenomenon.
The big one came back in 2013, when Facebook began allowing users to edit posts. Edited posts retain likes or shares, even if the words and pictures are completely changed. This means new users can be led to believe an edited post is hugely popular—or an offer is genuine—because other users seem to have endorsed it.
Users who have liked, commented or otherwise interacted with an edited post aren’t notified if the original content is changed. Concerns about potential misuse of the feature were raised at the time.
And then in 2021, there was a shakeup of the way public Facebook groups worked. The new system made it possible for members to join without admin approval, in a move which potentially made some local groups more accessible to those from outside the area (though admins were still able to restrict who posted and commented).
We contacted Meta, Facebook’s parent company, with the findings of our investigation. (Full disclosure—Full Fact is part of Meta’s Third-Party Fact-Checking programme, through which we receive funding to identify, review and rate viral misinformation across Facebook, Instagram and WhatsApp. Meta does not determine which checks Full Fact publishes or have any editorial control over our content.)We provided Meta with an example of a post that had been edited from a lost dog report to a cashback promotion. It quickly deleted the post and the account behind it.
A Meta spokesperson said: “Fraudulent activity is not allowed on our platforms and we’ve removed the violating posts and account brought to our attention. While no enforcement is perfect, we continue to invest in new technologies and methods to stop scams and the people behind them.”
We also separately flagged the Ella Fisher page, and while it did not provide a specific response the page was subsequently deleted.
In early April, Full Fact wrote to the head of UK content regulation policy at Meta to formally raise concerns.
We warned that “it is clear that reactive removal of posts identified by third parties is not enough to mitigate the damage that is being done”, and stressed that “the risks posed by these posts are pernicious and frequent enough to merit stronger action from Meta in terms of proactively identifying and tackling this growing trend”.
The letter called on Meta to set out the steps it intends to take to tackle the problem. More than three months on, we’ve yet to receive any reply.
In the meantime, the hoax posts keep on coming, and show no sign of abating. A boy reported missing in West Dunbartonshire. A grandma with dementia reported missing in Middlesbrough. And in South Shields, using a different photo—this time of a kidnap victim in the US—another missing five-year old girl called ‘Sofia’.
For practical help identifying hoax posts, and how to check if a Facebook post has been edited, see our guide with seven ways to spot if a Facebook post is a hoax.
Funding from Meta. Since January 2019, Full Fact has checked images, videos and articles on Facebook, Instagram and WhatsApp as part of the company’s Third-Party Fact Checking programme. This work is funded by Meta. The amount of money that Full Fact is entitled to depends on the amount of fact checking done under the programme. We are fully independent and Meta does not determine which checks Full Fact publishes or have any editorial control over our content.
Update 24 August 2023
This article was updated to include a link.