How our fact checking work with Meta makes a real-world difference
News that Meta, the world’s largest social media company, is scrapping its partnership with fact checkers in the US has reverberated around the world.
We’ve strongly refuted much of what Meta has said about their reasons for the change, particularly that fact checkers are politically biased, or that we encourage censorship. Our chief executive, Chris Morris had this to say:
Meta’s decision to end its partnership with fact checkers in the US is disappointing and a backwards step that risks a chilling effect around the world.
From safeguarding elections to protecting public health to dissipating potential unrest on the streets, fact checkers are first responders in the information environment. Our specialists are trained to work in a way that promotes credible evidence and prioritises tackling harmful information - we believe the public has a right to access our expertise. We absolutely refute Meta's charge of bias - we are strictly impartial, fact check claims from all political stripes with equal rigour, and hold those in power to account through our commitment to truth.
Like Meta, fact checkers are committed to promoting free speech based on good information without resorting to censorship. But locking fact checkers out of the conversation won’t help society to turn the tide on rapidly rising misinformation.
Misinformation doesn’t respect borders, so European fact checkers will be closely examining this development to understand what it means for our shared information environment.
The news was a surprise, particularly as Full Fact undertook a rigorous accreditation process to join Meta’s third party fact checking programme in the first place and the company has always applauded us for the work we do to highlight misinformation on their platforms.
As we get to grips with what Meta’s changes mean for fact checkers in the UK, Europe, and around the world, we also wanted to share what difference our existing work with Meta has been able to achieve.
Full Fact has been a proud contributor to the Third Party Fact Checking initiative (TPFC) since January 2019 and in those six years we’ve checked 2,596 cases which include misleading, faked, or potentially harmful posts on Facebook and other platforms. We added context and provided credible information directly to thousands of posts, and anyone who saw them.
Many of these cases relate to some of the biggest national and international stories like elections, major conflicts, viral conspiracy theories, and public health. We’re incredibly proud that this work plays a part in making sure people get access to good information that they need to engage with complex, high-profile issues.
It’s also important to emphasise that our collaboration with Meta has always been to protect and promote free speech on their platforms. Everything we’ve done has been in keeping with our belief that it’s possible to strike a balance by promoting the facts without resorting to censorship.
With that in mind, here are some of the fact checks we’ve published through our partnership with Meta:
Deepfakes and faked images
Recent years have seen the technology needed to produce deepfakes explode in popularity and accessibility, which we’ve seen translate into more common incidents of misinformation spreading on social media.
This kind of content often targets politicians and prominent figures. This case of likely deepfake audio of London mayor Sadiq Khan disparaging Remembrance Sunday and this video of Labour leader Sir Keir Starmer plugging an investment scheme spread widely on Facebook despite no evidence to support what they depicted. Our checks advised users that these posts were likely deepfakes and so shouldn’t be trusted.
We also debunked claims that this screenshot showed protesters on an anti-immigration march by revealing that it really showed people celebrating a Hindu festival. This particular image surfaced quite soon after the Southport stabbings, a challenging time for the UK and a sobering moment to remind our team why our work matters.
Honesty in public debate matters
You can help us take action – and get our regular free email
Health misinformation
Conspiracies and false information about vaccines, health treatments and therapies have proved to be a big focus for our third party fact checking on Meta platforms. Claims like vaccines are poisonous or lemons can be a cancer treatment pose a real risk to critical decision-making despite having no basis in science - our work with Meta helps to reduce how many people are at risk of seeing and believing such claims.
Conflict footage
We’ve highlighted numerous examples of footage spreading on social media with claims to be from a conflict that turns out to be lifted from somewhere else, or even from a video game. Clips of missiles hitting ships were lifted from gameplay rather than showing something happening in the Red Sea and images from this helicopter crash actually showed an incident four years earlier than the one it claimed to show, which killed the Iranian president - yet both were still shared widely. Our fact checking enabled us and Meta to advise users that neither were genuine.
We’ve also checked content emerging directly from conflict zones, such as this image that claimed to show tents on fire in Gaza but we revealed to be digital artwork or this image about prisoners allegedly found deep underground in Syria which was generated by AI.
Support the fight against misinformation
Losing the cooperation of an organisation like Meta could seriously disrupt our ability to challenge online misinformation.
As a registered charity, Full Fact relies on the public to support our independent, impartial fact checking and advocacy work.
Hoax posts
Thousands of people shared this appeal for a missing child called Sofia, and tens of thousands probably felt they were doing the right thing by sharing a photo of a child rescued by a police officer in Hartlepool. But our checks revealed that neither were genuine appeals, helping to warn people against sharing them any further.
Elections
Whatever critics may allege, we pride ourselves on political impartiality. This becomes even more important at election time, when we work hard to fairly scrutinise all parties equally. But voters also have the right to know if what they are seeing about politicians is untrue.
That’s why, during the 2024 General Election, we debunked false claims that Suella Braverman used parliamentary expenses to pay her parents’ energy bills; and pointed out that a photo of Sir Keir Starmer sitting with Jimmy Savile was an edited version of a photo of Mr Starmer and Gordon Brown. Far from acting on political bias (as Meta now alleges) we think this shows that the TPFC initiative has helped to facilitate free and fair discussion at election time.
These examples are a snapshot of why we think the TPFC initiative has helped Meta users to trust what they see and raise the standard of honesty on the platforms. We think Meta users have a right to access our expertise - directly and reliably. And we think that’s worth preserving.