One year on, has the Government learned the lessons from the Southport Riots?
As warnings grow about the potential for another summer of unrest, this time driven by anti-migrant protests, Full Fact is raising serious questions about how the government can stop false narratives from spreading unchecked.
One year after the widespread disorder of summer 2024, sparked by the tragic killing of three young girls during a Taylor Swift-themed dance class in Southport, there is a renewed urgency to prevent a similar crisis from unfolding.
Over the past year, Full Fact has closely tracked how misinformation spread rapidly in the aftermath of the Southport attacks, the failure of online platforms to detect and address emerging harms, and what more authorities and platforms can do to prevent violence driven by unchecked misinformation.
Content moderation and Community Notes
In collaboration with Demos, Full Fact recently co-hosted an event marking one year since the Southport riots. During this event Demos unveiled important new research on the effectiveness of Community Notes during the crisis and we shared our recommendations on how platforms could adapt the Notes model to better respond to misinformation in times of crisis.
Demos’ research highlighted a key issue: Community Notes were largely ineffective during the riots. Their research revealed that the notes were “largely invisible to users,” limiting their ability to stop the spread of false or harmful content. In fact, only 4.6% of the dataset of Community Notes were publicly visible during the Southport riots.
At the event, we presented our proposed solution going forward: an adapted notes model involving fast-tracked notes using external experts and “super notes” to ensure that helpful contributions to important notes are published to quickly address critical threats to the information ecosystem. These proposals are detailed in our joint briefing with Demos, and we will be bringing them to tech companies over the coming months.
With platforms like Meta shifting toward a notes-based system as a supposed replacement for independent fact checking on their platforms, improving the current framework has become even more urgent. This presents an opportunity for the government to collaborate directly with these platforms and encourage them to make enhancements to their existing structures.
The Online Safety Act
In the year since the Southport riots, we have repeatedly expressed our concerns about the Online Safety Act. In the 2025 Full Fact Report, we reiterated our view that the Act is “not fit for purpose” in addressing harmful misinformation. The lessons of the Southport riots have yet to be learned.
This position is shared by several key stakeholders, including the Science, Innovation and Technology Committee. In their most recent report, the Committee made it clear that the Online Safety Act “was not designed to tackle misinformation” and, even if it had been fully implemented, would not have prevented the spread of misinformation surrounding the riots.
So, what more can the government do to fix the Act?
Over the past year, we have made a series of recommendations identifying specific areas where the Online Safety Act can and should be strengthened:
- Review the Online Safety Act:
Now that the Act has been substantially implemented, an urgent review should be conducted to determine whether it is capable of addressing the scale of harmful content circulating on social media. As Full Fact has outlined, both in inquiry submissions and in direct engagement with the government, the misinformation that followed the Southport attack represents only the tip of the iceberg and is not in scope of the Act. - Upgrade the Act to address systemic risks:
The Act should be amended to target the broader, systemic harms that misinformation poses to society and democracy, in line with its original ambitions. We recommend that very large online platforms and search engines are required to take more responsibility to identify and manage the risks of democratically harmful material on their platforms, beyond what is currently in scope in the Act. This should include issues such as negative effects on civic discourse and electoral processes, and public security.
While senior Labour officials, including the Prime Minister himself, have pledged to revisit the Act, further delays only add to the risk of unchecked misinformation being left to further erode the public's trust in government and credible sources.
Can you help us do more?
Monthly donations really help our work. They mean we can continue to push for better regulation, and call out public figures when they get their facts wrong. Please set one up today.
Give monthlyCrisis Response Protocols
We have long-called for Ofcom to lead a coordinated system for managing online misinformation during crises. This would bring together social media platforms, trusted organisations, and public officials to respond quickly and effectively to rapidly spreading false information. This kind of coordinated approach was urgently needed during the Southport riots, and it remains essential should the anti-migrant protests continue.
Yet, a year later, progress has been minimal. Ofcom has launched a consultation, but only on a narrower issue: how platforms should respond to crises involving specific forms of illegal content. This limited scope would not have covered much of the misinformation that circulated during Southport, leaving a major gap unaddressed.
This kind of coordination must be paired with clear, consistent guidance on how authorities should respond to and dispel misinformation. After the Liverpool attack, where a driver ploughed into crowds during the Liverpool FC victory parade, officials acted swiftly to disclose the race and ethnicity of the attacker, which helped to prevent further false narratives from spreading, but may not be replicable in future circumstances. Decisions on how to counter misinformation must be underpinned by transparent, well-defined policies, based on research into effective debunking. What proves effective in one incident may not be easily replicated in another.
One year on from the Southport riots, the government’s response to misinformation remains dangerously slow. The events of summer 2024 showed just how quickly false narratives can incite real-world harm, yet the systems designed to counter misinformation remain fragmented and underpowered. With the threat of renewed unrest looming, there is no excuse. The government must treat this issue with the urgency it demands: by strengthening the Online Safety Act, pushing platforms to better their content moderation tools, and requiring Ofcom to lead a coordinated crisis response protocol. They must act quickly before history repeats itself.