An Urgent Call for Action: Committee Report on Social Media, Misinformation and Harmful Algorithms

11 July 2025

The latest report from the Science, Innovation and Technology Committee confirms what is now undeniable reality: the Online Safety Act, in its current form, does not go far enough to address the spread of harmful misinformation. Full Fact welcomes the Committee’s findings and, like them, calls on the government to take urgent action.

The Committee’s findings

The report follows a detailed inquiry into the role social media played in the violent unrest that shook the UK in the summer of 2024. The catalyst was a horrific knife attack in Southport that claimed the lives of three young girls during a Taylor Swift-themed dance class. In the days and weeks that followed, misinformation spread unchecked across social media, inflaming tensions and fuelling violence.

The inquiry set out to uncover what role social media played in these riots and whether there are enough safeguards in place to prevent this happening again. The Committee’s conclusion is clear: 

“The Online Safety Act was not designed to tackle misinformation—we heard that even if it had been fully implemented, it would have made little difference to the spread of misleading content that drove violence and hate in summer 2024.”

Science, Innovation and Technology Committee

The Committee places a heavy duty on platforms to take more responsibility. It recommends that companies work closely with independent fact checkers to identify and deprioritise misleading content, and that they put in place clear crisis protocols to prevent repeat scenarios like Southport.

These are steps Full Fact has long advocated for. We echo the Committee’s call for platforms to continue working with fact checkers as a vital way to combat misinformation. 

The report also takes a firm stance on the risks posed by generative AI. It urges the government to take action in bringing forward legislation to tackle AI-generated misinformation. This aligns with Full Fact’s own warnings in our 2025 annual report: unless the UK sets out clear rules for AI now, it risks falling behind - reacting to harms after the fact rather than anticipating them.

Full Fact’s Recommendations

Full Fact submitted two pieces of written evidence to the inquiry, including insights from our work following the Southport tragedy, where our fact checkers played a crucial role in addressing false claims that spread rapidly online.

We made a number of recommendations in our evidence and have identified additional areas where the Online Safety Act could be enhanced: 

  1. Review the Online Safety Act: Once the Online Safety Act has been implemented as drafted, an urgent review should take place to assess whether the Act can effectively combat the level of harm present on social media. As Full Fact has outlined, both in the inquiry submissions and to the government directly, the harmful misinformation we’ve seen following the Southport attack is only the tip of the iceberg.
  2. Establish a crisis coordination framework: Ofcom should lead the development of a coordinated system for managing online misinformation during crises. This would bring together social media platforms, trusted organisations, and public officials to respond quickly and effectively to rapidly spreading false information. Since our submission, Ofcom has opened a consultation on this topic—focused on the narrower issue of how platforms should respond to crises involving certain illegal content—which we intend to fully engage with.
  3. Upgrade the Online Safety Act to tackle systemic risks: The Act should be amended to tackle the collective harms that misinformation can cause to society and democracy, in line with its original ambition. The European Union’s Digital Services Act provides a model: very large online platforms and search engines are required to assess and mitigate systemic risks stemming from their services. This includes negative effects on civic discourse, electoral processes and public security.

Many of these recommendations were echoed by the Chair of the Science, Innovation and Technology Committee, Chi Onwurah MP, in her essay for the 2025 Full Fact Report, published in May 2025.

Responsibility for the dissemination of online misinformation largely comes down to how quickly platforms respond to the rapidly emerging falsehoods they host. The 2024 riots exposed an urgent need for both effective crisis response protocols and a rethink of the Online Safety Act.

The government needs to step up and hold large online platforms accountable for their slowness to act in this regard, but also needs to take bolder steps to prioritise the fight against misinformation. 

The government must learn the lessons of Southport and take the recommendations of this report, and act on them, in order to build an online information environment that is safe, trustworthy, and fit for the future.

Online Safety Act 2023

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.