What is the Online Safety Act? Here's what you need to know

The Online Safety Act 2023 (OSA) brought in new rules for internet companies to make sure users are protected from harm that can take place on their platforms, including tackling harms to individuals from illegal material online, and to better protect children.

The OSA made Ofcom the UK regulator for online safety.

internet_user.jpg

What is the duty of online platforms?

The OSA created a new duty for online platforms (and a similar duty for search engines) to assess the risks and mitigate the harms stemming from two types of material: illegal content, including fraud, terrorism and other offences; and content that is harmful to children. The OSA does not prescribe how to overcome these risks but Ofcom has recommended measures in codes of practice. Platforms failing in this duty could be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher.

Ofcom has new powers

Ofcom is empowered to block access to websites that fall foul of regulations. The OSA obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content. When these measures - which have been delayed - take effect, content posted on social media from news organisations or that is intended to contribute to democratic political debate in the UK cannot be removed.

Ofcom has been consulting on and developing the regulatory framework, and opened an enforcement programme to monitor whether online companies are meeting their requirements under the OSA.

Should the OSA be scrapped?

When new duties on platforms took effect in summer 2025, there was significant criticism of the OSA and misunderstanding about how platforms should comply with the rules. Much of that criticism focused on age verification measures.

Full Fact did not play a role in advocating for age verification in the OSA because our specific focus was on how misinformation is handled in what is a very broad piece of legislation.

Full Fact has long criticised the failure of the OSA specifically to tackle harmful misinformation online. But despite calls to repeal the Act, scrapping it is not the answer. The government should take the opportunity to improve the OSA whilst ensuring that free speech and freedom of expression are protected.

Does the OSA protect us from harm?

The OSA should have been a pivotal moment in the way we tackle the harms caused by misinformation. However, the final Act falls short of the former government’s original aim to make the UK “the safest place to be online.” The proposed duty on platforms to tackle content that is legal but harmful to adults was removed before the OSA became law; and the previous government backtracked from their plan to tackle collective and social harms, with the OSA focused on harms to individuals. In the 2025 Full Fact Report we reiterated our view that the OSA is not fit for purpose in tackling harmful misinformation.

'Misinformation presents a risk to national security'

The current government’s Statement of Strategic Priorities for Online Safety notes their focus on “the vast amount of misinformation and disinformation presenting a risk to national security or public safety that can be encountered by users online.” While their strategy for elections emphasises that “Our own democracy is being threatened by misinformation”. Having recognised the problem, the government now needs to turn this into action with a more effective regulatory framework.

The government has set out its view that misinformation will be caught by the OSA where it is a relevant offence in the Act (for example, misinformation that also stirs up racial hatred); and that harmful misinformation for children will be caught where it intersects with priority content (for example, content which is hateful or encourages the consumption of harmful substances and contains misinformation).

No credible plan

There is currently no credible plan to tackle the harms from online misinformation in the OSA beyond the narrow framework noted above, and this continues to leave the public and our democracy vulnerable and exposed. The only references to misinformation in the OSA are about setting up a committee to advise Ofcom, and changes to Ofcom’s media literacy policy. The words ‘misinformation’ and ‘disinformation’ were later removed from the name of that committee and the scope of its work was narrowed.

False communications offence

The OSA created a new ‘false communications offence’, for sending a message conveying information that is known to be false if there is intent to cause “non-trivial psychological or physical harm” to a likely audience. It would not apply to people who share content without knowing that it is untrue, which means the offence is not suited to tackle viral online misinformation. When this offence was originally proposed, Full Fact expressed significant concerns about it from a freedom of speech and a burden of proof perspective.

Health misinformation

The Act does not address health misinformation, which the Covid-19 pandemic demonstrated could be potentially harmful. Health misinformation should be included as a defined harm in the OSA. Without a legal requirement for online platforms to conduct adult risk assessments, there is no clear way to know whether or how they are tackling harmful health misinformation.

Election disinformation

The OSA also does not set out any new provisions to tackle election disinformation, unless it is a foreign interference offence, which requires various elements to be met that may be difficult to identify in practice and hard to prove. The government’s Elections Bill provides an opportunity to improve the OSA in line with its original ambition, by tackling the collective harms that misinformation can cause to society and democracy.

Southport

Nor does the OSA cover misinformation that happens during information incidents, when information spreads quickly online, such as during terror attacks or during the August 2024 riots following the Southport murders. As the Science, Innovation and Technology Committee concluded in a July 2025 report, following an inquiry into the role of social media in those riots: “The Online Safety Act was not designed to tackle misinformation—we heard that even if it had been fully implemented, it would have made little difference to the spread of misleading content that drove violence and hate in summer 2024.”

Generative AI

The OSA also does not extend to most harms from generative AI misinformation. Full Fact has long called for strong regulation to tackle AI-generated misinformation, to help ensure that the risks of AI are treated with the same urgency as its rewards.

What about the role of fact checkers?

Finally, the OSA does not ensure that researchers and fact checkers have timely access to data from online platforms and search engines about misinformation and disinformation circulating on their platforms.

Fact checkers can do a much better job when they have better access to data about what the most harmful content is, who is seeing it, and how it is spreading. However, right now the platforms are moving in the opposite direction and shutting down services designed to help fact checkers.

The Data (Use and Access) Act 2025 will require online companies to provide information for independent research into online safety issues. When the government implements the measures, they should ensure that fact checkers get the data they need to help improve the information environment.

A person using a laptop.
Image courtesy of Christin Hume

Does the Online Safety Act protect freedom of expression online?

In addition to the need to protect material from news publishers and content of democratic importance, the OSA imposes duties on platforms to protect freedom of expression online. The OSA’s approach means that internet companies will have to decide what content is not allowed on their platforms, display this in their terms of service, and apply a consistent approach in how they manage content that is in breach of their terms of service.

Platforms which fail to comply with their obligation to protect freedom of expression can face significant penalties. However, there is a lack of regulatory oversight for what companies include in their terms of service and how they will address it. This will neither prevent misinformation from spreading, nor will it protect freedom of expression.

parliament.png

How did Full Fact campaign to change the Online Safety Bill?

Throughout 2022 and 2023, Full Fact worked with MPs and Peers from across the political spectrum to improve the legislation. We helped to table amendments that would better protect us all from harmful health misinformation, to improve freedom of speech, and to end the ability of internet companies to make unaccountable decisions for UK internet users from offices on the other side of the world. Some of these made it into law; some didn’t.

Will Full Fact continue to campaign to tackle misinformation?

Full Fact has long campaigned for regulation that tackles misinformation and protects freedom of expression online and will continue to do so. It is in our core mission to find solutions to the spread of misinformation, and we proactively raise the matter with decision-makers at every opportunity, and run public campaigns when appropriate.

When the OSA became law, our first priority was to work with Ofcom as it consults on and develops the regulation. With implementation of the OSA now well under way, Full Fact is calling for a review to determine whether it is capable of addressing the scale of harmful content circulating online.

Our policy calls

This should be accompanied by a wider reform agenda to tackle misinformation and disinformation and to improve the information environment, including:

  1. The Elections Bill should improve the OSA, introduce rules to deal with political deepfakes, increase transparency and tackle misinformation in political ads, and enhance the system for dealing with electoral information incidents.
  2. Clear protocols are needed to ensure that the government, regulators and other trusted voices can come together quickly to give accurate information during critical information incidents. Full Fact previously developed a framework for managing information incidents following a consultation.
  3. The government should revisit the media literacy duties that platforms have under the OSA - and the largest platforms should be given a duty to provide effective media literacy programmes which meet users’ needs. Media literacy and critical thinking skills should also be incorporated across the national curriculum.
  4. The government should also mandate platforms to extend their risk assessments beyond illegal content. This would allow the government, Ofcom and researchers to evaluate emerging threats and adapt legislation accordingly.