How Ofcom’s transparency reporting guidance is an opportunity to fight misinformation

16 October 2024 | Robert Cann

Fact checkers such as Full Fact are on the frontline in the battle against the spread of misinformation circulating on internet platforms. But we could be far more effective if we were provided with better information from platforms to help guide us more easily to the most harmful claims being made, so that our fact checks address the most essential  content. At the moment the access to such data across the major platforms is patchy at best.

This access is vital to help us decide what are the most important things to check each day. This would allow us to assess trends, topics and narratives as they are forming. Practically this can either be content that is flagged by platform users, spotted by trust and safety teams, or identified by algorithms. 

So, where does Ofcom come into this? Under Part 4 Chapter 5 of the Online Safety Act, major platforms (known as ‘categorised services’ in the Act) must produce annual ‘transparency reports’ for the regulator about certain aspects of their service. These aspects are set out in Schedule 8 of the Act and include things like the service’s method of performing age verification, or tackling illegal content, to name but two.

Ofcom has recently been consulting on their guidance for this duty, and Full Fact has responded to that consultation. We called for platforms to be required to report on:

  1. The prevalence and spread of misinformation
  2. How the service enables users to flag and report suspected misinformation
  3. The extent to which the service uses independent fact checkers to identify and flag misinformation
  4. The service’s policies and protocols for providing access for independent fact checkers and researchers to data that would enable monitoring the spread of potential misinformation. 

Currently the above provisions are not spelt out in the Act. However, the Act does introduce new responsibilities for Ofcom to: ‘understand the nature and impact of disinformation and misinformation, and reduce their [the public’s] and others’ exposure to it’. 

Furthermore, Ofcom is required by the Act to set up the Advisory Committee on Disinformation and Misinformation, and it would undoubtedly be of great benefit to the functioning of this Committee if data related to the spread of, and measures to mitigate, misinformation on categorised services were readily available via transparency reports. 

Taking both of these responsibilities together, we recommended that Ofcom encourages platforms to report on all matters related to misinformation. That way, the regulator will be in receipt of a full picture in regards to misinformation mitigation measures across all the major services. Furthermore, point (1) above places the onus on services to identify in transparency reports what they consider to be ‘misinformation’ on their platforms: those without independent fact checkers may find it harder to provide this information, which will demonstrate further why access for fact checkers is so important.

Paragraph 20 of Schedule 8 (‘any other measures taken or in use by a provider which relate to online safety matters’) does provide such a method of incorporating reporting on misinformation - albeit less strong than if it were spelt out in the Act - and we feel that this could be a way to achieve some level of reporting on misinformation. Ideally however, the law itself would be strengthened, and we will continue to campaign for that change.

Honesty in public debate matters

You can help us take action – and get our regular free email

Why access for fact checkers must be safeguarded

At the moment, platforms are unfortunately moving in the opposite direction and shutting down services designed to help fact checkers. Meta’s Crowdtangle is a case in point, closed down in August. It has been replaced by a new product called the Meta Content Library. Meta is continuing to work with the fact checking community to make this new product fit our needs, but in the short term at least, it has left us and our community with a weaker product than before.  But if Ofcom required Meta to report on what kind of access they provide for fact checkers, this information would then be ‘on the record’, and Meta would then need to explain why they have moved towards a lower standard of access for fact checkers - and by extension a system likely to be less resilient to misinformation.

Meanwhile other services, such as X, are actively hindering transparency, by the recent change to make an existing free API now only available under commercial terms. This ideally would be made available for free to civic organisations and fact checkers, but it is currently only available for a huge licence fee (greater than $5,000 per month) that substantially diminishes our ability to monitor it at scale. 

We know that the impact of misinformation on society is huge. Whether it is vaccine hesitancy due to conspiracy theories about their side effects, or the false information circulating in August 2024 in relation to violence throughout England, Full Fact uncovers misinformation at scale on a daily basis. We, and organisations like us, need to be enabled to do our job to the best of our ability.

If Ofcom moves to require misinformation mitigation measures to be reported by the major platforms, it will be a step in the right direction.


Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.