We’ve seen many people sharing a video on social media that seems to show a kiwi fruit testing positive for Covid-19.
It follows a similar video, in which a glass of Coca-Cola also appeared to test positive.
The people in the videos, and many of those sharing them, have claimed that this proves that rapid tests for Covid are not reliable because they often give false positive results. This is not correct.
Research shows that approved lateral flow tests, like the ones in these videos, are highly unlikely to give a false positive result when used to test people in the correct way.
Coca-Cola and fruit juice are much more acidic than a sample taken from a human body during a Covid test. It seems likely that this breaks the test, making it appear to give a positive result.
What happened in the video?
The footage seems to come from a longer video posted on 23 December on the YouTube channel of Massimo Mazzucco, an Italian film-maker whose films include many conspiracy theories.
In the video, a man identified as Domenico D’Angelo, applies what appears to be a Joysbio lateral flow Covid test to a kiwi and several other fruits and vegetables. The test gives what appears to be a “positive” result in the case of the kiwi, the orange and the berry fruit juice.
One of the doctors identified in the video, Dr Mariano Amici, concludes by claiming that the experiment shows that the Covid tests are “absurdly unreliable”. But this is incorrect, because the experiment did not use the tests properly.
The instructions for the Joysbio test kit say that “This kit has been evaluated for use with human specimen material only… Correct specimen collection and preparation methods must be followed… Failure to follow the test procedure may adversely affect test performance and/or invalidate the test result.”
What is special about Coke and kiwi fruit?
The Coke “test” was conducted by the Austrian politician Michael Schnedlitz, producing a supposedly “positive” result—but again, this is not the correct use of the test.
The manufacturer, Dialab, has also pointed out that Mr Schnedlitz put a sample of the drink directly into the test device instead of mixing it first with the buffer solution, an important part of the test which is designed to stabilise the sample’s acidity. The company said this “destroys the buffer layer and makes the positive marker visible. This result would also be expected if any other manufacturer had such an application.”
According to Annette Beck-Sickinger, a biochemistry professor quoted by the international fact checking site AFP Fact Check, the buffer in a rapid test would in any case not be able “to neutralize large amounts of acid (like apple or mangoes)”.
However, the fact that these rapid tests can effectively be broken with acidic drinks or fruit does not make them unreliable for use in the general population. “If you completely ignore the manufacturer’s instructions or in fact use the test for something completely different, then you shouldn’t really be surprised if you get a silly result,” says Dr Alexander Edwards, an Associate Professor in Biomedical Technology at the University of Reading, who spoke to us about the fruit and Coke tests.
“It’s a bit like saying your fire alarm is not very accurate because when I hold a lighter under it, it goes off—but there isn’t a fire in the house!”
Are these tests reliable?
Diagnostic tests themselves are generally evaluated in experiments before governments are willing to approve them in their country.
Not all of the regulatory evaluations are complete yet, but in general we can see already that these rapid Covid tests very rarely give false positives in the real world.
For example, the Dialab test correctly identified 130 out of 130 negative samples in one evaluation. The Joysbio test claims that an independent evaluation in Italy found that it correctly identified negative samples in 382 out of 384 (or 385) cases. (It is listed as undergoing evaluation in one country, as of 11 December at the time of writing.)
Not every test from every manufacturer is perfect, of course. The European Centre for Disease Prevention and Control conducted a general review of the accuracy of a range of rapid tests up to the end of August. This found studies showing they had a specificity—meaning a success rate in spotting negative samples— of between 80.2% and 100%.
Even so, most governments insist that tests meet minimum standards before approving them. In the UK, the government has begun using a large number of Innova rapid tests. When trialled in labs and in real-world settings such as hospitals, schools and testing centres, they correctly identified negative samples 99.68% of the time. They also proved extremely good at finding negative samples in a pilot in Liverpool.
What about false negatives?
Although rapid tests are generally good at spotting negative samples, they are more likely to miss positive ones. In other words, they are very unlikely to make healthy people think that they have Covid—but they might make a fair number of infected people believe that they aren’t.
In its UK evaluation, the Innova test correctly identified 76.8% of positive samples, missing 75 out of 323.
When a positive sample is missed, it is called a “false negative”. This makes rapid Covid tests controversial among experts, who disagree about whether the benefits of quickly identifying lots of infected people will outweigh the risk of falsely reassuring others.
However, these rapid tests are unlikely to significantly inflate the number of people infected with Covid.