A news article about a doctor who died after receiving a Covid-19 vaccination was Facebook’s most viewed link in the US in the first quarter of 2021, a previously shelved report shows.
The piece – updated after a report said there was no proven link to the vaccine – was popular with vaccine sceptics.
The New York Times claimed that Facebook initially held back its report because it would “look bad”.
Facebook said the delay was in order to make “key fixes”.
The company had already published its “Widely Viewed Content” report for the second quarter of 2021, in which it found a word search promising to reveal “your reality” was the most popular post.
Similarly frivolous “question posts” formed most of the top 20.
But the New York Times revealed on Friday that the company had held back the earlier report covering January to March 2021.
NEW w/ @rmac18: Facebook released its first-ever quarterly report on Widely Viewed Content this week. But an earlier report from Q1 existed and was shelved, bc execs were scared it would make the company look bad. https://t.co/SYcqhb7MQK
— Davey Alba (@daveyalba) August 20, 2021
The paper alleged the report had not been shared because of fears that it would “look bad for the company”.
The most-viewed link was an article published by a mainstream US newspaper reporting that a doctor had died two weeks after getting a Covid-19 vaccine. The link attracted nearly 54 million views.
The article was subsequently updated to reflect the findings of the Medical Examiner that there was insufficient evidence to conclude whether the vaccine was responsible for the death.
Health bodies around the world have deemed the vaccine to be both safe and highly effective.
The first quarter report also revealed that the 19th most popular page on the platform belonged to the Epoch Times, which has been accused of spreading right-wing conspiracy theories.
The widespread circulation of this story of a doctor who died two weeks after receiving a Covid-19 jab exposes just how fertile a breeding ground Facebook can be for anti-vaccination content.
This can be partly explained by a committed network of activists, under a variety of different guises, who oppose coronavirus vaccines.
Promoting emotive, personal stories like this one on Facebook has been one of their primary tactics in scaring others from getting jabbed – even when, as was the case with this story, it turns out the death has no link to a Covid-19 vaccine at all.
Throughout the pandemic, these activists have muddled together real – and rare – stories of potential adverse side effects from vaccines with extreme online conspiracies, exploiting medical debates, genuine grief, and legitimate questions.
This also demonstrates the complexity of the disinformation ecosystem on social media – where users seize on a grain of truth, in this case an accurate news story, and spin it into a misleading narrative, without the facts to back it up.
I previously reported on how activists misappropriated the image of one woman’s foot on Facebook, after she took part in the Pfizer vaccine trials.
After the publication of the NY Times’ story, Facebook released the report.
A spokesperson for the company said: “We considered making the report public earlier but since we knew the attention it would garner, exactly as we saw this week, there were fixes to the system we wanted to make.”
According to Facebook, these fixes included dealing with bugs in some of the queries on which the report was based.
The firm’s Andy Stone added more detail on a Twitter thread.
We’re guilty of cleaning up our house a bit before we invited company. We’ve been criticized for that; and again, that’s not unfair.
— Andy Stone (@andymstone) August 21, 2021
Both the quarterly reports focus on what is most viewed in the USA, rather than what is engaged with through likes, comments, and shares.
They paint a different picture to data gathered by researchers and journalists with Crowdtangle, Facebook’s engagement-measuring tool, which suggests that right-leaning political content is dominant on the platform.
Facebook has fiercely pushed back against that idea, saying that only 6% of content seen by users is political.
But some misinformation researchers worry that Facebook is going cold on Crowdtangle.
The company did not answer a BBC question about whether the tool was under threat.