Facebook has said it will remove false claims about new coronavirus vaccines after it was revealed that jabs could be rolled out next week.
The social media site will remove disinformation – including claims that vaccines contain microchips or anything else not on the official ingredients list – but warned it “will not be able to enforce this policy overnight.”
In October, the company announced it would ban ads that discourage people from getting vaccines.
This now also applies to new Covid-19 vaccines.
A Facebook spokesperson said, “We apply our policy to remove misinformation about the virus that could lead to imminent bodily harm.
“These could be false claims about the safety, efficacy, ingredients or side effects of the vaccines.
“For example, we will remove false claims that Covid-19 vaccines contain microchips or anything else that is not on the official vaccine ingredients list.
“We will also remove conspiracy theories about Covid-19 vaccines that we know today to be false, such as using specific populations without their consent to test the safety of the vaccine.”
Following the announcement of the Pfizer / BioNTech vaccine on Wednesday, there was a wave of misinformation shared online, with widespread allegations from vaccine opponents posted on various social media platforms.
FullFact, an independent fact-checking charity, has partnered with Facebook to tackle disinformation.
On Wednesday, editor Tom Phillips told PA news agency: “We’ve seen many internet platforms take tougher measures against vaccine misinformation and I think that’s the right approach. Can some of them move on? Yes, possibly.
But at the same time it is important to remember the importance of freedom of expression. It is not illegal to have questions or concerns about the vaccine and it is important that we do not respond solely by trying to suppress those questions. We enable people to ask the questions, get quality answers and make decisions based on good quality information. “
The site said it will continue to regularly update the claims they remove based on current guidelines from public health authorities.
Between March and October, Facebook and Instagram removed 12 million pieces of misinformation related to Covid-19.
In April alone, it placed warning labels on approximately 50 million pieces of content, with 95% of people who saw the label not clicking further to view the content.
Between March and October, it placed warning labels on 167 million pieces of content.
The spokesperson added: “We have referred more than two billion people worldwide to authoritative information from public health authorities such as the WHO (World Health Organization) and in the UK the NHS, and we will continue to help people stay informed about these vaccines through advertising. create authoritative information resources through Facebook’s Covid-19 information center. “