Since the plight of the pandemic, information about the coronavirus, true and or false, has been circulating across social media platforms but misinformation has been circulating substantially more on Facebook in comparison to any other social media app.
Last month, Facebook announced that they would be applying tougher measures to tackle to mass amount of misinformation circulating across the app. The peak of coronavirus information as of late coincides with the UK’s administration of the vaccine.
After nearly a year of false information about the pandemic, the platform decided to pledge to take down falsehoods and incorrect information circulating regarding the virus and the vaccine as of last month. The pledge listed a range of clear-cut, absolutely incorrect statements which would be taken down if posted; these were inclusive of posts stating that the vaccine is untested; the vaccine contains dangerous chemicals which kills people.

However, many of these posts are gaining thousands of likes, comments, and shares before being taken down and many of the said posts are also inclusive of conspiracy theories, and media being shared which has observantly caused an increased concern aiding people in thinking the entirety of the pandemic is not true.
Since Facebook updated its policies regarding covid and misinformation as of last month, there has been at least 3200 posts detailing incorrect truths about the vaccine of which most have received an incredible amount of interaction, whilst also being posted in forums and communal Facebook groups.
Data from an insights tool – Crowd Tangle – shows that posts containing banned information receive over 12,000 interactions before action is taken. Vaccine minister, Nadhim Zahawi, told Sky News that the government was battling a “tsunami of disinformation“ around the vaccine as well as the entirety of the pandemic.

Although Facebook pledged to take down incorrect narrative regarding everything corona related, they acknowledged that they would not be able to do this overnight. In the meantime, Facebook and fact-checking organisation Full Fact will be working together to run a series of online ads over the next couple of months to encourage users to be able to identify false information about covid-19 and other topics by using a three-step checklist: check the source; check how it makes you feel; check the context.
With the assistance of the three-step checklist, users should be able to decipher between correct and incorrect information. However, many Facebook users who consciously spread incorrect information regarding the virus often change their settings on posts to make it more difficult to alert Facebook of falsehood posts.
The peak of Facebook misinformation comes as the UK has administered more than 20 million first dose vaccines. Authorities are concerned that the misleading posts receiving an incredible amount of interaction might deter people from accepting the jabs.
Some users believe the vaccine changes your DNA and can cause infertility in women which have not been proven to be true, however, arguably after seeing something a certain number of times it can be contested that one may begin to believe said falsehoods.

Ultimately, researchers acknowledge that Facebook is not fully aware of the magnitude of their platform especially during times of misinformation spreading at such a rate. Some believe that Facebook did not consider a series of false and potentially harmful information spreading in such magnitude.
Others argue that even if Facebook had the control to stop the spread of misinformation, it is to be noted and investigated why and how misinformation is being spread. Ultimately, Facebook is still tackling how they will deal with misinformation and it is uncertain what future policies they may implement in the future.
To keep up with the latest commercial news, click on commercial to get your daily dose.
Donate & Support