Social media, of all kinds, was key in the battle against COVID-19, given the volume of its users, the amount of news and information circulating on its platforms, its ability to employ new artificial intelligence techniques to track false information, and its desire to support global health institutions by promoting their policies and goals. Social media platforms have been updating their policies aimed at combating false and misleading news regarding the COVID-19 pandemic, employing a number of tools and mechanisms that ranged between ban, prevention and restriction, amid questions raised about the nature and feasibility of these policies and their most prominent implications.
Social media, specifically Facebook, Twitter and YouTube, have adopted a set of policies to combat the spread of flasehood and misleading news related to COVID-19, which can be summarized as follows:
1. Deleting misleading information and videos:
On August 18, 2021, Facebook announced the removal of 20 million posts from its network as well as Instagram, because of the violation of the COVID-19 rules since the virus outbreak. It also added information to more than 190 million pandemic-related posts on Facebook that third-party fact-checking partners have categorized as false or incomplete. On September 19, the company revealed the adoption of a new strategy to limit real accounts that publish content that may cause digital damage, by reducing the access of content from these accounts to other users.
Moreover, near the end of September, YouTube announced that it would remove content that promotes the risk of approved vaccines or questions their health repercussions or efficacy, in a clear expansion of its endeavors against misinformation about vaccines. That included videos which linked vaccines on the one hand, and autism, cancer, infertility, and other issues, on the other hand.
2. Banning counter hashtags:
Despite being criticized by the White House for its role in spreading anti-vaccine misinformation, Facebook has not moved to ban the hashtag #VaccinesKill - which claimed that vaccines eat up people's brains and that there are mysterious forces unleashing a plan to limit population - until it was questioned by CNN on Facebook platform why a page full of anti-vaccination misinformation was so easy to find last July. The so-called ‘disinformation dozen’, criticized by President Biden and identified by a report by the Center for Countering Digital Hate as a ‘super spreader’ of anti-vaccine propaganda on Facebook platforms, were banned.
3. Blocking channels on YouTube:
In late September 2021, YouTube blocked several channels that spread misleading news about vaccines, such as video channels linked to people opposing vaccination, including Joseph Mercola and Robert F. Kennedy Jr., which experts affirm that they are responsible for slowing down vaccination rates. In line with the guidelines of local and international health authorities, YouTube also deleted 15 videos from Brazilian President Jair Bolsonaro's channel in July 2021, because of false news that questioned the COVID-19 epidemic and suggested the use of drugs with unproven efficacy.
On September 28, YouTube deleted the German-language RT and DFP channels of the Russian RT network, with no right to restore them. This is due to their violation of the rules of posting. Subsequently, the Russian Foreign Ministry mentioned on Twitter that this measure represents blatant censorship and a suppression of the freedom of expression, and added that Moscow is considering taking retaliatory measures up to the threat of blocking YouTube. In a statement, the Russian Media Censorship Agency also accused YouTube of practicing censorship, demanding that the channels be restored. Meanwhile, in a statement published on Twitter by the Permanent Mission of Russia to the UNESCO, it called on UNESCO to immediately respond to YouTube's actions. The mission stated that it firmly condemns the "information aggression", and reserves the right to raise this issue at the meeting of the UNESCO’s Executive Committee.
In August, YouTube suspended Sky News Australia from its platform for a week, because it spread false information about the COVID-19 pandemic. This came after analyzing videos uploaded by Rupert Murdoch’s television channel, which is followed by conservatives outside Australia, with 1.86 million subscribers. Some of its posts- which question the existence of the pandemic and the effectiveness of vaccines - are widely spread throughout the world.
4- Promoting public debate:
YouTube confirmed that it will continue to allow videos relevant to COVID-19 vaccines, new experiments, historical successes and failures in general, as well as personal testimonies pertaining to that, with the aim of promoting public discussion. This includes testimonies that may be given after obtaining any of the vaccines, their approved policies, historical records, and other content that does not include misinformation or question the usefulness of vaccines.
5- Content review:
In August 2021, YouTube defended the techniques it adopts to contain content, stressing that priority is given to known and trusted sources such as the WHO. Meanwhile, many have argued that YouTube is letting misinformation spread for the financial profit, due the content being attractive to users. Conversely, YouTube officials emphasized that detecting misleading content is not always an easy task, stating that they rely on the opinions of health organizations experts. However, detecting fake new may be much more difficult in other cases.
6- Responding to systematic campaigns:
Last August, Facebook put an end to a ‘disinformation’ operation that sought to spread false information about COVID-19 vaccines by deceiving influencers on platforms in India, Latin America and the US into supporting false claims run by the British marketing firm Fazze from Russia. Facebook described the operation as ‘disinformation laundering; that sought to validate false allegations through disseminating them by reputable personalities. “Be careful when someone tries to tell you a story. Do your own research”, Ben Nemo, Head of Facebook's global threat intelligence division, said at a press conference. Facebook said that in July it removed 65 Facebook accounts and 243 Instagram accounts relevant to the campaign, and banned dealing with Fazze.
Social media policies in dealing with false news related to COVID-19 reflect several indications, which may be detailed as follows:
1- Responding to the accusations:
YouTube's measures come at a time when social media platforms are facing accusations of being lenient with the promoters of misinformation about COVID-19. President Biden had said that misinformation on social media about Covid-19 and vaccination is killing people, at a time when the delta variant is leading to a rise in the number of infections. He called on these platform to make an extra effort to combat misinformation. Meanwhile, on October 6, 2021, Maria Van Kerkhove, a technical official at the WHO gave a warning that we are not out of the woods in the fight against the pandemic, even if many people thought it was nearly over. She criticized the false and misleading information about COVID-19 which is circulating on the Internet. "It is resulting in people dying. There is no way to sugar-coat that", she said. On the other hand, Matt Halperin, Vice President and Global Head of Trust & Safety at YouTube, stressed that misinformation about the vaccine was a global problem and had spilled over from the spreading of falsehoods about COVID jabs. He continued saying, “Vaccine misinformation appears globally, it appears in all countries and cultures".
2- ‘A scapegoat’:
Social media has been emphasizing the deletion of misinformation about COVID-19, including information pertaining to the origin of the virus, fake prevention methods, the efficacy of vaccines, etc. Nevertheless, Facebook remains subject of US accusations to the extent of accusing it of the involvement in the killing of citizens, with the misinformation and false news it promotes. Some have attributed the reason for that to the decline in Biden's popularity since the US withdrawal from Afghanistan, along with the exacerbating economic conditions internally, and the increasing number of citizens infected with COVID-19. This prompted Biden to turn social networks into a ‘scapegoat’ through which he justifies the decline in his popularity and the increasing rates of the pandemic.
3- Facing pressure:
Social media companies are facing great pressure to remove any content against the COVID-19 vaccinations. The issue of misinformation about vaccines has gained so much momentum that Facebook and Twitter have expanded their policies to combat misinformation about vaccines in the past months, not just COVID-19 vaccines, but rather vaccines as a whole. However, unlike Facebook which has announced plans to reduce misinformation about all vaccines, YouTube's policies only address a specific subset of claims about COVID-19 vaccines that run counter to official guidance from the WHO and other authorities.
4- Imperfect effectiveness:
Last August, YouTube announced that it had intensified its efforts to delete more than a million videos that included misleading and dangerous information about COVID-19 since February 2020, in light of the accusations leveled at social networks of contributing to the spread of misleading new about COVID-19 and its vaccines. Despite this, the platform has facilitated the spread of a great deal of falsehood about the pandemic, most notably last May when a controversial anti-vaccination video was published under the title “Plandemic” which was viewed more than 7 million times before it was removed.
5- An existing clash:
YouTube's policies caused an inevitable clash with Russia. On September 29, Margarita Simonyan, Editor-in-Chief of the RT network, revealed the material that “YouTube” classified as false, which was basically 4 videos that included calls for more rationality in fighting COVID-19, criticism of the methods of using protective masks, and signs of reluctance about vaccines because of their genetic safety, as well as criticism of Germany's policies to combat the virus. This means that social media's policies in dealing with misinformation are not necessarily welcome by the international community which invokes freedom of expression, the necessity of presenting multiple points of view about the same topic, and the lack of responsibility of channels for the allegations made by some speakers. The result was Russian threats to ban YouTube.
In conclusion, arguably, social media's continued dissemination of misinformation contributes to spreading a ‘conspiracy culture’, promotes extremist views, and disseminates false information. The big problem is that social networks, by their very nature, do not have much power to prevent the spread of misinformation and harmful content. This means that these false news are circulated thousands and even millions of times before they are banned, reflecting a major deficiency in the combating efforts, which are definitely not limited to “COVID-19 falsehood”.