The social media giant has had enough as updates will soon reach Estonia that will direct people looking to join anti-vaccination groups to a page with scientifically proven facts.
Time will tell whether this will have an effect on thousands of people who form the largest Estonian anti-vaccination group on Facebook where world-famous rumors are spread using classic tricks.
The group and the misinformation it spreads are by no means a singular phenomenon of Facebook. There are hundreds of such groups all over the world.
Facebook has declared war on vaccination misinformation at a time when immunization is down everywhere in Europe. The World Health Organization finds that at least 95 percent of people should be vaccinated to ensure so-called pack immunity.
That would ensure that a large enough part of the population is vaccinated to also protect those who are not. Level of immunization ranges between 90 and 95 percent in Estonia, depending on the vaccine.
Misinformation as folklore
How does misinformation come about and reach people’s consciousness through social media?
Postimees and Anastasiya Astapova who has a PhD in folklore analyzed the 8,300-member Facebook group made up of those doubting the necessity of vaccines as well as zealous proponents of the medical practice.
Astapova, who is a member of the Estonian Youth Academy of Sciences, has studies fake truths and conspiracy theories and written on the subject of the anti-vaccination movement. She is also a member of the managing committee of an international research project called Comparative Analysis of Conspiracy Theories (COST).
“Fake news can be a product of information warfare as politicians often use it to further their goals. From there, it becomes folklore: people gladly share such things when fake news corroborates their view of the world,” Astapova explains and gives an example.
“For example, there is an idea going around that the only reason we’re expected to immunize ourselves is that migrants are coming to Europe, bringing disease. Those who are anti-immigration are happy to share it.”
Astapova said that rumors spread primarily through people who interpret information selectively – prefer pieces of information that support existing convictions. Other possible explanations are ignored.
Such convictions deepen in groups where likeminded people concentrate only on what fits their persuasion.
A closed system, such as a Facebook group of likeminded individuals, only echoes the same “truths”.
“The more you subscribe to anti-vaccination topics, the more convinced you become after every fake news you see,” Astapova explains. “The more you talk to people who are anti-vaccination, the less likely you are to disagree.”
Working with Astapova, we came across several posts that included misinformation but also ordinary fake news in Estonia’s largest anti-vaccination Facebook group. We will highlight a few such posts and show how fake truths are made to look credible.
First method: citing “authority”
A classic way to render your arguments more convincing is to present a seemingly expert source, such as a scientist, alongside your information. The expert might really be a scientist, simply not in the field or position they are said to hold in relation to facts presented.
Estonia’s largest anti-vaccers’ group holds eight posts the text or comments section of which cite a study by Harvard researcher Tetyana Obukhanych. This piece of fake news that first got started in 2017 has been shared in the Estonian group in 2017, last year and several times this year.
In short, the study concludes that unvaccinated children do not pose a threat to those who have been immunized.
“Such a pleasant read;),” writes Katrin, after sharing the post. “Harvard immunologist’s study reveals unvaccinated kids no danger to vaccinated ones”. Great translation:),” Anne posts, having previously shared the post. The “news” is false, however, suggests one of the world’s leading fact check websites Snopes.
While Obukhanych does have a PhD in immunology, her piece is no study, much less a Harvard one. It was a public letter she wrote in 2015 and that was published on anti-vaccination site Thinking Mom’s Revolution.
Second option: citing “research”
Another way to lend your talking points more credibility is to present them as scientific discoveries, while they are in fact fake news.
In 2018, a particular post on a medical conspiracy theories blog took off. The title reads: “60 lab studies confirm vaccine you probably got as kid linked to cancer.” The post was also shared on the Estonian Facebook group.
“An article on how vaccines administered when you were a child can lead to cancer. It has now been discovered that certain vaccines have been contaminated with cancerogenic viruses in the past that was previously hushed up,” Britta wrote on the Estonian group after sharing the post. This particular piece of “news” that includes its fair share of misinformation was making the rounds for the second time in 2018. It first appeared seven years earlier and included no new discoveries back then, Snopes suggests.
The eye-catching headline hides common knowledge of how US and a few other countries administered contaminated polio vaccines to children in the 1950s and 60s. The post in question concentrated on the HPV vaccine that has never been contaminated. The decades-old vaccine has nothing to do with its modern counterparts.
Method number three: series of “facts”
Astapova said that fake news is also made more credible by presenting piles of so-called facts that when checked one by one turn out to be false.
For example, news that a tetanus vaccine contaminated with chemicals used for sterilization caused half a million women to become infertile in Africa went live five years ago. The news has been shared in the Estonian group at various times.
“I’m left speechless. I thought we were living in the 21st century. I must have the timeline wrong. Our mainstream has never heard of it – horrifying to be honest,” Ott, who shared the post, wrote. “I suppose the HPV vaccine is used for the same reason here. It doesn’t immediately show when you immunize teenagers,” Ene added.
This news is also false, according to fact-check site Snopes.
Kenyan priests claimed that they analyzed a vaccine and came across a substance that has been used in contraception drugs still in development. UNICEF overturned the claim, saying that Kenya doesn’t even have laboratories where such analyses could be carried out. The vaccine has not been proven to be contaminated in any way or anyone having become infertile as a result of it. Nevertheless, rumors of links between vaccines and infertility are widespread in the world.
Game of cat and mouse with misinformation agents
It is virtually impossible to precisely gauge the effect of misinformation in social media in terms of the decline in vaccination figures. Distrust toward vaccines arises from other factors, if only the fact that most people under 60 never experience any of the diseases they are being vaccinated against.
Suspicions are easily created in this situation, experts say.
“I do not believe Facebook is to blame [for the spread of misinformation], while we do play a part. That said, I do not want to wait to find out whether Facebook is the reason or not,” Hirsch says. “That is why we have decided to combat the spread of misinformation.”
That is the reason Facebook pushed an update in early September. When a user searches for something to do with vaccines, they are first shown an information window that directs them to a WHO website with the latest info and facts on immunization.
If a user is invited to join a page or group on vaccination-related topics, the information window is displayed before one can join.
The social media giant first took on anti-vaccination rumors in January, after having spent two years combatting fake news in general.
“We became increasingly aware of the fact anti-vaccers were using our platform to spread misinformation. At the same time, the World Health Organization said hesitation concerning immunization is among the ten most serious healthcare problems in the world. Those two things pushed us to action,” Hirsch said.
A number of other changes have been made over the past nine months. Any group that consistently posts misinformation can be pushed to the back of search results, removed from recommendations or have its post removed from users’ news feed. Facebook can also strip groups and pages that disseminate false information of the option to raise funds.
It is likely Facebook’s recent update won’t be the last.
“We are in a constant game of cat and mouse with users who want to break our rules,” Hirsch says. “We will see how these updates work and take it from there.”