Many of us spent more time online in the past year after public health officials encouraged us to distance ourselves from each other and stay home during the coronavirus pandemic.
It was the safer way to live. But the online world has its dangers related to the virus, too.
The internet has been a breeding ground for misinformation about the pandemic, so much so that the World Health Organization deemed COVID-19 the first “infodemic.”
At Virginia Tech, researchers have been studying the phenomenon. With a $25,000 grant from the university’s Fralin Life Sciences Institute, research assistant professor Michelle Rockwell formed an interdisciplinary team focused on learning more about how misinformation on social media influences people’s plans to receive a COVID-19 vaccine.
Rockwell said she was first interested in social media messaging around health care more generally, “but how could we be interested in anything else (but the vaccine) right now?”
Misinformation is different from disinformation. While the latter is distributed purposefully to sow confusion or depress knowledge — think Russian hackers — the former is simply false information that gets circulated.
The Virginia Tech team decided to focus its study on how misinformation is perceived in the Appalachian region, believing that community to be particularly vulnerable due in part to historically worse access to quality health care.
The researchers designed a social media simulation to see if a post warning people to look out for misinformation would affect people’s perceptions of stories they saw afterward. Rather than a flag — a type of warning label that Facebook, for example, has started adding to posts notifying users to possible misinformation — the warning they used was a social media post itself, discussing misinformation about the vaccine.
The study participants — 1,048 people in the 13-state Appalachian region, half in rural areas and half not — saw either a neutral post or a warning post in their social media feed. It was designed to look like it was coming from different sources, including Dr. Anthony Fauci, the chief medical adviser to President Joe Biden; a local hospital; the Centers for Disease Control and Prevention; or the individual’s health provider.
After seeing such a warning, people were 40% less likely to rate an erroneous story about the coronavirus vaccine as accurate, and 60% less likely so share it, Rockwell said. The researchers call that a “nudge” from a trusted health influencer.
The Virginia researchers based the misinformation test posts on four COVID-19 vaccine myths they’ve seen persist online:
- That the vaccine can cause infertility. Almost half of respondents to the Virginia study said they believed this. The myth likely originated in a letter sent to the European equivalent of the FDA, according to health news organization STAT. The letter erroneously claimed that the vaccine contains a spike protein called syncytin-1 that is vital for women’s placentas. Coronavirus vaccines in fact do not contain syncytin-1, nor do they instruct women’s bodies to generate it. There has been valid debate over whether pregnant women should opt for a vaccine. Medical experts now say they should because infection from COVID-19 can increase risk for complications that outweigh possible vaccine side effects.
- That there is a high risk of serious side effects, including paralysis. About 40% of respondents believed a story about a high percentage of vaccine recipients experiencing such side effects. Here’s the truth: the most common side effects are swelling, redness and pain around the site of injection, as well as headache, muscle pain, nausea, fever and chills for a short time following the shot. Women have been reporting more side effects than men, possibly because of their stronger immune response. But extreme effects, like paralysis, are not on the list. There’s also an inherent perception problem, experts say: If someone receives a vaccine and then experiences a health problem that might have happened anyway, such as a heart attack, people could draw conclusions that they were related.
- That Microsoft founder and billionaire Bill Gates created the vaccine to install microchipped tracking devices in people. About 40% believed this. It’s simply not true. The rumors may have been propelled, as the BBC reported, when Gates said in an interview last year that eventually we “will have some digital certificates” showing who had recovered or been tested for the virus. He was referring to the idea of an open-source digital platform to share information, the Bill and Melinda Gates Foundation later clarified. There was never any mention of microchips. The foundation has pledged millions of dollars to efforts fighting the coronavirus, particularly in low-income countries, but was not involved in developing the vaccines currently on the market.
- That vaccines aren’t real because the coronavirus and pandemic aren’t real, either. Less than 20% believed this. To date in Virginia, more than 626,000 people have been infected with the coronavirus, more than 26,000 have been hospitalized and more than 10,000 have died.
The myths “are sure hard to budge,” Rockwell said.
So how best to combat the infodemic?
“There are gobs of information, and it’s evolving so quickly,” she said. “A very subtle pause and reminder that this (misinformation) is out there, could make such a powerful difference.”
That’s especially true, they found, when the source is your own doctor.
In a survey, the Virginia Tech team found that trust in science and trust in health care in general were the biggest predictors of a person’s readiness to get a COVID-19 vaccine. Younger people and those in more rural areas tended to be less likely to say they’d get a shot.
But when asked who was the most trusted health messenger, or where they’d prefer to get a vaccine if they did, primary care physicians “far and away” topped the list.
As part of the grant, the team is now working with business analytics to track language about the vaccine across Twitter and Reddit through a process called context mapping.
In the meantime, researchers hope their early findings can help inform public health officials on how to best get people accurate information from sources that they trust.