Stop me if you’ve heard this one: over the last 15 months, your social media presence has become overrun with “COVID denialists,” anti-vaxxers, and other outspoken skeptics of accepted science.
They co-opt your comment threads to spread conspiracy theories and harass people who support vaccines, wear masks, and believe SARS-CoV-2 exists.
Maybe you don’t even work with vaccines or public health, and yet your Facebook comments section has become a makeshift court of scientific veracity, with a real (and unsolicited) responsibility to prevent your platform from being used to spread false and potentially harmful information.
This is vital work, but it’s costly. You don’t want to ignore organic activity on your platforms, especially if a little education could help change minds and reinforce the trustworthiness of your online presence. But what use is a Facebook page if you have to periodically purge all the comments?
Talk Science to Me has a few clients facing this dilemma.
To better understand how misinformation and disinformation propagate on social media, we spoke with Tara Haelle, a science and multimedia journalist whose TED Talk, “Why Parents Fear Vaccines,” encourages honest and compassionate conversations with people who are skeptical.
“Social media revolutionized how we understand and interact with our world,” says Haelle. “There’s this quote… ‘a lie can travel halfway around the world while the truth is still putting on its shoes.’ Social media enables that in a way that hadn’t [really been seen] before.”
(The quote is often attributed to Mark Twain, but likely originates from Jonathan Swift.)
Haelle acknowledges that much of the current wave of online anti-science activity is a reaction to the circumstances of the pandemic.
“A lot of what people are seeing now is not new—there is a ramping up, it’s more broadly disseminated, because we’re in a pandemic, so there’s more of an audience for it,” she explains. “We’re seeing a lot more of it because there’s a wider audience for it. It’s important to recognize the weaponization of [social media] has been going on for years.”
Yet despite the broader, more systemic factors that lead people to engage in anti-science behavior online, the stakes remain personal. Haelle believes that understanding “nerd nodes”—a term borrowed from science journalist Emily Willingham to describe the people in our personal networks we trust about certain forms of knowledge—is key to confronting anti-science propaganda.
“If your nerd node is someone who isn’t pro-science, then you will be taken more easily, because you have trust in that person,” Haelle says. “Combine that with the reality of the number of people who are looking for that information—you can see why the number of people being taken in by the misinformation increases.”
Haelle also makes a point of distinguishing between misinformation, which may or may not be spread maliciously, and disinformation, which is intentionally spread. Both types of information piggyback on the trust bonds we form within our communities, although not necessarily toward the same goal.
Folks spreading misinformation might be doing so out of genuine concern for themselves and those around them. And someone spreading bad information in good faith has the capacity to be compelled by better, more accurate information.
Here’s the tricky part, especially for those of us who have grown accustomed to the ease and leisure of dismissing people we feel are in the wrong with sarcasm, mockery, or “dunking.”
“In your everyday life, everyday people have a lot of power by sharing information, non-judgmentally, on social media. And if someone challenges you on it, respond with accurate information. Don’t let it become a shouting contest,” says Haelle. “Continually sharing accurate information without snark, without making snide remarks, without making presumptions, that will win people over more than any other thing you can do.”
The good news is that anyone can do that work, even without a background in scientific education, though that background certainly helps when explaining the mechanics of an mRNA vaccine or how PCR technology has advanced in the years since developer Kary Mullis supposedly claimed it couldn’t be used to ascertain medical diagnoses.
Mullis died shortly before the COVID-19 pandemic took hold. A skeptical scientific pioneer dying shortly before his supposed theories about his own technology would be put to the test is a compelling narrative—one that, when shared by “nerd nodes,” can have a profound impact on people trying to protect themselves and their families.
This is where the better news comes in: confronting scientifically inaccurate discourse effectively might be a matter of telling equally compelling stories rooted in accurate science and current data.
“Storytelling is the dominant method of understanding our world and education,” says Haelle, so it makes sense to tell better stories to combat inaccurate ones. “The more trusted the source, and the more emotionally compelling [the] stories, the more successful.”
We all have the capacity, as individuals and organizations, to tell effective stories.
If you’ve ever told a story at a party, you have the power—and the social responsibility—to speak out in the face of misinformation. Turn your evidence-based knowledge about vaccines or variants into a compelling narrative that keeps your social media platforms focused on accurate information—and might even win over some skeptics.
Talk Science to Me provides science writing, editing, and social media moderation services to help you connect with current and potential audiences. If you’ve got a story we can help you tell, get in touch with us through our website, Facebook, or Twitter.
Learn more about Tara Haelle’s work at her website.