Spring 2023
THIS ISSUE

Misinformation/disinformation’s impact on public health

article summary

Is it really possible to "do your own research" when the internet's data voids and search engine algorithms contribute to the spread of misinformation about public health?

By now, either you or someone you know may have said something like, “I’m doing my own research.” Of course, we hear this most recently around COVID-19 vaccines. But does the internet actually allow for that? Is it even possible to “do your own research?”

Francesca Tripodi, PhD, is an assistant professor at the UNC School of Information and Library Science and media scholar whose research examines the relationship between social media, political partisanship and democratic participation, revealing how Google and Wikipedia are manipulated for political gains. “A lot of my research thinks about the ways in which the way we see the world shapes the kinds of keywords that we put into search bars.”

“For example, in the Google search bar, your inputs are your geographic location or your search history, but your results are primarily driven by the keywords that you put in — what is your query? They then match this query with what information scientists refer to as ‘relevance,’ and this relevance is highly connected to the keywords that you start with.”

The problem, however, is that these keywords can be quite polarized, and even the most enlightened among us might not be aware of our own implicit biases.

A simple but helpful example is what happens when you search “illegal alien” versus “undocumented worker.” You will get dramatically different returns that likely confirm your existing beliefs because they’re driven by relevance. Tripodi explains, “Google is taking those words and attempting to match them with all the information that they have stored in their database. So those keywords are going to largely drive the results that are returned. We tend to think of Google as a giant library, but they’re not really a helpful librarian. They’re a multi-million-dollar industry driven by stakeholder interests.”

Author Kurt Anderson won a Peabody Award for his work as host of public radio’s “Studio 360.” His 2017 book Fantasyland: How America Went Haywire — A 500-Year History explores the theory that America has always been a place where people were attracted to impossible “land of milk and honey” dreams.

“The large problem is when we, as a society, can’t agree on facts,” he said. Clearly, that presents a problem when dealing with public health crises. Opinions can trump facts, sometimes with deadly results.

Cynthia Miller-Idriss, PhD, is a professor at the school of public affairs at American University in Washington, D.C., where she is the director of the Polarization and Extremism Research and Innovation Lab, or PERIL.

“Conspiracy theories don’t just operate by one person going into YouTube and getting down a rabbit hole of bad information. The notion that communities believed bad information has been around for as long as time.”

— Francesca Tripodi, PhD

She points to massive disinformation campaigns that are undermining faith in elections and confidence in the electoral system. In terms of public health, the undermining of scientific expertise also contributes to threats against health care workers and county officials who set health mandates.

Of course, one of the biggest accelerants of disinformation and conspiracies is the rise of the internet, specifically Google. As Anderson observes, “Google search came along in September 1998. What else happened in 1998? The false medical study arguing that vaccines cause autism. That was a perfect first case study of how falsehood and panicky viral belief gets out of control thanks to this new mechanism we have, which is to say, the internet. And here we find ourselves 25 years on, and we still don’t know how to drive this car.” Disruptive artificial intelligence chatbots like ChatGPT further blur the lines between fact and fiction in ways that are not yet well-understood but pose potential threats to getting at the truth.

Noel Brewer, PhD, is the Gillings Distinguished Professor in the Department of Health Behavior at the Gillings School. His research focuses on three main areas: ways to increase vaccine uptake, communication about the harms of vaping and smoking and risk perception, and appropriate use of cancer screening tests. While he agrees that the impact of social media and the internet is undeniable, he calls its effects more “lumpy.” Brewer says, “Vaccination rates in the United States are still very high. The CDC has been very effective in that regard. Other countries have not fared so well.”

A data void is when little to no good information about a subject exists online, and these voids can be manipulated very easily.

In terms of social media, however, Brewer takes a nuanced view. The target of social media is not always an individual. It can often times be policy makers and social media can give the impression that there is a lot of support for something, or the decay of support for something else, leading them to vote against public health legislation. “The tempo has definitely increased around bills being introduced to undo vaccination requirements for school children.”

It’s been said that nature abhors a vacuum. The same might be true on the internet, but these vacuums are known as “data voids.” A data void is when little to no good information about a subject exists online. And when there are these kinds of pockets of returns, it allows them to be manipulated very easily. Tripodi explains, “Conspiracy theorists are really good at maximizing and taking advantage of data voids. This relates back to my concept of how you see the world shapes your keywords to begin with. In other words, if you Google, “are vaccines safe?” with a question mark, the CDC has preemptively filled that void with a lot of good information that tells you they’re pretty safe. But what if I belong to a mom group on Facebook like Parents Against Vaccination? I’m not going to Google “are vaccines safe?” because I’m part of a community that says they’re not.”

“Vaccination rates in the US are still very high. The CDC has been very effective in that regard. Other countries have not fared so well.”

Noel Brewer, PhD

“The way Google orders information and its desire to help drive search and best match your query is not an environment that’s going to expose you to good information if your starting point is a place where good information doesn’t exist. That’s the fundamental relationship to public health crises.”

Having a more nuanced understanding of how search works is important because all of us are so reliant on it. We have the ability to change how we search, not just what we search. But we also have the ability to approach these difficult conversations with greater kindness, empathy and understanding. The foundation for belief in any conspiracy theory or misinformation campaign springs from a desire for agency, community and to make sense of a confusing world.

Anderson is optimistic. “I am somewhat hopeful that the younger people, digital natives, are less easily hoodwinked. As opposed to us older folks who may think, ‘It’s on my computer along with ABC News, so it must be true.’ That’s part of the problem and that’s where we are right now.”  

— — —

The Pivot Podcast: Confronting Misinformation in Public Health

To learn more about the impact of conspiracy theories on public health, check out our Pivot episode “How Are Conspiracy Theories Public Health Crises?” Host Matthew Chamberlin speaks with experts in the study of the effects of disinformation. Just search “Pivot Gillings” on your favorite podcast app.

More from this issue

See all articles from this issue