fbpx
[Report] The Marketer of 2024

Discover the insights of 300+ marketing professionals on the essential tools, resources, and skills for success.

Read the report
Online Trends

Published March 22nd 2019

Fake News Week 2019 Interview: Ania Korsunska on Scientific Misinformation and the Structures That Spread It

Ania shares her research around the structural issues that allow for scientific research to be distorted.

“It got me so mad that I had to go get a PhD to figure it out,” says Ania Korsunska.

She’s talking about articles she saw on Facebook with ludicrous titles like “farts cure cancer,” or “cheese causes cancer.”

Ania Korsunska is currently pursuing her PhD at Temple University, but is transferring to the Syracuse University Information Science and Technology PhD in Summer 2019.

She’s studying the spread of misinformation, specifically on medicine and health topics.

It’s fascinating stuff, and what better time to discuss it than during the Brandwatch blog’s Fake News Week!

Back to Ania.

Why misinformation around health and medicine?

“Everybody’s looking at political misinformation, but for many reasons I’m not so interested in that,” Ania explains. “For one, I feel like with political misinformation there’s a sense of malicious intent – information that is meant to deceive – and that makes me lose all faith in humanity. I study things that, I think, are unintentionally becoming misinformation, because there is a real chance to design interventions to prevent that from happening.”

“In academic literature, people make a distinction between “disinformation” and “misinformation”. “Disinformation” is intentionally meant to be deceptive, but “misinformation” is where something happened to the information, the content became distorted in some way.”

Ania believes in a structural approach to the problem. She’s looking at the process information goes through from the original source through all the different steps to widespread distribution in the media, and how and why it’s sometimes distorted along the way.

Ideally, by targeting the points in the structure that cause the most distortion, action can be taken and the spread of misinformation can be prevented or at least minimized.

Here’s a simple example of the process of distribution:

A published academic article will often be summarized in a press release. That press release is then distributed among journalists. Journalists will then interpret the release (and perhaps the paper itself) and choose whether or not to publish a story around it, or a particular aspect of it. That story will often then get picked up by bloggers who are interested in the subject, and articles around it will be commented on and shared by people online through social media. The story might later be translated into different languages and shared across the world.

Every step of the way, there is a new opportunity for the information to go through some level of filtering or distortion.

“The people who are taking the information from step to step are under certain constraints and incentives, and at each step something can happen to the information as it gets funneled down, like a game of ‘telephone’, and it can end up becoming something ridiculous.”

The kinds of headlines that can come out of the process can end up being humorous, as we’ll see in her case study, below. But Ania is clear that these processes don’t always have funny outcomes.

“It can sometimes be funny like in the “farts cure cancer” story, but sometimes it can really impact people’s lives long term. When you look at things that are happening now, with the  anti-vaccination movement, for example,” she says. “It’s the same process that’s happening, the distortion of scientific research, and I just really want to understand why. Once you understand why, and when you’re making a structural argument, structures can be changed, and we can try to fix it.”

‘Farts cure cancer’: A case study

Ania has shown how the process can work with a case study around an article published in 2014 in the Journal of Medicinal Chemistry Communications titled: “The synthesis and functional evaluation of a mitochondria-targeted hydrogen sulfide donor, (10-oxo-10-(4-(3-thioxo-3H-1,2-dithiol-5-yl) phenoxy)decyl) triphe nylphosphonium bromide (AP39)”.

Through its dissemination process in the media, the story ended up being reported as, totally erroneously, having concluded that farts can help cure cancer.

As she points out, the study had nothing to do with curing cancer or flatulence.

The research began as a published paper. A press release was written by the University of Exeter and distributed. But not everyone who covered the research interpreted it as intended.

It was at the point that it hit the mainstream press that, seemingly from nowhere, farts and cancer were the central point of the story. Those elements were then picked up by other popular media outlets and the stories around the research, which no longer represented the original piece, spread far and wide.

This diagram demonstrates the spread of the story:

And, Ania says, she still sees versions of this story popping up today, 5 years later. Friends will send her links when they see it being shared.

Despite some attempts at correcting things, it’s pretty much a lost cause.

“Recent research has showed that misinformation spreads far and wide, and corrections kind of limp behind, but never catch up. People never spread corrections – they’re never going to go as far as the original story,” Ania says.

Structures and individuals

“There’s something about the network of academia and the network of the media – how they interact – there’s a structural issue that allows distortion to happen and misinformation to spread.”

So, when it comes to “fixing” the fake news problem, it’s these problems that we should be looking to target. When Ania asks people why fake news has become such a big thing, she’ll often get the “easy answers.” Common arguments she hears are ‘people are stupid’, ‘journalists are lazy’, and ‘media outlets need clicks’.

“When they want to fix this problem, people say that we need to sit people down and tell them they’re wrong. But changing people’s minds is one of the hardest things to do, and we’re not very good at it. People will ultimately believe what they want to believe. In that way, I feel like that’s not the way to solve this problem. But if we look at it from a structural point of view, I think we have a chance to prevent scientific information from becoming misinformation.”

“And having a high level of both science and media literacy could go a long way,” Ania explains. “Understanding how academic publishing works – for example, what peer review is, which journals are reliable sources of information, how to read an academic article and understand things like sample size. How science works, why scientists never say things with utter confidence. Things like that could go a long way to helping the general public better discern truth from fiction.”

It’s important to be critical of what we read around scientific findings, Ania says. But it’s easy to fall down a rabbit hole and begin to doubt everything. It’s a kind of terrifying prospect.

When two studies contradict each other, what’s a doubting mind to believe? If the revered sphere of science can’t decide on what the truth is, then what can you believe in? This is where education can come in, she says.

“We fear what we don’t understand. If people don’t understand science, then we’re going back a few centuries to anecdotal evidence – we end up trusting things we heard from friends more than the scientific consensus.”

Even if we do believe in scientific rigor, fact checking can be a nightmare for any reader. It’s actually how Ania got into her area of study.

“I’d read something like ‘cheese causes cancer’ and say great, where are the citations? I’ll click all the hyperlinks to see where they go and they’ll go to other blogs. Or sometimes you’ll actually get to the research paper but it’s behind a paywall, so you can’t access it,” Ania says. “If you’re stuck behind the paywall, the only thing you can read is the press release. It’s intended to spread the information to the public, so it can be oversimplified, maybe even sensationalized.”

Science and certainty

By visualizing the spread of information as a network, you can easily see how it can hop from source to source, and the rapid ways it can spread via non-traditional means like YouTube videos or Facebook posts.

“People are getting so much information from non-traditional sources. We still have the traditional newspapers like the New York Times, but a lot of people get their news from things like Instagram or Snapchat,” Ania laughs. She doesn’t get Snapchat.

We’ve got access to all this information, and it’s all competing for our attention. Certainty is attractive when we’re looking for news and information, which is probably why “farts cure cancer” is such a compelling, clickable headline.

The problem is, science is based on doubt.

“When you trace information, you can see that as it goes from scientists through to more public facing outlets the levels of certainty in the language get stronger and stronger. Scientific publications will say ‘based on these constraints, this is sort of what we found, but we need to do more research’, and that’ll turn into ‘farts cure cancer’.”

Ania says that scientists aren’t trained to convince the public – that’s not their job. But the result is that the scientific community’s voice can seem weak in the face of more confident, but less qualified voices who are disparaging scientific work.

“We need to get more people who are scientists to sound confident enough to have a voice in this debate. More scientists are now coming on to podcasts and TV shows. We need more of that, and we need them to learn how to bring people back to the side of science. Compared to the other loud opinions out there, they’re not doing a good job of convincing people, because that’s not scientists are trained to do.”

There is an incredible danger in the lack of trust in science, especially when you think about the myths around the dangers of vaccination. Ania says it’s going to be a mix of education around the scientific process, fostering belief in it as well as the strengthening of scientists’ voices, that will help start to tackle the structural problems that are seeing misinformation spread.

Thanks so much to Ania for speaking with us. You can find her website, which includes a freely available summary of the above case study here. You can also find the full research paper ‘The Spread and Mutation of Science Misinformation’ here.

And special thanks to Alex Jones for conducting this interview.

Share this post
Brandwatch Bulletin

Offering up analysis and data on everything from the events of the day to the latest consumer trends. Subscribe to keep your finger on the world’s pulse.

Get the data
facets Created with Sketch.
facets-bottom Created with Sketch.
New: Consumer Research

Harness the power of digital consumer intelligence

Consumer Research gives you access to deep consumer insights from 100 million online sources and over 1.4 trillion posts.

Brandwatch image
Brandwatch image
Brandwatch image
Brandwatch image

Falcon.io is now part of Brandwatch.
You're in the right place!

Existing customer?Log in to access your existing Falcon products and data via the login menu on the top right of the page.New customer?You'll find the former Falcon products under 'Social Media Management' if you go to 'Our Suite' in the navigation.

Paladin is now Influence.
You're in the right place!

Brandwatch acquired Paladin in March 2022. It's now called Influence, which is part of Brandwatch's Social Media Management solution.Want to access your Paladin account?Use the login menu at the top right corner.