by Elitsa Dermendzhiyska
Ivan
Alvarado/Reuters don't just afflict the ignorant.
The more you know, the more vulnerable you can be to infection...
You can catch it through
face-to-face contact or digitally - that is, via a human or bot. Few
of us possess immunity, some are even willing hosts; and, despite
all we've learned about it, this virus is proving more cunning and
harder to eradicate than anyone could have expected.
Fake news was around even before the invention of the printing press, although the first large-scale journalistic sham occurred in 1835, when the New York Sun published six articles announcing the discovery of life on the Moon (specifically, unicorns, bat men and bipedal beavers).
Consider, too, early modern witch hunts, or those colonial myths that depicted slaves as a different species; the back-and-forth volleys of anti-Jewish and anti-German propaganda during the world wars, McCarthyism's Red Scare, even communism's utopian narratives.
History teems with deceit...
What's different today is,
Online media has given voice to previously marginalized groups, including peddlers of untruth, and has supercharged the tools of deception at their disposal.
The transmission of falsehoods now spans a viral cycle in which AI, professional trolls and our own content-sharing activities help to proliferate and amplify misleading claims.
These new developments have come on the heels of rising inequality, falling civic engagement and fraying social cohesion - trends that render us more susceptible to demagoguery. Just as alarming, a growing body of research over the past decade is casting doubt on our ability - even our willingness - to resist misinformation in the face of corrective evidence.
The classic experiments to correct misinformation date to the late 1980s.
Subjects were given news briefs from the scene of a fictional warehouse fire, one of which mentions a closet with volatile materials - cans of oil paint and gas cylinders - others report,
A further brief cites the police investigator on the case stating that the closet was, in fact, empty, before the report ends with the fire finally put out.
Having read the briefs, subjects had to answer a series of questions meant to probe their grasp of the correction made by the police investigator. It seems a simple test yet, across a multitude of studies, people repeatedly fail it.
In one experiment, as many as 90 per cent of the subjects linked the fire's toxic nature or intensity to the cans of oil paint and gas cylinders, despite none being found in the closet.
More surprisingly, when asked directly, most of these participants readily acknowledged the empty closet.
What's remarkable is that people appear to cling to the falsehood while knowing it to be false.
This suggests that, even if successfully debunked, myths can still creep into our judgments and color our decisions - an outcome referred to in the literature as,
Why does this happen?
According to Jason Reifler, professor of political science at the University of Exeter, we tend to take incoming information at face value,
Moreover, myths can take on subtle, crafty forms that feign legitimacy, making them hard to expose without careful analysis or fact checks.
This means that those of us too dazed by the job of living to exert an extra mental effort can easily succumb to deception. And once a falsehood has slipped in and become encoded in memory - even weakly - it can prove remarkably sticky and resistant to correction.
One of the most common explanations for the continued influence effect puts it down to,
If the myth fits the 'logic' of events, its retraction leaves a hole, and the cogs of the story no longer click into place.
We need the cans of oil paint and the gas cylinders:
Remove the volatile materials from the closet, and the causal chain of events in our head unravels.
If we aren't to lose coherence, it makes sense to hold on to both the actual fact and the fitting falsehood - but keep them separate, compartmentalized, so that they don't clash.
This might be why, as studies show,
Older people might be particularly vulnerable to misinformation that's repeated when retracted...
Another reason why misinformation resists correction is repetition.
Once something gets repeated often enough - sensational claims on social media; urban legends passed from one bored timewaster to another - it can trick us into taking it as true merely because of its familiarity.
The illusory truth effect, as it's known, suggests that the easier to process and more familiar something is, the more likely we are to believe it. Which is exactly what repeating a misleading claim does - getting it to go down smooth by strengthening the neural pathways linked to it.
This can pose a challenge for corrections that work by repeating the original misinformation.
Consider, this retraction to a myth prone to ensnare hopeful new mothers:
The tiny 'not' mid-sentence is all that sets the myth and its correction apart - and it's easy to imagine that as time passes and memory fades, that 'not' will wash away, leaving Mozart's symphonies and smarter babies linked together in memory, and making the myth more familiar.
Could this cause the correction to fail or even backfire...?
In 2017, Stephan Lewandowsky, a cognitive scientist at the University of Bristol, and two colleagues from the University of Western Australia set out to investigate this possibility.
Because memory declines with age, older people might be particularly vulnerable to misinformation that's repeated when retracted. Indeed, in a similar study with older adults, Lewandowsky's team found that, after three weeks, subjects aged over 65 ended up re-remembering most of the successfully corrected myths as facts.
Again, though, no backfire effects occurred - that is, where the correction actually increases belief in the myth - and, despite some contrary earlier evidence, researchers now believe such effects to be rare, if they exist at all.
And although repeated mentions of a myth can strengthen it, one repetition during correction seems safe and even desirable as it makes the myth more pliant by activating it in memory.
In recent years, as misinformation has wormed its way into large swathes of society, scientists have been looking for the most effective methods to counter it.
Recently, Lewandowsky spearheaded The Debunking Handbook 2020, an online collection of best practice by 22 of the most active researchers in the field. The contributors nominated more than 50 relevant findings and more than 30 practical recommendations, rating them on their importance and the strength of the available evidence.
To successfully debunk a myth, the authors conclude, it helps to provide an alternative causal explanation to fill the mental gap that retracting the myth could leave.
Counterarguments work too, as they point out the inconsistencies contained in the myth, allowing people to resolve the clash between the true and the false statement.
Another strategy is to evoke suspicion about the source of the misinformation.
For example, you might be more critical of government officials who reject human-caused global warming if you suspect vested business interests behind the denialist claims.
The most vaccine-hesitant subjects ended up even less willing to vaccinate than they were before the study...
Some researchers, however, question the practical significance of debunking strategies devised in a lab.
As Reifler put it to me:
In a world where both media and online platforms have turned into hotbeds of misinformation, Reifler's question sounds especially urgent.
John Cook, a climate change communication researcher at George Mason University in Virginia, told me:
It can get worse...
Suppose the perfect message does find a person in need of disabusing, and even succeeds in fixing their false beliefs:
If you tell people that 97 per cent of climate scientists agree about the reality of global warming, studies show that you'll likely increase their perception of expert consensus on the subject.
But whether this greater awareness translates into action - say, support for carbon-reduction policies - remains unclear.
The evidence is mixed, and the question has sparked 'substantial debate and disagreement' among researchers, says James Druckman, professor of political science at Northwestern University in Illinois.
One worrying demonstration of this possibility comes from the realm of vaccines.
In a 2016 study, Reifler worked with the political scientist Brendan Nyhan at Dartmouth College in New Hampshire, testing two approaches to debunk the myth that flu vaccines actually cause the flu - a myth partly responsible for low vaccination rates and thousands of preventable deaths from seasonal influenza in the US.
One subject group saw official corrective materials from the US Centers for Disease Control and Prevention, while another group received information about the risks of not vaccinating.
This latter group showed no change in myth beliefs, whereas in the correction group the myth beliefs substantially declined, even among the most skeptical subjects. It seemed that the correction had worked - and brilliantly.
But what ultimately interested Reifler was less the participants' beliefs and more their intentions to vaccinate - and, across the sample, these didn't budge at all.
When I talked to Reifler, he couldn't name any research that showed that communicating the safety of vaccines (or highlighting the dangers of refusing them) had a positive effect on people's intentions to vaccinate.
At this point in our interview, I faltered.
It just seemed too absurd that in a matter of life and death, information potentially key to survival could still be ignored. I asked Reifler if he found this disappointing.
He said he was used to it.
To fully grasp the pernicious nature of the misinformation virus, we need to reconsider the innocence of the host. It's easy to see ourselves as victims of deception by malicious actors.
It's also tempting to think of being misinformed as something that happens to other people - some unnamed masses, easily swayed by demagoguery and scandal.
I've heard this sentiment echoed time and again by others, the implication always being that they and I were not like those other, misinformed people.
But, as it turns out, misinformation doesn't prey only on the ignorant:
Startling evidence for this possibility comes from Dan M. Kahan, professor of law and psychology at Yale University who has been studying how ordinary people evaluate complex societal risks.
One strand of his research is trying to shed light on the sometimes dramatic disparity between public opinion and scientific evidence.
Together with a small group of researchers, in 2010 Kahan set out to demystify this disparity in relation to global warming.
At the time, despite widespread consensus among climate scientists, only 57 per cent of Americans believed that there was solid evidence for global warming, and just 35 per cent saw climate change as a serious problem.
One standard explanation, which Kahan calls the 'science comprehension thesis', holds that people have insufficient grasp of science, and are unlikely to engage in the deliberate, rational thinking needed to digest these often complex issues.
It's a plausible explanation, yet Kahan suspected that it doesn't tell the whole story.
Asking for people's take on climate change is also to ask them who they are and what they value...
In the 2010 study, published in Nature in 2012, Kahan and his collaborators measured subjects' science literacy and numeracy, and plotted those against the participants' perceived risk of global warming.
If the science comprehension thesis was right, then the more knowledgeable the subjects, the more they'd converge towards the scientific consensus.
Surprisingly, however, the data revealed that those who scored high on hierarchy and individualism - the hallmark values of a conservative outlook - exhibited the opposite pattern:
What explains this seeming paradox?
Kahan argues that rather than being a simple matter of intelligence or critical thinking, the question of global warming triggers deeply held personal beliefs.
In a way, asking for people's take on climate change is also to ask them who they are and what they value.
For conservatives to accept the risk of global warming means to also accept the need for drastic cuts to carbon emissions - an idea utterly at odds with the hierarchical, individualistic values at the core of their identity, which, by rejecting climate change, they seek to protect.
Kahan found similar polarization over social issues that impinge on identity, such as gun control, nuclear energy and fracking, but not over more identity-neutral subjects such as GMO foods and artificial sweeteners.
In cases where identity-protective motivations play a key role, people tend to seek and process information in biased ways that conform to their prior beliefs.
This hints at a vexing conclusion:
And though most available research points to a conservative bias, liberals are by no means immune.
In a 2003 study, Geoffrey Cohen, then a professor of psychology at Yale, now at Stanford University, asked subjects to evaluate a government-funded job-training program to help the poor.
All subjects were liberal, so naturally the vast majority (76 per cent) favored the policy.
However, if subjects were told that Democrats didn't support the program, the results completely reversed: this time, 71 per cent opposed it. Cohen replicated this outcome in a series of influential studies, with both liberal and conservative participants.
He showed that subjects would support policies that strongly contradict their own political beliefs if they think that others like them supported those policies.
Despite the social influence, obvious to an outsider, participants remained blind to it, and attributed their preferences to objective criteria and personal ideology.
This would come as no surprise to social psychologists, who have long attested to the power of the group over the individual, yet most of us would doubtless flinch at the whiff of conformity and the suggestion that our thoughts and actions might not be entirely our own.
For Kahan, though, conformity to group beliefs makes sense.
Since each individual has only negligible impact on collective decisions, it's sensible to focus on optimizing one's social ties instead.
Seen from this perspective, then, the impulse to fit our beliefs and behaviors to those of our social groups, even when they clash with our own, is, Kahan argues, 'exceedingly rational'.
Ironically, however,
As tribal attachments prevail, emotions trump evidence, and the ensuing disagreement chokes off action on important social issues.
Recently, public disagreement has spilled over to the idea of truth itself.
The term 'post-truth' became the Oxford Dictionaries Word of the Year in 2016, and came to characterize that year's US presidential election and the Brexit referendum.
In a 2017 paper, Lewandowsky argued that we've gone 'beyond misinformation':
In this other reality, marked by the global rise of populism, lies have morphed into an expression of identity, a form of group membership.
In the US, the UK, Germany, Austria, Italy, Poland, Brazil and India, populists have captured a growing disenchantment with the status quo by pitting 'the people' against 'the elites', and attacking so-called elitist values - education, evidence, expertise.
In the populist story, lying takes on the trappings of anti-establishmentarianism, undermining truth as a social norm.
This is the misinformation virus at its most diabolical: a point where health (in this case, of the body politic) ceases to matter - as was so graphically demonstrated during the storming of the US Capitol this January - and the host consents to being infected.
(The one good thing to come out of that 'insurrection' is that tough action was swiftly taken against the peddlers of misinformation with Twitter banning the then president Donald Trump and suspending thousands of QAnon-related accounts.)
It's probably easier to change what we think others think than what we ourselves do..
It's easy to despair over all the cognitive quirks, personal biases and herd instincts that can strip our defenses against the ever-evolving misinformation machinery. I certainly did...
Then, I found Elizabeth Levy Paluck.
She is a psychologist at Princeton University who studies prejudice reduction - a field in which a century of research appears to have produced many theories but few practical results.
In 2006, she led an ambitious project to reduce ethnic hostilities in the Democratic Republic of Congo.
She blended a number of prominent theories to create a 'cocktail of treatments':
Nothing worked...
For Paluck, this was 'an empirical and theoretical puzzle', prompting her to wonder if beliefs might be the wrong variable to target. So she turned to social norms, reasoning that it's probably easier to change what we think others think than what we ourselves do.
In 2012, Paluck tested a new approach to reducing student conflict in 56 middle schools in New Jersey.
Contrary to popular belief, some evidence suggests that, far from being the product of a few aggressive kids, harassment is a school-wide social norm, perpetuated through action and inaction, by bullies, victims and onlookers.
Bullying persists because it's considered typical and even desirable, while speaking up is seen as wrong.
So how do you shift a culture of conflict?
Through social influence, Paluck hypothesized:
In some schools, Paluck had a group of students publicly endorse and model anti-bullying behaviors, and the schools saw a significant decline in reported conflicts - 30 per cent on average, and as much as 60 per cent when groups had higher shares of well-connected model students.
I've wondered recently if, like school violence, misinformation is becoming part of the culture, if it persists because some of us actively partake in it, and some merely stand by and allow it to continue.
If that's the case, then perhaps we ought to worry less about fixing people's false beliefs and focus more on shifting those social norms that make it OK to create, spread, share and tolerate misinformation.
Paluck shows one way to do this in practice - highly visible individual action reaching critical mass; another way could entail tighter regulation of social media platforms. And our own actions matter, too...
As the Scottish biologist D'Arcy Wentworth Thompson said in 1917,
We are, each and every one of us, precariously perched between our complicity in the world as it is and our capacity to make it what it can be...
|