It’s not a new thing for people to try to mislead you when it comes to science. But in the age of COVID-19 — when we’re being bombarded with even more information than usual, when there’s increased uncertainty, and when we may be feeling overwhelmed and fearful — we’re perhaps even more susceptible to being deceived.
The challenge is to be able to identify when this may be happening. Sometimes it’s easy, as often even the most basic fact-checking and logic can be potent weapons against misinformation.
But often, it can be hard. People who are trying either to make you believe something that isn’t true, or to doubt something that is true, use a variety of strategies that can manipulate you very effectively.
Here are five to look out for.
1. The ‘us versus them’ narrative
This is one of the most common tactics used to mislead. It taps into our intrinsic distrust of authority and paints those with evidence-based views as part of some other group that’s not be trusted. This other group — whether people or an institution — is supposedly working together against the common good, and may even want to harm us.
Recently we’ve seen federal MP Craig Kelly use this device. He has repeatedly referred to “big goverment” being behind a conspiracy to withhold hydroxychloroquine and ivermectin from the public (these drugs currently don’t have proven benefits against COVID-19). Kelly is suggesting there are forces working to prevent doctors from prescribing these drugs to treat COVID-19, and that he’s on our side.
His assertion is designed to distract from, or completely dismiss, what the scientific evidence is telling us. It’s targeted at people who feel disenfranchised and are predisposed to believing these types of claims.
Although this is one of the least sophisticated strategies used to mislead, and easy to spot, it can be very effective.
2. ‘I’m not a scientist, but…’
People tend to use the phrase “I’m not a scientist, but…” as a sort of universal disclaimer which they feel allows them to say whatever they want, regardless of scientific accuracy.
A phrase with similar intent is “I know what the science says, but I’m keeping an open mind”. People who want to disregard what the evidence is showing, but at the same time want to appear reasonable and credible, often use these phrases.
Politicians are among the most frequent offenders. On an episode of Q&A in 2020, Senator Jim Molan indicated he was not “relying on the evidence” to form his conclusions about whether climate change was caused by humans. He was keeping an open mind, he said.
If you hear any statements that sound faintly like these ones, particularly from a politician, alarm bells should ring very loudly.
3. Reference to ‘the science not being settled’
This is perhaps one of the most powerful strategies used to mislead.
There are of course times when the science is not settled, and when this is the case, scientists openly argue different points of view based on the evidence available.
Currently, experts are having an important debate around the role of tiny airborne particles called aerosols in the transmission of COVID-19. As for most things COVID-related, we’re working with limited and uncertain evidence, and the landscape is in constant flux. This type of debate is healthy.
But people might suggest the science isn’t settled in a mischievous way, to overstate the degree of uncertainty in an area. This strategy exploits the broader community’s limited understanding of the scientific process, including the fact all scientific findings are associated with a degree of uncertainty.
It’s well documented the tobacco industry designed the playbook on this to dismiss the evidence that smoking causes lung cancer.
The goal here is to raise doubt, create confusion and undermine the science. The power in this strategy lies in the fact it’s relatively easy to employ — particularly in today’s digital age.
4. Overly simplistic explanations
Oversimplifications and generalisations are where many conspiracy theories are born.
Science is often messy, complex and full of nuance. The truth can be much harder to explain, and can sometimes sound less plausible, than a simple but incorrect explanation.
We’re naturally drawn to simple explanations. And if they tap into our fears and exploit our cognitive biases — systematic errors we make when we interpret information — they can be extremely seductive.
Conspiracy theories, such as the one suggesting 5G is the cause of COVID-19, take off because they offer a simple explanation for something frightening and complex. This particular claim also feeds into concerns some people may have about new technologies.
As a general rule, when something appears too good or too bad to be true, it usually is.
5. Cherry-picking
People who use this approach treat scientific studies like individual chocolates in a gift box, where you can choose the ones you like and disregard the ones you don’t. Of course, this isn’t how science works.
It’s important to understand not all studies are equal; some provide much stronger evidence than others. You can’t just conveniently put all your faith in the studies that align with your views, and ignore those that don’t.
When scientists evaluate evidence, they go through a systematic process to assess the whole body of evidence. This is a crucial task that requires expertise.
The cherry-picking tactic can be hard to counter because unless you’re across all the evidence, you’re not likely to know whether the studies being presented have been deliberately curated to mislead you.
This is yet another reason to rely on the experts who understand the full breadth of the evidence and can interpret it sensibly.
The pandemic has highlighted the speed at which misinformation can travel, and how dangerous this can be. Regardless of how sensible or educated we think we are, we can all be taken in by people trying to mislead us.
The key to preventing this is to understand some of the common tactics used to mislead, so we’ll be better placed to spot them, and this may prompt us to seek out more reliable sources of information.
Hassan Vally, Associate Professor, La Trobe University
This article is republished from The Conversation under a Creative Commons license.
You must be logged in to post a comment.