We’re still suffering the consequences of misleading information that was presented to us as fact in the past. Think about the £350 million a week that we were told the UK was paying as part of belonging to the EU and could have been used for the NHS, or being told that Sadam Hussein had weapons of mass destruction.
What is fake news? It is stories that are false, fabricated, and contain no verifiable facts, sources or quotes.
The issue is complex, because some stories are half-true or exist, for example, in a complex system in which opinions are presented as evidence. But what is clear is that fake news is used to spread misinformation to influence public opinion.
When fear and blame taint the message and the intention is to blame others or create mistrust, this is when we should hear the warning bell.
But what can we do about it?
How can we stopped being manipulated now that AI can create fake videos, images and audios that tell a more-than-credible story of something that never happened?
And even more importantly, how do we have these conversations with the people around us? Or even with ourselves?
I’ve been talking in LinkedIn about “pre-bunking” based on inoculation theory from social psychology research.
There is growing evidence that pre-bunking (as opposed to debunking) might be the strategy to follow. Debunking is about trying to convince people once false information has hit them, whereas pre-bunking is helping them to discern what is true from what it is not.
Google has been working with academic psychologists to create videos like this to explain common misinformation tactics.
These are the most common misinformation tactics:
Impersonation: Spreading information attributed to another entity, normally an organisation with credibility. For example, “NASA advises that more research is needed on whether climate change is real.”
Emotional Manipulation: Using language that evokes strong emotional responses. For example “This is the heartbreaking story of what a child did to save the life of his brother. “
Polarisation: Exaggerating the differences to create opposition. For example: “’Immigrants are responsible for the increase of crime in the city’, said the police, whereas the immigrants’ associations say that they are the victims here. ”
Conspiracy theories: Hijacking news from traditional media and turning it into a story about a secret elite or society that is conspiring against the majority. For example “The government wants to immunise all our children to steal their DNA.”
Personal attack: Attacking a person to distract people from the main area of concern. For example “What those who work with the prime ministerial candidate don’t tell us: she’s manipulative and unprincipled.”
False dichotomy: Making reality look like two irreconcilable choices. For example “It’s good to help refugees, but in the current climate we have to choose between prioritising our own people or foreigners. “
False balance: Presenting two positions as equal in terms of evidence/morality/principles when they are not. For example “ It’s hard to know who to believe; the scientists who warn us about climate change, or those who deny that it exists.”
Recognise these techniques.
Talk about them with other people.
Show them the videos. Explain them how misinformation works.
There is evidence that when people are aware of these techniques they are less likely to share fake news or to allow themselves to be manipulated.
Critical thinking and self-awareness are the best tools we have to fight against social and political manipulation.
Spread the word.