Would you say you have critical thinking skills?
I’m sure you do.
The problem is that sometimes we’re unable to use it.
I spent many years in academia trying to help students develop this skill, but I hate to admit that I have limitations when it comes to my own critical thinking.
Cognitive bias affects all of us.
Even in courtrooms.
In 1999, Sally Clark was found guilty of murdering two of her sons, who had died suddenly as babies. One of the main arguments supporting her conviction was the paediatrician Roy Meadow’s claim that the chances of two children dying of SIDS (Sudden Infant Death Syndrome) in one family were just 1 in 73 million.
Peter Green, on behalf of the Royal Statistical Society, wrote later in 2002:
“The jury needs to weigh up two competing explanations for the babies' deaths: SIDS or murder. The fact that two deaths by SIDS is quite unlikely is, taken alone, of little value. Two deaths by murder may well be even more unlikely. What matters is the relative likelihood of the deaths under each explanation, not just how unlikely they are under one explanation.”
I hadn’t thought about this until I read Peter Green’s statement.
Initially I had thought about misogyny and sexism in law (the usual suspects), then I thought about whether the figure mentioned by Roy Meadow was right, but it hadn’t occurred to me that bringing data about a possible different explanation on why the babies died, could be valuable here.
The mathematician Ray Hill calculated in this regard that a double SIDS death would occur in roughly 1 out of 297,000 families, whereas two children murdered by a parent was roughly 1 out of 2.7 million families.
We have now a total new picture.
And yes, that’s one of the main principles of scientific knowledge: objectivity and impartiality. Sally Clark served three years of a life sentence. Her conviction was overturned in a second appeal in 2003. Sadly, she died four years later. Her family said she never recovered from such an appalling miscarriage of justice
What went wrong?
Yes, there are patriarchal and misogynistic misconceptions rooted in the legal system (am I repeating myself?) but there is something else, and this is hypothesis myopia.
Hypothesis myopia is when researchers and experts are so fixated on collecting evidence to support just one hypothesis that they don’t look for evidence against it, or fail to consider other explanations.
As you can see, nobody is immune to unconscious bias.
Richard Feynman, known for his work on quantum mechanics, among other things, said:
“…a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty… if you’re doing an experiment, you should report everything that you think might make it invalid – not only what you think is right about it.”
How can we then overcome hypothesis myopia?
When presenting a solution or an idea, look for alternatives, search competing narratives, look for evidence that might refute your approach. Present all the information to help others judge the value of the contribution.
Accept that this approach might initially diminish your ability to influence others, as you’re playing against unconscious bias, but in the long term you and your team are building skills to help you make better and more robust decisions.
Never, ever be afraid of asking the hard questions.