Unconscious bias

1 10 2011

We all like to think that we are unbiased. We are logical thinkers, clearly evaluating all of the information and forming sound judgements based purely on the facts.

Unlike others, who form opinions based on little information, informed by prior opinions, unshaken by subsequent logical argument or overwhelming evidence to the contrary.

Despite the deep unfairness of it all, logic and psychology do recognise a number of different types of cognitive bias. While we cannot change others, perhaps we can be a little more aware of how these biases might (just might) affect our own judgement.

Confirmation bias: the tendency to believe (remember, give additional importance to) information that confirms our own opinions. The flip side of this is choice-supportive bias, where you remember the choices you made as being compelling and the choices you rejected are trivialised or discounted.

Primacy Bias : Tendency to bleieve (and remember) the first thing you hear. This might be because of its novelty, or because of the amount of effort that is required for processing the first event or information.

Recency bias : Tendency to believe (and remember) the last thing you hear. May be particularly strong in those with poor memory. Primacy and recency are worth considering when you are thinking of job interviews, or marking papers. The first time you hear or read something it might seem clever. By the time a number of people have said it, it becomes discounted.

Egocentric bias : Of course none of US have this one….the tendency to remember memories in ways that are self-serving…

And similarly, Hindsight Bias : the tendency to remember past events as predictable…I knew it was going to happen that way all along.

Correlation effect : the tendency to believe that two events that coincide have a causal link. Sometimes it is just a coincidence, and the probability of coincidences is that they will occur more than once. (Also called illusory correlation)

Framing effect : where the bias is caused by either a too-narrow, incorrect scoping of relevant “surrounding” information, or making different decisions based on the same infromation in the context of other differing information.

Belief bias : where the assessment of the strength of the argument is framed by whether the end result is acceptable.

Fundamental Attribution bias : the tendency to attribute blame or credit to personality-based reasons rather than situational-based reasons.

There are probably hundreds of other biases that we are subject to – and yes, depsite my sarcastic and somewhat flippant introduction, we are all prone to making these errors in our judgement.

A wise person realises that there is more than one side to the story and tries to understand the others’ viewpoint as well as their own. Only then can we (maybe) illuminate the biases in our own, and in others’ thinking.


Seeing and believing

7 08 2011

We’ve all heard the old saying “seeing is believing”, referring to wanting to see the evidence in order to believe in something. Its corollary “you have to believe it to see it” is popular in positive thinking circles and the basis of visualisation as a technique, the idea being if you can trick your brain into believing in a possibility, the brain will make it come true.

But there is a third version of this. Sometimes, you only see the things you believe in. This is confirmation bias.

A simple example of these three: The first would be a parent saying that they don’t believe their child has good marks until the report card comes home. The second would be the child needing to believe it is possible to get good marks in order to actually achieve it. The third would be the teacher marking students according to what they expect they will get – Mary always gets high marks so her essay is read more thoroughly and favourably.

Wikipedia lists a number of biases, many of which have a similar basis to confirmation bias – we only believe, hear, see, test, understand, remember, the information that confirms our own opinion or hypotheses, rather than starting with a level playing field and examining the evidence impartially and wholly.

In general, we have a high regard for our own opinions and tend to believe that our opinion has been formed using all the available evidence and logical thought. If only we were such rational beings! My mother, being a Libra, says that her opinions are balanced, she has considered all sides of the argument. If you disagree with her then you are not thinking about the problem correctly. We agree to differ on this point.

The danger is of course that these biases are generally invisible to us as we make decisions that affect ourselves, our work and others. To quote Francis Bacon (Novum Organum, 1620):

The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.. . . And such is the way of all superstitions, whether in astrology, dreams, omens, divine judgments, or the like; wherein men, having a delight in such vanities, mark the events where they are fulfilled, but where they fail, although this happened much oftener, neglect and pass them by.

So, amusingly, we use this to confirm our beliefs in astrology (my mother is a Libra, therefore she behaves in this way….ignoring examples where she behaves otherwise.) More dangerously, we also confirm our own beliefs when the stakes are higher. Do we want scientists testing medicines that they already believe will work? Of course not, we want them to look at all the evidence and identify the positives and negatives. Do we want our teachers, bosses,co-workers seeking to confirm their established opinions? No, we want to be judged on unbased evidence – and all of the evidence.