Thinking about Thinking: Conflict and Cognitive Bias
Next time you are experiencing a difficult conflict try thinking about how you and the other person are thinking. When I read a recent post by Buster Benson I was struck by how cognitive bias contributes enormously to my day-to-day world of resolving conflict. Understanding more about cognitive bias certainly improves our conflict resolution skills.
Recently a learner in one of my courses expressed surprise when I said most people I deal with in mediation do not lie. However often they have very different perceptions about the same situation. Frequently those perceptions develop as a result of cognitive bias.
Let’s consider an example of employees in a workplace. One feels that having their reports corrected by a colleague is harassment. The other feels that this behaviour is being helpful. Or consider the joke that one member of the team does not find funny, and feels is intended to mock her.
According to the definition in Wikipedia, a cognitive bias is a pattern of deviation from rationality, in which inferences about other people and situations may be drawn in an illogical fashion. For example, when we choose to rely on details which support our beliefs and ignore those details which do not, we are demonstrating cognitive biases such as confirmation bias, ostrich effect or post-purchase rationalization.
It takes a lot of energy to think, and then to think about how we think. Being efficient humans, for good reason we rely on the shortcuts of cognitive bias. In his post Buster Benson said:
Every cognitive bias is there for a reason — primarily to save our brains time or energy. If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.
Here are four problems that cognitive biases help us address and some examples of the ways they contribute to make conflict situations more difficult.
1. Too much information. There is so much information in the world that we need some way to filter out the majority of it. Conflict situations often include the example above of relying on details which support our beliefs and ignoring details which do not, leading to several common cognitive biases, three of which are mentioned above.
2. Not enough meaning. How do we make sense of all the vast information out there? In conflict situations it is common to use our cognitive biases to fill in characteristics from generalities and prior histories, (for example, stereotyping and bandwagon effect) and to imagine things and people we’re familiar with as better than things and people we aren’t familiar with (for example, halo effect, and in-group bias). Another common participant in conflict situations is our tendency to think we know what others are thinking. Examples of this are illusion of transparency, asymmetric insight, and spotlight effect.
3. Need to act fast. We have too much information, not enough time to figure it out and we need to act fast without enough time to be certain. Ever since our cave-dwelling days, standing still invites danger. A factor in many conflict situations is our need to be confident in our ability to make an impact and to choose to do what is important, (for example overconfidence effect, and fundamental attribution error). Another popular area of cognitive bias which contributes to conflict is the tendency to choose what we know and preserve the way things are. Better the devil you know than the devil you do not. Examples of this are decoy effect and status quo bias.
4. Not enough memory. There’s too much information for us to remember much of it. What we choose to remember helps us create the filters we need for # 1 above and to fill in missing information for #2 above. It’s a self-reinforcing circle. Our tendency to edit memories after the fact is a contributor to conflict, for example, source confusion, and false memory. Another frequent contributor to conflict is our tendency to reduce facts and events to a few key elements, for example, misinformation effect and primacy effect.
Back to our examples of employees from the beginning. Of course the cognitive biases in action depend on the specific circumstances. The employees in a dispute about whether correcting a colleague’s report is harassment might benefit from considering how the cognitive biases of asymmetric insight and the illusion of transparency are affecting their perceptions of the situation. The team with the joke that is not shared by all might be experiencing perceptions framed by the cognitive biases of bandwagon effect and in-group bias. That group plus the one who does not find the joke funny may also be experiencing the cognitive bias of the illusion of transparency.
We need to use more logic when we think about our thinking. Simple to say and definitely not simple to do. Understanding more about how we form our perceptions, the illogical shortcuts we use and the errors those cognitive biases cause us can go a long way to helping us unravel the tangled mess of a conflict.
Read Buster Benson’s article here.
Build your conflict resolution skills by registering for Fundamentals of Mediation. The next course starts March 29, 2017.