Much of the popular literature on neuroscience focuses on decision making. In particular, researchers have discovered many different ways in which humans make poor decisions. "Neuroscience" is the vogue term, but many will remember similar lessons about decision biases like the Halo Effect from psychology 101. (In case you missed Psych 101, the Halo Effect is the tendency for one good thing about a person to create a golden halo so that we think everything about them is good.)
For reward professionals, decision making bias is a big concern when managers evaluate performance. For example, the Recency Effect leads managers to place far too much weight on what an employee did in the last few weeks (good or bad) and far too little weight on their accomplishments earlier on. (That's one of reasons why performance management software now enables and encourages tracking accomplishments throughout the year.)
While bias in evaluating performance is a well-recognized problem, we really ought to be concerned with all our reward decisions: from job evaluation to the choice of comparators in a salary survey to the selection of compensation software. If we want to improve decision making we need to confront our innate cognitive weaknesses.
How reward professionals can avoid bias
It is natural to start by learning about the various types of decision making biases: Halo Effect, Recency Effect, Backfire Effect and so on. This is useful, but we soon run into another cognitive constraint: the limits of our memory. The Wikipedia page on Cognitive Biases lists over 100 different biases. Trying to remember them all, let alone guard against them one-by-one, is impossible.
The alternative to tackling decision biases one-by-one is to adopt some general strategies that lead to better decisions. Here's a short list of practical steps to improve decision making:
If people know they are going to have to defend a decision then they take more care in making it. If they take more care, they will be less likely to suffer excessively from decision biases. So if leaders, on occasion, make the effort to seriously scrutinize an employee's decision, and ask them to defend it publicly, then it will create a better decision making culture.
Numbers are by no means the perfect road to truth, but metrics do offer a kind of reality check that forces people to go beyond their gut feeling. Decisions informed by data tend to be better decisions. It is almost always appropriate to ask "What kind of data do you have to support that decision?" Again, it is not that data gives you the answer; it just leads to wiser judgements.
The old advice to "sleep on it" is a good tactic for countering cognitive biases. Building in a delay between reviewing the data and reaching a final conclusion gives the unconscious brain time to process the facts. Leaders should encourage employees to give themselves time to mull over a situation before committing to a decision.
More profoundly, Stephen Sample, who was President of USC for almost 20 years, suggests never coming to a firm position before you have to. This makes sense because we know that as soon as you reach a conclusion you tend to stick with it, so if you do not need to make a decision now do not lock your mind in to a particular choice.
Once people have made up their mind they rarely change it on their own. Asking someone to rethink an issue usually just leads to more extensive rationalization of their existing views. However, people do change their mind in conversations. If you feel that cognitive biases have led to a poor decision, then use conversations to gently shift people from their existing views and open up to different conclusions.
5. MANY MINDS
Dr. Mandy Wintink of the Centre for Applied Neuroscience says her single favourite rule of thumb is ‘many brains are better than one'. Brains are designed so that they often sacrifice accuracy for speed-that's why we have so many cognitive biases. It takes a bit of extra time to get multiple perspectives, but decision quality goes up. Sometimes simply explaining your idea out loud will help you detect shortcomings in reasoning.
Consultant Carl Spetzler has written that the biggest decision bias of all is the belief that humans are good at making decisions. We're not; or at least not nearly as good as we think we are. To the extent that we can create a culture that recognizes our shortcomings, a culture of humility, we'll get better decisions.
In HR we make decisions all the time, perhaps we should spend some more time thinking about how we make those decisions and what practices will lead to better ones.
Your Turn: How do you avoid decision making bias in your organization?