User Tools

Site Tools


what_inbuilt_biases_do_we_have_that_impede_good_decision_making

What Inbuilt Biases Do We Have That Impede Good Decision Making?

Most of us are aware, or have at least heard of the idea that sometimes our brains operate in such a way that we make poor decisions which in turn means that we are a poorer manager / knowledge worker. Our work shouldn't be so hard: it is all about figuring out what the customer wants, and then figuring out how to get it to them in the fastest possible time, preferably without killing ourselves in the process. Or if you are a “Lean” person the aim is to deliver value with the “the shortest sustainable lead time”.

Sometime the problem is caused by the fact that your brain is getting in the way. Not only is it hard to learn new tricks Why Is Agile So Hard? but also our brains are wired in a way which actually gets in the way of being successful.

For example, most of us believe that we apply logic to the problems we face. In fact, if I were to ask a group of people whether their main approach to business problems most would say “we use logic to make decisions, especially key decisions”. It turns out that this thinking is actually a representative of an inbuilt bias we all have - in this case “hindsight bias” (see below).

In reality we are hard-wired and have inbuilt biases that actually get in the way of making good decisions. These biases cause us to misinterpret information and often push us to decisions which really do not make sense when looked at objectively, and which we may even regret.

Since these are inbuilt, they are hard to overcome. Step 1 therefore, and the reason for this page, is to become aware of these biases. With awareness we can look to mitigate the biases by changing how we think about this, putting processes in place aimed at reducing their effect, and collaborating with people to increase the chance that the biases are surfaced.

Exactly two things that determine how our decisions turn out: the quality of our decisions and luck. Luck is just another way of saying there is uncertainty and this uncertainty could lead to good or bad outcome. With this more accurate representation of the world and, when we accept that we can’t be sure that luck will play a role, then we can focus on improving what we can control, the decision making process, while acknowledging that even with the best decisions things will still go wrong.

Even with the best decision making process, things can still go wrong

The following list represents some of the biases we have as we do knowledge or management work that effect our decision making. Note that this is not an exhaustive list, but rather a list of things that I have seen over and over again.

Hindsight Bias

Hindsight bias is the inclination, after an event has occurred, to see the event as having been predictable, despite having been little or no objective basis for predicting it. This is the idea that when something happens “I knew it all along …”

Our reaction to project plans that don't work out as expected is an example of this. In these situations our reaction is to ask “what is wrong with our plan” or “what was wrong with our estimates” and, looking back, we can probably find something to work on. The problem is that the next time we do this same type of work, we are looking forward, not backward with the result that we cannot predict what might happen. And we keep doing this despite the evidence to the contrary.

Interestingly this is related to another bias we all have: confabulation.

Confabulation

This is the idea that you know when you are lying to yourself.

The truth is that you are often ignorant of your motivations and create fictional narratives to explain your decisions, emotions, and history without realizing it. It starts with the desire for your brain to fill in gaps. This is called “motivated reasoning” and leads to “confirmation bias” - see below.

Confirmation Bias

Confirmation bias is when you start with an answer, and then search for evidence to back it up.

If you believe that big upfront plans are important for success, you'll probably read lots of blogs and follow twitter feeds by those who share the same view and read books how to improve your plans and planning process. If you are convinced that all plans and planning are evil, you'll probably search for others with similar opinions. Neither helps you separate emotions from reality.

A better approach is to adopt the intellectual strategy of Charles Darwin, who regularly tried to disprove his own theories, and was especially skeptical of his own ideas that seemed most compelling. The same logic should apply to management and knowledge work ideas.

Confirmation bias is related to “motivated reasoning”.

Motivated Reasoning

Motivated reasoning is similar to confirmation bias, where evidence that confirms a belief (which might be a logical belief, rather than an emotional one) is either actively searched for or given more credibility than evidence that disconfirms a belief. It stands in contrast to critical thinking where beliefs are approached in a skeptical and unbiased fashion.

Motivated reasoning is the tendency to find arguments in favor of conclusions we want to believe to be stronger than arguments for conclusions we do not want to believe. It can lead to forming and clinging to false beliefs despite substantial evidence to the contrary. The desired outcome acts as a filter that affects evaluation of scientific evidence and of other people.

The effect is particularly insidious when its tied to our perception of ourselves, and the stories we tell ourselves. It is hard for people to really say “well that is a problem I caused” or “that was my mistake”. We want to see ourselves in a good light and we try to tell ourselves stories where we are successful. When this thought drives our reasoning, we do not deal with reality.

Surprisingly, being smart can actually make this bias worse. The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view. Corollary: the better you are with numbers, the more you can 'explain' your story.”

This bias is sometimes called the “self-serving bias” which is the common human tendency to take credit for success but deny responsibility for failure. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way that is beneficial to their interests. (See http://ices.gmu.edu/wp-content/uploads/2010/07/Spring-08_Sandra-Ludwig.pdf for example experiments.

One way to address this is to reframe the issue (see reframing bias below) so that you can see yourself in a good light with something that makes objective sense. For example it is often hard to stop a project because completing and closing projects is a “good thing.” If we reframe this goodness so that stopping projects early is a “good thing”, then we will be more inclined to do this.

Another way to address this issue is to ensure you have a group of unbiased folks to discuss your viewpoint with, and then listen to them.

Backfiring Effect

At its worse motivated reason will lead to the “backfiring effect”. This is when you are presented with information that goes against your viewpoints, you not only reject challengers, but double down on your original view.

Voters often view the candidate they support more favorably after the candidate is attacked by the other party.

But this effect applies to ideas as well as people. For example, many technical people get very attached to a technical product or solution to the point where that solution is the “hammer” to every “nail” (as in “if you have a hammer, every problem is a nail”). Increasing evidence makes these people double-down on their preferred solution.

This is sometimes called the “ostrich effect” where you ignore reality when it's screaming in your face, usually in an attempt to rationalize a certain viewpoint.

Framing Bias

Framing bias is when you react differently to the same information depending on how it's presented; how it is framed for you.

The classic example is where an experiment is set up as follows:

Framing Treatment A Treatment B
Positive “Saves 200 lives” “A 33% chance of saving all 600 people, 66% possibility of saving no one.”
Negative “400 people will die” “A 33% chance that no people will die, 66% probability that all 600 will die.”

Treatment A was chosen by 72% of participants when it was presented with positive framing (“saves 200 lives”) dropping to only 22% when the same choice was presented with negative framing (“400 people will die”).

You can see from the example that framing things as negatively or positively can change how you will react to something. Now apply this to organizations. A lot of organizations are risk adverse and so framing things in a negative way will lead to a disproportionate negative response.

A simple example of this for organizations is how “failure” is treated. Are messengers who bring bad news blamed and is failure treated by scapegoating the people seen as responsible, or are messengers celebrated and is failure treated as a learning opportunity? How you treat a problem will be based on this organizational framing.

Recency Bias

Recency bias happens when you let recent events skew your perception of what happened and what you expect of the future. It happens because a person most easily remembering something that has happened recently, compared to remembering something that may have occurred a while back.

Why is this a problem? Think about the last Retrospective you ran. With recency bias you have a tendency to remember the set of recent events, the end of the Iteration (or Sprint), and will work to improve those. But what if the real issue happened at the beginning of the Iteration, or the issue we are working was actually caused by something that occurred a while ago. In these situations we would end up working issues that are either not important or trying to fix something that is not a real root cause.

Recency bias also occurs when project starts to have problems. Because you currently see the problem you think it'll last forever and that you'll never recover. Rarely is that actually the case – it's usually the other way around – but it's what feels right when memories are fresh in our minds.

Anchoring

Anchoring is when you let one piece of often irrelevant information govern your thought-process.

Estimating offers a lot of examples of this. If you present a requirement and someone says “it'll be about 5 days” most of the estimates will come in around that 5 days even if the person saying it (for example, the person who wants the capability) has no actual information about how long it will take.

Pessimism Bias

Pessimism bias is when you underestimate the odds of something going right which often results in excessive focus on things that might go wrong.

Many organizations will try to reduce the downside of something going wrong. For example, they will do this by making early decisions in an effort to eliminate risk in the plan. The problem with this approach is that you are unable to take advantage of something positive that might happen. For example if you left the decision to a later time, when you have more information, could result in totally new was of solving the problem.

Halo Effect

“If we see a person first in a good light, it is difficult subsequently to darken that light,” writes the Economist.

And somehow we always attribute the “success” to one identifiable person, the hero, ignoring everyone else's contribution to the result. And once are seen that way, success will be automatically attributed to that person.

This is especially interesting in relationship to most work we do to day. In reality in a modern, reasonably sized organization no one person delivers value to the customers today. It takes a village. But we still tend to attribute positive (and negative) results to one person.

Escalation of Commitment

The escalation of commitment bias is basically another name for the “sunk cost falisy”. This is the idea that “We have to spend more on this project as we have already spent so much.” You double down on a failing project, not because you believe in its future, but because you feel the need to make back losses is bad.

Survivorship Bias

Survivorship Bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways.

XKCD on Survivorship Bias. To quote:

“Every inspirational talk should start with disclaimer on survivorship bias”.

Inspirational talks, books, etc. often have a similar approach. “I have been successful. I attribute my success based on my actions that I now recall, but at the time it is not clear they would have any affect on the success or failure.”

Side note: beware of consultants with these stories. Context matters.

Other Biases

Here are some other biases that seem to be relevant to good decision making, but which I do not see so much:

Skill Bias

When education and training causes confidence to increase faster than ability.

There is an old saying “In theory, theory and practice are the same thing. In practice …” Many people believe they know more about is happening because of training and education, but in reality its the people who are working the day-to-day issues that really know what is going on.

In fact it was Peter Drucker who defined a knowledge worker as “a person who knows more about the job than their boss”. And the Japanese say “no important invention happens in the office - you have to go and see”. But most of our approaches to knowledge work are focussed on thinking rather than doing (e.g. big upfront plan) and thinking that the manager knows more than the people doing the work.

Negativity Bias

Assuming perpetual doom, that problems will never be fixed, and that all hope is lost.

How many times do we not take the time to do the right thing because something else is more urgent. Many organizations fall into this trap, and then wonder why their people are depressed.

Risk Perception Bias

Attempting to eliminate one risk, but exposing yourself to another, potentially more harmful, risk.

Illusion of Control

Thinking that your decisions and skill led to a desired outcome, when luck was likely a big factor.

Most systems we work with today are “complex adaptive systems”. Complex adaptive systems have the following characteristics:

  1. Emergent: System manifests new behaviors none of its components have
  2. Non-determinism: Exact outcomes cannot be predicted
  3. Non-linearity: Very small changes lead to dramatic outcomes

Today, both of the product we develop and well as the system we develop it in are complex adaptive systems. And since exact outcomes cannot be predicted, how many of the upfront decisions are we making that caused the successful outcome? In reality a lot of success is related to luck …

Want to Know More

  • Original idea for this article (and some of the original list) came from the from a Motley Fool article “15 Biases That Make You a Bad Investor”
    • Hyperbolic Discounting (Going for an immediate payoff instead of a delayed larger one). Eg not writing that automated test you should.
    • IKEA Effect (Overvaluing your own solutions to a problem, and thus in contrast undervalue other solutions). The IKEA effect is a cognitive bias in which consumers place a disproportionately high value on products they partially created. If you’ve ever worked for a company that used a dumb internal tool rather than a better out-of-the-box solution, you know what I’m talking about.
    • Premature Optimization (Optimizing before you know that you need to). For example, the perfectly coded prototype.
    • Planning Fallacy (Optimistically underestimating the time required to complete a task). We think “ideal” and this comes apart in the real world. Also the old aphorism, “The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.”
    • Recency Bias (Placing higher value on recent events than ones that occurred further in the past). “Do you find yourself using the same design patterns over and over again? If yes, you’re probably looking at different problems from the same lens.”
  • Poster of biases to potentially use in workshop
/home/hpsamios/hanssamios.com/dokuwiki/data/pages/what_inbuilt_biases_do_we_have_that_impede_good_decision_making.txt · Last modified: 2021/01/06 12:54 by hans