The brain is an amazing and complex operating system. We use it to make hundreds of decisions every day from what clothes to wear, what to eat for lunch, to the how to pitch your big idea to your team. The human brain makes over 1,000 neural processes per second. And because it is an information processing machine, sometimes we create subjective reality from unconscious bias rather than objective data. We don’t mean to do it. But we all do it– and it often leads us away from rational judgment and sound conclusions and toward inaccurate, incomplete or counterproductive decisions.
Unconscious bias has been identified, observed and validated in brain studies using Magnetic Resonance Imaging (MRI) technology, and it is now acknowledged by psychologists and neuroscientists as real and measurable. Studies conducted at Harvard University using the Harvard Implicit Association Test have revealed that decision-making automatically triggers specific regions of the brain responsible for unconscious processing. The scans showed that activity increases in the amygdala (the region responsible for processing emotions - specifically fear) when the brain responds to perceived threats. Your brain automatically creates biases - or mental shortcuts - as a survival mechanism.
Biases arise from a variety of mental processes including:
· information-processing overload
· mental “noise” (memory deficiencies)
· emotional and moral motivations
· social influence
Cognitive biases also impact the way we make business decisions and collaborate with colleagues in group-work settings. Research shows that in the workplace we often inaccurately evaluate our colleagues and make critical missteps based upon proxies of expertise instead of actual skills and experience.
Princeton professor and Nobel Prize winning economist and author, Daniel Kahneman, maintains that quirks, logical inconsistencies, and flaws in human decision making are the rule rather than the exception in cognitive processing. In fact, he’s spent his career explaining human mistakes and miscalculations. Research Kahneman introduced decades ago continues to influence hundreds of billions of dollars in corporate investments worldwide today. His insights into the nature of human judgment and cognitive biases have led us to a fundamental reevaluation of human thought processes and how they influence group decision-making and risk assessment.
Kahneman starts at the individual level and looks at cognitive biases that often result in errors. Then he looks at the group to determine how individual biases impact the group. Kahneman maintains that not only do we underestimate the power of our biases, we also underestimate the amount to which they influence our decision-making. (There is a lot of research that supports how people tend to think their position has always been so even when they’ve been persuaded to a different position.) Also, groups tend to be more optimistic than the individuals that make up the group. And this frequently leads to a consensus opinion without proper evaluation or alternative viewpoints.This underlies the phenomena of groupthink.
Here are a few common biases that may be crippling the creativity and collaboration power of your team:
Illusory Superiority Bias
The illusory superiority bias is often at the core of conflicts in team tasks. Our natural inclination is to overestimate our own knowledge and skills. One of the first studies that found illusory superiority was conducted by the College Board in 1976. More than a million students taking the SATs that year were given a survey asking how they rated relative to the median of the sample (rather than the average peer) on a number of characteristics. In ratings of leadership abilities, 70% of the students self-reported above the median. In ability to get on well with others, 85% put themselves above the median; 25% rated themselves in the top 1%.
Research shows that our overall inability to assess competence and abilities leads to a phenomenon called the Dunning-Kruger Effect. Nobel Prize winners David Dunning and Justin Kruger stated that "persons of low ability suffer from illusory superiority when they mistakenly assess their cognitive ability as greater than it is." Furthermore, the least competent performers inflate their abilities the most.
Interestingly, Dunning and Kruger’s research on this bias originated with the criminal case of a bank robber named McArthur Wheeler. Do you remember using lemon juice as invisible ink when you were a kid? Apparently so did Wheeler. He covered his face with lemon juice believing he would be invisible to the surveillance cameras. (Seriously, that’s what Wheeler was going with.) Dunning and Kruger sought out to understand why and how common that behavior is. Their research - and the aptly named Dunning-Kruger Effect concludes that the less skilled you are at something, the less likely you are to recognize how unskilled you truly are, thus overestimating your abilities. You don’t know what you don’t know which makes you feel smarter than you are. Conversely, the better we are at a particular skill, we’re more likely to underestimate our ability.
The similarity bias is also known as the people like me bias. It’s the natural tendency to like people who seem more like you. We tend to agree with the people with whom we share opinions outside of the given topic. For example, if you enjoying talking football with John in the break room, the similarity principle says that you are more likely to subconsciously support his ideas in a meeting than Dan – with whom you share nothing in common.
Research shows that similarity is one of the biggest factors influencing hiring practices.
Interviewers naturally have a stronger affinity toward people they like – people like them – rather than people who may have a much stronger skill set or better qualifications for the position. In addition, this creates a diversity issue. Diverse teams have a wide range of strengths that enable them to overcome a wide spectrum of challenges.
From the interviewee perspective, the best way to gain points is to find a point of commonality with the interviewer. Same hometown, a love for yoga, children, or even a vase or piece of artwork in the office – anything that says “you and I share this thing and it makes us more alike than different” – subconsciously influences the interviewer and gives you an advantage.
The halo effect was named by psychologist, Edward Thorndike in 1920. His landmark study, "A Constant Error in Psychological Ratings", was designed to determine how the perceptions of one characteristic affected the our perceptions of unrelated characteristics. For example, Thorndike asked officers to evaluate soldiers in terms of physical qualities, intellect, leadership skills and personal qualities. He discovered that we have a natural tendency to form a general impression based upon a single characteristic.
The effect works in both positive and negative directions (and is sometimes referred to as the horns and halo effect). If we dislike one aspect of someone, there is a subconscious natural tendency to have a negative predisposition toward unrelated characteristics. A classic example of the horn effect is making the judgement that an unattractive person is less qualified for a position than than an attractive person with the same qualifications. (It's also called discrimination, but it happens all the time.)
The halo effect is especially prevalent in annual performance reviews. If an employee is competent in one area of his job, the natural tendency is to assume proficiency in other areas even without supporting data. Likewise, an employee who is incompetent in one or two aspects of his role may be labeled as incompetent across the board - again, without supporting data.
The horns and halo effect can play a significant role in group decision-making and team dynamics. For example, if a member of the team consistently makes minor mistakes on day-to-day tasks, the group may disregard any ideas offered by that person in brainstorming sessions. The converse would also apply. A person who generates amazing PowerPoint presentations may be judged equally as competent in project management.
The brain loves to be right. Confirmation bias is the tendency seek out evidence to prove that which we already believe to be true while ignoring contradicting evidence, even if it is factual and valid. It explains why it seems impossible to persuade someone arguing for or against a hot topic issue such as pro-life/pro-choice or gun control to consider the merits of the opposite stance.
This is one of the most common biases that subconsciously influences everyday decisions in the workplace. For example, suppose you are assigned to a team tasked with researching the next fitness product to develop. You think it should be yoga, but your data and research says otherwise. Confirmation bias prevents you from accepting the conflicted data, conduct more extensive market research, or even do your own tests, to prove that your preconception may not be the best move for your organization. Combined with the optimism bias, confirmation bias reinforces that your position must be the right one and underestimates any probability that it isn't.
Warren Buffet has been known to actively fight confirmation bias by inviting people with opposing opinions to the table when discussing investment strategies.
"Charles Darwin used to say that whenever he ran into something that contradicted a conclusion he cherished, he was obliged to write the new finding down within 30 minutes. Otherwise his mind would work to reject the discordant information, much as the body rejects transplants. Man's natural inclination is to cling to his beliefs, particularly if they are reinforced by recent experience--a flaw in our makeup that bears on what happens during secular bull markets and extended periods of stagnation." - Warren Buffet
To ensure you're making decisions based on facts and analysis and you're not influenced by confirmation bias, consider three simple rules:
Understand bias and acknowledge that it influences our judgement.
Be on the lookout for potential examples of biases and acknowledge them when they occur.
Aggressively seek out and consider information in opposition of our position.
Or, consider the words of Albert Einstein:
“If the facts don’t fit the theory, throw out the facts.”