Are You Blind to Bias?

Are you blind to bias?
Joe Burroughs

Joe Burroughs

Many people are frustrated by uncertainty and complexity in their jobs. I use Agile principles to provide clear and simple strategies so that you can win at work!

Many professionals are blind to their own biases

Bias is a disproportionate weight in favor of or against an idea or thing, usually in a way that is closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group, or a belief. In science and engineering, a bias is a systematic error.

There are many types of bias but among professionals, there are three main categories to be aware of:

Cognitive Bias

These are things we think to be true that are not.

Cultural Bias

Personal truths based on upbringing and affiliations that are not objectively true.

Unconscious Bias

Prejudices we are not even aware that we have.

Join our Pro Tips Pipeline

Agile Pro Tips respects your privacy.  Read our privacy policy on how we handle your personal information.

Cognitive bias at work

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own “subjective reality” from their perception of the input. A cognitive bias is a strong, preconceived notion of someone or something, based on information we have, perceive to have, or lack. These preconceptions are mental shortcuts the human brain produces to expedite information processing—to quickly help it make sense of what it is seeing. In terms of the work environment, cognitive bias appears in many ways:

  • Confirmation bias: This type of bias refers to the tendency to seek out information that supports something you already believe, and is a particularly pernicious subset of cognitive bias—you remember the hits and forget the misses, which is a flaw in human reasoning. People will cue into things that matter to them, and dismiss the things that don’t, often leading to the “ostrich effect,” where a subject buries their head in the sand to avoid information that may disprove their original point.
  • Availability bias: Also known as the availability heuristic, this bias refers to the tendency to use the information we can quickly recall when evaluating a topic or idea—even if this information is not the best representation of the topic or idea. Using this mental shortcut, we deem the information we can most easily recall as valid and ignore alternative solutions or opinions.
  • Fundamental attribution error: This bias refers to the tendency to attribute someone’s particular behaviors to existing, unfounded stereotypes while attributing our own similar behavior to external factors. For instance, when someone on your team is late to an important meeting, you may assume that they are lazy or lacking motivation without considering internal and external factors like an illness or traffic accident that led to the tardiness. However, when you are running late because of a flat tire, you expect others to attribute the error to the external factor (flat tire) rather than your personal behavior.
  • Hindsight bias: Hindsight bias, also known as the knew-it-all-along effect, is when people perceive events to be more predictable after they happen. With this bias, people overestimate their ability to predict an outcome beforehand, even though the information they had at the time would not have led them to the correct outcome. This type of bias happens often in sports and world affairs. Hindsight bias can lead to overconfidence in one’s ability to predict future outcomes.
  • Anchoring bias: The anchoring bias, also known as the anchoring effect, pertains to those who rely too heavily on the first piece of information they receive—an “anchoring” fact— and base all subsequent judgments or opinions on this fact.
  • Optimism bias: This bias refers to how we as humans are more likely to estimate a positive outcome if we are in a good mood.
  • Pessimism bias: This bias refers to how we as humans are more likely to estimate a negative outcome if we are in a bad mood.
  • Status quo bias: The status quo bias refers to the preference to keep things in their current state, while regarding any type of change as a loss. This bias results in the difficulty to process or accept change.

Cultural bias plays a role as well

Cultural bias involves a prejudice or highlighted distinction in the viewpoint that suggests a preference of one culture over another. Cultural bias can be described as discriminative because there is a lack of group integration of social values, beliefs, and rules of conduct.

Cultural bias is the interpretation of situations, actions, or data based on the standards of one’s own culture. Cultural biases are grounded in the assumptions one might have due to the culture in which they are raised. Some examples of cultural influences that may lead to bias include:

  • Linguistic interpretation: a translational activity in which one produces a first and final translation on the basis of a one-time exposure to an expression in a source language.
  • Ethical concepts of right and wrong: Interpretation of what is moral is influenced by cultural norms, and different cultures can have different beliefs about what is right and wrong.
  • Understanding of facts or evidence-based proof: When forming personal convictions, we often interpret factual evidence through the filter of our learned values, shared feelings, tastes, and past experiences.
  • Intentional or unintentional ethnic or racial bias based on cultural norms: unintentional biases are systemic, existing in the advantages and disadvantages imprinted in cultural artifacts, ideological discourse, and institutional realities that work together with individual biases.
  • Religious beliefs or understanding: this is the tendency to judge the strength of arguments based on the plausibility of the conclusion as it relates to religious or cultural beliefs rather than how strongly evidence supports a given conclusion.

Unconscious bias is always a concern

Unconscious biases, also known as implicit biases, are the underlying attitudes and stereotypes that people unconsciously attribute to other people, situations, ideas, artifacts, processes, and concepts that affect how they understand and engage with them going forward. There are a number of types of unconscious bias in most workplaces but here are some of the most common:

  • The Halo Effect: proximity to a “good” person or outcome endows others with a positive initial response
  • The Horns Effect: proximity to a “negative” person or outcome endows others with a negative initial response
  • Contrast Effect: differences draw attention and often that attention is negative
  • Gender Bias: attributing roles, characteristics, expected behaviors, etc. to individuals based on their gender
  • Racism: at its most basic, is thinking that one color or race is better than another and treating or mistreating people based on that belief
  • Ageism: believing that a person can be too old or young to contribute meaningful value to a conversation, project, or effort
  • Name Bias: attributing positive or negative value based solely on a person or product’s name
  • Beauty Bias: belief that attractiveness and intelligence or other positive qualities are related
Tips for Agile Teams Working Remotely

Tips for Agile Teams Working Remotely

Regain Morale • Establish Momentum • Increase Engagement

How does bias show up at work?

Forbes estimates that bias in the workplace is costing American companies north of $60 billion annually. In their article entitled, New Data Reveals The Hard Costs Of Bias And How To Disrupt It” they share that:

“Employees at large companies who perceive bias are nearly three times as likely to be disengaged at work. That kind of clock-punching is costly. Gallup estimates that active disengagement costs U.S. companies $450 billion to $550 billion per year.”

Bias shows up in all stages of the employee lifecycle from hiring practices, team assignments, peer and mentor relationships, promotions, raises, partnerships, to leadership considerations and beyond. 

Recognizing that we all have biases and ensuring that we are vigilant in our efforts to negate their influence is the first step, but we need to do more. It is our responsibility to address systemic biases as we encounter them. Simply put we need to look at the processes we interact with on a routine basis with fresh eyes. Look for biases built into the process or the system itself. For example, consider access requests. Are there biases built into the granting or distribution of access to applications, information, reports, etc. that prevent or inhibit employees from succeeding in their jobs? What about gender bias in roles within your organization? Are there roles that seem to be generally filled by men far more than women or vice versa? Ask yourself why and then what would change in the organization if the role was gender-balanced. 

The realization that bias exists in others, ourselves, and within the systems and processes we interact with is the “ah-ha” moment, but nothing will change if we don’t decide to act.

It is our responsibility to address systemic biases as we encounter them.

Please share!

Share your thoughts, suggestions, comments, and experiences in the comments below. We can learn from one another and grow as a community to help build the organizations we want to work for. Also, consider sharing this article with someone who has experienced bias or encountered it at work. Their insight and commentary would be invaluable!

Cheers,

Joe Burroughs

Sharing is Caring!

We are working hard to grow our audience and you can help! Please share any content your find valuable or interesting. 

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on pinterest
Pinterest

Leave a Comment

Your email address will not be published. Required fields are marked *