Cognitive Bias

Overcoming the Bystander Effect

Initiative.jpg

Research and personal experience both demonstrate that people are less likely to intervene (offer help) when there are other people around than they are when they are the only person observing the incident. This phenomenon has come to be known as the Bystander Effect and understanding it is crucial to increasing intervention into unsafe actions in the workplace. It came to light following an incident on March 13, 1964 when a young woman named Kitty Genovese was attacked by a knife-wielding rapist outside of her apartment complex in Queens, New York. Many people watched and listened from their windows for the 35 minutes that she attempted to escape while screaming that he was trying to kill her. No one called the police or attempted to help. As a matter of fact, her attacker left her on two occasions only to return and continue the attack. Intervention during either of those intervals might have saved her life. The incident made national news and it seemed that all of the “experts” felt that it was "heartless indifference" on the part of the onlookers that was the reason no one came to assist her. Following this, two social psychologists, John Darley and Bibb Latane began conducting research into why people failed to intervene. Their research became the foundation for understanding the bystander effect and in 1970 they proposed a five step model of helping where failure at any of the steps could create failure to intervene (Latane & Darley, 1970).

Step 1: Notice That Something Is Happening. Latane & Darley (1968) conducted an experiment where male college students were placed in a room either alone or with two strangers. They introduced smoke into the room through a wall vent and measured how long it took for the participants to notice the smoke. What they found was that students who were alone noticed the smoke almost immediately (within 5 seconds) but those not alone took four times as long (20 seconds) to notice the smoke. Just being with others, like working in teams in the workplace can increase the amount of time that it takes to notice danger.

Step 2: Interpret Meaning of Event. This involves understanding what is a risk and what isn’t. Even if you notice that something is happening (e.g., a person not wearing PPE), you still have to determine that this is creating a risk. Obviously knowledge of risk factors is important but when you are with others and no one else is saying anything you might think that they know something that you don’t about the riskiness of the situation. Actually they may be thinking the same thing (pluralistic ignorance) and so no one says anything. Everyone just assumes that nothing is wrong.

Step 3: Take Responsibility for Providing Help. In another study, Darley and Latane (1968) demonstrated what is called diffusion of responsibility. What they demonstrated is that as more people are added the less responsibility each assumes and therefore the less likely any one person is to intervene. When the person is the only one observing the event then they have 100% of the responsibility, with two people each has 50% and so forth.

Step 4: Know How to Help. When people feel competent to intervene they are much more likely to do so than when they don’t feel competent. Competence engenders confidence. Cramer et al. (1988) demonstrated that nurses were significantly more likely to intervene in a medical emergency than were non medically trained participants. Our research (Ragain, et al, 2011) also demonstrated that participants reported being reluctant to intervene when observing unsafe actions because they feared that the other person would become defensive and they would not be able to deal with that defensiveness. In other words, they didn’t feel competent when intervening to do so successfully, so they didn’t intervene.

Step 5: Provide Help. Obviously failure at any of the previous four steps will prevent step 5 from occurring, but even if the person notices that something is happening, interprets it correctly, takes responsibility for providing help and knows how to do so successfully, they may still fail to act, especially when in groups. Why? People don’t like to look foolish in front of others (audience inhibition) and may decide not to act when there is a chance of failure. A person may also fail to act when they think the potential costs are too high. Have you ever known someone (perhaps yourself) who decided not to tell the boss that he is not wearing proper PPE for fear of losing his job?

The bottom line is that we are much less likely to intervene when in groups for a variety of reasons. The key to overcoming the Bystander effect is two fold, 1) awareness and 2) competency. 1) Just knowing about the Bystander effect and how we can all fall victim to this phenomenon makes us less likely to do so. We are wired to be by-standers, but just knowing about this makes us less likely to do so. 2) Training our employees in risk awareness and intervention skills makes them more likely to identify risks and actually intervene when they do recognize them.

Hardwired to Jump to Conclusions

yeah-if-everybody-s3lpx6.jpg

Have you ever misinterpreted what someone said, or why they said it, responded defensively and ended up needing to apologize for your response? Or, have you ever been driving down the freeway, minding your own business, driving the speed limit and gotten cut off by someone? If you have, and you are like me then you probably shouted something like “jerk” or “idiot”. (By the way, as my 6-year old grandson reminded me from the back seat the other day….the other driver can’t hear you!) As it turns out, we are actually cognitively hardwired to respond quickly with an attributional interpretation of what we see and hear. It is how we attempt to make sense of our fast paced, complex world. Daniel Kahneman in his 2011 book, “Thinking, Fast and Slow” proposes that we have two different cognitive systems, one designed for automatic, rapid interpretation of input with little or no effort or voluntary control (System 1) and the other designed for conscious, effortful and rational interpretation of information (System 2). We spend most of our time utilizing System 1 in our daily lives because it requires much less effort and energy as it helps us make sense of our busy world. The problem is that System 1 analysis is based on limited data and depends on past experience and easily accessible knowledge to make interpretations, and thus is often wrong. When I interpreted the actions of the driver that cut me off to be the result of his intellect (“idiot”), it was System 1 processing that led to that interpretation. I “jumped to a conclusion” without sufficient processing. I didn’t allow System 2 to do it’s work. If I stay with my System 1 interpretation, then the next time I get cut off I am even more likely to see an “idiot” because that interpretation is the most easily accessible one because of the previous experience, but if I allow System 2 to operate I can change the way I perceive future events of this nature. System 2 allocates attention and effortful processing to alternative interpretations of data/events. It requires more time but also increases the probability of being right in our interpretation of the data. Asking myself if there could be other reasons why the driver cut me off is a System 2 function. Identifying and evaluating those possibilities is also a System 2 function. Engaging in System 2 cognitive processing can alter the information stored in my brain and thus affect the way I perceive and respond to similar events in the future.

So how can we stop jumping to conclusions?

It would be great if we could override our brains wiring and skip System 1 processing but we can’t. Actually, without System 1 we would not be very efficient because we would over analyze just about everything. What we can do is recognize when we are jumping to conclusions (guessing about intent for example) and force ourselves to focus our attention on other possible explanations, i.e. activate System 2. You need to find your “guessing trigger” to signal you to call up System 2. When you realize that you are thinking negatively (“idiot”) about someone or feeling a negative emotion like anger or frustration, simply ask yourself…. “Is there something I am missing here?” “Is there another possible explanation for this?” Simply asking this will activate System 2 processing (and also calm you down) and lead to a more accurate interpretation of the event. It will help override your natural tendency to jump to conclusions. It might even keep you from looking like an “idiot” when you have to apologize for your wrong interpretation and action.

Avoid Cognitive Bias to Create Workplace Accountability

shutterstock_8480227.jpg

As we discussed in our January Newsletter, the first step to Accountability involves an examination of the facts/reasons underlying a specific event/result (accounting). In order for this process to bear fruit, it is important that we accurately and fairly evaluate the causes of the poor performance. To effectively examine the facts/reasons for a specific event/result requires that we understand how our biases could affect that evaluation. This is where Cognitive Biases can come into play. You may be saying to yourself…”I don’t have any biases. What are they talking about?”

Well, the truth is that we are all impacted by biases and much of the time for that matter.

What is a Cognitive Bias?

A Cognitive Bias is anything in our thought process that can distort the way we view things including the actions of another person.

There are a multitude of cognitive biases that have been identified and studied by psychologists, but there are two that directly impact accounting for the actions/results of another person.

Confirmation Bias

One of these is what is called Confirmation Bias or the tendency to search for, interpret, focus on and remember information in a way that confirms one's preconceptions. In other words, we are predisposed to look for causes that confirm what we expect.

This means, for example, that if we are predisposed to view another person as competent, a hard worker and motivated, then we will tend to look for these types of behaviors in that person and also overlook behaviors that are in conflict with our preconception. Additionally, we would be more likely to account for poor performance on the basis of external factors such as lack of resources, lack of support, etc. rather than internal factors such as knowledge, ability or motivation. In other words, we would be likely to conclude that the failure was out of the person’s control.

On the other hand, if we are predisposed to view another person as incompetent, lazy and unmotivated, then we will tend to look for support of this preconception as the cause for failure and perhaps blame the person for the failure.

The Confirmation Bias is the underlying driver for a phenomenon commonly referred to as the Self Fulfilling Prophecy. This phenomenon has been demonstrated through research and personal experience in various environments and is notably reflected in the positive correlation between a supervisor’s expectations of a subordinate and that subordinate's performance.

Low, negative expectations tend to result in poor performance, whereas high, positive expectations tend to result in good performance.

Therefore, how we view an individual not only can color how we evaluate performance, but it can also determine how the individual actually performs. To fairly hold others accountable for failure we must be aware of our predispositions/biases regarding the individual and how we may have contributed to the failure in the first place.

Fundamental Attribution Error

The second Cognitive Bias related to Accountability is called the Fundamental Attribution Error.

Have you ever been driving on a three lane highway, going the speed limit in the right hand lane (left hand lane if you are from the UK) approaching an exit that you are not taking, only to have someone cut dangerously close in front of you to take the exit? What were your thoughts about the person doing the cutting? If you are like most of us you called the person a “jerk” or something worse and honked your horn or gestured “politely”.

You just attributed the other person’s actions to an internal attribute related to carelessness or some other bad motive. In other words, we view the other person as “bad” in some way.

Now, have you ever cut someone off in a similar circumstance when you were needing to get to an exit? If you are like us, and everyone else we have asked this question, then the answer is “yes”!

So why did you do it?

Probably because that “jerk” in the right hand lane wouldn't get out of the way and let you exit. In other words, your poor performance was due to external causes and not your carelessness or bad motive.

This is the Fundamental Attribution Error which says that we tend to attribute internal/motivational causes to the poor performance of others but not to our own poor performance. This cognitive bias can cause us to “jump to the conclusion” that the cause of the poor performance was due to motivation and thus interfere with our complete evaluation of other causes. Failure to accurately evaluate the “real” causes will most likely lead to consequences or corrections that will not lead to success in the future.

What's the Point?

Simply being aware that these two Cognitive Biases exist will help reduce or hopefully eliminate their impact on the accountability process.

As we will discuss in a future newsletter, starting your accounting of poor performance without “guesses” as to the cause(s) will almost always lead to a more accurate evaluation.