‘Confirmation bias’, ‘groupthink’ & ‘normalisation of deviance’: The invisible enemies of a socially safe organisation

In the modern business world, we strive for innovation, efficiency and a healthy work environment where everyone feels safe. Yet these goals are often undermined by subtle but powerful psychological phenomena. Confirmation bias and groupthink are mechanisms that can cloud critical thinking and communication, which is so important when it comes to social safety.

Obviously, the absence of neutral and objective consideration of information, weighing and making subsequent decisions can lead to wrong or clumsy decisions. Recognising and breaking these patterns is essential if one strives for a socially safe environment and healthy work culture.

What are confirmation bias and groupthink?

Confirmation bias is the tendency to seek, interpret or remember information in a way that confirms our own pre-existing beliefs. This may cause us to ignore or minimise conflicting information, or information that does not fall within the boundaries of our own beliefs, opinions and views. The result is a distorted view of reality. When individuals or teams are not open to new or different perspectives, this can lead to dangerous processes such as tunnel vision and not seeing or not taking critical information seriously.

Even in groups, ‘bias’ is our enemy. Groupthink occurs in a group where there is usually a greater or lesser desire for consensus and harmony. Acceptance and consideration of alternative ideas that do not fit within the norms or culture of the group are suppressed and divergent views of group members are subtly dismissed or in some cases simply ignored or even addressed with hostility.

It is known that group members who have opinions or have information or knowledge that differs from what most of the group thinks or wants, are not as inclined to stand up and share their vision. On the contrary, they are inclined to put their own opinions on the back burner and not make them known. In fact, they are even more likely to accept and follow the group's predominant ideas, even if they have a different opinion. In other words, they conform and exhibit behaviour that is in line with what the rest of the group is doing. Several psychological experiments confirm this interesting and dangerous psychological phenomenon. People do not want to be left out of the group and certainly do not like to endure criticism from the majority. As with confirmation bias, the risk of making wrong decisions also increases here and this certainly applies to those who investigate incidents in a team.

Many, perhaps even you, reading this piece now will claim that they are resilient enough to these mechanisms, are familiar with these phenomena and therefore less susceptible to the dangers. But be warned, it is an invisible enemy that can target anyone, including the person writing this piece, experienced researchers and those who consider themselves to be self-confident, independent thinkers.

Research shows that groups led by a dominant person who has poorer skills in dealing with criticism are more susceptible to groupthink, as are groups that must work under a lot of pressure or are confronted with a lot of stress. Groupthink can lead to risky decisions and a culture in which errors, misconduct and abuses are not identified or are identified too late.

Another relevant experiment is the Wason Selection Task. This study showed that people are more likely to seek evidence that confirms their beliefs than evidence that contradicts them. Participants were given four cards with a rule such as: 'If there is a vowel on one side, then there is an even number on the other side.' Most people chose cards that could confirm the rule rather than disprove it, which is a classic example of confirmation bias. In organisations, this leads to tunnel vision and risky decision-making if people do not actively seek out counter-evidence.

The impact on organisations

At DANTES we have often seen how these phenomena can influence teams working on social safety, researchers and investigators. Bias can also cause organisations to stick with outdated strategies despite clear signals that change is necessary.

A classic example of the dangers of bias is the 1986 Space Shuttle Challenger disaster. There was peer pressure and a strong desire from the dominant leadership to stay on schedule. Many engineers were very aware of a specific defect but had already accepted that deviation as a group. After all, there had already been a few previous successful launches, despite those specific technical flaws. The smart engineers had slowly accepted the technical error and came to view it as 'normal'. They didn't put in any more effort or take the time to fix the obvious problem. The Challenger disaster was caused by this defect.

This phenomenon is also known as 'normalisation of deviance'. In the context of social safety, think of socially unsafe behaviour in the workplace that is slowly but surely accepted (normalised) by the dominant group. There is a gradual shift in standards. An outsider or new employee who joins the team will initially be surprised and may condemn the unsafe behaviour or even bring it up for discussion. However, the group hardly reacts to the warnings and concerns of the new colleague, because they have long since normalised the unsafe behaviour. What the employee does is no longer strange or weird or harmful to them. After all, he has been exhibiting the unusual behaviour for a long time and nothing serious has ever resulted from it. There is a good chance that the new employee will gradually conform to this prevailing vision. Groupthink is introduced. The socially unsafe behaviour continues, victims do not dare to come forward or are convinced that they are being oversensitive and should not act this way. Until it escalates and a serious, violent incident occurs.

The Psychology of Totalitarianism

In his book The Psychology of Totalitarianism, Dr. Mattias Desmet discusses how mass formation and totalitarian tendencies can arise from mechanisms such as confirmation bias and groupthink. He states that in times of uncertainty and fear, people tend to conform to dominant narratives, even if they are harmful. This process can result in a collective loss of critical thinking and individual autonomy.

Desmet describes how in society certain ideas or policy choices are accepted unchallenged, because differing opinions are discouraged or even punished. This not only happens at a political level, but also within organisations, where employees may be afraid to go against prevailing opinion. This creates blind spots in policies and strategies, which can ultimately be detrimental to both employees and the organisation.

The importance of breaking patterns

Breaking through the phenomena discussed above is crucial for promoting a culture of openness and critical reflection. This is not only essential for innovation and growth but also for ensuring social safety within an organisation. When there is a culture where people feel free to express concerns and share differing opinions, the general sense of social safety increases, as does the willingness to report and flexibility to change. Potential problems and events that could jeopardise the safe culture are addressed faster and more effectively.

For example, a lack of critical reflection can lead to employees not speaking out about unsafe situations or ethical misconduct. This poses a significant risk, especially in sectors where safety and integrity are crucial. Problems remain under the radar for longer and can eventually escalate. Internal investigations into incidents are less objective and problematic employees get off scot-free. Outsiders wonder how it is possible that someone who has displayed long-term, serious and repeated misconduct can still work in the department that he has terrorised for years. An extreme example, but we do encounter them in practice.

Strategies for Organisations

To break these destructive patterns, organisations can consider the following steps:

Encourage Diversity: Build teams with diverse backgrounds and perspectives to explore a broader range of ideas and solutions. Different insights can help minimise blind spots in decision-making.

Creating a culture of open dialogue: Encourage employees to express their opinions without fear of repercussions. This can be done by implementing anonymous feedback systems or by training leadership in open communication.

Conduct independent evaluations: Involve external experts in decision-making processes to minimize bias and gain objective insights.

Offer awareness training: Educate employees about the dangers of confirmation bias and groupthink and how to recognise and avoid them. This can help them to be more aware and critical of information and decision-making.

Appoint a 'Devil’s Advocate': Encourage teams to appoint someone to take on the role of critical questioner to encourage diverse perspectives and reduce peer pressure.

Conclusion

By actively working to recognise and counteract confirmation bias and groupthink, organisations can create an environment in which critical thinking flourishes and both individuals and teams reach their full potential. This not only contributes to better decision-making and innovation but also to a culture that promotes social safety and willingness to report.

For more information on how DANTES can support your organisation in fostering a culture of critical thinking and openness, visit www.dantespsychology.com

Inge Nijenhuis

Stay Connected with Dantes

* indicates required

Intuit Mailchimp


Website Designed & Developed By Shore Lines