I will look at any additional evidence to confirm the opinion to which I have already come.
Arthur Hugh Elsdale Molson, British Conservative politician
Do you think you are right and it is others who are misguided? They clearly cannot see what is going on. Their opinion on matters are fundamentally incorrect. They need to be brought round to the ‘right’ way of thinking. Any of their arguments just support your own view that you are right.
Do you have an open mind? Are you ready to listen, and accept, someone else’s opinion? Do you ever consider that you might be wrong and have the mental capacity to accept such a status? During a project, if a review of lessons learned show changes in behaviour are needed, these will take place, even though the change might be hard to accept.
So which are you?
If you are the latter then you will be open to the three ideas below.
- Cognitive Dissonance
- Confirmation Bias
They may help explain why some individuals will not change. Why, despite evidence that their approach is fundamentally failing, continue along the path of failure.
If you are the former, there is no point in reading on, you will not listen with an open mind anyway.
Cognitive dissonance theory is founded on the assumption that individuals seek consistency between their expectations and their reality. Because of this, people engage in a process called dissonance reduction to bring their cognitions and actions in line with one another.
Dissonance is felt when people are confronted with information that is inconsistent with their beliefs. If the dissonance is not reduced by changing one's belief, the dissonance can result in restoring consonance through misperception, rejection or refutation of the information, seeking support from others who share the beliefs, and attempting to persuade others.
Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favour, and recall information in a way that confirms one's pre-existing beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
In psychology and logic, rationalisation or rationalisation (also known as making excuses) is a defence mechanism in which controversial behaviours or feelings are justified and explained in a seemingly rational or logical manner to avoid the true explanation, and are made consciously tolerable – or even admirable and superior – by plausible means. It is also an informal fallacy of reasoning
It can be frustrating when you are working with individuals who will not change. Yet this is not a new phenomenon. ‘Ignaz Semmelweis discovered that the incidence of puerperal fever (also known as "childbed fever") could be drastically cut by the use of hand disinfection in obstetrical clinics. Despite various publications of results where hand washing reduced mortality to below 1%, Semmelweis's observations conflicted with the established scientific and medical opinions of the time and his ideas were rejected by the medical community. Some doctors were offended at the suggestion that they should wash their hands.’
So you are not alone. Your ideas may conflict with established opinions but that does not mean they are incorrect.