Tags

,

I am particularly stuck by the subtitle of your blog, Bill (“Better to be right than happy?”) and how difficult it is to alter our beliefs, even when they are limiting us. In my work over the years, it is clear that people tend to cling tenaciously to belief systems and mental models that do not serve them (or those around them) well, despite having little or no basis in fact.

In a recent article in the New York Times (July 2, 2015) based on the work of Barry Nalebuff, a Yale professor and expert in game-theory, David Leonhardt presented a simple “puzzle” that provides an elegant opportunity to experience how we so easily, yet erroneously, develop and then strongly hold on to beliefs without ever “testing” them for accuracy. In fact, people tend actively to seek additional evidence to support their beliefs, rather than challenge them by seeking information or evidence that might disconfirm their beliefs.

Here is the puzzle as presented in the New York Times article:

“We’ve chosen a rule that some sequences of three numbers obey — and some do not. Your job is to guess what the rule is. We’ll start by telling you that the sequence 2, 4, 8 obeys the rule.” The reader is then asked to submit a number sequence and told they will be informed “whether it satisfies the rule or not,” as well as that they “can test as many sequences as (they) want.” Finally, the reader is told, “When you think you know the rule, describe it … and then submit your answer. Make sure you’re right; you won’t get a second chance.”

 Based on the example provided, most people test one of several theories (e.g. each number is double the previous number – 1, 2, 4; 5, 10, 20 – or that each number is a square of the previous number – 1, 2, 4; 3, 9, 81). When told that their number sequence satisfies the rule, they conclude that they understand the rule. That is, once they believe they are correct, they fail to make any effort to disconfirm their theory or belief!

In this particular puzzle, the actual rule is simply that “Each number must be larger than the one before it.” Thus, the actual number of sequences that satisfy the rule is literally infinite (1, 2, 3; 10, 30, 35; 4, 17, 231; etc.). Yet, most people quickly conclude that their belief (or rule) is correct, exclude other possibilities and behave accordingly.

This tendency is known as “confirmation bias” and sets us up to maintain beliefs that often are severely self-limiting at the very least, if not outright detrimental to ourselves and others. It also interferes with our ability (or willingness) effectively to challenge and alter our beliefs and behaviors in ways that could free us from self-imposed (and often inaccurate) assumptions. For instance, in this puzzle, people select and adhere to one rule they believe to be accurate, which then denies them an infinite number of alternatives that may be more efficient or lead to better outcomes.

One conclusion of the article’s author is: “When you want to test a theory, don’t just look for examples that prove it. When you’re considering a plan, think in detail about how it might go wrong.” Whether related to loneliness anxiety, relationships, setting and achieving our goals or leading an organization, it is easier to cling to long-held and over-practiced beliefs and behaviors, even when they are limiting, detrimental and inaccurate, than to challenge them. Only the latter allows for self-growth and moving toward our greater potential.

Leaders who fail to “test” their belief systems and mental models (about themselves, their employees, their theories about effective leadership) do so at their own (and their organization’s) peril! At CEO Effectiveness, we recognize this vitally important human challenge and seek to facilitate personal and professional growth and achievement through self-knowledge and understanding self-imposed limitations.

7 people like this post.

Comments

comments