“The first principle is that you must not fool yourself—and you are the easiest person to fool.”
Rule-breaking thinking works because the solution to persistently stubborn problems can usually be found outside our own universes of truth. Each of us dwells in the center of his or her own private universe of truth, and within are ideas, facts, and principles that we are absolutely certain of. That certainty is well founded. The truth of our universe is regularly validated by what we read, by our expe- riences, and by those we associate with.
Of course, we know our universes aren’t perfect. We know that we don’t understand many things, such as supersymmetry, Peruvian cooking, or how to appreciate a good cricket match. However, we are sure we have the facts for the important things.
We maintain great confidence that the ideas, facts, and principles of our universe are all true—in spite of strong evidence to the con- trary. For example, my universe of truth is probably very different from yours. That should tip both of us off that neither of us has a complete picture. Instead, we both rationalize that the other simply hasn’t seen the light yet.
In addition, we know that our universes of truth have changed through the years. We have abandoned some ideas and adopted others. This should also make it clear that our current universes of truth are probably not perfect. However, we generally rationalize that we have outgrown our foolish thinking, and that said foolish- ness is now behind us.
Intellectually, we may be able to admit that all of our opinions, beliefs, and ideas about the world are not perfect. Yet most people cannot identify any of their core beliefs that might be wrong. Minor ideas may be a missing a few facts, but when it comes to important thinking, we are certain that we have the answers. All of them. Only an idiot would suggest otherwise.
I would suggest otherwise. Our universes of truth are far from perfect, and because they’re imperfect, they hide many important ideas. When we attempt to solve problems, we naturally limit our- selves to solutions that conform to the rules of our universe of truth, cutting ourselves off from other universes of superior solutions.
Many problems, including most of the unsolved ones, can’t be solved within your current universe of truth. You probably gave up on many of them years ago because there seemed to be no solution. However, the real problem is in our heads.
Trained to Obey the Rules
“Unthinking respect for authority is the greatest enemy of truth.” ALBERT EINSTEIN
Our talent for breaking rules atrophies because we are trained and socialized to obey a myriad of rules. Education, social norms, and stan- dardization work together to make staying in our rule ruts habitual.
This training starts when we are children and exposed to the opin- ions, standards, and norms of our parents and families. By elemen- tary school, we share many rules of those around us. In high school we learn more rules, facts, and acceptable solutions including career goals, religious views, political orientation, and leisure preferences.
Perhaps most damaging, at every level of education we learn that when we think like everyone else, tackle problems in the acceptable way, and follow authorities, life goes on rather smoothly. We also learn that when we think differently, try new solutions, or challenge the authorities, life can be unpleasant.
Einstein provides an excellent example of breaking rules because he was never a conformist. We read about the quiet professor, but the Einstein who gave us relativity had an attitude problem. He rarely attended classes, preferring to spend his time in the laboratory or the cafés, and eventually his professors withheld the recommen- dation that would have allowed him to secure the university posi- tion he wanted. It was a difficult education and Einstein suffered much for his independence. But Einstein managed to acquire the knowledge of his day without becoming its slavish acolyte. That was a tremendous advantage.
Once out of school, we continue to go with the crowd. Even organizations that boast of innovation discourage new thinking. If someone makes a “crazy” suggestion in a meeting, no one says, “Wow, that kind of original thinking may lead to a novel solution.” Instead, someone usually turns on the heretic. We have been taught to learn the rules, use the rules, and revere the rules.
Einstein did much of his best thinking when he was completely isolated from the rest of the scientific community. While he worked at the patent office, no one directed his physics research. There was no tenure committee to intimidate him and no department head to rein in his wild ideas. He didn’t attend conventions to learn what everyone else was thinking. Einstein was free to create great solutions. And he did.
Precedent has a powerful influence on our thinking. For exam- ple, the most modern, state-of-the-art train still runs on a standard gauge, or track width. The gauge became standard on American railroads because they were built by British engineers who had used the same gauge on their railroads. British railroads originally adopt- ed the standard because the carriage tooling was available to make axles that size. All carriages used that dimension of axle to fit in the ruts of British roads. British roads started as Roman roads with ruts made by Roman chariots. The axles of Roman chariots were built to accommodate two Roman horses.
A modern transportation system cannot escape what was perfect for Roman horses, just as our thoughts are still shaped by generations of old thinking. We continue down millennia-old ruts with- out recognizing that the reason for the rule has disappeared.
We Become Experts
“To punish me for my contempt for authority, fate made me an authority myself.” ALBERT EINSTEIN
It is not surprising to discover that Einstein the great rule breaker was also Einstein the novice. Novices often conceive the break- throughs that win Nobel prizes. They receive the awards and recognition when they are famous experts, but the ideas were created as novices.
Novices are the best rule breakers. It is easier to break a rule that one has just learned. Novices know the concepts, but can still ignore them. It is like learning the customs of another culture. An outsider can learn a new custom and follow it, but he can also vio- late it without anxiety because the rule is not ingrained. A native, on the other hand, would never consider a violation because the rule rut is too deep.
We all develop expertise in one field or another. As we do, our novice’s talent for breaking rules fades. Ideas become inviolable rules. We would no more break our rules than defy gravity.
PHYSIOLOGICAL ROOTS OF OUR UNIVERSES OF TRUTH
“Logic: the art of thinking and reasoning in strict accordance with the limitations of human misunderstanding.” AMBROSE BIERCE
Our irrational confidence in our own universe of truth isn’t just the result of training. There are also physiological reasons for our deep confidence in the indisputable validity of all of our ideas.
Every day we are exposed to a torrent of information. All around us family, friends, and colleagues are engaged in constantly changing activity. We receive numerous messages and feeds from various news sources, with more information than anyone can possibly sort through.
And so we don’t pay attention to most of it. Our brains filter out all but a fraction of what we perceive. The rest never even makes it into our universe of truth. It is as though it never happened.
We think we are clued in to all the important things going on around us, but really we are just observing what is vital and interest- ing to us. The other information isn’t considered because it is not to our advantage or doesn’t agree with our rules. Even more, we are unaware the distinction was made.
The widely viewed invisible gorilla experiment performed by Christopher Chabris and Daniel Simons is a classic example of only perceiving what we are interested in. In the experiment, six stu- dent actors, three dressed in white and three in black, pass two basketballs. Subjects are asked to count how many times the actors wearing white pass a basketball. It is a demanding task because the actors weave in and out while passing the balls. In the middle of the brief action, an actor wearing a gorilla suit strolls into the middle of the other actors, pounds its chest, and walks out of the room. The gorilla is visible for at least ten seconds, almost half the length of the video clip. Yet usually only about half of the subjects notice the gorilla.
The information that makes it through our filters is typically information that conforms to our rules for the universe. This is exactly what happened to Einstein later in his career. Although the evidence for the quantum nature of the universe was steadily mounting, his rule ruts filtered that evidence out. Instead, he focused on the unresolved aspects of quantum theory. As a result, he simply didn’t see how strongly the rules of his universe of truth conflicted with the observable universe.
It’s bad enough that we perceive so little of our world. But in addi- tion, after we perceive the thin stream of information that makes it through our brain’s filters, we promptly misremember or forget most of it.
Our brains can remember a lot, just not as much as we think they can. Many individuals can perform amazing feats of recall. Those with hyperthymesia have very detailed autobiographical memories and can recall past experiences vividly. Others memorize books or numerous poems. You may even know a sports fan who can reel off years of very detailed sporting statistics.
Our memories are largely shaped by our rules. We remember the experiences that conform to the rules of our universe of truth and forget those that don’t. However, our brains are very good at fooling us into believing our recollections are accurate. When the brain doesn’t actually remember something that we are interested in, it will often make something up.
In fact, it’s quite easy to instill vivid memories in someone else or yourself by painting a descriptive image of the fake memory. It’s also possible to replace a weak memory with another stronger one that fits with your perceived rules. Thus, when we look back on the experiences of our life, we feel that everything we’ve experienced validates our rules and our views of the universe. More so, our memories of the limited set of things we do remember are often far from accurate.
Test subjects have been asked to record their memories immediately after a significant event such as 9/11 or the Kennedy assassination. Years later, they are asked to record their memories again for the same significant event. Although most of the subjects believe what they remember is sharp and accurate, a comparison of each of the recorded memories shows otherwise. Memories degrade rapid- ly, both in accuracy and in detail.
When Einstein considered quantum mechanics, he was most influenced by his very strong memories of solving problems using classical ideas. As a result, all of his experience told him that quan- tum mechanics was the wrong direction to pursue.
Our perceptions are often significantly skewed by environmental influences that we don’t even notice. For example, subjects are more likely to rate something as important when they are holding a heavy clipboard while rating it than when holding a light clipboard. Attractive people are viewed as more authoritative and intellectually competent. Body temperature skews how we view the warmth of an interpersonal relationship.
As a result, our universes of truth are constantly influenced and modified in ways we are unaware of. And yet we convince our- selves that our thoughts and decisions are always perfectly rational and only based upon the relevant facts.
We love to be right and can’t imagine being wrong. Yet intelli- gent people can hold very different opinions from us and also be certain they are right. The History of the Peloponnesian War, written by Thucydides about the war between Athens and Sparta, shows this very clearly. Thucydides was a very thoughtful, intelligent man who saw no wrong in slaughtering every inhabitant of a city when it broke a treaty. I may disagree with him, but we both would still think we are right.
Our brains are addicted to being right. Our brains crave certainty because uncertainty makes us uncomfortable. When aspects of our environment threaten our certainty, the amygdalae in our brains become very active. The amygdalae are key to memory decision- making and emotional reactions, and use these functions to help us deal with threats. Our brains become hyper motivated to resolve the uncertainty.
Sometimes this certainty bias drives the Einsteins and the Isaac Newtons of the world to uncover its mysteries when observations conflict with theory. However, often our bias for certainty impels our brains to stick with wrong conclusions to resolve the uncertainty.
Many times our brain simply declares that it is right and the universe is wrong, and that is that. You have often seen people deal with facts that conflict with their thinking in just this way. You and I have done the same thing, but probably didn’t notice.
Your brain may also search your current perceptions and your memories, looking for patterns that settle the uncertainties that torment it. And when it finds a pattern, it creates an answer. The answer doesn’t need to be a very good one. It just needs to make you feel that your universe is certain again. Regardless of how, your brain will find a way to be right while hanging on to its current rules.
Rejecting Contrary Evidence
Our brains reject evidence that contradicts our rules, even very compelling evidence. It is like having an immune system for foreign ideas—anything that doesn’t fit is expelled. This rejection is closely related to our certainty bias. When something contradicts our beliefs, it creates cognitive dissonance in our mind because we are forced to consider too many contradictory conclusions. Just like our dislike of uncertainty, we hate cognitive dissonance and try to resolve it as quickly as possible.
Our brains are very good at this. We simply ignore the contra- dictory evidence. If possible, we pretend it didn’t happen. If it is too persistent to ignore, we declare it an anomaly and unworthy of further consideration.
When we are forced to consider evidence that contradicts our own ideas and beliefs, we don’t do so objectively. Instead, our brains immediately search for a reason to reject the offending idea.
It doesn’t have to be a good reason. Frequently the first reason we think of is good enough. The most ready reason is often to attack the proponent. “He’s an idiot,” is usually enough to dispose of any idea we don’t like.
But our brains don’t stop there in stamping out uncomfortable evidence. After we have created a reason for rejecting the new evi- dence, our brain rewards us with a shot of dopamine. We actually feel good when we reject contrary evidence and resolve the cogni- tive dissonance.
Of course, there are good reasons for rejecting evidence and the ideas it inspires when they oppose our current beliefs. New ideas are costly to implement, something we will discuss more later. In the past, one could easily die before mastering a new hunting tech- nique or perfecting the cultivation of a new crop. But while there are often serious consequences to new thinking, those consequences are less likely to be fatal today.
New ideas simply aren’t as personally dangerous as they used to be. However, the machinery for rejecting new evidence and the ideas it inspires is still active in our brains, just like all of our inherit- ed or learned blocks to breaking our rules and escaping our narrow universes of truth. This is a terrible handicap because contradictory evidence and the cognitive dissonance it creates are early clues of a rule that needs to be broken.
The rules of our universe of truth are also kept in place by our aversion to loss. We hate to lose more than we love to win. For example, it is not uncommon for someone to take greater pains to avoid losing five dollars than to try to win twenty dollars, even if the probability of each happening is the same. Our brains associate a loss with pain that is far disproportionate to the actual consequences of the loss. As a result, when we evaluate opportunities, particularly opportunities that require a change, relatively small potential losses can eclipse the value of much larger opportunities.
It becomes harder to view novel solutions with optimism. After all, we may try something new and it won’t work. We will lose, and we hate that. Yet loss aversion also causes us to hang on to exist- ing solutions that yield predictably poor results. We don’t feel like we are losing because we get what we expected, even if we didn’t expect much.
When we evaluate new ideas, our loss aversion bias discourages us from considering novel solutions by inflating their risks and diminishing their rewards. As a result, we often reject ideas with small potential loses and big potential gains.
Closely associated with our loss aversion is fear. We are particularly fearful of the unknown. We believe that’s no end to what can go wrong when we leave the safety of our universes of truth. And so we protect ourselves with our instinctive responses to fear: fight or flight.
Neither fight nor flight is a good response to new ideas, but they are often our first reactions. You see it all the time. A new idea is presented, and predictably it is attacked. Often the attackers don’t
even pause to consider if the idea has any redeeming value. It is new and unknown, anything can go wrong, and so the idea must be attacked.
Others simply flee new ideas. You’ve seen this before too. “If that changes, I’m leaving.” Many simply fear being part of something new and opt out by fleeing. As a result, our fears often keep us safely cocooned in our universes of truth, too fearful to explore the myriad of new solutions within our grasp because they are as yet unknown and unexperienced.
Our pattern-seeking minds create rules out of random successes. Our brains are designed to give more attention to these random rewards than to other predictable ones. This is a useful trait for helping us identify opportunities such as new sources of food or a new marketing strategy. However, this skewing of what we pay attention to can also cause problems. We often become hooked on things such as gambling, romantic partners with dramatic mood swings, and solutions that only work under a unique set of circum- stances that happen to us once.
For example, the random successes of gambling can be incred- ibly addictive, even when one knows that on average he will lose money, simply because the brain places pays more attention to the occasional random win than the regular losses. As a result, gamblers continue to gamble, convinced that making regular donations at casinos is perfectly rational.
Random successes can create rule ruts that we cling to even when the rules fail us more often than they succeed. These rules can exert a powerful influence on our thinking, even when we know they are silly. A friend may win big at poker while wearing an old hat and then always wear the same hat for every poker game there- after, although he subsequently has just average luck. The rules in his head overpower reason.
We have similar rules that we don’t recognize as superstitions. Something you did worked. Maybe being obnoxious or being compliant got you your way when you were three. You remembered the success and used it again and again thereafter, even if it didn’t always work well. It became your rule simply because it worked once.
Wanting to Belong
We all want to be part of the tribe. That’s where all the fun is. That’s where it’s safe. To make sure we stay members in good standing, our brains are very adept at bending our thinking to fit in. We quickly, naturally, effortlessly change our thinking to go along with the crowd.
There are good reasons for this behavior. In the past, the holders of nonconforming ideas were often in great danger. Even today, it is much easier to avoid conflict by getting along.
However, our desire to conform blinds us to the ideas and solu- tions that everyone else is also blind to. In addition, it often dis- suades us from suggesting or trying new things when our group isn’t interested in them. Since your rule ruts have many similarities to the rules of those you associate with, going along applies extra pressure to stick to your rules.
We must recognize that our current mindset could be greatly improved. Unfortunately, the wiring of our own brain conspires against us to keep us from seeing beyond our own universe of truth. We don’t break the rules because we are programmed to keep them.
We need to perceive what we have been ignoring, recognize that we don’t have all the answers, overcome our fears, and go exploring in the vast universes outside our own truth and in risk territory. Only then can we discover the mind-blowing ideas that exist just beyond the limits of our own brain. Like Einstein did.