Incompleteness and the Imagination

4

June 2015
Ivan Obolensky

Humans perceive the world through a small window of senses, and compared to what is available to be observed, it is a very small window indeed.

Our vision, for example, is restricted to a microscopic portion of the broad spectrum of electromagnetic radiation, which extends up through the high-frequencies of gamma rays and down through the low-frequencies of radio waves. Our hearing allows us to perceive the pressure changes that create sound, but low-frequency vibrations such as those used by elephants and high-frequency chirps emitted by bats fall outside our abilities to hear them. We do have an infrared sensor, in the form of skin that is sensitive to heat, and we can certainly taste molecules and smell some of them when they become airborne but far less than other animals such as dogs. In addition, we are aware of changes in location and direction.

If we were to take the amount of raw information around us that is available to be inputted at any given time and compare that to what actually gets through our senses to our brains, the percentage would be very small. Of course we can enhance our perceptions with such things as infrared sensors or thermal monitors but regardless, the outputs of these devices still have to be adjusted to conform to what our senses can take in. We operate on incomplete information, and we always will. This is not a mental fault, but a systemic one that all humans have in common.

To compensate for this always incomplete view of our surroundings, our brain has a built-in mechanism to supplement the data we receive with information that is manufactured internally. (The punctum caecum is an example. It is the name for the blind spot that is caused by lack of photoreceptors where the optic nerve connects to the eye. See Memories). The result is a worldview that is continuous and seamless, but is not necessarily a true and accurate representation of that world. There is so much of our perception that is augmented, it is no wonder that humans have excellent imaginations. Our ability to imagine is an extraordinary compensation for our sensory blindness. We are experts at envisioning consequences, possibilities, and things that don’t exist.

The facilities and mechanisms the brain uses to generate this supplemental data is not fully understood.

There are several models as to how the mind functions. One of my favorites is that of a CEO running an organization that is made up of a multitude of voting members, over which he has little control other than to veto what the consensus recommends. They operate at a level below the awareness of the CEO.

The CEO could be likened to the active part of the mind, who we think of as ourselves, while the voting members make up the automatic functions.

Both the active part and the automatic part have positive and negative attributes which indirectly derive from the mechanism which compensates for our lack of complete sensory information. The active part has its ability to imagine, but just like a typical CEO, it can get lazy, or overloaded, and rely on the automatic parts to handle the day-to-day drudgery, such as commuting, while the active part thinks of other things. The automatic part has several positive qualities. It allows the CEO to concentrate on the big picture while it supplies a seamless view of the world. On the negative side of the ledger: it too operates on incomplete data and if the CEO is doing other things, it can jump to conclusions that the CEO accepts. These we know as cognitive biases.1

There are other models.

According to Daniel Kahneman, who along with Amos Tversky introduced the idea of cognitive bias in 1974, and who received the 2002 Nobel Prize for economics, our minds can be modelled by thinking of it as having two basic parts called System 1 and System 2. System 1 is intuitive, emotional, and automatic. System 2 is slower, deliberate, and logical.

If we decide to learn a language, or fly a plane, the slower part, System 2, is initially used. We think about what we are doing and how to do it. Gradually the learned behavior gets automated, and the intuitive side takes over. System 1 performs tasks, such as speaking fluently, without our need to think about it. System 1 is quick.

According to Kahneman, System 2 can also get complacent and fail to inspect the System 1 information. We then become prey to cognitive biases.

Cognitive biases are tendencies to think automatically in certain ways that lead to errors in judgement.2

We look at the world within the context of what we know and what we have experienced.

This is called a framing bias. An approaching rain storm is viewed very differently by a farmer with a dry field and the farmer’s wife who is organizing an outdoor cake sale.

There is the survivorship bias. We tend to think that those who have survived some process or event are in some ways better than those that didn’t.

One always hears how success is predicated on hard work, persistence, and having a written plan, but then how many hard-working and persistent people who wrote out a plan failed to become successful? The figure is never printed.

We look at what is, rarely at what isn’t.

If an outcome is framed as negative (being eaten by a shark) then we will not go near the water in spite of the probability of such an event being remote.

We expect a small probability event to happen. We play the Lotto expecting to increase our chances if we choose the numbers ourselves.

On the other hand, when we are presented with a 70% probability of gaining a 40% return if we invest $10,000 dollars, we would rather not because there is the 30% chance of a negative outcome. In that sense we are risk-adverse. We tend to favor no-risk options.

This tendency can be turned around if we frame it differently. If we can reframe the shark scenario by saying that 99.999999% of people who swim in the ocean survive, or that you could have $14,000 instead of $10,000 if you follow this investment plan, we will tend to go along.

How we frame an event to take advantage of our biases is how advertising works.

Suppose we know that we have a bias that makes us believe we can finish a job more rapidly than we actually can. (The optimism bias.) We get clever and decide to always add an extra amount of time to compensate. This is a called a heuristic. It is a practical shortcut and knowing there is that tendency to underestimate the time it takes to complete the job, we add a fudge factor on important job schedules. But we are not always on guard. Do we remember to do that when we estimate the amount of time it will take to mow the lawn? Chances are we will forget to make that allowance. If we were to tell our spouse that we’ll be done in half an hour she will look at us and shake her head. She knows it will be at least an hour.3

Cognitive biases lead to faulty thinking and less than optimum decision-making. Additionally they may be impossible to eradicate completely. Familiarity is no guarantee and perversely cognitive biases are much easier to detect in others than in ourselves.

We all aspire to think logically and rationally. If we have positions of responsibility it is even more important. Errors in judgement can affect far more people than just ourselves.

The Greeks relied on logic to help embrace their goal of living well and to divine truths about the world around them. Mathematics embraced logic for a similar reason: to form valid conclusions.

Starting at the turn of the 20th century there was a concerted effort to prove mathematics as a subject was consistent, it had no logical contradictions, and that it was complete in that it encompassed all available truths. If mathematics could be proved to be true, then physics, which uses mathematics, could also be proved to be true and thus ensure that we had a correct picture of the universe in which we operated.4

Kurt Gödel was a Czech mathematician who eventually immigrated to the US and was a friend of Albert Einstein. He submitted a mathematical paper in 1931 that established the limitations for all except the most trivial axiomatic systems. An axiomatic system is a system that starts from a set of premises that are thought to be true and builds on these using logical operation to create other truths.

In a mathematical tour de force he proved that even in an axiomatic system as simple as arithmetic there exist arithmetic truths that are unprovable (the system is incomplete) or else there are arithmetic falsehoods that can be proved to be true (the system is inconsistent).5

The implications of Gödel’s proof had a profound effect on mathematics going forward, in that it showed that establishing an all-encompassing truth using mathematics was impossible.

Although this proof was specifically about axiomatic systems, many processes start with a series of inputs and follow a sequential series of steps that are extraordinarily involved. Complex systems, whether it be a computer program or the mind, are subject to incompleteness and inconsistencies and thus vulnerable to errors.

Kahneman in a 2001 paper, pointed out the challenges of avoiding biases as an individual, but that there was hope for better decision-making if we utilized our collective ability to spot biases in others. He advocated a team approach to decision-making.

By spotting the biases in others’ thinking and then hammering out a consensus, better decisions can be generated. Projects still might fail but it will not be because of irrationally optimistic expectations or not even begun because of a tendency to play it safe and maintain the status quo.6

A single decision point is prone to act irrationally not because he or she is mentally deficient but because of the systemic errors that are inherent in ourselves.

Dictators are particularly vulnerable in that no one who values their lives are willing to point out the flaws in their thinking. Usually they are surrounded by those who simply agree.

Leaders can certainly inspire and lead a group but the wise leaders utilize a diverse group of advisers who are not afraid to speak their minds.

Handling our systemic tendencies to irrational behavior as individuals starts with our understanding and acceptance that these will always exist no matter how superbly competent we may be. We need to surround ourselves with those who can point out our biases and help us achieve sound planning that will ultimately lead to a higher quality of life because that is what we all, as individuals, desire. It is doubtful we can do it alone.

 


 

  1. Kahneman, D. (2011) Thinking, Fast and Slow, New York, NY: Farrar, Straus and Giroux
  2. Eagleman, D. M. (2011). Incognito, The Secret Lives of the Brain. New York, NY: Pantheon Books
  3. Bevelin, P. (2007) Seeking Wisdom from Darwin to Munger. Malmo, Sweden: Post Scriptum AB
  4. Ferrell, E. (1994) Mathopedia. Greensboro, NC: OMNI Books
  5. Gensler, H. J. (1984) Godel’s Theorem Simplified. Lanham, MD: University Press of America
  6. Kahneman, D., Lovallo D., Sibony, O. (2011) The Big Idea: Before You Make That Big Decision, Harvard Business Review. Retrieved June 2, 2015 from https://hbr.org/2011/06/the-big-idea-before-you-make-that-big-decision.

 


 

If you would like to sign up for our monthly articles, please click here.

Interested in reprinting our articles? Please see our reprint requirements.

© 2015 Ivan Obolensky. All rights reserved. No part of this publication can be reproduced without the written permission from the author.

 

Leave a Reply