October 2014
By Andreas Guenter

Andreas Günter is a technical writer who has been part of the industry since 1998 and is particularly interested in the scientific principles of technical communication. He works as reviewer for the tekom Dokupreis and as a qualification advisor.


andreasguenter09[at]googlemail.com

Mind (the) trap!

How do we think? What traps does our brain set for us? Moreover, what does this mean for our workday? Neurobiology and cognitive psychology provide surprising insights into a system that does not really function as rationally as we would like to think.

The popularity of neurobiology has also thrown the media spotlight on cognitive psychology. The science of thinking and cognition primarily sheds light on how we run into mind traps. Errors in reasoning are often presented to an astonished public even in newspaper articles and television documentation. Moreover, in the business world, coaches are trying to change the way managers think. Even technical writers can learn from errors that they owe to their brain.

There have been some new books on this topic in recent years. Nobel Prize winner Daniel Kahneman has consolidated scientific findings since the 60’s in his thick volume “Thinking, fast and slow”. Financial mathematician Nassim Taleb shows us how badly we are able to evaluate probabilities in his bestseller “The black Swan” and psychologist Gerd Gigerenzer explains how we assess risks correctly and make the correct decisions.

Success through prevention

It is well recognized that analyzing and avoiding mistakes is the most effective method. It is easy to discern mistakes. On the other hand, it is difficult to say why something went well. It is necessary to understand here that this is more a problem of the quality and not the quantity of thinking (“he did not think”). The three key mistakes described by cognitive research are:

  • Cognitive fallacy  
  • Wrong heuristics
  • Wrong conclusions

We are subject to such cognitive fallacy, if we do not perceive a problem correctly, while we are on the wrong track in case of wrong heuristics and apply solutions that are too simple.

Competition between the systems

Sometimes our thinking seems to get confused even during simple tasks. Here is a classic task for warming up: A table tennis racket with ball costs 11 Euros as a set. The rackets costs 10 Euros more than the ball. What is the cost of the ball? Many respond quickly with 1 Euro, although in that case, the racket would cost 11 Euros and the price of the set would be 12 Euros. Only after considering this do we try with 50 cents and then reach the correct solution.

Kahneman extensively describes this hasty pinning down on the easy answer as the result of competition between two thinking systems in our brain:

  • System 1 works automatically, is fast and cannot be controlled by will
  • System 2 requires attention and concentration, but also allows freedom of decision

System 1 could be called intuitive thinking, while system 2 represents conscious thinking. It is necessary to observe here that both systems are metaphors and do not actually exist in the brain. In reality, different regions work together, and Kahneman tries to describe their interaction. System 1 is very helpful in most situations. However, it gets in the way in case of some problems when it would be better off leaving the field to system 2.

Influence of associations

As in the case of optical illusions, we often make involuntary mistakes in thinking – something that could be called cognitive illusions. In spite of checking the measurements, the line in the Müller-Lyer illusion continues to appear longer than the other. Similarly, judgments are often made in spite of knowing better.

Figure 1: The Müller-Lyer illusion – although we know that both lines are of the same length, they appear to us to be of different lengths. Cognitive illusions behave in a similar manner. We know what the right decision would be, yet we decide differently.


Associations belong to these cognitive illusions. For the psychologist, however, there is more behind the term than what is understood in everyday parlance. An example that can be offered for associative activation is disgust. When we see something disgusting or even imagine it, the corresponding emotions are reinforced even more by the imagination; our thoughts influence the entire body, a dull feeling and a retching reflex set in. We rarely register less drastic associations and activated imaginations consciously, although they influence us as well.

Influence of context

Another phenomenon is called priming. Our thinking is channelized depending on the context, which in turn impacts our actions and emotions. If you see a film on an retirement village, you then walk slower after watching the film. If you attend a business seminar, you will not want to share much after it. These studies were conducted dozens of times and prove how our thoughts are shaped unconsciously.

For that matter, we are particularly proud of our judgment, although scientific observation of how a judgment is formed shows that system 1 renders judgments with the mental shotgun. Consequently, we have an intuitive feeling and an opinion about almost everything.

Simplification is of no help

When we really do not know what to do next, we replace the complicated questions with a simpler one for which we know the answer. These heuristics are standard methods that we use continuously without being aware of it. Because of course, we do not consciously act according to an easier schema. Rather, the simple schema appears to fit perfectly and delivers a good solution.

Many simplifications end in classic errors in reasoning like in statistical problems for instance. Intuition does not help us here really, only calculation does, but we still ignore the size of a sample and prefer to think in causal terms. That is why we believe, the distribution of birth rate BBGBGB (B = boys, G = girls) in a delivery room is more probable than the distribution BBBGGG or GGGGGG, although they have the same probability. On the other hand, we believe that a goalkeeper is a particularly talented penalty slayer, because he has stopped three penalties one after the other. You might perhaps agree that you too have thought in similar vein in such examples, which means you stepped into a mind trap, without realizing it.

Pitfalls of anchors

However, mind traps are found to be lurking in much more concealed places. The so-called anchor effect is very well documented: students, who have to gauge the price of a bottle of wine after they have written the last two digits of their telephone number are found to clearly orient towards these numbers, although obviously one has nothing to do with the other.

Such heuristics of availability allow us to judge things wrongly, only because we remember examples at that time and these are accessible to us. If you think of your share of the housework, several activities come to mind: carrying out the garbage, cutting the hedges, vacuuming the carpet or washing the car. Does that not take your share much beyond 50 percent? Well, you should ask the other people who share the home with you.

As it is, we are at odds with calculating probability. An apparent representativeness looks to be much more plausible to us than the logical probability. The story behind it, the context, must feel right for us. This leads us directly into the mind traps in everyday professional life, where we apparently only arrive at rational decisions.

Thinking errors in management

In case of narrative fallacy, we give more importance to explanatory stories that allow us to understand the world. Ability, skills, talent and work are stressed as the basis of success, while luck and coincidence are underestimated. Ignorance of all circumstances and coincidences leads us to identify a clear development where there is none.

Decision makers assess the quality of a decision considering only the result, in case of hindsight bias. Information that has been revealed newly is integrated, which means that the memory is revised. Those bearing responsibility have systematically wrong memories concerning earlier predictions. Therefore, decision-makers who are oriented only by results tend towards bureaucracy and aversion of risks. This allows deriving an illusion of skill, which fills us with a feeling of subjective conviction, because we allow sense to be made of things in hindsight and therefore believe that the future can be foretold. Experts in their areas of specialization are therefore often even worse than laymen in assessments, and good algorithms and statistical analysis are much better. Expertise is possible only in a regular environment, i.e. in chess or in medicine, but not in politics, in the stock market or in the economy.

Mistakes while planning

Planning appears to be the cure-all against the unpredictable. Far from being the truth, planning fallacy allows us to wrongly define time plans for a project time and again, as we always use only one of two possible forecasting models: the personal view. Our own circumstances and issues are considered, time is extrapolated and another safety buffer is integrated. The unknown is not considered perforce and the probability of failure is not recognized: the prospects of success are seen too optimistically. The external view would work better.

Look at other similar projects, consider statistical data and use distributed information. Create a reference class forecast, where you say that other projects have needed this amount of time and there is no reason to say we are faster. Unfortunately, the personal view always dominates the external view, because it is literally closer to us and because we are optimistic by nature. Everyone considers themselves to be superior (above-average effect), irrespective of whether it is about driving or managing a project. This is especially pronounced in case of experts, since social and economic pressures demand that they overestimate themselves. An expert must be confident, uncertainty is a weakness.

Mental losses

Reasoning errors can often be demonstrated through simple examples. Most people would thus immediately respond to the question about what winnings would convince them to participate in a bet with the toss of a coin, a chance of 50 percent. We feel loss more strongly than winning due to the so-called aversion to loss, and usually double as strongly, which is why we would participate in such a game only when the winnings are doubled. We simply do not want to and cannot lose. Here too it is system 1 that limits us by defining a neutral reference point, our status quo.

This also holds true for the endowment effect. An example: You are a fan of a music group and are able to snatch one of the last concert tickets for 100 Euros. Not even for 1000 Euros would you part with the ticket. The possession of the ticket is your reference point, selling it would be a mental loss. Our preferences fluctuate with the reference point, but we have a clear tendency against change. In fact, losses accompany feelings of pain and disgust as well. Admittedly, this phenomenon is also developed culturally and distinctly more developed in Europe than in the USA.

Sunk costs

We maintain a mental accounting with different accounts. If you bought the concert ticket, while your friend was gifted the ticket and there is a snowstorm raging on the day of the concert, who do you think would travel to the concert nevertheless? Your mental account as buyer is more in minus than that of the one who received the gift, although when considered rationally both are transacted and absorbed costs – sunk costs.

This does not only affect private individuals. Even managers do not deal correctly with sunk costs and put in more and more money into a project that is running badly. Instead of calling it a day, a manager looks to avoid a mental loss – the dilemma of many a manager. The company targets suddenly do not seem to coincide with an individual career that does not cope with a failed project. Therefore, it is often helpful to shuffle managers. A new manager begins again with zero mentally, an effect that can be observed during every trainer change in the soccer league.

Influence of emotions

We weigh everything within an emotional framework. Emotions form the base of all our decisions, we just refuse to recognize them and sit on our high horse – the illusion of decisions based on rationale. Especially decision-makers worship figures, data and fact following the nonsensical cult of management. Although the mental traps mentioned have been proven scientifically just as every other finding of natural science, they are ignored, naturally because they would influence self-perception. Moreover, it requires courage to recognize that one is influenced by emotions, which in turn sometimes depend on banalities such as hormonal or blood sugar levels. Neuroeconomics has already achieved a lot in this area and done away with the rationally thinking “homo oeconomicus”.  The regions of the brain responsible for conflict resolution and self-control have to become active only when we decide against the emotions. We can do it, but we do it rarely, because we do not realize that out preferences depend on an emotional frame.

Some recommendations

The idea of man that has prevailed from the time of Plato to Descartes and Freud to the present sees reason at war against emotions. Humans act rationally, unless their emotions hold them back from doing so. Unfortunately, this perception is wrong. Emotion and rationale work together. There are mental patients who cannot feel any emotions anymore. At the same time, they are incapable of making even the smallest decision. Emotions are part of the process of taking good decisions.

There do exist problems where the unknown deflects us from correct thinking. The art is in knowing when this is the case and when it would be better to trust intuition. We must pay heed to some tips for this: We should not rely too much on our own assumptions and we should be flexible about our basic principles of beliefs and convictions. Furthermore, it is advisable to observe our own thoughts and feelings to identify how they mutually influence each other. Alternatives should be played out to the end in the mind and one should not blindly rely on methods and strategies. This also includes allowing complexity – not everything can always be simple and clear. A wide framing, i.e.  broadening the horizon and forming holistic mental accounts should help to avoid mental traps and to take advantage of opportunities.

 

Literature for further reading

  • Dan Ariely: Predictably Irrational. The Hidden Forces That Shape Our Decisions. 2008
  • Rolf Dobelli: The Art of Acting Clearly. 2012
  • Rolf Dobelli: The Art of Thinking Clearly. 2011
  • Gerd Gigerenzer: Reckoning with Risk: Learning to Live with Uncertainty. 2002
  • Daniel Kahneman: Thinking, Fast and Slow. 2011
  • Gary Klein: Sources of Power: How People Make Decisions. 1999
  • Jonah Lehrer: How we decide. 2009
  • David McRaney: You Are Not So Smart. 2012
  • Phil Rosenzweig: The Halo Effect. 2007
  • Nassim Nicholas Taleb: The Black Swan. 2007
  • Nassim Nicholas Taleb: Antifragile: Things That Gain from Disorder. 2012