THE HUMAN BRAIN is a fascinating organ, but it's an absolute mess. Because it has evolved over millions of years, there are all sorts of processes jumbled together rather than logically organized. Some of the processes are optimized for only certain kinds of situations, while others don't work as well as they could. There's some duplication of effort, and even some conflicting brain processes.
Assessing and reacting to risk is one of the most important things a living creature has to deal with, and there's a very primitive part of the brain that has that job. It's the amygdala, and it sits right above the brainstem, in what's called the medial temporal lobe. The amygdala is responsible for processing base emotions that come from sensory inputs, like anger, avoidance, defensiveness and fear. It's an old part of the brain, and seems to have originated in early fishes.
...
We humans have a completely different [
and additonal - db] pathway to cope with analyzing risk. It's the neocortex, a more advanced part of the brain that developed very recently, evolutionarily speaking, and only appears in mammals. It's intelligent and analytic. It can reason. It can make more nuanced trade-offs. It's also much slower.
So here's the first fundamental problem: We have two systems for reacting to risk -- a primitive intuitive system and a more advanced analytic system -- and they're operating in parallel. It's hard for the neocortex to contradict the amygdala.
...
And it's not just risks. People are not computers. We don't evaluate security trade-offs mathematically, by examining the relative probabilities of different events. Instead, we have shortcuts, rules of thumb, stereotypes and biases -- generally known as "heuristics." These heuristics affect how we think about risks, how we evaluate the probability of future events, how we consider costs, and how we make trade-offs. We have ways of generating close-to-optimal answers quickly with limited cognitive capabilities. Don Norman's wonderful essay, Being Analog, provides a great background for all this.
Daniel Kahneman, who won a Nobel Prize in Economics for some of this work,
talks (.pdf) about humans having two separate cognitive systems, one that intuits and one that reasons:
The operations of System 1 are typically fast, automatic, effortless, associative, implicit (not available to introspection) and often emotionally charged; they are also governed by habit and therefore difficult to control or modify. The operations of System 2 are slower, serial, effortful, more likely to be consciously monitored and deliberately controlled; they are also relatively flexible and potentially rule governed.
When you examine the brain heuristics about risk, security and trade-offs, you can find evolutionary reasons for why they exist. And most of them are still very useful. The problem is that they can fail us, especially in the context of a modern society. Our social and technological evolution has vastly outpaced our evolution as a species, and our brains are stuck with heuristics that are better suited to living in primitive and small family groups.