Some years ago, I had a conversation with a work colleague who told me we don’t place sufficient trust in our instincts when it comes to assessing threats. I'm not so sure. I think we may trust them more than we should.
As Stephen Pinker suggests, “the mind is a system of organs of computation, designed by natural selection to solve the kinds of problems our ancestors faced in their foraging way of life, in particular, understanding and outmaneuvering objects, animals, plants, and other people.” We do a certain amount of “understanding and outmaneuvering objects, animals, plants, and other people” in our (contemporary) daily lives -- while driving, for instance. Even in those situations, though, an instinctive response is not necessarily the right response. We can kill ourselves by braking at the wrong time, out of instinct. We can create a disaster by speeding after a driver who has enraged us, chasing down this adversary like our ancestors might chase a raider from another tribe.
So while my conversation partner was right in observing that we have evolved a potent mechanism for perceiving and responding to danger, it’s probably best suited to a kind of environment that humans knew in the distant past. In a modern, urbanized setting it frequently sends out false alarms.
It’s bad enough when instinct misleads us in everyday life; even worse when it distorts our view of social or political problems. For example, our perception of crime differs from the reality: over the past quarter century, it’s dropped steadily. Yet most Americans believe crime levels are going up, not down. Newt Gingrich, on CNN recently, said that while the data "theoretically may be right...it’s not where human beings are.” Indeed, when perception differs from fact, most people will go with perception. Multiple studies, moreover, have found that being presented with contradictory evidence rarely impacts opinion. On the contrary, researchers at the University of Michigan found that "when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs."
Another example: we dread terrorism, but overlook other threats which are equally capable of killing us. We've evolved to be somewhat indifferent to risks arising from nature, while being extremely sensitive to risks posed by an identifiable human enemy -- militant jihadists, for instance. Because emotion is thought to be a major driver in risk perception, it's no surprise that we have little to zero tolerance for the risk posed by Al-Qaeda and ISIS, even though, on average, we’re more likely to be crushed by furniture than killed by a terrorist. The 9/11 attacks were a national trauma that most Americans beyond a certain age threshold experienced, either directly or indirectly (that is, via the media), and that we associate with powerful emotions of horror, grief and fear.
As a result, we make calculation errors that inadvertently sustain terrorism, by demonstrating its effectiveness in playing off the human psyche. Al Qaeda goaded the United States into a costly and inconclusive war. ISIS and its adherents could, if we let them, have the power to sway an election (Hillary Clinton's declining poll numbers in June and July correlate with attacks in Orlando and Nice, though other factors were also in play). Terrorism is a method of influencing policy without the benefit of a large military or geopolitical clout, that is, by taking advantage of known idiosyncrasies in the way we perceive risk.
We're stuck, it seems, with a flawed toolkit, and we can't simply exchange it for another; it's not easily removable. Almost everyone will, in certain circumstances, go on gut impulse even when presented with contradictory evidence. I doubt that many of us are able to be consistently rational in all situations, and that probably wouldn't be desirable anyway: for one thing, it would make movies and theme parks a lot less fun. We can and should, however, check our intuitive responses against the available data and learn to recognize false flags.
In more situations than we’d probably care to admit, “trust your instincts” is really bad advice.