Previously I've discussed biases that affect problem-solving and decision-making, such as the availability bias, the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind. Another is the representative bias, the assumption that an event is more likely because it conforms to a known category or that a given situation is typical and will be consistent with expectations derived from past experience. [1]
I've also written about the "ladder of inference," a concept developed by Chris Argyris that illustrates the (sometimes quite large) gulf between our perceptions and reality. [2] The foundation of the ladder--its first rung, so to speak--is the recognition that in seeking to understand the world around us we focus on certain data and ignore other data. And it's essential to bear in mind that this selection process is not objective but is itself riddled with errors and biases that have a tremendous influence on our perception of the world and our experience in it. [3]
Much of our understanding of cognitive biases comes from the work of psychologist Daniel Kahneman, who won the Nobel Prize for economics for explaining apparent paradoxes in our decision-making. As Kahneman notes, when information is scarce, we rapidly craft a explanatory narrative on the basis of existing information and jump to a conclusion. In a sense, our brains are making a bet that it's preferable to act quickly on the basis of limited information than to wait while we fill in the gaps. [4]
This process allows us to immediately interpret a situation and "make sense" of it in a way that will guide future action, and most of the time it works well; we're generally successful at navigating the world of experiences and interactions. But as effective as they are, the mental systems that evolved to do this work are as subject to errors and biases as any other aspect of our thought processes.
One of the most significant cognitive errors is what Kahneman calls What You See Is All There Is, or WYSIATI, which is shorthand for the fact that we find it very difficult to envision missing data. Our brains, Kahneman says, are "radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions." [5] Even when faced with massive gaps in information, we tend to focus on the information at our disposal and rely on it to construct a narrative, as flimsy as it might be. A corollary to Kahneman's WYSIATI might be called WYDSDE: What You Don't See Doesn't Exist. And as a result we're typically overconfident in the validity and coherence of our explanatory narrative. Kahneman writes,
The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing--what we see is all there is. Furthermore, [our brain] suppresses doubt and ambiguity. [6]
Bear in mind that our cognitive abilities are often diminished when we're under emotional stress, such as when we're resolving a conflict or are in the midst of any tough conversation. At the very moments when we most need to make accurate judgments about a situation and about others' intentions, we are most likely to be confidently wrong.
So what can we do? First, merely being aware of our tendency to ignore missing data and rely instead on readily-available information to construct explanatory narratives can be helpful. We can't turn off this mental mechanism (nor would we want to--it usually serves us well), but we can look for the errors it tends to generate. Applying the framework of the Ladder of Inference, we can drop down a few rungs and go from taking action to assessing the data that serves as the basis for our actions. We can ask ourselves: What don't I know here? What assumptions am I making to fill in the gaps? How might I challenge or test those assumptions?
The ability to ask these questions in the first place rests on a foundation of self-awareness and the practice of self-inquiry. We need to be able to step outside our immediate experience and understand our habitual responses and tendencies. We need to understand and explore the mental models that we're bringing to the situation and determine whether they apply here (or are still accurate at all.) One way to do this is through journaling--even 30 seconds can be useful, when done consistently.
Finally, in order to do any of this work we need to cultivate the ability to slow down, both in the moment when we're under stress, and on a regular basis to make time for reflection. I recommend meditation to my clients and students, not as a source of stress relief but as a workout for attention management. I also recommend exercise, both for its impact on our mental effectiveness and as a reflective experience. And, of course, good sleep habits are essential. Note that pursuing these activities will require us to set boundaries and say no in order to invest time in our own growth and development--if we don't make it a priority, it will never happen.
Footnotes
[2] Racing Up the Ladder of Inference
[3] Cognitive bias cheat sheet (Buster Benson, 2016)
[4] Thinking, Fast and Slow, page 85 (Daniel Kahneman, 2011)
[5] Thinking, Fast and Slow, page 86
[6] Thinking, Fast and Slow, pages 87-88
Photo by David Kingham.