What You See Is All There Is

The world doesn’t give us decisions to be made made like answering a multiple choice problem, though sometimes we may believe that’s the case. Nobel Prize Winner in Economics Daniel Kahneman and his collaborator, Amos Tversky, tackled the blind side of decision-making in their pioneering work on prospect theory. Their 1979 paper, Prospect theory: An Analysis of Decision Under Risk, illustrated humanity’s collective divergence from neoclassical economic decision-making. The two conducted a series of psychological tests that unpacked the heuristics and biases that influence how we perceive the costs and benefits of a given decision, and how our judgments are often incongruent with various decisions’ objective values.

I was talking with a friend recently about some choices he’ll soon have to make in his career. We spent a good deal of time on the phone and went through several scenarios and permutations thereof on how things might play out. When I got off the phone with him I couldn’t shake the feeling that we had missed something. Though we swarmed the issue he was dealing with aggressive and penetrating intellectual consideration, I had this sense that we had missed something important.

A few hours later it hit me. Kahneman describes in his book Thinking, Fast and Slow, the notion of “What You See Is All There Is.” We tend to make decisions independent of outside considerations unless we go through the taxing mental exercise of developing counterfactuals. We simply fail to account for the incredible complexity of our world when making decisions and tend to rely on a small and not necessarily representative sample of observations when formulating conclusions or courses of action. Our brain can deal with the known knowns, but has a difficult time conceptualizing known unknowns, phenomena that is relevant to the problem at hand but for which our mental reserves have no information on. And it goes without saying we tend to immediately discount the unknown unknowns, the proverbial “black swans” of our world that Nassim Taleb has brought into so many contemporary intellectual conversations.

I ended up sending my friend a well thought out counterfactual argument to basically everything I had described during our phone call. I had been relying too much on intuitive judgment at the expense of exploring outside possibilities, the known unknowns and the unknown unknowns. The final course of action turned an otherwise myopic and short-sighted plan of action into one that was much broader and more thought through. We turned a yes/no question into a multiple-choice question, and in the process gained a far better sense of what was at stake.

Just remember, zoom out when something feels off and remember that your brain believes that what you see is all there is, especially since the truth is anything but.

Everyday Negotiation

Swathes of literature discuss the nuances and intricacies of successful negotiation. There is plenty of excellent advice out there, but today I want to discuss one technique that comes out of some of the excellent research done on cognitive biases.

In his new book, Thinking, Fast and Slow, Daniel Kahneman, a Nobel Prize winner in Economics, amongst other things writes about how cognitive biases and heuristics affect how we behave under some of the most mundane of circumstances. In this excerpt, Daniel Kahneman discusses the powerful role of anchoring effects in negotiation and how to fight their influence.

We see the same strategy at work in the negotiation over the price of a home, when the seller makes the first move by setting the list price. As in any other games, moving first is an advantage in single-issue negotiations—for example, when price is the only issue to be settled between a buyer and a seller. As you may have experienced when negotiating for the first time in a bazaar, the initial anchor has a powerful effect. My advice to students when I taught negotiations was that if you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear—to yourself as well as to the other side—that you will not continue the negotiation with that number on the table.

Depending on the nature of one’s work or extracurricular persuasions, we are not apt to encounter such instances of overt negotiation with much regularity. But if we think about the myriad of interactions we go through in day-to-day living, often we are looking to get something from someone in exchange for something we possess, or the converse in which someone is looking to get something from us.

If we approach conversations as negotiations (where appropriate, of course), this knowledge may give us pause should we find ourselves working toward an agreement that began from an unfavorable starting point. In such cases we may have the presence of mind to simply walk away. I can’t help but think how contrary this is to our nature, where, when confronted with a challenge the natural instinct is to fight back rather than disengage, even if disengaging is our best course of action.

Behavioral science seems to play this out—we shouldn’t always trust our gut.

Trial and Error

Economist Tim Harford has catapulted to the top of my “must-read” list. His new book, Adapt: Why Success Always Starts with Failure, espouses the virtues of the trial and error process in our increasingly complex world. Despite playing an indescribably critical role in our daily lives, we often fail to perceive the complex systems that surround us. In this video, Tim expounds on a task that, at face value, seems reasonably doable—the creation of a toaster from scratch—to illustrate how decentralized and impenetrable our market economies can be.

This got me thinking about all of the systems we help construct, become a part of, or benefit from. Even the writing of this blog post brings together innumerable systems of wide-ranging complexity, from the nascent thought processes that inspire these words to their delivery and publication across the internet. It behooves us to better understand the systems that surround us, so we may help improve them in an effort to make our world a better place.

Unimpossibility

I’ve of late been putting an inordinate amount of time into philosophical texts. My current literary victim is Nassim Nicholas Taleb and his book, The Black SwanThe Wall Street Journal noted that “he writes in a style that owes as much to Stephen Colbert as it does to Michel de Montaigne.” I would have to agree.

Taleb’s black swan is in academia also referred to as a fat tail, and is the outlier that reintroduces luck and serendipity into the social sciences. Taleb seeks to explain the high impact, “impossible” events that defy our expectations of the material world and so take on the garb of a Power Law distribution. He as well examines humanity’s psychological bias and blindness toward such rare events, and in doing so seeks to answer why we’re so susceptible and blind to them.

Broadly the cognitive bias and, more specifically, the confirmation bias prove problematic as we try to understand the world we live in. By means of naïve empiricism we bipedal thinking things have a tendency to look for instances that confirm the narratives and stories and Platonic understandings of our world. The problem of course is that if you look for confirmation you can find it almost anywhere. Taleb argues that instead of deluding ourselves into thinking we’ve just aroused evidence for our correctness, we should rather scrape and claw and unearth those instances where our method or theory or course of action fails. It’s at that juncture that we actually learn something.

I’m going to try something different this time around and throw the question out to you, my readers. When have you found yourself susceptible to confirmation bias? Or, when you have taken the empiricist’s path and avoided the confirmation bias, and sought to falsify your theory rather than confirm it?