top of page
Search

Exploring Heuristics and Biases in Decision Making

While shortcuts to success, achievement, and even decision-making, may seem appealing, mental processes don’t always work in the decision-maker’s favor. Heuristics, mental shortcuts used to overcome everyday cognitive and environmental constraints, operate by reducing a decision-maker’s cognitive load in a variety of scenarios. Although the modern fast and frugal and adaptive toolbox approaches to heuristics focus on the accuracies and benefits of mental shortcuts, the negative impact of the predictable biases they occasionally produce can hardly be ignored. Used as proof of irrationality in the normative model, biases can be defined as the incorrect application of well-founded heuristics to decision-making scenarios. Because biases are derived from typically reliable decision-making strategies, the erroneous results they produce can create confusion and even embarrassment. Although critically thinking through every decision would be impractical, a general understanding of the heuristics that underlie biases may reduce some common decision-making errors like the false consensus effect or the gambler’s fallacy.

The false consensus effect is a product of the bias of egocentrism, which, in turn, is a misapplication of the anchor and adjustment heuristic. Anchoring and adjusting refers to the cognitive strategy wherein a decision maker focuses, or ‘anchors,’ on a given criterion or piece of information when evaluating choices, estimating subjective probability, or choosing an alternative. The decision maker then ‘adjusts’ his or her prediction/evaluation from the anchor when new information is presented and considered. Although the anchor and adjustment strategy of decision-making has its merits when applied appropriately, (i.e. when a predictive anchor is used), it suffers from numerous shortcomings in the form of primacy effects, conservatism, under-adjustment, and egocentrism. Used primarily when making judgements of others, egocentric thought relies on the process of projection: the human tendency to “‘anchor’ on our own attitude[s] or behavior[s]” and “adjust for aspects of the other person that are different from us” (Hastie and Dawes, p. 81). Although this process can lead to valid conclusions when the target of judgement is similar to the decision maker, people frequently fall prey to the false consensus effect by overestimating the commonality of their own beliefs and opinions. For instance, a few weeks ago my mom visited town and decided to take me, and several of my friends, out for dinner. At some point in the meal the discussion drifted to current events and, inevitably, politics. Knowing my position on the political spectrum, my mother made a partisan joke that, on the whole, was received with laughter, but caused two of my friends to remain awkwardly silent. In choosing to make a politically one-sided joke, my mother banked on the concept of group polarization wherein people naturally gravitate towards like-minded people; therefore, because she has an understanding of my political ideology, she assumed my friends would likely share the same sentiment. Because the assumption was likely substantiated by base rates of my friends’ democratically leaning ideologies prior to college, I believe the mental shortcut was warranted. That being said, in today’s hyper-polarized political environment, the joke could have had more serious implications for my friendships than a mere awkward silence.

The gambler’s fallacy, on the other hand, is considered a misconception of chance and is derived from the representativeness heuristic. This decision-making shortcut operates by determining to what degree a sample represents a population of interest. Thus, decision validity increases as the decision maker uses more predictive and accurate characteristics, like base rates and probability, to classify the sample in question. The gambler’s fallacy misapplies a decision maker’s basic understanding of chance or probability to generate the assumption that independent events occur in a balanced way, such that their probability pattern is clearly represented. For example, my cousins and I were taught how to play blackjack at a young age to sharpen our math skills. My grandfather, ever the gambling pro, would often win numerous hands in a row, causing us to increase our bets under the guise that we must be due a winning hand soon; however, because the deck was shuffled in its entirety after each hand, we were never truly any more or less likely to be dealt a winning hand. Since our premonition of an eminent win was likely based on nothing more than a belief in a cosmic balance, our decisions to bet more were unfounded, especially considering upon reflection that our grandfather won far more often than we did. Although we only suffered the loss of a few quarters (functioning as our ‘chips’), an experienced gambler risking thousands would face much harsher consequences employing the same decision-making strategy.

Though heuristics are undoubtedly useful in everyday decision-making scenarios, they are accompanied by a whole host of associated biases and consequences of varying severity. Ironically, the best way to avoid the pitfalls of shortcuts that strive to enhance decision-making performance under cognitive or environmental restraints is often to engage in more effortful thinking, a strategy heuristics are inherently designed to avoid. Since heuristics don’t come with ‘user beware’ signage, perhaps the best way to guard against embarrassing consequences and “dominance of the given” is to attain a better understanding of how we arrive at predictably erroneous conclusions (Hastie and Dawes, p. 114).


Works Cited:

Hastie, R. & Dawes, R. M. (2010). Rational choice in an uncertain world: The psychology of

judgement and decision making (2nd ed.). SAGE Publications, Inc.

 
 
 

Comments


© 2021 by Sydney B. Brashears Proudly created with Wix.com

bottom of page