If this is your first time reading, please check out the overview for Poor Ash’s Almanack, a free, vertically-integrated resource including a latticework of mental models, reviews/notes/analysis on books, guided learning journeys, and more.
Contrast Bias Mental Model: Executive Summary
If you only have three minutes, this introductory section will get you up to speed on the contrast bias mental model.
The concept in one quote:Anomaly appears only against the background provided by the paradigm. - Thomas Kuhn Click To Tweet
The concept in one sentence: things that don’t change tend not to be very interesting, because they’re likely the same as what they used to be and thus contain new information; humans are thus very attuned to changes above a certain threshold (and don’t notice very slow/modest ones, like a sneaky predator creeping through the grass.)
Key takeaways/applications: Having an understanding of how contrast bias can serve to impact our perceptions and beliefs allows us to make more accurate judgments and counteract its influence.
Three brief examples of contrast bias:
Hit the town with your loser friends, not your cool ones. If you bring a wingman, make sure they’re worse than you in every way. As Jordan Ellenberg explains in “How Not To Be Wrong” ( HNW review + notes), it turns out that if you give people option A, then give people option B, which is exactly like option A except worse in some clear and obvious way, people then like option A more.
I saved $20 on those shoes so I could spend an extra $2,000 on that car. Consumers often make purchase decisions based on contrast bias; Richard Thaler explores the concept of “just noticeable differences” in Misbehaving ( M review + notes), which explains, among other things, why we’ll drive across town – or wait half a year for a sale – to save a modest amount of money on running shoes, but later, when purchasing a new vehicle, blithely drop a few thousand dollars on upsells like fancy floormats that we’ll never notice or use.
A quick and easy trick to be happier. One useful of contrast bias that I go deeper into in the mindfulness mental model is utilizing contrast bias to easily put situations in a new framing: if you compare your life to some perfect and nonexistent ideal, you’ll be miserable; if you find a clearly bad counterfactual to compare yourself against, you’ll feel much better.
This tendency to judge situations by contrast is a biological construct, it turns out – Jennifer Ackerman’s “ The Genius of Birds” ( Bird review + notes) includes a hilarious and thought-provoking section discussing how male bowerbirds use visual effects to make themselves look bigger, and thus more attractive to females.
If this sounds interesting/applicable in your life, keep reading for unexpected applications and a deeper understanding of how this interacts with other mental models in the latticework.
However, if this doesn’t sound like something you need to learn right now, no worries! There’s plenty of other content on Poor Ash’s Almanack that might suit your needs. Instead, consider checking out our learning journeys, our discussion of theinversion, schema, or Bayesian reasoning mental models, or our reviews of great books like “ Uncontainable” ( UCT review + notes), “ Deadly Choices” ( VAX review + notes), or “ The Landscape of History” ( LandH review + notes)
Contrast Bias Mental Model: A Deeper Look At “Just-Noticeable” Differences
We’re hardwired to notice changes, but only those that are above a certain threshold.
This is why many parents rarely notice their own kids getting taller, but Aunt Bethel always says “oh my! How they’ve grown” come Thanksgiving.
If you watch long-running TV shows, there’s often the same phenomenon.
If you watched a show for a long time (as I used to with NCIS – of which the covers of the second and fourteenth seasons are pictured at left), you never really noticed the protagonists like Gibbs, McGee, DiNozzo, and Abby getting older.
Yet if you watch a recent episode – then go back and watch a rerun – you’ll be shocked by how much older all of them look now.
In the frequently-hilarious, always-thought-provoking “ Misbehaving” ( M review + notes), Richard Thaler invokes the concept of “just noticeable differences” to explain a wide range of human behavior. For example, we tend to focus a lot on transactional utility – discussed more in the product vs. packaging mental model – and are lured by “discounts” to list price that take advantage of our contrast bias because they’re noticeable differences.
There’s nothing wrong with that, of course, but it can lead to some silly behavior if we’re not focused on evaluating the utility of our actions – it’s nonsensical to spend lots of time and effort to save small amounts of money, while in other situations blithely frittering it away.
It’s also something that automakers have learned to utilize: consumers don’t think much of saving $300 on a $15,000 car – because $300 compared to $15,000 is pretty small – but if you charge them $15,000 and give them a “free $300 rebate,” now that $300 is being contrast against a different number – zero.
If I were to relay all the examples of contrast bias and just-noticeable differences from Misbehaving, we’d be here all day. So just go read the book. And then start looking for places where this pops up.
Starbucks had begun to fail itself. No single bad decision or tactic or person was to blame. The damage was slow, quiet, incremental: like a single loose thread that unravels a sweater inch by inch. - Howard Schultz, ex-Starbucks CEO Click To Tweet
We’ll return to this application of contrast bias a little later when we talk about the idea of self-justification.
Meanwhile, here’s another great quote from Thaler:When we have adapted to our environment, we tend to ignore it. - Richard Thaler Click To Tweet
n “ The Happiness Advantage” ( THA review + notes), Achor explores this phenomenon via “ hedonic adaptation” – which explains why we’re not actually happier after we win the lottery. Again, I delve into this in the mindfulness mental model.
Contrast bias, this environmental adaptation, and just-noticeable differences can make us process information in inappropriate ways. For example, in “ Scale” ( Scale review + notes), theoretical physicist and former Santa Fe Institute President Geoffrey West explores how we inappropriately focus on discrete (high-contrast) rather than continuous (not-noticeable-difference) risks:
“We are surprisingly tolerant of death and destruction arising from ‘unnatural, man-made’ causes when they occur on a continual and regular basis, but are extremely intolerant when they occur suddenly as discrete events even though the numbers involved are much smaller.
For now, we’ll stick with one of the unfortunate downsides of “just noticeable differences” – many people like big “effect size” solutions, like get-rich-quick schemes or “one easy trick that’ll cut your waistline by five inches.”
The truth is that much of the world isn’t like this: it’s a process of compounding many small advantages.
“A 1 or 2% change in some outcome […] should not be a reason to scoff, especially if the intervention is essentially costless […] when the stakes are in billions of dollars, small percentage changes add up.
As one United States senator famously remarked, ‘a billion here, a billion there, pretty soon you’re talking about real money.’”Human felicity is produced not so much by great pieces of good fortune that seldom happen, as by little advantages that occur every day. - Benjamin Franklin Click To Tweet
Don’t forget to rack up (and notice) those advantages, which are often below the “just noticeable difference” threshold.
Application / impact: contrast bias can distort our perceptions, making us focus too much – or too little – on important details depending on whether they happen discretely or all at once.
Contrast Bias x Social Proof x Culture x Framing ( Schema) = “Anchoring”
One cognitive quirk associated with contrast bias is what’s known as “anchoring” – the idea that we “anchor” our judgments on something, rather than coming to them independently. Even completely random, arbitrary inputs have been demonstrated to meaningfully influence our decisions.
For example, we tend to evaluate things relative to other products – so retailers will often include products they don’t actually intend for you to buy, but that serve as a reference point to make another product seem cheaper or better in comparison. (See the Ellenberg bit referenced in the beginning in “ How Not To Be Wrong” ( HNW review + notes) for how even a clearly worse option makes us like another one better.)
Unfortunately, short of wearing earplugs and an eye-mask everywhere, it is difficult to avoid this phenomenon in many circumstances (beyond simply being aware of it). One place we can try to avoid it, however, is when those inputs are not random, but shaped by social proof and culture.
Take this example from physicist Richard Feynman’s “ The Pleasure of Finding Things Out” ( PFTO review + notes), where he’s discussing how and why it took scientists a while to arrive at the correct charge of an electron. It turns out that an early scientist – Millikan – didn’t have the right constant for the viscosity of air, so his measurement was wrong. What happened with subsequent scientists?
“It’s interesting to look at the history of measurements of the charge of the electron after Millikan. If you plot them as a fnction of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that… until finally they settle down to a number which is higher.
Why didn’t they discover that the new number was higher right away? … When [scientists] got a number that was too high above Millikan’s, they thought something must be wrong – and they would look for and find a reason […]
when they got a number closer to Millikan’s value, they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that.”
Examples of this sort of anchoring abound everywhere. When ideas are embedded in our culture, we tend to anchor off them and use them as a starting point (even if just to frame our disagreements). The antidote, as Peter Thiel explains in “ Zero to One” ( Z21 review + notes) is thinking from first premises:The most contrarian thing of all is not to oppose the crowd but to think for yourself. - Peter Thiel Click To Tweet
Nonetheless, many professional investors make use of this technique and avoid anchoring on ideas from other people: I don’t pay attention to other investors’ valuation work, and I also try to spend a very limited amount of time looking at a stock’s price while conducting my analysis.
Does this always work? Of course not – just recently, there was an example where I went back and analyzed a company I’d looked at a few years prior, and realized that I’d understated how working-capital-intensive the business was because of the “technology company” framing that another investor had discussed it with.
Application / impact: being aware of this tendency and actively counteracting it – or using structural problem solving techniques to minimize our exposure to potentially biasing “anchors” – can help. For example, if you want an unbiased view from a third party, don’t tell them what you think, so they can come up with their own insights that aren’t colored by yours. You may pick up on factors you completely missed – that they might have, as well, if starting with your “anchor.”
Contrast Bias x Incentives x Memory x Schema x Overconfidence x Sunk Costs x Local vs. Global Optimization: Self-Justification
Hey, if you really wanna fight with me?
Then drop the act and take accountability.
And own your wrongs, and keep them near…
Don’t know what you’re really scared of? I can tell.
The circles that you’re running in your mind…
Someone who’s certain is certain you’re falling.”
In fact, “drop the act and take accountability, and own your wrongs and keep them near” is pretty much exactly the point of the extremely important book “ Mistakes were Made (but not by me)” – MwM review + notes – by psychologists Carol Tavris and Elliot Aronson.
The book asks a powerful question: why is it so easy for us to see other peoples’ mistakes… and yet disregard our own, continuing to make them and double down on them?
The answer, as usual when it comes to psychological phenomena, turns out to be multicausal. We’re overconfident and yet our memory is leaky – a really bad one-two combination. Of course, we also have incentives to protect our own ego, we usually find it difficult to step outside of our own schema (worldview), commitment bias makes us want to stick to our course of action, and last of all, local vs. global optimization: we tend to make decisions that are best/easiest locally, often to the detriment of the ones that are best globally.
We’re so scared of admitting we’re wrong that, as Set Your Goals say, we’re “running circles in [our] mind” to justify our own bad behavior, actions that serve “only to harm,” leaving people – whether colleagues, our friends, or loved ones – “feeling used.”
The piece that ties it all together for me is contrast bias: specifically, the “just noticeable differences” from Thaler that I referenced in previous sections. It’s hard to go from “kind” to “cruel” overnight, but it’s easy to get there in little tiny steps, each of which is justified by fundamental attribution error: today I was mean because I had a tough day at work.We may come to believe our own lies, little by little. - Carol Tavris + Elliot Aronson Click To Tweet
Tavris/Aronson present a compelling “pyramid” model that works as well for bad habits as it does for bad behavior.
I believe there’s an analogy between her observations there, and this topic. We start by telling one lie… or watching one TV episode to take our mind off things. But after the first one, the second one is easy to do, and so is the third… and eventually we look up and not only is our afternoon gone, but so is our self-integrity, and we’re farther away than ever from solving the problem that’s the root cause of our issue. (See causality).
I visualize below, and discuss in the local vs. global optimization model as well: at any given moment, it’s easier to go down the pyramid (toward where we don’t want to be) than back up the pyramid (to where we should be).
Once you start thinking about this pyramid of self-justification, where step by step we come to believe irrational things and justify our own bad behavior, you start to see it everywhere.
Take reciprocity bias: it’s been well-known since Cialdini’s classic “Influence” that gifts influence our behavior. Gifts are thus used as a form of advertising, and as Peter Thiel points out in “ Zero to One” ( Z21 review + notes), if we don’t think advertising affects us, we’re doubly deceived.
So most of us probably have the good sense to not start by accepting lavish gifts and assuming they won’t affect our decisions. But we can work our way up there: Charles Duhigg’s “ The Power of Habit” ( PoH review + notes) explores how a compulsive gambler – who already got herself into trouble once – was lured back into it by ever-more-extravagant gifts from casinos.
Jerome Groopman discusses a similar phenomenon in “ How Doctors Think” ( HDT review + notes): he cites the example of several doctors going on lavish, all-expenses-paid trips to tony destinations (with, of course, a “conference” thrown in for justification) and claiming that it didn’t influence their judgment.
Groopman doesn’t go too deep into it, but doctors probably didn’t go from zero to “fancy ski trip” all in one go. In fact, he highlights how one doctor – endocrinologist Karen Delgado – tackles the issue:
“[The pharmaceutical salesman] brought boxes of candy to her office three times, and when this ploy failed, he left invitations to ‘educational dinners’ at the most expensive restaurants in town.
Delgado ignored the invitations, telling herself that if she wanted a good meal, she would have it with her husband on her own tab.”
Delgado is a model of good decision-making and structural problem solving. Tavris/Aronson recommend the exact same approach for avoiding self-justification: just don’t start.
Let’s finish with an extreme example of contrast bias and self-justification that is related to the above tendencies (although working in a different direction.)
How do you turn a group of ordinary men into mass murderers?
(Kids, please do not try this at home.)
Christopher Browning’s chilling but fascinating “ Ordinary Men” ( OrdM review + notes) changed how I thought about history by exploring that question through the lens of Reserve Police Battalion 101, comprised of working-class Germans who, as Browning notes, were pretty much absolutely the least likely sample size of men to become Holocaust perpetrators.
Again, this was multicausal and involved a lot of psychological phenomena, but contrast bias and self-justification played a role. Many readers may be familiar with the classic Mark Twain quote:Eat a live frog first thing in the morning and nothing worse will happen to you the rest of the day. - Mark Twain Click To Tweet
A similar phenomenon befell Reserve Police Battalion 101.
Indeed, one sergeant reported that his men were “overjoyed” about their subsequent non-shooting participation (that of course didn’t change the fact that they were still playing a vital role.) It’s easy to see how merely escorting people to their death doesn’t seem so bad after you’ve pulled the trigger while staring into their eyes.
One man even managed to convince himself he was doing Jewish children a favor by ending their lives, because they wouldn’t be able to live without their parents. He subsequently only shot children…
Application / impact: intellectual honesty is critical; lying to ourselves is easy but the only sensible policy is “just don’t do it.” Once you start, you set in motion a process which makes it easier to go farther and farther down the pyramid, and harder and harder to climb back up. The structural problem solving solution?Drop the act and take accountability, and own your wrongs, and keep them near. - Set Your Goals, 'Certain' Click To Tweet