Sunk Costs / Commitment + Consistency Bias Mental Model (Incl Thesis Drift)

If this is your first time reading, please check out the overview for Poor Ash’s Almanack, a free, vertically-integrated resource including a latticework of mental models, reviews/notes/analysis on books, guided learning journeys, and more.

Sunk Costs / Commitment + Consistency Bias Mental Model: Executive Summary

If you only have three minutes, this introductory section will get you up to speed on the sunk costs / commitment + consistency bias mental model.

The concept in one sentence: money or time that’s already been spent can’t be gotten back; they’re “sunk costs” – so instead of trying to waste our time “getting it back,” we should focus on the marginal utility of our future actions.

Key takeaways/applications: sunk costs can lead us down a path of misery and even death.  In less pressing circumstances, they can lead to bad business or financial conditions.  Training ourselves to take a very unintuitive “econ” – rather than “human” – view of sunk costs helps us make dramatically better decisions.

Two brief examples of sunk costs / commitment + consistency bias:

What do family rifts and geopolitical conflicts have in common?  Books like Tavris/Aronson’s “ Mistakes were Made (but not by me) explore how sunk costs can lead to relationships fraying: a vicious autocatalyticfeedback process of parties litigating past wrongs.  People at the end of their lives tend to regret this, as explored in Karl Pillemer’s “30 Lessons for Living” (30L review).

It’s a trap that professional negotiators have to learn to sidestep and avoid – in the landmark “ Getting to Yes ( GTY review + notes), authors Fisher, Patton, and Ury explore various practical techniques for focusing on interests (forward-looking marginal utility), rather than positions (which are often based on historical sunk costs, or commitment bias.)

Allowing people to “save face” – empathy – and making people feel like the ideas are their own – agency – can defuse sunk cost thinking, to everyone’s benefit.

Vietnam wasn’t the first time people died for sunk costs.  The Vietnam War is widely acknowledged as a classic example of the sunk cost fallacy – needlessly throwing good money after bad, sacrificing more soldiers to justify the lives that had been lost.  It was hardly the first, last, or only time that politicians made decisions based on justifying sunk costs, however.

Richard Rhodes’ phenomenal “ The Making of the Atomic Bomb ( TMAB review + notes) explores not only the science and engineering of the atom bomb, but the human side as well.  

At least in part, the scientists’ original justification for building the bomb was a totally reasonable and defensible fear of Hitler ruling the world.  Rhodes explains:

“Patriotism contributed to many decisions, but a deeper motive among the physicists, by the measure of their statements, was fear […] of a thousand-year Reich made invulnerable with atomic bombs.”

And yet the bombs were dropped – in Japan – a full three months after Germany officially surrendered, and after Japan had already been half burnt to the ground by firebombing raids.  Why? There were a lot of reasons, but one was political pressure incentives) to justify sunk costs, leading to commitment bias:

“… the bomb was also to be used to pay for itself, to justify to Congress the investment of $2 billion, to keep Groves and Stimson out of prison.”

If this sounds interesting/applicable in your life, keep reading for unexpected applications and a deeper understanding of how this interacts with other mental models in the latticework.

However, if this doesn’t sound like something you need to learn right now, no worries!  There’s plenty of other content on Poor Ash’s Almanack that might suit your needs. Instead, consider checking out our learning journeys, our discussion of theBayesian reasoningluck vs. skill, or mindfulness / cognitive behavioral therapy mental models, or our reviews of great books like “ Poor Charlie’s Almanack” ( PCA review + notes), “ The Vaccine Race” ( TVR review + notes), or “ Onward” ( O review + notes).

Sunk Costs: A Deeper Look

In Richard Thaler’s amazing “ Misbehaving ( M review + notes) – my favorite book of all time – he quips that he became the first clinical behavioral economist by explaining to a friend, whose five-year-old daughter refused to wear dresses they’d paid good money for, that the disutility incurred by both mom and daughter through this fighting would not, in fact, recoup the money paid for the dresses.

Thaler provides a great exploration of sunk costs in various contexts, and I’m not going to replicate his discussion here, other than to point out a few things.

First, like all other mental models, sunk costs shouldn’t be thought of as universally “good” or “bad” – they’re simply a trait that we can use adaptively in some circumstances, but that’s maladaptive and needs to be avoided in others.

Second, Thaler’s discussion includes a lot of interactions between sunk costs and other related models like opportunity cost and marginalutility.

Third and finally, one of the more hilarious elements of misbehaving is how economists – who, of all people, should know to avoid sunk costs – fell prey to sunk costs when defending the completely wrong, irrational, unhelpful “rational actor” premise.  (See humans vs. econs.)

Thaler discusses how “factors” were added to the Fama-French efficient market hypothesis model, for example, despite getting to a point where those factors were completely bizarre:

“it is difficult to tell a plausible story in which highly profitable firms are riskier than firms losing money.”  

And yet this is not an uncommon problem among scientists and other researchers, who often fail to apply scientific thinking.  Indeed, Thomas Kuhn’s famous “ The Structure of Scientific Revolutions ( Kuhn review + notes) mentions, at one point:

“normal science, the activity in which most scientists inevitably spend almost all their time, is predicated on the assumption that the scientific community knows what the world is like…

normal science… often suppresses fundamental novelties because they are necessarily subversive of its basic commitments.”

Kuhn goes on to note:

Phenomena that will not fit in the box are often not seen at all... |scientists| are often intolerant of |new theories| invented by others. - Thomas Kuhn Click To Tweet

Which can be viewed as an example of sunk costs – scientists have invested a lot of time and effort in building a paradigm, and if it turns out to be incorrect, what do they do?

In the case of humans vs. econs, one unusually candid economist literally asked Thaler: if your newfangled theory is correct, what do I do?  I’ve spent my entire career figuring out how to do it the old way… “ Misbehaving” ( M review + notes) provides lots of great examples of this sort of activity.  See of you can find them all.

One important phenomenon about sunk costs that’s worth noting is that there is clear recency bias, a function of the way our  memory works – as explored by some great books like Daniel Schacter’s “ The Seven Sins of Memory ( 7SOM review + notes).

We’re unlikely to throw out a nice shirt that we just bought a few months ago but that unfortunately doesn’t fit us as well as we thought it would…

… but if it’s been sitting in the closet for a few years, it’s easier to let go of.

Sunk Costs Status Quo Bias x Overconfidence x Storytelling x Hindsight Bias = Consistency Bias / Thesis Drift

When the facts change, I change my mind. - Apocryphal Click To Tweet

Given our natural overconfidence and tendency to stick with the default option ( overconfidence), one challenge for many people – whether in the context of stock portfolios or personal relationships – is “consistency bias.”  This is a little different than the stronger “commitment bias” (which we’ll discuss in the next section).

Consistency bias, which is also sometimes known as “thesis drift,” basically amounts to the following: we started doing something for a good reason, but then the reason changed, and now we’re still doing the thing.

Various psychological mechanisms in our memory – as well as our natural tendency to tell stories – contributes to this phenomenon.

And it’s important to note that even the most brilliant and thoughtful among us are susceptible.  To go back to the Manhattan Project, which I referenced in the introduction, here’s what Richard Feynman has to say about his role in the project in his wonderful “ The Pleasure of Finding Things Out ( PFTO review + notes).

“The original reason to start the project, which was that the Germans were a danger, started me off on a process of action[…] at Los Alamos, to try to make the bomb work.  […]

It was a project on which we all worked very, very hard, all cooperating together.  And with any project like that you continue to work trying to get success, having decided to do it.  

But what I did – immorally I would say – was to not remember the reason that I said I was doing it, so that when the reason changed, because Germany was defeated, not the singlest thought came to my mind at all about that, that that meant now that I have to reconsider why I am continuing to do this.  I simply didn’t think, okay?”

Feynman goes on, in fact, to note that everyone was happy when the bomb went off; the mood was “a very considerable elation and excitement, and there were parties and people got drunk.”  

Feynman himself was:

“drinking and drunk and playing drums sitting on the hood of a Jeep and playing drums with excitement running all over Los Alamos at the same time as people were dying and struggling in Hiroshima.”  

It’s easy to be judgmental from a distance, I think, but it’s a good exercise in empathy to put yourself in both sets of shoes – those of the scientists and those on the other siide.  ( The Making of the Atomic Bomb – TMAB review + notes – takes good care of the Japanese side of that equation.)

Application / impact: it’s easy to get caught up in keepin’ on keepin’ on with what we’re doing… but if reasons for our actions meaningfully change, then so should our actions.

Sunk Costs Inversion x Stress x Loss Aversion x Incentives x Social Proof == Commitment Bias / Escalation of Commitment

The First Rule of Holes is: Stop Digging. The Second Rule of Holes is: Don’t Forget Rule 1. Click To Tweet

A stronger version of consistency bias is commitment bias, which invokes stress and loss aversion– two powerful models.

Summarily, pain is painful (wow, insightful, aren’t I?)  We don’t like losing things, and given that we (errantly) treat sunk costs as if they’re something we can “get back,” we’re loath to sell a dog in our stock portfolio and turn a “paper loss” into a “real loss” – and we’re similarly loath to give up a path that we’ve put a lot of time and effort into, that cost us a lot of stress.

Because if we give up now, then all that work – all that stress – all that pain – was for nothing.

It’s an understandable but completely wrong way to think: all that matters going forward is our marginal utility – we should be taking actions, today, that make us happier and better off for today and tomorrow (and the rest of our lives.)  Instead, since we’re humans, not econs, we often take actions today that hurt us – providing negative utility – today, tomorrow, and for the rest of our lives.

This often interacts with incentives.  We discussed how the leaders of the Manhattan Project needed to justify all the time and money they’d spent to their bosses.

Continuing the Feynman arc, a similar story actually played out with the ill-fated launch of the Challenger space shuttle.

While several of Feynman’s books touch on this, so does Megan McArdle’s wonderful “ The Up Side of Down ( UpD review + notes), which I like to cite every chance I get because it’s a phenomenal book that too few people are talking about.  McArdle notes that some NASA engineers were actually aware, prior to launch, of the famous O-ring erosion issue.

Some engineers from Morton-Thiokol, sensibly, recommended delaying the launch due to the cold weather.  Why did the launch proceed anyway? McArdle notes that a manager of the project was “appalled by [the recommendation]” – another engineer asked “my God, when do you want me to launch – next April?”

McArdle explains:

“The shuttle launch had already been delayed almost a week.  Each delay was massively expensive, and of course, it didn’t look good.”

You can see how incentives are causing commitment bias here.  There’s also an element of time-induced stress (deadlines can warp our thinking).

And it’s not just scientists, to be fair: McArdle’s book cites other examples.  Laurence Gonzales notes, in “ Deep Survival ( DpSv review + notes), that a similar phenomenon underlies many fatal incidents in outdoors recreation.

People have often traveled a long way, spent a meaningful amount of money, and probably taken precious days off work to climb a specific mountain.  Those are all sunk costs that needs to be justified, not to mention the incentive of the long-awaited excitement from the activity.

Gonzales explores many instances of how commitment bias to a preset “plan” – even when conditions change so dramatically that the plan no longer makes any sense – can lead to danger, or death.

Among his examples: some rafters went rafting on a river that was clearly dangerous, amidst a massive flood.  A group of climbers, whose start time was delayed meaningfully, Lemony Snicket style, by a series of unfortunate events, still tried to climb the mountain and ended up getting caught in a dangerous thunderstorm (with one guy being struck by lightning.)

In all of these cases, we’re talking about quite large-scale expenditures of time and effort, so it’s easy to see why sunk costs and commitment bias might show up.  

What’s perhaps scarier is that even relatively modest “sunk costs” can invoke commitment bias.

For example, Tavris/Aronson cite research in “ Mistakes were Made (but not by me) ( MwM review + notes) about lab experiments in which participants, after reading embarrassing and explicit material out loud as an “initiation rite” for joining a group, were thereafter far more committed to that group (and liked them more.)

Of course, this was the 1950s, so reading some explicit material out loud was a lot more embarrassing than it is today – but it’s still not the equivalent of driving hundreds of miles, and investing hundreds of dollars.

T/A note that the more extreme the initiation rite, the stronger the commitment, which of course explains college fraternity hazing.  

It also explains, as Shawn Achor explores in “ Before Happiness ( BH review + notes), why such strong relationships and a sense of duty are formed in the military – Achor (who cites Elliot Aronson as part of his exploration) discusses his own experience, and also notes that:

“Research indicates that stress, even at high levels, creates greater mental toughness, deeper relationships.”

So commitment bias isn’t all bad, and there are situations where intense, effortful activities – conducted as a group – can create lasting friendships and  social connection.

Of course, commitment bias is bad most of the time.  Take entrepreneurship and business.  Many readers are probably familiar with the “pivot” and “fail fast” mantras of Silicon Valley.  

While these can be taken too far, these are, in some senses, validated.  In Clayton Christensen’s classic analysis of disruption – “ The Innovator’s Dilemma ( InD review + notes) – I was particularly fascinated by this comment, which I think few people really notice and think about deeply:

“guessing the right strategy at the outset isn’t nearly as important to success as conserving enough resources… so that new business initiatives get a second or third stab at getting it right.  

Those that run out of resources or credibility before they can iterate toward a viable strategy are the ones that fail.”  

There’s a margin of safety angle there, too.

Here’s a real-world example to bring it home: Brad Stone’s “ The Everything Store ( TES review + notes) – a great exploration of the history of Amazon, and the thought processes of Jeff Bezos – drives home the reality that Bezos, like anyone else, wasn’t immune to bad ideas and overconfidence.

For example, his attempt at competing with eBay failed miserably, the expansion into toys and electronics (categories that were very different from books) was not necessarily a failure but was certainly very challenging at first; jewelry flopped; the hiring of Joe Galli as COO was a disaster, and Bezos’s obsession with the “Noah’s Ark” idea (having one or two of every product in an Amazon FC) was no bueno.  

But as Rick Dalzell, Bezos’s long-time right hand man, reflected:

Jeff does a couple of things better than anyone I’ve ever worked for.  He embraces the truth. A lot of people talk about the truth, but they don’t engage their decision-making around the best truth at the time.  

In other words, Bezos appears to have been bold in going after new opportunities, but willing to cut his losses when things weren’t going well; i.e. ignoring sunk costs and avoiding commitment bias – which prevented Amazon from squandering all of its resources on ideas that didn’t work.

This is a phenomenon often seen in successful, disruptive, entrepreneurial companies; anyone who’s read Sam Walton’s “ Made in America ( WMT review + notes) will see parallels between Bezos and Walton.

Walton also didn’t let commitment bias keep Wal-Mart on a path if it didn’t make sense.  He actively fought against status quo bias – which, as we’ve seen, is one of the key contributors to consistency and commitment bias.  As Walton puts it, in his own words:

As good as business was, I could never leave well enough alone, and, in fact, I think my constant fiddling and messing with the status quo may have been one of my biggest contributions to the later success of Wal-Mart. - Sam Walton Click To Tweet

Application / impact: commitment bias usually distorts our decisionmaking; in some contexts, we can use that to promote group unity, but we should also be aware that even small actions can create “sunk costs” that we feel forced to justify later.