The Ratchet Problem: On Asymmetric Risk

There’s a pattern I keep noticing, and once you see it, it’s hard to unsee.

Losing weight is hard. Gaining it is easy. An exercise injury happens in a moment; recovery can take months. Building a savings cushion takes months of discipline. Spending it can happen in a weekend. Earning someone’s trust is slow and cumulative. You can lose it in a single sentence. Changing your mind about something you’ve believed for years (really changing it, not just updating a surface position) is one of the hardest things a person can do. Arriving at that belief in the first place probably didn’t feel effortful at all.

This type of structural asymmetry is everywhere. And if you want to design resilient systems, you are wise to anticipate it.

There’s a concept in engineering called a ratchet, a mechanism that allows motion in one direction and resists it in the other. The cost to advance is low. The cost to reverse is high. And crucially, the resistance isn’t always proportional to how far you’ve gone. Sometimes a single click is nearly impossible to undo.

Economists have a related idea: path dependency. The sequence of events matters, not just the sum. Ten good days followed by one catastrophic one isn’t equivalent to a flat average, because the catastrophic day may reset or compound in ways the arithmetic doesn’t capture. A single bad night of eating doesn’t just add calories. It can trigger a guilt spiral that extends the damage for days. The ratchet clicks, and then the response to the click becomes a second-order ratchet. The story we tell about what happened can be harder to escape than the event itself.

It’s more subtle and interesting how this maps onto belief. When we hold a position publicly, defend it in conversation, build identity around it, signal tribal membership with it, we raise the cost of reversal. Each defense is another click. At some point the ratchet starts functioning as a kind of sunk cost fallacy: the prior investment in the belief becomes the reason to keep believing it, independent of whether the belief still holds up. Changing your mind stops being an intellectual act and becomes a social and psychological one. The people most invested in a belief are often structurally the last to update it, not because they’re less intelligent, but because the ratchet has more clicks to undo.

The same pattern shows up constantly in engineering management, and often with higher stakes because the feedback loops are slower and the damage is less visible. Trust and psychological safety accumulate through dozens of small interactions and can be set back significantly by a single public misstep. The second-order ratchet is insidious here: teams that stop surfacing problems don’t announce it, they just quietly stop, and you lose the signal before you know it’s gone. Technical debt is almost a textbook ratchet. Shortcuts taken under pressure are easy to make and nearly impossible to reverse once the code becomes load-bearing, and the sunk cost dimension shows up when teams defend architectural decisions simply because so much has been built on top of them. Hiring has a brutal asymmetry too: a bad hire can take six to eighteen months to fully recognize and resolve, the damage to team dynamics during that period compounds quietly, and teams often adjust around a low performer rather than address it, normalizing the drag until it becomes the baseline.

There’s a related idea statisticians call ergodicity. The average outcome across a population doesn’t tell you what happens to a single person over time. One catastrophic event (a binge, a bankruptcy, a broken relationship) can permanently alter your trajectory in a way that averages obscure. You can’t diversify across your own life the way a portfolio can diversify across assets. You live sequentially, and some losses don’t compound back.

I’m not sure there’s a clean solution to any of this. But you can think about it from two directions simultaneously, both aimed at increasing net margin on each side of the asymmetry.

The first is minimizing the damage of the rare bad event. Not preventing it. I’m skeptical that’s really the goal, or even possible at scale. A production incident is more survivable if the system is designed to limit blast radius before it happens. The impulsive thing said in an argument lands differently depending on the trust accumulated before it. The unexpected home repair hurts less if there was a buffer built in. You can’t eliminate the catastrophic click, but you can sometimes cap how far it travels. And maybe more importantly: the response to the bad event matters as much as the event itself. A guilt spiral is its own ratchet. A quick, undramatic return to baseline, treating the bad day as noise rather than verdict, is itself a skill and probably an underrated one. This is part of what makes tools like cognitive/dialectical behavioral therapy (CBT/DBT) valuable. They’re essentially frameworks for interrupting second-order ratchets, for catching the narrative before it clicks further than the event warranted. This is also why I’ve come to prefer tools that show me trends over time rather than daily verdicts, systems designed to make it structurally harder to catastrophize a single bad data point into a reason to abandon the whole project.

But the ratchet is structurally neutral. The same mechanism that makes bad outcomes sticky can make good ones sticky too, if you can find the right ones.

The second direction is almost the inverse: finding the ratchets that run the right way and leaning into them deliberately. Small consistent actions that are individually trivial but structurally hard to reverse. James Clear calls these identity votes. Each small action is a cast vote for the kind of person you’re becoming, and at some point the accumulated votes shift the underlying identity. The compounding effect of showing up regularly, to a workout, a relationship, a practice, isn’t just additive. At some point it changes the underlying terrain. The behavior stops being something you do and starts being something you are, which makes the next instance easier, not harder. Something automatic. This is the ratchet working for you. It clicks forward just as stubbornly as it clicks back.

The asymmetry doesn’t disappear. But maybe the move is to stop fighting it symmetrically, trying equally hard to prevent bad events and create good ones, and instead play to the structure. Absorb the rare catastrophe as cheaply as possible. Let the small consistent wins accumulate interest.




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • My experience preventing and managing RSI (n=1)
  • Case-sensitive multiselect in VS Code
  • The Gene: Reflections on the PBS documentary
  • My current remote work setup
  • Notes on Spaced Repetition Memory Systems