The Cobra Effect

“Show me the incentive and I will show you the outcome.” — Charlie Munger

Well-designed incentives have the power to create great outcomes; poorly-designed incentives have the power to…well…create terrible outcomes.

Goodhart’s Law says that when a measure becomes a target, it ceases to be a good measure. Simply put, if a measure of performance becomes a stated goal, humans tend to optimize for it, regardless of any associated consequences. The measure often loses its value as a measure.

One of the most prominent examples of this in action comes from the story of the British colonists’ cobra eradication efforts in India.

Συνέχεια εδώ

Πηγή: sahilbloom.substack.com

Σχετικά Άρθρα