Our iatrogenics problem
We tend to think action always trumps inaction. But sometimes intervening causes more harm than good.
If we are to intervene in what would otherwise happen, we need an idea of not only the benefits of our interventions but also the harm. Otherwise how will we know when, despite our best intentions, we cause more harm than we do good?
Intervening when we have no idea of the break-even point is "naive interventionism," a phrase first brought to my attention by Nassim Taleb.
In Antifragile, he writes:
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
Why would people do something even when the evidence points out that doing something is actually causing more harm?
I can think of a few reasons.
The first thing that goes through my mind is incentive-caused bias. What is the incentive for action? Is there an agency gap where the outcome from the person doing the intervention is disconnected from the outcome for the person experiencing it?
Another big reason: Lack of clear feedback loops between action and outcome. It's hard to know you're causing harm if you can't trace action to outcome. This allows, even encourages, some self-delusion. Given that we are prone to confirming our beliefs — and presumably we took action because we believed it to be helpful — we're unlikely to see evidence that contradicts our beliefs. We should be seeking disconfirming evidence to our actions. But we don't, because if we did, we'd realize we are a lot less smart than we think we are.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
And the third major contributor, I'd say, is our bias for action (especially what we consider positive action). This is also known as, to paraphrase Chalie Munger, "do something syndrome." If you're a policy advisor or politician, or heck, even a modern office worker, social norms make it hard for you to say, "I don't know." You're expected to have an answer for everything.
Think about how a typical meeting starts. In response to a new product from a competitor, for example, the first question people usually ask is "What are we going to do about this?" The hidden assumption that goes unexplored is that you need to do something. It could be that the cost of doing something outweighs the benefits.
The concept applies to domains outside of medicine and relates to everything where we cause more harm than good under the guise of knowledge.
More from Farnam Street...
-
Her Lotus Year: Paul French's new biography sets lurid rumours straight
The Week Recommends Wallis Simpson's year in China is less scandalous, but 'more interesting' than previously thought
By The Week UK
-
Today's political cartoons - November 21, 2024
Cartoons Thursday's cartoons - wild cards, wild turkeys, and more
By The Week US Published
-
Say Nothing: 'sensational' dramatisation of Patrick Radden Keefe's bestselling book
The Week Recommends The series is a 'powerful reminder' of the Troubles
By The Week UK Published