October 5, 2018
Faulty Math Behind Over-Treatment
by Al Lewis
A recent recommendation to start colonoscopies at age 45 shows why the U.S. suffers from over-treatment and worse outcomes.
The American Cancer Society (ACS) recently decided that, because the rate of colon cancer has increased by an “alarming” 22% in the 45-to-49-year old cohort this century, colon screenings should start at age 45.
This is a very instructive decision, though not because it is a good idea. Rather, it’s instructive because it’s a “teachable moment” about why Americans suffer from over-diagnosis and over-treatment so much more than people in other modern economies, with worse outcomes to show for it.
One reason is the failure of our medical and public health community to understand the difference between relative risk and absolute risk.
On one hand, a smoker might have a relative risk of heart attacks that is only a few times greater than the risk for nonsmokers. However, with Americans suffering 400,000 heart attacks a year, many of us do whatever we can to avoid small increases in relative risk of a heart attack because the absolute risk is so great.
On the other hand, suppose the relative risk of an unsafe airline is 10 times the risk of a safe airline. But with about three crashes a year (in a very bad year) over the course of 30 million flights, even a whopping tenfold increase in relative risk would bring your absolute risk of crashing up to a trivial 1-in-a-million. Here at Quizzify, we’d still opt for the unsafe airline if it has more legroom and a better mileage program.
See also: A Road Map for Health Insurance
And that brings us to exactly what the American Cancer Society miscommunicated…with a disturbing twist, as you’ll see below.
This “alarming” 22% increase in relative risk over the course of this century translates into an increase in the absolute rate of colon cancer in the <50 cohort from 0.006% to 0.007%. Yes, 0.001 percentage point more of the <50 population in this country will get colon cancer now than 18 years ago.
Further, suppose half of that 0.007% had a family history (or some other major risk factor) and would be advised by their doctor to get screened regardless of guidelines for the average person. That leaves roughly 0.0035% of the 45- to 49-year-old population who could possibly benefit from a random screen. That’s not much different from your lifetime odds of getting struck by lightning.
And a screen is far from a lifesaver, in general. Quite the contrary, statistically speaking it is likely to find the slow-growing tumors while missing the more aggressive, faster-growing tumors that begin between screens. Screening is not a surefire way to detect cancer, by any means.
The Disturbing Twist: The Hazards of Screening
That trivial benefit must be weighed against the nontrivial harms. The risk of a complication, such as a perforation, is estimated at between 1.6% and 1.8%. In all fairness to the ACS, it isn’t insisting that the screen be done via a colonoscopy, though the non-invasive screens have such high positive/inconclusive test rates that they often lead to colonoscopies. That makes the rate of complications about 3,000 times the odds of having your life saved by early diagnosis of colon cancer.
See also: How Telehealth Changes Senior Care
Of course, the worst complication is death, and the mortality rate from colonoscopies (0.02%) appears to be, on its face, much higher than the rate of lives that would be saved. However, in all fairness, the mortality rate, like the complication rate, in general, increases with advancing age. Hence maybe the mortality rate in the 45- to 50-year-old cohort isn’t any higher than — and might even be slightly lower than — the rate at which early detection will save lives.
So what’s an employer to do? We’d say, stay on the sidelines. Let employees work this one out for themselves, with their doctors. Or show them this post. But don’t encourage them to run out and get screened on the basis of a recommendation that is at best controversial and at worst harmful.