Preventing a disease, before it occurs, seems intuitively obvious. But when it comes to taking medicine to prevent a disease before it occurs, people tend to be much less comfortable. Not only are there the concerns about the “medicalization” of healthy people, there are good questions about benefits, risks (like side effects), and costs. Cardiovascular disease will kill many of us, so there’s been decades of research studying how to prevent that first heart attack or stroke. But even if you’re born with good genes and do everything possible to prevent heart disease (e.g., don’t smoke, exercise regularly, eat a healthy diet, moderate your alcohol, and keep your weight down) you’re still at risk of heart disease. And if you have one or more risk factors for disease, your lifetime risk goes up dramatically. Once you’ve had your first heart attack or stroke, the effectiveness of medical therapy has been established. Drug therapy with medication like the “statins” class of cholesterol-lowering drugs reduces subsequent deaths from cardiovascular disease. Given their unambiguous effectiveness, and the high likelihood that many of us will eventually have cardiovascular disease of some sort, the idea of “pre-treating” otherwise-healthy people with drug therapy to possibly prevent that first event has been held out as a potential public health strategy. There’s new evidence that tests this hypothesis, and the results are surprising. Continue reading
Anti-inflammatory drugs are among the most well-loved products in the modern medicine cabinet. They can provide good pain control, reduce inflammation, and eliminate fever. We give non-steroidal anti-inflammatory drugs (NSAIDs) in infancy, continuing through childhood and then adulthood for the aches and pains of modern living. It’s the later stages of life where NSAIDs are used most frequently, usually in the treatment of joint disease like osteoarthritis, which eventually affects pretty much everyone. Over 17 million Americans use NSAIDs on a daily basis, and this number will grow as the population ages. While they’re widely used, they also have a long list of side effects. Not only can they cause stomach ulcers and bleeding by damaging the lining of the gastrointestinal tract, the cardiovascular risks are real and significant.
It was the arrival (and withdrawal) of the drugs Bextra (valdecoxib) and Vioxx (rofecoxib) that led to a much better understanding of the potential for these drugs to increase the risks of heart attacks and strokes. And it’s now well-documented that these effects are not limited to the “COX-2″ drugs – almost all NSAIDs, including the old standbys we have used for years, raise the risk of heart attacks and strokes. Given how frequently these products are used, it’s essential that pharmacists and their patients understand the risks in order to make informed decisions based on expected benefits and known risks. Continue reading
Among the many forms of supplementation that I’m asked about, multivitamins are the source of the most confusion among consumers. Multivitamins are marketed with a veneer of science but much of that image is a mirage – rigorous testing doesn’t support most of the outlandish health claims that are attributed to them. Yet not all vitamin and mineral supplementation is useless. They can be used appropriately, when our decisions are informed by scientific evidence: Folic acid prevents neural tube defects in the developing fetus. Vitamin B12 can reverse anemia. Vitamin D is recommended for breastfeeding babies to prevent deficiency. Vitamin K injections in newborns prevent potentially catastrophic bleeding events. But the most common reason for taking vitamins isn’t a clear need, but rather our desire to “improve overall health”. It’s deemed “primary prevention” – the belief that we’re just filling in the gaps in our diet. Others may believe that if vitamins are good, then more vitamins must be better. And there is no debate that we need dietary vitamins to live. The case for indiscriminate supplementation, however, has not been well established. We’ve been led to believe, through very effective marketing, that taking vitamins is beneficial to our overall health – even if our health status is reasonably good. So if supplements truly provide real benefits, then we should be able to verify this claim by studying health effects in populations of people that consume vitamins for years at a time. Those studies have been done. Different endpoints, different study populations, and different combinations of vitamins. The evidence is clear. Routine multivitamin supplementation in most people has not been shown to offer substantial health benefits.
It is a triumph of marketing over evidence that millions take supplements every day. There is no question we need vitamins in our diet to live. But do we need vitamin supplements? It’s not so clear. There is evidence that our diets, even in developed countries, can be deficient in some micronutrients. But there’s also a lack of evidence to demonstrate that routine supplementation is beneficial. And there’s no convincing evidence that supplementing vitamins in the absence of deficiency is beneficial. Studies of supplements suggest that most vitamins are useless at best and harmful at worst. Yet the sales of vitamins seem completely immune to negative publicity. One negative clinical trial can kill a drug, but vitamins retain an aura of wellness, even as the evidence accumulates that they may not offer any meaningful health benefits. So why do so many buy supplements? As I’ve said before, vitamins are magic. Or more accurately, we believe this to be the case.
There can be many reasons for taking vitamins but one of the most popular I hear is “insurance” which is effectively primary prevention – taking a supplement in the absence of a confirmed deficiency or medical need with the belief we’re better off for taking it. A survey backs this up – 48% reported “to improve overall health” as the primary reason for taking vitamins. Yes, there is some vitamin and supplement use that is appropriate and science-based: Vitamin D deficiencies can occur, particularly in northern climates. Folic acid supplements during pregnancy can reduce the risk of neural tube defects. Vitamin B12 supplementation is often justified in the elderly. But what about in the absence of any clear medical need? Continue reading
If science-based medicine reflects the application of the best evidence, then we should expect practices to change when new data emerges. In the long run that’s generally true, and the progressive gains we’ve seen in the management of disease reflect this. But in the short run, change can be maddeningly slow, and there are many areas of medicine where we could be doing a better job of applying what we already know to improve outcomes and reduce harms. One area where this is obvious is drug treatments, which can provide remarkable benefits but are also sources of significant harms.
Somewhat problematically, the real world is often the setting where the full extent of harms from treatments are identified. Bringing new drugs to market means tradeoffs: Do you demand larger and longer clinical trials to get as much information as possible about a drug before it’s sold? Or do you approve based on more preliminary, potentially weaker evidence, to meet (potentially) important patient need? There is no set formula or right answer to this questions – it’s ultimately a value judgement exercised by regulators like the FDA, who decide which drugs are allowed for sale (the benefits are assumed, overall, to exceed the harms) or removed for sale (when the opposite is felt to be the case).
Judging by the recent press reports, the latest Cochrane review reveals that everything we’ve been told about eating salt, and cardiovascular disease, is wrong:
The New York Times: Nostrums: Cutting Salt Has Little Effect on Heart Risk
Scientific American: It’s Time to End the War on Salt
Sometimes it’s possible to completely miss this point. And that’s what’s happened here.