Real World Evidence – and Rat Poison
Why practical hurdles help inferior drugs perform better in clinical trials than in real life
I’ve been thinking a lot about “real world evidence” recently and I’ve come to the conclusion that anti-VEGF drugs are essentially the warfarin of retinal disease.
Why? Warfarin is one of the classic examples of a drug that does well in clinical trials – and far less well in real life. It’s a dirty drug – it hits many proteins in the coagulation cascade, and it interacts with alcohol, many foods, and other drugs – and there are also pharmacogenomics issues. But for many years, warfarin (or its sister vitamin K antagonists) have been the centerpiece of many cardiac event prophylaxis strategies – principally, stroke prevention in patients with atrial fibrillation. Warfarin does stop those deadly fibrin clots from forming, after all!
But people on warfarin need to be assessed regularly. If test values are outside the narrow therapeutic range, the dose must be adjusted there and then. Insufficient anticoagulation risks thromboembolic stroke; too much risks hemorrhagic stroke. Quite a motivation to visit the anticoagulation clinic, right? Well, real life gets in the way. Appointments are missed. The dose adjustment doesn’t happen. And people bleed, clot, suffer and often die because of it. At least the drug is orally administered...
So why did warfarin perform so well in clinical trials against newer, cleaner, single-target oral anticoagulant drugs (that lack almost all of warfarin’s practical problems)? Well, all aspects are closely monitored in a clinical trial setting (such that it’s far closer to the ideal standard of care), and trial patient populations are carefully selected with stringent inclusion and exclusion criteria. The real world? Not so much.
When it comes to anti-VEGF drugs, the same real-world issues exist, but the monitoring is via OCT, and dose-adjustment is essentially temporal – extending or shrinking the periods between injections. Under-dosing has obvious issues, but overdosing with anti-VEGF agents has cost implications, and may also have adverse effects on the retina (particularly photoreceptors). What might it take to beat the current generation of anti-VEGF drugs (in terms of efficacy) when they are administered monthly, in a clinical trial? The anti-PDGF approach looks like it’s fading away; current approaches are all about extending dosing intervals (which is great), but will these simply aim for non-inferiority to the current crop? Given the clinical trial paradigm, I wonder if that’s really the best that can be achieved.
Mark Hillen
Editor
I spent seven years as a medical writer, writing primary and review manuscripts, congress presentations and marketing materials for numerous – and mostly German – pharmaceutical companies. Prior to my adventures in medical communications, I was a Wellcome Trust PhD student at the University of Edinburgh.