The information technology industry has been living by Moore’s Law ever since 1965, when Intel co-founder Gordon Moore came up with the rule of thumb that the number of transistors that can be placed inexpensively on an integrated circuit doubles every 18 months to two years. Contrast this with pharmaceuticals. In a paper published in a recent issue of Nature Reviews Drug Discovery, a wholly different development trajectory was posited, named “Eroom’s Law” (Moore’s Law spelled backwards): the cost of developing a new drug roughly doubles every nine years.
The authors came up with Eroom’s Law by charting the ever-dwindling number of new drugs to emerge from pharmaceutical labs against the ever-increasing amount of dollars spent on discovering them. It’s a dismal ratio. Drugs approvals by the US Food & Drug Administration (FDA) have plunged 40 percent since 2005, while spending on research and development almost doubled during the same period, according to a December 2011 report by consultancy Oliver Wyman (“Beyond the Shadow of a Drought”). The number of new drug approvals averaged 22 per year between 2005 and 2010, compared with 36 per year from 1996 to 2004. That’s Eroom’s Law in action.
The pharma industry keeps casting about for solutions, some of which represent a fundamental and painful change in their business model. Big diseases like cancer, Alzheimer’s, and malaria require big science, so some companies are forming unprecedented collaborations, with each other and with public agencies. The FDA also recognizes that new drug technologies might require new methods of evaluation, and is working with industry to make the regulatory process more efficient. Meanwhile, pharma companies keep trying to reinvent themselves, or at least their research labs, with varying degrees of success and failure.
These efforts are driven by a growing panic over the status quo. Second-quarter sales and earnings for the largest pharmaceutical companies plunged for the second consecutive quarter, with revenues down 4.5 percent compared with the same period in 2011 and earnings down 2.6 percent. In August alone two of the leading experimental drug candidates targeting Alzheimer’s failed their clinical trials, after more than 10 years of development and hundreds of millions of R&D dollars each. The dwindling productivity isn’t just bad news for the pharma industry, however; it’s bad news for human health. Every year the US pharmaceutical industry increases its spending on R&D. In 2010 it reached $60 billion—20 percent of total R&D spending by all US companies and the largest single share of any industry. Yet the past 10 years have seen few of the major industry breakthroughs we saw in the final decades of the 20th Century, such as such as statins for heart disease, AZT for AIDS, and Herceptin and Gleevec for cancer. Of the 35 new drugs approved by the FDA in 2011, 10 were for rare diseases, another seven added only months of survival to patients with various forms of cancer, and one was a treatment for scorpion stings. Consultancy Oliver Wyman calculated that since 2005 the value generated by a dollar invested in pharmaceutical R&D has plunged more than 70 percent.
The new drug drought was highlighted in January by Derek Lowe, a pharmaceutical scientist who writes the influential blog In the Pipeline. He asked his readers to name the most worthwhile new drug that had been introduced since 1990. Of the many candidates nominated, the vast majority were brought to market in the first half of that 20-year span.
One reason for the industry’s meager R&D productivity is the sheer complexity of the human body, argue four analysts at Sanford C. Bernstein, led by Jack W. Scannell. In their article in Nature Reviews Drug Discovery, “Diagnosing the Decline in Pharmaceutical R&D Efficiency,” they examined R&D projects for more than 28,000 compounds investigated since 1990. During that 20 year period the pharma industry increasingly concentrated its R&D investments on drugs that address unmet therapeutics needs and untargeted biological mechanisms—areas where the need is great but the risk of failure highest. This is the widely-held “low hanging fruit” theory of the drug drought: the easier disease targets, such as high cholesterol, asthmatic airway passages, migraines, and ulcerous digestive systems, have been met. Complex diseases such as cancer and neuro-degenerative conditions are much harder to solve.
But Scannell and his colleagues also laid out four additional, interlocking arguments that may explain the decline in R&D output:
- The ‘better than the Beatles’ problem: Imagine how hard it would be to come up with a successful pop song if any new song had to be better than the Beatles . Unlike cars, or electronics, with drugs there’s no interest in novelty for its own sake. And there’s no point in creating something that’s only just as good as what’s already available, especially since today’s hit drug is tomorrow’s inexpensive generic.
- The ‘cautious regulator’ problem: The progressive lowering of risk tolerance, particularly after the pain treatment Vioxx was removed from the market in 2004 for safety reasons, raises the bar on safety for new drugs, which makes R&D both costlier and harder.
- The ‘throw money at it’ tendency: The tendency to just keep pouring more money and resources into a research project or a widely-held theory until something sticks. Could also be called throwing good money after bad.
- The ‘basic research-brute force’ bias: The industry’s tendency to overestimate the probability that advances in basic research and large scale screening processes will show a molecule safe and effective in clinical trials.
The last of those issues has garnered a lot of discussion among pharma scientists, because it calls into question the focus over the past 15 years or so on targeted drugs and personalized medicine. Pharmaceutical scientists have long believed that that a single molecule can be tailored to exquisitely target the cellular flaw behind an individual’s disease, knocking it out without harming healthy cells, a theory backed up by numerous test-tube and animal experiments. Consequently, considerable R&D resources have been devoted to screening innumerable compounds to find one that might block just one disease-causing cellular protein or genetic flaw. Cancer research is almost totally devoted to the search for such targeted therapies, with scientists positing that by screening a slice of tumor tissue or a patient’s blood, a drug can be deployed that is most effective against the cancer’s specific genetic fingerprint. But targeted therapies that look promising in the lab often fail in clinical trials, stymied by complex human biology.
Scannell and his fellow authors throw water on the personalized medicine theory by pointing out that despite the shift to targeted drugs and high tech screening tools, the probability that a small-molecule drug will successfully complete clinical trials has remained almost constant for the past 50 years. And those treatments that do succeed can cost patients and insurers hundreds of thousands of dollars per year, because they will by definition only work on the small number that have the cellular target. Physicians who prescribe drugs and the scientists who invent them are increasingly embracing a more nuanced view of drug discovery, the idea that most diseases require a combination of targeted drugs, often called a cocktail, to be held in check. The cocktail approach proved effective against AIDS, and medical experts believe the same approach may be necessary for cancer, Alzheimer’s, and a range of other diseases.
The problem with cocktails, however, is that it can be difficult if not impossible for two different companies to test experimental drugs in concert, for both competitive and safety reasons. Companies are beginning to overcome those competitive challenges, however, and collaborate on some of the most difficult challenges in medicine, most notably Alzheimer’s disease, the only one of the top 10 causes of death in the U.S. with no known cause, cure or even a way of slowing its progression. In 2004 the National Institutes of Health, the FDA and 20 drug companies joined forces to start the Alzheimer’s Disease Neuroimaging Initiative (ADNI), a landmark public-private partnership tasked with mapping all the biological markers connected to Alzheimer’s. The ADNI’s defining principle is to publicly share and relinquish ownership of all data and findings as soon as possible. More than 57 sites are collecting data from thousands of patients, and the results to date have already been incorporated into research and trials by pharmaceutical companies.
In 2010 a second consortium of companies and academic researchers, organized by the Banner Alzheimer’s Institute in Arizona, formed the Alzheimer’s Disease Initiative to work together to study an extended family in Colombia with some 600 members that carry a rare genetic mutation that makes them highly susceptible to the development of early Alzheimer’s. The first prevention clinical trials are due to start in 2013. Alzheimer’s drug development got an added boost when President Obama announced a $100 million increase in funding on May 15 to research the disease, the kind of big science needed to fight big diseases.
Similar collaborations are emerging in other disease areas. GlaxoSmithKline, AstraZeneca, and the Innovative Medicines Initiative recently announced a partnership to develop new antibiotics, named NewDrugs4BadBugs. Although new antibiotics are desperately needed because of the emergence of drug resistant superbugs in hospitals and elsewhere in the world, these drugs are rarely lucrative, making it difficult for any one company to earn a return on the huge investment required.
Scannell and his co-writers suggest that that each drug company should create a Chief Dead Drug Officer (CDDO) responsible for figuring out the reasons behind a drug failure at every stage of the R&D process, and publish the results in a scientific journal. Today, companies rarely publish the results of failed clinical trials or experiments. Consequently, scientists keep trying to invent the same broken wheel.
It seems unlikely a drug company will appoint a dead drug officer anytime soon, but the idea does highlight an important point—from failure can come great learnings. Pharmaceutical companies know their R&D process needs to change, but it is hard for billion dollar enterprises with proud histories of life-saving drug discoveries to change their ways, and to admit failure.
Collaboration and greater sharing of information in the early stages of drug development might make a difference—as might new thinking all together. For example, biophysicists and cancer researchers at Rice University, Tel Aviv University and Johns Hopkins University just announced a new joint development strategy for cancer drugs that they labeled a “cyber-war.” Their plan is to decipher and then destroy the biological “social networks” that cancer cells use to communicate with each other, signaling each other to mount a coordinated defense against chemotherapies, much like bacteria act as a team to resist antibiotics. It may sound unorthodox, but such new thinking is what this field needs.
It is unlikely drug development will ever see the equivalent of Moore’s law – the human body is far too complex to lend itself to rapid solutions. But applying fresh thinking and borrowing concepts from other fields such as social networking, pharmaceutical companies may at least begin to reverse the gloomy math of Eroom’s law. Humankind’s well-being demands that they try.