Can the gene and cell therapy revolution scale up?

May 31, 2018

“I believe gene therapy will become a mainstay in treating, and maybe curing, many of our most devastating and intractable illnesses,” said FDA commissioner Dr Scott Gottlieb after Luxturna’s approval.

As innovative gene and cell therapies continue to make the transition from the laboratory to the clinic, they are bringing with them the promise of truly personalised medicine. The last few years have seen the regulatory approval of the first gene therapies that take a patient’s own immune cells and genetically engineer them to target cancer cells more effectively.

These chimeric antigen receptor T-cell (CAR-T) therapies now represent a rapidly growing field, with Novartis’s Kymriah, the first CAR-T therapy approved by the US Food and Drug Administration (FDA) in August 2017 for the treatment of a rare blood cancer, seen as the tip of the iceberg for this treatment class’ potential. Approval of Kite Pharma’s Yescarta, a CAR-T treatment for certain forms of non-Hodgkin lymphoma, followed just a few months later.

Transformative potential

“This has been utterly transformative in blood cancers,” Dr Stephan Grupp, director of cancer immunotherapy at the Children’s Hospital of Philadelphia, which collaborated with Novartis on Kymriah’s development, told the New York Times. “If it can start to work in solid tumours, it will be utterly transformative for the whole field.”

CAR-T, as well as other cell and gene therapies – such as Spark Therapeutics’ Luxturna, a gene therapy for inherited vision loss that was approved by the FDA in December – are offering the prospect of step changes in the treatment of genetic diseases and some of the deadliest forms of cancer.

“The cellular immunotherapies tend to be marketed for various types of cancer; these cause fewer side effects than traditional chemotherapies and as a result can be used in combination with other treatments in typically older patients, who can struggle to cope with drug-associated toxicity,” says PharmSource healthcare analyst Adam Bradbury. “Cellular immunotherapies will also be used in refractory cancers, which have become resistant to initial therapies.”

The regulatory landscape is also encouraging for gene and cell therapies; last year the FDA issued new guidelines to accelerate the assessment and approval of cell treatment and gene therapy, and the European Medicines Agency continues to focus on the area, publishing an action plan to foster the development of advanced treatments including gene therapy and somatic cell therapy.

“I believe gene therapy will become a mainstay in treating, and maybe curing, many of our most devastating and intractable illnesses,” said FDA commissioner Dr Scott Gottlieb after Luxturna’s approval.

The viral backlog

While the long-term transformative potential of gene and cell therapies is becoming increasingly clear, it is equally obvious that bringing the cutting-edge of personalised medicine to patients comes with no shortage of roadblocks. While traditional small molecule drugs and even complex biologics can be produced at large scales, cell and gene therapies require a new level of customisability and manufacturing expertise.

Although the cell and gene therapies that have so far been introduced to the market are indicated for rare diseases with small patient populations, and thus only require relatively small-scale manufacturing, the early successes of CAR-T therapies and the exploding pharma and biotech interest in cell and gene therapies stress the need for a rapid capacity expansion to support clinical research and commercial-scale production.

Viral vectors of various kinds – the most common being lentiviral and adenoviral vectors – are used in the production of many cell and gene therapies. These disabled viruses encase the genetic material to be introduced to the target cells in the patient; the harmless viral vectors essentially infect the relevant cells to deliver the therapy. Worryingly, there is already a significant backlog of viral vector availability for gene and cell therapy developers.

“As more related biologics have been approved and researched in recent years, the demand for viral vectors has increased,” says Bradbury. “Particularly following on from the clinical trial success of CAR-T cell therapies, more pharma and biotech companies are seeking to enter the market. The manufacturing process to produce viral vectors is complex, costly and highly regulated. There is a shortage of both related manufacturing facilities and appropriately qualified staff, which has meant that demand has outstripped supply and will continue to do so.”

As contract manufacturing organisations (CMOs) struggle to build capacity and expertise in the viral vector production that forms the basis for many gene and cell therapies, Bradbury notes that there is currently an average wait time of 16 months for CMOs to start new projects, even at the smaller clinical scale. Scaling up capacity is incredibly difficult and costly; the need for Good Manufacturing Practice (GMP) facilities to grow cells while ensuring vector sterility and purity means that the regulatory burden is high.

“The cost of constructing the viral vector manufacturing facilities is prohibitively expensive, in the range of hundreds of millions of dollars,” Bradbury says. “On the regulatory front, there is difficulty establishing all aspects of GMP at early phases of clinical trials. Virus manufacture can be considered as more problematic than that of mAbs [monoclonal antibodies] and requires cryopreservation at a far lower temperature than most biologics.”

Boosting capacity, cutting costs

Almost a year and a half is a long time to wait to kick off production for a clinical trial or research project, let alone commercial-scale manufacturing, and Bradbury says the backlog is likely to increase in the short term. While many large pharma firms will have the financial clout to build or acquire their own production facilities to support gene and cell therapy programmes, those that rely on external contractors will be hit hardest.

“Smaller and medium-sized companies will be affected most by a lack of CMOs involved with cell and gene therapy manufacture,” says Bradbury. “I expect both clinical and commercial manufacture to be affected; the demand is likely to drive up prices for CMO services, which in turn will affect institutions conducting clinical research that may not have the budget to run trials.”

With the capacity crunch in full effect, developers large and small have been scrambling to secure their viral supply chain. For the production of Kymriah, Novartis partnered with UK-based gene and cell therapy specialist Oxford BioMedica back in 2013, giving Novartis access to the company’s LentiVector delivery platform, as well as its facilities and expertise.

Smaller biotechs have been spreading their bets to help ensure a steady flow of viral vectors. Bluebird Bio has adopted this kitchen-sink strategy to fuel its ambitious pipeline of experimental gene therapies for genetic diseases and cancers, led by Lenti-D for cerebral adrenoleukodystrophy and LentiGlobin for blood disorder beta thalassaemia. On one hand, Bluebird has struck multi-year manufacturing agreements with Brammer Bio, MilliporeSigma and Belgium-based Novasep. Meanwhile, in late 2017, the company announced that it had spent $11.5m to acquire a facility of its own in Durham, North Carolina, which it will convert into a production site for lentiviral vector.

CMOs are also working to increase capacity, with Brammer Bio doubling its capacity in recent years, investing $50m last year alone. Shanghai-headquartered CMO WuXi AppTec opened a 150,000ft² cell and gene therapy manufacturing centre in Philadelphia, while Swiss biopharma giant Lonza is making a play to lead the space. In April, the company opened a 300,000ft² cell and gene therapy plant near Houston, Texas, the largest facility of its kind in the world. The plant will complement the company’s existing cell and gene therapy hubs in New Hampshire, the Netherlands and Singapore.

As well as increasing capacity, sustained investment in these facilities, and their underlying processes, will help address the manufacturing challenges that make these one-time treatments so expensive. Novartis’s Kymriah currently costs an eye-watering $475,000 per treatment, and as these therapies begin to target organs with larger surface areas, necessitating larger cell batches, costs at the current rate would rise to as much as $3m per patient, Oxford BioMedica chief executive John Dawson told the New York Times last year. As production processes mature and manufacturers start embracing automation, these costs will come down, making treatments affordable for health systems and commercially viable for developers.

“There is substantial scope to improve the manufacturing process,” Bradbury comments. “As a relatively novel treatment and one which is complex and costly to manufacture, there are significant issues to resolve to improve the commercial viability of a cell therapy. Quality control testing still has plenty of scope for optimisation. Cell therapy production must become automated, which should also increase manufacturing scale for commercial production. Viral vectors must also be more readily manufactured and available.”

This article was originally published by: https://www.pharmaceutical-technology.com/features/can-gene-cell-therapy-revolution-scale/

Advertisements

Cancer ‘vaccine’ eliminates tumors in mice

May 17, 2018

Ronald Levy (left) and Idit Sagiv-Barfi led the work on a possible cancer treatment that involves injecting two immune-stimulating agents directly into solid tumors. Steve Fisch

Injecting minute amounts of two immune-stimulating agents directly into solid tumors in mice can eliminate all traces of cancer in the animals, including distant, untreated metastases, according to a study by researchers at the Stanford University School of Medicine.

The approach works for many different types of cancers, including those that arise spontaneously, the study found.

The researchers believe the local application of very small amounts of the agents could serve as a rapid and relatively inexpensive cancer therapy that is unlikely to cause the adverse side effects often seen with bodywide immune stimulation.

“When we use these two agents together, we see the elimination of tumors all over the body,” said Ronald Levy, MD, professor of oncology. “This approach bypasses the need to identify tumor-specific immune targets and doesn’t require wholesale activation of the immune system or customization of a patient’s immune cells.”

One agent is already approved for use in humans; the other has been tested for human use in several unrelated clinical trials. A clinical trial was launched in January to test the effect of the treatment in patients with lymphoma. (Information about the trial is available online.)

Levy, who holds the Robert K. and Helen K. Summy Professorship in the School of Medicine, is the senior author of the study, which was published Jan. 31 in Science Translational Medicine. Instructor of medicine Idit Sagiv-Barfi, PhD, is the lead author.

‘Amazing, bodywide effects’

Levy is a pioneer in the field of cancer immunotherapy, in which researchers try to harness the immune system to combat cancer. Research in his laboratory led to the development of rituximab, one of the first monoclonal antibodies approved for use as an anti-cancer treatment in humans.

Some immunotherapy approaches rely on stimulating the immune system throughout the body. Others target naturally occurring checkpoints that limit the anti-cancer activity of immune cells. Still others, like the CAR T-cell therapy recently approved to treat some types of leukemia and lymphomas, require a patient’s immune cells to be removed from the body and genetically engineered to attack the tumor cells. Many of these approaches have been successful, but they each have downsides — from difficult-to-handle side effects to high-cost and lengthy preparation or treatment times.

“All of these immunotherapy advances are changing medical practice,” Levy said. “Our approach uses a one-time application of very small amounts of two agents to stimulate the immune cells only within the tumor itself. In the mice, we saw amazing, bodywide effects, including the elimination of tumors all over the animal.”

Cancers often exist in a strange kind of limbo with regard to the immune system. Immune cells like T cells recognize the abnormal proteins often present on cancer cells and infiltrate to attack the tumor. However, as the tumor grows, it often devises ways to suppress the activity of the T cells.

Levy’s method works to reactivate the cancer-specific T cells by injecting microgram amounts of two agents directly into the tumor site. (A microgram is one-millionth of a gram). One, a short stretch of DNA called a CpG oligonucleotide, works with other nearby immune cells to amplify the expression of an activating receptor called OX40 on the surface of the T cells. The other, an antibody that binds to OX40, activates the T cells to lead the charge against the cancer cells. Because the two agents are injected directly into the tumor, only T cells that have infiltrated it are activated. In effect, these T cells are “prescreened” by the body to recognize only cancer-specific proteins.

Cancer-destroying rangers

Some of these tumor-specific, activated T cells then leave the original tumor to find and destroy other identical tumors throughout the body.

The approach worked startlingly well in laboratory mice with transplanted mouse lymphoma tumors in two sites on their bodies. Injecting one tumor site with the two agents caused the regression not just of the treated tumor, but also of the second, untreated tumor. In this way, 87 of 90 mice were cured of the cancer. Although the cancer recurred in three of the mice, the tumors again regressed after a second treatment. The researchers saw similar results in mice bearing breast, colon and melanoma tumors.

Mice genetically engineered to spontaneously develop breast cancers in all 10 of their mammary pads also responded to the treatment. Treating the first tumor that arose often prevented the occurrence of future tumors and significantly increased the animals’ life span, the researchers found.

Finally, Sagiv-Barfi explored the specificity of the T cells by transplanting two types of tumors into the mice. She transplanted the same lymphoma cancer cells in two locations, and she transplanted a colon cancer cell line in a third location. Treatment of one of the lymphoma sites caused the regression of both lymphoma tumors but did not affect the growth of the colon cancer cells.

“This is a very targeted approach,” Levy said. “Only the tumor that shares the protein targets displayed by the treated site is affected. We’re attacking specific targets without having to identify exactly what proteins the T cells are recognizing.”

The current clinical trial is expected to recruit about 15 patients with low-grade lymphoma. If successful, Levy believes the treatment could be useful for many tumor types. He envisions a future in which clinicians inject the two agents into solid tumors in humans prior to surgical removal of the cancer as a way to prevent recurrence due to unidentified metastases or lingering cancer cells, or even to head off the development of future tumors that arise due to genetic mutations like BRCA1 and 2.

“I don’t think there’s a limit to the type of tumor we could potentially treat, as long as it has been infiltrated by the immune system,” Levy said.

The work is an example of Stanford Medicine’s focus on precision health, the goal of which is to anticipate and prevent disease in the healthy and precisely diagnose and treat disease in the ill.

The study’s other Stanford co-authors are senior research assistant and lab manager Debra Czerwinski; professor of medicine Shoshana Levy, PhD; postdoctoral scholar Israt Alam, PhD; graduate student Aaron Mayer; and professor of radiology Sanjiv Gambhir, MD, PhD.

Levy is a member of the Stanford Cancer Institute and Stanford Bio-X.

Gambhir is the founder and equity holder in CellSight Inc., which develops and translates multimodality strategies to image cell trafficking and transplantation.

The research was supported by the National Institutes of Health (grant CA188005), the Leukemia and Lymphoma Society, the Boaz and Varda Dotan Foundation and the Phil N. Allen Foundation.

Stanford’s Department of Medicine also supported the work.

This article was originally published by: http://med.stanford.edu/news/all-news/2018/01/cancer-vaccine-eliminates-tumors-in-mice.html

Longevity industry systematized for first time

April 02, 2018

UK aging research foundation produces roadmap for the emerging longevity industry in a series of reports to be published throughout the year

The Biogerontology Research Foundation has embarked on a year-long mission to summarise in a single document the various emerging technologies and industries which can be brought to bear on aging, healthy longevity, and everything in between, as part of a joint project between The Global Longevity Consortium, consisting of the Biogerontology Research FoundationDeep Knowledge Life SciencesAging Analytics Agency and Longevity.International platform.

‍GLOBAL LONGEVITY SCIENCE LANDSCAPE 2017

For scientists, policy makers, regulators, government officials, investors and other stakeholders, a consensus understanding of the field of human longevity remains fragmented, and has yet to be systematized by any coherent framework, and has not yet been the subject of a comprehensive report profiling the field and industry as a whole by any analytical agency to date. The consortium behind this series of reports hope that they will come to be used as a sort of Encyclopedia Britannica and specialized Wikipedia of the emerging longevity industry, with the aim of serving as the foundation upon which the first global framework of the industry will be built, given the significant industry growth projected over the coming years.

Experts on the subject of human longevity, who tend arrive at the subject from disparate fields, have failed even to agree on a likely order of magnitude for future human lifespan. Those who foresee a 100-year average in the near future are considered extreme optimists by some, while others have even mooted the possibility of indefinite life extension through comprehensive repair and maintenance. As such the longevity industry has often defied real understanding and has proved a complex and abstract topic in the minds of many, investors and governments in particular.

The first of these landmark reports, entitled ‘The Science of Longevity‘, standing at almost 800 pages in length, seeks to rectify this.

Part 1 of the report ties together the progress threads of the constituent industries into a coherent narrative, mapping the intersection of biomedical gerontology, regenerative medicine, precision medicine, artificial intelligence, offering a brief history and snapshot of each. Part 2 lists and individually profiles 650 longevity-focused entities, including research hubs, non-profit organizations, leading scientists, conferences, databases, books and journals. Infographics are used to illustrate where research institutions stand in relation to each other with regard to their disruptive potential: companies and institutions specialising in palliative technologies are placed at the periphery of circular diagrams, whereas those involved with more comprehensive, preventative interventions, such as rejuvenation biotechnologies and gene therapies, are depicted as central.

In this report great care was taken to visualize the complex and interconnected landscape of this field via state of the art infographics so as to distill the many players, scientific subsectors and technologies within the field of geroscience into common understanding. Their hope was to create a comprehensive yet readily-understandable view of the entire field and its many players, to serve a similar function that Mendeleev’s periodic table did for the field of chemistry. While these are static infographics in the reports, their creators plan to create complimentary online versions that are interactive and filterable, and to convene a series of experts to analyze these infographics and continually update them as the geroscience landscapes shifts. Similar strategies are employed in Volume II to illustrate the many companies and investors within the longevity industry.

These reports currently profile the top 100 entities in each of the categories, but in producing them, analysts found that the majority of these categories have significantly more than 100 entities associated with them. One of their main conclusions upon finishing the report is that the longevity industry is indeed of substantial size, with many industry and academic players, but that it remains relatively fragmented, lacking a sufficient degree of inter-organization collaboration and industry-academic partnerships. The group plans to expand these lists in follow-up volumes so as to give a more comprehensive overview of the individual companies, investors, books, journals, conferences and scientists that serve as the foundation of this emerging industry.

Since these reports are being spearheaded by the UK’s oldest biomedical charity focused on healthspan extension, the Biogerontology Research Foundation is publishing them online, freely available to the public. While the main focus of this series of reports is an analytical report on the emerging longevity industry, the reports still delve deeply into the science of longevity, and Volume I is dedicated exclusively to an overview of the history, present and future state of ageing research from a scientific perspective.

The consortium of organizations behind these reports anticipate them to be the first comprehensive analytical report on the emerging longevity industry to date, and hope to increase awareness and interest from investors, scientists, medical personnel, regulators, policy makers, government officials and the public-at-large in both the longevity industry as well as geroscience proper by providing a report that simultaneously distills the complex network of knowledge underlying the industry and field into easily and intuitively comprehensible infographics, while at the same time providing a comprehensive backbone of chapters and profiles on the various companies, investors, organizations, labs, institutions, books, journals and conferences for those inclined for a deeper dive into the vast foundation of the longevity industry and the field of geroscience.

It is hoped that this report will assist others in visualising the present longevity landscape and elucidate the various industry players and components. Volume 2, The Business of Longevity, which at approximately 500 pages in length aims to be as comprehensive as Volume 1, is set to be published shortly thereafter, and will focus on the companies and investors working in the field of precision preventive medicine with a focus on healthy longevity, which will be necessary in growing the industry fast enough to avert the impending crisis of global aging demographics.

These reports will be followed up throughout the coming year with Volume 3 (“Special Case Studies”), featuring 10 special case studies on specific longevity industry sectors, such as cell therapies, gene therapies, AI for biomarkers of aging, and more, Volume 4 (“Novel Longevity Financial System”), profiling how various corporations, pension funds, investment funds and governments will cooperate within the next decade to avoid the crisis of demographic aging, and Volume 5 (“Region Case Studies”), profiling the longevity industry in specific geographic regions.

These reports are, however, only the beginning, and ultimately will serve as a launching pad for an even more ambitious project: Longevity.International, an online platform that will house these reports, and also serve as a virtual ecosystem for uniting and incentivizing the many fragmented stakeholders of the longevity industry, including scientists, entrepreneurs, investors, policy makers, regulators and government officials to unite in the common goal of healthspan extension and aversion of the looping demographic aging and Silver Tsunami crisis. The platform will use knowledge crowdsourcing of top tier experts to unite scientists with entrepreneurs, entrepreneurs to investors, and investors to policy-makers and regulators, where all stakeholders can aggregate and integrate intelligence and expertise from each other using modern IT technologies for these types of knowledge platforms, and all stakeholders can be rewarded for their services.

THE SCIENCE OF PROGRESSIVE MEDICINE 2017 LANDSCAPE

The consortium behind these reports are interested in collaboration with interested contributors, institutional partners, and scientific reviewers to assist with the ongoing production of these reports, to enhance their outreach capabilities and ultimately to enhance the overall impact of these reports upon the scientific and business communities operating within the longevity industry, and can be reached at info@longevity.international

This article was originally published by:
http://bg-rf.org.uk/press/longevity-industry-systematized-for-first-time

Cancer ‘vaccine’ eliminates tumors in mice

February 05, 2018

Injecting minute amounts of two immune-stimulating agents directly into solid tumors in mice can eliminate all traces of cancer in the animals, including distant, untreated metastases, according to a study by researchers at the Stanford University School of Medicine.

The approach works for many different types of cancers, including those that arisespontaneously, the study found.

The researchers believe the local application of very small amounts of the agents could serve as a rapid and relatively inexpensive cancer therapy that is unlikely to cause the adverse side effects often seen with bodywide immune stimulation.

“When we use these two agents together, we see the elimination of tumors all over the body,” said Ronald Levy, MD, professor of oncology. “This approach bypasses the need to identify tumor-specific immune targets and doesn’t require wholesale activation of the immune system or customization of a patient’s immune cells.”

One agent is currently already approved for use in humans; the other has been tested for human use in several unrelated clinical trials. A clinical trial was launched in January to test the effect of the treatment in patients with lymphoma.

Levy, who holds the Robert K. and Helen K. Summy Professorship in the School of Medicine, is the senior author of the study, which was published Jan. 31 in Science Translational Medicine. Instructor of medicine Idit Sagiv-Barfi, PhD, is the lead author.

‘Amazing, bodywide effects’

Levy is a pioneer in the field of cancer immunotherapy, in which researchers try to harness the immune system to combat cancer. Research in his laboratory led to the development of rituximab, one of the first monoclonal antibodies approved for use as an anticancer treatment in humans.

Some immunotherapy approaches rely on stimulating the immune system throughout the body. Others target naturally occurring checkpoints that limit the anti-cancer activity of immune cells. Still others, like the CAR T-cell therapy recently approved to treat some types of leukemia and lymphomas, require a patient’s immune cells to be removed from the body and genetically engineered to attack the tumor cells. Many of these approaches have been successful, but they each have downsides — from difficult-to-handle side effects to high-cost and lengthy preparation or treatment times.

“All of these immunotherapy advances are changing medical practice,” Levy said. “Our approach uses a one-time application of very small amounts of two agents to stimulate the immune cells only within the tumor itself. In the mice, we saw amazing, bodywide effects, including the elimination of tumors all over the animal.”

Cancers often exist in a strange kind of limbo with regard to the immune system. Immune cells like T cells recognize the abnormal proteins often present on cancer cells and infiltrate to attack the tumor. However, as the tumor grows, it often devises ways to suppress the activity of the T cells.

Levy’s method works to reactivate the cancer-specific T cells by injecting microgram amounts of two agents directly into the tumor site. (A microgram is one-millionth of a gram). One, a short stretch of DNA called a CpG oligonucleotide, works with other nearby immune cells to amplify the expression of an activating receptor called OX40 on the surface of the T cells. The other, an antibody that binds to OX40, activates the T cells to lead the charge against the cancer cells. Because the two agents are injected directly into the tumor, only T cells that have infiltrated it are activated. In effect, these T cells are “prescreened” by the body to recognize only cancer-specific proteins.

Cancer-destroying rangers

Some of these tumor-specific, activated T cells then leave the original tumor to find and destroy other identical tumors throughout the body.

The approach worked startlingly well in laboratory mice with transplanted mouse lymphoma tumors in two sites on their bodies. Injecting one tumor site with the two agents caused the regression not just of the treated tumor, but also of the second, untreated tumor. In this way, 87 of 90 mice were cured of the cancer. Although the cancer recurred in three of the mice, the tumors again regressed after a second treatment. The researchers saw similar results in mice bearing breast, colon and melanoma tumors.

Mice genetically engineered to spontaneously develop breast cancers in all 10 of their mammary pads also responded to the treatment. Treating the first tumor that arose often prevented the occurrence of future tumors and significantly increased the animals’ life span, the researchers found.

Finally, Sagiv-Barfi explored the specificity of the T cells by transplanting two types of tumors into the mice. She transplanted the same lymphoma cancer cells in two locations, and she transplanted a colon cancer cell line in a third location. Treatment of one of the lymphoma sites caused the regression of both lymphoma tumors but did not affect the growth of the colon cancer cells.

“This is a very targeted approach,” Levy said. “Only the tumor that shares the protein targets displayed by the treated site is affected. We’re attacking specific targets without having to identify exactly what proteins the T cells are recognizing.”

The current clinical trial is expected to recruit about 15 patients with low-grade lymphoma. If successful, Levy believes the treatment could be useful for many tumor types. He envisions a future in which clinicians inject the two agents into solid tumors in humans prior to surgical removal of the cancer as a way to prevent recurrence due to unidentified metastases or lingering cancer cells, or even to head off the development of future tumors that arise due to genetic mutations like BRCA1 and 2.

“I don’t think there’s a limit to the type of tumor we could potentially treat, as long as it has been infiltrated by the immune system,” Levy said.

The work is an example of Stanford Medicine’s focus on precision health, the goal of which is to anticipate and prevent disease in the healthy and precisely diagnose and treat disease in the ill.

The study’s other Stanford co-authors are senior research assistant and lab manager Debra Czerwinski; professor of medicine Shoshana Levy, PhD; postdoctoral scholar Israt Alam, PhD; graduate student Aaron Mayer; and professor of radiology Sanjiv Gambhir, MD, PhD.

Levy is a member of the Stanford Cancer Institute and Stanford Bio-X.

Gambhir is the founder and equity holder in CellSight Inc., which develops and translates multimodality strategies to image cell trafficking and transplantation.

The research was supported by the National Institutes of Health (grant CA188005), the Leukemia and Lymphoma Society, the Boaz and Varda Dotan Foundation and the Phil N. Allen Foundation.

Stanford’s Department of Medicine also supported the work.

This article was originally published by:
https://med.stanford.edu/news/all-news/2018/01/cancer-vaccine-eliminates-tumors-in-mice.html

What Does It Cost to Create a Cancer Drug? Less Than You’d Think

October 18, 2017

What does it really cost to bring a drug to market?

The question is central to the debate over rising health care costs and appropriate drug pricing. President Trump campaigned on promises to lower the costs of drugs.

But numbers have been hard to come by. For years, the standard figure has been supplied by researchers at the Tufts Center for the Study of Drug Development: $2.7 billion each, in 2017 dollars.

Yet a new study looking at 10 cancer medications, among the most expensive of new drugs, has arrived at a much lower figure: a median cost of $757 million per drug. (Half cost less, and half more.)

Following approval, the 10 drugs together brought in $67 billion, the researchers also concluded — a more than sevenfold return on investment. Nine out of 10 companies made money, but revenues varied enormously. One drug had not yet earned back its development costs.

The study, published Monday in JAMA Internal Medicine, relied on company filings with the Securities and Exchange Commission to determine research and development costs.

“It seems like they have done a thoughtful and rigorous job,” said Dr. Aaron Kesselheim, director of the program on regulation, therapeutics and the law at Brigham and Women’s Hospital.

“It provides at least something of a reality check,” he added.

The figures were met with swift criticism, however, by other experts and by representatives of the biotech industry, who said that the research did not adequately take into account the costs of the many experimental drugs that fail.

“It’s a bit like saying it’s a good business to go out and buy winning lottery tickets,” Daniel Seaton, a spokesman for the Biotechnology Innovation Organization, said in an email.

Dr. Jerry Avorn, chief of the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, predicted that the paper would help fuel the debate over the prices of cancer drugs, which have soared so high “that we are getting into areas that are almost unimaginable economically,” he said.

A leukemia treatment approved recently by the Food and Drug Administration, for example, will cost $475,000 for a single treatment. It is the first of a wave of gene therapy treatments likely to carry staggering price tags.

“This is an important brick in the wall of this developing concern,” he said.

Dr. Vinay Prasad, an oncologist at Oregon Health and Science University, and Dr. Sham Mailankody, of Memorial Sloan Kettering Cancer Center, arrived at their figures after reviewing data on 10 companies that brought a cancer drug to market in the past decade.

Since the companies also were developing other drugs that did not receive approval from the F.D.A., the researchers were able to include the companies’ total spending on research and development, not just what they spent on the drugs that succeeded.

One striking example was ibrutinib, made by Pharmacyclics. It was approved in 2013 for patients with certain blood cancers who did not respond to conventional therapy.

Ibrutinib was the only drug out of four the company was developing to receive F.D.A. approval. The company’s research and development costs for their four drugs were $388 million, the company’s S.E.C. filings indicated.

The drug ibrutinib was developed to treat chronic lymphocytic leukemia, shown here in a CT reconstruction of a patient’s neck. The manufacturer’s return on investment was quite high, according to a new study. Credit LLC/Science Source

After the drug was approved, AbbVie acquired its manufacturer, Pharmacylics, for $21 billion. “That is a 50-fold difference between revenue post-approval and cost to develop,” Dr. Prasad said.

Accurate figures on drug development are difficult to find and often disputed. Although it is widely cited, the Tufts study also was fiercely criticized.

One objection was that the researchers, led by Joseph A. DiMasi, did not disclose the companies’ data on development costs. The study involved ten large companies, which were not named, and 106 investigational drugs, also not named.

But Dr. DiMasi found the new study “irredeemably flawed at a fundamental level.”

“The sample consists of relatively small companies that have gotten only one drug approved, with few other drugs of any type in development,” he said. The result is “substantial selection bias,” meaning that the estimates do not accurately reflect the industry as a whole.

Ninety-five percent of cancer drugs that enter clinical trials fail, said Mr. Seaton, of the biotech industry group. “The small handful of successful drugs — those looked at by this paper — must be profitable enough to finance all of the many failures this analysis leaves unexamined.”

“When the rare event occurs that a company does win approval,” he added, “the reward must be commensurate with taking on the multiple levels of risk not seen in any other industry if drug development is to remain economically viable for prospective investors.”

Cancer drugs remain among the most expensive medications, with prices reaching the hundreds of thousands of dollars per patient.

Although the new study was small, its estimates are so much lower than previous figures, and the return on investment so great, that experts say they raise questions about whether soaring drug prices really are needed to encourage investment.

”That seems hard to swallow when they make seven times what they invested in the first four years,” Dr. Prasad said.

The new study has limitations, noted Patricia Danzon, an economist at the University of Pennsylvania’s Wharton School.

It involved just ten small biotech companies whose cancer drugs were aimed at limited groups of patients with less common diseases.

For such drugs, the F.D.A. often permits clinical trials to be very small and sometimes without control groups. Therefore development costs may have been lower for this group than for drugs that require longer and larger studies.

But, Dr. Danzon said, most new cancer drugs today are developed this way: by small companies and for small groups of patients. The companies often license or sell successful drugs to the larger companies.

The new study, she said, “is shining a light on a sector of the industry that is becoming important now.” The evidence, she added, is “irrefutable” that the cost of research and development “is small relative to the revenues.”

When it comes to drug prices, it does not matter what companies spend on research and development, Dr. Kesselheim said.

“They are based on what the market will bear.”

Correction: September 14, 2017
An earlier version of this article incorrectly identified the company that acquired a drug maker. It was AbbVie, not Janssen Biotech (which jointly develops the drug). Additionally, the article incorrectly described what AbbVie acquired. It was the company Pharmacylics, which developed the drug Imbruvica, not the drug itself.

Original source: https://www.nytimes.com/2017/09/11/health/cancer-drug-costs.html

3D ‘body-on-a-chip’ project aims to accelerate drug testing, reduce costs

October 18, 2017

A team of scientists at Wake Forest Institute for Regenerative Medicine and nine other institutions has engineered miniature 3D human hearts, lungs, and livers to achieve more realistic testing of how the human body responds to new drugs.

The “body-on-a-chip” project, funded by the Defense Threat Reduction Agency, aims to help reduce the estimated $2 billion cost and 90 percent failure rate that pharmaceutical companies face when developing new medications. The research is described in an open-access paper in Scientific Reports, published by Nature.

Using the same expertise they’ve employed to build new organs for patients, the researchers connected together micro-sized 3D liver, heart, and lung organs-on-a chip (or “organoids”) on a single platform to monitor their function. They selected heart and liver for the system because toxicity to these organs is a major reason for drug candidate failures and drug recalls. And lungs were selected because they’re the point of entry for toxic particles and for aerosol drugs such as asthma inhalers.

The integrated three-tissue organ-on-a-chip platform combines liver, heart, and lung organoids. (Top) Liver and cardiac modules are created by bioprinting spherical organoids using customized bioinks, resulting in 3D hydrogel constructs (upper left) that are placed into the microreactor devices. (Bottom) Lung modules are formed by creating layers of cells over porous membranes within microfluidic devices. TEER (trans-endothelial [or epithelial] electrical resistance sensors allow for monitoring tissue barrier function integrity over time. The three organoids are placed in a sealed, monitored system with a real-time camera. A nutrient-filled liquid that circulates through the system keeps the organoids alive and is used to introduce potential drug therapies into the system. (credit: Aleksander Skardal et al./Scientific Reports)

Why current drug testing fails

Drug compounds are currently screened in the lab using human cells and then tested in animals. But these methods don’t adequately replicate how drugs affect human organs. “If you screen a drug in livers only, for example, you’re never going to see a potential side effect to other organs,” said Aleks Skardal, Ph.D., assistant professor at Wake Forest Institute for Regenerative Medicine and lead author of the paper.

In many cases during testing of new drug candidates — and sometimes even after the drugs have been approved for use — drugs also have unexpected toxic effects in tissues not directly targeted by the drugs themselves, he explained. “By using a multi-tissue organ-on-a-chip system, you can hopefully identify toxic side effects early in the drug development process, which could save lives as well as millions of dollars.”

“There is an urgent need for improved systems to accurately predict the effects of drugs, chemicals and biological agents on the human body,” said Anthony Atala, M.D., director of the institute and senior researcher on the multi-institution study. “The data show a significant toxic response to the drug as well as mitigation by the treatment, accurately reflecting the responses seen in human patients.”

Advanced drug screening, personalized medicine

The scientists conducted multiple scenarios to ensure that the body-on-a-chip system mimics a multi-organ response.

For example, they introduced a drug used to treat cancer into the system. Known to cause scarring of the lungs, the drug also unexpectedly affected the system’s heart. (A control experiment using only the heart failed to show a response.) The scientists theorize that the drug caused inflammatory proteins from the lung to be circulated throughout the system. As a result, the heart increased beats and then later stopped altogether, indicating a toxic side effect.

“This was completely unexpected, but it’s the type of side effect that can be discovered with this system in the drug development pipeline,” Skardal noted.

Test of “liver on a chip” response to two drugs to demonstrate clinical relevance. Liver construct toxicity response was assessed following exposure to acetaminophen (APAP) and the clinically-used APAP countermeasure N-acetyl-L-cysteine (NAC). Liver constructs in the fluidic system (left) were treated with no drug (b), 1 mM APAP (c), and 10 mM APAP (d) — showing progressive loss of function and cell death, compared to 10 mM APAP +20 mM NAC (e), which mitigated those negative effects. The data shows both a significant cytotoxic (cell-damage) response to APAP as well as its mitigation by NAC treatment — accurately reflecting the clinical responses seen in human patients. (credit: Aleksander Skardal et al./Scientific Reports)

The scientists are now working to increase the speed of the system for large scale screening and add additional organs.

“Eventually, we expect to demonstrate the utility of a body-on-a-chip system containing many of the key functional organs in the human body,” said Atala. “This system has the potential for advanced drug screening and also to be used in personalized medicine — to help predict an individual patient’s response to treatment.”

Several patent applications comprising the technology described in the paper have been filed.

The international collaboration included researchers at Wake Forest Institute for Regenerative Medicine at the Wake Forest School of Medicine, Harvard-MIT Division of Health Sciences and Technology, Wyss Institute for Biologically Inspired Engineering at Harvard University, Biomaterials Innovation Research Center at Harvard Medical School, Bloomberg School of Public Health at Johns Hopkins University, Virginia Tech-Wake Forest School of Biomedical Engineering and Sciences, Brigham and Women’s Hospital, University of Konstanz, Konkuk University (Seoul), and King Abdulaziz University.


Abstract of Multi-tissue interactions in an integrated three-tissue organ-on-a-chip platform

Many drugs have progressed through preclinical and clinical trials and have been available – for years in some cases – before being recalled by the FDA for unanticipated toxicity in humans. One reason for such poor translation from drug candidate to successful use is a lack of model systems that accurately recapitulate normal tissue function of human organs and their response to drug compounds. Moreover, tissues in the body do not exist in isolation, but reside in a highly integrated and dynamically interactive environment, in which actions in one tissue can affect other downstream tissues. Few engineered model systems, including the growing variety of organoid and organ-on-a-chip platforms, have so far reflected the interactive nature of the human body. To address this challenge, we have developed an assortment of bioengineered tissue organoids and tissue constructs that are integrated in a closed circulatory perfusion system, facilitating inter-organ responses. We describe a three-tissue organ-on-a-chip system, comprised of liver, heart, and lung, and highlight examples of inter-organ responses to drug administration. We observe drug responses that depend on inter-tissue interaction, illustrating the value of multiple tissue integration for in vitro study of both the efficacy of and side effects associated with candidate drugs.

From flying warehouses to robot toilets – five technologies that could shape the future

August 06, 2017

Flying warehouses, robot receptionists, smart toilets… do such innovations sound like science fiction or part of a possible reality? Technology has been evolving at such a rapid pace that, in the near future, our world may well resemble that portrayed in futuristic movies, such as Blade Runner, with intelligent robots and technologies all around us.

But what technologies will actually make a difference? Based on recent advancements and current trends, here are five innovations that really could shape the future

1. Smart homes

Many typical household items can already connect to the internet and provide data. But much smart home technology isn’t currently that smart. A smart meter just lets people see how energy is being used, while a smart TV simply combines television with internet access. Similarly, smart lighting, remote door locks or smart heating controls allow for programming via a mobile device, simply moving the point of control from a wall panel to the palm of your hand.

But technology is rapidly moving towards a point where it can use the data and connectivity to act on the user’s behalf. To really make a difference, technology needs to fade more into the background – imagine a washing machine that recognises what clothes you have put into it, for example, and automatically selects the right programme, or even warns you that you have put in items that you don’t want to wash together. Here it is important to better understand people’s everyday activities, motivations and interactions with smart objects to avoid them becoming uninvited guests at home.

Such technologies could even work for the benefit of all. The BBC reports, for example, that energy providers will “reduce costs for someone who allows their washing machine to be turned on by the internet to maximise use of cheap solar power on a sunny afternoon” or “to have their freezers switched off for a few minutes to smooth demand at peak times”.

A major concern in this area is security. Internet-connected devices can and are being hacked – just recall the recent ransomware attack. Our home is, after all, the place where we should feel most secure. For them to become widespread, these technologies will have to keep it that way.

2. Virtual secretaries

While secretaries play a very crucial role in businesses, they often spend large parts of their working day with time-consuming but relatively trivial tasks that could be automated. Consider the organisation of a “simple” meeting – you have to find the right people to take part (likely across business boundaries) and then identify when they are all available. It’s no mean feat.

Tools such as doodle.com, which compare people’s availability to find the best meeting time, can help. But they ultimately rely on those involved actively participating. They also only become useful once the right people have already been identified.

By using context information (charts of organisations, location awareness from mobile devices and calendars), identifying the right people and the right time for a given event became a technical optimisation problem that was explored by the EU-funded inContext project a decade ago. At that stage, technology for gathering context information was far less advanced – smart phones were still an oddity and data mining and processing was not where it is today. Over the coming years, however, we could see machines doing far more of the day-to-day planning in businesses.

Indeed, the role of virtual assistants may go well beyond scheduling meetings and organising people’s diaries – they may help project managers to assemble the right team and allocate them to the right tasks, so that every job is conducted efficiently.

‘She is expecting you in the main boardroom …’ Shutterstock

On the downside, much of the required context information is relatively privacy-invasive – but then the younger generation is already happily sharing their every minute on Twitter and Snapchat and such concerns may become less significant over time. And where should we draw the line? Do we fully embrace the “rise of the machines” and automate as much as possible, or retain real people in their daily roles and only use robots to perform the really trivial tasks that no one wants to do? This question will need to be answered – and soon.

3. AI doctors

We are living in exciting times, with advancements in medicine and AI technology shaping the future of healthcare delivery around the world.

But how would you feel about receiving a diagnosis from an artificial intelligence? A private company called Babylon Health is already running a trial with five London boroughs which encourages consultations with a chatbot for non-emergency calls. The artificial intelligence was trained using massive amounts of patient data in order to advise users to go to the emergency department of a hospital, visit a pharmacy or stay at home.

The company claims that it will soon be able to develop a system that could potentially outperform doctors and nurses in making diagnoses. In countries where there is a shortage of medical staff, this could significantly improve health provision, enabling doctors to concentrate on providing treatment rather than spending too much time on making a diagnosis. This could significantly redefine their clinical role and work practices.

Elsewhere, IBM Watson, the CloudMedx platform and Deep Genomics technology can provide clinicians with insights into patients’ data and existing treatments, help them to make more informed decisions, and assist in developing new treatments.

An increasing number of mobile apps and self-tracking technologies, such as Fitbit, Jawbone Up and Withings, can now facilitate the collection of patients’ behaviours, treatment status and activities. It is not hard to imagine that even our toilets will soon become smarter and be used to examine people’s urine and faeces, providing real-time risk assessment for certain diseases.

Your robodoctor will see you now. Shutterstock

Nevertheless, to enable the widespread adoption of AI technology in healthcare, many legitimate concerns must be addressed. Already, usability, health literacy, privacy, security, content quality and trust issues have been reported with many of these applications.

There is also a lack of adherence to clinical guidelines, ethical concerns, and mismatched expectations regarding the collection, communication, use, and storage of patient’s data. In addition, the limitations of the technology need to be made clear in order to avoid misinterpretations that could potentially harm patients.

If AI systems can address these challenges and focus on understanding and enhancing existing care practices and the doctor-patient relationship, we can expect to see more and more successful stories of data-driven healthcare initiatives.

4. Care robots

Will we have robots answering the door in homes? Possibly. At most people’s homes? Even if they are reasonably priced, probably not. What distinguishes successful smart technologies from unsuccessful ones is how useful they are. And how useful they are depends on the context. For most, it’s probably not that useful to have a robot answering the door. But imagine how helpful a robot receptionist could be in places where there is shortage of staff – in care homes for the elderly, for example.

Robots equipped with AI such as voice and face recognition could interact with visitors to check who they wish to visit and whether they are allowed access to the care home. After verifying that, robots with routing algorithms could guide the visitor towards the person they wish to visit. This could potentially enable staff to spend more quality time with the elderly, improving their standard of living.

The AI required still needs further advancement in order to operate in completely uncontrolled environments. But recent results are positive. Facebook‘s DeepFace software was able to match faces with 97.25% accuracy when tested on a standard database used by researchers to study the problem of unconstrained face recognition. The software is based on Deep Learning, an artificial neural network composed of millions of neuronal connections able to automatically acquire knowledge from data.

5. Flying warehouses and self-driving cars

The new postman. Shutterstock

Self-driving vehicles are arguably one of the most astonishing technologies currently being investigated. Despite the fact that they can make mistakes, they may actually be safer than human drivers. That is partly because they can use a multitude of sensors to gather data about the world, including 360-degree views around the car.

Moreover, they could potentially communicate with each other to avoid accidents and traffic jams. More than being an asset to the general public, self-driving cars are likely to become particularly useful for delivery companies, enabling them to save costs and make faster, more efficient deliveries.

Advances are still needed in order to enable the widespread use of such vehicles, not only to improve their ability to drive completely autonomously on busy roads, but also to ensure a proper legal framework is in place. Nevertheless, car manufacturers are engaging in a race against time to see who will be the first to provide a self-driving car to the masses. It is believed that the first fully autonomous car could become available as early as the next decade.

The advances in this area are unlikely to stop at self-driving cars or trucks. Amazon has recently filed a patent for flying warehouses which could visit places where the demand for certain products is expected to boom. The flying warehouses would then send out autonomous drones to make deliveries. It is unknown whether Amazon will really go ahead with developing such projects, but tests with autonomous drones are already successfully being carried out.

Thanks to technology, the future is here – we just need to think hard about how best to shape it.

This article was originally published by:
https://theconversation.com/from-flying-warehouses-to-robot-toilets-five-technologies-that-could-shape-the-future-81519

King cancer: The top 10 therapeutic areas in biopharma R&D

July 23, 2017

It’s not going to come as a surprise to anyone who’s been paying attention to drug R&D trends that cancer is the number 1 disease in terms of new drug development projects. But it is amazing to see exactly how much oncology dominates the industry as never before.

At a time the first CAR-T looks to be on the threshold of a pioneering approval and the first wave of PD-(L)1 drugs are spurring hundreds of combination studies, cancer accounted for 8,651 of the total number of pipeline projects counted by the Analysis Group, crunching the numbers in a new report commissioned by PhRMA. That’s more than a third of the 24,389 preclinical through Phase III programs tracked by EvaluatePharma, which provided the database for this review.

That’s also more than the next 5 disease fields combined, starting with number 2, neurology — a field that includes Parkinson’s and Alzheimer’s. Psychiatry, once a major focus for pharma R&D, didn’t even make the top 10, with 468 projects.

Moving downstream, cancer studies are overwhelmingly in the lead. Singling out Phase I projects, cancer accounted for 1,757 out of a total of 3,723 initiatives, close to half. In Phase II it’s the focus of 1,920 of 4,424 projects. Only in late-stage studies does cancer start to lose its overwhelming dominance, falling to 329 of 1,257 projects.

PhRMA commissioned this report to underscore just how much the industry is committed to R&D and significant new drug development, a subject that routinely comes into question as analysts evaluate how much money is devoted to developing new drugs instead of, say, marketing or share buybacks.

The report makes a few other points to underscore the nature of the work these days.

— Three out of four projects in the clinic were angling for first-in-class status, spotlighting the emphasis on advancing new medicines that can make a difference for patients. Me-too drugs are completely out of fashion, unlikely to command much weight with payers.

— Of all the projects in clinical development, 822 were for orphan drugs looking to serve a market of 200,000 or less. Orphan drugs have performed well, able to command high prices and benefiting from incentives under federal law.

— There were 731 cell and gene therapy projects in the clinic, with biopharma looking at pioneering approvals in CAR-T, with Novartis and Kite, as well as the first US OK for a gene therapy, with the first application accepted this week for a priority review of a new therapy from Spark Therapeutics.


Distribution of products and projects by therapeutic area and phase


Source: Analysis Group, using EvaluatePharma data


Unique NMEs in development by stage (August 2016)

CRISPR kills HIV and eats Zika ‘like Pac-man’. Its next target? Cancer

June 29, 2017

There’s a biological revolution underway and its name is CRISPR.

Pronounced ‘crisper’, the technique stands for Clustered Regularly Interspaced Short Palindromic Repeat and refers to the way short, repeated DNA sequences in the genomes of bacteria and other microorganisms are organised.

Inspired by how these organisms defend themselves from viral attacks by stealing strips of an invading virus’ DNA, the technique splices in an enzyme called Cas creating newly-formed sequences known as CRISPR. In bacteria, this causes RNA to make copies of these sequences, which help recognise virus DNA and prevent future invasions.This technique was transformed into a gene-editing tool in 2012 and was named Science magazine’s 2015 Breakthrough of the Year. While it’s not the first DNA-editing tool, it has piqued the interest of many scientists, research and health groups because of its accuracy, relative affordability and far-reaching uses. The latest? Eradicating HIV.

At the start of May, researchers at the Lewis Katz School of Medicine at Temple University (LKSOM) and the University of Pittsburgh demonstrated how they can remove HIV DNA from genomes of living animals – in this case, mice – to curb the spread of infection. The breakthrough was the first time replication of HIV-1 had been eliminated from infected cells using CRISPR following a 2016 proof-of-concept study.

 

In particular, the team genetically inactivated HIV-1 in transgenic mice, reducing the RNA expression of viral genes by roughly 60 to 95 per cent, before trialling the method on infected mice.

“During acute infection, HIV actively replicates,” Dr. Khalili explained. “With EcoHIV mice, we were able to investigate the ability of the CRISPR/Cas9 strategy to block viral replication and potentially prevent systemic infection.”

Since the HIV research was published, a team of biologists at University of California, Berkeley, described 10 new CRISPR enzymes that, once activated, are said to “behave like Pac-Man” to chew through RNA in a way that could be used as sensitive detectors of infectious viruses.

These new enzymes are variants of a CRISPR protein, Cas13a, which the UC Berkeley researchers reported last September in Nature, and could be used to detect specific sequences of RNA, such as from a virus. The team showed that once CRISPR-Cas13a binds to its target RNA, it begins to indiscriminately cut up all RNA making it “glow” to allow signal detection.

 

Two teams of researchers at the Broad Institute subsequently paired CRISPR-Cas13a with the process of RNA amplification to reveal that the system, dubbed Sherlock, could detect viral RNA at extremely low concentrations, such as the presence of dengue and Zika viral RNA, for example. Such a system could be used to detect any type of RNA, including RNA distinctive of cancer cells.

This piece has been updated to remove copy taken from WIRED US.

http://www.wired.co.uk/article/crispr-disease-rna-hiv

QuintilesIMS Institute Study: U.S. Drug Spending Growth of 4.8 Percent in 2016

June 29, 2017

DANBURY, Conn. & RESEARCH TRIANGLE PARK, N.C.

(Business Wire) Growth in U.S. spending on prescription medicines fell in 2016 as competition intensified among manufacturers, and payers ramped up efforts to limit price increases, according to research released today by the QuintilesIMS Institute. New medicines introduced in the past two years continue to drive at least half of total spending growth as clusters of innovative treatments for cancer, autoimmune diseases, HIV, multiple sclerosis, and diabetes become accessible to patients. The prospects for innovative treatments over the next five years are bright fueled by a robust late-phase pipeline of more than 2,300 novel products that include more than 600 cancer treatments. U.S. net total spending is expected to increase 2-5 percent on average through 2021, reaching $375-405 billion.

Drug spending grew at a 4.8 percent pace in 2016 to $323 billion, less than half the rate of the previous two years, after adjusting for off-invoice discounts and rebates. The surge of innovative medicine introductions paused in 2016, with fewer than half as many new drugs launched than in 2014 and 2015. While the total use of medicines continued to climb—with total prescriptions dispensed reaching 6.1 billion, up 3.3 percent over 2015 levels—the spike in new patients being treated for hepatitis C ebbed, which contributed to the decline in spend. Net price increases—reflecting rebates and other price breaks from manufacturers—averaged 3.5 percent last year, up from 2.5 percent in 2015.

“After a year of heated discussion about the cost and affordability of drugs, the reality is that after adjusting for population and economic growth, total spending on all medicines increased just 1.1 percent annually over the past decade,” said Murray Aitken, senior vice president and executive director of the QuintilesIMS Institute. “Understanding how the dynamics of today’s healthcare landscape impact key stakeholders is more important than ever, as efforts to pass far-reaching healthcare legislative reforms remain on the political agenda.”

In its report, Medicine Use and Spending in the U.S. – A Review of 2016 and Outlook to 2021, the QuintilesIMS Institute highlights the following findings:

  • Patients age 65 years and over have accounted for 41 percent of total prescription growth since 2011. While the population of seniors in the U.S. has increased 19 percent since 2011, their average per capita use of medicines declined slightly—from 50 prescriptions per person in 2011 to 49 prescriptions per person last year. In the age 50-64 year population, total prescriptions filled increased 21 percent over the past five years, primarily due to higher per capita use, which reached 29 prescriptions per person. The largest drivers of prescription growth were in large chronic therapy areas including hypertension and mental health, while the largest decline was in pain management.
  • Average patient out-of-pocket costs continued to decline in 2016, reaching $8.47 compared to $9.66 in 2013. Nearly 30 percent of prescriptions filled in 2016 required no patient payment due in part to preventive treatment provisions under the Affordable Care Act, up from 24 percent in 2013. The use of co-pay assistance coupons by patients covered by commercial plans also contributed to the decline in average out-of-pocket costs, and were used to fill 19 percent of all brand prescriptions last year—compared with 13 percent in 2013. Those patients filling brand prescriptions while in the deductible phase of their commercial health plan accounted for 14 percent of prescriptions and 39 percent of total out-of-pocket costs. Patients in the deductible phases of their health plan abandoned about one in four of their brand prescriptions.
  • The late-phase R&D pipeline remains robust and will yield an average of 40-45 new brand launches per year through 2021. At the end of 2016, the late-phase pipeline included 2,346 novel products, a level similar to the prior year, with specialty medicines making up 37 percent of the total. More than 630 distinct research programs are underway in oncology, which account for one-quarter of the pipeline and where one in four molecules are focused on blood cancers. While the number of new drug approvals and launches fell by more than half in 2016, the size and quality of the late-phase pipeline is expected to drive historically high numbers of new medicines.
  • Moderating price increases for branded products, and the larger impact of patent expiries, will drive net growth in total U.S. spending of 2-5 percent through 2021, reaching $375-405 billion. Net price increases for protected brands are forecast to average 2-5 percent over the next five years, even as invoice price growth is expected to moderate to the 7-10 percent range. This reflects additional pressure and influence by payers on the pricing and prescribing of medicines, as well as changes in the mix of branded products on the market. Lower spending on brands following the loss of patent protection is forecast to total $140 billion, including the impact of biosimilar competition, through 2021.

The full version of the report, including a detailed description of the methodology, is available at www.quintilesimsinstitute.org. The study was produced independently as a public service, without industry or government funding.

In this release, “spending on medicines” is an estimate of the amount received by pharmaceutical manufacturers after rebates, off-invoice discounts and other price concessions have been made by manufacturers to distributors, health plans and intermediaries. It does not relate directly to either the out-of-pocket costs paid by a patient, except where noted, and does not include mark-ups and additional costs associated with dispensing or other services associated with medicines reaching patients. For a fuller explanation of methods to estimate net spending, see the Methodology section of the report.

https://www.quintilesims.com/press-releases/quintilesims-institute-study-us-drug-spending-growth-of-48-percent-in-2016