Mitigate The Potential For Immunogenicity To De-Risk Your Early Drug Development
By Noel Smith Ph.D., Head of Immunology, Lonza

Despite exciting breakthroughs in science and technology over the last couple of decades, the pharmaceutical industry continues to struggle with high drug-attrition rates, with only an estimated 1 in 1,000 preclinical drug candidates actually reaching the market.1 Manufacturers facing an average of 12.8 years and a cost of approximately $2.6 billion to complete their drug development journey must adopt risk mitigation strategies that can move their ideas beyond pharma’s “valley of death,” in order to not only reduce investments in time and resources but also deliver innovative new therapies to the patients who need them.2
While the reasons for clinical failure vary, a large portion are due to issues with safety and efficacy.3 A lack of translatable testing models has historically limited insight during the preclinical development stage on how the human body will react to a drug candidate, but new assays that can more accurately predict immunogenicity potential are now available. Yet, their success is dependent on several factors such as cell quality, culture conditions and assay readout parameters. Understanding what these factors are as well as how and when these tools should be used will help you design the most effective testing strategies for your drug development program, ultimately increasing its likelihood of success and improving speed to market.
Get unlimited access to:
Enter your credentials below to log in. Not yet a member of Bioprocess Online? Subscribe today.