Guest Column | October 15, 2019

The Need For Enhanced Control Strategies In Biopharma And Biosimilar Production

By Emil W. Ciurczak, Doramaxx Consulting

Marionette-Strings-iStock-538061952

For many years, the pharmaceutical industry  meant “small (usually synthetic) molecules” mixed with various non-active materials and put into capsules or, in the old days, rolled into pills or pressed into tablets. While synthesizing the APIs (active pharmaceutical ingredients), formulating the dosage forms, and analyzing the materials at every step of the life cycle was not always trivial, it was relatively straightforward.

The tools used for analyzing/controlling each step were, in many cases, already in labs across the world. Since the early commercial production tools were, by today’s standards, very slow, in-process tests didn’t need to be fast or sophisticated. Indeed, the vast majority of solid dosage forms were “immediate-release” tablets or capsules that depended upon the gelatin’s solubility for release of the API. Later, time-release dosage forms were subjected to the same in-process tests as immediate-release forms: hardness, friability, disintegration, and weight variation.

All this was fine when a single-punch press (and later, somewhat larger units) was used, producing hundreds of tablets per hour. Since final testing was “sufficient” for safety and efficacy, the 20 to 30 final doses tested (or an assay of a composite) were considered fine. After all, the batch style of production took weeks for a single lot to be made, so who cared if it took (several) days to analyze it? So, as production methods grew faster and faster, the industry was saddled with 1950’s style in-process and final lot analysis techniques. The best impetus to “modernizing” the way we monitor and analyze and, more importantly, control our production was the FDA’s PAT (process analytical technology) Guidance of 2004.  (If you aren’t familiar with this guidance, please Google it.)

Technology Advancements Enhance Process Control

The PAT guidance (and successive guidances from the FDA, EMA, and ICH) supported better and more control of a process through modern technology. The extension of analysis/control to process applications meant simply waiting for faster and better computers to be made available, sufficiently complex software to be written, and the engineering of smaller, faster, and more accurate measurement devices. Beginning in (roughly) 1990, several companies began developing the tools needed (one example was the cooperative effort between Pfizer, the U.K., and Zeiss, Switzerland, to develop the first wireless, in-place near-infrared [NIR] spectrometer for blend uniformity measurements in real time). The acceptance of this tool by the FDA opened the floodgates for new equipment and peripherals.

Traditionally, the making of a small molecule dosage form has two distinct segments: synthesizing the API and generating the solid dosage form. The former is essentially organic chemistry, while the latter is (or should be) based on materials science, i.e., mixing and tableting. With the development of wireless spectrometers (largely near-infrared) came continuous monitoring and feedback (control) under PAT. A decade or so after the introduction of PAT/QbD (quality by design), we see more and more real-time release of final dosage forms… not to mention the growing presence of continuous manufacturing (CM). So, it would appear that solid dosage forms are well on their way to QbD and, eventually, where warranted, continuous manufacturing.

These and similar tools have been in existence for organic (API) synthesis reactions for longer than for tablet production, simply because the organic synthesis reactions take place in non-aqueous solutions, amenable to spectroscopic (IR, NIR, Raman) controls. Parameters like viscosity, temperature, refractive index, and other physical measurements were easy to measure in an organic solution.

However, expecting us to simply apply these same control technologies to biopharma products would be naïve. There are some fundamental, basic differences between the two paradigms. Instead of a controlled synthetic organic reaction in a chemical reactor, the manufacturing of biologics (monoclonal antibodies, recombinant proteins and DNA, vaccines, etc.) relies on complex cellular biosystems with high sensitivity to their environment and feeding regimen in an aqueous matrix, not simply controlled by well-established principles of organic chemistry.

Biologics Production Adds Complexities

The production of large molecules by microbes and mammalian cells requires the control of numerous processing parameters such as nutrient concentration, temperature, pH, gases, agitation, and so on. The host cells, the product(s), the byproducts (i.e., lactate, ammonium, and CO2), and the growth medium constitute a complex mixture, with many of the chemical species present in a bioreactor at levels undetectable by many analytical tools, including NIR spectroscopy. (Many materials are not strong absorbers of IR or NIR light, i.e., ammonium [NH4+ and H+ ions], so their effects on other molecules and water are followed by chemometric methods.)

The production of large molecules typically follows a two-step process. First, the microorganisms produce the molecules of interest. Then, the molecule is purified from the growth medium, cells, viruses, and other impurities. Most of the published work involving NIRS and other popular process controls has been for the large molecule production, so I will not address the cleanup process here.

The biopharma manufacturing process routinely relies on the in-line and real-time measurements and control of parameters such as pH, dissolved oxygen, and CO2, both dissolved and in the headspace, that impact cell viability. Nutrients (i.e., glucose) need to be measured and controlled throughout the duration of the batch production and byproducts (i.e., lactate, ammonia) need to be monitored. Until recently, manual sampling and offline measurements with fundamental primary analytical methods were the predominant control procedures. However, the use of in-line spectroscopy as a process analytical tool to monitor and control these bioreactors has seen a significant increase over the last decade.

All impurities in APIs are critical, but with biological impurities (often proteins, not seen previously), the stakes are potentially higher. Not only are there potential long-term carcinogenicity and mutagenicity dangers, but, with unknown proteins, there are also potential immediate allergic reactions. Assuming there are no immediate reactions, there are still the potential long-term potential harmful effects, depending on the therapy for which the biological is being used. If the drug is used in a one-time application, such as heparin for a cardiac event, there would not likely be a chance for the minor impurities to do much harm. On the other hand, long-term use of a bio-drug such as insulin, used for decades, would allow even the smallest impurity the time to do harm to the patient.

One might assume that a company that develops an NDE (new drug entity) based on a bioprocess will spend years assessing potential harmful effects. Between the time involved in development of the API (protein, etc.), all the clinical trials, and subsequent stability studies while in production, it would be expected that the initiator company would have accumulated a large portfolio on all the potential byproducts and, later, the breakdown products of the drug substance and its synthesis route. However, as with generic competition for small molecules, there has arisen competition from secondary companies, producing the “same” active molecule, but from a different synthesis/bio-expression route.

Now, a biosimilar would have, by definition, less time for any potential side products to be evaluated before marketing, often with abbreviated clinical trials before release/marketing. While the major active ingredient may be identical to the patented one, any biological process expresses numerous proteins, each particular to the mode of expression. When all is said and done (excluding potential lawsuits for patent infringement, etc.), the most problematic feature of any biosimilar will be the exotic side products and potential side effects. Again, excluding copyright infringement possibilities, several guidances and policies of the FDA also add to the complexity of making and selling biosimilars.

Regulatory Expectations Impose New Responsibilities

When you include the provisions of the FDA’s Question-Based Review Guidance, for example, it becomes more arduous. The Question-Based Review (QbR) for Generic Drugs: An Enhanced Pharmaceutical Quality Assessment System has as one of its main thrusts requiring ANDAs (Amended New Drug Applications, which, unfortunately, would include biosimilars) from disparate companies to follow a common form for style. Previously, when each of the large number of generic companies submitted their documents, each used its own internal style. This resulted in reviewers at the FDA having to navigate dozens of different types of applications, causing long wait times for the generics to get a yes/no answer on their new product’s fate. This style requirement, alone, made the guidance an excellent idea and, like a class receiving a term paper assignment, they all understood what was needed and in what order it should be presented. This did, indeed, speed up review times.

Unfortunately, it also included some new responsibilities for the generic company. The responsibility for the purity of the product was extended to both earlier and later than had been the case previously. The existing responsibility was to “simply” produce a product (often covered by a monograph in the USP) that met the requirements of purity, assay, disintegration or dissolution times, and so on. Prior to QBR, it was sufficient to depend on the CoA (certificate of analysis) for purity, potency, etc. of an API. With a biosimilar, a mere CoA would have never been a good idea.

But, under new guidances (both FDA and ICH), the generic drug companies (including CMOs) now need to be familiar with the synthesis route for the API such that they can prove (validate) that their incoming raw materia testing and stability-indicating assays can identify and quantify any breakdown product from the synthesis of any of API, no matter the synthesis route by which they were produced. This also extends to stability programs: each analysis method MUST be capable of finding and quantifying materials from the breakdown of the dosage form APIs, however they are produced.

This means a constant feedback loop between suppliers and the company’s labs, such that any analytical method can separate any and all potential byproducts (from synthesis) and any and all breakdown products from stability samples. Now, in a “normal” or traditional generic company or contract manufacturing facility, there are a number of trained analytical chemists, allowing the methods to morph to the specificity needed. It only adds a small amount of labor and time to the existing workload when small molecules are involved.

However, when it comes to biological or biosimilar production and sales, all bets are off. Whether the CMO is producing a biological product that was the “original” (under contract to the patent-holder) or generating a product that is “similar,” the process is far more complicated than merely mixing powders and compressing a tablet or encapsulating the mix into a capsule. Understanding the effects of an API on the final dosage form is even covered in ICH Q11:

The identification of CQAs (critical quality attributes) for complex products can be challenging. Biotechnological/biological products, for example, typically possess such a large number of quality attributes that it might not be possible to fully evaluate the impact on safety and efficacy of each one. Risk assessments can be performed to rank or prioritize quality attributes. Prior knowledge can be used at the beginning of development and assessments can be iteratively updated with development data (including data from nonclinical and clinical studies) during the lifecycle. Knowledge regarding mechanism of action and biological characterization, such as studies evaluating structure-function relationships, can contribute to the assessment of risk for some product attributes.

This control/understanding of biologicals for the companies that have developed the drug is difficult enough, even with a large number of biochemists, molecular biologists, and analytical and QC chemists. For smaller companies (both producers of the bioproducts and the generics that package them as dosage forms) largely used to performing small molecule analyses, this makes the task even more difficult.  Clearly, any company producing a biosimilar would need the facilities of the major company that originally discovered and produced the first bioproduct.

So, in short, biologicals are the next great step for the pharmaceutical industries. The double-edged sword is that, as the molecules become more and more complex, our need for control and understanding becomes greater. The potential for curing exotic diseases and helping humans has become greater, but (as they say in Marvel movies) “with great power comes great responsibilities.” Our quality programs will need to become many times more stringent and carefully designed.

But, the future with biopharmaceuticals is far brighter than without them.

This article has been adapted from the 2019 CPhI Annual Report.

About The Author:

Emil Ciurczak has advanced degrees in chemistry from Rutgers University and Seton Hall University. He has worked in the pharmaceutical industry since 1970. In 1983, he introduced near infrared (NIR) spectroscopy consulting for Technicon (Bran & Leubbe), NIRSystems (FOSS), CDI Pharma, Infrared Fiber Systems, Brimrose, and Buchi. Most of his research was on pharmaceutical applications of NIR, about which he has published over 40 articles in refereed journals and over 200 magazine columns and has presented over 200 technical papers.