Making The Leap To Smart Bioprocessing
A conversation with Patrick Statham, Cell and Gene Therapy Catapult

Variable inputs can lead to consistent outputs, but they need an adaptive process in between. That’s the challenge an EU-funded consortium ran up against in developing a smart bioprocess for tumor-infiltrating lymphocytes (TILs) manufacturing, starting from a first-generation static process and industrializing it into a stir-tank bioreactor process integrated with sensing technologies.
Patrick Statham and his colleagues at the Cell and Gene Therapy Catapult in the U.K. collaborated with partners, including the therapy developer Achilles Therapeutics, to integrate sensors and real-time controls to make the autologous cell therapy.
The effort — which aimed to improve consistency and efficiency of a complex T cell expansion process in co-culture with dendritic cells — included several players in the SMARTER consortium, including:
- Achilles, which provided the patient-derived material — TIL-like cells and dendritic cells
- Leibniz University of Hannover, which provided spectroscopy expertise
- The Biomarkers and Precision Medicine Unit at the Health Research Institute of the Hospital La Fe provided metabolomics expertise.
Following an initial proof-of-concept run, which Statham described in August at Cambridge Health Institute’s Bioprocessing Summit conference, the consortium found greater T cell expansion upon better metabolite utilization compared to the baseline process conditions, strengthening the notion that real-time monitoring and control of critical process parameters do, in fact, help improve productivity and quality.
After his talk, we followed up with Statham to understand what it takes to adopt a smarter bioprocessing platform with real-time monitoring and responsive process controls. Here’s what he told us. The transcript is edited for clarity.
Can you describe the shift between static and dynamic processes?
Statham: For some processes, maybe a static system works well. But for the one we were discussing, which was an autologous cell therapy process using the G-Rex platform, there’s inherent batch-to-batch variation from donors and no means of monitoring what is happening in the culture.
If you've got no means of in-line or online monitoring what's happening in the culture, there's no way to intervene. Even if you could know what is happening in there, there's no means of controlling it. On the other hand, a lot of dynamic systems allow for sampling, for hooking up to controllers or external pumps. Having that dynamic system, with that functionality, you can then look to control and monitor your process. That was the motivation for us.
It seems like a massive undertaking, first changing our thinking and also the physical infrastructure. What are some of the challenges associated with that?
Statham: Changing the mindset is really challenging, and in the context of my whole talk, it seems like a small thing, but that is a bioprocessing challenge in itself. In our specific situation, the prior art suggested it needed to be static because it was TILs and a dendritic cell model. That's a co-culture, typically. And the prior art said this has to be static so biological interaction can occur. That was a big challenge for us — changing the way of thinking, then taking a risk by putting it into an agitated system and understanding that it could actually work in agitation.
We actually published a patent based on that. So that’s a testament to challenging the prior art. Other challenges in terms of moving from static dynamic include choosing parameters. In a static process, of course, most of these instances will be in an incubator with 5% CO2 and atmospheric oxygen. Then moving into a dynamic system, we have all these options to try different things – different agitation rates, pH, dissolved oxygen, gas flow rates, and gassing strategy are just some.
You talked about mechanistic and chemometric modeling. Can you break down the difference between those two approaches?
Statham: I mainly presented on chemometric modeling. In this instance, we had to match our offline data, some of which we had from the FLEX2 and some from the LCMS. So, it's matching that offline data with spectral data. In our instance, it was Raman spectroscopy, and our collaborator was working on the 2D fluorescence spectroscopy technique.
The process of chemometric modeling is matching those spectral data with your offline metabolite data, generating accurate predictions from your spectral signals, and then converting those in real time into manageable metabolite measurement outputs.
In mechanistic modeling, which we are looking to finish next, it is basically using equations and mathematical descriptions of cell behavior to make predictions about your culture and how your parent can affect that.
Do you see this as a blueprint for other autologous cell therapies or even allogeneic cell therapies beyond TILs?
Statham: I can take a step back on how we work at the Catapult. We have functionality and capabilities with these platform technologies. So, looking at stirred-tank systems and upstream process development dynamic systems, we assembled a project with a therapy provider to move to a smarter bioprocessing platform.
In this instance it was TILs, but that's not to say that this is the only thing that it's compatible with. In a way, this is probably one of the more difficult things we could have trialed because it’s a co-culture.
Certainly, we would be compatible with other autologous modalities. Our approach, in which we were looking at what metabolites we're depleting and adding metabolites using our online chemometric models, could definitely be applicable to allogeneic cell therapies — I think it's applicable across modalities.
How can patients benefit from this approach? Are we talking about faster time to dosing, lower costs, higher quality?
Statham: We saw intensification and a big increase in fold expansion in our study. What this could mean for different therapy modalities is if we’re getting a larger yield, perhaps for a therapy that requires a really high dose that could be really beneficial. In instances where we're far exceeding the required fold expansion, it could mean you're shortening the process, i.e., reducing the cost.
But I think the most important thing from our smart adapted platform is a model and a means of controlling the process. Whatever your input starting material is, you can have a process that adapts to your starting material and generates more consistent product.
I think it's consistency and intensification — be that in yield or time and therefore cost. With improved consistency, there is less process failure.
Looking at some of the technologies you used, it seems like it would require many different functional groups. Can you talk about the different skillsets you needed in the same room together?
Statham: It is a massive undertaking. There's bioprocess knowledge, which isn't trivial; PAT; soft sensors; modeling; and then, above all, IT integration, which I think there were some questions on that today. So, I think the real strong point of this SMARTER consortium was being able to assemble these partners within the U.K. and the EU that have all these distinct skills. And that was the real benefit and the strength of this project. But I think that's really a lesson companies and therapy providers could think about going forward – how to assemble teams with these skills. So yes, your data scientists, as we've heard a lot in this conference, are really important. It takes process engineers, biostatisticians, and IT experts to really pull off something like this.
About The Expert:
Patrick Statham, Ph.D., is a senior bioprocessing scientist at the Cell & Gene Catapult and a member of its Scale Enabling Technology (SET) team where he explores upstream process development for several modalities and platforms. He received his Ph.D. from the University of Leeds.