Quality by Design, or QbD, is an important tool for ensuring the pharmaceuticals taken by patients are consistently safe. It allows us to be certain that the process used to make them is both robust and reproducible. It emphasizes the importance of gaining a good understanding of the product and process control, based on good science and risk management. It also provides a basis for identifying any risks that might affect product quality, enabling them to be mitigated against.
The critical quality attributes, or CQAs, for the product are first defined, along with the acceptable ranges for each one. These are then linked back to all the quality-relevant process parameters (QRPPs) that can be controlled, and a model is established that permits the prediction of combinations of QRPPs that will give acceptable product quality.
When carrying out process development, typically only one process parameter is changed at a time when looking for proven acceptable ranges. However, this is slow, and there is a significant risk that it can create ‘blind spots’ for any interacting combinations of parameters. QbD gives a much deeper understanding of the process, allowing wider parameter ranges to be defined. This gives greater operational flexibilities. With a broad and well-defined design space, adjustments to the reaction conditions can even be made without having to file for regulatory post-approval.
Kinetic or mechanistic computer modelling can be used to make the QbD process more effective. Mechanistic models use physical and chemical equations; empirical models require a huge amount of data to create equations that describe the process.
Here at Siegfried, we’ve been putting a lot of effort into developing an improved model-based approach that will enable us to complete the process development more quickly. This was made possible by the careful application of advanced techniques. First, we collect all the available data about the process, and make a primliminary quality risk assessment to identify knowledge gaps. This systematic approach allows us to define experiments that will help fill those gaps.
At the same time, we develop a mechanistic kinetic model, based on existing process understanding. Results from the experiments are fed into the model based on the working hypothesis. This allows us to adjust and improve the model.
We then re-evaluate the quality risk assessment and the model, and repeat the process until we determine which of the QRPPs might affect product quality. This allows us to create a model that can predict process responses effectively.
We recently published a paper in the journal Chimia that showed how we applied this QBD approach to a telescoped two-step synthesis. In this example, we created a kinetic model that included 12 individual reactions to describe the synthesis itself and the side reactions. This allowed us to predict the effects that making changes to process parameters would have on the total yield and quality of the product.
It worked really well. For the process, we identified 32 potential QRPPs, and experiments allowed us to reduce this to just the six that were most likely to affect yield and quality. This significantly reduced the complexity of the model.
We then created an initial mechanistic model, taking into account all the available knowledge about the process and our working hypothesis of the reaction mechanisms. This included 36 reactions in total, but pinning down the kinetic parameters for such a comprehensive model is clearly impractical. In parallel with the quality risk assessment, therefore, we used our extensive process understanding to simplify the model. Further experiments allowed us to reduce the number of reactions in the model to just 12.
The next step was to use the model to identify combinations of variable QRPPs that would allow the CQA limits to be achieved – known as the design space. The systematic approach and mechanistic model gave us significantly increased process understanding, and in this specific example allowed us to spot that the presence of water led to an increase in a decomposition side-reaction. A more traditional one-factor-at-at-time approach would not have allowed us to determine the very complex interactions and relationships between the many critical process parameters.
The model proved invaluable to improving the process. And with more data in hand from the process in production scale, it should be possible to make further improvements. Overall, the model allows for fast process optimization, and better control over quality.
If you’re interested in finding out more about how it works, please get in touch. You can also check out the full paper.