Evaluating future drug candidates and their manufacture together

Evaluating future drug candidates and their manufacture together

By Professor Peter Dunnill

Classically, evaluating a candidate drug has preceded evaluation of the process for making it. The late rush to process development worked reasonably well with simple chemical pharmaceuticals.

With complex biopharmaceuticals, a new approach is needed. One now being developed has implications for increasingly complex small molecule pharmaceuticals and can cut time, costs and risk.

Consider the sort of post-genomic scenario which is becoming common. A pharmaceutical company is seeking a fast route to a therapeutic agent where it has been difficult to locate a chemical candidate. A thousand antibody molecules have been selected, for example, by phage display methods, with closely related capacities to bind to a particular receptor. A subset are chosen and evaluated further until one is prepared for detailed trials. The antibody looks very promising in terms of efficacy and toxicity but as the scale goes up the yield drops dramatically.

The time for a decision to proceed with manufacturing trials looms but the capacity to make the antibody does not improve. It is abandoned. Why? The basis of selection of the antibody has been its capacity to bind, not its capacity to remain soluble in a process environment. As scale increases the processing period available for time-dependent aggregation grows and more and more is lost irreversibly. There is no time to go back.

With biopharmaceuticals and complex chemical pharmaceuticals, the classical approach of delaying process studies is increasingly leading to problems. The reason for not working on process design early is logical. When so many candidates will fail in the progression to Phase II clinical trials it is not realistic to put candidates into expensive pilot plant programmes early. With simple chemical pharmaceuticals this was not too serious because, with decades of process chemistry experience, a capacity to fully characterise molecules analytically and a well established chemical engineering theoretical basis, rapid scale-up was usually achievable.

For biopharmaceuticals such as protein-based medicines there is no such history. The molecular genetics and relevant biochemical engineering used are barely three decades old. The tendency to variability of cell synthesis, the difficulty of analysis and the lack of process experience represent a quite different situation. Companies are often finding that they cannot create efficient processes in the short time available. If they proceed with an inefficient process and thereby a high cost of goods the drug may fall foul of government efforts to contain healthcare costs.

A non-robust process will also pose problems with regulatory authorities. If companies delay improving the process there will be a large irreversible loss of revenue because the period of exclusivity, already shortening generally, will be further reduced. With real progress in increasing the speed of other trials, slow process development risks being the cause of a company failing to be first to market.

The logic remains that large-scale process trials cannot be conducted before clinical promise is established. What is needed urgently is a means of acquiring early information about the likely performance of a process for a new drug candidate. The scale of tests need not be quite as small as that now being pursued in initial discovery where millions of candidates must be examined and most will be eliminated. However, the cost of process assessment must be sufficiently low that the abandonment of as many as 95% of the results following candidate failure will be financially acceptable.

That rules out the use of smaller pilot plants which are still expensive to build and operate and implies tests at a scale of no more than tens to hundreds of millilitres. Engineers learned early that working with conventional laboratory equipment and expecting industrial systems to match the results is doomed to failure. Instead the aim is to start with the constraints of known large-scale systems and aim to mimic them; so-called scale-down.

Even with engineering fields based on long established physical sciences, such as aerospace, scale-down alone is not enough. With a new aircraft, once the results of examining a tiny model of a wing or fuselage have been collected, the prediction of actual behaviour of a real plane demands computer-based modelling. Similar approaches are applied in some chemical fields, including simpler chemical pharmaceuticals.

The challenge with biopharmaceuticals is much greater. Knowledge of their genetically modified cell sources is still relatively empirical. Human proteins, genes (in plasmids or disabled viruses) and recombinant vaccines are highly labile materials and therefore very sensitive to the process environment. In addition, industry cannot move to a new paradigm until any new scale-down approach is workable across the whole bioprocess: to have only a very good scale-down for cell culture and a late purification step such as chromatography would be like having a high throughput screening instrument but no means of generating drug candidates.

An interdisciplinary research team at University College London has been addressing this challenge during the 1990s. It has been necessary to dissect the scaling problem into separate elements. First, the sensitivity of materials such as proteins and genes to the engineering environment is measured in very small devices. Then the corresponding level of such effects in large equipment is assessed using modelling. This information is combined for groups of interacting operations. What emerges is a set of miniature tests and modelling tools, which allow prediction of large-scale performance (1).

Figure 1 shows one ‘ultra scale-down’ tool used in predicting full-scale centrifuge performance with delicate biological materials and illustrates its size relative to the industrial equipment.

Figure 1 Researcher holding an ultra scale-down measuring device

For the industry to have faith in the approach the results from the miniature device must be compared with large pilot-scale results to verify or refine the method. Even then the companies have to examine the new approach in relation to their own procedures. In an industry, which already is forced to take great risks, the introduction of a new paradigm is an incremental process. Therefore, ultra scale-down must be run in parallel with conventional methods in the first phase.

Fortunately it is in the nature of ultra scale-down that it is a laboratory-based technique and so does not demand additional expensive facilities. What it does require is an emphasis in the laboratory on mathematical modelling skills and biochemical engineering insight. However, this is already being demanded by the requirements of automated high throughput screening technology and massive data processing. The new approach is summarised in Figure 2.

Figure 2 Relationship between conventional bioprocess development and the ultra scale-down approach

It stresses that large-scale process trials cannot be conducted until clinical promise and safety are substantially confirmed. In contrast, ultra scale-down can be applied much earlier. Then, subsequent large-scale trials, which begin at the same point in time as in the classical approach, can proceed with much greater insight, occupying less time and yielding a more robust process (2). Such an ultra scale-down approach also can contribute to the creation of earlier ‘killing fields’ to eliminate drugs, which are likely to be a problem for a variety of reasons. Finally, it can inform the preparation of early trials material.

At present the ultra scale-down methods are manual and they can achieve a major step change. However, in the future it will be possible to apply automation. At first sight it is hard to see how this can be achieved because it can take as many as 10 distinctly different operations to purify and formulate a biopharmaceutical once it has been produced from the living cell. Automated drug screening provides a possible clue but the microwell systems employed there need to be seen through the eyes of a biochemical engineer.

In microwell technology a drug candidate is added to a well containing the biological test material and the dispensing causes mixing after which it is possible to judge the outcome of the interaction typically by a fluorescence-based technique. To conduct process research in microwells it is necessary to go a good deal further. If, for example, the growth of the source cell is to be examined it must be supplied with oxygen over time. That means inducing a level of sustained and defined mixing. If purification operations are to be explored such as centrifugation, membrane separation and absorption then the conditions must be carefully defined so that predictions of scale-up are meaningful.

Existing drug discovery automation of microwell technology means that some of the instrumental systems for addressing bioprocessing studies are already in place. The software linkage of these instruments also means that they are well suited to experimental design and data analysis. The UK Government has funded a new Micro Biochemical Engineering Centre at UCL to take the automated micro-processing research forward. As the pressure for evaluating ever more drug candidates with high precision grows, engineering issues of what occurs in microwells will also become important in discovery studies.

Ultra scale-down methods are potentially of importance to start-up companies in a different way. For them the financial risk in building pilot plants is very high. If several potential drug candidates fail in succession, the cost of an unused plant will reduce the chance of company survival. One alternative, of licensing out very early, reduces the proportion of the value which can be retained by the start-up.

Ultra scale-down techniques could allow the start-ups to collect process-related information and thereby retain more of the value at an affordable cost. They could reduce the expense and duration of subsequent contract manufacture or allow a stronger position when licensing out the drug. There is one crucial caveat. Those who use ultra scale-down and modelling must have a good grasp of the biochemical engineering issues involved and that does demand careful training.

The capacity to reach back towards discovery is being taken even further in new process research studies at UCL. For protein therapeutics, the genome of the host organism from which a recombinant biopharmaceutical molecule is produced gives information about processing. It defines the size of all host protein molecules, their charge characteristics and potentially the exposure of hydrophobic regions. Following the sequencing of the human genome, the same information is available for many recombinant human proteins representing the potential biopharmaceutical.

Given that separation techniques are based on such properties it is potentially possible to define a preferred separation process sequence. The genome also contains information about the potential of the cell as a source. In practice the situation is more complicated and use of bioinformatics to apply genomic and protein structural data is needed. The approach will address the kind of problem on antibody processing raised at the beginning of the article. It will open up the possibility also of designing proteins for bioprocessing as well as for enhancing clinical performance, though recognising that the latter will always be critical.

As indicated earlier, the issues of scale translation are perceived to be more straightforward in chemical processes. However, there are changes that have required a reassessment. For example, the wide use of combinatorial chemistry to generate candidate compounds can put a bigger step between discovery and large-scale synthesis, and the rise in structural complexity to achieve selectivity tends to push up the intricacy of synthetic processes.

For these reasons microwell-based approaches are now being applied, though mostly to predict performance of reaction steps. As cost and environmental pressures bear down on the industry there is a growing use of combined chemical and biochemical syntheses, and this too will make it valuable to have more early insight into the key process issues.

A final advantage of ultra scale-down if it can be fully automated is that it will bring closer the time when translating a drug from discovery to market can adopt a complete systems approach. Already discovery is increasingly dominated by bioinformatics and computer-linked automated screening. The data from trials for safety, efficacy and regulation are becoming fully electronic in form. The judgement of value to patients will be similarly encoded.

The complex task of making choices between the options that are presented by these data is increasingly being assisted by computer-based decisional tools (3). None of these approaches replace human skills and judgement. However, being able to bring together all the issues of drug development, including early assessment of manufacturing issues, will give a much better prospect of creating the new generation of postgenomic drugs at a price that can be afforded. As Arlington (4) has noted “sticking to the present course of action is simply not an option”. DDW

This article originally featured in the DDW Fall 2001 Issue

Peter Dunhill was Director of the Advanced Centre for Biochemical Engineering at University College London for 10 years and is now its Chairman. He was a member of the Biotechnology Joint Advisory Board and has consulted for major pharmaceutical companies and for biotechnology investors such as Abingworth.

References
1 Boychyn, M et al. Characterisation of flow intensity in continuous centrifuges for the development of laboratory mimics 2000 Chem. Eng. Science 56:1-12.

2 Dunnill, P. Creating a UK genome valley: the role of biochemical engineering. Ingenia 2001 51-54.

3 Farid, S.S et al. A tool for modelling strategic decisions in cell culture manufacturing. 2000 Biotech. et al. Prog. 16:829-836.

4 Arlington S. Pharma 2005: an industrial revolution in R&D. PricewaterhouseCoopers, Nov 1998.

Related Articles

Join FREE today and become a member
of Drug Discovery World

Membership includes:

  • Full access to the website including free and gated premium content in news, articles, business, regulatory, cancer research, intelligence and more.
  • Unlimited App access: current and archived digital issues of DDW magazine with search functionality, special in App only content and links to the latest industry news and information.
  • Weekly e-newsletter, a round-up of the most interesting and pertinent industry news and developments.
  • Whitepapers, eBooks and information from trusted third parties.
Join For Free