The Industrial Evolution of Screening Infrastructure
Industrialisation of screening depends upon the use of a stable,mature infrastructure. According to a recent Boston Consulting Group report, a pharmaceutical company needs to spend 900 million ($810 million) on R&D to bring a product to market.
This is almost three times the cost of developing a medicine some 10 or 15 years ago, and yet the numbers of products reaching the market have changed little. It is therefore widely accepted that pharmaceutical companies will have to get more value out of that investment if they are to prosper. Unlocking R&D productivity is a universal challenge in the industry.
GlaxoSmithKline (GSK) has invested significant sums in the development of massively increased screening capacity. The business has moved rapidly since merger in 2000 to develop a worldwide structure of automation, IT and facilities to enable the co-ordinated prosecution of high throughput screening, compound profiling and lead optimisation.
In doing this, it has to face the challenges of moving scientists, who have an in-built desire for constant optimisation and innovation, into the more restricted, documented and de-optimised world of routine manufacture. This ‘industrialisation’ is a trend discussed widely within the screening community. This article outlines some of the key issues involved in developing industrialisation.
There are many texts on manufacturing models (1,2) and some very valuable academic research on which to draw (3,7). However, much of the effort to industrialise screening is in challenging the paradigm that drug discovery is fundamentally different from many other hi-tech, process-driven manufacturing industries with short product lifecycles. Many of the requirements of mobile phone, PC and chip manufacturing have similar difficulties of fragile raw materials, particular working environments and regular modifications to the design of their end product.
Yet their processes, automation, IT systems and costs rarely resemble those of most screening environments. Where differences certainly exist are in the very diffuse range of solutions for each task type used across the drug discovery industry, the very high price/quality ratio which it has historically borne and the readiness of scientists to innovate at the production end of data manufacture. This expression of scientific credibility has often been accepted without true consideration of the holistic costs involved and the impact on overall effectiveness.
The model for success in these industries may not be similar to the Henry Ford models of manufacture – readily conjured up at the mention of the term – but rely heavily on Japanese and more recently developed cellular techniques (1,4,8). These deal with the utilisation of segmented production lines, the organisation of multidisciplinary teams responsible for the complete delivery of specified products and the reduction in rework and wastage.
These more recent developments have come from many studies conducted on the effectiveness of ‘traditional’ production models. While productive by their very scale, long linear production systems, where products are passed between many different groups as they are assembled, have drawbacks in that a large number of interfaces are required between groups of different skills. This leads to issues of ineffective communication and understanding that then may develop into wasted time, effort and materials.
The concept of throwing a finished assay, screen or reagent ‘over-the-wall’ to another group to carry on with has many parallels with this approach. The building and empowerment of multidisciplinary teams around designated automation platforms has generated business improvement in many industries and may represent a better ‘fit’ for a screening organisation looking to improve its processes.
What then is meant by the term ‘industrial’? Most certainly this term can mean different things to different people. However, for a modern screening infrastructure, it is a four-fold approach: standardisation, system consolidation based upon reliability, strategic management of the production line and process change.
Assay design, uHTS and lead optimisation within GSK are global entities. Assays designed in one group in one country can as easily become screens and lead optimisation projects on that site or on a separate continent. This philosophy defines the approaches that must be taken in building an effective infrastructure of hardware and software. Implicit in this is the hard work of standardisation and change management. This task has a still keener edge in a merging company, where former competitors become colleagues. For the clear good of the new organisation, this diversity must be harnessed to mould a common, agreed approach.
Standardisation can be seen by many as adopting the lowest common denominator to achieve consensus: taking the smallest steps along the road. Additionally it is quite natural for merging groups to see the adoption of a solution used by only one of the former companies to be a ‘victory’ or ‘loss’ in the exchange of ideas. It does not have to be that way. With a clear vision and the support of the organisation to make a sea change in approach to these processes, standardisation offers the opportunity to do far more than define a workable and robust onward solution.
It can be, and is, a tool of merger and a method of managing the inevitable change. It provides both a focus for discussion and, at times, a helpful whipping boy on which both sides may jointly vent their frustration. Developing a new company standard is a genuine deliverable for any merged organisation. Such standards, once they have permeated through the business and imprinted themselves upon working practices and data, make organisations look forward eagerly rather than backward fondly. Standardisation needs to occur for hardware systems and, importantly, for how biological data are collected, analysed, stored and retrieved.
The global manufacture of biological data, which is the true mission of modern screening for international drug discovery companies, is greatly aided by the common understanding of how these data are generated. The calculation, curve-fitting, quality assurance and contextual storage of data are emotive topics in any company, as is even the nomenclature which surrounds them.
Scientists generate data using mechanisms they believe to be optimal for their area and are very ready to enter energetic debate with colleagues who, from a different experience, hold a different view. This is traditional scientific debate by individuals with a commitment to their work. Effort is required to bring these groups together and develop mechanisms whereby all scientists can understand and interpret data generated by other groups. This avoids the traditional issues of misunderstanding, misinterpretation and resources wasted by needless rework.
The choices and implementation of such solutions are themselves a major challenge, but not the subject of this piece. However, once data standards are agreed, the use of globalised IT solutions for generation and storage of data is facilitated. Third party IT systems for screening (eg ActivityBase, ID Business Solutions) are becoming globally accepted. These solutions also can be effective merging tools.
The in-built flexibility of research scientists soon has disparate groups discussing their work in the language and framework of a common tool, sharing common problems and able to evolve coherently. Inhouse IT resources are also spared in order to provide truly value-adding components for analysing data, automating the application of business rules and in the development of ‘expert’ components.
The equipment market surrounding drug screening contains a potent mixture of novelty and high technology. The scientist’s desire to push against boundaries or perceived bottlenecks, and the vendor’s requirement for rapid return on investment, has led to the endemic use within customer groups of ‘beta’ prototypes with extensive resource utilisation in the evaluation and evolution of these potential new solutions. New technology areas are avidly pursued with a desire to exploit them rapidly for potential competitor advantage.
This leads to prototype instruments being used in production screening. There are no reliable industry data to support the benefits of this approach. Indeed a large number of these industry-wide activities have resulted in obsolete or unused equipment. Neither vendors nor customers are clear as to whom should bear the cost of development. A ‘beta’ instrument may offer early access to a technology at a reduced price, but this discount may not truly reflect the internal costs of instruments potentially under-performing under production conditions.
There is a great deal of literature on the successful approaches to manufacture, which concludes that on-line experimentation with any part of the production process leads to inefficiencies and expense3. With cost being a major driver across all screening-related activities, a more risk averse and resource-sparing approach is necessary, as well as being appropriate in process terms.
The use of discrete event modelling, and other established methods, can help in defining the real requirement for, and process impact of, new equipment, along with characteristics of acceptable machine failure and utilisation rates. Using this software is a skill but can be enlightening. Often, counter-intuitive approaches modelled in this way can prove to be highly cost-effective.
The validity of the model is dependent upon the quality and richness of the data used to generate it but this only serves to push organisations into the generation of quality metrics, which are vital to continued efficient running. Prototype experimentation should, and will, be allowed to continue within these boundaries, but successful candidates should be brought strategically, rather than rapidly, into production use.
The term ‘industrial’ when applied to equipment brings to mind many things, but perhaps the most constant feature of them all is dependability. The Formula One motor car is a highly impressive, blisteringly fast and fascinating piece of technology. However, it may have only a one in two probability of completing a 200-mile race at its designated pace, and even less of scoring competitive points for its owners. Vast sums are pumped into the constant development and support of these systems.
The laboratory equipment and screening businesses, as they have developed, have been jointly creating and redeveloping Formula One equivalents, supported by a large number of competing vendors and a voracious conference circuit. While this challenge for speed has been highly engaging it clearly does not fit the model for future guarantee of robust, high quality output.
New technology can offer genuine progress and efficiency benefits. However, these genuine stepchanges in productivity are, in reality, few and far between. The scientist will always be attracted by the lure of ultimate optimisation but this native desire must be balanced by traditional – if unfashionable – management accountancy techniques. These give genuine insight into the holistic costs associated with the replacement of existing production systems with new approaches.
Where the balance lies between novelty, reliability and cost is a judgement for each screening company, large or small. However, for a business where high volume data manufacture is the prime directive, a risk management approach must be taken. A valuable offering to such an organisation will therefore be one analogous to a luxury sports car; tested, documented, pre-crashed and well supported by a trusted manufacturer. These more mature products may carry a higher list price than the early access models, but will offer reduced downtime and hidden cost, making it a greater value proposition.
Large-scale screening groups in lead optimisation or HTS/uHTS place significant strain on equipment. The strategically important elements of the screening production line must be capable of exceeding these demands on their engineering. Most automation solutions are marketed on the basis of being able to meet accuracy and reproducibility measures. Robustness characteristics are only generally available by customer evaluations or, at worst, experience in production screening.
A reason for this lack of real world data may be due to the novelty of the equipment. This in itself represents a risk to dependability – although if engineering reliability has been demonstrated under basic conditions by the manufacturer, this risk can begin to be understood. Greater amounts of engineering robustness data are required by customers prior to system selection to minimise the investment in time required by in-house groups in this activity and to recognise that reliability data are as potent marketing points as accuracy.
Laboratory automation to date has striven for high flexibility in single solutions to match the variability in requirements and, ironically, the constant change in technology that the area encourages. However, this approach has not necessarily delivered lasting quality or robustness to drug discovery. One of the lessons learned from the success outside drug discovery of cellular manufacturing techniques is the platform-based approach to automation. Here, a robust automation system is tasked with doing a small number of limited operations.
Production cells may be large integrations (eg The Automation Partnership, Evotec, RTS Life Sciences) or workstations (eg Zymark, CyBio, Velocity 11). The principle is that they bed down to perform limited tasks with high dependability and without the retooling and rescheduling often demanded of systems in the past.
As new technologies become accepted, new cells are generated for them – as with production lines for machined parts or assemblies of mobile telephones. Building specific production cells and multidisciplinary teams of people around them provides a route to sharing the successes derived by others through the adoption of this approach.
Another facet to providing reliable, high capacity screening facilities is to ensure the maintenance of a strategic portfolio of automation, identical to how a pharmaceutical firm’s therapeutic products are maintained. In this way, GSK maintains a balanced, stable infrastructure structure for production screening across the world and works with vendors in a regulated and well-managed environment. In each segment of equipment (eg arms, reagent dispensers, washers, etc) one to three solutions are proposed for worldwide application. This can also be the case for reader types (eg ViewLuxTM, PerkinElmer; FLIPR® and AcquestTM from Molecular Devices Corporation).
These systems are selected through a defined process, relying on significant input from vendor companies for the generation of engineering robustness and reliability information, followed by confirmatory testing of successful candidates at the most appropriate of six available sites. Novel solutions are compared against existing portfolio members and are only considered for further interest if gaps are identified or portfolio systems under-perform or become obsolete.
Successful vendors must be prepared and able to enter global service level agreements on system and consumables supply and the provision of high-level after sales support. This is the point where stronger customer vendor partnerships must be developed, broadening from the focus upon the development end of the product cycle. Clear service level agreements are required to ensure smooth running (5,7). Regular preventative maintenance of key equipment and rapid times to resolution, should unforeseen errors occur, need to be negotiated and kept under regular review.
In the modern screening facility, preventative maintenance becomes an area where concentrated internal effort is required to ensure that all elements of the facility are covered. As is the case for cars and aeroplane engines, preventative work should be scheduled, not by calendar time, as has traditionally been the case in the laboratory, but by duty cycle of the instrumentation. This requires an understanding of the likely duty cycle of each instrument within a production environment, then a constant monitoring of the use of that instrument. This is not simple to capture, or maintain, across a large instrument inventory, but is the metric required to assess whether parts and consumables require replacement.
For GSK, vendors must be able to offer support services on a worldwide basis and recognise that the holding of spares is a shared responsibility between customer and supplier. In this way the vendor develops a close relationship with the company and maintains an ongoing advertisement for the quality of their product, ie successful and continued use in production – ‘the best ad is a good product’.
The generation and maintenance of such a portfolio does, of course, require a centralised approach to achieve, but this does not detract from local groups’ ability to expend some of its energy, where available, in automation R&D. By ringfencing resources to this activity, systems development may be supported, but can be maintained at an appropriate level where core production activities are not compromised. Successful solutions can then be scheduled for future implementation at single, or multiple, sites.
A process of standardisation and consolidation of infrastructure will not, in itself, usher in a new age of industrialised screening. There is a requirement for scientists and support staff to develop a new way of working. These, again, challenge the scientist to adapt to rigorous, repetitive sometimes de-optimised processes, in order to improve dependability. Process developments help to improve the robustness and reliability of most automation systems. For example, a management focus upon, and facilitation of, instrument QC and the centralisation of QC data has driven improvements across the business and allowed automation support groups to target problems early.
Also, the robustness of automation in production has improved due to the introduction of standard documentation which scientists and support staff complete before and during each run to check set-ups and ensure agreed procedure has been followed. Such documentation limits human error and provides audit trails for both metrics gathering and troubleshooting.
Business improvement techniques such as lean sigma and six sigma (6,7) are proven to have benefited many industries seeking to maximise outputs and reduce inefficiency. The application of such data-driven process improvement techniques to screening provides a further challenge in the metamorphosis into a modern manufacturing business. Statistical process control tools show potential here for ‘real-time’ information gathering and offer another point of control.
The generation of appropriate metrics around automation and overall process performance is a requirement if we are to learn about the true performance of systems under strain or to improve how we interact with them. Headline metrics should be concentrated upon those areas where most improvement is required (eg run failure rates or QC data) (2,6,7). The collection and constant review of these values concentrates the mind of all concerned in the screening enterprise on the real issues.
In order for this effort to bear fruit, Continuous Business Improvement (CBI) must be seen as an accepted culture rather than a single, timebound project. This culture embraces the avid collection and use of metrics to identify and understand process problems. Scientific groups are comfortable with data interpretation and this approach suits the environment so long as the collection of metrics are seen as a positive step to making things better as opposed to a threat. An example of this is the study of rework cycles.
All screening capacities are scaled to meet the requirements of the customer group served by that facility. However, the holistic impact of rework cycles must also be genuinely factored into such calculations. Rework may be caused by a number of factors, both in and out of control of scientists and is implicit in an environment where quality of output is paramount. This is basic practise on major production lines for other industries and such an accurate and realistic approach helps to ensure that predictions of true capacity and throughput are met.
Industrialised screening, when dissected into its component parts, may appear a sterile affair, driven by the unglamorous graft of process and standardisation. However, this is overlooking the many stimulating scientific and business challenges which are implicit within it. The benefits of the changes required, however tough they may be, are manifest. The lean organisations that emerge from these changes are undoubtedly of an unprecedented capacity, but are also fundamentally stronger and more robust.
There will continue to be an evolution of technology, process and quality control – as for all manufacturing businesses – but the strong foundations being laid down as the ‘industrial model’ consolidates, will allow for that. Changes in how the activity of screening is perceived, from scientists in industry to industrial science, opens up opportunities for understanding and further learning from other industries and adopting many of their successful experiments of managing cost, quality and delivery. It is an exciting time for screening and for those in support of it. DDW
This article originally featured in the DDW Summer 2003 Issue
Chris Molloy is Manager of the Screening Support Group, Cheminformatics, based at Stevenage, UK and has been actively involved in lead optimisation and high throughput screening for 13 years. He chairs the Automation Platforms Committee within GSK which has responsibility for co-ordinating automation across screening at all sites. He is also responsible for the implementation of data analysis software to screening groups worldwide.
1 Factory Physics,Wallace Hopp, Mark Spearman, ISBN: 0256247951.
2 The Goal, Eli Goldratt and Jeff Cox, 1993. MPG Books, ISBN,0566074184.
3 Institute for Manufacturing, http://www.ifm.eng.cam.ac.uk.
4 Tackling Industrial Complexity, Gerry Frizelle, Hugh Richard, ISBN 1-902546- 24-5.
5 Logistics and Supply Chain Management, Martin Christopher, 1998. Prentice Hall ISBN,0273630490.
6 Six Sigma Way Team Fieldbook:An Implementation Guide for Project Improvement Teams Peter S. Pande, Robert P. Neuman, Roland Cavanagh, ISBN: 0071373144.
7 The Lean Toolbox. John Bicheno, ISBN 0 9513 829 9 3.
8 Value Stream Management, Peter Hines, Richard Lamming, Dan Jones, Paul Cousins, Nick Rich, ISBN 0 273 64202 2.