Changing the Model for Lab Equipment Maintenance and Asset Management
A large section of the Pharmaceutical, Biotech and supporting industries are currently driving aggressive cost-saving initiatives throughout their operations to remain competitive, and deliver shareholder value.
One such area is the cost associated with maintaining laboratory instrumentation and apparatus, ensuring it is performing to specification, and meeting regulatory requirements. Historically, equipment maintenance was mainly provided by individual equipment manufacturers, but more recently consolidated service provision is becoming more popular.
Reducing maintenance expenditure is a business imperative for these organisations, yet many fail to recognise that, in parallel, significant improvements can be realised in equipment uptime and scientist productivity as part of the edict for change.
This article highlights best practices that make the difference between a pure cost-savings initiative that could potentially adversely impact science, and a well-developed maintenance programme that achieves cost savings and supports other important business goals.
Although many of these principles can be applied to any size of laboratory, the focus here is on those environments with more than 100 instrument systems that spend more than $1million a year on equipment maintenance.
Changing the way maintenance is delivered impacts on multiple stakeholders in an organisation, so it makes sense to pull together representatives from each business function into a cross functional team. The most important stakeholder group is the scientific community whom it is vital to engage early in the process to understand their needs and address concerns, both real and perceived. This promotes trust, good communication, and prevents planned changes stumbling on integration.
Even when there is a cross functional team in place, the team should expect the unexpected. For example, one cross functional team began implementing a consolidated service model before realising that one of the laboratories had its own team performing Preventative Maintenance. Needless to say communicating to this laboratory that a new service provider was now providing maintenance services caused great distress to that scientific community.
Many consolidated maintenance models provide software-based asset management capabilities as part of the programme, in which case it is beneficial to have IT representation on the cross functional team. IT will be able to integrate the IT resource requirements of the consolidated maintenance programme into their overall IT projects timeline, and provide guidance on high risk issues such as network and internet security.
A third key group of stakeholders in GxP environments are the QA or Metrology groups who will want to understand and eliminate any risk to compliance as the result of any change to the organisation’s maintenance model. In some cases these groups may be resistant to any change in process or documentation, and in others change may be positively encouraged. Regardless, the team’s understanding of approved processes is vital to the success of the overall programme.
A summary of how different stakeholder groups contribute to effective change is shown in Figure 1.
Framing the task
One of the first tasks of the cross functional team is to understand and then detail the top level objectives and desired outcomes passed down by senior management, which usually fall into one of the following categories:
1) Provide better control and reporting of maintenance expense and asset effectiveness.
2) Achieve cost savings targets usually expressed in terms of a percentage.
3) Improve laboratory productivity and efficiency, often articulated as getting scientists back to science.
4) Reduce cost of compliance (in those areas subject to regulatory control).
Once the team reaches agreement it can start researching the types of consolidated maintenance programmes available in the marketplace, and being used successfully or otherwise by industry peers. Some teams will define a formal Request For Information (RFI) process to solicit information, and others will leverage their industry contacts to understand what is available, and what works.
Every potential solution will provide varying degrees of compliance to the objectives so the team will find itself having to answer the following key questions to narrow down the range of potential solutions.
– How important is it to retain the original equipment manufacturer as the service provider?
– Do the benefits of using an alternative service provider (cost, uptime) outweigh the potential pitfalls (quality, scientist acceptance)?
– If we stay with the original manufacturer for service do the benefits of cancelling full coverage contracts in favour of limited coverage contracts or time and materials (lower cost) outweigh the potential pitfalls (service response)?
– Do the benefits of a centralised maintenance management centre (service event management, reduced scientist involvement with non-core activities, asset and service provider performance reporting) outweigh the potential pitfalls (change, reduces scientist contact with equipment provider for service)?
The answers to these questions will define the boundaries of the consolidated maintenance programme requirement and will vary from company to company based on culture, history and current internal economic climate.
Teams should be prepared to change their perspective and opinions as they learn more about their current asset base and maintenance system and about the capabilities and limitations of consolidated maintenance programmes. Fear of change will also raise emotions to the surface and teams may have to dedicate considerable time to overcoming these hurdles if they feel the time investment is worth the gain.
Defining the process
The majority of companies in this industry embrace a formal Request for Proposal (RFP) process and by now the team should have collated enough information to accurately define their requirements and pull together a short list of potential suppliers. In addition to asking formal questions the team should encourage potential suppliers to proffer additional information about their consolidated maintenance model – it is always possible the team may have overlooked a very attractive business benefit.
It is recommended the team categorises questions to focus vendors on how they deliver their solution rather than what the solution comprises:
– Description of consolidated maintenance model proposed.
– Description of the commercial model proposed – fixed price, shared risk, management fee, open book, etc.
– How the model will be resourced by the provider – implementation team, service engineers, quality engineers, administration, management support, back up resources, direct employees versus subcontractors, etc.
– Breakdown of the service provider proposed by line item – provider engineer, original manufacturer, independent.
– Descriptions of the skills of people allocated to the programme and how training is secured for multi-vendor technologies.
– Description of the parts supply chain, logistics and inventory management system.
– Description of how scheduled and unscheduled maintenance events are managed through to conclusion with supporting process maps, SOPs and documents. Examples include PM checklists, service reports, calibration/qualification and quality management system documentation.
– Definition of the performance promises or guarantees of the programme.
– Description of the capabilities of the asset management software proposed.
– Description of how the programme will be implemented – timeline, critical paths, provider expectations.
– Proposed Scope of Work (SOW) for the programme.
– List of potential references for similar programmes in the same geographic location.
Generating an accurate equipment list
One of the biggest challenges faced by the team will be its ability to provide an accurate equipment list and service coverage requirement to prospective bidders for quotation. Many companies have not done a good job of keeping track of what assets they have and where they are located, let alone which systems have what level of service coverage and which agreements deliver value for money.
Indeed this issue ties back to one of the key objectives of moving to a consolidated maintenance programme – the ability to understand and control maintenance spend. The team has two choices at this stage. It can commission an asset inventory and contract review sub-project or it can ask potential providers to bid on a representative market basket.
The advantage of the former is it allows a more accurate comparison of proposed spend to current spend, and it allows bidders to maximise the efficiency and, therefore, cost of their programme proposal. The biggest disadvantage is keeping the inventory current pending the award of a contract. The advantage of the market basket approach is it allows quick cost comparisons to be made between vendors.
One disadvantage is that the total programme cost will not be known, and some consolidated provider price modelling systems analyse the risk profile of the entire asset base to generate the lowest cost. However, the biggest usual drawback of the market basket approach is the paucity of information available to define the equipment list.
Lack of product descriptions, equipment configurations and coverage requirements make it very difficult for potential providers to price on an equal footing, producing large variances in cost proposals, which do not help the client or the potential providers. An asset inventory and contract review will always have to be done at some point to finalise the contractual cost so it is recommended that the asset inventory is done prior to the RFP process commencing.
Conducting an equipment inventory provides invaluable insight to the team to help build the programme specification. In-service equipment assets will be found that are not on the asset register and assets on the register will no longer be present at the site. The former could be because the asset is fully depreciated, the latter may require adjustments to the balance sheet.
As well as establishing an accurate asset count, the inventory provides basic information such as serial number, description, configuration, location and equipment owner. It should also provide insight into the age, usage and condition of equipment, which is going to influence the frequency that service is required and therefore cost. Inventory personnel will need access to laboratories and equipment so the cross functional team will need to balance the degree of engagement with scientists.
On one hand the team will want minimal disruption to science. On the other hand the inventory is a perfect opportunity to understand specific needs and allay concerns. Many companies do not have the resources to perform the inventory so the team may decide to engage an external provider to perform this service.
The purpose of the contract review is to baseline the coverage of each asset. As a starting point most scientists will want to retain their existing coverage so this would be the level defined in the RFP equipment list. However, at the aggregate level patterns will begin to emerge that will allow the team to optimise the coverage for a particular type of equipment in a particular lab, uncovering potential cost savings.
This could be built into the RFP or parked as an improvement initiative for a later date. In addition the contract review reveals performance guarantees, contract exclusions, and value added services which may need to be matched, and termination terms which may affect the way the programme is initially rolled out and the timeline.
Stating assumptions
Even if an inventory and contract review is commissioned pre-RFP the team should state all coverage assumptions in its RFP documentation to ensure all bidders price the same and allow an apples-to-apples cost comparison. Bidders should be asked to confirm they have adhered to these assumptions as part of the bid process. Examples of the types of assumptions that need to be tightly specified include:
– Whether the proposal should be priced as an aggregate of all assets or at a line-item level.
– Whether parts must be original or whether alternatives are permitted.
– Whether refurbished parts are permitted or not allowed.
– The definition of parts versus consumables.
– Whether the bid should include or exclude LC detector lamps.
– Other exclusions, for example equipment liquid paths.
– The number of Preventative Maintenance (PM) calls required per asset per year.
– The percent of PMs that must be completed on time.
– Whether all parts should be changed at PM or whether it is permissible to just change worn parts. If the latter, what should be done with unused PM parts?
– Which protocols should be used for calibration/qualification – original provider, client protocol, custom protocol – how many hours do protocols take to complete?
– What additional metrology/compliance related activities and processes are required that will incur additional time out of tolerance reporting, corrective action, secondary review etc?
– What instrument PC and software support is expected from the provider?
– What other maintenance related services should be priced into the programme, for example, supplier meet and greet, equipment moves etc?
– What unscheduled maintenance response times and fix rates are expected?
– How the programme will be resourced – dedicated on-site resources or on-hand resources?
– The standard hours of work, eg Mon-Fri, 8am to 5pm and the cost of out-of-hours service.
– What start-up costs, if any, the provider will levy that are not included within the base programme.
Failure to accurately define assumptions and ensure bidders are held to them can easily stall a project and often forces teams to restart the RFP process, resulting in additional and unnecessary internal cost. The team may feel that one or more of its sites are sufficiently different to the rest to warrant special attention, particularly if the facility is subject to stringent regulatory control. In these situations it is recommended the team takes representatives from each short-listed supplier to each facility to make sure they understand the special requirements.
If a company is considering a consolidated maintenance model it is likely the equipment list extends to 1,000, 5,000 or even 10,000 pieces of equipment. Each bidder will have their own individual pricing process forcing them to change the order of the equipment list making subsequent cross-bid line-item cost comparisons difficult for the team. As part of the RFP bidding instructions the team should make sure bidders return equipment lists in the original order and format.
A consolidated maintenance contract of this size and nature usually requires contractual agreement via a Master Service Agreement (MSA) and Scope of Work (SOW). This is usually driven by the Company’s legal group or contracts team. Given negotiations can take time it is important to initialise the review process as soon as possible. Many companies now issue a template MSA as part of the RFP process asking bidders to mark up as part of their response.
The balanced scorecard
It is also a best practice to define a balanced score card against which each bid is judged. The score card should support the RFP, which in turn supports the overall goals of the programme. An example of a score card used by one company is shown in Figure 2.
The focus of this particular score card was to ensure that potential providers had the tangible capabilities the team was looking for such as technical skills and processes, but also whether providers would make a good cultural fit for their organisation.
Once an RFP process is opened most companies have a process that allows questions to be submitted electronically with the answers made available to all suppliers. In many cases the answers often introduce more confusion into the process, either because the question was not clear or because the team has not considered that particular issue. Best practice would see questions submitted by a specified date with potential providers pulled together on a joint teleconference to discuss both the questions and the answers.
Reviewing proposals
Once the RFP is closed the team will want to review the responses and agree on next steps. The review process will focus on pricing, making sure assumptions have been adhered to, and the thoroughness and quality of the response. The next step will usually involve meeting with two or three of the most suitably qualified providers to more fully understand their proposal.
The key to success in these review meetings is to make sure all stakeholders are represented, proposals have been reviewed, questions collated, and that sufficient time is set aside for the review. A minimum of three hours should be set aside to understand the nuances of the programme and the capabilities of each potential provider. The team should communicate its review expectations to each bidder making sure they allocate a significant portion of time to how the programme works rather than just an overview of what the programme comprises. This comprises the resources, processes and systems described earlier in this document and embedded in the RFP questions.
Asking for a project implementation timeline is a great way of understanding each bidder’s thought processes and whether they have the expertise and experience to successfully manage the programme. The team should also expect to see profiles of the resources that will be used to staff the programme and preferably meet the resources in person. Examples of protocols and process flow maps should also be presented for discussion. At the end of each presentation score cards should be completed and collated and understood by the cross functional team.
It is likely there will be attractive elements in each of the proposals so it is important to ascertain whether the most preferred provider can build in elements from other proposals into its model. Considering a high portion of maintenance delivery derives from intangibles such as processes the preferred provider should be receptive and capable.
References
Once the team has agreed on a preferred provider or a short list of two alternatives it is important to take up references. The reference should be currently operating a similar programme to the one under consideration, and this should be clarified with the reference. It is important to get a cross functional reference not just a reference from the individual who championed the programme. The team should reach out to cross functional stakeholders to get a comprehensive picture of the provider’s performance, in particular the scientists benefiting from the programme.
They should devise a set of questions that focus on the key areas that make a program successful:
– What is the programme structure, how does the provider interact with the client and what is each party’s responsibility?
– How well did they communicate to the cross functional team and scientific community during implementation, and now that the programme is live?
– Was the programme implemented to the agreed timeline, what were the issues and how did they overcome them?
– How do they rate the quality and timeliness of service delivery?
– Is the computer system delivering on the asset management promise?
– Is all the documentation in place and being controlled as per approved SOPs?
– What are the performance metrics assigned to the provider and how do they rate against them?
Recommendations and next steps
The team’s decision-making process may require further consultation with stakeholders with some compromise called for by team members. As a result, the preferred provider(s) may be given new direction or be asked to provide further information. Team members may be asked to re-evaluate their score cards before consensus is reached on what solution to recommend to senior management.
Once the recommendation is made and approved the project moves into the implementation phase where the planned changes will begin to be felt by the organisation. The implementation phase introduces a whole new set of critical success factors, deliverables and best practices that will be covered in a follow up article.
The recommendations provided here do require a degree of commitment and time from the cross functional team, but they do pave the way to ensure a successful implementation, one that meets or even exceeds the objectives of the programme. DDW
—
This article originally featured in the DDW Summer 2007 Issue
—
Martin Long is the Business Director of OneSource® Professional Services, PerkinElmer’s global asset management and managed maintenance resource. Martin is responsible for project management and business planning for OneSource, which offers personalised lab equipment maintenance and asset management solutions for customers. Prior to this role, Martin spent 22 years in the industry, holding positions in global product marketing and product development, sales, service and marketing, export sales management, and product development. Martin’s career includes the development of qualification protocols and validation documentation, set up of ISO17025 permanent calibration laboratory, and direct involvement in bringing more than 75 new product innovations to market. Martin joined PerkinElmer in 2001 as Molecular Spectroscopy Marketing Manager. In 2005, he joined the PerkinElmer laboratory services business. Martin holds degrees in chemistry and strategic marketing from Cambridge Marketing College.