Quantitative real-time PCR is becoming mature technology for the quantification of nucleic acids. It is spreading wide outside its original use in the research laboratories, becoming preferred technology for a range of applications, many that require specialised solutions and adaptations. Integration with pre-analytical steps and post-processing operations are becoming key challenges.
The idea of the polymerase chain reaction (PCR) was born in 1983 when Kary Mullis was taking a ride in the mountain range in California1,2. It is as simple as brilliant. Based on the natural ability of the polymerase enzyme to copy nucleic acids, Kary Mullis reasoned that using a heat stable polymerase it should be possible to automate the reaction to perform multiple copying events by cycling the temperature.
The double stranded DNA molecule is strand separated by heating to 95°C, then the temperature is lowered to allow short synthetic DNA oligonucleotide primers to anneal to complementary sequence in the DNA template, and finally the temperature is set to 72°C, at which the heat stable Thermus aquaticus (Taq) polymerase extends the primers into full length copies. Since both strands are copied, the number of DNA molecules is doubled in each cycle. Using PCR, virtually any DNA can be amplified starting from a single copy to a large number of molecules that can readily be analysed or used for engineering. Development of PCR was a major breakthrough as a qualitative analytical tool, but it was not quantitative. The amount of PCR product produced depended on the amount of reagents added rather than on the amount of starting material.
In the beginning of the 90s Russell Higuchi discovered that PCR can be performed in the presence of nucleic acid stains that become fluorescent upon binding the DNA. The fluorescence from the dyes could be measured throughout the reaction, making it possible to monitor the accumulation of the PCR product in real time. By registering the number of PCR amplification cycles required to obtain a particular amount of product characterised by certain dye fluorescence, it was possible to calculate the number of target molecules the sample contained initially. The approach was named quantitative real-time PCR or qPCR for short. The analytical sensitivity of qPCR was only limited by sampling effects, since a single molecule was sufficient to generate product, and its dynamic range was virtually unlimited. Reproducibility was also impressive, considering the technique gives exponential response. pandemic H1N1/09 and regular seasonal flu3. Rapid influenza diagnostic tests (RIDTs) based on the detection of the influenza viral nucleoprotein antigen, for example, show only 10-70% sensitivity compared to the qPCR test for the novel virus4. So far only qPCR-based diagnostic tests have gained FDA approval. This total dominance of qPCR as primary test for the new pathogen reflects its emerging status as gold standard for pathogen detection in diagnostics.
Although influenza testing has dominated the news in the past year, the most common molecular diagnostic tests are HCV, HBV and HIV, which account for some 85% of the testing. In the US more than 2 million quantitative HIV tests are performed annually. Currently a handful of large companies compete for this market. All use qPCR on license from Roche, apart from bioMerieux which uses NASBA and Chiron/Bayer which uses branch DNA technology – these are the only other FDA approved quantitative tests. This picture is expected to change rapidly as the qPCR patents expire within the next few years as many more kit suppliers will be able to enter. Currently, only few qPCR instruments are licensed for diagnostics and approved for in vitro diagnostics (IVD). This will also change when patents expire and will make it easier for kit manufacturers to sell their products, since many of the new instruments will be open platforms that are compatible with qPCR kits from most suppliers. These open platforms will mainly be attractive for smaller hospitals and laboratories, where cost savings and flexibility are important. For high throughput laboratories that perform large number of routine tests, fully automatic systems such as the COBAS® AmpliPrep/COBAS® TaqMan® from Roche5, the m2000 RealTime System from Abbott6 and the RotorgeneQ-based system from Qiagen7 are the most attractive. These systems are almost fully automated but they are voluminous, occupying large bench space. The next generation of integrated systems based on microtechnology will be exciting. These primarily target small laboratories, the doctor’s office, and may ultimately be available for point-of-care testing. First system on the market is the qPCR instrument from Enigma Diagnostics8. Enigma FL is completely self-contained. The entire process from collection of the raw sample to delivery of an end result takes less than 30 minutes. The system operates with ambient stored reagents in a single disposable cartridge and meets the need for diagnostic systems that are portable and easy to use with minimal operator training and expertise. The Enigma ML is suited to settings where usage is lower and space is a premium, eg in the doctor’s office, pharmacy or intensive care unit. It incorporates a disposable cartridge that accommodates either liquid or swab samples without requirements for manual processing. All reagents and sample preparation tools are held on the self-contained cartridge and all steps are automated. Another exciting system is the GeneDisc Cycler from Gene Systems, a part of Pall Life Sciences9. It is an automated, miniaturised qPCR system that performs gene amplification in a disposable GeneDisc preloaded with reagents. It will be combined with Genextract HD for the standardised extraction of 48 parallel samples. Currently the GeneDisc is only available for food pathogen testing. The global molecular diagnostic market was $2.9 billion in 2008 with a CAGR of 7.8%, out of which infectious disease testing accounted for USD $1.9 billion with a CAGR of 6.5%.
It is not correct to refer to the high-throughput instruments as next generation qPCR; they will not replace the by now traditional 96/384-well instruments, which are most suited for the small research lab where most operations such as sample preparation and loading are done manually. But they do constitute a new generation of qPCR instruments that open for applications that are either not practical or cost-efficient on the conventional instruments. The new generation high throughput qPCR instruments are represented by the OpenArray from Life Technologies10, the BIOMARK from Fluidigm11, the LC1536 from Roche12, and soon also the SmartChip from Wafergen13. They are all built on different platforms. The BIOMARK is a microfluidic system based on the company’s proprietary valves. The dynamic array for expression profiling loads 96 assays on one side and 96 samples on the other side, which are then mixed into
A year ago I summarised in DDW the emerging qPCR applications and in this news story I follow up on those and other important happenings in the qPCR field during the past year.
Closed, automated systems for infectious disease testing
2009 was the year of the Swine flu outbreak. First detected in April in Veracruz, Mexico, the new virus with combination of genes from swine, avian, and human influenza viruses, spread quickly around the world and was in June declared pandemic by the World Health Organization (WHO) and the US Centers for Disease Control (CDC). The CDC recommended qPCR for the new virus as other tests were unable to differentiate between 96 × 96 = 9,216 reaction chambers for parallel qPCR analysis. The BIOMARK platform is not compatible with the popular dye reporter SYBRGreen I, which adsorbs to the particular material of the micro channels. However, recently this was solved with the introduction of Chromofy dye (Figure 1)14.
The OpenArray uses a chip with 3,072 33-nanolitre reaction volumes in a footprint the size of microscope slide. The assays are loaded using proprietary robotics, dried-down and sent to the users who add sample and master mix15. LC1536 is the big brother of Roche’s very successful LC480, running a 1536-well plate that requires as little as 0.5μl reaction volumes. In the second half of 2010 Wafergen plans to launch its SmartChip, which has 72 × 72 = 5,184 nanowells. Today these instruments are for research use only. Considering the fast development in multimarker diagnostics of complex diseases, we expect this will change and they will become platforms for multimarker diagnostics, prognostics and theranostics, where some 20-100 markers are expected to be sufficient and often optimum to give the most reliable indications.
Miniaturised and lab-on-chip platforms
Exciting developments of Micro-Electro- Mechanical Systems (MEMS) technology has recently allowed the migration of qPCR machines to lab-on-a-chip systems and holds promise to eventually bring qPCR to the doctor’s office. The main advantage of miniaturised systems is their speed. The reduced heat capacity of the much smaller reaction volumes allow for shorter cycles, since temperature equilibria are attained faster. The chip designed by Neuzil, for example, performs 40 cycles within six minutes with excellent amplification performance16. Analytical sensitivity is generally sufficient to detect a single molecule if it is present, and high reproducibility can be achieved. Limitation is the very high sensitivity of PCR to inhibitors, which makes it impossible to analyse crude test samples. Another limitation is the small reaction volume, which requires the sample to be concentrated. For field applications the miniaturised PCR systems must be interfaced, preferably integrated with sample preparation and concentration units17,18.
The exponential nature of PCR is its key strength that contributes to its high sensitivity and wide dynamic range, but it is also the Achilles’ heel, since it limits precision. Although replicate qPCR response curves show excellent reproducibility the exponential increase in the amount of template limits the precision to detect a difference between samples to about 50% when expressed in copy numbers. Analysing gene copy number variations this is the difference between a normal diploid genome and a trisomy. In expression analysis measurement precision is even lower, since additional pre-processing steps add confounding variation.
At about the same time Russ Higuchi was developing qPCR, the idea of quantifying target numbers by PCR using limiting dilution was conceived by Sykes et al19. Diluting a sample to such an extent that it contains a very small number of target molecules, the sample can be aliquoted into reaction containers that each initially is either blank or contains a single template molecule only. When amplified by PCR the number of target molecules in the initial sample will correspond to the number of positive PCRs20. In 1999 Bert Vogelstein named the technique digital PCR and used it to quantify K-ras mutations in stool DNA from colorectal cancer patients21. However, as long as PCR was performed mainly by manual dispensing in 96 or 384-well plates, digital PCR remained esoteric with few applications. This will change rapidly with the advent of the high throughput platforms presented above. Since digital PCR is conceptually an end-point PCR technique rather than real-time PCR, it opens the arena also for other PCR platforms, such as the innovative RainStorm™ microdroplet-based technology developed at RainDance Technologies22, that produces picolitre-volume droplets at a rate of 10 million per hour. Each droplet is the functional equivalent of a reaction chamber with encapsulated PCR reagents, reporter molecules and, under limiting dilution conditions, either none or one template molecule23. The droplets are carried in a continuous oil flow through alternating denaturation and annealing zones, resulting in rapid (55 second cycles) and efficient PCR amplification. The formation of product evidencing presence of template molecules in the individual droplets is measured as fluorescence within the microfluidic chip.
Digital PCR enhances our ability to discriminate between copy numbers. Four from five copies can be distinguished using some 1,200 chambers, while with 8,000 chambers 11 from 10 copies can be separated24,25. Critical with these new platforms is that they allow for automatic distribution of a sample into a very large number of reaction containers for qPCR analysis, which is prerequisite for any biological and medical studies and eventually for clinical applications. The important digital PCR applications we foresee to become popularised within near future include early detection of mutations16, detection of non-cultivatable pathogens against excessive backgrounds26, copy number variations (Figure 2), analysis of fetal DNA in plasma27 and qPCR tomography28.
Pre-analytics, experimental design and publication guidelines
qPCR is the final analytical step in a process of quantifying target nucleic acids that typically involves several upstream steps starting with sampling followed by extraction and in the case of RNA analysis also reverse transcription to produce cDNA. Frequently additional steps such as storage, freezing/thawing, fixation, transportation etc, are required. All these steps contribute to the variation in the analytical process of quantifying the amount of target nucleic acid in a test sample and must be considered.
For studies within one laboratory, the best approach is to perform a small fully nested pilot study, from which results the variance contributions from the different pre-processing steps can be estimated29. The following study can then be costoptimised for performance in terms of using optimum replicates at the different levels and also sufficient biological subjects to achieve a required power. The approach can also be used to compare different protocols, kits and approaches. MultiD Analysis offers software GenEx for this planning30. Results published so far suggest that most variance is contributed from the natural variation among studied subject, sampling for tissue samples and in a few cases from the reverse transcription. The qPCR step does not contribute appreciably. Clearly, future efforts should be on improving the pre-analytical steps, rather than fine-tuning the qPCR. In Europe the project SPIDIA co-ordinated by QIAGEN has been launched to tackle the standardisation and improvement of pre-analytical procedures for in vitro diagnostics31. The activities cover all steps from the creation of evidence-based guidelines and tools for the pre-analytical phase to the testing and optimisation of these tools through the development of novel assays and biomarkers. The biomarkers shall be suitable to control for the natural degradation that occurs when nucleases are released as cells are damaged, the physical and chemical degradation that occurs when samples are preserved and, most importantly, the activation of many genes that occurs due to the stress and changed environment when samples are collected. Improved methods and procedures to control the quality and integrity of the sampled material are very much needed, as are standardised procedures to minimise variability between measurements in different laboratories and among independent studies. Guidelines are requisite for the maturing of qPCR into a robust, accurate and reliable nucleic acid quantification technology. As described and exemplified by Bustin, ill-assorted pre-assay conditions, poor assay design and inappropriate data analysis methodologies have resulted in the recurrent publication of data that are at best inconsistent and at worst irrelevant and even misleading32. A step in that direction was taken by the set of guidelines that propose a minimum standard for the provision of information for qPCR experiments (‘MIQE’)33. MIQE aims to restructure to-day’s free-for-all qPCR methods into a more consistent format that will encourage detailed auditing of experimental detail, data analysis and reporting principles. Key points of the MIQE guidelines are to present the design of the experiment, describe the sample and how it is collected, describe the extraction process of the nucleic acids and test the integrity of the DNA, which can be done using microfluidic analysis on the Agilent Bioanalyzer34 or on the Bio-Rad Experion35, alternatively with differential assays such as the 3’/5’ approach, specify the reverse transcription conditions, describe the qPCR target, describe the qPCR oligonucleotides used and the detailed qPCR protocol, the validation of any standards and reference genes used, and details of the qPCR data analysis that was performed.
Learning more about the complexity of the overall process of collecting, preparing and analysing samples for their nucleic acid content and about the underlying biological variation due to natural diversity, we are facing the challenge of separating the noise caused by these factors that overlays the relevant effect on gene expression caused by the environmental influence or drug that we are studying. Adequately-designed studies have many subjects, often multiple samples collected from each subject, levels of technical replicates that are analysed for multiple genes of interest combined with validated genes for normalisation, and various controls. The studies are often run over multiple plates, occasionally over long periods of time. Analysing these kinds of data with general tools such as Microsoft Excel is not an option; the risk is too large to make errors or miss accounting for some of the variability. The instrument softwares do not offer appropriate analytical tools either. There is, however, no need to reinvent the wheel. The statistical methods to pre-process and then analyse these kind of measurement data are known, and since recently are being made available to the qPCR community in user-friendly softwares dedicated to the challenge of processing and mining qPCR data. Market-leading GenEx from MultiD Analysis supports all important qPCR platforms on the market, handles multiplate/ multicentre/multilevel studies, with appropriate controls, and in addition of performing all basic comparisons, such as absolute quantification with standard curves and relative quantification with appropriate univariate tests, GenEx offers powerful classification methods for expression profiling and multimarker diagnostics25. Another option is the StatMiner from Integromics, which also offers user-friendly and advanced analysis of qPCR data36.
It is obvious qPCR is developing into niches that have distinct customer bases. Dominant are the infectious disease applications, which are targeted by IVD-approved instruments that preferably are fully automated for whichever FDA/CE-approved kits are available, and small research laboratories that require open, flexible systems. Upcoming new niches are the high throughput platforms that require special loading systems, but reduce substantially cost per run and open for novel applications such as digital PCR. They will also be suitable for multimarker diagnostics of complex diseases. Forthcoming niches are closed miniaturised, or at least smaller systems with integrated sample preparation that will target small diagnostic laboratories and the doctor’s office. Major focus is on the pre-analytical steps, where there is plenty of room for improvement of product yield and quality, and where guidelines are needed for diagnostic applications, and on experimental design and postprocessing of information to retrieve as valid and valuable biological information as possible from a study. There is also a need to improve the quality of published data. DDW
Dr Mikael Kubista is head of the department of gene expression at the institute of Biotechnology of the Czech Academy of Sciences in Prague, and CEO and founder of TATAA Biocenters (www.tataa.com), leading providers of qPCR services in Europe. TATAA has an intensive R&D programme related to qPCR and has developed several important products such as the Chromofy and visiblue dyes, the 1-step extraction, RT, qPCR CelluLyser reagent and proprietary panels for the identification of optimum reference genes, and for the profiling of embryonic stem cells and tumour cells. TATAA also offers hands-on training courses in qPCR and molecular diagnostics worldwide (www.tataa.com/Courses/Courses.html) and arranges the main qPCR symposia in Europe (www.qpcrsymposium.eu) and in the US (www.qpcrsymposium.com).
1 Dancing Naked in the Mind Field. Ed. Kary Mullis (2000).
3 Interim Guidance on Specimen Collection, Processing, and Testing for Patients with Suspected Novel Influenza A (H1N1) Virus Infection. CDC.gov. Centers for Disease Control and Prevention. www.cdc.gov/h1n1flu/specimen collection.htm.
4 Hurt, AC et al. Performance of influenza rapid point-of-care tests in the detection of swine lineage A(H1N1) influenza viruses. Influenza and Other Respiratory Viruses 2009;3(4):171-76.
5 http://molecular.roche.com/ platforms/fully_automated_pcr _systems.html.
6 http://international.abbott molecular.com/m2000SPm2000 RT_51644.aspx.
7 http://www1.qiagen.com/ Products/ByLabFocus/MDX/.
8 http://www.enigmadiagnos tics.com.
10 http://www.lifetechnologies. com/.
15 Brenan, C, Morrison, T. High throughput, nanoliter quantitative PCR. Drug Discovery Today: Technologies 2, 247-253 (2005).
16 Neuzil, P, Zhang, C, Pipper, J, Oh, S, Zhuo, L. Ultra fast miniaturized real-time PCR: 40 cycles in less than six minutes. Nucleic Acids Res. 2006 Jun 28; 34(11):e77.
17 Lee, D, Chen, PJ, Lee, GB. Biosens Bioelectron. The evolution of real-time PCR machines to real-time PCR chips. 2010 Mar 15;25(7):1820- 4. Epub 2009 Nov 27.
18 Liu, P, Mathies, RA. Integrated microfluidic systems for high-performance genetic analysis. Trends Biotechnol. 2009 Oct;27(10):572-81. Epub 2009 Aug 24.
19 Sykes, PJ, Neoh, SH, Brisco, MJ, Hughes, E, Condon, J, Morley, AA. Quantitation of targets for PCR by use of limiting dilution. Biotechniques. 1992 Sep;13(3):444-9.
20 Kalinina, O, Lebedeva, I, Brown, J, Silver, J. Nanoliter scale PCR with TaqMan detection. Nucleic Acids Res. 1997 May 15;25(10):1999- 2004.
21 Vogelstein, B, Kinzler, KW. Digital PCR. Proc Natl Acad Sci U S A. 1999 Aug 3;96(16):9236-41.
23 Kiss, MM, Ortoleva- Donnelly, L, Beer, NR, Warner, J, Bailey, CG, Colston, BW, Rothberg, JM, Link, DR, Leamon, JH. High-throughput quantitative polymerase chain reaction in picoliter droplets. Anal Chem. 2008 Dec 1;80(23):8975-81.
24 Dube, Simant, Qin, Jian, Ramakrishnan, Ramesh. Mathematical Analysis of Copy Number Variation in a DNA Sample Using Digital PCR on a Nanofluidic Device. PLoS ONE 3 (2008), pp. e2876-e2883.
25 Weaver, Suzanne, Dube, Simant, Mir, Alain, Qin, Jian, Sun, Gang, Ramakrishnan, Ramesh, Jones, Robert C and Livak, Kenneth J. Taking qPCR to a higher level: Analysis of CNV reveals the power of high throughput qPCR to enhance quantitative resolution. Methods Volume 50, Issue 4, April 2010, Pages 271-276. The ongoing Evolution of qPCR.
26 Ottesen, EA, Hong, JW, Quake, SR, Leadbetter, JR. Microfluidic digital PCR enables multigene analysis of individual environmental bacteria. Science. 2006 Dec 1;314(5804):1464-7.
27 Lun, Fiona MF, Chiu, Rossa WK, Chan, KC Allen, Leung, Tak Yeung, Lau, Tze Kin and Lo, YM Dennis. Microfluidics Digital PCR Reveals a Higher than Expected Fraction of Fetal DNA in Maternal Plasma. Clinical Chemistry. 2008;54:1664-1672.
28 Sindelka, R, Sidova, M, Svec, D, Kubista, M. Spatial expression profiles in the Xenopus laevis oocytes measured with qPCR tomography. Methods. 2010 Jan 4. doi:10.1016/j.ymeth.2009. 12.011.
29 Tichopad, A, Kitchen, R, Riedmaier, I, Becker, C, Ståhlberg, A, Kubista, M. Design and optimization of reversetranscription quantitative PCR experiments. Clin Chem. 2009 Oct;55(10):1816-23. Epub 2009 Jul 30.
32 Bustin, SA. Why the need for qPCR publication guidelines? – The case for MIQE. Methods. 2010 Apr;50(4):217-26. Epub 2009 Dec 16.
33 www.tataa.com/files/PDF/ Clin%20Chem%2055,%204%20 %282009%29.pdf.