Although the future for micorarray technology looks bright, the DNA microarray consumer is faced with a number of important choices regarding sources and technologies.
Developers, users, and afficionados of DNA microarray technology, who happened on July 25, 2002 to peruse the day’s crop of press releases, were probably shocked to learn that Amersham Biosciences had purchased the doomed remnants of Motorola Life Sciences’ CodeLink microarray business for $20 million – mere pocket change on the acquisitions scale. It seems like only yesterday (it was actually mid-2000) when another major electronics and research equipment manufacturer, which shall not be named, quoted figures claiming that microarray sales would reach $2.4 billion by 2004.
The reality is somewhat more sobering, as shown by Amersham’s bargain purchase and other indications to be discussed in a forthcoming report (Microbiotechnology Markets, Drug and Market Development Publications, November 2002). On the plus side, industry leader Affymetrix did show an impressive 50% increase in the number of chips shipped compared to the same quarter a month earlier. The actual number of chips sold totalled 97,000, and product revenue for the quarter was $58.6 million (of which $5.6 million came from sales to Perlegen, an Affymetrix spin-off). During that same quarter, Affymetrix’s installed base of microarray processing and detection instrument systems reached 690, which means that on average, Affymetrix shipped 140 chips per instrument for the quarter (560 chips annualised).
Although rather impressive, these growth figures may be anomalous in light of the fact that 2001 product sales grew only 12% over the previous year. Affymetrix’s success in placing the instruments required to process and read its chips can also be considered a bit anaemic at 690. Instrument system placements from industry-leading medical diagnostic companies routinely exceed 10,000. Laboratories conducting gene expression studies in the US alone exceed 5,000 in number.
Although exact industry sales are not known, since Agilent and several other key competitors do not report microarray sales, estimated annual sales of microarrays from all commercial sources for 2002 will be approximately $500 million. If 25% growth can be achieved for the next two years, sales for 2004 will be approach $800 million, a highly respectable, but arguably less than stellar, total (considering the levels of investment input) for one of the most important technological innovations of the late 20th century.
Yet by other measures microarray technology can be considered an unqualified success. The number of life science publications identified from PubMed abstracts containing the term, microarray, continues to increase at impressive rates. For the first six months of 2002, 688 such articles were published versus 398 for the same period a year earlier, an increase of nearly 73%.
Although precise comparisons are impossible, it is quite likely that commercial sales of microarrays lag considerably behind the number actually used. An estimated 1.7 million chips will be consumed in 2002, of which approximately 800,000 will be come from commercial sources. The remainder, more than half, will be ‘homebrewed’ in core microarray facilities located in pharmaceutical and biotechnology companies, universities and other research institutions.
Why have so many decided to make their own arrays? There are two main reasons. The first, not surprisingly, is cost. Commercial arrays range in cost from about $200 to $1,200 and average approximately $500. University core microarray facilities – which often use inexpensive student labour, benefit from various other subsidies and usually do not operate for profit – can sell arrays to their client laboratories for $150 or less.
Although comparative pricing would appear to clearly favour homebrewed arrays, other considerations tend to level the playing field. Quality is a key issue and commercial providers (at least the larger ones) are clearly in a better position to control and assure quality than their do-it-yourself competitors, which generally do not operate on a scale that can justify significant investment in quality control/assurance. In fact, quality among homebrewers continues to improve as core facilities ascend their learning curve, but commercial players still have considerable advantage. The cost of assuring quality, together with other hidden costs of homebrew microarraying, convinced at least one large consumer, Millennium Pharmaceuticals, to essentially abandon its homebrew ways and jump on the Affymetrix bandwagon.
Customisation, the other major factor favouring homebrew microarray production, points to what may be a key vulnerability in the Affymetrix armour. A microarray user must choose between standard and custom arrays. Standard arrays are either ‘genome-wide’, representing on the order of 10,000 to 30,000 genes, or ‘focused’, containing probes for tens to low thousands of genes. When investigators start a project designed to determine, for example, which genes are expressed by a tumour but not by corresponding ‘normal’ tissue, they will most likely use a genome-wide array in order identify a group of interesting candidate genes. Once researchers identify those candidate genes, they can continue the project using either a standard focused array (containing, say, a generic group of genes expressed by human tumours) or a custom array (containing a specific subgrouping of genes specific to the user’s project). A significant number of microarray users, however, do not feel safe in using anything less than a genome-wide array at any stage of a project. They feel that the odd, but important, gene might pop up unannounced and they do not want to miss it.
Focused arrays tend to deal with particular subjects, such as cancer, apoptosis or toxicology. Manufacturers face some difficulty in providing one-size-fits-all focused gene content. Despite expectations, real or perceived content needs still vary considerable among users.
Focused arrays can be bought off-the-shelf from commercial sources or custom-produced by core microarray facilities. Core facilities have a clear edge over commercial producers regarding customisation. For example, Affymetrix’s light-directed synthesis-based microarray production technology is somewhat less than ideally suited for making low-density, custom arrays. Since the number of production steps is much more closely linked to the length of oligonucleotide probes synthesised than to the number of features (ie, the number of distinct probes) per array, the cost-effectiveness of Affymetrix technology tends increase in proportion to the feature density.
Secondly, the Affymetrix process tends to require the customisation of photolithographic masks and production protocols for different array configurations, and this means that front-end production costs will be considerably greater than for spotted arrays or for synthesised arrays such as those made by Agilent’s inkjet-based process. Does this mean that Agilent has a significant edge over Affymetrix in custom microarray provision? Not necessarily. Like Affymetrix, Agilent must carry significant overhead costs corresponding to its corporate infrastructure dedicated to production, marketing, sales, and quality control/assurance. Qualitatively, at least, one is led to the conclusion that Agilent, and presumably others of their ilk, cannot generate adequate profitability from customisation without keeping prices well above homebrew levels.
Admittedly, Affymetrix has made a major attempt with some success to address the custom microarray market opportunity. Its GeneChip® CustomExpress programme permits customers to order microarrays in lots of 90 units containing about 16,000 oligonucleotides (corresponding to 1,000 genes), and get them in four weeks for $250 each. Orders for less than 540 arrays must pay an undisclosed design fee in addition to the unit cost. Affymetrix claims to have made advances in its production technologies favouring customisation, yet it is tempting to wonder if profitability is being traded to an extent for market share.
Smaller companies have advantages over Affymetrix, Agilent and others of their ilk in regard to customisation, yet they may have little advantage over well-run core microarray facilities aided by subsidies. Many of these smaller companies make spotted arrays, which were originally developed to accommodate cDNA molecules rather than oligonucleotides as capture agents. As researchers have gained experience with microarray technology, they have tended to support oligonucleotides rather than cDNA as the molecules of choice for arraying. Furthermore, they are tending to favour longer oligonucleotides – 50-70- mers – over the 25-30-mers to which Affymetrix is limited by the nature of its technology. Motorola also settled on 30-mers for its gel-pad microarrays. This brief consideration of some factors involved when researchers must select a source for small lots of relatively low-density microarrays reflects a stillevolving technology area and an industry still working to optimise its business models. It also reflects a technology in need of standardisation. Although efforts to standardise data analysis are well under way, the same cannot be said of the arrays themselves.
Array-making technologies and materials vary, and these variations are reflected in the performance of the arrays themselves. And more array-making technologies are on the way. Nimblegen and Xeotron both use digital light processors, of the sort found in digital slide projectors, to perform massively parallel light-directed oligonucleotide synthesis. Nimblegen’s technology employs digital light processing in much the same manner as Affymetrix employs masks, using similar chemistry. Xeotron, on the other hand, uses digital light processing to generate an acid catalyst for DNA chain elongation in microfluidic cells on the array surface. The combination of Xeotron’s PhotoGenerated Reagent (PGR) chemistry and the microfluidic chip allows Xeotron the unique capability to make long oligo DNA (of more than 100 nucleotides), RNA and peptide arrays. In a similar vein to digital light processor-directed synthesis, CombiMatrix uses electrochemistry to localise DNA synthesis steps. All three companies’ business plans emphasise array customisation. Other companies are advancing still more new arraymaking technologies.
The equation suffers further complexity when encoded bead technologies are thrown into the mix. In principle, at least, these technologies are much better suited to analyses requiring 100 or fewer capture molecules than standard positional microarrays. Encoded beaded microarrays – which have also been called virtual microarrays – consist of beads, each of which carries both a capture biomolecule and an identification code telling the nature of biomolecule. Coding may involve dyeing the beads (Luminex), attaching address-tag oligonucleotides (Illumina), or using quantum dots (Quatum Dot Corporation). In standard arrays, the position of a capture molecule is known in advance, precluding the need for coding.
From the production perspective, virtual arrays have a major advantage over their positional counterparts. Each positional microarray is a distinct unit with capture molecules made in situ or placed there, so that production errors may be specific to one or several arrays in a lot. By contrast, encoded beads are produced in lots. If 100 oligos are to be immobilised on beads, one produces 100 lots of beads, each containing one oligo. An array is made by combining small portions of each bead lot. Each lot of beads can therefore be tested for quality before use. The manufacturer is burdened by the need to make and inventory many lots of beads, but once these are available, constructing arrays is trivial and quality control should be much less complicated and costly than for positional arrays. Of course encoded bead arrays cannot be read with the confocal microscope scanners used for positional arrays. Manufacturers of bead arrays have taken several approaches to reading. Luminex – the only company with a system available on the open market – uses flow cytometry for label detection and decoding. Illumina is planning to place each bead at the end of an optical fibre through which a laser is beamed. Quantum Dot is planning to use an imaging microscope together with image analysis software to read its labels and codes. Nanoplex is planning to use laser-induced Raman light scattering.
The DNA microarray consumer is faced with a number of important choices regarding sources and technologies. The high and medium-density array fields will probably continue to be dominated by Affymetrix and Agilent, each with its own standard. Low-density arrays are very much in need of standards, but no one technology or set of vendors yet stands out. As the need for analysing large numbers of specimens using low-density arrays increases (as is starting to happen in tumour characterisation via gene expression analysis and in genotyping for gene-disease association), encoded bead array technology can be expected to increase in dominance.
In short, the future for microarray technology is bright, but users are faced with a multiplicity of choices, and manufacturers face a diverse and dynamic market environment.
Dr Ken Rubenstein has a doctorate in chemistry with post-doctoral training in molecular biology. He served as a scientist and executive with Syntex Corporation where he invented and developed the EMIT homogenous enzyme immunoassay technology. For the past 19 years Dr Rubenstein has served as a technical and strategy consultant to the biotechnology industry specialising in platform technologies.