If we have learned anything from scientific research in the last 20 years, it is that finding cures to complex diseases is difficult. Despite the promise of the genomic revolution, disease progression and patient outcomes are still not easily predicted by genetic factors alone.
It is now readily accepted that we are in a post genomic era. With the steady flow of genomic information available to researchers worldwide, the focus turns to ways to analyse this information effectively and then utilise it in a practical manner.
Pharmaceutical research and development is changing. The old model of drug discovery, based on a combination of imprecise candidate generation and broad physiological screens, has given way to more specific and intelligent approaches to target identification and drug design. Now, a vast influx of genomic information is set to revolutionise the range of targets available, at least to those able to navigate effectively through the bewildering immensity of the world’s genomic data archives.
Within the next 10 years the benefits of pharmacogenetics and pharmacogenomics will inevitably outweigh the disadvantages. But what are the commercial and legal implications for the pharmaceutical industry especially for companies who have lead candidates ready to enter development?
Currently available drugs only target around 500 different proteins4. Recent reports from efforts to sequence the human genome suggest there are tens of thousands of genes1,2 and many more different proteins. Popular estimates of the number of ‘new’ drug targets that will emerge from genomic research range from 2,000 to 5,0003. A critical question as we enter the post-genomic world is: how can the pharmaceutical industry rapidly discover and develop medicines for these new targets to improve the human condition?
Adverse drug responses are an important post-marketing public health issue, occurring many times in subsets of treatment populations. Promising new approaches to predicting physiological responses to drugs are focused on ‘genomic responses’ or toxicogenomics1. This article provides a current perspective on toxicogenomics technologies that are aimed at: 1) providing new tools and systems for more rapid, accurate and complete toxicity assessments in advance of human exposure; 2) enhancing the thoroughness and accuracy of toxicity assessments achievable with currently available test systems, and 3) predictive assessments of individualised risk for developing adverse drug reactions.
sequencing of the human genome represents one of the most significant scientific advances of the 20th century that will shape the foundation of medical research well into the 21st.This accomplishment was enabled by remarkable technological advances – high throughput sequencing, increased computing power, automated methods of analysis – that 25 years ago seemed unimaginable.Through this project, we have gained the understanding that human beings are an estimated 99.9% identical at the genetic level.Yet, it is the 0.1% of variation among individuals that serves as the foundation for the emerging discipline of pharmacogenomics. It is this variation that contributes to physical diversity in the human population as well as differences in disease susceptibility and response to pharmacological therapies.This contribution of the Human Genome Project offers the opportunity to shape the face of drug discovery and development in this century.
HTS has been in place for approximately 10-12 years and has achieved a threeorder of magnitude scale up. Genomics has been in a ‘high throughput’ mode for approximately four years for areas such as genotyping, is an emerging field governed by sporadic technology leaps and data generation leaps. However, despite these approaches, there are many discrepancies and parallels that exist between the two areas: sample management/assay assembly/parallel processing/ data analysis. However, some differences exist with respect to regulatory issues, public perception and ethical consent. The article will compare and contrast these fields and highlight where both disciplines may learn from each other
HTS has been in place for approximately 10-12 years and has achieved a threeorder of magnitude scale up. Genomics has been in a ‘high throughput’ mode for approximately four years for areas such as genotyping, is an emerging field governed by sporadic technology leaps and data generation leaps. However, despite these approaches, there are many discrepancies and parallels that exist between the two areas: sample management/assay assembly/parallel processing/ data analysis. However, some differences exist with respect to regulatory issues, public perception and ethical consent. The article will compare and contrast these fields and highlight where both disciplines may learn from each other.
With the high number of sufferers from skin disease around the world, it is astonishing that there are relatively few treatments available and that many of these only serve to relieve symptoms. Can drug development based on a functional genomics approach be the answer in bringing new products to this ‘Cinderella’ market?
The international structural genomics effort has resulted in a number of technological advancements that are accelerating the process of threedimensional structure determination while continually decreasing the cost per structure.
The rapid growth in proteomic and structural and functional genomic research is driving demand for purified proteins that far exceeds the industryï¿½s ability to scale-up conventional protein production technologies. This demand has sparked innovation in protein separation and purification technologies. We discuss how the advent of improved systems for protein expression and high throughput parallel protein purification can deliver the opportunity to ignite a revolution in structural and functional genomics, proteomics and high-value drug target discovery.