Cheap DNA Sequencing Will Drive a Revolution in Health Care
Credit: Leonard Lessin / Photo Researchers, Inc.
MIT Technology Review, March/April 2010, by Stephen Cass — The dream of personalized medicine was one of the driving forces behind the 13-year, $3 billion Human Genome Project. Researchers hoped that once the genetic blueprint was revealed, they could create DNA tests to gauge individuals’ risk for conditions like diabetes and cancer, allowing for targeted screening or preëmptive intervention. Genetic information would help doctors select the right drugs to treat disease in a given patient. Such advances would dramatically improve medicine and simultaneously lower costs by eliminating pointless treatments and reducing adverse drug reactions.
Delivering on these promises has been an uphill struggle. Some diseases, like Huntington’s, are caused by mutations in a single gene. But for the most part, when our risk of developing a given condition depends on multiple genes, identifying them is difficult. Even when the genes linked to a condition are identified, using that knowledge to select treatments has proved tough (see “Drowning in Data“). We now have the 1.0 version of personalized medicine, in which relatively simple genetic tests can provide information on whether one patient will benefit from a certain cancer drug or how big a dose of blood thinner another should receive. But there are signs that personalized medicine will soon get more sophisticated. Ever cheaper genetic sequencing means that researchers are getting more and more genomic information, from which they can tease out subtle genetic variations that explain why two otherwise similar people can have very different medical destinies. Within the next few years, it will become cheaper to have your genome sequenced than to get an MRI (see “A Moore’s Law for Genetics“). Figuring out how to use that information to improve your medical care is personalized medicine’s next great challenge.
The approximate cost of genetic testing to predict a patient’s response to the commonly prescribed blood thinner warfarin
MIT Technology Review, March/April 2010, by Lauren Gravitz — The market for personalized medicine is growing: according to PricewaterhouseCoopers, the core market will reach $42 billion by 2015. However, that growth is not uniform. Some areas, such as genomic sequencing, are surging ahead; others, such as translating genetic data into clinically useful information, languish.
In this environment, startups developing sequencing technologies, such as Pacific Biosciences, Illumina, and Complete Genomics, have attracted sustained investor interest as they race to create ever cheaper ways to decode DNA (see “Faster Tools to Scrutinize the Genome“). In their most recent rounds of venture funding last summer, Pacific Biosciences and Complete Genomics received $68 million and $45 million, respectively.
Diagnostic technologies, too, are moving at a rapid pace. Startups from Boston to Silicon Valley have been pinning down disease-related genetic markers and creating many new tests that are already in the clinic or on their way. As these companies grow and bring more tests to market, large diagnostics companies are likely to acquire them, says venture capitalist Brook Byers of Kleiner Perkins Caufield and Byers.
One of the biggest undeveloped areas in personalized medicine, however, is the information technology needed to analyze and store the huge quantity of genetic data that is starting to pour forth (see “Drowning in Data“). Of the few bioinformatics companies working to digest the data, Proventys, based in Newton, MA, is among the furthest along. Its technology combines biomarkers and other information to make risk predictions about diseases.
Meanwhile, pharmaceutical companies are responding to the nascent market for personalized therapeutics in different ways. Pfizer, for example, is collaborating with existing biotech companies to develop drugs and diagnostics based on genetic testing. AstraZeneca recently announced a partnership with the Danish diagnostics company Dako, the first of many alliances it plans in a strategy for bringing genetic tests to market. Novartis is taking a different tack, dedicating a large portion of its own resources to developing personalized medicine.
In the United States, benefit management companies, which act as middlemen between patients and insurers or employers, are aggressively moving into the market. One of the largest, Medco, has established a personalized-medicine group to recommend which genetic tests insurers should pay for. In February it acquired DNA Direct, a firm that specializes in analyzing genetic diagnostics, to aid in this effort. One of its largest competitors, CVS Caremark, increased its stake in a similar company, Generation Health, last December. Because such companies serve millions of people, they will play a critical role in making genetic tests broadly available and educating doctors about the benefits of offering such tests to their patients.
A machine for DNA sequencing was invented by Leroy Hood and his colleagues at Caltech. In 1992, Hood and several others were granted U.S. patent 5,171,534 for an “Automated DNA Sequencing Technique.” Replacing slow and expensive manual methods, this is one of the most important pieces of intellectual property in biotechnology; explore this interactive analysis by IPVision of the patent’s impact on the innovation landscape.
The FDA recommends genetic testing before patients are given prescriptions for Erbitux, a treatment for colorectal cancer. Credit: Bristol Myers Squib
MIT Technology Review, February 24, 2010, by Courtney Humphries — Personalized medicine does not fit easily into established government procedures for approving drugs. After all, clinical trials are designed to test a drug on a large and diverse group of patients, and the whole point of personalized therapeutics is to target the specific genetic populations that will benefit most. The U.S. Food and Drug Administration is now trying to figure out how to judge the usefulness of a drug designed for particular genetic groups while also considering its safety for others who may receive it for off-label purposes.
Last fall the FDA created a post for a genomics advisor, who will coördinate the agency’s efforts to address the subject of genetic data and prescription drugs. Amy Miller, public-policy director of the nonprofit Personalized Medicine Coalition, says the agency has signaled that it’s “now ready to give the industry some guidance on how personalized-medicine products will be regulated in the future.”
One of the first challenges the FDA will probably tackle is how to evaluate genetic and biomarker-based tests aimed at identifying the patients most likely to benefit from a drug. The agency has begun adding recommendations for diagnostic tests to drug labels, and in a handful of cases it has mandated a genetic test before a drug can be prescribed, but there is currently no streamlined path for approving the combination of a drug and a diagnostic test. The FDA has indicated that it will develop guidelines, but so far it’s not clear how, or when, it will resolve the logistical difficulties involved in approving two very different products in one regulatory process.
By MIT Technology Review Editors
- Project: Applied Statistical Genetics Group
Wellcome Trust Sanger Institute
Finding ways to analyze large amounts of genetic data and extract information related to diseases that involve multiple genes.
Project: Cancer Biology and Genetics Program
Testing a microfluidic chip that will measure differences in how genes are expressed in tumors.
Project: Coriell Personalized Medicine Collaborative
Coriell Institute for Medical Research
Enrolling 100,000 participants in a research study to measure how genetic information can improve health.
Project: Diagnostic Investigation of Sudden Cardiac Event Risk
Developing a genetic test that will identify patients at risk of sudden cardiac death who should receive an implantable defibrillator.
Catholic University of Leuven and others
An EU-funded consortium seeking to standardize genetic testing and establish guidelines for doctors and patients.
Project: Global Alliance for Pharmacogenomics
National Institutes of Health, RIKEN Yokohama Institute
An American-Japanese scientific alliance studying pharmacogenomics across a broad array of medical conditions, including depression, AIDS, and asthma.
Project: Pharmacogenomics Knowledge Base
Stanford University Medical Center and others
An international consortium building a database detailing the influence of genetic variations on drug reactions.
Project: Plavix, Effient Comparative Effectiveness Study
Investigating whether using genetic tests to determine a patient’s sensitivity to certain drugs is more cost-effective than choosing drugs that are less affected by genetic variations.
Project: The 1000 Genomes Project
Wellcome Trust Sanger Institute, Beijing Genomics Institute Shenzen, National Human Genome Research Institute
Sequencing the genomes of about 1,200 people around the world to create a database of biomedically relevant genetic variation.
Project: The Cancer Genome Atlas
National Cancer Institute, National Human Genome Research Institute
Sequencing thousands of samples from over 20 types of tumors to understand the genetic changes that underlie these cancers.
MayoClinic.com, GoogleNews.com, February 24, 2010, by Carrie A. Zabel — Personalized medicine offered at your local drugstore?
Two large prescription drug companies have announced plans to offer genetic testing as part of the prescription-filling process. The testing will center on an emerging science, pharmacogenomics, which studies drug response based upon an individual’s genetic make-up. Pharmacogenomic testing is already used for some commonly prescribed drugs such as Tamoxifen and Warfarin.
The process would use a pharmacy benefits management company that would contract with large drugstore chains. When certain prescriptions come in, the company would contact the physician to let them know a genetic test is available, which may help them to more appropriately prescribe that medication. The individual may then be offered the genetic testing, but it wouldn’t be required.
Supporters say this will improve patient safety, health outcomes and decrease overall health care costs by using the right medications in the right patients. It may also provide an opportunity to advance the field of pharmacogenomics by collecting data on genetic testing results and drug effectiveness.
Others are concerned about the privacy of genetic testing information and say the science of pharmacogenomics is premature. Drug metabolism isn’t only based on our genetic make-up, but is affected by many additional factors, such as body size and age. And, since pharmacogenomics is a relatively new science, insurance companies may not reimburse for the cost of genetic testing.
About the writer………………..Carrie A. Zabel, M.S., C.G.C., Genetic Counselor
Carrie A. Zabel, M.S., C.G.C.
“We must begin now to prepare for the future; we cannot wait until the details are known or fully understood.”*
— David B. Schowalter, M.D., Ph.D., former Mayo geneticist, (*posthumous)
Carrie A. Zabel, M.S., C.G.C., is a board-certified genetic counselor who specializes in hereditary cancer syndromes. One of her main professional interests is the family medical history.
“Recognizing features in the family history which may suggest an underlying single gene disorder can have a huge impact on families,” she says. “Identifying a genetic susceptibility gene can allow family members to more accurately understand their risk of disease and empower those who have an increased genetic susceptibility to take control of their medical management and lifestyle factors which may influence this risk.”
She received her B.S. in biology from the University of Wisconsin-La Crosse in 2002 and M.S. in genetic counseling from the University of Minnesota in 2004.
She was a clinical genetic counselor at the Marshfield Clinic in Marshfield, Wis., before joining Mayo Clinic in August 2006 as a genetic counselor and educator for the grant-funded Mayo Eisenberg Genomics Education Program. During her time in Wisconsin, she was also an active member of the metabolic subcommittee of the state Newborn Screening Program and co-facilitated a phenylketonuria clinic.
At Mayo Clinic, she provides physician and staff education about clinically relevant topics in genomics. She also manages multiple education projects championed by Mayo Clinic physicians and is a faculty member for Mayo Medical School. In addition to her education roles, she sees adult patients in the Department of Medical Genetics.
Credit: Christopher Harting
Tests for this type of individual genetic variation have been available for a long time, but in many cases they cost too much and take too long. Nanosphere, a startup out of Northwestern University that’s based in Northbrook, IL, hopes to change that. Its Verigene system, which takes just a few hours to analyze DNA from blood or other material, allows doctors to test for genetic variations without having to send samples out to a lab.
A new device can rapidly test biological samples for genetic variations that could cause dangerous reactions to some drugs
MIT Technology Review, March/April 2010, by Erica Naone — Different people can react to drugs in different ways, and in some cases the response can be predicted from their genes. For example, the drug warfarin, often used to prevent blood clots, can cause dangerous bleeding in some patients. Researchers have identified two genetic variations that can increase this risk.
A. Disposable cartridge
A single-use cartridge uses a combination of chemical reactions to isolate fragments of DNA from a patient sample and test them for specific genetic characteristics. The top half of the cartridge is discarded after this process is complete, leaving a prepared glass slide behind.
B. Bar Code
To help keep track of samples, a bar code is printed on the test cartridge and the underlying slide.
C. Reagent Wells
The necessary ingredients for the chemical reactions used to process the DNA are stored in wells located around the edges of the test cartridge. After the DNA is extracted from a sample, the machine uses air pressure and mechanical valves to release the ingredients from the wells as needed. Strands of DNA that are complementary to the target sequences are used to bind those sequences to the glass slide below the cartridge, as well as to gold nanoparticles that will allow the DNA to be detected when exposed to light. The cartridge washes away any excess DNA or nanoparticles and then sets off a reaction that coats the remaining nanoparticles with silver, which makes it easier to scan for them.
D. DNA Loading chamber
A DNA sample is loaded into the port shown here. Sonic energy, applied when the cartridge is inserted into the machine that processes the samples, breaks the DNA into small fragments and separates it into its two complementary strands so that it can be captured on the surface of the glass slide.
E. Glass Slide (Microarray)
After the chemical reactions have finished, the target DNA remains on the surface of the prepared glass slide, tagged by silver-coated gold nanoparticles. The Verigene’s reader can read the slide by shining light into it and measuring how that light is scattered by the tagged DNA. The system can be used to look for single or multiple genetic targets.
Finding messengers: deResearchers use a specially designed sensor to detect the release of dopamine (lower green and purple band) and adenosine (upper green and purple band), both chemical messengers in the brain.
Credit: Kendall H. Lee, MD, PhD, director of Mayo Neural Engineering Laboratories, and Kevin Bennet, Chair of Mayo Division of Engineering.
A system to detect brain chemicals may improve therapies for Parkinson’s and other disorders
MIT Technology Review, February 24, 2010, by Emily Singer — Over the last decade, deep brain stimulation, in which an implanted electrode delivers targeted jolts of electricity, has given surgeons an entirely new way to treat challenging neurological diseases. More than 75,000 people have undergone the procedure for Parkinson’s and other disorders. But despite its success, scientists and surgeons know little about its actual effect on the brain or exactly why it works.
An implantable sensor designed to detect vital chemical signals in the brain, currently being tested in animals, could help scientists measure the impact of electrical stimulation and perhaps provide a way to enhance the effectiveness of the treatment. “For a long time in neurosurgery we’ve been dealing with the brain from purely an electrical perspective,” says Nader Pouratian, a neurosurgeon at the University of California, Los Angeles, who was not directly involved in the research. “This allows us to look at the brain as an electrochemical organ and understand the effect of interventions such as deep brain stimulation.”
During the conventional deep brain stimulation procedure, neurosurgeons insert a small electrode into the brain. The patient is awake during the surgery so that the surgeon can find the optimal location and level of stimulation to reduce the patient’s symptoms. In Parkinson’s patients, for example, muscle tremors are often immediately and visibly reduced with the appropriate stimulation.
However, the actual mechanisms behind its therapeutic effect are hotly debated. Recording the release of the brain’s signaling chemicals, known as neurotransmitters, could help to resolve the question, allowing neurosurgeons to better optimize the procedure.
The device consists of a custom-designed sensor electrode that is implanted along with the stimulating electrode, a microprocessor, a Bluetooth module to send data to a computer, and a battery. “It allows us to record dopamine and serotonin wirelessly in real time,” says Kendall Lee, a neurosurgeon at the Mayo Clinic, Rochester, MN, who helped develop the device. “That means we have tremendous control over the chemistry of the brain.”
To detect neurotransmitters, researchers apply a low voltage across the electrode. That oxidizes dopamine molecules near the electrode, triggering current flow at the electrode. “The amount of current flow gives a relative indication of concentration,” says Kevin Bennet, chairman of the division of engineering at the Mayo Clinic and one of Lee’s collaborators.
Preliminary research in pigs using the new system has shown that deep brain stimulation of the area targeted in Parkinson’s patients triggers release of dopamine. Researchers now aim to repeat these experiments in pigs that have some of the symptoms of the disease. For example, the sensors could detect whether certain patterns of dopamine correspond to improvements or worsening of Parkinson’s symptoms.
“We have to get more nuanced understanding of how electricity impacts brain chemistry at the microscopic circuit level,” says Helen Mayberg, a physician and neuroscientist at Emory University, in Atlanta, who was not involved in the research. “This type of technology gives us the opportunity to look precisely at very local changes in the chemical mix. As the technology is expanded to be able to detect an even wider range of neurochemical systems, it’s going to really catapult what we can learn about the mechanisms of brain stimulation and the diseases we treat with it.”
In addition to detecting dopamine, preliminary research shows the technology can also detect serotonin, a brain chemical implicated in depression. (Serotonin reuptake inhibitors such as Prozac target this neurotransmitter.) Deep brain stimulation is currently approved to treat Parkinson’s, the movement disorder dystonia, and severe obsessive-compulsive disorder, and is under study for epilepsy, depression, anorexia, and other disorders.
Lee says his team has now been granted approval to test the system in a patient, which they aim to do in the next few months. Initially, it will be tested only during the implantation surgery to determine how moving the electrode alters the level of dopamine released. But the ultimate goal is to incorporate the sensor into the deep brain stimulation system. Researchers are currently developing new sensor electrodes that function effectively over the long term, as well as shrinking the device so that it can be packaged and implanted onto the skull. Once researchers better understand the link between deep brain stimulation and neurochemistry, the accompanying chemical data may help neurosurgeons to better place the electrode.
But some say this step may be premature. “The technology is very intriguing, but we need a lot more research before it can be applied in humans,” says Ali Rezai, a neurosurgeon at Ohio State University, who was not directly involved with the research. He says that researchers need to show that using this technology alongside deep brain stimulation in animals with symptoms of Parkinson’s disease improves outcomes.
In the long term, Lee and his collaborators want to develop a so-called closed loop system, allowing the stimulation device to detect the chemical changes in the brain and adjust its response accordingly. This approach is analogous to cardiac pacemakers, which stimulate the heart only when the instrument detects an abnormality. While abnormalities in heart rhythms are fairly straightforward to detect, “in the brain, it’s much more complicated,” says Rezai.