thi_holiday

20091224-5

New publications, experiments and breakthroughs in biomedicine -and what they mean

MIT Technology Review, January/February 2010, by Emily Singer  — 

Three-Dimensional Genome
New technology reveals the higher-order structure of DNA.

Source: “Comprehensive mapping of long-range interactions reveals folding principles of the human genome”
Eric S. Lander, Job Dekker, et al.
Science
326: 289-293

Results: Scientists developed a tool that makes it possible to map the three-dimensional structure of the entire human genome, shedding light on how six feet of DNA is packed into a cell nucleus about three micrometers in diameter. According to the resulting analysis, chromosomes are folded so that the active genes–the ones this particular cell is using to make proteins–are close together.

Why it matters: Growing evidence suggests that the way the genome is packed in a particular cell is key to determining which of its genes are active. The new findings could allow scientists to study this crucial aspect of gene regulation more precisely.

Methods: Scientists treated a folded DNA molecule with a preservative in order to create bonds between genes that are close together in the three-dimensional structure even though they may be far apart in the linear sequence. Then they broke the molecule into a million pieces using a DNA-cutting enzyme. The researchers sequenced these pieces to identify which genes had bonded together and then used this information to develop a model of how the chromosome had been folded.

Next steps: Scientists plan to study how the three-dimensional structure of the genome varies between different cell types, between different organisms, and between normal and cancerous cells. They also hope that improving the resolution of the technology might reveal new structural properties of the genome. They can currently analyze DNA in chunks comprising millions of bases, but they would like to zero in on sequences thousands of bases long.
Diabetic Cells
Stem cells derived from patients with diabetes provide a new model for studying the disease

Source: “Generation of pluripotent stem cells from patients with type 1 diabetes”
Douglas A. Melton et al.
Proceedings of the National Academy of Sciences
106: 15768-15773

Results: Scientists collected cells from patients with type 1 diabetes and turned them into induced pluripotent stem cells, adult stem cells with an embryonic cell’s capacity to differentiate into many different cell types. Then they stimulated these cells to differentiate into insulin-producing pancreatic cells.

Why it matters: The stem cells carry the same genetic vulnerabilities that led the patients to develop diabetes. Watching them develop into insulin-producing cells should shed light on the development and progression of diabetes. Researchers may also be able to test new treatments on the developing cells.

Methods: Researchers “reprogrammed” skin cells from two diabetes patients by using a virus to insert three genes involved in normal development. The new genes caused other genes to turn on and off in a pattern more typical of embryonic cells, returning the skin cells to an earlier developmental stage. The scientists then exposed the cells to a series of chemicals, encouraging them to differentiate into insulin-producing cells.

Next steps: The researchers will examine the interaction between the different cell types affected by diabetes: the pancreatic beta cells and the immune cells that attack them. Initially they will study these interactions in a test tube, but ultimately they hope to incorporate the lab-generated human stem cells into mice. This will help scientists understand which cells are affected first. Armed with that knowledge, they could begin developing treatments that involve replacing some of those cells.

20091224-4

Lab on a chip: Fluidigm’s microfluidic chip (the gray square in the center) uses tiny channels and valves to manipulate liquids. It allows fast and sensitive bioassays     Credit: Joshua Scott

Why does it take so long to commercialize new technologies?

 

MIT Technology Review, January/February 2010, by David Rotman  —  The new microfluidic chip fabricated by Fluidigm, a startup based in South San Francisco, represents a decade of successive inventions. This small square of spongy polymer–the same type used in contact lenses and window caulking–holds a complex network of microscopic channels, pumps, and valves. Minute volumes of liquid from, say, a blood sample can flow through the maze of channels to be segregated by the valves and pumps into nearly 10,000 tiny chambers. In each chamber, nanoliters (billionths of a liter) of the liquid can be analyzed.

The ability to move fluids around a chip on a microscopic scale is one of the most impressive achievements of biochemistry over the last 10 years. Microfluidic chips, which are now produced by a handful of startup companies and a similar number of university-­based foundries, allow biologists and chemists to manipulate tiny amounts of fluid in a precise and highly automated way. The potential applications are numerous, including handheld devices to detect various diseases and machines that can rapidly analyze the content of a large number of individual cells (each holding about one picoliter of liquid) to identify, for example, rare and deadly cancerous mutations. But microfluidics also represents a fundamental breakthrough in how researchers can interact with the biological world. “Life is water flowing through pipes,” says George Whitesides, a chemist at Harvard University who has invented much of the technology used in microfluidics. “If we’re interested in life, we must be interested in fluids on small scales.”

By way of explaining the importance of the technology and the complexity of its microscopic apparatus, those involved in microfluidics often make comparisons to microprocessors and integrated circuits. Indeed, a microfluidic chip and an electronic microprocessor have similar architectures, with valves replacing transistors and channels replacing wires. But manipulating liquids through channels is far more difficult than routing electrons around an integrated circuit. Fluids are, well, messy. They can be hard to move around, they often consist of a complex stew of ingredients, and they can stick and leak.

Over the last decade, researchers have overcome many such challenges. But if microfluidics is ever to become truly comparable to microelectronics, it will need to overcome a far more daunting challenge: the transition from promising laboratory tool to widely used commercial technology. Can it be turned into products that scientists, medical technicians, and physicians will want to use? Biologists are increasingly interested in using microfluidic systems, Whitesides says. But, he adds, “do you go into the lab and find these devices everywhere? The answer is no. What’s interesting is that it hasn’t really taken off. The question is, why not?”

A similar question could just as well be asked about at least two other important technologies that have emerged over the last decade: genomic-based medicine and nanotechnology. Each began this century with significant breakthroughs and much fanfare. The sequencing of the human genome was first announced in early 2001; the National Nanotechnology Initiative, which helped launch much of today’s nanotech research, got its first federal funding in 2000. While all three technologies have produced a smattering of new products, none has had the transformative effects many experts expected. Why does it take so long for a technology as obviously important and valuable as these to make an impact? How do you create popular products out of radically new technologies? And how do you attract potential users?
Patience, Patience
Despite the economic, social, and scientific importance of technology, the process of creating it is poorly understood. In particular, researchers have largely overlooked the question of how technologies develop over time. That’s the starting point of W. Brian Arthur’s The Nature of Technology, an attempt to develop a comprehensive theory of “what technology is and how it evolves.” Arthur set to work in the library stacks at Stanford University. “As I began to read, I was astonished that some of the key questions had not been very deeply thought about,” he recalled in a recent interview. While much has been written on the sociology of technology and engineering, and there’s plenty on the histories of various technologies, he said, “there were big gaps in the literature. How does technology actually evolve? How do you define technology?”

link

A patent map created by IPVision, based in Cambridge, MA, shows many of the key inventions by Stephen Quake and Fluidigm over the last decade that make possible the company’s microfluidic chips. The timeline shows several key initial advances and how today’s microfluidics use both advances in microfabrication and biochemistry. Such a complex network of inventions is not uncommon in the development of new bodies of technology.
Credit: IPVision

Arthur hopes to do for technology what Thomas Kuhn famously did for science in his 1962 The Structure of Scientific Revolutions, which described how scientific breakthroughs come about and how they are adopted. A key part of Arthur’s argument is that technology has its own characteristics and “nature,” and that it has too long been treated as subservient to science or simply as “applied science.” Science and technology are “completely interwoven” but different, he says: “Science is about understanding phenomena, whereas technology is really about harnessing and using phenomena. They build out of each other.”

Arthur, a former professor of economics and population studies at Stanford who is now an external professor at the Santa Fe Institute and a visiting researcher at the Palo Alto Research Center, is perhaps best known for his work on complexity theory and for his analysis of increasing returns, which helped explain how one company comes to dominate the market for a new technology. Whether he fulfills his goal of formulating a rigorous theory of technology is debatable. The book does, however, offer a detailed description of the characteristics of technologies, peppered with interesting historical tidbits. And it provides a context in which to begin understanding the often laborious and lengthy processes by which technologies are commercially exploited.

Particularly valuable are Arthur’s insights into how different “domains” of technology evolve differently compared to individual technologies. Domains, as Arthur defines them, are groups of technologies that fit together because they harness a common phenomenon. Electronics is a domain; its devices–capacitors, inductors, transistors–all work with electrons and thus naturally fit together. Likewise, in photonics, lasers, fiber-optic cables, and optical switches all manipulate light. Whereas an individual technology–say, the jet engine–is designed for a particular purpose, a domain is “a toolbox of useful components”–“a constellation of technologies”–that can be applied across many industries. A technology is invented, Arthur writes. A domain “emerges piece by piece from its individual parts.”

The distinction is critical, he argues, because users may quickly adopt an individual technology to replace existing devices, whereas new domains are “encountered” by potential users who must try to understand them, figure out how to use them, determine whether they are worthwhile, and create applications for them. Meanwhile, those developing the new domains must improve the tools in the toolbox and invent the “missing pieces” necessary for new applications. All this “normally takes decades,” Arthur says. “It is a very, very slow process.”

What Arthur touches on just briefly is that this evolution of a new body of technology is often matched by an even more familiar progression: enthusiasm about a new technology, investor and user disillusionment as the technology fails to live up to the hyperbole, and a slow reëmergence as the technology matures and begins to meet the market’s needs.
A Solution Looking for Problems
In the late 1990s, microfluidics (or, as it is sometimes called, “lab on a chip” technology) became another overhyped advance in an era notorious for them. Advocates talked up the potential of the chips. But the devices couldn’t perform the complex fluid manipulations required for many applications. “They were touted as a replacement for everything. That clearly didn’t pan out too well,” says Michael Hunkapiller, a venture capitalist at Alloy Ventures in Palo Alto, CA, who is now investing in several microfluidics startups, including Fluidigm. The technology’s capabilities in the 1990s, he says, “were far less universal than the hype.”

The problem, as Arthur might put it, was that the toolbox was missing key pieces. Prominent among the needed components were valves, which would allow the flow of liquids to be turned on and off at specific spots on the chip. Without valves, you merely have a hose; with valves you can build pumps and begin to think of ways to construct plumbing. The problem was solved in the lab of Stephen Quake, then a professor of applied physics at Caltech and now in the bioengineering department at Stanford. Quake and his Caltech coworkers found a simple way to make valves in microfluidic channels on a polymer slab. Within two years of publishing a paper on the valves, the group had learned how to create a microfluidic chip with thousands of valves and hundreds of reaction chambers. It was the first such chip worthy of being compared to an integrated circuit. The technology was licensed to Fluidigm, which Quake cofounded in 1999.

Meanwhile, other academic labs invented other increasingly complex ways to manipulate liquids in microfluidic devices. The result is a new generation of companies equipped with far more capable technologies. Still, many potential users remain skeptical. Once again, microfluidics finds itself in a familiar phase of technology development. As David Weitz, a physics professor at Harvard and cofounder of several microfluidics companies, explains: “It is a wonderful solution still looking for the best problems.”

There are plenty of possibilities. Biomedical researchers have begun to use microfluidics to look at how individual cells express genes. In one experiment, cancer researchers are using one of Fluidigm’s chips to analyze prostate tumor cells, seeking patterns that would help them select the drugs that will most effectively combat the tumor. Also, Fluidigm has recently introduced a chip designed to grow stem cells in a precisely controlled microenvironment. Currently, when stem cells are grown in the lab, it can be difficult to mimic the chemical conditions in a living animal. But tiny groups of stem cells could be partitioned in sections of a microfluidic chip and bathed in combinations of biochemicals, allowing scientists to optimize their growing conditions.

And microfluidics could make possible cheap and portable diagnostic devices for use in doctor’s offices or even remote clinics. In theory, a sample of, say, blood could be dropped on a microfluidic chip, which would perform the necessary bioassay–identifying a virus, detecting telltale cancer proteins, or finding biochemical signs of a heart attack. But in medical diagnostics as in biomedical research, microfluidics has yet to be widely adopted.

Again, Arthur’s analysis offers an explanation. Users who encounter the new tools must determine whether they are worthwhile. In the case of many diagnostic applications, biologists must better understand which biochemicals to detect in order to develop tests. Meanwhile, those developing microfluidic devices must make the devices easier to use. As Arthur reminds us, the science and technology must build on each other, and technologists must invent the missing pieces that users want; it is a slow, painstaking evolution.

It’s often hard to predict what those missing pieces will be. Hunkapiller recalls the commercialization history of the automated DNA sequencer, a machine that he and his colleagues invented at Caltech and that was commercialized in 1986 at Applied Biosystems. (The machine helped make possible the Human Genome Project.) “Sometimes, it is a strange thing that makes a technology take off,” he says. Automated sequencing didn’t become popular until around 1991 or 1992, he says, when the company introduced a sample preparation kit. Though it wasn’t a particularly impressive technical advance–certainly not on the level of the automated sequencer itself–the kit had an enormous impact because it made it easier to use the machines and led to more reliable results. Suddenly, he recalls, sales boomed: “It wasn’t a big deal to pay $100,000 for a machine anymore.”

In a recent interview, Whitesides demonstrated a microfluidic chip made out of paper in which liquids are wicked through channels to tiny chambers where test reactions are carried out. Then he pulled a new smart phone, still in its plastic wrapping, out of its box. What if, he mused, you could somehow use the phone’s camera to capture the microchip’s data and use its computational power to process the results, instead of relying on bulky dedicated readers? A simple readout on the phone could give the user the information he or she needs. But before that happens, he acknowledged, various other advances will be needed. Indeed, as if reminded of the difficult job ahead, ­Whitesides quickly slipped the smart phone back into the box.

20091224-3

Advances in antiaging drugs, acoustic brain surgery, flu vaccines
-and the secret to IQ

 

MIT Technology Review, December 24, 2009, by Emily Singer  —  We may look back on 2009 as the year human genome sequencing finally became routine enough to generate useful medical information (“A Turning Point for Personal Genomes“). The number of sequenced and published genomes shot up from two or three to approximately nine, with another 40 or so genomes sequenced but not yet published. In a few cases, scientists have already found the genetic cause of a disorder by sequencing an affected person’s genome.

Scientists have also sequenced the genomes of a number of cancers, comparing that sequence to patients’ normal genome to find the genetic mistakes that might have caused the cells to become cancerous and to metastasize (“Sequencing Tumors to Target Treatment“). The results suggest that even low-grade and medium-grade tumors can be genetically heterogeneous, which could be problematic for molecularly targeted drugs. That points to a need to develop new strategies for drug development and treatment in cancer.

The year brought more good news for aging mice, and maybe humans, too, as scientists identified the first drug that can extend lifespan in mammals (“First Drug Shown to Extend Lifespan in Mammals“). Rapamycin, an antifungal drug currently used to prevent rejection of organ transplants, was found to boost longevity 9 to 13 percent even when it was given to mice that were the mouse equivalent of 60 years old. Previously, genetic engineering and caloric restriction–a nutritionally complete but very low-calorie diet–were the only proven methods of extending lifespan in mammals (“A Clue to Living Longer“).

Because of its potent immunosuppressant effect, the drug isn’t suitable for this application in humans. But researchers have already found that disrupting part of the same signaling pathway has similar life-extending benefits (“Genetic Fountain of Youth“). Mice with the relevant protein disabled showed superior motor skills, stronger bones, and better insulin sensitivity when they reached mouse middle age. Female mice lived about 20 percent longer than their unaltered counterparts. But male mice, while healthy, didn’t have longer lifespans. (In comparison, caloric restriction boosts longevity by about 50 percent.) Scientists now aim to develop drugs that target this pathway, which is thought to act as a kind of gauge for the amount of food available in the environment.

The emergence in April of a new pandemic flu strain, H1N1, rapidly renewed interest in new approaches to making vaccines (“New Vaccines for Swine Flu“). For the first time during an active pandemic, pharmaceutical companies were able to use faster cell-based production methods to create vaccines against the virus, in addition to the traditional egg-based method. (None of these methods has yet been approved for use in the United States–the vaccine currently available was made in eggs.) In November, an advisory panel for the U.S. Food and Drug Administration declared that a novel method of producing flu vaccines in insect cells, while effective, needs more safety testing before it can be approved (“Caterpillar Flu Vaccine Delayed“). The vaccine, developed by Protein Sciences, based in Meriden, CT, uses a single protein from the virus to induce immunity, rather than a dead or weakened version of the virus. Two other companies began clinical trials of flu vaccines made from virus-like particles–protein shells that look just like viruses but do not contain viral DNA (“Delivering a Virus Imposter Quicker“).

A new approach to brain surgery, tested by a Swiss team earlier this year, allows surgeons to burn out small chunks of brain tissue without major surgery using specialized sound waves (“Brain Surgery Using Sound Waves“). Neurosurgeons used a technology developed by InSightec, an ultrasound technology company headquartered in Israel. The method employs high-intensity focused ultrasound (HIFU) to target the brain. (HIFU is different than the ultrasound used for diagnostic purposes, such as prenatal screening, and has previously been used to remove uterine fibroids.) Beams from an array of more than 1,000 ultrasound transducers are focused through the skull onto a small piece of diseased tissue, heating it up and destroying it. In the study, nine patients with chronic debilitating pain reported immediate pain relief after the procedure.

Scientists also hope to co-opt the technologies developed for HIFU to modulate brain activity, using low intensity focused ultrasound to activate nerve cells (“Targeting the Brain with Sound Waves“). This approach might one day provide a less invasive alternative to deep-brain stimulation. This procedure, in which surgically implanted electrodes stimulate parts of the brain, is an increasingly common treatment for Parkinson’s disease and other neurological problems.

In another first for the brain, scientists discovered this year that our IQ, or general intelligence, depends in large part on our white matter–the fatty layer of insulation that coats the neural wiring of the brain (“Brain Images Reveal the Secret to Higher IQ“). Using a type of brain imaging called diffusion tensor imaging, researchers analyzed the neural wiring in 92 pairs of fraternal and identical twins and found a strong correlation between the integrity of the white matter and performance on a standard IQ test. In addition, the researchers found that the quality of one’s white matter is largely genetically determined. They are now searching for genetic variants tied to white matter and IQ.

A feature in the November issue of the magazine further explored the secret of intelligence, revealing that our smarts may be determined by the function and efficiency of the networks within the brain, rather than the number of neurons or the size of any particular region (“Intelligence Explained“).

20091224-2

Credit: Technology Review

Scientists are finally starting to find medical information of value

 

MIT Technology Review —  Last year, when more than 100 of the world’s top geneticists, technologists, and clinicians converged on Cold Spring Harbor Laboratory in New York for the first annual Personal-Genomes conference, the main focus was James Watson‘s genome. The codiscoverer of the structure of DNA was the first to have his genome sequenced and published (aside from Craig Venter, who used his own DNA for the private arm of the human genome project.) Watson sat in the front row of the lecture hall as scientists presented their analysis of his genome. They paid special attention to the number of single-letter variations or small insertions and deletions in his DNA–clues as to whether he had a genetic variation that slightly boosted his risk for heart disease or cancer. But there was very little usable information in the genome.

That has all changed. In the last year, the number of sequenced, published genomes has shot up from two or three to approximately nine, with another 40 or so genomes sequenced but not yet published. “While the numbers are still small numbers, we are starting to put this research into the real disease context and get something out of it,” says Jay Shendure, a geneticist at the University of Washington in Seattle, and a TR35 winner in 2006.

Last year, sequencing a genome was still a feat in itself, and much of the conference focused on the technical details–assessing accuracy and error rates and comparing one method to another. While these issues are still of central importance, sequencing a human genome has become routine enough to generate medically useful information. “Now we are able to do things automatically, so the biology starts to come out,” says Paul Flicek, a bioinformaticist with the European Bioinformatics Institute and one of the conference organizers.

In a few cases, scientists have already been able to find the genetic cause of a disorder by sequencing an affected person’s genome. Shendure has sequenced the coding region–the 1 percent of the genome that directs production of proteins–of the genomes of a handful of families with children afflicted with a rare inherited disorder called Miller Syndrome, which is linked to facial and limb abnormalities. Researchers compiled a list of genetic variations in each person and filtered out those that have been commonly found in people without the disease variations. They then looked for variants present only in affected people, and came up with one candidate gene. Shendure declined to identify the gene prior to publishing the findings, but noted that it was one they would not have anticipated. He hopes the technique can be applied to more common diseases as well, perhaps by studying people with early onset or extreme cases.

Genome sequencing has also engendered a new approach to cancer research. Last year, Elaine Mardis and her team at Washington University School of Medicine in St. Louis sequenced the complete genomes of cancerous and normal tissue in a patient with acute myeloid leukemia, identifying 10 mutated genes that appear to play a role in this cancer. This year, her team has sequenced the genome of four different types of tissue from a breast-cancer patient–the normal genome, DNA from the primary tumor, DNA from a metastatic brain tumor (a secondary tumor formed from cancer cells originally from the breast tumor), and DNA from the patient’s cancerous tissue implanted into a mouse. (Because the cancerous tissue removed during surgery is often inadequate for genetic research, scientists sometimes grow tumor tissue from the patient’s cancer cells in mice.)

While the vast majority of the sequence will be identical in all four samples, identifying differences could pinpoint the genetic changes that lead to the initial formation of the tumor, as well as those that trigger metastasis. If scientists can find drugs that block the primary tumor from spreading, cancer could be converted into a manageable chronic disease.

Mardis’s team has already identified a number of variants that are unique to either the primary tumor or the metastatic tumor. They have also found some variants that appear in both but are more common in the metastatic tissue, suggesting that this type of mutation might enable cells to spread through the body. “We are now looking at breast-cancer-derived brain, lung, and liver tumors to see if there are commonalities in metastatic disease,” says Mardis. Her center aims to sequence 150 cancer genomes this year. Next year, that number will likely seem small.

20091224-1

http://www.lef.org/newsletter/2009

December 22, 2009  —  A report appearing in the December 2009 issue of the American Psychological Association journal Behavioral Neuroscience revealed that diets that fail to provide enough of the omega-3 fatty acids eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) may negatively affect the nervous system. The finding could impact the understanding of information-processing deficits that occur in schizophrenia, bipolar disease, obsessive-compulsive disorder, attention deficit-hyperactivity disorder (ADHD), Huntington’s disease and other nervous system disorders.

Norman Salem Jr, PhD of the Laboratory of Membrane Biochemistry and Biophysics at the National Institute on Alcohol Abuse and Alcoholism and his associates gave one of the four following diets to pregnant mice and their offspring: omega-3 fatty acid deficient, low alpha-linolenic acid, high alpha-linolenic acid, or a diet enriched with EPA and DHA. DHA is the primary omega-3 fatty acid in the nervous system, including the brain. While DHA is metabolized from alpha-linolenic acid in the diet, the conversion is minimal, rendering a dietary source of DHA and EPA, such as fish oil or an algae source, of vital importance. “Humans can convert less than one percent of the precursor into DHA, making DHA an essential nutrient in the human diet,” coauthor Irina Fedorova, PhD noted.

Adult offspring of the mice in the four groups were tested for nervous system function by exposing them to a loud noise preceded by a softer warning tone. Animals normally flinch upon hearing a loud tone, however, the degree of flinching is reduced when the animals are first exposed to a warning tone: an adaptive process known as sensorimotor gating. Weak sensorimotor gating in humans is associated with a number of nervous-system disorders.

While mice that were raised on EPA and DHA demonstrated normal sensorimotor gating, animals given the other diets were more startled by the loud noise. The finding suggests that a sensory overload state could result from DHA deficiency.

The ability of DHA and EPA to help maintain nerve cell membranes may be responsible for the protective effects observed in the current study. “It is an uphill battle now to reverse the message that ‘fats are bad,’ and to increase omega-3 fats in our diet,” Dr Salem commented. “It only takes a small decrement in brain DHA to produce losses in brain function.”

http://www.lef.org/newsletter/2009

 

December 22, 2009  —  A growing body of scientific literature is helping parents and doctors better understand the link between fatty acids and behavioral disorders such as ADHD. The ratio between omega-3 and omega-6 fatty acids (such as arachidonic acid) seems especially important. Eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) are omega-3 fatty acids found in flaxseed oil and cold water fish. In the typical Western diet, we tend to consume more omega-6 fatty acids relative to omega-3 fatty acids. The ratio of omega-3 to omega-6 fatty acids has been shown to influence the development of neurotransmitters and other chemicals that are essential for normal brain function. Increased intake of omega-3 fatty acids has been shown to reduce the tendency toward hyperactivity among children with ADHD (Haag M 2003).

Several studies have examined the role of essential fatty acids in ADHD, with very encouraging results:

  • § A study examined the effects of flaxseed oil and fish oil, which provide varying degrees of omega-3 fatty acids, on adults with ADHD. The patients were given supplements for 12 weeks. Their blood levels of omega-3 fatty acids were tracked throughout the 12 weeks. Researchers found that high-dose fish oil increased omega-3 acids in the blood relative to omega-6 acids. An imbalance between arachidonic acid and omega-3 fatty acids is considered a risk factor for ADHD (Young GS et al 2005).

One study compared 20 children with ADHD who were given a dietary supplement (that included omega-3 fatty acids) to children with ADHD who were given methylphenidate. The dietary supplement was a mix of vitamins, minerals, essential fatty acids, probiotics, amino acids, and phytonutrients. Amazingly, the groups showed almost identical improvement on commonly accepted measures of ADHD (Harding KL et al 2003).

http://www.lef.org/newsletter/2009

 

December 22, 2009  —  Attention deficit/hyperactivity disorder (ADHD) is a distressing diagnosis for any parent to hear. It’s well known that children with ADHD are at a disadvantage in school and that ADHD can have long-term effects. In addition, a number of powerful pharmaceuticals have been used to treat the condition.

Fortunately, newer findings in nutrition and wellness, and newer generations of pharmaceuticals, have been developed that can help children with ADHD gain control over their lives. The Life Extension Foundation has conducted an extensive survey of the scientific literature to uncover the safest and best approaches for families affected by this increasingly common condition.

ADHD is defined as a persistent lack of attention to tasks (attention deficit) and/or a lack of ability to control impulses and an increase in physical activity (hyperactivity) that is not typical of others at a similar stage of development (National Institutes of Health 2006). ADHD is most prevalent in children and teens, although it can occur in adults. ADHD occurs in 3 to 6 percent of all children in the United States, with rates as high as 15 percent in some areas (Kasper DL et al 2005).

According to the fourth edition of the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders (DSM-IV), ADHD is now the most commonly diagnosed behavioral disorder of childhood. Boys with ADHD outnumber girls 3 to 1. Some children outgrow ADHD, but 60 percent continue to have symptoms (Biederman J et al 2000).

ADHD: A Typical Profile

The behavior of children who have ADHD typically is affected in many settings such as at home and school or when they are with friends. The most prominent feature of ADHD is a consistent pattern of developmentally inappropriate levels of attention, concentration, distractibility, hyperactivity, and impulsivity. It is important to note that these problems must be inappropriate to a child’s developmental level to be considered ADHD. One concern among physicians is rampant overdiagnosis of ADHD, in part because the condition has been so hard to define.

Children who have attention deficits are unable to remain on-task for extended periods of time. They may appear forgetful, in part because their inability to attend to information prevents them from understanding it in the first place. Such children may also have cognitive and language delays. Children with hyperactivity may fidget, have difficulty engaging in quiet activities, be excessively talkative, and always seem to be on the go. Children who have impulse control problems may be impatient (for example, they may blurt out an answer before the question has been finished). They may have difficulty waiting their turn and are often perceived to be intruding on others. All of these manifestations can cause difficulties in academic and social settings (Warner-Rogers J et al 2000).

It is common for children with ADHD to be misdiagnosed as having learning disorders because they often perform poorly on tests that require information processing and concentration (Hartman CA et al 2004; Weiler MD et al 2000). There is also evidence that adults with ADHD are more likely to have a variety of addictive behaviors, among them alcoholism (Ponce AG et al 2000), smoking (Levin ED et al 2001), and cocaine use (Bandstra ES et al 2001).

What Causes ADHD?

Although the exact causes of ADHD are unknown, it is most likely caused by an interaction of genetic, environmental, and nutritional factors, with a strong focus on the interaction of multiple genes (genetic loading) that together cause ADHD.

There is some evidence that people with ADHD do not produce adequate quantities of certain neurotransmitters, among them dopamine, norepinephrine, and serotonin. Some experts theorize that such deficiencies lead to self-stimulatory behaviors that can increase brain levels of these chemicals (Comings DE et al 2000; Mitsis EM et al 2000; Sunohara GA et al 2000).

There may also be some structural and functional abnormalities in the brain itself in children who have ADHD (Pliszka SR 2002; Mercugliano M 1999). Evidence suggests that there may be fewer connections between nerve cells. This would further impair neural communication already impeded by decreased neurotransmitter levels (Barkley R 1997). Evidence from functional studies in patients with ADHD demonstrates decreased blood flow to those areas of the brain in which “executive function,” including impulse control, is based (Paule MG et al 2000). There may also be a deficit in the amount of myelin (insulating material) produced by brain cells in children with ADHD (Overmeyer S et al 2001).

Diagnosing ADHD

Establishing a diagnosis of ADHD is a considerable challenge, largely because of the lack of reliable and specific testing and firm criteria. ADHD has become a high-profile condition (which may result in it being both overdiagnosed and under diagnosed), depending on pressures from parents, teachers, and others. Although DSM-IV contains diagnostic criteria, they are often not followed by health professionals. Because of the lifelong implications of a diagnosis of ADHD, most experts recommend a multidisciplinary team approach to both diagnosis and treatment. Such an approach should involve physicians, child behavior experts, and parents. Nutritional experts may also be valuable members of the treatment team.

The core symptoms of ADHD in children are listed below. This list was adapted from the Centers for Disease Control. It is important to note that the diagnosis of ADHD cannot be made unless the patient has experienced these symptoms in ways that are disabling for a 6-month period. The DSM-IV diagnosis includes:

I. Either A or B:

A. Six or more of the following symptoms of inattention have been present for at least 6 months to a point that is disruptive and inappropriate for developmental level:

  1. Often does not give close attention to details or makes careless mistakes in schoolwork, work, or other activities.
  2. Often has trouble keeping attention on tasks or play activities.
  3. Often does not seem to listen when spoken to directly.
  4. Often does not follow instructions and fails to finish schoolwork, chores, or duties in the workplace (not due to oppositional behavior or failure to understand instructions).
  5. Often has trouble organizing activities.
  6. Often avoids, dislikes, or does not want to do things that take a lot of mental effort for a long period of time (such as schoolwork or homework).
  7. Often loses things needed for tasks and activities (such as toys, school assignments, pencils, books, or tools).
  8. Is often easily distracted.
  9. Is often forgetful in daily activities.

B. Six or more of the following symptoms of hyperactivity/impulsivity have been present for at least 6 months to an extent that is disruptive and inappropriate for developmental level:

Hyperactivity:

  1. Often fidgets with hands or feet or squirms in seat.
  2. Often gets up from seat when remaining in seat is expected.
  3. Often runs about or climbs when and where it is not appropriate (adolescents or adults may feel very restless).
  4. Often has trouble playing or enjoying leisure activities quietly.
  5. Is often “on the go” or often acts as if “driven by a motor.”
  6. Often talks excessively.

Impulsivity

  1. Often blurts out answers before questions have been finished.
  2. Often has trouble waiting his/her turn.
  3. Often interrupts or intrudes on others (such as butts into conversations or games).

II. Some symptoms that cause impairment were present before age 7 years.

III. Some impairment from the symptoms is present in two or more settings (such as at school or work and at home).

IV. There must be clear evidence of significant impairment in social, school, or work functioning.

V. The symptoms do not happen only during the course of a pervasive developmental disorder, schizophrenia, or other psychotic disorder. The symptoms are not better explained by another mental disorder (such as a mood disorder, anxiety disorder, dissociative disorder, or personality disorder).

Traditional Medical Treatment

In addition to behavioral management, medical treatment of ADHD includes stimulant and nonstimulant medications.

Stimulant drugs. Effective prescription drugs are primarily the so-called stimulant drugs. These agents are known to increase brain concentrations of a variety of brain neurotransmitters, most importantly dopamine, and exert a calming effect on people who have ADHD. Since dopamine enhances signaling between nerve cells that are involved in task-specific activities and also decreases “noise,” or “nonsense signaling,” increased concentrations of dopamine are thought to help individuals stay focused and on-task.

Despite their limitations, stimulants are still considered first-line treatment for ADHD. They are effective in 70 to 80 percent of patients. Stimulants are highly effective at alleviating core ADHD symptoms (such as inattention, hyperactivity, or impulsivity). Original stimulant preparations had very short periods of action that could result in dramatic rises and falls in drug levels. Newer long-acting preparations have been developed to even out these swings.

Even with the newer formulations, some adverse effects are inevitable. Long-term effects, although unusual, can occur. There is some evidence, for example, that long-term use of stimulants, especially methylphenidate (Ritalin®), can cause a delay in growth (Holtkamp K et al 2002). It is understandable that many parents are hesitant to give their young children this medication.

While they are effective, stimulant drugs are members of the amphetamine class, which means they can have significant adverse effects and hold some potential for abuse. Unfortunately, methylphenidate has gained popularity as a recreational drug, especially among adolescents and college students. While methylphenidate paradoxically acts as a calming drug among people diagnosed with ADHD, it acts as a stimulant among people who do not have ADHD. Surveys have indicated that more than 90 percent of college students and adolescents who abuse prescription drugs identified methylphenidate as their drug of choice (White BP et al 2006).

Nonstimulant drugs. The negative effects of stimulant drugs have led to an intensive search for better alternatives. Atomoxetine is the first nonstimulant drug approved by the US Food and Drug Administration (FDA) for treatment of ADHD and the only agent approved by the FDA for treatment of ADHD in adults.

Atomoxetine therapy for ADHD controls symptoms and maintains remission, and has comparable efficacy with methylphenidate, a favorable safety profile, and noncontrolled substance status (Christman AK et al 2004). Atomoxetine is safe and well tolerated (Kelsey DK et al 2004). It effectively reduces ADHD symptoms and improves social functioning in school-aged children, adolescents, and adults. As with stimulant medications, atomoxetine should be used with caution in patients who have hypertension or a cardiovascular disorder (Christman AK et al 2004).

In addition to atomoxetine, other drugs that increase brain concentrations of dopamine and/or serotonin have been used with varying degrees of success. Among these are the anticonvulsant gabapentin (Hamrin V et al 2001), the dopamine-enhancing antidepressant bupropion (Daviss WB et al 2001), the wakefulness-promoting drug modafinil (Taylor FB et al 2000), and donepezil, an acetylcholinesterase inhibitor that increases brain levels of acetylcholine. Studies, however, have cast doubt on donepezil’s effectiveness (Wilens TE et al 2005).