MIT researcher Mark Bear thinks that some forms of autism and mental retardation may be treatable with drugs already on laboratory shelves.

FORBES.com, November/December 2010, by Robert Langreth —  Mark Bear, 53, has been fixated on understanding the brain since he was 6–when he saw news commentators speculating about John F. Kennedy’s brain functioning after the shooting. He later became a neuroscientist, now at the Massachusetts Institute of Technology, spending most of his career doing basic research on how the brain’s cells form connections during learning.

Today researchers are buzzing about Bear and his radical new theory that offers a real glimmer of hope that some forms of autism may be treatable with drugs. The causes of autism have mystified scientists for decades. It has been blamed on everything from genes to environmental toxins to the discredited concept that childhood vaccines are the culprit.

Bear’s work suggests that a specific class of drug already sitting on drug company shelves may help patients with an inherited disease called fragile X syndrome, a common cause of autism. It hits one in 5,000 kids and causes mental retardation, anxiety and autism-like symptoms. While years of research remain, Bear theorizes those types of drugs might have application beyond fragile X and into autism in general.

In the wake of his results Roche ( RHHBY.PKnews people ) and Novartis ( NVSnews people ) have begun testing an old class of experimental anxiety drugs called mGluR5 inhibitors in fragile X patients. Seaside Therapeutics, which Bear cofounded, licensed a similar drug from Merck ( MRKnews people ) that is set to enter tests in fragile X patients early next year. Another Seaside drug showed promising early results in a study of 28 autism patients. (Bear owns 5% of the company.)

“I have been in this field for 25 years, and these last two years have been the most exciting in my career,” says Randi Hagerman, a developmental pediatrician at the MIND Institute at UC, Davis who is testing several of the drugs.

Bear’s work in fragile X started with a chance encounter a decade ago with Emory University geneticist Stephen Warren, who discovered the gene for fragile X in 1991. Bear gave a speech about how protein production was needed for certain basic cellular processes involved in memory. That grabbed Warren’s attention. He knew that the same gene that caused fragile X also helped control protein production. “After his talk I leaned over and said, ‘I have a mouse you have to look at,'” Warren says.

Bear’s subsequent experiments with mice with fragile X indicate that the brain makes too much of key proteins that prevent proper learning from occurring. By adjusting protein levels with a drug, he realized, it just may be possible to reverse the problem. Better yet, drugs that could do the trick already existed in pharmaceutical laboratories–a class of medications called mGluR5 antagonists that had been originally developed as antianxiety medicines. “I remember the moment. I was sitting in my office at home–and it occurred to me that excessive mGluR could prove a thread that connects what seems to be unrelated symptoms” of fragile X, he recalls. “The hair stood up on the back of my neck.”

When Bear proposed this concept, even his own students were skeptical. His first talk detailing the theory was at an invitation-only conference of fragile X experts in 2002. Bear, an outsider to the field, fretted about how his theory would be received. “The initial reaction was stunned silence. It was like a chin-scratching,” Bear says. “No one could immediately see what was wrong with the idea.”

He founded Seaside Therapeutics in 2005 with Randall Carpenter, who leads the company. It has attracted $60 million in funding from a private family trust. “We were thinking if this is something that has gone wrong in autism we have to do it. Big Pharma was not going to do it because it was too risky,” Bear says. After getting a cool reception from several drug companies, Bear leveraged an MIT connection into a meeting with scientists from Merck, which agreed to license its mGluR5 drug to Seaside.

So far his intuition has proved right. The theory got a huge boost in 2007. Bear and Gül Dölen took mice with the defective fragile X gene and then crossed them with mice that were missing half their copies of the brain receptor mGluR5, which stimulates protein production. The fragile X symptoms disappeared, suggesting that drugs that blocked the receptor–and thus reduced the amount of protein produced–might do the same thing.

UNC-Chapel Hill psychologist Geraldine Dawson, chief science officer for nonprofit Autism Speaks, calls Bear’s work “the first demonstration that it is possible” that drugs based on gene research could help treat autism-like symptoms. “It is like lightning to have something in clinical trials” so fast, says former Bear postdoc Kimberly Huber, who did the early lab work leading to the theory and is now a professor at UT Southwestern Medical Center.

The latest version of Bear’s theory, published in 2008, holds that similar protein production problems may be at play in many cases of autism of unknown cause. There may be an optimal level of protein production inside brain cells needed for learning. If there is too much or too little, learning disabilities may occur and grow over time. Under this view, rather than an unfixable problem, many forms of autism may be more like a chronic disease that starts at birth and gradually gets worse. Tweaking protein levels with various drugs may be able to help.

Initial tests of mGluR5 drugs are being conducted on adults with fragile X, but companies ultimately hope to move to treating kids, where the drugs could have a bigger impact. “My dream would be to apply the treatments early enough in life so we could change the course of the disease,” says Roche Vice President Luca Santarelli.

Testing whether the drugs work in people with fragile X will take years, and it’s far from clear that they could work with other forms of autism. “That’s the billion-dollar question,” says Santarelli, who is holding off testing his drug in autism until researchers know more. Says Bear: “If we are right, this could have a profound effect on human health.”

An array of dissolving microneedles is shown on a fingertip. Researchers have been awarded $10 million to develop a microneedle patch for immunizing against influenza. (Credit: Courtesy of Mark Prausnitz)

ScienceDaily.com, November 2010 — The National Institutes of Health (NIH) has awarded $10 million to the Georgia Institute of Technology, Emory University and PATH, a Seattle-based nonprofit organization, to advance a technology for the painless, self-administration of flu vaccine using patches containing tiny microneedles that dissolve into the skin.

The five-year grant will be used to address key technical issues and advance the microneedle patch through a Phase I clinical trial. The grant will also be used to compare the effectiveness of traditional intramuscular injection of flu vaccine against administration of vaccine into the skin using microneedle patches. In animals, vaccination with dissolving microneedles has been shown to provide immunization better than vaccination with hypodermic needles.

“We believe that this technology will increase the number of people being vaccinated, especially among the most susceptible populations of children and the elderly,” said Mark Prausnitz, a professor in the Georgia Tech School of Chemical and Biomolecular Engineering, and the project’s principal investigator. “If we can make it easier for people to be vaccinated and improve the effectiveness of the vaccine, we could significantly reduce the number of deaths caused every year by influenza.”

Vaccine-delivery patches contain hundreds of micron-scale needles so small that they penetrate only the outer layers of skin. Their small size would allow vaccines to be administered without pain — and could allow people to apply the patches themselves without visiting medical facilities.

While the ability to immunize large numbers of people without using trained medical personnel is a key advantage for the microneedle patch, the researchers have learned that administering the vaccine through the skin creates a different kind of immune response — one that may protect vaccine recipients better.

“We have seen evidence that the vaccine works even better when administered to the skin because of the plethora of antigen presenting cells which reside there,” said Ioanna Skountzou, co-principal investigator for the project and an assistant professor in Emory University’s Department of Microbiology and Immunology. “This study will allow us to determine how we can optimize the vaccine to take advantage of those cells that are important in generating the body’s immune response.”

Among the issues to be addressed in the five-year study are:

  • Developing an administration system that will be simple to use, intuitive and reliable. “Our goal is to make these patches suitable for self-administration, so that anybody could take a patch out of an envelope, put it on, and have it work with high reliability,” Prausnitz said.
  • Studying the long-term stability of vaccine used in the patches, and optimizing technology for incorporating it into the dissolving microneedles. “We need to put the vaccine into a dry form in this patch,” said Prausnitz. “That will require different processing than is normally done with vaccines. We expect that this dry vaccine will provide enough stability that the patches can be stored without refrigeration.”
  • Evaluating the economic, regulatory, social and medical implications of a self-administered vaccine. PATH, an international nonprofit organization, will assist with this work, and will help strategically address any issues. “We will be assessing the barriers that may exist to introduction of a self-administered flu vaccine so we can anticipate those issues and develop possible solutions,” said Darin Zehrung, leader of the vaccine delivery technologies group at PATH.

The funding will come from the Quantum program of the National Institute of Biomedical Imaging and Bioengineering (NBIB), which is part of the NIH. The initiative is designed to bring new medical technologies into clinical use.

While the funding focuses specifically on influenza vaccination, the lessons learned may advance other microneedle applications — including vaccination efforts in developing countries where skilled medical personnel are limited and concerns about re-use of hypodermic needles are significant.

Additional design and development of the microneedle patch will largely be done at Georgia Tech, with vaccine development, immunological studies and the Phase I trial carried out at Emory University. The trial, to be conducted by the Hope Clinic of the Emory Vaccine Center, is expected to take place during the final year of the grant, setting the stage for Phase II and Phase III clinical trials that would be required to obtain FDA approval.

Ultimately, the goal will be to produce an influenza vaccine delivery patch that could be made widely available. Prausnitz expects that will be done by an established company with the ability to manufacture and market the devices.

Microneedle drug and vaccine delivery systems have been under development at Georgia Tech and elsewhere since the 1990s. The technology got a significant boost in July of 2010 with publication of a study in Nature Medicine that showed mice vaccinated with dissolving microneedles were protected against influenza at least as well as mice immunized through traditional hypodermic needle injections.

The patches used in that study contained needles just 650 microns long, assembled into arrays of 100 needles. Pressed into the skin, the needles quickly dissolved into bodily fluids thanks to their hydrophilic polymer material, carrying the vaccine with them and leaving only a water-soluble backing. In contrast, use of hypodermic needles leaves the problem of “sharps” disposal.

Prausnitz hopes that the $10 million in NIH funding will help accelerate development of the microneedle patches to make them available for general use within five to ten years.

“This research will focus on optimizing the microneedle-based delivery of vaccines into the skin and understanding how this method affects immune responses both at the mucosal surfaces of the body and through the systemic response inside the body,” added Skountzou. “Combined with the convenience of self-administration, painless application and absence of sharps waste, this novel immunization route could make the microneedle patch a powerful new weapon against infectious diseases.”

Sections of neurons captured with new imaging technology from Stanford researchers.

FORBES.com, November/December 2010  —  While large pharmaceutical companies like Pfizer, Eli Lilly and Bristol-Myers Squibb spend millions of dollars pursuing new treatments for neurological diseases like Alzheimer’s, a huge impediment to their success lies in the fact that the brain and diseases affecting it are still not well understood. As my colleague Bob Langreth has pointed out several times, including here, there is still quite a bit of controversy about what causes Alzheimer’s.

New research by scientists at Stanford University may help to change that. The researchers created a state-of-the-art imaging system and used it to look at brain tissues of mice (see the photo at right) –and were able to quickly and accurately count the high number of connections between the brain’s nerve cells with a level of detail not attained before. The research results were published today in the journal Nature.

There are some 200 billion nerve cells (neurons) in the brain, linked together by trillions of contacts called synapses. “In a human, there are more than 125 trillion synapses just in the cerebral cortex area [a thin layer of tissue on the brain’s surface],” said Stephen Smith, PhD, a professor of molecular and cellular physiology and a senior author of the new research paper. Smith said that up to now, scientists have been guessing when they attempt to map the circuitry of the cerebral cortex. The synapses in the brain are smashed so close together that traditional microscopes couldn’t make them out. “Now we can actually count them and, in the bargain, catalog each of them according to its type.”

The method Smith and others in his lab developed is called array tomography. The researchers took slabs of tissue from a mouse’s cerebral cortex, sliced it into sections 70 nanometers thick, stained them with antibodies designed to match certain proteins and attached molecules that responded to glowing light.

After taking very high resolution photographs of the tissue slices, the information in the photos was virtually stitched together using new computational software designed by study co-author Brad Busse. The result: Researchers could move through a 3-D mosaic created by the software. Different colors represent different synaptic types.

The promise is great, and the researchers are optimistic. “I anticipated that within a few years, array tomography will have become an important mainline clinical pathology technique and a drug research tool,” said Smith. Let’s hope that it ultimately leads to unraveling the many mysteries of crippling neurological diseases like Alzheimer’s.

Image by blitzmaerker via Flickr


FORBES.com, November 29, 2010, by Terry Waghorn  —  Micro solutions hold the answer to sustainability in developing countries. Advanced infrastructure often taken for granted in the developed world does not exist in many of these countries. This means sustainability innovations must include what isn’t available in these regions must make use of what little is available.

One way this is being accomplished is through txteagle, which uses the mobile phone phenomenon to provide micro-work for people in developing countries. Only 18% of people in the developing world have access to the internet, but more than 50% owned a mobile phone at the end of 2009.

Txteagle, started by Nathan Eagle a research scientist with MIT, distributes small jobs via text messaging in return for small payments. These jobs often involve local knowledge and range from things like checking what street signs say for a satellite-navigation service, to translating words into local dialects for companies trying to spread their marketing.

Meanwhile, others are tackling one of the biggest problems in the developing world’s rural areas – the lack of an electrical grid to provide lighting. In many areas, highly polluting kerosene is burned to generate light, contributing significantly to the earth’s carbon dioxide levels.

For example, the Lumina Project, an initiative of the U.S. Department of Energy, and Lighting Africa conducted tests using solar LED lighting in a broiler chicken operation at an off-grid farm in Kenya. The test achieved lower operating costs, produced substantially more light, improved the working environment and had no adverse effect on yields.

Providing light with the solar LED systems is far more economical than connecting to a grid, the cost of which was estimated at 1.7 million Kenya Schillings (about 21,350 USD). This is about 35 times the cost of the LED system. Furthermore, switching to the LED system avoids over one metric ton of carbon dioxide emissions per (broiler) house on an annual basis compared to kerosene.

There is potential for replication of this particular LED lighting strategy in the developing world. But there is also potential for non-industrial use. Solar LED could help produce much-needed light for millions of people in the developing world who do not have access to traditional electrical grids.

Australian company Barefoot Power uses this simple post-industrial technology to bring alternatives to fuel-consuming appliances, specifically lights, to poverty stricken areas. This reduces the drain on the very limited money impoverished individuals have.

More than $10 billion is spent each year on kerosene for lighting in the homes of the poor in developing countries. Barefoot Power provides LED lights driven by fuel cells that collect solar power to help poor families stop spending their scarce cash by giving them a better and cheaper option.

Carbon emissions from burning kerosene are also heavily cut. Kerosene consumption for lighting is equal to 1.7 million barrels of oil per day, which is greater than the annual oil production of Libya. The Lawrence Berkley Laboratories state that the “single greatest way to reduce the green house gas emissions associated with lighting energy use is to replace kerosene lamps with white LED light systems in developing countries.”

Science Weekly: Memory on trial

Filed Under Uncategorized | Comments Off on Science Weekly: Memory on trial

Can we trust the memory of court witnesses?; a sneak preview of a new climate science exhibition; oxygen tasted on another world; and ‘evidence’ we can see into the future

Panel on Integrating Personalized Health Care Into Clinical Practice

Target Health is pleased to announce that our CMO Mark Horn, MD, MPH, and Glen Park, Pharm.D., our Sr. Director Clinical Research and Regulatory Affairs will be participating in a panel entitled: “Integrating Personalized Health Care Into Clinical Practice.” The panel discussion will take place on Thursday, December 2, 2010, between 6:30-8:00 pm at the Mount Sinai School of Medicine, Icahn Medical Institute, 1425 Madison Avenue, NYC (1st Floor Seminar room).

For the program, the Fundamentals of the Bioscience Industry Program (FOBIP) Alumni Network brings together a clinical scientist and two healthcare experts to discuss, debate and identify the current challenges and opportunities for the development of a personalized approach to medicine and healthcare. They will provide perspective from the different participants of the healthcare industry and offer insights into this promising area of medical research. In addition to Drs. Horn and Park, the other panelist will be Paul Chapman, MD. Eric Vieira, PhD will moderate the program. For more information, go to http://fobip.org/alumni/personalized-medicine-seminar.html.

For more information about Target Health contact Warren Pearlson (212-681-2100 ext 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. Target Health’s software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

DNA Drugs Come of Age

After years of false starts, a new generation of DNA vaccines and medicines for HIV, influenza and other stubborn illnesses is now in clinical 1) ___. What’s next for AIDS is new approaches for tackling HIV in the developing world The surprise success, this past summer, of a clinical trial on an antiretroviral-based vaginal 2) ___ provides new traction for efforts to combat AIDS in the developing world. Here are some new directions to expect for treatment and prevention of this widespread killer.

Vaccines and therapies containing DNA rings called plasmids have long held promise for treating and preventing 3) ___, but the plasmids made a weak showing in early tests. Improvements to the plasmids and new methods for delivering them have dramatically enhanced their potency.

DNA vaccines and therapies now used in animals or in late-stage human trials demonstrate that 4) ___ are reaching their potential.

In a head-to-head competition held 10 years ago, scientists at the NIH tested two promising new types of 5) ___ to see which might offer the strongest protection against one of the deadliest viruses on earth, the human immunodeficiency virus (HIV) that causes AIDS. One vaccine consisted of DNA rings called plasmids, each carrying a gene for one of five HIV proteins. Its goal was to get the recipient’s own cells to make the viral proteins in the hope they would provoke protective reactions by 6) ___ cells. Instead of plasmids, the second vaccine used another virus called an adenovirus as a carrier for a single HIV gene encoding a viral protein. The rationale for this combination was to employ a “safe” virus to catch the attention of immune 7) ___ while getting them to direct their responses against the HIV protein.

The test results dealt a major blow to believers in this first generation of DNA vaccines, the plasmids. The DNA recipients displayed only weak immune responses to the five HIV proteins or no response at all, whereas recipients of the adenovirus-based vaccine had robust reactions. To academic and pharmaceutical company researchers, 8) ___ clearly looked like the stronger candidates to take forward in developing HIV vaccines. Source: ScientificAmerican.com, by Morrow & Weiner

 

ANSWERS

 

1) trials; 2) microbicide; 3) disease; 4) plasmids; 5) vaccine; 6) immune; 7) cells; 8) adenoviruses

Francis S. Collins and J. Craig Venter

A decade after the human-genome project, biological science is poised on the edge of something wonderful

Ten years ago, on June 26th 2000, a race ended. The result was declared a dead heat and both runners won the prize of shaking the hand of President Bill Clinton. The runners were J. Craig Venter for the private sector and Francis Collins for the public. The race was to sequence the human genome, all 3 billion genetic letters of it, and as headline writers put it – read the book of life.

There was the drama of a maverick upstart, in the form of Dr Venter and his newly created firm, Celera, taking on the medical establishment, in the form of Dr Collins’s International Human Genome Sequencing Consortium. There was the promise of a cornucopia of new drugs as genetic targets previously unknown to biologists succumbed to pharmacological investigation. There was talk of an era of “personalized medicine” in which treatments would be tailored to an individual’s genetic make-up.

As The Economist observed at the time, the race Dr Venter and Dr Collins had been engaged in was a race not to the finish but to the starting line. Moreover, compared with the sprint they had been running in the closing years of the 1990s, the new race marked by that starting line was a marathon. The competitors ran into numerous obstacles. They found at first that there were far fewer genes than they had expected, only to discover later that there were far more. These discoveries changed the meaning of the word “gene”. They found the way genes are switched on and off is at least as important, both biologically and medically, as the composition of those genes. They found that their methods for linking genetic variation to disease were inadequate. And they found, above all, that they did not have enough genomes to work on. Each human genome is different, and that matters.

One by one, however, these obstacles are falling away and the science of biology is being transformed. It seems quite likely that future historians of science will divide biology into the pre- and post-genomic eras. In one way, post-genomic biology – biology 2.0, has finally killed the idea of vitalism, the persistent belief that to explain how living things work, something more is needed than just an understanding of their physics and chemistry. So it is with the new biology. The chemicals in a cell are the hardware. The information encoded in the DNA is the preloaded software. The interactions between the cellular chemicals are like the constantly changing states of processing and memory chips. Though understanding the genome has proved more complicated than expected, no discovery made so far suggests anything other than that all the information needed to make a cell is squirreled away in the DNA. Yet the whole is somehow greater than the sum of its parts.

The past few weeks have seen an announcement that may turn out to have been as portentous as the sequencing of the human genome: Dr Venter’s construction of an organism with a completely synthetic genome. The ability to write new genomes in this way brings true biological engineering – as opposed to the tinkering that passes for biotechnology at the moment – a step closer. A second portentous announcement, of the genome of mankind’s closest – albeit extinct -relative, Neanderthal man, shows the power of biology 2.0 in a different way. Putting together some 1.3 billion fragments of 40,000-year-old DNA, contaminated as they were with the fungi and bacteria of millennia of decay and the personal genetic imprints of the dozens of archaeologists who had handled the bones, demonstrates how far the technology of genomics has advanced over the course of the past decade. It also shows that biology 2.0 can solve the other great question besides how life works: how it has evolved and diversified over the course of time.

As is often the way with scientific discovery, technological breakthroughs of the sort that have given science the Neanderthal genome have been as important to the development of genomics as intellectual insights have been. The telescope revolutionized astronomy; the microscope, biology; and the spectroscope, chemistry. The genomic revolution depends on two technological changes. One, in computing power, is generic – though computer-makers are slavering at the amount of data that biology 2.0 will need to process, and the amount of kit that will be needed to do the processing. This torrent of data, however, is the result of the second technological change that is driving genomics, in the power of DNA sequencing.

Eric Lander, the head of the Broad Institute, in Cambridge, Massachusetts, which is America’s largest DNA-sequencing center, calculates that the cost of DNA sequencing at the institute has fallen to a hundred-thousandth of what it was a decade ago. The genome sequenced by the International Human Genome Sequencing Consortium took 13 years and cost $3 billion. Now, using the latest sequencers from Illumina, of San Diego, California, a human genome can be read in eight days at a cost of about $10,000. Another Californian firm, Pacific Biosciences, of Menlo Park, has a technology that can read genomes from single DNA molecules. It thinks that in three years’ time this will be able to map a human genome in 15 minutes for less than $1,000.

Even before cheap sequencing became available, huge databases were being built up. In alliance with pathology samples, doctors’ notes and – most valuable of all – long-term studies of particular groups of individuals, genetic information can be linked to what biologists refer to as the phenotype. This is an organism’s outward expression: its anatomy, physiology and behavior, whether healthy or pathological. The goal of the new biology is to tie these things together reliably and to understand how the phenotype emerges from the genotype. That will lead to better medical diagnosis and treatment. It will result in the ability to manipulate animals, plants, fungi and bacteria to human ends and it may help to explain the history of life and what it is to be human. Source: by Geoffrey Carr, The Economist (US). Economist Newspaper Ltd. 2010

Daily Hemodialysis Helps Protect Kidney Patients’ Hearts

When there is a loss of about 90% of usual kidney function, either kidney transplantation or dialysis is required. Nearly 400,000 people in the United States and 2 million worldwide are dependent on dialysis. Despite improvements in dialysis technology, new medications and more than 40 years of experience, mortality rates remain high at 18 to 20% a year. Patients experience frequent hospitalizations and reduced health-related quality of life.

Results published online in the New England Journal of Medicine (20 November 2010) showed that frequent hemodialysis improved left ventricular mass (heart size) and self-reported physical health compared to conventional hemodialysis for kidney failure. The trial, called the Frequent Hemodialysis Network (FHN) Daily Trial, was funded by the National Institutes of Health and the Centers for Medicare & Medicaid Services.

The study showed that six hemodialysis treatments per week improved left ventricular mass and physical health compared to conventional, three weekly dialysis therapy sessions. Frequent hemodialysis was also associated with improved control of high blood pressure and excessive phosphate levels in the blood, a common problem in patients on hemodialysis. There were no significant effects on cognitive performance, self-reported depression, or the use of drugs to treat anemia.

Previous observational data suggested that the dose of hemodialysis correlates directly with patient survival. However, results from the HEMO Study published in 2002, showed no added benefit of increasing the per-treatment dose of hemodialysis in the conventional three times per week method. However, since that time a few small, single-center studies found that the dialysis dose could be greatly increased by adding more dialysis sessions. Those findings led FHN researchers to test the hypothesis that almost daily treatment would improve both objective and subjective, or patient-reported, outcomes.

The FHN Daily Trial involved 245 patients at 10 university and 54 community-based hemodialysis facilities in North America between Jan. 2006 and March 2010. Patients were randomly assigned to receive either conventional three weekly dialysis treatments or six treatments a week.

Patients in the frequent-hemodialysis group averaged 5.2 sessions per week; the weekly standard Kt/Vurea (the product of the urea clearance and the duration of the dialysis session normalized to the volume of distribution of urea) was significantly higher in the frequent-hemodialysis group than in the conventional-hemodialysis group (3.54+0.56 vs. 2.49+0.27). Frequent hemodialysis was associated with significant benefits with respect to both coprimary composite outcomes (hazard ratio for death or increase in left ventricular mass, 0.61; hazard ratio for death or a decrease in the physical-health composite score, 0.70). Patients randomly assigned to frequent hemodialysis were more likely to undergo interventions related to vascular access than were patients assigned to conventional hemodialysis (hazard ratio, 1.71). Frequent hemodialysis was associated with improved control of hypertension and hyperphosphatemia. There were no significant effects of frequent hemodialysis on cognitive performance, self-reported depression, serum albumin concentration, or use of erythropoiesis-stimulating agents.

According to the authors, frequent hemodialysis, as compared with conventional hemodialysis, was associated with favorable results with respect to the composite outcomes of death or change in left ventricular mass and death or change in a physical-health composite score but prompted more frequent interventions related to vascular access.

Pre-Exposure Chemoprophylaxis for HIV Prevention in Gay Men

Antiretroviral chemoprophylaxis before exposure is a promising approach for the prevention of human immunodeficiency virus (HIV) acquisition. To study this approach, a study published in the New England Journal of Medicine (23 November 2010), randomly assigned 2,499 HIV-seronegative men or transgender women, who reported to have intimate physical relations with men, to receive a combination of two oral antiretroviral drugs, emtricitabine and tenofovir disoproxil fumarate (FTC-TDF), or placebo once daily. All subjects received HIV testing, risk-reduction counseling, condoms, and management of sexually transmitted infections.

The study subjects were followed for 3,324 person-years (median, 1.2 years; maximum, 2.8 years). Of these subjects, 10 were found to have been infected with HIV at enrollment, and 100 became infected during follow-up (36 in the FTC-TDF group and 64 in the placebo group), indicating a 44% reduction in the incidence of HIV (P=0.005). In the FTC-TDF group, the study drug was detected in 22 of 43 of seronegative subjects (51%) and in 3 of 34 HIV-infected subjects (9%) (P<0.001).

In terms of safety, nausea was reported more frequently during the first 4 weeks in the FTC-TDF group than in the placebo group (P<0.001). The two groups had similar rates of serious adverse events (P=0.57).

According to the authors, oral FTC-TDF provided protection against the acquisition of HIV infection among the subjects and that detectable blood levels strongly correlated with the prophylactic effect.

Next Page →