Medscape.com, May 2, 2011, by Sue Hughes, (Birmingham, Alabama) — A new meta-analysis of studies of aspirin in primary prevention in a total of 90 000 subjects has suggested a 14% reduction in total cardiovascular events (driven by a 19% reduction in nonfatal MI) and a nonsignificant reduction in overall mortality and stroke [1].

But clinical-trials guru Dr Sanjay Kaul (Cedars Sinai Medical Center, Los Angeles, CA) is not overly impressed with these new data and continues to be skeptical about the role of aspirin in primary prevention.

Kaul commented to heartwire : “The key finding of this meta-analysis is that aspirin use is associated with a statistically significant reduction in nonfatal MI. However, the clinical significance of this finding is not clear, as annualized risk difference or the number-needed-to-treat data are not presented. It is also not clear whether the data analysis included silent MIs identified on ECG examination. It is important to emphasize that, with the exception of one trial (the Thrombosis Prevention Trial), the primary end point was not met in any one of the studies included. For guiding clinical practice, an estimate of benefit/risk is necessary, which these data don’t provide. In addition to significant statistical heterogeneity, there is also clinical heterogeneity (in aspirin dose, treatment duration, concomitant medications, etc) that might impact interpretation of the data. Lack of patient-level data precludes adjustment for variability in critical prognostic covariates and does not permit the more robust time-to-event analysis.”

The latest meta-analysis, published online April 8, 2011 in the American Journal of Cardiology, adds three new studies–the Aspirin for Asymptomatic Atherosclerosis (AAA) trial, the Prevention of Progression of Arterial Disease and Diabetes (POPADAD) trial, and the Japanese Primary Prevention of Atherosclerosis With Aspirin for Diabetes (JPAD) trial–to the previous meta-analysis published by the Antithrombotic Trialists’ Collaboration (ATTC) in 2009.

Lead author Dr Alfred Bartolucci (University of Alabama at Birmingham) told heartwire : “Our results are in line with the ATTC meta-analysis, but we have added more patients, so the results become stronger.”

Meta-Analysis: Predefined End Points

End point Odds ratio (95% CI) p
Total CHD 0.85 (0.69–1.06) 0.154
Nonfatal MI 0.81 (0.67–0.99) 0.042
Total CV events 0.86 (0.80–0.93) 0.001
Stroke 0.92 (0.83–1.02) 0.116
CV mortality 0.96 (0.80–1.14) 0.619
All-cause mortality 0.94 (0.88–1.01) 0.115

But What About the Bleeding Risk?

Bartolucci was reluctant to be drawn into the controversy that has been simmering recently over claims that recommendations for the use of aspirin in primary prevention are too enthusiastic. This has been based on concerns about the bleeding risk associated with taking daily aspirin. In particular, the ATTC investigators found that the same people who would derive the greatest benefit (those at higher risk of heart disease) are also at higher bleeding risk with aspirin. This led them to conclude that “there is not good evidence of substantial benefit that outweighs risk enough to justify a public policy recommending routine use in primary prevention.”

Our bottom-line results apply to the average person. But not everyone fits into the average.

The current meta-analysis did not show a significant increase in GI bleeding with aspirin, but Bartolucci said he would not recommend that people take aspirin for primary prevention if they were at risk of GI bleeding.

The GI bleeding rate with aspirin varied in the nine studies included from 0.3% to 4.5%. Bartolucci attributed this variation to different patient populations.

He noted that many of these studies included people who were not at especially high risk of heart disease. “Many of these studies just included normal middle-aged individuals. Some had risk factors, some did not.”

Bartolucci was not eager to make recommendations based on his data. “We’re just reporting the data. People have to interpret it for themselves,” he told heartwire . “Our bottom-line results apply to the average person. But not everyone fits into the average.”

The FDA continues (rightly so) not to endorse aspirin for primary prevention.

When pushed, he said he thought there was a major role for aspirin in primary prevention but that people needed to consult with their doctor to make sure they could tolerate it. He added: “The benefit of aspirin in primary prevention is still there. And we are now getting more knowledge of how different groups of individuals, such as diabetics, are at risk of CHD.”

But Kaul disagreed. “In my opinion, the proper use of a meta-analysis is not only to harness power from inadequately powered individual trials to derive a pooled estimate of treatment effect but also to identify consistency of treatment effects across important subgroups. I also do not consider a meta-analytic p value of <0.05 to provide strong evidence,” he told heartwire .

Kaul added: “Positive secondary end points have typically (and generously) been used to support aspirin recommendations by professional societies. However, the FDA continues (rightly so) not to endorse aspirin for primary prevention, even without considering the contemporary negative trials.”

This study was supported by an unrestricted research grant from Bayer HealthCare AG.

 

References

  1. Bartolucci AA, Tendera M, and Howard G. Meta-analysis of multiple primary prevention trials of cardiovascular events using aspirin. Am J Cardiol 2011; DOI:10.1016/j.amjcard.2011.02.325. Available at: http://www.ajconline.org.

(American Journal of Cardiology)

 

ChicagoTribune, May 2, 1011, by Bruce Japsen  —  Price of Kaletra — used in combination therapy — will drop 8 percent, to nearly $5,000 per year

Abbott Laboratories this week reduced the price of its popular AIDS drug Kaletra for some customers.

The move, disclosed Friday during the company’s annual shareholders meeting, comes amid reductions in government spending on programs for low-income Americans with HIV.

Cash-strapped states such as Illinois have curtailed eligibility for people enrolled in AIDS drug assistance programs, which also receive federal funds. Meanwhile, there has been an influx of applicants for AIDS drug assistance programs as people have lost jobs and their ability to pay for HIV prescriptions.

Starting in July, the Illinois Health Department will restrict the state’s AIDS Drug Assistance Program to “new applicants with incomes at or below 300 percent of the federal poverty level,” or $32,670 for a single individual. Currently, the qualification for the program is 500 percent of the federal poverty level, or $54,450 for a single person.

North Chicago-based Abbott on Friday said it reduced the price most AIDS drug assistance programs will pay for Kaletra by 8 percent, to $5,037 per year. Kaletra is a protease inhibitor, a key ingredient in the so-called cocktails of medicines HIV patients take to keep the virus in check.

More than 7,700 patients across the country are on waiting lists for drug assistance programs, the AIDS Healthcare Foundation said, citing state and federal records.

Miles White, Abbott chairman and chief executive, said the company has not raised the price of Kaletra since 2007, while some companies have increased prices on their AIDS drugs 5 to 6 percent annually.

 

CBSMoneyWatch.com, GoogleNews.com, May 2, 1011, WASHINGTON (AP) — Abbott Laboratories said Friday that it received U.S. regulatory approval for a new, more potent formulation of its testosterone gel.

The gel is approved for men with hypogonadism or low testosterone, a condition associated with fatigue, depression and various sexual dysfunctions.

The new Androgel 1.62 percent formula delivers 40.5 milligrams of gel in two pumps of the canister. The older, Androgel 1 percent formulation included 50 milligrams in four pumps. The two formulations are not interchangeable and both require a prescription.

The Food and Drug Administration approved the new formula based on a study showing 78 percent of men using the gel had normal testosterone levels after one year of use.

Drug companies are increasingly targeting low testosterone as a new market opportunity. About 14 million American men are believed to have irregularly low testosterone levels, though only 1.3 million are being treated, according to one industry estimate. The exact science is not completely understood, but researchers believe low testosterone levels can lead to a loss of energy, lower libido and osteoporosis.

Analysts estimate the current U.S. market for testosterone therapies at $1.1 billion. The market size has increased 23 percent since 2005, according to IMS Health, a health data firm.

Androgel is one of the oldest testosterone replacement products on the market, first launching in 2000. Last year Elli Lilly & Co. Inc. and Endo Pharmaceuticals both received approval for similar products. Lilly’s Axiron solution is applied under the arms while Endo’s Fortesta is applied to the inner thighs. Other testosterone products, including Androgel, are applied to the chest.

The FDA has placed warnings on all testosterone gels about the risks if the formulas rub off on children. The warning reads: “signs of puberty that are not expected have happened in young children who were accidentally exposed to testosterone through contact with men using topical testosterone products.”

North Chicago-based Abbott Laboratories said it expects to launch Androgel 1.62 percent in the second quarter of 2011.

 

BaltimoreSun.com, May 2, 2011  —  In an effort to find new therapies for 6,000 rare diseases, National Institutes of Health researchers are screening drugs approved for other uses. They’re hoping to find off-label uses for the diseases afflicting some 25 million Americans.

“This is a critical step to explore the full potential of these drugs for new applications,” said Dr. Francis S. Collins, NIH director, in a statement. “The hope is that this process may identify some potential new treatments for rare and neglected diseases.”

The research is being coordinated by the NIH’s Chemical Genomics Center and uses information on 27,000 active drug ingredients included in the center’s publicly available pharmaceutical collection browser. It also includes 2,750 small molecule drugs with regulatory approval in the United States, Canada, Europe and Japan as well as those registered for human clinical trials. The center hopes to add more compounds.

For now, the focus is on collaboration with disease foundations, industry and academic investigators who can test the limited amounts of the compounds in the database. New clinical trials would be needed to test the drugs on the rare, and neglected, diseases. As would U.S. Food and Drug Administration approval.

The cost of drug development is so high that there are only therapies for less than 300 rare diseases now. But the hope is now that some of these other drugs have been vetted in large populations that there uses can be expanded. There are a few cases of new uses being found for drugs already

 

 

 

May 2, 2011, by Ryan McBride, FierceBiotechIT/com updates senior biotech, pharma, and IT leaders on how IT advances are shaping clinical trials and clinical research.

On a recent trip to Boston I met several entrepreneurs who are pushing the envelope on the role of computers in discovering new drugs or molecular targets for treatments. Meeting them inspired me to compile a short list of online resources for biological research, all of which are publicly available with limited restrictions on use.

One of the key takeaways from my conversations with those entrepreneurs is that biological data–from such sources as blood assays and DNA sequencers–are a key ingredient to using information technologies to glean new insights about diseases. And while publicly available repositories of biological data are nothing new, such resources are becoming increasingly important today as we attempt to store and analyze all the newly available biological data.

This isn’t meant to be an exhaustive list of online research tools. In fact, it’s highly likely that the stewards of a particular resource not mentioned here weren’t able to reply to my questions before the deadline for this report. Please e-mail me with your feedback (including any open source tools for analyzing biological data), as I’d like to update this list or publish a separate list some time soon.

1. Allen Human Brain Atlas
2. caBIG
3. Complete Genomics’ Public Genomic Repository
4. ENCODE
5. MicrobesOnline

 

 

 

1) Allen Human Brain Atlas

Brain disorders pose major challenges to researchers, in part because of the complexity of the central nervous system and the lack of good models of disease. Now there’s an online navigation system or sorts to help guide researchers in studying the anatomy and biochemistry of the human brain–the Allen Human Brain Atlas.

A product of the Seattle-based Allen Institute of Brain Science, the human brain atlas is the first resource to map all known anatomical sites of the human brain with data on genes that are active at each site. The institute, which was founded in 2003 and is partially funded by Microsoft ($MSFT) co-founder Paul G. Allen, first released data from the project in October 2010. And about 4,000 researchers per month are accessing the vast amounts of data from the online resource, which became fully available with data from two adult human brains in March.

In a way, the atlas is a foe to brain disorders and an ally to scientists that are researching the diseases–which include Alzheimer’s disease, depression and Parkinson’s disease, among many others. Knowing which specific genes are active in the brain and where certain genes are expressed, drug developers might be able to better understand where drugs could have an intended effect or where they could cause harm. This might help reduce the high rate of drug trial failures involving patients with CNS diseases.
http://www.brain-map.org/

hot link to the Allen Human Brain Map

 

 

2) caBIG

cancer Biomedical

Informatics Grid*

May 2, 2011 — By Ryan McBride

There are huge amounts of data from worldwide cancer research. So much, perhaps, that individual research sites would have a tough time managing it all on their own. With caBIG, however, there are plenty of ways to share and easily access such data without the burden of keeping it all in-house.

The information technology network provides researchers all over the country with free access to numerous collections of biological data as well as tools to manage and analyze the information. Those available data include studies from the Cancer Genome Atlas project, more than 3 million medical images and more than 2 million biological specimens. The program, founded in 2004, is paid for by the National Cancer Institute Center for Biomedical Informatics.

While caBIG has its sights set on further utilization, there are already more than 75 organizations connected to its information grid for sharing data among researchers. Also, caBIG has played a role in many studies and more than 300 scientific publications. The bottom line is that the program addresses a huge need among researchers to easily access key data on diseases to aid many types of studies. Call it big data versus the big “C.”

 

 

https://wiki.nci.nih.gov/download/attachments/24271074/Intro_GridTech_DataSharing.pdf

click here for an informative pdf that you can download

 

 

3) Complete Genomics’ Public Genomics Repository

May 2, 2011 — by Ryan McBride

It’s tough to find a larger collection of publicly available human genome sequences than Mountain View, CA-based Complete Genomics’ Public Genomic Repository. The firm ($GNOM) says that it knows of no larger repository of its kind.

There are datasets from 69 fully sequenced genomes on the firm’s website, where the company made the first 40 genomes available for free on Feb. 3. The repository features genomes from multiple generations and ethnicities. And academic researchers from prestigious institutions–including Stanford University and the University of Illinois–are among those who have downloaded more than 30 terabytes of data from the website.

Those are impressive numbers, for sure. But what really makes the wide availability of this data exciting is its potential impact on human health. With data from sequenced genomes of multiple branches of a family tree, for example, researchers can do the types of analyses that can help identify potential disease genes, according to the company. And since the data are available to all for free, researchers anywhere can download it and make it their own. Still, the genomic data are just the starting point for many phases of experiments and investigations needed to lead to key discoveries about the genetic underpinnings of disease.

 

http://www.completegenomics.com/sequence-data/download-data/

Click here for the Complete Genomics site

 

 

4) ENCODE

Fast and cheap genomic sequencing has brought us an unprecedented amount of data on human DNA. But what does it mean? The Encyclopedia of DNA Elements (ENCODE) is helping researchers and biology classrooms tackle this big question.

The ENCODE resource (which was recently described in detail in PLoS Biology) includes a large database with information about certain regions of the human genome, specific functions of genes, and RNA transcripts, among other genomic elements. There are also free software tools and algorithms for analyzing genomic data, courtesy of researchers who have contributed mightily to the international project. The National Institute of Health’s National Human Genome Research Institute is funding the effort.

Perhaps there needs to be a Wikipedia or sorts for the human genome. Genomic discoveries are coming fast and furious. An open-access environment can make these discoveries widely available to researchers, and the ENCODE team consists of many leading scientists who are helping to make sure that the resource provides reliable data and tools.

 

http://www.genome.gov/10005107

Start learning more from the NIH ENCODE Project site


5) MicrobesOnline

May 2, 2011 — y Ryan McBride

MicrobesOnline reminds us of how well outnumbered us humans are on a planet populated by an unfathomable number of microorganisms. The website provides access to a huge database of microbial genomes and tools for comparative analysis of genomes, among other things.

The Virtual Institute of Microbial Stress and Survival, based at the Lawrence Berkeley National Laboratory, provides MicrobesOnline. Since its beginnings in 2003, the online resource has expanded considerably in terms of the number of genomes and analysis tools available to the public. The website says it has more than 3,000 genomes of various bacteria, fungi and other microorganisms. Some of the useful features for researchers include a comparative genome browser and a search tool for finding out about the roles of certain genes in the metabolites of microbes.

It’s no coincidence that the U.S. Department of Energy has funded this effort. Understanding microbes could play a key role in the country’s energy future, and many companies have already been using microorganisms in processes to make ethanol and industrial chemicals from renewable sources. And with efforts like the Human Microbiome Project, supported by the National Institutes of Health, we’re improving our understanding about how microbes in the human body can impact our health.

 

 

 

http://www.microbesonline.org/

http://www.microbesonline.org/mo_siteguide_tutorial.pdf

Click above, for more info MicrobesOnline info

 

TheHill.com, May 2, 2011, by Sam Baker  —  States could save money on Medicaid benefits for prescription drugs through more aggressive price negotiations and patient tracking, according to a new report from the National Center for Policy Analysis. The report also suggests increasing the use of generic drugs and mail-order pharmacies.

The NCPA found wide variances in what state Medicaid programs pay for prescription drugs and also in the fees they pay to pharmacies. Some Medicaid programs pay higher fees to pharmacies than Medicare drug plans that operate in the same state. “State officials and state legislatures often yield to political pressure and set dispensing fees for conventional Medicaid programs that are often above (or below) what the market would normally compensate pharmacies,” the report says.

The paper also recommends better tracking of people who fill Medicaid prescriptions, which it says would help root out abuse and fraud. And it lays out a series of consumer-driven proposals states could consider, such as special savings accounts for health care and encouraging the use of mail-order pharmacies for patients with chronic conditions.

 

A tale of two lakes: Paul (reference lake) is smaller lake; Peter (manipulated lake) in background. (Credit: Steve Carpenter)

 

 

Researchers eavesdropping on complex signals emanating from a remote Wisconsin lake have detected what they say is an unmistakable warning — a death knell — of the impending collapse of the lake’s aquatic ecosystem. Researchers have found that models used to assess catastrophic changes in economic and medical systems can also predict environmental collapse. Stock market crashes, epileptic seizures, and ecological breakdowns are all preceded by a measurable increase in variance—be it fluctuations in brain waves, the Dow Jones index, or, in the case of the Wisconsin lake, chlorophyll.

The finding, reported April 29 in the journal Science by a team of researchers led by Stephen Carpenter, a limnologist at the University of Wisconsin-Madison, is the first experimental evidence that radical change in an ecosystem can be detected in advance, possibly in time to prevent ecological catastrophe.

“For a long time, ecologists thought these changes couldn’t be predicted,” says Carpenter, a UW-Madison professor of zoology and one of the world’s foremost ecologists. “But we’ve now shown that they can be foreseen. The early warning is clear. It is a strong signal.”

The implications of the National Science Foundation-supported study are big, says Carpenter. They suggest that, with the right kind of monitoring, it may be possible to track the vital signs of any ecosystem and intervene in time to prevent what is often irreversible damage to the environment.

“With more work, this could revolutionize ecosystem management,” Carpenter avers. “The concept has now been validated in a field experiment and the fact that it worked in this lake opens the door to testing it in rangelands, forests and marine ecosystems.”

Ecosystems often change in radical ways. Lakes, forests, rangelands, coral reefs and many other ecosystems are often transformed by such things as overfishing, insect pests, chemical changes in the environment, overgrazing and shifting climate.

For humans, ecosystem change can impact economies and livelihoods such as when forests succumb to an insect pest, rangelands to overgrazing, or fisheries to overexploitation.

A vivid example of a collapsed resource is the Atlantic cod fishery. Once the most abundant and sought-after fish in the North Atlantic, cod stocks collapsed in the 1990s due to overfishing, causing widespread economic hardship in New England and Canada. Now, the ability to detect when an ecosystem is approaching the tipping point could help prevent such calamities.

In the new study, the Wisconsin researchers, collaborating with groups from the Cary Institute for Ecosystem Studies in Millbrook, N.Y., the University of Virginia in Charlottesville, and St. Norbert College in De Pere, Wis., focused their attention on Peter and Paul lakes, two isolated and undeveloped lakes in northern Wisconsin. Peter is a six-acre lake whose biota were manipulated for the study and nearby Paul served as a control.

The group led by Carpenter experimentally manipulated Peter Lake during a three-year period by gradually adding predatory largemouth bass to the lake, which was previously dominated by small fish that consumed water fleas, a type of zooplankton. The purpose, Carpenter notes, was to destabilize the lake’s food web to the point where it would become an ecosystem dominated by large predators. In the process, the researchers expected to see a relatively rapid cascading change in the lake’s biological community, one that would affect all of its plants and animals in significant ways.

“We started adding these big ferocious fish and almost immediately this instills fear in the other fish,” Carpenter explains. “The small fish begin to sense there is trouble and they stop going into the open water and instead hang around the shore and structure, things like sunken logs. They become risk averse.”

The biological upshot, according to the Wisconsin lake expert, is that the lake became “water flea heaven.” The system becomes one where the phytoplankton, the preferred food of the lake’s water fleas, becomes highly variable.

“The phytoplankton get hammered and at some point the system snaps into a new mode,” says Carpenter.

Throughout the lake’s three-year manipulation, all its chemical, biological and physical vital signs were continuously monitored to track even the smallest changes that would announce what ecologists call a “regime shift,” where an ecosystem undergoes radical and rapid change from one type to another. It was in these massive sets of data that Carpenter and his colleagues were able to detect the signals of the ecosystem’s impending collapse.

Ecologists first discovered the signals in computer simulations of spruce budworm outbreaks. Every few decades the insect’s populations explode, causing widespread deforestation in boreal forests in Canada. Computer models of a virtual outbreak, however, seemed to undergo odd blips just before an outbreak.

The problem was solved by William “Buz” Brock, a UW-Madison professor of economics who for decades has worked on the mathematical connections of economics and ecology. Brock used a branch of applied mathematics known as bifurcation theory to show that the odd behavior was in fact an early warning of catastrophic change. In short, he devised a way to sense the transformation of an ecosystem by detecting subtle changes in the system’s natural patterns of variability.

The upshot of the Peter Lake field experiment, says Carpenter, is a validated statistical early warning system for ecosystem collapse. The catch, however, is that for the early warning system to work, intense and continuous monitoring of an ecosystem’s chemistry, physical properties and biota are required.

Such an approach may not be practical for every threatened ecosystem, says Carpenter, but he also cites the price of doing nothing: “These regime shifts tend to be hard to reverse. It is like a runaway train once it gets going and the costs — both ecological and economic — are high.”


Journal Reference:

S. R. Carpenter, J. J. Cole, M. L. Pace, R. Batt, W. A. Brock, T. Cline, J. Coloso, J. R. Hodgson, J. F. Kitchell, D. A. Seekell, L. Smith, and B. Weidel. Early Warnings of Regime Shifts: A Whole-Ecosystem Experiment. Science, 28 April 2011 DOI: 10.1126/science.1203672