Study Shows Additional Benefits of Progesterone in Reducing Preterm Birth Risk



Pregnant women with a short cervix are at risk of pre-term birth and at an increased risk of delivering early. The cervix is the part of the uterus that shortens and opens during labor for the infant to pass through. Preterm infants, born three weeks or more before a full 40-week term, are at increased risk for death in the first year of life, as well as for breathing difficulties, cerebral palsy, learning disabilities, blindness and deafness. A previous NIH study had earlier indicated that progesterone, a naturally occurring hormone, was effective in reducing the preterm birth rate.


According to a study published online in the American Journal of Obstetrics and Gynecology (12 December 2011), an analysis of five previous studies has uncovered additional evidence of the effectiveness of progesterone in reducing the rate of preterm birth among a high-risk category of women.


The current study is a meta-analysis, which is a statistical technique that combines the data from several studies addressing a related research question. By combining information from the five studies, results from treatment of 775 women became available. By comparing women who received progesterone treatment with those who did not, the authors separately calculated the rate of preterm delivery at each week of gestation.


Results showed that progesterone treatment in doses ranging from 90 milligrams to 200 milligrams per day tested in the previous studies substantially reduced the risk of delivery in the 27th to 34th weeks of gestation. For example, progesterone reduced preterm delivery before week 28 by half. The study also concluded that even when the mother delivers before full term, progesterone treatment can reduce the likelihood that the infant will die (43%), have respiratory distress syndrome (52%), weigh less than 3.5 pounds (45%), be admitted for intensive care (25%), or require mechanical ventilation (34%).


Based on their findings, the authors recommended that pregnant patients be screened with ultrasound of the cervix routinely at 19 to 24 weeks of gestation. If a short cervix (10 to 20 millimeters) is detected, the authors recommended treatment with 90 mg per day of progesterone between weeks 20 and 37.

TARGET HEALTH excels in Regulatory Affairs. Each week we highlight new information in this challenging area.



FDA Approves Mechanical Cardiac Assist Device for Children with Heart Failure



Heart failure in children is much less common than in adults. Heart transplantation offers effective relief from symptoms. However, far fewer pediatric sized donor hearts are available for transplantation than for adults, limiting the use of heart transplantation in children and prolonging the waiting period until transplant can occur. In infants, the median waiting time for a donor heart is 119 days. Overall a reported 12-17% of children and 23% of infants die while on the wait list for a heart transplant.


The FDA has approved a medical device that supports the weakened heart of children with heart failure to help keep them alive until a donor for a heart transplant can be found. The mechanical pulsatile cardiac assist device is called the EXCOR Pediatric System, made by a German company, Berlin Heart. The device comes in graduated sizes to fit children from newborns to teens. The device consists of one or two external pneumatic (driven by air) blood pumps, multiple tubes to connect the blood pumps to heart chambers and the great arteries, and the driving unit.


In the primary U.S. study group of 48 patients, the use of the device was found to improve survival to transplant in patients when compared with the use of extracorporeal membrane oxygenation (ECMO) which is the current standard of care, although not FDA approved. Stroke, which can cause serious brain deficits, is a risk of the EXCOR Pediatric System.


The EXCOR was designated as a Humanitarian Use Device (HUD) by the Office of Orphan Products Development at the FDA. This designation is for medical devices intended to benefit patients in the treatment or diagnosis of a disease or condition that affects fewer than 4,000 individuals in the United States annually. The device was approved under a Humanitarian Device Exemption (HDE), a type of marketing application that is similar to a premarket approval application in that the level of safety required for approval is the same. Rather than having to show a reasonable assurance of effectiveness, devices submitted under the HDE marketing route need to prove that the probable benefit from use of the device outweighs the probable risk of illness or injury from its use to obtain the FDA’s approval. The FDA approval of an HDE authorizes an applicant to market the device subject to certain use restrictions. After the passing of the Pediatric Medical Device Safety and Improvement Act of 2007, HUDs intended and labeled for use in a pediatric population are permitted to be marketed for profit.


The FDA’s Orphan Products Grant Program supported the U.S. clinical trials for the EXCOR Pediatric System with grants of $400,000 per year for three years. For more information: Designating Humanitarian Use Devices

Lack of Alignment



By Mark L. Horn, MD, MPH, Chief Medical Officer, Target Health Inc.


A creative and provocative paper (Allan Detsky, What Patients Really Want From Health Care, JAMA: 14 December 2011) published just this week ought to give policymakers pause (or at the very least some food-for-thought).


The author confronts, in an unusually straightforward, and what will likely seem to some a “politically incorrect“ manner, a serious dilemma for health care policy makers. The piece identifies a significant disconnect between what patients actually want from the health care system and the goals of policymakers who, however well intended, may be engaged in designing a system with, from patients’ perspectives at least, misplaced priorities. Along the way some key shibboleths of conventional wisdom are seriously challenged (always fun). If the author is correct, health care policymakers across the political and philosophical spectrums are potentially seriously at odds with the public. The implications of this disconnect are both numerous and potentially serious.


Patients, when ill, evidently want rapid and easy access to a caring physician able to quickly initiate effective interventions with (no surprise here) minimal out of pocket costs; they are much less concerned about interacting with the system regularly to stay healthy or with a broad redesign of the system to focus on prevention. While these concerns are of special concern to the author, they are apparently not a major problem for patients. Finally, regarding economics, for insured patients the cost of their individual (covered) care is not an issue, nor is the overall rising cost of care for the nation of great concern. Patients want easy and rapid access to top of the line medicine when they’re sick and are willing to accept that this requires a progressively larger commitment of societal resources.


Although this is a somewhat unusual paper where the takeaways will likely vary with individuals’ political, economic, and social philosophies, there are some ?generic’ lessons as well.


Without venturing into the obviously dicey areas, it seems safe to conclude that these dichotomies in objectives between the public and policymakers imply either that system reform will receive suboptimal pubic enthusiasm or that meaningful market research, educational, and communication efforts will be required to understand what the public wants, help them understand whether their desires are realistic, and engage them more productively in rational system redesign.


This is necessarily a tall-order; however, even the possibility that after all the rancorous discussions and debates, the consequent policy changes, system interventions, and enormous resource allocations this disconnect with the pubic exists (e.g. the possibility that the paper is correct) suggests a potentially serious flaw in our processes.


We need to confirm the accuracy of this diagnosis quickly and prescribe the proper therapy.


However, in addressing preventative medicine, in addition to any conflicts with the health habits and behavior of our citizens, it must be said that there are other huge health risks beyond the control of our citizens such as air and water pollution, food deserts and how we are seduced by the junk food revolution.

Target Health ( a full service e*CRO, is committed to serve the pharmaceutical community through knowledge, experience, technology and connectivity. Target Health strives to optimize the life cycle of drugs, biologics and devices with expertise, leadership, innovation and teamwork. Target Health Inc. has fulltime staff dedicated to all aspects of Regulatory Affairs, Clinical Research, Biostatistics, Data Management, Strategic Planning and Drug and Device Development. Target Health is committed to the paperless clinical trial and has developed a full suite of eClinical Trial software including:

1) Target e*CRF® (EDC Made Simple)

2) Target e*CTMS™

3) Target Document®

4) Target Encoder®

5) Target e*Pharmacovigilance™

6) Target e*Monitoring™

7) Target Newsletter®

8) Target e*CTR™ (eSource, electronic medical record for clinical trials).

Target Health’s Pharmaceutical Advisory Dream Team assists companies in strategic planning from Discovery to Market Launch. Let us help you on your next project.


261 Madison Avenue
24th Floor
New York, NY 10016
Phone: (212) 681-2100; Fax (212) 681-2105
Ms Joyce Hays, CEO
Dr. Jules T. Mitchel, President

©2011 Target Health Inc. All rights reserved

CERN:   A simulation of a particle collision inside the Large Hadron Collider. When two protons collide inside the machine, they create an energetic explosion that gives rise to new and exotic particles — including, perhaps, the Higgs boson.



Dear All, we’re posting these graphics and articles because the Higgs boson was so much in the news this past week and we like to keep our bloggers well informed.  We follow our own curiosity and whatever we’re reading up on, each day, we share.  Now, we’re trying to understand the basic concepts of what’s going on in Cern.  Enjoy and send comments, as usual,  to  This kind of science is for sure, rich material for a discussion of science, philosophy, religion and the arts.  We’re not yet quite sure why anyone (except media and PR) would nickname the Higgs boson, The God Particle.  Maybe it’s a reference to Einstein’s the mind of God.  Just have to keep reading


A simulated event, featuring the appearance of the Higgs boson


Higg’s Boson, named after Peter Higgs, a physicist at the University of Edinburgh and one of the people who theorized its existence, is the last fundamental piece of the Standard Model that has yet to have been observed. It plays a crucial role in generating a sort of cosmic molasses filling the universe that creates mass in other particles.

The Higgs boson is a hypothetical massive elementary particle that is predicted to exist by the Standard Model (SM) of particle physics. The Higgs boson is an integral part of the theoretical Higgs mechanism. If shown to exist, it would help explain why other particles can have mass. It is the only elementary particle predicted by the Standard Model that has not yet been observed in particle physics experiments. Theories that do not need the Higgs boson also exist and would be considered if the existence of the Higgs boson were ruled out. They are described as Higgsless models.

If shown to exist, the Higgs mechanism would explain why the W and Z bosons, which mediate weak interactions, are massive whereas the related photon, which mediates electromagnetism, is massless. The Higgs boson is expected to be in a class of particles known as scalar bosons. (Bosons are particles with integer spin, and scalar bosons have spin 0.)

Experiments attempting to find the particle are currently being performed using the Large Hadron Collider (LHC) at CERN, and were performed at Fermilab‘s Tevatron until its closure in late 2011. Some theories suggest that any mechanism capable of generating the masses of elementary particles must become visible at energies above 1.4 TeV; therefore, the LHC (colliding two 3.5 TeV beams) is expected to be able to provide experimental evidence of the existence or non-existence of the Higgs boson.

On 12 December 2011, the ATLAS collaboration at the LHC found that a Higgs mass in the range from 145 to 206 GeV/c2 was excluded at the 95% confidence level.  On 13 December 2011, experimental results were announced from the ATLAS and CMS experiments, indicating that if the Higgs boson exists, its mass is limited to the range 116–130 GeV (ATLAS) or 115–127 GeV (CMS), with other masses excluded at 95% confidence level. Observed excesses of events at around 124 GeV (CMS) and 125-6 GeV (ATLAS) are consistent with the presence of a Higgs boson signal, but also consistent with fluctuations in the background. The global statistical significances of the excesses are 1.9 sigma (CMS) and 2.6 sigma (ATLAS) after correction for the look elsewhere effect. As of 13 December 2011, a combined result is not available.

The particle is sometimes called the God particle, a title deplored by some scientists as a media hyperbole that misleads readers.


Origin of the theory


Five of the six 2010 APS J.J. Sakurai Prize winners. From left to right: Kibble, Guralnik, Hagen, Englert, and Brout.



The sixth of the 2010 APS J.J. Sakurai Prize winners: Peter Higgs



The Higgs mechanism is a process by which vector bosons can get a mass. It was proposed in 1964 independently and almost simultaneously by three groups of physicists: François Englert and Robert Brout; by Peter Higgs (inspired by ideas of Philip Anderson and by Gerald Guralnik, C. R. Hagen, and Tom Kibble.

The three papers written on this discovery were each recognized as milestone papers during Physical Review Letters‘s 50th anniversary celebration.  While each of these famous papers took similar approaches, the contributions and differences between the 1964 PRL symmetry breaking papers are noteworthy. These six physicists were also awarded the 2010 J. J. Sakurai Prize for Theoretical Particle Physics for this work.

The 1964 PRL papers by Higgs and by Guralnik, Hagen, and Kibble (GHK) both displayed equations for the field that would eventually become known as the Higgs boson. In the paper by Higgs the boson is massive, and in a closing sentence Higgs writes that “an essential feature” of the theory “is the prediction of incomplete multiplets of scalar and vector bosons”. In the model described in the GHK paper the boson is massless and decoupled from the massive states. In recent reviews of the topic, Guralnik states that in the GHK model the boson is massless only in a lowest-order approximation, but it is not subject to any constraint and it acquires mass at higher orders. Additionally, he states that the GHK paper was the only one to show that there are no massless Nambu-Goldstone bosons in the model and to give a complete analysis of the general Higgs mechanism. Following the publication of the 1964 PRL papers, the properties of the model were further discussed by Guralnik in 1965 and by Higgs in 1966.

Steven Weinberg and Abdus Salam were the first to apply the Higgs mechanism to the electroweak symmetry breaking. The Higgs mechanism not only explains how the electroweak vector bosons get a mass, but predicts the ratio between the W boson and Z boson masses as well as their couplings with each other and with the Standard Model quarks and leptons. Many of these predictions have been verified by precise measurements performed at the LEP and the SLC colliders, thus confirming that the Higgs mechanism takes place in nature.

The Higgs boson’s existence is not a strictly necessary consequence of the Higgs mechanism: the Higgs boson exists in some but not all theories which use the Higgs mechanism. For example, the Higgs boson exists in the Standard Model and the Minimal Supersymmetric Standard Model yet is not expected to exist in Higgsless models, such as Technicolor. A goal of the LHC and Tevatron experiments is to distinguish among these models and determine if the Higgs boson exists or not.


Theoretical overview



Summary of interactions between particles described by the Standard Model.



A one-loop Feynman diagram of the first-order correction to the Higgs mass. The Higgs boson couples strongly to the top quark so it might decay into top–anti-top quark pairs if it were heavy enough.

The Higgs boson particle is a quantum of the theoretical Higgs field. In empty space, the Higgs field has an amplitude different from zero; i.e. a non-zero vacuum expectation value. The existence of this non-zero vacuum expectation plays a fundamental role; it gives mass to every elementary particle that couples to the Higgs field, including the Higgs boson itself. The acquisition of a non-zero vacuum expectation value spontaneously breaks electroweak gauge symmetry. This is the Higgs mechanism, which is the simplest process capable of giving mass to the gauge bosons while remaining compatible with gauge theories. This field is analogous to a pool of molasses that “sticks” to the otherwise massless fundamental particles that travel through the field, converting them into particles with mass that form (for example) the components of atoms.

In the Standard Model, the Higgs field consists of two neutral and two charged component fields. Both of the charged components and one of the neutral fields are Goldstone bosons, which act as the longitudinal third-polarization components of the massive W+, W, and Z bosons. The quantum of the remaining neutral component corresponds to the massive Higgs boson. Since the Higgs field is a scalar field, the Higgs boson has no spin, hence no intrinsic angular momentum. The Higgs boson is also its own antiparticle and is CP-even.

The Standard Model does not predict the mass of the Higgs boson. If that mass is between 115 and 180 GeV/c2, then the Standard Model can be valid at energy scales all the way up to the Planck scale (1016 TeV). Many theorists expect new physics beyond the Standard Model to emerge at the TeV-scale, based on unsatisfactory properties of the Standard Model. The highest possible mass scale allowed for the Higgs boson (or some other electroweak symmetry breaking mechanism) is 1.4 TeV; beyond this point, the Standard Model becomes inconsistent without such a mechanism, because unitarity is violated in certain scattering processes. There are over a hundred theoretical Higgs-mass predictions.

Extensions to the Standard Model including supersymmetry (SUSY) predict the existence of families of Higgs bosons, rather than the one Higgs particle of the Standard Model. Among the SUSY models, in the Minimal Supersymmetric Standard Model (MSSM) the Higgs mechanism yields the smallest number of Higgs bosons; there are two Higgs doublets, leading to the existence of a quintet of scalar particles: two CP-even neutral Higgs bosons h0 and H0, a CP-odd neutral Higgs boson A0, and two charged Higgs particles H±. Many supersymmetric models predict that the lightest Higgs boson will have a mass only slightly above the current experimental limits, at around 120 GeV/c2 or less.


Experimental search



Status as of March 2011, to the indicated confidence intervals



A Feynman diagram of one way the Higgs boson may be produced at the LHC. Here, two gluons convert to two top/anti-top pairs, which then combine to make a neutral Higgs.



A Feynman diagram of another way the Higgs boson may be produced at the LHC. Here, two quarks each emit a W or Z boson, which combine to make a neutral Higgs.

As of November 2011, the Higgs boson has yet to be confirmed experimentally,despite large efforts invested in accelerator experiments at CERN and Fermilab.

Prior to the year 2000, the data gathered at the LEP collider at CERN allowed an experimental lower bound to be set for the mass of the Standard Model Higgs boson of 114.4 GeV/c2 at the 95% confidence level. The same experiment has produced a small number of events that could be interpreted as resulting from Higgs bosons with a mass just above this cut off — around 115 GeV—but the number of events was insufficient to draw definite conclusions. The LEP was shut down in 2000 due to construction of its successor, the LHC, which is expected to be able to confirm or reject the existence of the Higgs boson. Full operational mode was delayed until mid-November 2009, because of a serious fault discovered with a number of magnets during the calibration and startup phase.

At the Fermilab Tevatron, there were ongoing experiments searching for the Higgs boson. As of July 2010, combined data from CDF and experiments at the Tevatron were sufficient to exclude the Higgs boson in the range 158 GeV/c2 – 175 GeV/c2 at the 95% confidence level.Preliminary results as of July 2011 have since extended the excluded region to the range 156 GeV/c2 – 177 GeV/c2 at the 90% confidence level. Data collection and analysis in search of Higgs are intensifying since 30 March 2010 when the LHC began operating at 3.5 TeV.Preliminary results from the ATLAS and CMS experiments at the LHC as of July 2011 exclude a Standard Model Higgs boson in the mass range 155 GeV/c2 – 190 GeV/c2 and 149 GeV/c2 – 206 GeV/c2, respectively, at the 95% confidence level. All of the above confidence intervals were derived using the CLs method.

It may be possible to estimate the mass of the Higgs boson indirectly. In the Standard Model, the Higgs boson has a number of indirect effects; most notably, Higgs loops result in tiny corrections to masses of W and Z bosons. Precision measurements of electroweak parameters, such as the Fermi constant and masses of W/Z bosons, can be used to constrain the mass of the Higgs. As of 2006, measurements of electroweak observables allowed the exclusion of a Standard Model Higgs boson having a mass greater than 285 GeV/c2 at 95% CL, and estimated its mass to be 129+74
−49 GeV/c2 (the central value corresponding to approximately 138 proton masses). As of August 2009, the Standard Model Higgs boson is excluded by electroweak measurements above 186 GeV at the 95% confidence level. These indirect constraints make the assumption that the Standard Model is correct. It may still be possible to discover a Higgs boson above 186 GeV if it is accompanied by other particles between the Standard Model and GUT scales.

In a 2009 preprint,  it was suggested that the Higgs boson might not only interact with the above-mentioned particles of the Standard model of particle physics, but also with the mysterious weakly interacting massive particles (or WIMPS) that may form dark matter, and which play an important role in recent astrophysics.

Various reports of potential evidence for the existence of the Higgs boson have appeared in recent years but to date none has provided convincing evidence. In April 2011, there were suggestions in the media that evidence for the Higgs boson might have been discovered at the LHC in Geneva, Switzerland but these had been debunked by mid May. In regard to these rumors Jon Butterworth, a member of the High Energy Physics group on the Atlas experiment, stated they were not a hoax, but were based on unofficial, unreviewed results. The LHC detected possible signs of the particle, which were reported in July 2011, the ATLAS Note concluding: “In the low mass range (c 120−140 GeV) an excess of events with a significance of approximately 2.8 sigma above the background expectation is observed” and the BBC reporting that “interesting particle events at a mass of between 140 and 145 GeV” were found. These findings were repeated shortly thereafter by researchers at the Tevatron with a spokesman stating that: “There are some intriguing things going on around a mass of 140GeV.”

On 22 August it was reported that the anomalous results had become insignificant on the inclusion of more data from ATLAS and CMS and that the non-existence of the particle had been confirmed by LHC collisions to 95% certainty between 145–466 GeV (except for a few small islands around 250 GeV). A combined analysis of ATLAS and CMS data, published in November 2011, further narrowed the window for the allowed values of the Higgs boson mass to 114-141 GeV. On 12 December 2011, the ATLAS collaboration found that a Higgs mass in the range from 145 to 206 GeV was excluded at the 95% confidence level.

On 13 December 2011, experimental results were announced from the ATLAS and CMS experiments, indicating that if the Higgs boson exists, its mass is limited to the range 116–130 GeV (ATLAS) or 115–127 GeV (CMS), with other masses excluded at 95% confidence level. Observed excesses of events at around 124 GeV (CMS) and 125-6 GeV (ATLAS) are consistent with the presence of a Higgs boson signal, but also consistent with fluctuations in the background. The global statistical significances of the excesses are 1.9 sigma (CMS) and 2.6 sigma (ATLAS) after correction for the look elsewhere effect.  As of 13 December 2011, a combined result is not yet available. The statistical significance of the observations is not large enough to draw conclusions, but the fact that the two independent experiments show excesses at around the same mass has led to considerable excitement in the particle physics community. It is expected that the LHC will provide sufficient data to either exclude or confirm the existence of the Standard Model Higgs boson by the end of 2012.


Alternatives for electroweak symmetry breaking


Higgsless model

In the years since the Higgs boson was proposed, several alternatives to the Higgs mechanism have been proposed. All of these proposed mechanisms use strongly interacting dynamics to produce a vacuum expectation value that breaks electroweak symmetry. A partial list of these alternative mechanisms are:


The God particle”


The Higgs boson is often referred to as “the God particle” by the media,after the title of Leon Lederman‘s book, The God Particle: If the Universe Is the Answer, What Is the Question?  Lederman initially wanted to call Higgs boson “the goddamn particle” because “nobody could find the thing.”; but his editor would not let him.  While use of this term may have contributed to increased media interest in particle physics and the Large Hadron Collider, many scientists dislike it, since it overstates the particle’s importance, not least since its discovery would still leave unanswered questions about the unification of QCD, the electroweak interaction and gravity, and the ultimate origin of the universe.  A renaming competition conducted by the science correspondent for the British Guardian newspaper chose the name “the champagne bottle boson” as the best from among their submissions: “The bottom of a champagne bottle is in the shape of the Higgs potential and is often used as an illustration in physics lectures. So it’s not an embarrassingly grandiose name, it is memorable, and [it] has some physics connection too.”



LHC – Large Hadron Collider

ATLAS experiment  —   A Toroidal LHC ApparatuS


ALICE  —   (A Large Ion Collider Experiment) is one of the six detector experiments at the Large Hadron Collider at CERN. The other five are: ATLAS, CMS, TOTEM, LHCb, and LHCf. ALICE is optimized to study heavy ion collisions. Pb-Pb nuclei collisions will be studied at a centre of mass energy of 2.76 TeV per nucleon. The resulting temperature and energy density are expected to be large enough to generate a quark-gluon plasma, a state of matter wherein quarks and gluons are deconfined.



About the Higgs Field and the Higgs Boson


György Ligeti: Lux Aeterna


Understanding Our Universe through the Higgs


Higgs Boson Particle (The God Particle)


GYORGY LIGETI – ATMOSPHERES – Should have been called Ode to the God Particle


Michio Kaku on the ‘God Particle’


‘The God Particle’: The Higgs Boson


Understanding the Universe Through the God Particle


CERN scientists break the speed of light


Sounds of the God Particle


Mass is Energy Moving Faster then Light


What’s new @CERN ? Higgs boson, standard model, SUSY and neutrinos


Ligeti – Poeme Symphonique for 100 Metronomes (Minimalist Sound Poem)



Data Hints at Elusive Particle, but the Wait Continues


Agence France-Presse – Getty Images: An illustration of photons, provided by the Compact Muon Solenoid team of researchers.



Physicists will have to keep holding their breath a while longer


The New York Times, December 14, 2011, by Dennis Overbye  —  Two teams of scientists sifting debris from high-energy proton collisions in the Large Hadron Collider at CERN, the European Organization for Nuclear Research outside Geneva, said Tuesday that they had recorded tantalizing hints — but only hints — of a long-sought subatomic particle known as the Higgs boson, whose existence is a key to explaining why there is mass in the universe. By next summer, they said, they will have enough data to say finally whether the elusive particle really exists.

If it does, its mass must lie within the range of 115 billion to 127 billion electron volts, according to the new measurements.

The putative particle would weigh in at about 126 billion electron volts, about 126 times heavier than a proton and 250,000 times heavier than an electron, reported one army of 3,000 physicists, known as Atlas, for the name of their particle detector.

Meanwhile, the other team, known as C.M.S. — for its detector, the Compact Muon Solenoid — found what its spokesman, Guido Tonelli, termed “a modest excess” in its data corresponding to masses around 124 billion electron volts. The physicists from the different teams are already discussing whether these differences are significant.

Showing off one striking bump in the data, Fabiola Gianotti, a spokeswoman for the Atlas team, said, “If we are just being lucky, it will take a lot of data to kill it.”

Over the last 20 years suspicious bumps that might have been the Higgs have come and gone — most recently last summer — and the same thing could happen again. Physicists said the chance that these results were a fluke because of random fluctuations in the background of normal physics was about 1 percent, which is too high to claim a discovery, but is enough to inspire excitement.

The fact that two rival teams using two different mammoth particle detectors had recorded similar results was considered good news.

“So CERN is not claiming a discovery, but I am quite optimistic,” said Steven Weinberg of the University of Texas at Austin, whose 1979 Nobel Prize rests partly on the Higgs.

Greg Landsberg of Brown University, a leader of the C.M.S. group, said that how to characterize the new results depended “on whether you see the glass half empty or half full.” He added, “I believe that these are exciting results, but it is just too early to say whether what we see is a glimpse of Higgs or another statistical fluctuation.”

Trying unsuccessfully to hold back an ear-to-ear grin, Kyle Cranmer, a New York University physicist and member of the Atlas team, admitted he was excited. “A bump is the most exciting thing a particle physicist can see on a plot,” he said.

Physicists around the world, fueled by coffee, dreams and Internet rumors of a breakthrough, gathered in lounges and auditoriums early Tuesday morning to watch a lengthy Webcast of the results at CERN.

“Physicists at 8 a.m.,” exclaimed Neal Weiner, a theorist who organized a gathering at New York University. “That’s really impressive!”

The results were posted on the Web sites of Atlas and C.M.S.

As seen on the Webcast, the auditorium at CERN was filled to standing room only. In New York, at the conclusion of the talks, the N.Y.U. physicists burst into applause. And around the world, physicists also seemed cautiously excited.

Lawrence M. Krauss, a cosmologist at Arizona State University, put it this way: “If the Higgs is discovered, it will represent perhaps one of the greatest triumphs of the human intellect in recent memory, vindicating 50 years of the building of one of the greatest theoretical edifices in all of science, and requiring the building of the most complicated machine that has ever been built.”

The Higgs boson is the cornerstone and the last missing part of the so-called Standard Model, a suite of equations that has held sway as the law of the cosmos for the last 35 years and describes all of particle physics. Physicists have been eager to finish the edifice, rule the Higgs either in or out and then use that information to form deeper theories that could explain, for example, why the universe is made of matter and not antimatter, or what constitutes the dark matter and dark energy that rule the larger universe.

The particle is named for the University of Edinburgh physicist Peter Higgs, who was one of six physicists — the others are Tom Kibble, the late Robert Brout, Francois Englert, Gerry Guralnik and Dick Hagen — who suggested that a sort of cosmic molasses pervading space is what gives particles their heft. Particles trying to wade through it gather mass the way a bill moving though Congress gains riders and amendments, becoming more and more ponderous. It was Dr. Higgs who pointed out that this cosmic molasses, normally invisible and, of course, odorless, would have its own quantum particle, and so the branding rights went to him.

In 1967 Dr. Weinberg made the Higgs boson a centerpiece of an effort to unify two of the four forces of nature, electromagnetism and the nuclear “weak” force, and explain why the carriers of electromagnetism — photons — are massless but the carriers of the weak force — the W and Z bosons — are about 100 times as massive as protons.

Unfortunately, the model does not say how heavy the Higgs boson itself — the quantum personification of this field — should be. And so physicists have had to search for it the old-fashioned train-wreck way, by smashing subatomic particles together to see what materializes.

The Large Hadron Collider accelerates protons to energies of 3.5 trillion electron volts around an 18-mile underground racetrack and then crashes them together.

If these crashes have indeed put the Higgs on the horizon of discovery, the news comes in the nick of time. Over the course of the last few years, searches at the CERN collider and the now-defunct Tevatron at the Fermi National Accelerator Laboratory in Batavia, Ill., have come to the verge of ruling the Higgs out.

Perhaps it won’t come to that. Reached in Austin, Dr. Weinberg, who shared the Nobel for coming up with the theory of electroweak unification with Sheldon Glashow, of Boston University, and Abdus Salam, of Pakistan, said: “It’s always a little weird when something that comes out of the mathematics in theoretical work turns out to exist in the real world. You asked me earlier if it’s exciting. Sure is.”

That excitement continues, as Rolf Heuer, CERN’s director general, told the physicists Tuesday. “Keep in mind,” he concluded, “that we are running next year.”


The Brain Under Anesthesia


Brain waves: This figure illustrates the differences in brain activity during anesthesia. The plots with black lines show the electrical activity recorded with EEG, while the colored plots show a spectral analysis of that activity–whether the activity is primarily high or low frequency. When the patient was awake (top), his brain activity was at a high frequency. When he was sedated during surgery (bottom), the frequency of brain waves dropped.
Emery Brown.



Researchers are looking for better ways to prevent awareness during surgery



MIT Technology Review, by Emily Singer  —  According to a study published in the New England Journal of Medicine, a while back, a commonly used device designed to prevent anesthesia awareness–the rare event when a patient is actually conscious during surgery–was largely ineffective.

The findings highlight just how little is known about the neural changes that underlie anesthesia. “The challenge is that we don’t understand the physiology and pharmacology underlying memory blocking by anesthetics,” says Beverly Orser, an anesthesiologist and scientist at the University of Toronto, who wrote an editorial accompanying the piece. “If we understood the circuits and brain regions involved in complex memory formation, we’d be in a better position to develop these monitors.”

Emery Brown, an anesthesiologist and neuroscientist at Massachusetts General Hospital, and his colleagues are using both brain imaging of human volunteers and, in animals, electrophysiology approaches–which more directly measure brain activity–to gain a deeper understanding of anesthesia. Preliminary research from his lab suggests that measuring activity at the surface of the brain may not be a reliable indicator of what’s going on deeper down, where the memory circuitry may still be functioning–and forming frightening recollections of a particular surgery.

Every year, more than 20 million people in North America undergo general anesthesia–a combination of drugs that sedate patients, paralyze their muscles, and block perception of pain. The cocktail is carefully titered to each individual and each surgery, with the aim of maintaining the patient’s crucial functions, such as heart rate and blood pressure, while keeping her blissfully unaware of the procedure.

A small number of those who get general anesthesia–about 0.1 to 0.2 percent–will experience awareness, which ranges from relatively innocuous incidents, such as later remembering a conversation between surgeons and nurses, to reports of excruciating pain while completely paralyzed. While it’s not exactly clear what triggers anesthesia awareness, an insufficient amount of drugs that quiet brain areas involved in learning and memory is thought to be part of the problem.

As recognition of the problem of anesthesia awareness has grown in recent years, so has the market for devices designed to prevent it. Several types of monitors are now commercially available. They are based on a simple concept: that anesthesia drugs quiet the cortex in a predictable manner that can be measured with electroencephalography (EEG), a technology that measures electrical activity on the surface of the head. The frequency of brain waves spikes briefly as the patient is lulled into unconsciousness, and then it slows. The devices convert EEG patterns into a single number that indicates a patient’s level of awareness, allowing physicians to administer more drugs if needed.


But Brown and others argue that devices like this give only a rudimentary measure of what’s happening in the brain. “If it’s slow, we think it’s okay to operate; if it’s fast, we think they’re waking up,” says Brown. “That’s all we’re doing.”

Brown and his colleagues are using newly developed technology that allows them to study EEG waves while a patient is simultaneously having his brain imaged with functional magnetic brain imaging, an indirect measure of brain activity that is more spatially precise than EEG. Preliminary results show that some brain areas actually become more active during the course of anesthesia. It’s not surprising that a broad-acting drug, which inactivates brain areas that are normally involved in selectively inhibiting brain activity, leads other areas become more active, says Brown. “This is the type of information we really need,” he says.

In corresponding experiments conducted on rodents, scientists used arrays of electrodes to directly measure activity in different parts of the brain. Researchers directed by Matt Wilson, a professor of brain and cognitive sciences MIT who collaborates with Brown, found that rodents that had been given an increasing dose of an anesthetic showed characteristic changes in the rhythm of brain activity in the cortex. But activity in the hippocampus, a brain area crucial in learning and memory, remained unchanged.

“If the signature [measured via EEG] is coming from the cortex, it’s not telling us what the deeper brain structures are doing, such as the arousal system, the brain stem, the amygdala, and the hippocampus,” says Brown. “If EEG cannot tell you about those structures, it’s not telling you about key systems.”








Tracking Information Flow in the Brain



When a neuron fires, it releases calcium. Alan Jasanoff at the McGovern Institute at MIT used this observation to develop a new way to visualize brain activity using fMRI. Superparamagnetic nanoparticles (illustrated here) are covered with proteins (red and green) that aggregate when calcium is released by the neuron. Aggregation of these particles can be detected by the MRI magnet.



A tiny sensor that tracks calcium levels may one day provide clearer pictures of the brain at work.


MIT Technology Review, by Jennifer Chu  —  Scientists at MIT have engineered a nano-sized calcium sensor that may eventually shed light on the intricate cell-to-cell communications that make up human thought. Alan Jasanoff and his team at the Francis Bitter Magnet Lab and McGovern Institute of Brain Research have found that tracking calcium, a key messenger in the brain, may be a more precise way of measuring neural activity, compared with current imaging techniques, such as traditional functional magnetic resonance imaging (fMRI).

FMRI uses powerful magnets to detect blood flow in the brain, allowing researchers to watch the human brain in action. Through a rapid series of snapshots, scientists can observe key areas of a person’s brain lighting up in response to a given task or command. The technology has been used to pinpoint the brain areas involved in everything from basic motor and verbal skills to murkier cognitive states like jealousy, deception, and morality.

Unfortunately, fMRI, as it is used today, has a major drawback: it measures blood flow, or hemodynamics, which is an indirect measure of neural cell activity. “It turns out hemodynamics basically introduces a delay of five seconds,” says Jasanoff. “It keeps you from being able to detect fast variation [in neural activity].”

Since neurons typically fire on the order of milliseconds, current fMRI techniques provide only a rough estimate of what the brain is doing at any given moment. FMRI scans also have a relatively low spatial resolution, measuring activity in areas of 100 microns, a volume that typically contains 10,000 neurons, each with varying activation patterns.

Efforts to fine-tune fMRI have focused on developing stronger magnets and a better understanding of blood flow and its relationship to brain activity.

But Jasanoff believes there’s a better, more precise way of tracking neural activity. He and his team are looking at calcium as a direct measure of neuronal firing. When a neuron sends an electrical impulse to another neuron, calcium-specific channels in the neuron’s membrane instantaneously open up, letting calcium flow into the cell. “It’s a very dramatic signal change,” says Jasanoff.

Fluorescent calcium sensors are already used in superficial optical imaging, but haven’t yet been applied to the deeper brain tissues that are accessible via the powerful magnets of fMRI machines. To that end, Jasanoff’s lab set about designing a calcium sensor that would be detectable via fMRI. To do this, they combined the sensor with a superparamagnetic iron oxide nanoparticle–essentially, a molecular-sized magnet that can be picked up by fMRI as high-contrast images.

The sensor itself is composed of two separate nanoparticles, each coated with a different protein: calmodulin and M13. In the presence of calcium, these two proteins bind together. “Essentially…we created two sets of Velcro balls,” says Jasanoff. “One has hooks and one has loops, and they only become Velcro balls in the presence of calcium.” The proteins come apart when calcium disappears, a property that might be useful in interpreting the flow of electrical activity in a circuit of neurons during a given task–something that’s not possible with today’s fMRI.


Jasanoff’s research is only a first step toward that goal. So far, he has tested the sensor in test-tube solutions with and without calcium, scanning the interactions with MRI. The initial results, published in a recent issue of the Proceedings of the National Academies of Sciences, are promising: scans were able to pick up high-contrast images of the Velcro-like balls clustering in the presence of calcium. Although the images were only visible after many seconds, or even minutes, Jasanoff says the sensor is highly tweakable, and he plans to improve its time response in future trials. For now, he plans to inject calcium sensors into single cells of flies and eventually rats.

Outsider observers like Greg Sorensen of Harvard Medical School are cautiously optimistic about this new generation of brain imaging, particularly for human applications. Sorensen, an associate professor of radiology, is focused on applying novel imaging techniques to the treatment of neurological diseases.

“Intracellular iron oxide particles have in some studies had an unfavorable safety profile in humans,” Sorensen says. “If we learned that this method had some risks but in exchange could identify the best treatment for, say, schizophrenia, then the risk may well be worth the benefit.”

Device Tracks Blood Flow in the Brain




A headset ultrasound monitor could make it easier to detect the dangerous aftereffects of brain injuries



MIT Technology Review, December 13, 2011, by Courtney Humphries  —  A new ultrasound device could make it easier to detect a potentially life-threatening condition that is common in soldiers with blast-related brain injuries and patients who survive aneurysms.

The condition, called cerebral vasospasm, occurs when blood vessels suddenly constrict. The effect is like squeezing a garden hose: the velocity inside the artery builds as pressure grows, and less blood flows to the brain. The condition can develop several days after an initial injury, and is currently detected using ultrasound, which requires a trained technician to find the relevant blood vessels and hold the ultrasound beam in place.

PhysioSonics, based in Bellevue, Washington, has developed a monitor that makes this process automatic, eliminating the need for a technician. The company is adapting the product for military use, and hopes to expand it to also detect a potentially dangerous buildup of pressure inside the head.

The company’s monitor consists of a headset that directs an array of ultrasound beams through the head and uses a proprietary algorithm to automatically detect the mid-cerebral artery, one of the major arteries supplying blood to the brain. The device then locks the relevant beam onto the artery and measures its blood flow. A machine attached to the headset gives an index of flow and peak velocity.

“The point is to give you a variable” that could be read similarly to a heart-rate monitor, says Michel Kliot, company cofounder and a neurosurgeon at University of Washington, where the technology was initially developed.

In November, the company received a military grant of $2.5 million to adapt the device for monitoring vasospasm in soldiers. Nearly half the soldiers who sustain blast injuries develop vasospasm, and the company plans to make a more rugged version of its commercial device for the battlefield.





Looking into the Brain with Light


Mind reader: The OrNim targeted oximetry probe (above) adheres to the scalp to monitor the oxygen levels of specific areas of the brain.



An Israeli company is working on a new device to monitor oxygen levels in the brain



MIT Technology Review, by Michael Chorost  —  A new noninvasive diagnostic technology could give doctors the single most important sign of brain health: oxygen saturation. Made by an Israeli company called OrNim and slated for trials on patients in U.S. hospitals later this year, the technology, called targeted oximetry, could do what standard pulse oximeters can’t.

A standard pulse oximeter is clipped onto a finger or an earlobe to measure oxygen levels under the skin. It works by transmitting a beam of light through blood vessels in order to measure the absorption of light by oxygenated and deoxygenated hemoglobin. The information allows physicians to know immediately if oxygen levels in the patient’s blood are rising or falling.

Prior to the development of pulse oximeters, the only way to measure oxygen saturation was to take a blood sample from an artery and analyze it in a lab. By providing an immediate, noninvasive measure of oxygenation, pulse oximeters revolutionized anesthesia and other medical procedures.

While pulse oximeters have become accurate and reliable, they have a key limitation: they can’t measure oxygen saturation in specific areas deep inside the body. Because pulse oximeters measure only the blood’s overall oxygen levels, they have no way of monitoring oxygen saturation in a specific region. This is especially problematic in the case of brain injuries, since the brain’s oxygenation can then differ from the rest of the body’s.

Information on oxygenation in specific regions of the brain would be valuable to neurologists monitoring a brain-injured patient, as it could be used to search for localized hematomas and give immediate notice of hemorrhagic strokes. When a stroke occurs, an area of the brain is deprived of blood and thus oxygen, but there is no immediate way to detect the attack’s occurrence.

CT and MRI scans give a snapshot of tissue damage, but they can’t be used for continuous monitoring. It can also be very difficult to conduct such scans on unconscious patients hooked up to life-support devices.

Wade Smith, a neurologist at the University of California, San Francisco, and an advisor to OrNim, points out that, while cardiologists have devices to monitor hearts in detail, neurologists have no equivalent tool. With brain-injured patients, Smith says, “the state of the art is flying blind.”

OrNim’s new device uses a technique called ultrasonic light tagging to isolate and monitor an area of tissue the size of a sugar cube located between 1 and 2.5 centimeters under the skin. The probe, which rests on the scalp, contains three laser light sources of different wavelengths, a light detector, and an ultrasonic emitter.

The laser light diffuses through the skull and illuminates the tissue underneath it. The ultrasonic emitter sends highly directional pulses into the tissue. The pulses change the optical properties of the tissue in such a way that they modulate the laser light traveling through the tissue. In effect, the ultrasonic pulses “tag” a specific portion of tissue to be observed by the detector. Since the speed of the ultrasonic pulses is known, a specific depth can be selected for monitoring.

The modulated laser light is picked up by the detector and used to calculate the tissue’s color. Since color is directly related to blood oxygen saturation (for example, arterial blood is bright red, while venous blood is dark red), it can be used to deduce the tissue’s oxygen saturation. The measurement is absolute rather than relative, because color is an indicator of the spectral absorption of hemoglobin and is unaffected by the scalp.

Deeper areas could be illuminated with stronger laser beams, but light intensity has to be kept at levels that will not injure the skin. Given the technology’s current practical depth of 2.5 centimeters, it is best suited for monitoring the upper layers of the brain. Smith suggests that the technology could be used to monitor specific clusters of blood vessels.

While the technology is designed to monitor a specific area, it could also be used to monitor an entire hemisphere of the brain. Measuring any area within the brain could yield better information about whole-brain oxygen saturation than a pulse oximeter elsewhere on the body would. Hilton Kaplan, a researcher at the University of Southern California’s Medical Device Development Facility, says, “If this technology allows us to actually measure deep inside, then that’s a big improvement over the limitations of decades of cutaneous versions.”

Michal Balberg, the CEO and cofounder of OrNim, thinks that it may ultimately be feasible to deploy arrays of probes on the head to get a topographic map of brain oxygenation. In time, brain oxygenation may be considered a critical parameter that should be monitored routinely. Balberg says, “Our development is directed toward establishing a new brain vital sign that will be used to monitor any patient [who’s] unconscious or under anesthesia. We believe that this will affect patient management in the coming decade in a manner comparable to pulse oximeters.”

Michael Chorost covers medical devices for Technology Review. His book about cochlear implants, Rebuilt: How Becoming Part Computer Made Me More Human, was published in 2005.

The New York Times, December 12, 2011, by Steve Lohr  —   When I.B.M.’s Watson computer beat two human “Jeopardy!” champions earlier this year, it was a triumphant demonstration of the company’s technology. It was great for Big Blue’s image, but it was not a moneymaker on its own.

Yet that process is under way. Watson was a bundle of advanced technologies, including speech recognition, machine learning, natural-language processing, data mining and ultrafast in-memory computer hardware. They have been under development at I.B.M. for years, and were pulled into Watson.

The ingredients that went into the Watson arsenal are steadily finding their way into I.B.M. products. For example, WellPoint, the big health insurer, is trying out a system that uses Watson-style software to reduce redundant medical tests.

The latest entry is being announced on Thursday, I.B.M.’s Strategic Intellectual Property Insight Platform. Clearly, the Watson branding team did not work on this name.

But then again, this is not for television, where Watson performed, it is for major corporate customers seeking competitive advantage. The technology, sold as a cloud-based service, is the result of several years of joint development between IBM Research and four companies — AstraZeneca, Bristol-Myers Squibb, DuPont and Pfizer.

The insight platform uses data mining, natural-language processing and analytics to pore through millions of patent filings and biomedical journals to look for chemical compounds used in drug discovery. It searches for the names of compounds, related words, drawings of the compounds, the names of companies working with specific chemicals and molecules, and the names of scientists who created the patented inventions. It does its work quickly, retrieving information on patents in as little as 24 hours after a filing.

“It provides a landscape that shows who is working with what chemicals and drugs,” said Chris Moore, head of business analytics and optimization in I.B.M.’s global services unit.

The technology, Mr. Moore said, can be applied to everything from product strategy to recruiting to patent enforcement.

As a byproduct of its research, I.B.M. is also adding to a vast, searchable chemical database housed by the National Institutes of Health.

The company is contributing more than 2.4 million chemical compounds extracted from 4.7 million patents and 11 million biomedical journal extracts from 1976 to 2000.

The information was all published, but often in costly scientific journals or buried in the mountains of patent filings. It was so difficult to access that it was, for all practical purposes, inaccessible.

“It’s a nice contribution to the field of open chemistry — and that’s a growing trend, inspired by and similar to open source software,” said Marc C. Nicklaus, head of the computer-aided drug discovery group at the National Cancer Institute, which is part of the National Institutes of Health.

I.B.M.’s data contribution, it seems, is both generous and calibrated. The chemical compound data from patents goes from 1976 to 2000. So most of the data will be on patents that have already expired, useful for scientific research but far less useful commercially. The latter, no doubt, will be of greatest interest to I.B.M.’s paying clients.




IBM Contributes Data to the National Institutes of Health to Speed Drug Discovery and Cancer Research Innovation, New York, New York – 12 Dec 2011: IBM (NYSE: IBM) today announced it is contributing a massive database of chemical data extracted from millions of patents and scientific literature to the National Institutes of Health. This contribution will allow researchers to more easily visualize important relationships among chemical compounds to aid in drug discovery and support advanced cancer research.

In collaboration with AstraZeneca, Bristol-Myers Squibb, DuPont and Pfizer, IBM is providing a database of more than 2.4 million chemical compounds extracted from about 4.7 million patents and 11 million biomedical journal abstracts from 1976 to 2000. The announcement was made at an IBM forum on U.S. economic competitiveness in the 21st century, exploring how private sector innovations and investment can be more easily shared in the public domain.

The publicly available chemical data can be used by researchers worldwide to gain new insights and enable new areas of research. It will also help researchers save time by more efficiently finding information buried in millions of pages of patent documents. Access to this data will also allow researchers to analyze far larger sets of documents than the traditional manual process, adding a whole new dimension to the ability to search intellectual property.

The data was extracted using the IBM business analytics and optimization strategic IP insight platform (SIIP), a combination of data and analytics delivered via the IBM SmartCloud, and developed by IBM Research in collaboration with several major life sciences organizations. This new cloud-driven method for curating and analyzing massive amounts of patents, scientific content and molecular data. It uses techniques such as automated image analysis and enhanced optical recognition of chemical images and symbols to extract information from patents and literature upon publication. This is a task that otherwise takes weeks and months to complete manually, but can be done rapidly using this new technology.

“Information overload continues to be a challenge in drug discovery and other areas of scientific research,” said Steve Heller, project director for the InChI Trust, a non-profit which supports the InChI international standard to represent chemical structures. “Rich data and content is often buried in patents, drawings, figures and scholarly articles. This contribution by IBM and its collaborators will make it easier for researchers to use this data, link to other data using the InChI structure representation and derive new insight.”

Over the past six years, several major life sciences organizations have worked on this project with IBM Research gaining access to a comprehensive chemical library extracted from worldwide patents and scientific abstracts. Public structure extraction tools developed by researchers at the National Institutes of Health were also used successfully in this project.

“The scientific community will receive enormous benefit from this advancement,” said Heller. “This is an important addition to the open chemistry data sets. The comprehensiveness of the data and the new ways researchers can look at these data and cross-link to other data associated with each chemical is expected to help with drug development to fight many forms of cancers and other human diseases, as well as the development of other chemical compounds.

The data will be contributed to the National Center for Biotechnology Information (NCBI), part of the National Library of Medicine (NLM), and the Computer-Aided Drug Design (CADD) Group of the National Cancer Institute (NCI) at the National Institutes of Health. It will be incorporated in the NCBI’s PubChem, a public resource for the scientific community that serves as an aggregator for scientific results as well as in NCI CADD Group services such as the Chemical Structure Lookup Service and the Chemical Identifier Resolver.

The National Institutes of Health will make the content available on PubChem at

Read more: IBM Contributes Data to the National Institutes of Health to Speed Drug Discovery and Cancer Research Innovation – FierceBiotech


Back in April 2011, FierceBioTech wrote:



How can IBM’s Watson aid pharma researchers?


Carol Kaelson/Courtesy of Jeopardy Productions Inc., via Associated PressThe technology behind I.B.M.’s Watson computer, most known for beating two human “Jeopardy!” champions earlier this year, is steadily finding its way into other I.B.M. products.


FierceBiotech IT updates senior biotech, pharma, and IT leaders on how IT advances are shaping clinical trials and clinical research. Get your weekly briefing on clinical trial design and management systems, adaptive trials, eClinical trials, and more. Sign up today!

You might know about how IBM’s Watson supercomputer bested human contestants in the game show Jeopardy! in February. But there’s also been some thought about what the supercomputer could do to aid humans faced with sorting through vast amounts of biomedical information to make decisions. In a recent interview with the San Francisco Chronicle, IBM’s ($IBM) Dr. David Ferrucci talked about hypothetical uses of Watson in a clinical setting.

“I think what’s compelling about the medical use case is that, first of all, there’s a huge amount of information out there,” Ferrucci told the Chronicle. “It often doesn’t get considered and some of these diagnostic pieces can be very involved and very complicated but, more over, you want this evidence trail. You want to know–what did I consider? Why did I consider it? Where’s the evidence for that?”

Taking a step back from this interview, it’s not hard to imagine how Watson could potentially aid drug researchers who are now faced with a dizzying amount of data in their jobs. The bioinformatics groups at major pharmaceutical companies are working on multiple fronts to help their researchers make effective decisions based on all the information available to them. And Big Pharma also has deep pockets to pay for supercomputers.

At the IBM’s T.J. Watson Research Center in Hawthorne, NY, where the Watson supercomputer is being developed, the firm’s researchers are already working in computational biology and other areas that hold promise in drug discovery, according to IBM. Also, Swiss healthcare giant Roche last year inked a research deal with IBM to enlist the support of the tech giant in developing cheaper and faster gene sequencing technology. So we know that Big Blue is no stranger to Big Pharma.

We’ll see whether IBM’s Watson grows up to be a major force in the biomedical research world. That would be a great encore to the supercomputer’s stellar performance on Jeopardy!.




Scripps begins world’s largest computer-based project against malaria
November/December 2011  —  Malaria may have lost big when IBM’s ($IBM) Watson supercomputer won Jeopardy! in February. Scientists from the Scripps Research Institute in La Jolla, CA have garnered a portion of Watson’s winnings from the game show to mount the largest-ever computational project to combat drug-resistant malaria. Watson’s providing the money, but the scientists are using a supercomputer of another sort for the major undertaking.

The researchers plan to tap the World Community Grid, which consists of some 2 million PCs from more than half a million volunteers who have made their computers available for computing jobs. With computing capacity from the volunteers’ machines, the Scripps scientists plan to crunch data on millions of potential compounds to home in on proteins that the deadliest malaria-causing parasite needs to survive, according to the group’s post on IBM’s blog. The goal is to discover new drugs that can treat people with malaria who didn’t get vaccinated or whose vaccination wore off after a while.

The project is taking aim at the most deadly form of malaria, which is caused by the parasite called Plasmodium falciparum. While many cases of malaria are curable, certain forms of the infectious disease have built up resistance to the drugs. According to the World Health Organization, there were 781,000 deaths and 225 million cases of malaria in 2009. Malaria kills a child in Africa every 45 seconds, the organization said on its website.

Before starting the malaria project, Scripps scientists used the World Community Grid to find two compounds to attack multi-drug resistant HIV, according to the group. Computer-based analysis of compounds and disease-related proteins is nothing new, but recently members of the public have been empowered through organizations such as the World Community Grid and the online video game Foldit to help find answers to difficult scientific questions.



IBM cloud aids fight against superbugs


FierceBiotech IT updates senior biotech, pharma, and IT leaders on how IT advances are shaping clinical trials and clinical research. Get your weekly briefing on clinical trial design and management systems, adaptive trials, eClinical trials, and more. Sign up today!

Big Blue’s cloud has helped Swiss scientists in their hunt for clues about how certain bacteria form resistance against antibiotics and cause disease. It’s perhaps the latest case where the tech giant ($IBM) has been aiding life sciences concerns in managing and analyzing biological or clinical data in the cloud.

For this latest feat, IBM–which has previously made the case that its cloud could help reduce clinical trial costs–worked with Swiss cloud computing start-up CloudBroker and researchers at the prestigious technical university ETH Zurich, according to IBM. With IBM’s cloud and CloudBroker’s queuing and data management software, the Swiss researchers analyzed a huge amount of data within two weeks. Without the technology, the analyses could have taken several months.

Indeed, IBM has made life sciences customers a key market for its cloud computing offerings for years. And as scientists compute massive amounts of data from disease proteins and genomes, clouds have proven to be useful in providing the needed computing capacity in short order. The technology also offers potential cost savings for managing and analyzing clinical data from drug trials. While developers are looking for ways to make their R&D run more efficiently, cloud computing has entered the discussion as one way to achieve this.

The scientists at ETH Zurich (from which CloudBroker spun off in 2008) used IBM’s cloud to find about 250 virulence factors and conjure 2.3 million 3D models to gain a better understanding of disease-causing bacteria. For their study of streptococcus bacteria that cause strep throat, the scientists tapped nearly 250,000 computing hours on 1,000 parallel CPUs with Big Blue’s Smart Cloud Enterprise.

“For our experiments, we need very high capacity in short time frames,” Dr. Lars Malmstrom, ETH Zurich’s lead researcher, said in IBM’s release. “Cloud computing allows to reserve this computing capacity whenever researchers need it, and it is available quickly. Research teams do not need to set it up or maintain it, and thus can concentrate better on their research.”



Bloomberg: more on IBM



IBM Gives Researchers Data on 2.4 Million Chemicals


By Alex Wayne –Dec 12, 2011



(Bloomberg) — Joe Foresi, an analyst at Janney Montgomery Scott LLC, talks about International Business Machines Corp.’s third-quarter earnings report and forecast. IBM, the biggest computer-services company reported third-quarter sales that missed analysts’ estimates on slowing revenue growth at its software, hardware and services businesses. The company raised its full-year earnings forecast by 10 cents per share. Foresi speaks with Lisa Murphy and Adam Johnson on Bloomberg Television’s “Street Smart.” Julie Hyman also speaks. (Source: Bloomberg)

U.S. researchers gained access to a database of 2.4 million chemical formulas and diagrams that International Business Machine Corp. (IBM) culled from 24 years’ worth of patent applications and medical journals.

The catalogue of compounds will be housed at the National Institutes of Health in Bethesda, Maryland. Scientists can use the data to identify new candidates for drug development or new uses for existing drugs.

“The applications are very wide reaching, from life sciences to chemicals, petroleum, to food, to health,” said Chris Moore, a partner and vice president in Armonk, New York- based IBM’s global life sciences division, in a telephone interview.

The NIH has made development of drugs for rare diseases such as sickle cell anemia a priority since Francis Collins became director of the $31 billion research agency in 2009. Collins has asked Congress to create a new institute for drug development and pressed manufacturers to open catalogs of abandoned compounds.

IBM, the world’s biggest computer-services provider, created the database by “Watson-type technology,” Moore said, referring to the supercomputer that beat human competitors on the game show ‘Jeopardy.’ The data was extracted from about 4.7 million U.S., European and United Nations patents and 11 million biomedical journal abstracts from 1976 to 2000.

Researchers can search the database for free. IBM plans to sell a product that will produce more sophisticated analysis of the database that can tell customers who is conducting research on specific therapies, Moore said. He wouldn’t disclose the price.

Biotechnology Showcase 2012



On 11 January 2012, Dr. Jules T. Mitchel, President of Target Health Inc., will be chairing a panel at the Biotech Showcase™ 2012 meeting in San Francisco. The topic will be “Getting a Drug Approved as the FDA is Evolving.” The Biotech Showcase provides private and public life science companies the opportunity to present to an audience of investors and business development executives during the course of the industry’s largest annual healthcare investor conference.


Now in its fourth year, Biotech Showcase 2012 is expected to attract upwards of 1,500 attendees. The program includes lunch plenary sessions featuring top industry leaders and innovators speaking on industry- and time-relevant topics, as well as presentations from both private and public companies. Let us know if you will be attending.


For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website at

Two Drugs Appear to Delay Progression of Breast Cancer




Breast cancer today is not what it was 20 years ago. Survival rates are climbing, thanks to greater awareness, more early detection, and advances in treatment. For roughly 200,000 Americans who are diagnosed with breast cancer each year, there are plenty of reasons to be hopeful.


Two drugs have been shown to delay by several months the time before advanced breast cancer worsens, potentially providing new options for women with that disease. Both drugs, pertuzumab from Genentech and everolimus from Novartis, also showed signs in clinical trials that they could prolong 1) ___, though it is too early to say that definitively. Results of the studies were presented this week at the San Antonio Breast Cancer Symposium and were published online in The New England Journal of Medicine (7 Dec 2011).


Pertuzumab is designed to complement Genentech’s big-selling drug 2) ___ for the roughly 20% of breast cancer patients whose tumors have elevated levels of a protein called Her2. Both pertuzumab and Herceptin block the action of the protein but in different ways. In a late-stage clinical trial involving 808 patients, women randomly chosen to receive pertuzumab, Herceptin and the chemotherapy drug docetaxel, went a median of 18.5 months before their tumors worsened or they 3) ___, a measure known as progression-free survival. That was significantly longer than the 12.4 months for those who received a placebo, Herceptin and docetaxel.


“We have an improvement in progression-free 4) ___ that is six months,” said Dr. José Baselga, chief of hematology and oncology at Massachusetts General Hospital, said in an interview. He said the addition of pertuzumab did not increase cardiac dysfunction, a worrisome side effect of Herceptin. Genentech and its parent company, Roche, have applied in the last few days for permission to market pertuzumab in the United States and 5) ___. Approval could help Roche recover from the recent decision of the FDA to revoke the approval of another Genentech drug, Avastin, for the treatment of breast 6) ___. Avastin had been approved based on a trial that showed it delayed the worsening of tumors by 5.5 months — almost as big a gain as seen now with pertuzumab. But the use of Avastin did not prolong 7) ___, and subsequent studies found a much smaller improvement in progression-free survival. That could make the FDA reluctant to approve pertuzumab unless it also helps women live longer. But Dr. Sandra J. Horning, head of cancer clinical trials at Genentech, said the pertuzumab results on 8) ___ progression were more trustworthy than the original Avastin results because the trial was conducted more carefully.


Novartis’s drug, everolimus, is a tablet that is already sold under the name Afinitor to treat kidney cancer and some rare tumors. Novartis plans to apply for the tablet’s approval as a breast cancer treatment by the end of the year. The clinical trial involved 724 postmenopausal women with hormone receptor-positive metastatic breast cancer. The women who took both everolimus and a drug called exemestane had a median progression-free survival of 7.4 months compared with 3.2 months for those who took a 9) ___ plus exemestane. Exemestane, also known by the brand name Aromasin, deprives tumors of estrogen, which can fuel their growth. Women in the study already had failed to benefit from other estrogen-depriving drugs. So perhaps it is not surprising that the control arm did not do that well on exemestane alone. “They put it up against a weak opponent,” said Dr. Peter Ravdin, a breast cancer specialist at the University of Texas Health Science Center at San Antonio, who was not involved in the study. But trial investigators said the comparison was valid because in daily practice, doctors often use exemestane when other 10) ___-blocking drugs fail. Everolimus works by inhibiting mTOR, a protein that often spurs tumor growth after tumors become resistant to hormone therapy.


Another drug – entinostat, from privately held Syndax Pharmaceuticals – might also be able to do that, according to the results of a small study presented in San Antonio. Women who received that drug plus exemestane had delayed tumor progression and also lived months longer than those who took exemestane alone. Those results will have to be confirmed in a larger trial, and it will be several years before entinostat can reach the market.


Everolimus costs about $7,000 a month when used for kidney cancer. The drug can have significant, even fatal, side effects like mouth sores, infections and lung inflammation. That could give some doctors pause about adding it to 11) ___ therapy.


Update: This past week, an exhaustive new report, put out by The Institute of Medicine, meant to address public fears about possible links between breast cancer and the environment, found evidence strong enough to make only a few firm recommendations, most already well known and none with a large proven benefit. The most consistent data suggest that women can reduce their risk by avoiding unnecessary medical radiation, forgoing hormone treatments for 12) ___ that combine estrogen and progestin, limiting alcohol intake and minimizing weight gain, the report found. Source: NYTimes, 10 Dec 2011, by Andrew Pollack


ANSWERS: 1) lives; 2) Herceptin; 3) died; 4) survival; 5) Europe; 6) cancer; 7) lives; 8) tumor; 9) placebo; 10) estrogen; 11) hormone; 12) menopause

← Previous PageNext Page →