It’s been right under our noses for years, and we never noticed it.
By Dan Nosowitz

 

20130816-1

Hello, Olinguito Mark Gurney

 

 

We don’t discover new mammals very often, let alone new mammals in well-known families like Procyonidae (which includes the raccoons, coatis, and ringtails). So we are very pleased to introduce you all to the newest member of the family. Everyone, say hello to the olinguito (pronounced oh-ling-GHEE-toe).

 

The olinguito, as its name suggests, is highly similar to another member of the raccoon family, the olingo, an arboreal, nocturnal animal that looks more like a combination of a possum and a monkey than a raccoon. In fact, says Smithsonian, there was actually an olinguito in American zoos in the 1960s, kept in cages with olingos. “Its keepers were mystified as to why it refused to breed with its peers,” writes Joseph Stromberg at Smithsonian. But now we have proof as to why: a new study published today in the journal ZooKeys has established through DNA and other anatomical evidence that the olinguito, Bassaricyon neblina, is a distinct species. No wonder it didn’t mate with an olingo!

 

The olinguito looks fairly similar to the olingo, but its fur is a totally different color; olingos have dull gray fur, but the olinguito has reddish-brown fur. Kristofer Helgen, curator of mammals at the Smithsonian National Museum of Natural History and author of the paper, stumbled on some olinguito skins while researching the olingo, in South America. It’s slightly smaller than the olingo, about 14 inches long and weighing about two pounds, and eats mostly fruit, supplemented with insects. It lives in the high forest of Ecuador and Colombia–delightfully called “cloud forests”–and rarely comes down from the trees, adept as it is at leaping around the branches.

 

Researchers aren’t sure whether the species is at critical risk; Helgen estimates that 42 percent of its possible home territory has been deforested, so he says “there is reason to be concerned.”

 

Helgen thinks there may be up to four subspecies of olinguito roaming around the forest, which is pretty much unheard-of for a mammal discovery at this time in history. “I honestly think that this could be the last time in history that we will turn up this kind of situation—both a new carnivore, and one that’s widespread enough to have multiple kinds,” he said.

Read more about the olinguito over at Smithsonian.

 

 

Source: http://www.popsci.com/science/article/2013-08/new-awesome-mammal-raccoon-family-found-south-america

20130815-1

Deterministic quantum teleportation of a photonic quantum bit. Each qubit that flies from the left into the teleporter leaves the teleporter on the right with a loss of quality of only around 20 percent, a value not achievable without entanglement. (Credit: Ill./©: University of Tokyo)

 

By means of the quantum-mechanical entanglement of spatially separated light fields, researchers in Tokyo and Mainz have managed to teleport photonic qubits with extreme reliability. This means that a decisive breakthrough has been achieved some 15 years after the first experiments in the field of optical teleportation. The success of the experiment conducted in Tokyo is attributable to the use of a hybrid technique in which two conceptually different and previously incompatible approaches were combined.

“Discrete digital optical quantum information can now be transmitted continuously — at the touch of a button, if you will,” explained Professor Peter van Loock of Johannes Gutenberg University Mainz (JGU). As a theoretical physicist, van Loock advised the experimental physicists in the research team headed by Professor Akira Furusawa of the University of Tokyo on how they could most efficiently perform the teleportation experiment to ultimately verify the success of quantum teleportation. Their findings have now been published in the journal Nature.

Quantum teleportation involves the transfer of arbitrary quantum states from a sender, dubbed Alice, to a spatially distant receiver, named Bob. This requires that Alice and Bob initially share an entangled quantum state across the space in question, e.g., in the form of entangled photons. Quantum teleportation is of fundamental importance to the processing of quantum information (quantum computing) and quantum communication. Photons are especially valued as ideal information carriers for quantum communication since they can be used to transmit signals at the speed of light. A photon can represent a quantum bit or qubit analogous to a binary digit (bit) in standard classical information processing. Such photons are known as ‘flying quantum bits’.

The first attempts to teleport single photons or light particles were made by the Austrian physicist Anton Zeilinger. Various other related experiments have been performed in the meantime. However, teleportation of photonic quantum bits using conventional methods proved to have its limitations because of experimental deficiencies and difficulties with fundamental principles.

What makes the experiment in Tokyo so different is the use of a hybrid technique. With its help, a completely deterministic and highly reliable quantum teleportation of photonic qubits has been achieved. The accuracy of the transfer was 79 to 82 percent for four different qubits. In addition, the qubits were teleported much more efficiently than in previous experiments, even at a low degree of entanglement.

Entanglement ‘on demand’ using squeezed light

The concept of entanglement was first formulated by Erwin Schrödinger and involves a situation in which two quantum systems, such as two light particles for example, are in a joint state, so that their behavior is mutually dependent to a greater extent than is normally (classically) possible. In the Tokyo experiment, continuous entanglement was achieved by means of entangling many photons with many other photons. This meant that the complete amplitudes and phases of two light fields were quantum correlated. Previous experiments only had a single photon entangled with another single photon — a less efficient solution.

“The entanglement of photons functioned very well in the Tokyo experiment — practically at the press of a button, as soon as the laser was switched on,” said van Loock, Professor for Theory of Quantum Optics and Quantum Information at Mainz University. This continuous entanglement was accomplished with the aid of so-called ‘squeezed light’, which takes the form of an ellipse in the phase space of the light field. Once entanglement has been achieved, a third light field can be attached to the transmitter. From there, in principle, any state and any number of states can be transmitted to the receiver. “In our experiment, there were precisely four sufficiently representative test states that were transferred from Alice to Bob using entanglement. Thanks to continuous entanglement, it was possible to transmit the photonic qubits in a deterministic fashion to Bob, in other words, in each run,” added van Loock.

Earlier attempts to achieve optical teleportation were performed differently and, before now, the concepts used have proved to be incompatible. Although in theory it had already been assumed that the two different strategies, from the discrete and the continuous world, needed to be combined, it represents a technological breakthrough that this has actually now been experimentally demonstrated with the help of the hybrid technique. “The two separate worlds, the discrete and the continuous, are starting to converge,” concluded van Loock.

 

 

Story Source:

The above story is based on materials provided by Universität Mainz.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Shuntaro Takeda, Takahiro Mizuta, Maria Fuwa, Peter van Loock, Akira Furusawa. Deterministic quantum teleportation of photonic quantum bits by a hybrid technique. Nature, 2013; 500 (7462): 315 DOI: 10.1038/nature12366

Source: http://www.sciencedaily.com/releases/2013/08/130815084411.htm

20130814-1

Human Heart Patrick J. Lynch via Wikimedia Commons

 

 

A team of scientists from the University of Pittsburgh School of Medicine has created lab-grown human heart tissue that can beat on its own, according to a new study in Nature Communications.

In 2008, a University of Minnesota study showed that the original cells from a rat heart could be completely flushed out of the heart’s external structure in a process called decellularization, then replaced by newborn rat cells to regenerate a working heart. A similar process has now allowed Pitt scientists to grow working human heart tissue within the decellurized structure of a mouse heart.

Using various enzymes and special cleansing detergents, the researchers stripped a mouse heart of all its cells to create a scaffold for induced pluripotent stem cells (iPS cells), adult human cells that are reprogrammed to act like embryonic cells. They treated the iPS cells taken from a skin biopsy to become multipotential cardiovascular progenitor (MCP) cells, the precursor cells that can become any of the three types of cells found in the heart.

 

20130814-2

Decellularized Mouse Heart:  Lu et. al

 

“Nobody has tried using these MCPs for heart regeneration before,” said Lei Yang, an assistant professor of developmental biology at Pitt. After a period of a few weeks, the human cells had repopulated the mouse heart, and it began beating at a rate of 40 to 50 beats per minute. That’s a little slow, though not by much. A typical resting heart rate for an an adult is between 60 and 80 bpm, though anything above 50 bpm is still considered normal.

Watch it beat:

 

 

This could eventually lead to personalized organ transplants, or even just a great way to study in the lab the way they human heart develops or how it responds to drugs.

Next, Yang wants to try to make just a patch of human heart tissue, which could be used to replace only regions of the heart that have been damaged by something like a heart attack. He told PopularScience.com via email that he hopes to test heart tissue patches in animals within the next few years.

The study came out in the Aug. 13 issue of Nature Communications.

 

Source: http://www.popsci.com/science/article/2013-08/scientists-engineer-lab-grown-heart-tissue-beats-its-own

20130813-1

Glacier Perito Moreno National Park, Argentina Patagonia. (Credit: iStockphoto)

 

 

Science has struggled to explain fully why an ice age occurs every 100,000 years. As researchers now demonstrate based on a computer simulation, not only do variations in insolation play a key role, but also the mutual influence of glaciated continents and climate.

Ice ages and warm periods have alternated fairly regularly in Earth’s history: Earth’s climate cools roughly every 100,000 years, with vast areas of North America, Europe and Asia being buried under thick ice sheets. Eventually, the pendulum swings back: it gets warmer and the ice masses melt. While geologists and climate physicists found solid evidence of this 100,000-year cycle in glacial moraines, marine sediments and arctic ice, until now they were unable to find a plausible explanation for it.

Using computer simulations, a Japanese, Swiss and American team including Heinz Blatter, an emeritus professor of physical climatology at ETH Zurich, has now managed to demonstrate that the ice-age/warm-period interchange depends heavily on the alternating influence of continental ice sheets and climate.

“If an entire continent is covered in a layer of ice that is 2,000 to 3,000 metres thick, the topography is completely different,” says Blatter, explaining this feedback effect. “This and the different albedo of glacial ice compared to ice-free earth lead to considerable changes in the surface temperature and the air circulation in the atmosphere.” Moreover, large-scale glaciation also alters the sea level and therefore the ocean currents, which also affects the climate.

Weak effect with a strong impact

As the scientists from Tokyo University, ETH Zurich and Columbia University demonstrated in their paper published in the journal Nature, these feedback effects between Earth and the climate occur on top of other known mechanisms. It has long been clear that the climate is greatly influenced by insolation on long-term time scales. Because Earth’s rotation and its orbit around the sun periodically change slightly, the insolation also varies. If you examine this variation in detail, different overlapping cycles of around 20,000, 40,000 and 100,000 years are recognisable.

Given the fact that the 100,000-year insolation cycle is comparatively weak, scientists could not easily explain the prominent 100,000-year-cycle of the ice ages with this information alone. With the aid of the feedback effects, however, this is now possible.

Simulating the ice and climate

The researchers obtained their results from a comprehensive computer model, where they combined an ice-sheet simulation with an existing climate model, which enabled them to calculate the glaciation of the northern hemisphere for the last 400,000 years. The model not only takes the astronomical parameter values, ground topography and the physical flow properties of glacial ice into account but also especially the climate and feedback effects. “It’s the first time that the glaciation of the entire northern hemisphere has been simulated with a climate model that includes all the major aspects,” says Blatter.

Using the model, the researchers were also able to explain why ice ages always begin slowly and end relatively quickly. The ice-age ice masses accumulate over tens of thousands of years and recede within the space of a few thousand years. Now we know why: it is not only the surface temperature and precipitation that determine whether an ice sheet grows or shrinks. Due to the aforementioned feedback effects, its fate also depends on its size. “The larger the ice sheet, the colder the climate has to be to preserve it,” says Blatter. In the case of smaller continental ice sheets that are still forming, periods with a warmer climate are less likely to melt them. It is a different story with a large ice sheet that stretches into lower geographic latitudes: a comparatively brief warm spell of a few thousand years can be enough to cause an ice sheet to melt and herald the end of an ice age.

The Milankovitch cycles

The explanation for the cyclical alternation of ice and warm periods stems from Serbian mathematician Milutin Milankovitch (1879-1958), who calculated the changes in Earth’s orbit and the resulting insolation on Earth, thus becoming the first to describe that the cyclical changes in insolation are the result of an overlapping of a whole series of cycles: the tilt of Earth’s axis fluctuates by around two degrees in a 41,000-year cycle. Moreover, Earth’s axis gyrates in a cycle of 26,000 years, much like a spinning top. Finally, Earth’s elliptical orbit around the sun changes in a cycle of around 100,000 years in two respects: on the one hand, it changes from a weaker elliptical (circular) form into a stronger one. On the other hand, the axis of this ellipsis turns in the plane of Earth’s orbit. The spinning of Earth’s axis and the elliptical rotation of the axes cause the day on which Earth is closest to the sun (perihelion) to migrate through the calendar year in a cycle of around 20,000 years: currently, it is at the beginning of January; in around 10,000 years, however, it will be at the beginning of July.

Based on his calculations, in 1941 Milankovitch postulated that insolation in the summer characterises the ice and warm periods at sixty-five degrees north, a theory that was rejected by the science community during his lifetime. From the 1970s, however, it gradually became clearer that it essentially coincides with the climate archives in marine sediments and ice cores. Nowadays, Milankovitch’s theory is widely accepted. “Milankovitch’s idea that insolation determines the ice ages was right in principle,” says Blatter. “However, science soon recognised that additional feedback effects in the climate system were necessary to explain ice ages. We are now able to name and identify these effects accurately.”

 

 

Story Source:

The above story is based on materials provided by ETH Zurich. The original article was written by Fabio Bergamin.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.

Journal Reference:

  1. Ayako Abe-Ouchi, Fuyuki Saito, Kenji Kawamura, Maureen E. Raymo, Jun’ichi Okuno, Kunio Takahashi, Heinz Blatter. Insolation-driven 100,000-year glacial cycles and hysteresis of ice-sheet volume. Nature, 2013; 500 (7461): 190 DOI: 10.1038/nature12374

And is it already too late?!
By Bjorn Carey

20130812-1

Brain Freeze Getty Images

 

 

We’ve all sucked down a milkshake so quickly that it causes a sudden headache—the dreaded brain freeze. But . . . milkshake. Tasty. Must. Drink. Could chugging the rest of that shake cause lasting brain damage?

First, let’s get one thing straight. “This condition is referred to as an ‘ice-cream headache,’ ” says Stacey Gray, a sinus surgeon at the Massachusetts Eye and Ear Infirmary in Boston. “It’s a very technical term.” Although there’s no published paper saying as much, a milkshake slurped too quickly probably does not actually lower brain temperature. Besides, Gray says, the temporary pain can’t do any harm because it has nothing to do with the brain.

There are two schools of thought on what causes the ice-cream headache. The drink may chill the air in your sinuses and cause the blood vessels in the nasal cavity near your forehead to constrict, creating pain similar to a migraine. Or perhaps it touches off a branch of the trigeminal nerve in your mouth, triggering a pain response in the nerve that’s responsible for facial sensation.

The condition has not drawn research funding from the National Institutes of Health, so no one has performed the simple experiment that Gray says would settle things once and for all. “You could block the nerve with an injection of lidocaine, cool the area, and if it still happens it’s probably a circulatory-system thing,” she says. “But no one seems that interested.”

Even if a cold drink was able to temporarily chill your brain a few degrees, it wouldn’t be a big deal. Neurosurgeons including Johns Hopkins Hospital’s Rafael Tamargo often take the brain from its cozy resting range of between 98.6˚F and 100.4˚ all the way down to 64˚. “There are situations, particularly for correcting blood-vessel problems like an aneurysm, where we cool the brain in order to stop circulation to an area to perform our work safely,” he says. When the brain is chilled to 68˚, its metabolism and electrical activity drop to 15 percent; surgeons reduce it to 64˚for good measure.

“Even if the patient wasn’t anesthetized, at that temperature they would be in a noninteractive state, unable to sense stimuli or produce a response,” Tamargo says. “But once you warm the brain up, it picks right up from where it left off. It’s not harmful at all.” So whether your brain is frozen or not, if you can handle a little pain, slurp away.

This article originally appeared in the October 2010 issue of Popular Science magazine.

 

source: http://www.popsci.com/science/article/2010-09/q-can-%E2%80%9Cbrain-freeze%E2%80%9D-cause-long-term-brain-damage

SLOW IDEAS

20130809-1

Atul Gwande MD

 

 

Some innovations spread fast. How do you speed the ones that don’t?

BY ATUL GAWANDE MD, JULY 29, 2013

 

 

Why do some innovations spread so swiftly and others so slowly? Consider the very different trajectories of surgical anesthesia and antiseptics, both of which were discovered in the nineteenth century. The first public demonstration of anesthesia was in 1846. The Boston surgeon Henry Jacob Bigelow was approached by a local dentist named William Morton, who insisted that he had found a gas that could render patients insensible to the pain of surgery. That was a dramatic claim. In those days, even a minor tooth extraction was excruciating. Without effective pain control, surgeons learned to work with slashing speed. Attendants pinned patients down as they screamed and thrashed, until they fainted from the agony. Nothing ever tried had made much difference. Nonetheless, Bigelow agreed to let Morton demonstrate his claim.

On October 16, 1846, at Massachusetts General Hospital, Morton administered his gas through an inhaler in the mouth of a young man undergoing the excision of a tumor in his jaw. The patient only muttered to himself in a semi-conscious state during the procedure. The following day, the gas left a woman, undergoing surgery to cut a large tumor from her upper arm, completely silent and motionless. When she woke, she said she had experienced nothing at all.

Four weeks later, on November 18th, Bigelow published his report on the discovery of “insensibility produced by inhalation” in the Boston Medical and Surgical Journal. Morton would not divulge the composition of the gas, which he called Letheon, because he had applied for a patent. But Bigelow reported that he smelled ether in it (ether was used as an ingredient in certain medical preparations), and that seems to have been enough. The idea spread like a contagion, travelling through letters, meetings, and periodicals. By mid-December, surgeons were administering ether to patients in Paris and London. By February, anesthesia had been used in almost all the capitals of Europe, and by June in most regions of the world.

There were forces of resistance, to be sure. Some people criticized anesthesia as a “needless luxury”; clergymen deplored its use to reduce pain during childbirth as a frustration of the Almighty’s designs. James Miller, a nineteenth-century Scottish surgeon who chronicled the advent of anesthesia, observed the opposition of elderly surgeons: “They closed their ears, shut their eyes, and folded their hands. . . . They had quite made up their minds that pain was a necessary evil, and must be endured.” Yet soon even the obstructors, “with a run, mounted behind—hurrahing and shouting with the best.” Within seven years, virtually every hospital in America and Britain had adopted the new discovery.

Sepsis—infection—was the other great scourge of surgery. It was the single biggest killer of surgical patients, claiming as many as half of those who underwent major operations, such as a repair of an open fracture or the amputation of a limb. Infection was so prevalent that suppuration—the discharge of pus from a surgical wound—was thought to be a necessary part of healing.

In the eighteen-sixties, the Edinburgh surgeon Joseph Lister read a paper by Louis Pasteur laying out his evidence that spoiling and fermentation were the consequence of microorganisms. Lister became convinced that the same process accounted for wound sepsis. Pasteur had observed that, besides filtration and the application of heat, exposure to certain chemicals could eliminate germs. Lister had read about the city of Carlisle’s success in using a small amount of carbolic acid to eliminate the odor of sewage, and reasoned that it was destroying germs. Maybe it could do the same in surgery.

During the next few years, he perfected ways to use carbolic acid for cleansing hands and wounds and destroying any germs that might enter the operating field. The result was strikingly lower rates of sepsis and death. You would have thought that, when he published his observations in a groundbreaking series of reports in The Lancet, in 1867, his antiseptic method would have spread as rapidly as anesthesia.

Far from it. The surgeon J. M. T. Finney recalled that, when he was a trainee at Massachusetts General Hospital two decades later, hand washing was still perfunctory. Surgeons soaked their instruments in carbolic acid, but they continued to operate in black frock coats stiffened with the blood and viscera of previous operations—the badge of a busy practice. Instead of using fresh gauze as sponges, they reused sea sponges without sterilizing them. It was a generation before Lister’s recommendations became routine and the next steps were taken toward the modern standard of asepsis—that is, entirely excluding germs from the surgical field, using heat-sterilized instruments and surgical teams clad in sterile gowns and gloves.

In our era of electronic communications, we’ve come to expect that important innovations will spread quickly. Plenty do: think of in-vitro fertilization, genomics, and communications technologies themselves. But there’s an equally long list of vital innovations that have failed to catch on. The puzzle is why.

Did the spread of anesthesia and antisepsis differ for economic reasons? Actually, the incentives for both ran in the right direction. If painless surgery attracted paying patients, so would a noticeably lower death rate. Besides, live patients were more likely to make good on their surgery bill. Maybe ideas that violate prior beliefs are harder to embrace. To nineteenth-century surgeons, germ theory seemed as illogical as, say, Darwin’s theory that human beings evolved from primates. Then again, so did the idea that you could inhale a gas and enter a pain-free state of suspended animation. Proponents of anesthesia overcame belief by encouraging surgeons to try ether on a patient and witness the results for themselves—to take a test drive. When Lister tried this strategy, however, he made little progress.

The technical complexity might have been part of the difficulty. Giving Lister’s methods “a try” required painstaking attention to detail. Surgeons had to be scrupulous about soaking their hands, their instruments, and even their catgut sutures in antiseptic solution. Lister also set up a device that continuously sprayed a mist of antiseptic over the surgical field.

But anesthesia was no easier. Obtaining ether and constructing the inhaler could be difficult. You had to make sure that the device delivered an adequate dosage, and the mechanism required constant tinkering. Yet most surgeons stuck with it—or else they switched to chloroform, which was found to be an even more powerful anesthetic, but posed its own problems. (An imprecise dosage killed people.) Faced with the complexities, they didn’t give up; instead, they formed an entire new medical specialty—anesthesiology.

So what were the key differences? First, one combatted a visible and immediate problem (pain); the other combatted an invisible problem (germs) whose effects wouldn’t be manifest until well after the operation. Second, although both made life better for patients, only one made life better for doctors. Anesthesia changed surgery from a brutal, time-pressured assault on a shrieking patient to a quiet, considered procedure. Listerism, by contrast, required the operator to work in a shower of carbolic acid. Even low dilutions burned the surgeons’ hands. You can imagine why Lister’s crusade might have been a tough sell.

This has been the pattern of many important but stalled ideas. They attack problems that are big but, to most people, invisible; and making them work can be tedious, if not outright painful. The global destruction wrought by a warming climate, the health damage from our over-sugared modern diet, the economic and social disaster of our trillion dollars in unpaid student debt—these things worsen imperceptibly every day. Meanwhile, the carbolic-acid remedies to them, all requiring individual sacrifice of one kind or another, struggle to get anywhere.

The global problem of death in childbirth is a pressing example. Every year, three hundred thousand mothers and more than six million children die around the time of birth, largely in poorer countries. Most of these deaths are due to events that occur during or shortly after delivery. A mother may hemorrhage. She or her baby may suffer an infection. Many babies can’t take their first breath without assistance, and newborns, especially those born small, have trouble regulating their body temperature after birth. Simple, lifesaving solutions have been known for decades. They just haven’t spread.

Many solutions aren’t ones you can try at home, and that’s part of the problem. Increasingly, however, women around the world are giving birth in hospitals. In India, a government program offers mothers up to fourteen hundred rupees—more than what most Indians live on for a month—when they deliver in a hospital, and now, in many areas, the majority of births are in facilities. Death rates in India have fallen, but they’re still ten times greater than in high-income countries like our own.

Not long ago, I visited a few community hospitals in north India, where just one-third of mothers received the medication recommended to prevent hemorrhage; less than ten per cent of the newborns were given adequate warming; and only four per cent of birth attendants washed their hands for vaginal examination and delivery. In an average childbirth, clinicians followed only about ten of twenty-nine basic recommended practices.

Here we are in the first part of the twenty-first century, and we’re still trying to figure out how to get ideas from the first part of the twentieth century to take root. In the hopes of spreading safer childbirth practices, several colleagues and I have teamed up with the Indian government, the World Health Organization, the Gates Foundation, and Population Services International to create something called the BetterBirth Project. We’re working in Uttar Pradesh, which is among India’s poorest states. One afternoon in January, our team travelled a couple of hours from the state’s capital, Lucknow, with its bleating cars and ramshackle shops, to a rural hospital surrounded by lush farmland and thatched-hut villages. Although the sun was high and the sky was clear, the temperature was near freezing. The hospital was a one-story concrete building painted goldenrod yellow. (Our research agreement required that I keep it unnamed.) The entrance is on a dirt road lined with rows of motorbikes, the primary means of long-distance transportation. If an ambulance or an auto-rickshaw can’t be found, women in labor sit sidesaddle on the back of a bike.

The hospital delivers three thousand newborns a year, a typical volume in India but one that would put it in the top fifth of American hospitals. Yet it had little of the amenities that you’d associate with a modern hospital. I met the physician in charge, a smart and capable internist in his early thirties who had trained in the capital. He was clean-shaven and buzz-cut, with an Argyle sweater, track shoes, and a habitual half smile. He told me, apologetically, that the hospital staff had no ability to do blood tests, to give blood transfusions, or to perform emergency obstetrics procedures such as Cesarean sections. There was no electricity during the day. There was certainly no heating, even though the temperature was barely forty degrees that day, and no air-conditioning, even though summer temperatures routinely reach a hundred degrees. There were two blood-pressure cuffs for the entire facility. The nurse’s office in my neighborhood elementary school was better equipped.

The hospital was severely understaffed, too. The doctor said that half of the staff positions were vacant. To help with child deliveries for a local population of a quarter of a million people, the hospital had two nurses and one obstetrician, who happened to be his wife. The nurses, who had six months of childbirth training, did most of the deliveries, swapping shifts year-round. The obstetrician covered the outpatient clinic, and helped with complicated births whenever she was required, day or night. During holidays or sickness, the two nurses covered for each other, but, if no one was available, laboring women were either sent to another hospital, miles away, or an untrained assistant might be forced to step in.

It may be surprising that mothers are better off delivering in such places than at home in a village, but studies show a consistently higher survival rate when they do. The staff members I met in India had impressive experience. Even the youngest nurses had done more than a thousand child deliveries. They’ve seen and learned to deal with countless problems—a torn placenta, an umbilical cord wrapped around a baby’s neck, a stuck shoulder. Seeing the daily heroism required to keep such places going, you feel foolish and ill-mannered asking how they could do things better.

But then we hung out in the wards for a while. In the delivery room, a boy had just been born. He and his mother were lying on a cot, bundled under woollen blankets, resting. The room was coffin-cold; I was having trouble feeling my toes. I tried to imagine what that baby must have felt like. Newborns have a high body-surface area and lose heat rapidly. Even in warm weather, hypothermia is common, and it makes newborns weak and less responsive, less able to breast-feed adequately and more prone to infection. I noticed that the boy was swaddled separately from his mother. Voluminous evidence shows that it is far better to place the child on the mother’s chest or belly, skin to skin, so that the mother’s body can regulate the baby’s until it is ready to take over. Among small or premature babies, kangaroo care (as it is known) cuts mortality rates by a third.

So why hadn’t the nurse swaddled the two together? She was a skilled and self-assured woman in her mid-thirties with twinkly eyes, a brown knit hat, and a wool sweater over her shalwar kameez. Resources clearly weren’t the issue—kangaroo care costs nothing. Had she heard of it? Oh, yes, she said. She’d taken a skilled-birth-attendant class that taught it. Had she forgotten about it? No. She had actually offered to put the baby skin to skin with the mother, and showed me where she’d noted this in the record.

“The mother didn’t want it,” she explained. “She said she was too cold.”

The nurse seemed to think it was strange that I was making such an issue of this. The baby was fine, wasn’t he? And he was. He was sleeping sweetly, a tightly wrapped peanut with a scrunched brown face and his mouth in a lowercase “o.”

But had his temperature been taken? It had not. The nurse said that she had been planning to do so. Our visit had disrupted her routine. Suppose she had, though, and his temperature was low. Would she have done anything differently? Would she have made the mom unswaddle the child and put him to her chest?

Everything about the life the nurse leads—the hours she puts in, the circumstances she endures, the satisfaction she takes in her abilities—shows that she cares. But hypothermia, like the germs that Lister wanted surgeons to battle, is invisible to her. We picture a blue child, suffering right before our eyes. That is not what hypothermia looks like. It is a child who is just a few degrees too cold, too sluggish, too slow to feed. It will be some time before the baby begins to lose weight, stops making urine, develops pneumonia or a bloodstream infection. Long before that happens—usually the morning after the delivery, perhaps the same night—the mother will have hobbled to an auto-rickshaw, propped herself beside her husband, held her new baby tight, and ridden the rutted roads home.

From the nurse’s point of view, she’d helped bring another life into the world. If four per cent of the newborns later died at home, what could that possibly have to do with how she wrapped the mother and child? Or whether she washed her hands before putting on gloves? Or whether the blade with which she cut the umbilical cord was sterilized?

We’re infatuated with the prospect of technological solutions to these problems—baby warmers, say. You can still find high-tech incubators in rural hospitals that sit mothballed because a replacement part wasn’t available, or because there was no electricity for them. In recent years, though, engineers have produced designs specifically for the developing world. Dr. Steven Ringer, a neonatologist and BetterBirth leader, was an adviser for a team that made a cheap, ingenious, award-winning incubator from old car parts that are commonly available and easily replaced in low-income environments. Yet it hasn’t taken off, either. “It’s in more museums than delivery rooms,” he laments.

As with most difficulties in global health care, lack of adequate technology is not the biggest problem. We already have a great warming technology: a mother’s skin. But even in high-income countries we do not consistently use it. In the United States, according to Ringer, more than half of newborns needing intensive care arrive hypothermic. Preventing hypothermia is a perfect example of an unsexy task: it demands painstaking effort without immediate reward. Getting hospitals and birth attendants to carry out even a few of the tasks required for safer childbirth would save hundreds of thousands of lives. But how do we do that?

The most common approach to changing behavior is to say to people, “Please do X.” Please warm the newborn. Please wash your hands. Please follow through on the twenty-seven other childbirth practices that you’re not doing. This is what we say in the classroom, in instructional videos, and in public-service campaigns, and it works, but only up to a point.

Then, there’s the law-and-order approach: “You must do X.” We establish standards and regulations, and threaten to punish failures with fines, suspensions, the revocation of licenses. Punishment can work. Behavioral economists have even quantified how averse people are to penalties. In experimental games, they will often quit playing rather than risk facing negative consequences. And that is the problem with threatening to discipline birth attendants who are taking difficult-to-fill jobs under intensely trying conditions. They’ll quit.

The kinder version of “You must do X” is to offer incentives rather than penalties. Maybe we could pay birth attendants a bonus for every healthy child who makes it past a week of life. But then you think about how hard it would be to make a scheme like that work, especially in poor settings. You’d need a sophisticated tracking procedure, to make sure that people aren’t gaming the system, and complex statistical calculations, to take prior risks into account. There’s also the impossible question of how you split the reward among all the people involved. How much should the community health worker who provided the prenatal care get? The birth attendant who handled the first twelve hours of labor? The one who came on duty and handled the delivery? The doctor who was called in when things got complicated? The pharmacist who stocked the antibiotic that the child required?

Besides, neither penalties nor incentives achieve what we’re really after: a system and a culture where X is what people do, day in and day out, even when no one is watching. “You must” rewards mere compliance. Getting to “X is what we do” means establishing X as the norm. And that’s what we want: for skin-to-skin warming, hand washing, and all the other lifesaving practices of childbirth to be, quite simply, the norm.

To create new norms, you have to understand people’s existing norms and barriers to change. You have to understand what’s getting in their way. So what about just working with health-care workers, one by one, to do just that? With the BetterBirth Project, we wondered, in particular, what would happen if we hired a cadre of childbirth-improvement workers to visit birth attendants and hospital leaders, show them why and how to follow a checklist of essential practices, understand their difficulties and objections, and help them practice doing things differently. In essence, we’d give them mentors.

The experiment is just getting under way. The project has recruited only the first few of a hundred or so workers whom we are sending out to hospitals across six regions of Uttar Pradesh in a trial that will involve almost two hundred thousand births over two years. There’s no certainty that our approach will succeed. But it seemed worth trying.

Reactions that I’ve heard both abroad and at home have been interestingly divided. The most common objection is that, even if it works, this kind of one-on-one, on-site mentoring “isn’t scalable.” But that’s one thing it surely is. If the intervention saves as many mothers and newborns as we’re hoping—about a thousand lives in the course of a year at the target hospitals—then all that need be done is to hire and develop similar cadres of childbirth-improvement workers for other places around the country and potentially the world. To many people, that doesn’t sound like much of a solution. It would require broad mobilization, substantial expense, and perhaps even the development of a new profession. But, to combat the many antisepsis-like problems in the world, that’s exactly what has worked. Think about the creation of anesthesiology: it meant doubling the number of doctors in every operation, and we went ahead and did so. To reduce illiteracy, countries, starting with our own, built schools, trained professional teachers, and made education free and compulsory for all children. To improve farming, governments have sent hundreds of thousands of agriculture extension agents to visit farmers across America and every corner of the world and teach them up-to-date methods for increasing their crop yields. Such programs have been extraordinarily effective. They have cut the global illiteracy rate from one in three adults in 1970 to one in six today, and helped give us a Green Revolution that saved more than a billion people from starvation.

In the era of the iPhone, Facebook, and Twitter, we’ve become enamored of ideas that spread as effortlessly as ether. We want frictionless, “turnkey” solutions to the major difficulties of the world—hunger, disease, poverty. We prefer instructional videos to teachers, drones to troops, incentives to institutions. People and institutions can feel messy and anachronistic. They introduce, as the engineers put it, uncontrolled variability.

But technology and incentive programs are not enough. “Diffusion is essentially a social process through which people talking to people spread an innovation,” wrote Everett Rogers, the great scholar of how new ideas are communicated and spread. Mass media can introduce a new idea to people. But, Rogers showed, people follow the lead of other people they know and trust when they decide whether to take it up. Every change requires effort, and the decision to make that effort is a social process.

This is something that salespeople understand well. I once asked a pharmaceutical rep how he persuaded doctors—who are notoriously stubborn—to adopt a new medicine. Evidence is not remotely enough, he said, however strong a case you may have. You must also apply “the rule of seven touches.” Personally “touch” the doctors seven times, and they will come to know you; if they know you, they might trust you; and, if they trust you, they will change. That’s why he stocked doctors’ closets with free drug samples in person. Then he could poke his head around the corner and ask, “So how did your daughter Debbie’s soccer game go?” Eventually, this can become “Have you seen this study on our new drug? How about giving it a try?” As the rep had recognized, human interaction is the key force in overcoming resistance and speeding change.

In 1968, The Lancet published the results of a modest trial of what is now regarded as among the most important medical advances of the twentieth century. It wasn’t a new drug or vaccine or operation. It was basically a solution of sugar, salt, and water that you could make in your kitchen. The researchers gave the solution to victims of a cholera outbreak in Dhaka, the capital of what is now Bangladesh, and the results were striking.

Cholera is a violent and deadly diarrheal illness, caused by the bacterium Vibrio cholera, which the victim usually ingests from contaminated water. The bacteria secrete a toxin that triggers a rapid outpouring of fluid into the intestine. The body, which is sixty per cent water, becomes like a sponge being wrung out. The fluid pouring out is a cloudy white, likened to the runoff of washed rice. It produces projectile vomiting and explosive diarrhea. Children can lose a third of their body’s water in less than twenty-four hours, a fatal volume. Drinking water to replace the fluid loss is ineffective, because the intestine won’t absorb it. As a result, mortality commonly reached seventy per cent or higher. During the nineteenth century, cholera pandemics killed millions across Asia, Europe, Africa, and North America. The disease was dubbed the Blue Death because of the cyanotic blue-gray color of the skin from extreme dehydration.

In 1906, a partially effective treatment was found: intravenous fluid solutions reduced mortality to thirty per cent. Prevention was the most effective approach. Modern sewage and water treatment eliminated the disease in affluent countries. Globally, though, millions of children continued to die from diarrheal illness each year. Even if victims made it to a medical facility, the needles, plastic tubing, and litres of intravenous fluid required for treatment were expensive, in short supply, and dependent on medical workers who were themselves in short supply, especially in outbreaks that often produced thousands of victims.

Then, in the nineteen-sixties, scientists discovered that sugar helps the gut absorb fluid. Two American researchers, David Nalin and Richard Cash, were in Dhaka during a cholera outbreak. They decided to test the scientific findings, giving victims an oral rehydration solution containing sugar as well as salt. Many people doubted that victims could drink enough of it to restore their fluid losses, typically ten to twenty litres a day. So the researchers confined the Dhaka trial to twenty-nine patients. The subjects proved to have no trouble drinking enough to reduce or even eliminate the need for intravenous fluids, and none of them died.

Three years later, in 1971, an Indian physician named Dilip Mahalanabis was directing medical assistance at a West Bengal camp of three hundred and fifty thousand refugees from Bangladesh’s war of independence when cholera struck. Intravenous-fluid supplies ran out. Mahalanabis instructed his team to try the Dhaka solution. Just 3.6 per cent died, an unprecedented reduction from the usual thirty per cent. The solution was actually better than intravenous fluids. If cholera victims were alert, able to drink, and supplied with enough of it, they could almost always save their own lives.

One might have expected people to clamor for the recipe after these results were publicized. Oral rehydration solution seems like ether: a miraculous fix for a vivid, immediate, and terrifying problem. But it wasn’t like ether at all.

To understand why, you have to imagine having a child throwing up and pouring out diarrhea like you’ve never seen before. Making her drink seems only to provoke more vomiting. Chasing the emesis and the diarrhea seems both torturous and futile. Many people’s natural inclination is to not feed the child anything.

Furthermore, why believe that this particular mixture of sugar and salt would be any different from water or anything else you might have tried? And it is particular. Throw the salt concentration off by a couple of teaspoons and the electrolyte imbalance could be dangerous. The child must also keep drinking the stuff even after she feels better, for as long as the diarrhea lasts, which is up to five days. Nurses routinely got these steps wrong. Why would villagers do any better?

A decade after the landmark findings, the idea remained stalled. Nothing much had changed. Diarrheal disease remained the world’s biggest killer of children under the age of five.

In 1980, however, a Bangladeshi nonprofit organization called brac decided to try to get oral rehydration therapy adopted nationwide. The campaign required reaching a mostly illiterate population. The most recent public-health campaign—to teach family planning—had been deeply unpopular. The messages the campaign needed to spread were complicated.

Nonetheless, the campaign proved remarkably successful. A gem of a book published in Bangladesh, “A Simple Solution,” tells the story. The organization didn’t launch a mass-media campaign—only twenty per cent of the population had a radio, after all. It attacked the problem in a way that is routinely dismissed as impractical and inefficient: by going door to door, person by person, and just talking.

It started with a pilot project that set out to reach some sixty thousand women in six hundred villages. The logistics were daunting. Who, for instance, would do the teaching? How were those workers going to travel? How was their security to be assured? The brac leaders planned the best they could and then made adjustments on the fly.

They recruited teams of fourteen young women, a cook, and a male supervisor, figuring that the supervisor would protect them from others as they travelled, and the women’s numbers would protect them from the supervisor. They travelled on foot, pitched camp near each village, fanned out door to door, and stayed until they had talked to women in every hut. They worked long days, six days a week. Each night after dinner, they held a meeting to discuss what went well and what didn’t and to share ideas on how to do better. Leaders periodically debriefed them, as well.

The workers were only semi-literate, but they helped distill their sales script into seven easy-to-remember messages: for instance, severe diarrhea leads to death from dehydration; the signs of dehydration include dry tongue, sunken eyes, thirst, severe weakness, and reduced urination; the way to treat dehydration is to replace salt and water lost from the body, starting with the very first loose stool; a rehydration solution provides the most effective way to do this. brac’s scientists had to figure out how the workers could teach the recipe for the solution. Villagers had no precise measuring implements—spoons were locally made in nonstandard sizes. The leaders considered issuing special measuring spoons with the recipe on the handle. But these would be costly; most people couldn’t read the recipe; and how were the spoons to be replaced when lost? Eventually, the team hit upon using finger measures: a fistful of raw sugar plus a three-finger pinch of salt mixed in half a “seer” of water—a pint measure commonly used by villagers when buying milk and oil. Tests showed that mothers could make this with sufficient accuracy.

Initially, the workers taught up to twenty mothers per day. But monitors visiting the villages a few weeks later found that the quality of teaching suffered on this larger scale, so the workers were restricted to ten households a day. Then a new salary system was devised to pay each worker according to how many of the messages the mothers retained when the monitor followed up. The quality of teaching improved substantially. The field workers soon realized that having the mothers make the solution themselves was more effective than just showing them. The workers began looking for diarrhea cases when they arrived in a village, and treating them to show how effective and safe the remedy was. The scientists also investigated various questions that came up, such as whether clean water was required. (They found that, although boiled water was preferable, contaminated water was better than nothing.)

Early signs were promising. Mothers seemed to retain the key messages. Analysis of their sugar solutions showed that three-quarters made them properly, and just four in a thousand had potentially unsafe salt levels. So brac and the Bangladeshi government took the program nationwide. They hired, trained, and deployed thousands of workers region by region. The effort was, inevitably, imperfect. But, by going door to door through more than seventy-five thousand villages, they showed twelve million families how to save their children.

The program was stunningly successful. Use of oral rehydration therapy skyrocketed. The knowledge became self-propagating. The program had changed the norms.

Coaxing villagers to make the solution with their own hands and explain the messages in their own words, while a trainer observed and guided them, achieved far more than any public-service ad or instructional video could have done. Over time, the changes could be sustained with television and radio, and the growth of demand led to the development of a robust market for manufactured oral rehydration salt packets. Three decades later, national surveys have found that almost ninety per cent of children with severe diarrhea were given the solution. Child deaths from diarrhea plummeted more than eighty per cent between 1980 and 2005.

As other countries adopted Bangladesh’s approach, global diarrheal deaths dropped from five million a year to two million, despite a fifty-per-cent increase in the world’s population during the past three decades. Nonetheless, only a third of children in the developing world receive oral rehydration therapy. Many countries tried to implement at arm’s length, going “low touch,” without sandals on the ground. As a recent study by the Gates Foundation and the University of Washington has documented, those countries have failed almost entirely. People talking to people is still how the world’s standards change.

Surgeons finally did upgrade their antiseptic standards at the end of the nineteenth century. But, as is often the case with new ideas, the effort required deeper changes than anyone had anticipated. In their blood-slick, viscera-encrusted black coats, surgeons had seen themselves as warriors doing hemorrhagic battle with little more than their bare hands. A few pioneering Germans, however, seized on the idea of the surgeon as scientist. They traded in their black coats for pristine laboratory whites, refashioned their operating rooms to achieve the exacting sterility of a bacteriological lab, and embraced anatomic precision over speed.

The key message to teach surgeons, it turned out, was not how to stop germs but how to think like a laboratory scientist. Young physicians from America and elsewhere who went to Germany to study with its surgical luminaries became fervent converts to their thinking and their standards. They returned as apostles not only for the use of antiseptic practice (to kill germs) but also for the much more exacting demands of aseptic practice (to prevent germs), such as wearing sterile gloves, gowns, hats, and masks. Proselytizing through their own students and colleagues, they finally spread the ideas worldwide.

In childbirth, we have only begun to accept that the critical practices aren’t going to spread themselves. Simple “awareness” isn’t going to solve anything. We need our sales force and our seven easy-to-remember messages. And in many places around the world the concerted, person-by-person effort of changing norms is under way.

I recently asked BetterBirth workers in India whether they’d yet seen a birth attendant change what she does. Yes, they said, but they’ve found that it takes a while. They begin by providing a day of classroom training for birth attendants and hospital leaders in the checklist of practices to be followed. Then they visit them on site to observe as they try to apply the lessons.

Sister Seema Yadav, a twenty-four-year-old, round-faced nurse three years out of school, was one of the trainers. (Nurses are called “sisters” in India, a carryover from the British usage.) Her first assignment was to follow a thirty-year-old nurse with vastly more experience than she had. Watching the nurse take a woman through labor and delivery, she saw how little of the training had been absorbed. The room had not been disinfected; blood from a previous birth remained in a bucket. When the woman came in—moaning, contractions speeding up—the nurse didn’t check her vital signs. She didn’t wash her hands. She prepared no emergency supplies. After delivery, she checked the newborn’s temperature with her hand, not a thermometer. Instead of warming the baby against the mother’s skin, she handed the newborn to the relatives.

When Sister Seema pointed out the discrepancy between the teaching and the practice, the nurse was put out. She gave many reasons that steps were missed—there was no time, they were swamped with deliveries, there was seldom a thermometer at hand, the cleaners never did their job. Sister Seema—a cheerful, bubbly, fast talker—took her to the cleaner on duty and together they explained why cleaning the rooms between deliveries was so important. They went to the medical officer in charge and asked for a thermometer to be supplied. At her second and third visits, disinfection seemed more consistent. A thermometer had been found in a storage closet. But the nurse still hadn’t changed much of her own routine.

By the fourth or fifth visit, their conversations had shifted. They shared cups of chai and began talking about why you must wash hands even if you wear gloves (because of holes in the gloves and the tendency to touch equipment without them on), and why checking blood pressure matters (because hypertension is a sign of eclampsia, which, when untreated, is a common cause of death among pregnant women). They learned a bit about each other, too. Both turned out to have one child—Sister Seema a four-year-old boy, the nurse an eight-year-old girl. The nurse lived in the capital, a two-hour bus ride away. She was divorced, living with her mother, and struggled with the commute. She’d been frustrated not to find a hospital posting in the city. She worked for days at a stretch, sleeping on a cot when she got a break. Sister Seema commiserated, and shared her own hopes for her family and her future. With time, it became clearer to the nurse that Sister Seema was there only to help and to learn from the experience herself. They even exchanged mobile-phone numbers and spoke between visits. When Sister Seema didn’t have the answer to a question, she made sure she got one.

Soon, she said, the nurse began to change. After several visits, she was taking temperatures and blood pressures properly, washing her hands, giving the necessary medications—almost everything. Sister Seema saw it with her own eyes.

She’d had to move on to another pilot site after that, however. And although the project is tracking the outcomes of mothers and newborns, it will be a while before we have enough numbers to know if a difference has been made. So I got the nurse’s phone number and, with a translator to help with the Hindi, I gave her a call.

It had been four months since Sister Seema’s visit ended. I asked her whether she’d made any changes. Lots, she said.

“What was the most difficult one?” I asked.

“Washing hands,” she said. “I have to do it so many times!”

“What was the easiest?”

“Taking the vital signs properly.” Before, she said, “we did it haphazardly.” Afterward, “everything became much more systematic.”

She said that she had eventually begun to see the effects. Bleeding after delivery was reduced. She recognized problems earlier. She rescued a baby who wasn’t breathing. She diagnosed eclampsia in a mother and treated it. You could hear her pride as she told her stories.

Many of the changes took practice for her, she said. She had to learn, for instance, how to have all the critical supplies—blood-pressure cuff, thermometer, soap, clean gloves, baby respiratory mask, medications—lined up and ready for when she needed them; how to fit the use of them into her routine; how to convince mothers and their relatives that the best thing for a child was to be bundled against the mother’s skin. But, step by step, Sister Seema had helped her to do it. “She showed me how to get things done practically,” the nurse said.

“Why did you listen to her?” I asked. “She had only a fraction of your experience.”

In the beginning, she didn’t, the nurse admitted. “The first day she came, I felt the workload on my head was increasing.” From the second time, however, the nurse began feeling better about the visits. She even began looking forward to them.

“Why?” I asked.

All the nurse could think to say was “She was nice.”

“She was nice?”

“She smiled a lot.”

“That was it?”

“It wasn’t like talking to someone who was trying to find mistakes,” she said. “It was like talking to a friend.”

That, I think, was the answer. Since then, the nurse had developed her own way of explaining why newborns needed to be warmed skin to skin. She said that she now tells families, “Inside the uterus, the baby is very warm. So when the baby comes out it should be kept very warm. The mother’s skin does this.”

I hadn’t been sure if she was just telling me what I wanted to hear. But when I heard her explain how she’d put her own words to what she’d learned, I knew that the ideas had spread. “Do the families listen?” I asked.

“Sometimes they don’t,” she said. “Usually, they do.” ♦

20130807-1

Healthy Heart

 

 

Published: Aug 7, 2013

 

By Michael Smith, North American Correspondent, MedPage Today

Reviewed by F. Perry Wilson, MD, MSCE; Instructor of Medicine, Perelman School of Medicine at the University of Pennsylvania and Dorothy Caputo, MA, BSN, RN, Nurse Planner

20130807-2

  • Note that this review of available literature suggests that communication skills are paramount to reducing errors in the cardiac operating room.
  • Be aware that while some of the recommendations have a strong evidence-base, some are grounded more in expert opinion.

Better communication is the key to improving patient outcomes after cardiac surgery, according to a new scientific statement from the American Heart Association.

A wide-ranging review of evidence, led by Joyce Wahr, MD, of the University of Michigan Ann Arbor, called for such things as checklists, preoperative and postoperative briefings, and team training in communication skills.

But the review, published online in Circulation, also cautioned that “research in this area is nascent but informative.”

The investigators noted that cardiac surgical operating rooms form “a complex environment” that includes many highly trained people working together and using sophisticated equipment to treat people with severe disease.

“Preventable errors are often not related to failure of technical skill, training, or knowledge,” they wrote, “but represent cognitive, system, or teamwork failures.”

Key to improvement of patient safety, they argued, are the “nontechnical skills” of communication, cooperation, coordination, and leadership.

Communication skills, in particular, “have been measured as the worst aspect of teamwork behavior in the [operating room],” they wrote.

To help improve matters, they recommended:

  • Using checklists and/or briefings in every cardiac surgery case, with postoperative debriefings “encouraged” by the cardiac OR leadership.
  • Training to improve communication, leadership, and situational awareness involving all members of the cardiac operative team.
  • Formal handoff protocols for transfer of cardiac surgical patients to new personnel.
  • Regular training for “significant and rare” events, such as emergency oxygenator change-out.

They also called for more research into such areas as the best communication models, team-training approaches, and the efficacy of formal training in teamwork and communication skills in improving patient outcomes.

It might also be “reasonable” to investigate setting up an “anonymous national multidisciplinary event-reporting system to obtain data about events and near-misses,” they wrote.

Also important are the physical design of the OR and the organizational culture of safety.

“Many, if not most, cardiac ORs” have poor ergonomics, resulting in hazards for both patients and staff, Wahr and colleagues found.

Studies have shown that many cardiac ORs are designed so that their doors open several times an hour, increasing the risk of infection in patients.

Also, noise levels — a combination of music, alarms, and multiple conversations — can be hazardous, and small, crowded rooms can increase the risk of tripping over equipment or power cords.

For those reasons, Wahr and colleagues suggested researchers should investigate better information systems in order to cut down on distractions and “improve clinicians’ ability to integrate knowledge from multiple sources.”

An “innovative area of future research, which may avoid expensive design errors” would be to test optimal OR design and layout, they argued.

The investigators also suggested that institutions develop policies that “define disruptive behaviors by medical professionals in all hospital settings, with transparent, formal procedures for addressing unacceptable behaviors.”

They also called for hospitals to establish an “institutional culture of safety” by setting up a quality improvement system, with input encouraged from all team members, to find and fix hazards.

 

The statement was supported by the American Heart Association. Wahr reported no conflicts of interest, but several authors reported financial links with the pharmaceutical industry.

From the American American Heart Association:

20130807-1

The prospect of reversing blindness has made a significant leap, according to scientists in the UK.

 

 

An animal study in the journal Nature Biotechnology showed the part of the eye which actually detects light can be repaired using stem cells.

The team at Moorfields Eye Hospital and University College London say human trials are now, for the first time, a realistic prospect.

Experts described it as a “significant breakthrough” and “huge leap” forward.

Photoreceptors are the cells in the retina which react to light and convert it into an electrical signal which can be sent to the brain.

However, these cells can die off in some causes of blindness such as Stargardt’s disease and age-related macular degeneration.

There are already trials in people to use stem cells to replace the “support” cells in the eye which keep the photoreceptors alive.

Blind mice

Now the London-based team have shown it is possible to replace the light-sensing cells themselves, raising the prospect of reversing blindness.

They have used a new technique for building retinas in the laboratory. It was used to collect thousands of stem cells, which were primed to transform into photoreceptors, and injected them into the eyes of blind mice.

The study showed that these cells could hook up with the existing architecture of the eye and begin to function.

However, the effectiveness is still low. Only about 1,000 cells out of a transplant of 200,000 actually hooked up with the rest of the eye.

Lead researcher Prof Robin Ali told the BBC News website: “This is a real proof of concept that photoreceptors can be transplanted from an embryonic stem cells source and it give us a route map to now do this in humans.

“That’s why we’re so excited, five years is a now a realistic aim for starting a clinical trial.”

20130807-2

Rods, blue, and cones, blue-green, detect light and create electrical signals which are sent to the brain.

 

 

The eye is one of the most advanced fields for stem cell research.

It is relatively simple as the light sensing cells only have to pass their electrical message on to one more cell in order to get their message to the brain, unlike an attempt to reverse dementia which would require cells to hook up with far more cells all across the brain.

The immune system is also very weak in the eye so there is a low chance of the transplant being rejected. A few cells can also make a big difference in the eye. Tens of thousands of stem cells in the eye could improve vision, but that number of stem cells would not regenerate a much larger organ such as a failing liver.

Prof Chris Mason, from University College London, told the BBC: “I think they have made a major step forward here, but the efficiency is still too low for clinical uses.

“At the moment the numbers of tiny and it will take quite a bit of work to get the numbers up and then the next question is ‘Can you do it in man?’

“But I think it is a significant breakthrough which may lead to cell therapies and will give a much expanded knowledge on how to cure blindness.”

Dr Marcelo Rivolta, from the University of Sheffield, said the study was a “huge leap” forward for treating blindness and could have implications across stem cell research.

Good Sleep Plus Clean Living Cuts Heart Risk

20130806-1

By Chris Kaiser, Cardiology Editor, MedPage Today
Reviewed by Robert Jasmer, MD; Associate Clinical Professor of Medicine, University of California, San Francisco and Dorothy Caputo, MA, BSN, RN, Nurse Planner

20130806-2

MedPage.com, Published: July/August 2013

 

 

The duration of sleep alone or in combination with physical activity, a healthy diet, limited alcohol intake, and no smoking significantly reduced the risk of heart disease.

Even achieving sufficient sleep duration of at least 7 hours per night without any of the four traditional lifestyle factors had a positive impact on risk reduction of CVD (23%) and fatal CVD (43%).

 

 

The duration of sleep alone or in combination with four traditional healthy lifestyle factors significantly reduced the risk of heart disease, researchers found.

Those who adhered to sufficient physical activity, a healthy diet, limited alcohol intake, and no smoking had a 57% reduced risk of a composite of cardiovascular disease (CVD) and a 67% reduced risk of fatal CVD compared with those who adhered to none or one lifestyle factor, according to W.M. Monique Verschuren, PhD, of the National Institute for Public Health and the Environment in Bilthoven, the Netherlands, and colleagues.

When a good night’s sleep (more than 7 hours) was added to those traditional factors, the risk of CVD and of fatal CVD decreased even further — 65% and 83%, respectively, researchers wrote in the July edition of the European Journal of Preventive Cardiology.

However, even achieving sufficient sleep duration without any of the four traditional lifestyle factors had a positive impact on risk reduction — 22% reduced risk of CVD and 43% reduction in fatal CVD.

Also, not smoking alone carried significant weight, conferring a 43% reduced risk of CVD and of fatal CVD.

Thus, nonsmoking and sufficient sleep duration were both strongly and similarly inversely associated with fatal CVD, researchers said.

The composite CVD comprised fatal CVD, nonfatal myocardial infarction (MI), and stroke.

“If all participants adhered to all five healthy lifestyle factors, 36% of composite CVD and 57% of fatal CVD could theoretically be prevented or postponed,” the authors wrote. “The public health impact of sufficient sleep duration, in addition to the traditional healthy lifestyle factors, could be substantial.”

Verschuren and colleagues noted that only two studies have thus far included sleep duration as a lifestyle factor.

Both studies mirrored the current study by showing an independent association of sleep with CVD risk and a reduced risk of CVD when sleep duration was included with traditional lifestyle factors.

However, neither study separated out the protective benefit of sleep from the other healthy lifestyle benefits.

The investigators therefore decided to examine whether sufficient sleep duration further reduces risk of CVD on top of the traditional lifestyle factors.

Participant data came from the Monitoring Project on Risk Factors for Chronic Diseases (MORGEN), a prospective cohort study from 1993 to 1997. Follow-up was a mean of 12 years.

The study comprised 6,672 men and 7,967 women, with a mean age of 42 and 41, respectively.

A total of 12% of both men and women reported having all five healthy lifestyle factors at baseline, while 6%, reported having zero or one factor.

Rounding out the middle were 33% of both men and women who adhered to three lifestyle factors, and 32% and 29% of men and women, respectively, adhering to four factors.

In an analysis that adjusted for age, sex, educational level, and with and without mutual adjustments, nonsmoking was “strongly inversely” associated with composite CVD (hazard ratio 0.57), as were sleep duration (HR 0.78) and limited alcohol consumption (HR 0.79).

In terms of fatal CVD in the adjusted analysis, only nonsmoking (0.61) and sufficient sleep duration (0.57) were significant factors in reducing death.

As an explanation for the results, the investigators noted that short sleep duration has been associated with a higher incidence of overweight, obesity, and hypertension, along with higher levels of blood pressure, total cholesterol, hemoglobin A, and triglycerides, effects which are “consistent with the hypothesis that short sleep duration is directly associated with CVD risk.”

The importance of sufficient sleep “should now be mentioned as an additional way to reduce the risk of cardiovascular disease,” Verschuren said in a statement. ” It is always important to confirm results, but the evidence is certainly growing that sleep should be added to our list of CVD risk factors.”

An earlier study from this group of researchers found that those who slept less than 7 hours and got up each morning not fully rested had a 63% higher risk of CVD than those sleeping sufficiently — although those who woke rested, even from less than 7 hours of sleep, did not have the increased risk (Sleep 2011; 34: 1487-1489).

A limitation of the current study includes the inability to know how or if depressive symptoms, sleep apnea, or psychological stress impacted the risk of CVD. Also, the potential for CVD to be misclassified “may have attenuated” the results, researchers said.

The Monitoring Project on Risk Factors for Chronic Diseases (MORGEN study) is supported by the Ministry of Health, Welfare and Sport for the Netherlands and the National Institute for Public Health and the Environment.

 

 

The authors declared no conflicts of interest.


Primary source: European Journal of Preventive Cardiology
Source reference:
Verschuren MWM, et al “Sufficient sleep duration contributes to lower cardiovascular disease risk in addition to four traditional lifestyle factors: the MORGEN study” Eur J Prev Cardiol2013; DOI: 10.1177/2047487313493057.

 

Risk-Based Monitoring Videos From Applied Clinical Trials Are Going Viral

 

Target Health is pleased to share the video clips below which were recorded at the year’s DIA meeting and produced as part of the Applied Clinical Trials Risk-Based Network. Please share.

 

Target Health will again be presenting and sponsoring the CBINET Conference on “Risk-Based Monitoring in Clinical Studies, being held in Philadelphia on October 24 and 25. This meeting for sure will be a very productive experience for all.

 

Video Clips

 

Video 1

What are your views on risk-based monitoring?

 

Video 2

What is different now about risk-based monitoring vs. what companies may have been doing more innovatively about monitoring in the past?

 

 

YouTube Posting

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

← Previous PageNext Page →