tb 500

Date:
August 31, 2015

Source:
University of Dundee

Summary:
Childhood memories of sticky hands from melting ice cream cones could soon become obsolete, thanks to a new food ingredient.

 

20150902-1

Melting ice cream.
Credit: © Janis Smits / Fotolia

 

 

Childhood memories of sticky hands from melting ice cream cones could soon become obsolete, thanks to a new food ingredient.

Scientists have discovered a naturally occurring protein that can be used to create ice cream that is more resistant to melting than conventional products. The protein binds together the air, fat and water in ice cream, creating a super-smooth consistency.

The new ingredient could enable ice creams to keep frozen for longer in hot weather. It could also prevent gritty ice crystals from forming, ensuring a fine, smooth texture like those of luxury ice creams. The development could allow products to be manufactured with lower levels of saturated fat — and fewer calories — than at present.

Researchers at the Universities of Edinburgh and Dundee developed a method of producing the new protein — which occurs naturally in some foods — in friendly bacteria. They estimate that ice cream made with the ingredient could be available within three to five years.

The protein works by adhering to fat droplets and air bubbles, making them more stable in a mixture. Using the ingredient could offer significant advantages for ice cream makers. It can be processed without loss of performance, and can be produced from sustainable raw materials.

Manufacturers could also benefit from a reduced need to deep freeze their product, as the ingredient would keep ice cream frozen for longer. The supply chain would also be eased by a reduced need to keep the product very cold throughout delivery and merchandising.

The protein, known as BslA, was developed with support from the Engineering and Physical Sciences Research Council and the Biotechnology and Biological Sciences Research Council.

Professor Cait MacPhee, of the University of Edinburgh’s School of Physics and Astronomy, who led the project, said: “We’re excited by the potential this new ingredient has for improving ice cream, both for consumers and for manufacturers.”

Dr Nicola Stanley-Wall, of the University of Dundee, said: “It has been fun working on the applied use of a protein that was initially identified due to its practical purpose in bacteria.”


Story Source:

The above post is reprinted from materials provided byUniversity of Dundee. The original item was written by Roddy Isles. Note: Materials may be edited for content and length.

 

Source: University of Dundee. “Slower melting ice cream in pipeline, thanks to new ingredient.” ScienceDaily. ScienceDaily, 31 August 2015. <www.sciencedaily.com/releases/2015/08/150831213057.htm>.

Filed Under News | Leave a Comment 

Date:
August 31, 2015

Source:
Woods Hole Oceanographic Institution

Summary:
Ancient rocks harbored microbial life deep below the seafloor, reports scientists. This first-time evidence was contained in drilled rock samples of Earth’s mantle — thrust by tectonic forces to the seafloor during the Early Cretaceous period. The discovery confirms a long-standing hypothesis that interactions between mantle rocks and seawater can create potential for life even in hard rocks deep below the ocean floor.

 

20150901-1

These remarkable rocks, recovered by the Ocean Drilling Program (ODP) on board the drilling vessel JOIDES Resolution, are from the Earth’s upper mantle that underwent intense alteration by heated seawater. The rocks show a systematic change in color from rusty brown (top) to green and black (bottom), reflecting the chemical gradients across the fluid mixing zone. These chemical gradients played a key role in supporting microbes with chemical energy and the substrates they needed to thrive. Fossilized microbes were found in white veins consisting of the minerals calcite and brucite.
Credit: Photo courtesy of Ocean Drilling Program

 

 

Ancient rocks harbored microbial life deep below the seafloor, reports a team of scientists from the Woods Hole Oceanographic Institution (WHOI), Virginia Tech, and the University of Bremen. This new evidence was contained in drilled rock samples of Earth’s mantle — thrust by tectonic forces to the seafloor during the Early Cretaceous period. The new study was published in the Proceedings of the National Academy of Sciences.

The discovery confirms a long-standing hypothesis that interactions between mantle rocks and seawater can create potential for life even in hard rocks deep below the ocean floor. The fossilized microbes are likely the same as those found at the active Lost City hydrothermal field, providing potentially important clues about the conditions that support ‘intraterrestrial’ life in rocks below the seafloor.

“We were initially looking at how seawater interacts with mantle rocks, and how that process generates hydrogen,” said Frieder Klein, an associate scientist at WHOI and lead author of the study. “But during our analysis of the rock samples, we discovered organic-rich inclusions that contained lipids, proteins and amino acids — the building blocks of life — mummified in the surrounding minerals.”

This study, which was a collaborative effort between Klein, WHOI scientists Susan Humphris, Weifu Guo and William Orsi, Esther Schwarzenbach from Virginia Tech and Florence Schubotz from the University of Bremen, focused on mantle rocks that were originally exposed to seawater approximately 125 million years ago when a large rift split the massive supercontinent known as Pangaea. The rift, which eventually evolved into the Atlantic Ocean, pulled mantle rocks from Earth’s interior to the seafloor, where they underwent chemical reactions with seawater, transforming the seawater into a hydrothermal fluid.

“The hydrothermal fluid likely had a high pH and was depleted in carbon and electron acceptors,” Klein said. “These extreme chemical conditions can be challenging for microbes. However, the hydrothermal fluid contained hydrogen and methane and seawater contains dissolved carbon and electron acceptors. So when you mix the two in just the right proportions, you can have the ingredients to support life.”

According to Dr. Everett Shock, a professor at Arizona State University’s School of Earth and Science Exploration, the study underscores the influence major geologic processes can have on the prospect for life.

“This research makes the connection all the way from convection of the mantle to the break-up of the continents to ultimately providing geochemical options for microbiology,” Shock said. “It’s just such a nice demonstration of real-world geobiology with a lot of ‘geo’ in it.”

Drilling Deep

The rock samples analyzed in the study were originally drilled from the Iberian continental margin off the coast of Spain and Portugal in 1993. During the expedition aboard the research vessel JOIDES Resolution operated by the Ocean Drilling Program (ODP) — researchers drilled through 690 meters of mud and sediment deposited onto to the ocean floor to reach the ancient seafloor created during the break-up of the supercontinent Pangaea and the opening of the Atlantic Ocean. The drill samples had been stored in core repositories at room temperature for more than two decades, before Klein and his colleagues began their investigation and discovered the fossilized microbial remains.

“Colonies of bacteria and archaea were feeding off the seawater-hydrothermal fluid mix and became engulfed in the minerals growing in the fractured rock,” Klein said. “This kept them completely isolated from the environment. The minerals proved to be the ultimate storage containers for these organisms, preserving their lipids and proteins for over 100 million years.”

“It’s exciting that the research team was able to go back and examine samples that had been collected years ago for other reasons and find new discoveries,” Shock said. “There will always be active new drilling, but this study raises the possibility of there being a lot more out there in the way of existing samples that could be analyzed.”

In the lab, samples from the rock interior had to be extracted since the outside of the drill core was stored under non-sterile conditions. So Klein and his colleagues took a number of careful steps to ensure the integrity of the sample interior wasn’t compromised, and then analyzed the rocks with high-resolution microscopes, a confocal Raman spectrometer and a range of isotope techniques.

A Link to the Lost City

While Raman spectroscopy enabled Klein to verify the presence of amino acids, proteins and lipids in the samples, it did not provide enough detailed information to correlate them with other hydrothermal systems. The lipids were of particular interest to Klein since they tend to be better preserved over long timescales, and have been studied in a wide range of seafloor environments. This prompted Klein to ask Schubotz, an expert in lipid biomarker analysis at the University of Bremen, if she could tease out further information about the lipids from these ancient rocks.

Schubotz ran the lipids through an advanced liquid chromatography-based mass spectrometer system to separate out and identify their biochemical components. The analysis led to a remarkable discovery: the lipids from the Iberian margin match up with those from the Lost City hydrothermal field, which was discovered in 2000 in the Mid-Atlantic Ridge during an expedition on board the WHOI-operated research vessel Atlantis. This is significant because researchers believe the Lost City is a present-day analog to ancient hydrothermal systems on early Earth where life may have emerged.

“I was stoked when I saw Dr. Schubotz’s email detailing the analytical results,” Klein said. “It was fascinating to find these particular biological substances — which had previously been found only at the Lost City hydrothermal field and in cold seeps — in rocks below the seafloor where life is extremely challenging. At that point we knew we were onto something really cool!”

A Deeper Understanding

According to Klein, confirmation that life is possible in mantle rocks deep below the seafloor may have important implications for understanding subseafloor life across a wide range of geologic environments.

“All the ingredients necessary to drive these ecosystems were made entirely from scratch,” he said. “Similar systems have likely existed throughout most of Earth’s history to the present day and possibly exist(ed) on other water-bearing rocky planetary bodies, such as Jupiter’s moon Europa.”

The study reinforces the idea that life springs up anywhere there is water, even in seemingly hostile geological environments — a tantalizing prospect as scientists find more and more water elsewhere in the solar system. But Klein contends that, while scientists have long understood many of the forces driving microbial life above the seafloor, there is still a great deal of uncertainty when it comes to understanding biogeochemical processes occurring in the oceanic basement.

“In the future, we’ll be trying to learn more about these particular microorganisms and what the environmental conditions were in the mixing zone in that location. We also plan to go to different places where we think similar processes may have taken place, such as along the Newfoundland margin, and analyze samples to see if we find similar signatures. Broadening this research could provide additional insights about Earth’s history and the search for life in the solar system.”


Story Source:

The above post is reprinted from materials provided by Woods Hole Oceanographic Institution. Note: Materials may be edited for content and length.


Journal Reference:

  1. Frieder Klein, Susan E. Humphris, Weifu Guo, Florence Schubotz, Esther M. Schwarzenbach, and William D. Orsi.Fluid mixing and the deep biosphere of a fossil Lost City-type hydrothermal system at the Iberia Margin. PNAS, August 2015 DOI: 10.1073/pnas.1504674112

 

Source: Woods Hole Oceanographic Institution. “Evidence of ancient life discovered in mantle rocks deep below the seafloor.” ScienceDaily. ScienceDaily, 31 August 2015. <www.sciencedaily.com/releases/2015/08/150831163726.htm>.

Filed Under News | Leave a Comment 

Date:
August 27, 2015

Source:
Howard Hughes Medical Institute (HHMI)

Summary:
Scientists can now watch dynamic biological processes with unprecedented clarity in living cells using new imaging techniques. The new methods dramatically improve on the spatial resolution provided by structured illumination microscopy, one of the best imaging methods for seeing inside living cells.

 

20150831-1

This is a still image from a video showing the interaction of filamentous actin (mApple-F-tractin, purple) with myosin IIA bipolar head groups (EGFP, myosin IIA, green) at 20-second intervals for 100 time points, as seen with high-NA TIRF-SIM.
Credit: Betzig Lab, HHMI/Janelia Research Campus

 

 

Scientists can now watch dynamic biological processes with unprecedented clarity in living cells using new imaging techniques developed by researchers at the Howard Hughes Medical Institute’s Janelia Research Campus. The new methods dramatically improve on the spatial resolution provided by structured illumination microscopy, one of the best imaging methods for seeing inside living cells.

The vibrant videos produced with the new technology show the movement and interactions of proteins as cells remodel their structural supports or reorganize their membranes to take up molecules from outside the cell. Janelia group leader Eric Betzig, postdoctoral fellow Dong Li and their colleagues have added the two new technologies — both variations on SIM — to the set of tools available for super-resolution imaging. Super-resolution optical microscopy produces images whose spatial resolution surpasses a theoretical limit imposed by the wavelength of light, offering extraordinary visual detail of structures inside cells. But until now, super-resolution methods have been impractical for use in imaging living cells.

“These methods set a new standard for how far you can push the speed and non-invasiveness of super-resolution imaging,” Betzig says of the techniques his team described in the August 28, 2015, issue of the journal Science. “This will bring super-resolution to live-cell imaging for real.”

In traditional SIM, the sample under the lens is observed while it is illuminated by a pattern of light (more like a bar code than the light from a lamp). Several different light patterns are applied, and the resulting moiré patterns are captured from several angles each time by a digital camera. Computer software then extracts the information in the moiré images and translates it into a three-dimensional, high-resolution reconstruction. The final reconstruction has twice the spatial resolution that can be obtained with traditional light microscopy.

Betzig was one of three scientists awarded the 2014 Nobel Prize in Chemistry for the development of super-resolved fluorescence microscopy. He says SIM has not received as much attention as other super-resolution methods largely because those other methods offer more dramatic gains in spatial resolution. But he notes that SIM has always offered two advantages over alternative super-resolution methods, including photoactivated localization microscopy (PALM), which he developed in 2006 with Janelia colleague Harald Hess.

Both PALM and stimulated emission depletion (STED) microscopy, the other super-resolution technique recognized with the 2014 Nobel Prize, illuminate samples with so much light that fluorescently labeled proteins fade and the sample is quickly damaged, making prolonged imaging impossible. SIM, however, is different. “I fell in love with SIM because of its speed and the fact that it took so much less light than the other methods,” Betzig says.

Betzig began working with SIM shortly after the death in 2011 of one of its pioneers, Mats Gustafsson, who was a group leader at Janelia. Betzig was already convinced that SIM had the potential to generate significant insights into the inner workings of cells, and he suspected that improving the technique’s spatial resolution would go a long way toward increasing its use by biologists.

Gustafsson and graduate student Hesper Rego had achieved higher-resolution SIM with a variation called saturated depletion non-linear SIM, but that method trades improvements in spatial resolution for harsher conditions and a loss of speed. Betzig saw a way around that trade-off.

Saturated depletion enhances the resolution of SIM images by taking advantage of fluorescent protein labels that can be switched on and off with light. To generate an image, all of the fluorescent labels in a protein are switched on, then a wave of light is used to deactivate most of them. After exposure to the deactivating light, only molecules at the darkest regions of the light wave continue to fluoresce. These provide higher frequency information and sharpen the resulting image. An image is captured and the cycle is repeated 25 times or more to generate data for the final image. The principle is very similar to the way super-resolution in achieved in STED or a related method called RESOLFT, Betzig says.

The method is not suited to live imaging, he says, because it takes too long to switch the photoactivatable molecules on and off. What’s more, the repeated light exposure damages cells and their fluorescent labels. “The problem with this approach is that you first turn on all the molecules, then you immediately turn off almost all the molecules. The molecules you’ve turned off don’t contribute anything to the image, but you’ve just fried them twice. You’re stressing the molecules, and it takes a lot of time, which you don’t have, because the cell is moving.”

The solution was simple, Betzig says: “Don’t turn on all of the molecules. There’s no need to do that.” Instead, the new method, called patterned photoactivation non-linear SIM, begins by switching on just a subset of fluorescent labels in a sample with a pattern of light. “The patterning of that gives you some high resolution information already,” he explains. A new pattern of light is used to deactivate molecules, and additional information is read out of their deactivation. The combined effect of those patterns leads to final images with 62-nanometer resolution — better than standard SIM and a three-fold improvement over the limits imposed by the wavelength of light.

“We can do it and we can do it fast,” he says. That’s important, he says, because for imaging dynamic processes, an increase in spatial resolution is meaningless without a corresponding increase in speed. “If something in the cell is moving at a micron a second and I have one micron resolution, I can take that image in a second. But if I have 1/10-micron resolution, I have to take the data in a tenth of a second, or else it will smear out,” he explains.

Patterned photoactivation non-linear SIM captures the 25 images that go into a final reconstruction in about one-third of a second. Because it does so efficiently, using low intensity light and gleaning information from every photon emitted from a sample’s fluorescent labels, labels are preserved so that the microscope can image longer, letting scientists watch more action unfold.

The team used patterned photoactivation non-linear SIM to produce videos showing structural proteins break down and reassemble themselves as cells move and change shape, as well as the dynamics of tiny pits on cell surfaces called caveolae.

Betzig’s team also reports in the Science paper that they can boost the spatial resolution of SIM to 84 nanometers by imaging with a commercially available microscope objective with an ultra-high numerical aperture. The aperture restricts light exposure to a very small fraction of a sample, limiting damage to cells and fluorescent molecules, and the method can be used to image multiple colors at the same time, so scientists can simultaneously track several different proteins.

Using the high numerical aperture approach, Betzig’s team was able to watch the movements and interactions of several structural proteins during the formation of focal adhesions, physical links between the interior and exterior of a cell. They also followed the growth and internalization of clathrin-coated pits, structures that facilitate the intake of molecules from outside of the cell. Their quantitative analysis answered several questions about the pits’ distribution and the relationship between pits’ size and lifespan that could not be addressed with previous imaging methods.

Finally, by combining the high numerical-aperture approach with patterned photoactivatable non-linear SIM, Betzig and his colleagues could follow two proteins at a time with higher resolution than the high numerical aperture approach offered on its own.

Betzig’s team is continuing to develop their SIM technologies, and say further improvements are likely. They are also eager to work with biologists to continue to explore potential applications and refine their techniques’ usability.

For now, scientists who want to experiment with the new SIM methods can arrange to do so through Janelia’s Advanced Imaging Center, which provides access to cutting-edge microscopy technology at no cost. Eventually, Betzig says, it should be fairly straightforward to make the SIM technologies accessible and affordable to other labs. “Most of the magic is in the software, not the hardware,” he says.


Story Source:

The above post is reprinted from materials provided by Howard Hughes Medical Institute (HHMI). Note: Materials may be edited for content and length.


Journal Reference:

  1. D. Li, L. Shao, B.-C. Chen, X. Zhang, M. Zhang, B. Moses, D. E. Milkie, J. R. Beach, J. A. Hammer, M. Pasham, T. Kirchhausen, M. A. Baird, M. W. Davidson, P. Xu, E. Betzig. Extended-resolution structured illumination imaging of endocytic and cytoskeletal dynamics.Science, 2015; 349 (6251): aab3500 DOI: 10.1126/science.aab3500

 

Source: Howard Hughes Medical Institute (HHMI). “Imaging techniques set new standard for super-resolution in live cells.” ScienceDaily. ScienceDaily, 27 August 2015. <www.sciencedaily.com/releases/2015/08/150827143751.htm>.

Filed Under News | Leave a Comment 

Date:
August 26, 2015

Source:
The Lancet

Summary:
Global life expectancy has risen by more than six years since 1990 as healthy life expectancy grows; ischemic heart disease, lower respiratory infections, and stroke cause the most health loss around the world.

 

20150828-1

People around the world are living longer, even in some of the poorest countries, but a complex mix of fatal and nonfatal ailments causes a tremendous amount of health loss.
Credit: © Maksim Šmeljov / Fotolia

 

 

Global life expectancy has risen by more than six years since 1990 as healthy life expectancy grows; ischemic heart disease, lower respiratory infections, and stroke cause the most health loss around the world.

People around the world are living longer, even in some of the poorest countries, but a complex mix of fatal and nonfatal ailments causes a tremendous amount of health loss, according to a new analysis of all major diseases and injuries in 188 countries.

Thanks to marked declines in death and illness caused by HIV/AIDS and malaria in the past decade and significant advances made in addressing communicable, maternal, neonatal, and nutritional disorders, health has improved significantly around the world. Global life expectancy at birth for both sexes rose by 6.2 years (from 65.3 in 1990 to 71.5 in 2013), while healthy life expectancy, or HALE, at birth rose by 5.4 years (from 56.9 in 1990 to 62.3 in 2013).

Healthy life expectancy takes into account not just mortality but also the impact of nonfatal conditions and summarizes years lived with disability and years lost due to premature mortality. The increase in healthy life expectancy has not been as dramatic as the growth of life expectancy, and as a result, people are living more years with illness and disability.

“Global, regional, and national disability-adjusted life years (DALYs) for 306 diseases and injuries and healthy life expectancy (HALE) for 188 countries, 1990-2013: quantifying the epidemiological transition” examines fatal and nonfatal health loss across countries. Published in The Lancet on August 27, the study was conducted by an international consortium of researchers working on the Global Burden of Disease study and led by the Institute for Health Metrics and Evaluation (IHME) at the University of Washington.

“The world has made great progress in health, but now the challenge is to invest in finding more effective ways of preventing or treating the major causes of illness and disability,” said Professor Theo Vos of IHME, the study’s lead author.

For most countries, changes in healthy life expectancy for males and females between 1990 and 2013 were significant and positive, but in dozens of countries including Botswana, Belize, and Syria healthy life expectancy in 2013 was not significantly higher than in 1990. In some of those countries, including South Africa, Paraguay, and Belarus, healthy life expectancy has actually dropped since 1990. People born in Lesotho and Swaziland in 2013 could expect to live at least 10 fewer years in good health than people born in those countries two decades earlier. People in countries such as Nicaragua and Cambodia have experienced dramatic increases in healthy life expectancy since 1990, 14.7 years and 13.9 years, respectively. The reverse was true for people in Botswana and Belize, which saw declines of 2 years and 1.3 years, respectively.

The differences between countries with the highest and lowest healthy life expectancies is stark. In 2013, Lesotho had the lowest, at 42 years, and Japan had the highest globally, at 73.4 years. Even regionally, there is significant variation. Cambodians and Laotians born in 2013 would have healthy life expectancies of only 57.5 years and 58.1 years, respectively, but people born in nearby Thailand and Vietnam could live nearly 67 years in good health.

As both life expectancy and healthy life expectancy increase, changes in rates of health loss become increasingly crucial. The study’s researchers use DALYs, or disability-adjusted life years, to compare the health of different populations and health conditions across time. One DALY equals one lost year of healthy life and is measured by the sum of years of life lost to early death and years lived with disability. The leading global causes of health loss, as measured by DALYs, in 2013 were ischemic heart disease, lower respiratory infections, stroke, low back and neck pain, and road injuries. These causes differed by gender: for males, road injuries were a top-five cause of health loss, but these were not in the top 10 for females, who lose substantially more health to depressive disorders than their male counterparts.

Ethiopia is one of several countries that have been rising to the challenge to ensure that people live lives that are both longer and healthier. In 1990, Ethiopians could expect to live 40.8 healthy years. But by 2013, the country saw an increase in healthy life expectancy of 13.5 years, more than double the global average, to 54.3 years.

“Ethiopia has made impressive gains in health over the past two decades, with significant decreases in rates of diarrheal disease, lower respiratory infection, and neonatal disorders,” said Dr. Tariku Jibat Beyene of Addis Ababa University. “But ailments such as heart disease, COPD, and stroke are causing an increasing amount of health loss. We must remain vigilant in addressing this new reality of Ethiopian health.”

The fastest-growing global cause of health loss between 1990 and 2013 was HIV/AIDS, which increased by 341.5%. But this dramatic rise masks progress in recent years; since 2005, health loss due to HIV/AIDS has diminished by 23.9% because of global focus on the disease. Ischemic heart disease, stroke, low back and neck pain, road injuries, and COPD have also caused an increasing amount of health loss since 1990.The impact of other ailments, such as diarrheal diseases, neonatal preterm birth complications, and lower respiratory infections, has significantly declined.

Across countries, patterns of health loss vary widely. The countries with the highest rates of DALYs are among the poorest in the world, and include several in sub-Saharan Africa: Lesotho, Swaziland, Central African Republic, Guinea-Bissau, and Zimbabwe. Countries with the lowest rates of health loss include Italy, Spain, Norway, Switzerland, and Israel.

Country-level variation also plays an important role in the changing disease burden, particularly for non-communicable diseases. For communicable, maternal, neonatal, and nutritional disorders, global DALY numbers and age-standardized rates declined between 1990 and 2013. While the number of DALYs for non-communicable diseases have increased during this period, age-standardized rates have declined.

The number of DALYs due to communicable, maternal, neonatal, and nutritional disorders has declined steadily, from 1.19 billion in 1990 to 769.3 million in 2013, while DALYs from non-communicable diseases have increased steadily, rising from 1.08 billion to 1.43 billion over the same period.

The study also examines the role that socio-demographic status — a combination of per capita income, population age, fertility rates, and years of schooling — plays in determining health loss. Researchers’ findings underscore that this accounts for more than half of the differences seen across countries and over time for certain leading causes of DALYs, including maternal and neonatal disorders. But the study notes that socio-demographic status is much less responsible for the variation seen for ailments including cardiovascular disease and diabetes.

“Factors including income and education have an important impact on health but don’t tell the full story,” said IHME Director Dr. Christopher Murray. “Looking at healthy life expectancy and health loss at the country level can help guide policies to ensure that people everywhere can have long and healthy lives no matter where they live.”

Countries with highest healthy life expectancy, both sexes, 2013

1 Japan

2 Singapore

3 Andorra

4 Iceland

5 Cyprus

6 Israel

7 France

8 Italy

9 South Korea

10 Canada

Countries with lowest healthy life expectancy, both sexes, 2013

1 Lesotho

2 Swaziland

3 Central African Republic

4 Guinea-Bissau

5 Zimbabwe

6 Mozambique

7 Afghanistan

8 Chad

9 South Sudan

10 Zambia

Leading causes of DALYs or health loss globally for both sexes, 2013

1 Ischemic heart disease

2 Lower respiratory infection

3 Stroke

4 Low back and neck pain

5 Road injuries

6 Diarrheal diseases

7 Chronic obstructive pulmonary disease

8 Neonatal preterm birth complications

9 HIV/AIDS

10 Malaria


Story Source:

The above post is reprinted from materials provided by The Lancet. Note: Materials may be edited for content and length.


Journal Reference:

  1. Murray CJL et al. Global, regional, and national disability-adjusted life years (DALYs) for 306 diseases and injuries and healthy life expectancy (HALE) for 188 countries, 1990–2013: a systematic analysis for the Global Burden of Disease Study 2013. The Lancet, 2015 DOI:10.1016/S0140-6736(15)61340-X

 

Source: The Lancet. “Life expectancy climbs worldwide but people spend more years living with illness and disability.” ScienceDaily. ScienceDaily, 26 August 2015. <www.sciencedaily.com/releases/2015/08/150826204220.htm>.

Filed Under News | Leave a Comment 

Currents of semi-liquid rock key to seismicity away from tectonic plate boundaries

Date:
August 26, 2015

Source:
University of Southern California

Summary:
Scientists have discovered the mechanism that generates earthquakes that occur away from tectonic plate boundaries.

 

20150827-1

St. Louis skyline in State of Missouri. Seismicity on the North American plate occurs as far afield as southern Missouri, where earthquakes between 1811 and 1812 estimated at around magnitude 7 caused the Mississippi River to flow backward for hours.
Credit: © digidreamgrafix / Fotolia

 

 

It’s not a huge mystery why Los Angeles experiences earthquakes. The city sits near a boundary between two tectonic plates — they shift, we shake. But what about places that aren’t along tectonic plate boundaries?

For example, seismicity on the North American plate occurs as far afield as southern Missouri, where earthquakes between 1811 and 1812 estimated at around magnitude 7 caused the Mississippi River to flow backward for hours.

Until now, the cause of that seismicity has remained unclear.

While earthquakes along tectonic plate boundaries are caused by motion between the plates, earthquakes away from fault lines are primarily driven by motion beneath the plates, according to a new study published by USC scientist Thorsten Becker inNature on Aug. 27.

Just beneath the Earth’s crust is a layer of hot, semi-liquid rock that is continually flowing — heating up and rising, then cooling and sinking. That convective process, interacting with the ever-changing motion of the plates at the surface, is driving intraplate seismicity and determining in large part where those earthquakes occur. To a lesser extent, the structure of the crust above also influences the location, according to their models.

“This will not be the last word on the origin of strange earthquakes. However, our work shows how imaging advances in seismology can be combined with mantle flow modeling to probe the links between seismicity and mantle convection,” said Becker, lead author of the study and professor of Earth sciences at the USC Dornsife College of Letters, Arts and Sciences.

Becker and his team used an updated mantle flow model to study the motion beneath the mountain belt that cuts north to south through the interior of the Western United States.

The area is seismically active — the reason Yellowstone has geysers is that it sits atop a volcanic hotspot. Previously, scientists had suggested that the varying density of the plates was the main cause. (Imagine a mountain’s own weight causing it to want to flow apart and thin out.)

Instead, the team found that the small-scale convective currents beneath the plate correlated with seismic events above in a predictable way. They also tried using the varying plate density or “gravitational potential energy variations” to predict seismic events and found a much poorer correlation.

“This study shows a direct link between deep convection and shallow earthquakes that we didn’t anticipate, and it charts a course for improved seismic hazard mapping in plate interiors,” said Tony Lowry, co-author of the paper and associate professor of geophysics and geodynamics at Utah State University.


Story Source:

The above post is reprinted from materials provided byUniversity of Southern California. The original item was written by Robert Perkins. Note: Materials may be edited for content and length.


Journal Reference:

  1. Thorsten W. Becker, Anthony R. Lowry, Claudio Faccenna, Brandon Schmandt, Adrian Borsa, Chunquan Yu. Western US intermountain seismicity caused by changes in upper mantle flow. Nature, 2015; 524 (7566): 458 DOI:10.1038/nature14867

 

Source: University of Southern California. “Mechanism behind ‘strange’ earthquakes discovered: Currents of semi-liquid rock key to seismicity away from tectonic plate boundaries.” ScienceDaily. ScienceDaily, 26 August 2015. <www.sciencedaily.com/releases/2015/08/150826135726.htm>.

Filed Under News | Leave a Comment 

Date:
August 24, 2015

Source:
University of Toronto

Summary:
A team of physicists has taken a step toward making the essential building block of quantum computers out of pure light. Their advance has to do with logic gates that perform operations on input data to create new outputs.

 

20150826-1

This is an artist’s rendition of what occurs when one photon goes through a carefully prepared atomic medium at the same time as a pulse including many photons. Change in the colors, represents nonlinear phase shifts picked up by each pulse that is proportional to the number of photons in the other pulse. A measurable nonlinear phase shift caused by a single photon on a pulse with many photons can enable deterministic two-qubit gates, an important missing part of the optical quantum information processing hardware.
Credit: Amir Feizpour

 

 

A team of physicists at the University of Toronto (U of T) have taken a step toward making the essential building block of quantum computers out of pure light. Their advance, described in a paper published this week in Nature Physics, has to do with a specific part of computer circuitry known as a “logic gate.”

Logic gates perform operations on input data to create new outputs. In classical computers, logic gates take the form of diodes or transistors. But quantum computer components are made from individual atoms and subatomic particles. Information processing happens when the particles interact with one another according to the strange laws of quantum physics.

Light particles — known as “photons” — have many advantages in quantum computing, but it is notoriously difficult to get them to interact with one another in useful ways. This experiment demonstrates how to create such interactions.

“We’ve seen the effect of a single particle of light on another optical beam,” said Canadian Institute for Advanced Research (CIFAR) Senior Fellow Aephraim Steinberg, one of the paper’s authors and a researcher at U of T’s Centre for Quantum Information & Quantum Computing. “Normally light beams pass through each other with no effect at all. To build technologies like optical quantum computers, you want your beams to talk to one another. That’s never been done before using a single photon.”

The interaction was a two-step process. The researchers shot a single photon at rubidium atoms that they had cooled to a millionth of a degree above absolute zero. The photons became “entangled” with the atoms, which affected the way the rubidium interacted with a separate optical beam. The photon changes the atoms’ refractive index, which caused a tiny but measurable “phase shift” in the beam.

This process could be used as an all-optical quantum logic gate, allowing for inputs, information-processing and outputs.

“Quantum logic gates are the most obvious application of this advance,” said Steinberg. “But being able to see these interactions is the starting page of an entirely new field of optics. Most of what light does is so well understood that you wouldn’t think of it as a field of modern research. But two big exceptions are, ‘What happens when you deal with light one particle at a time?’ and ‘What happens when there are media like our cold atoms that allow different light beams to interact with each other?'”

Both questions have been studied, he says, but never together until now.


Story Source:

The above post is reprinted from materials provided byUniversity of Toronto. Note: Materials may be edited for content and length.


Journal Reference:

  1. Amir Feizpour, Matin Hallaji, Greg Dmochowski & Aephraim M. Steinberg. Observation of the nonlinear phase shift due to single post-selected photons. Nature Physics, 2015; DOI: 10.1038/nphys3433

 

Source: University of Toronto. “A little light interaction leaves quantum physicists beaming.” ScienceDaily. ScienceDaily, 24 August 2015. <www.sciencedaily.com/releases/2015/08/150824114255.htm>.

Filed Under News | Leave a Comment 

Date:
August 24, 2015

Source:
ARC Centre of Excellence in Coral Reef Studies

Summary:
New research into the impact of climate change has found that warming oceans will cause profound changes in the global distribution of marine biodiversity. The study found that a rapidly warming climate would cause many species to expand into new regions, which would impact on native species, while others with restricted ranges, particularly those around the tropics, are more likely to face extinction.

 

20150825-1

Species in tropical areas are more likely to face extinction as oceans warm.
Credit: Simon Foale

 

 

New research into the impact of climate change has found that warming oceans will cause profound changes in the global distribution of marine biodiversity.

In a study published in the journal Nature Climate Change an international research team modelled the impacts of a changing climate on the distribution of almost 13 thousand marine species, more than twelve times as many species as previously studied.

The study found that a rapidly warming climate would cause many species to expand into new regions, which would impact on native species, while others with restricted ranges, particularly those around the tropics, are more likely to face extinction.

Co-author, Professor John Pandolfi from the ARC Centre of Excellence for Coral Reef Studies at the University of Queensland says global patterns of species richness will change significantly, with considerable regional variability.

“This study was particularly useful because it not only gave us hope that species have the potential to track and follow changing climates but it also gave us cause for concern, particularly in the tropics, where strong biodiversity losses were predicted,” says Professor Pandolfi.

“This is especially worrying, and highly germane to Australia’s coral reefs, because complementary studies have shown high levels of extinction risk in tropical biotas, where localized human impacts as well as climate change have resulted in substantial degradation.”

To model the projected impact of climate change on marine biodiversity, the researchers used climate-velocity trajectories, a measurement which combines the rate and direction of movement of ocean temperature bands over time, together with information about thermal tolerance and habitat preference.

They say the analysis provides the simplest expectation for the future distribution of marine biodiversity, showing recurring spatial patterns of high rates of species invasions coupled with local extinctions.

The researchers say this will make currently distinct ecological communities much more similar to each other in many regions by the end of the century.

Professor Pandolfi warns the resultant novel combinations of resident and migrant species will present unprecedented challenges for conservation planning.

“Above all, this study shows the broad geographic connections of the effects of climate change — conservation efforts need to be facilitated by cooperation among countries to have any real chance of combating the potentially severe biodiversity losses that a changing climate might impose.”


Story Source:

The above post is reprinted from materials provided by ARC Centre of Excellence in Coral Reef Studies. Note: Materials may be edited for content and length.


Journal Reference:

  1. Jorge García Molinos, Benjamin S. Halpern, David S. Schoeman, Christopher J. Brown, Wolfgang Kiessling, Pippa J. Moore, John M. Pandolfi, Elvira S. Poloczanska, Anthony J. Richardson, Michael T. Burrows. Climate velocity and the future global redistribution of marine biodiversity. Nature Climate Change, 2015; DOI:10.1038/nclimate2769

 

Source: ARC Centre of Excellence in Coral Reef Studies. “Climate profound impact on marine biodiversity.” ScienceDaily. ScienceDaily, 24 August 2015. <www.sciencedaily.com/releases/2015/08/150824114237.htm>.

Filed Under News | Leave a Comment 

Date:
August 21, 2015

Source:
Oregon State University

Summary:
A recalculation of the dates at which boulders were uncovered by melting glaciers at the end of the last Ice Age has conclusively shown that the glacial retreat was due to rising levels of carbon dioxide and other greenhouse gases, as opposed to other types of forces. The data helps to confirm predictions of future glacial retreat, and that most of the world’s glaciers may disappear in the next few centuries.

 

20150824-1

A melting tongue of Exit Glacier near Seward, Alaska, continues to dwindle and pour water into streams below, as it has been doing for decades.
Credit: Photo courtesy of Oregon State University

 

 

A recalculation of the dates at which boulders were uncovered by melting glaciers at the end of the last Ice Age has conclusively shown that the glacial retreat was due to rising levels of carbon dioxide and other greenhouse gases, as opposed to other types of forces.

Carbon dioxide levels are now significantly higher than they were at that time, as a result of the Industrial Revolution and other human activities since then. Because of that, the study confirms predictions of future glacial retreat, and that most of the world’s glaciers may disappear in the next few centuries.

The findings were published today in Nature Communications by researchers from Oregon State University, Boston College and other institutions. They erase some of the uncertainties about glacial melting that had been due to a misinterpretation of data from some of these boulders, which were exposed to the atmosphere more than 11,500 years ago.

“This shows that at the end of the last Ice Age, it was only the increase in carbon dioxide and other greenhouse gases that could have caused the loss of glaciers around the world at the same time,” said Peter Clark, a professor in the OSU College of Earth, Ocean and Atmospheric Sciences, and co-author on the study.

“This study validates predictions that future glacial loss will occur due to the ongoing increase in greenhouse gas levels from human activities,” Clark said. “We could lose 80-90 percent of the world’s glaciers in the next several centuries if greenhouse gases continue to rise at the current rate.”

Glacial loss in the future will contribute to rising sea levels and, in some cases, have impacts on local water supplies.

As the last Ice Age ended during a period of about 7,000 years, starting around 19,000 years ago, the levels of carbon dioxide in the atmosphere increased from 180 parts per million to 280 parts per million. But just in the past 150 years, they have surged from 280 to about 400 parts per million, far higher than what was required to put an end to the last Ice Age.

The new findings, Clark said, were based on a recalculation of the ages at which more than 1,100 glacial boulders from 159 glacial moraines around the world were exposed to the atmosphere after being buried for thousands of years under ice.

The exposure of the boulders to cosmic rays produced cosmogenic nuclides, which had been previously measured and used to date the event. But advances have been made in how to calibrate ages based on that data. Based on the new calculations, the rise in carbon dioxide levels — determined from ancient ice cores -matches up nicely with the time at which glacial retreat took place.

“There had been a long-standing mystery about why these boulders were uncovered at the time they were, because it didn’t properly match the increase in greenhouse gases,” said Jeremy Shakun, a professor at Boston College and lead author on the study. “We found that the previous ages assigned to this event were inaccurate. The data now show that as soon as the greenhouse gas levels began to rise, the glaciers began to melt and retreat.”

There are other forces that can also cause glacial melting on a local or regional scale, the researchers noted, such as changes in the Earth’s orbit around the sun, or shifts in ocean heat distribution. These factors probably did have localized effects. But the scientists determined that only the change in greenhouse gas levels could have explained the broader global retreat of glaciers all at the same time.

In the study of climate change, glaciers have always been of considerable interest, because their long-term behavior is a more reliable barometer that helps sort out the ups-and-downs caused by year-to-year weather variability, including short-term shifts in temperature and precipitation.

Other collaborators on this research were from the University of Wisconsin, Purdue University, and the National Center for Atmospheric Research. The work was supported by the National Oceanic and Atmospheric Administration and the National Science Foundation.


Story Source:

The above post is reprinted from materials provided by Oregon State University. Note: Materials may be edited for content and length.


Journal Reference:

  1. Jeremy D. Shakun, Peter U. Clark, Feng He, Nathaniel A. Lifton, Zhengyu Liu, Bette L. Otto-Bliesner. Regional and global forcing of glacier retreat during the last deglaciation. Nature Communications, 2015; 6: 8059 DOI: 10.1038/ncomms9059

Source: Oregon State University. “Greenhouse gases caused glacial retreat during last Ice Age.” ScienceDaily. ScienceDaily, 21 August 2015. <www.sciencedaily.com/releases/2015/08/150821082727.htm>.

Filed Under News | Leave a Comment 

Scientists say increasing heat drives moisture from ground

Date:
August 20, 2015

Source:
The Earth Institute at Columbia University

Summary:
A new study says that global warming has measurably worsened the ongoing California drought. While scientists largely agree that natural weather variations have caused a lack of rain, an emerging consensus says that rising temperatures may be making things worse by driving moisture from plants and soil into the air. The new study is the first to estimate how much worse: as much as a quarter.

 

20150821-1

Drought in California.
Credit: © Tupungato / Fotolia

 

 

A new study says that global warming has measurably worsened the ongoing California drought. While scientists largely agree that natural weather variations have caused a lack of rain, an emerging consensus says that rising temperatures may be making things worse by driving moisture from plants and soil into the air. The new study is the first to estimate how much worse: as much as a quarter. The findings suggest that within a few decades, continually increasing temperatures and resulting moisture losses will push California into even more persistent aridity. The study appears this week in the journal Geophysical Research Letters.

“A lot of people think that the amount of rain that falls out the sky is the only thing that matters,” said lead author A. Park Williams, a bioclimatologist at Columbia University’s Lamont-Doherty Earth Observatory. “But warming changes the baseline amount of water that’s available to us, because it sends water back into the sky.”

The study adds to growing evidence that climate change is already bringing extreme weather to some regions. California is the world’s eighth-largest economy, ahead of most countries, but many scientists think that the nice weather it is famous for may now be in the process of going away. The record-breaking drought is now in its fourth year; it is drying up wells, affecting major produce growers and feeding wildfires now sweeping over vast areas.

The researchers analyzed multiple sets of month-by-month data from 1901 to 2014. They looked at precipitation, temperature, humidity, wind and other factors. They could find no long-term rainfall trend. But average temperatures have been creeping up–about 2.5 degrees Fahrenheit over the 114-year period, in step with building fossil-fuel emissions. Natural weather variations have made California unusually hot over the last several years; added to this was the background trend. Thus, when rainfall declined in 2012, the air sucked already scant moisture from soil, trees and crops harder than ever. The study did not look directly at snow, but in the past, gradual melting of the high-mountain winter snowpack has helped water the lowlands in warm months. Now, melting has accelerated, or the snowpack has not formed at all, helping make warm months even dryer according to other researchers.

Due to the complexity of the data, the scientists could put only a range, not a single number, on the proportion of the drought caused by global warming. The paper estimates 8 to 27 percent, but Williams said that somewhere in the middle–probably 15 to 20 percent–is most likely.

Last year, the U.S. National Oceanic and Atmospheric Administration sponsored a study that blamed the rain deficit on a persistent ridge of high-pressure air over the northeast Pacific, which has been blocking moisture-laden ocean air from reaching land. Lamont-Doherty climatologist Richard Seager, who led that study (and coauthored the new one), said the blockage probably has nothing to do with global warming; normal weather patterns will eventually push away the obstacle, and rainfall will return. In fact, most projections say that warming will eventually increase California’s rainfall a bit. But the new study says that evaporation will overpower any increase in rain, and then some. This means that by around the 2060s, more or less permanent drought will set in, interrupted only by the rainiest years. More intense rainfall is expected to come in short bursts, then disappear.

Many researchers believe that rain will resume as early as this winter. “When this happens, the danger is that it will lull people into thinking that everything is now OK, back to normal,” said Williams. “But as time goes on, precipitation will be less able to make up for the intensified warmth. People will have to adapt to a new normal.”

This study is not the first to make such assertions, but it is the most specific. A paper by scientists from Lamont-Doherty and Cornell University, published this February, warned that climate change will push much of the central and western United States into the driest period for at least 1,000 years. A March study out of Stanford University said that California droughts have been intensified by higher temperatures, and gives similar warnings for the future.

A further twist was introduced in a 2010 study by researchers at the NASA Goddard Institute for Space Studies. They showed that massive irrigation from underground aquifers has been offsetting global warming in some areas, because the water cools the air. The effect has been especially sharp in California’s heavily irrigated Central Valley–possibly up to 3.5 degrees Fahrenheit during some seasons. Now, aquifers are dropping fast, sending irrigation on a downward trajectory. If irrigation’s cooling effect declines, this will boost air temperatures even higher, which will dry aquifers further, and so on. Scientists call this process “positive feedback.”

Climatologist Noah Diffenbaugh, who led the earlier Stanford research, said the new study is an important step forward. It has “brought together the most comprehensive set of data for the current drought,” he said. “It supports the previous work showing that temperature makes it harder for drought to break, and increases the long-term risk.”

Jonathan Overpeck, co-director of the Institute of the Environment at the University of Arizona, said, “It’s important to have quantitative estimates of how much human-caused warming is already making droughts more severe.” But, he said, “it’s troubling to know that human influence will continue to make droughts more severe until greenhouse gas emissions are cut back in a big way.”


Story Source:

The above post is reprinted from materials provided by The Earth Institute at Columbia University. Note: Materials may be edited for content and length.


Journal Reference:

  1. A.P. Williams et al. Contribution of anthropogenic warming to California drought during 2012–2014. Geophysical Research Letters, 2015 DOI: 10.1002/2015GL064924

 

Source: The Earth Institute at Columbia University. “Warming climate is deepening California drought: Scientists say increasing heat drives moisture from ground.” ScienceDaily. ScienceDaily, 20 August 2015. <www.sciencedaily.com/releases/2015/08/150820090713.htm>.

Filed Under News | Leave a Comment 

Date:
August 19, 2015

Source:
University of East Anglia

Summary:
New research could one day help build computers from DNA. Scientists have found a way to ‘switch’ the structure of DNA using copper salts and EDTA (Ethylenediaminetetraacetic acid) — an agent commonly found in shampoo and other household products. The applications for this discovery include nanotechnology — where DNA is used to make tiny machines, and in DNA-based computing — where computers are built from DNA rather than silicon.

 

20150820-1

Scientists have found a way to “switch” the structure of DNA using copper salts and EDTA (Ethylenediaminetetraacetic acid) — an agent commonly found in shampoo and other household products. It was previously known that the structure of a piece of DNA could be changed using acid, which causes it to fold up into what is known as an “i-motif.” But new research published today in the journal Chemical Communications reveals that the structure can be switched a second time into a hair-pin structure using positively-charged copper (copper cations). This change can also be reversed using EDTA. The applications for this discovery include nanotechnology — where DNA is used to make tiny machines, and in DNA-based computing — where computers are built from DNA rather than silicon.
Credit: University of East Anglia

 

 

New research from the University of East Anglia could one day help build computers from DNA.

Scientists have found a way to ‘switch’ the structure of DNA using copper salts and EDTA (Ethylenediaminetetraacetic acid) — an agent commonly found in shampoo and other household products.

It was previously known that the structure of a piece of DNA could be changed using acid, which causes it to fold up into what is known as an ‘i-motif’.

But new research published today in the journal Chemical Communicationsreveals that the structure can be switched a second time into a hair-pin structure using positively-charged copper (copper cations). This change can also be reversed using EDTA.

The applications for this discovery include nanotechnology — where DNA is used to make tiny machines, and in DNA-based computing — where computers are built from DNA rather than silicon.

It could also be used for detecting the presence of copper cations, which are highly toxic to fish and other aquatic organisms, in water.

Lead researcher Dr Zoë Waller, from UEA’s school of Pharmacy, said: “Our research shows how the structure of our genetic material — DNA — can be changed and used in a way we didn’t realise.

“A single switch was possible before — but we show for the first time how the structure can be switched twice.

“A potential application of this finding could be to create logic gates for DNA based computing. Logic gates are an elementary building block of digital circuits — used in computers and other electronic equipment. They are traditionally made using diodes or transistors which act as electronic switches.

“This research expands how DNA could be used as a switching mechanism for a logic gate in DNA-based computing or in nano-technology.”


Story Source:

The above post is reprinted from materials provided by University of East Anglia. Note: Materials may be edited for content and length.


Journal Reference:

  1. Henry Albert Day, Elisé Patricia Wright, Colin John MacDonald, Andrew James Gates, Zoë Ann Ella Waller. Reversible DNA i-motif to hairpin switching induced by copper(ii) cations. Chem. Commun., 2015; DOI: 10.1039/C5CC05111H

 

Source: University of East Anglia. “Building computers from DNA?.” ScienceDaily. ScienceDaily, 19 August 2015. <www.sciencedaily.com/releases/2015/08/150819083421.htm>.

Filed Under News | Leave a Comment 

Next Page →