Date:
August 28, 2014

 

Source:
Broad Institute of MIT and Harvard

 

Summary:
In response to an ongoing, unprecedented outbreak of Ebola virus disease in West Africa, a team of researchers has rapidly sequenced and analyzed more than 99 Ebola virus genomes. Their findings could have important implications for rapid field diagnostic tests.

 

 

20140829-1
Created by CDC microbiologist Frederick A. Murphy, this colorized transmission electron micrograph (TEM) revealed some of the ultrastructural morphology displayed by an Ebola virus virion.
Credit: CDC/Frederick A. Murphy

 

 

In response to an ongoing, unprecedented outbreak of Ebola virus disease (EVD) in West Africa, a team of researchers from the Broad Institute and Harvard University, in collaboration with the Sierra Leone Ministry of Health and Sanitation and researchers across institutions and continents, has rapidly sequenced and analyzed more than 99 Ebola virus genomes. Their findings could have important implications for rapid field diagnostic tests. The team reports its results online in the journal Science.

For the current study, researchers sequenced 99 Ebola virus genomes collected from 78 patients diagnosed with Ebola in Sierra Leone during the first 24 days of the outbreak (a portion of the patients contributed samples more than once, allowing researchers a clearer view into how the virus can change in a single individual over the course of infection). The team found more than 300 genetic changes that make the 2014 Ebola virus genomes distinct from the viral genomes tied to previous Ebola outbreaks. They also found sequence variations indicating that, from the samples sequenced, the EVD outbreak started from a single introduction into humans, subsequently spreading from person to person over many months.

The variations they identified were frequently in regions of the genome encoding proteins. Some of the genetic variation detected in these studies may affect the primers (starting points for DNA synthesis) used in PCR-based diagnostic tests, emphasizing the importance of genomic surveillance and the need for vigilance. To accelerate response efforts, the research team released the full-length sequences on National Center for Biotechnology Information’s (NCBI’s) DNA sequence database in advance of publication, making these data available to the global scientific community.

“By making the data immediately available to the community, we hope to accelerate response efforts,” said co-senior author Pardis Sabeti, a senior associate member at the Broad Institute and an associate professor at Harvard University. “Upon releasing our first batch of Ebola sequences in June, some of the world’s leading epidemic specialists contacted us, and many of them are now also actively working on the data. We were honored and encouraged. A spirit of international and multidisciplinary collaboration is needed to quickly shed light on the ongoing outbreak.”

The 2014 Zaire ebolavirus (EBOV) outbreak is unprecedented both in its size and in its emergence in multiple populated areas. Previous outbreaks had been localized mostly to sparsely populated regions of Middle Africa, with the largest outbreak in 1976 reporting 318 cases. The 2014 outbreak has manifested in the more densely-populated West Africa, and since it was first reported in Guinea in March 2014, 2,240 cases have been reported with 1,229 deaths (as of August 19).

Augustine Goba, Director of the Lassa Laboratory at the Kenema Government Hospital and a co-first author of the paper, identified the first Ebola virus disease case in Sierra Leone using PCR-based diagnostics. “We established surveillance for Ebola well ahead of the disease’s spread into Sierra Leone and began retrospective screening for the disease on samples as far back as January of this year,” said Goba. “This was possible because of our long-standing work to diagnose and study another deadly disease, Lassa fever. We could thus identify cases and trace the Ebola virus spread as soon as it entered our country.”

The research team increased the amount of genomic data available on the Ebola virus by four fold and used the technique of “deep sequencing” on all available samples. Deep sequencing is sequencing done enough times to generate high confidence in the results. In this study, researchers sequenced at a depth of 2,000 times on average for each Ebola genome to get an extremely close-up view of the virus genomes from 78 patients. This high-resolution view allowed the team to detect multiple mutations that alter protein sequences — potential targets for future diagnostics, vaccines, and therapies.

The Ebola strains responsible for the current outbreak likely have a common ancestor, dating back to the very first recorded outbreak in 1976. The researchers also traced the transmission path and evolutionary relationships of the samples, revealing that the lineage responsible for the current outbreak diverged from the Middle African version of the virus within the last ten years and spread from Guinea to Sierra Leone by 12 people who had attended the same funeral.

The team’s catalog of 395 mutations (over 340 that distinguish the current outbreak from previous ones, and over 50 within the West African outbreak) may serve as a starting point for other research groups. “We’ve uncovered more than 300 genetic clues about what sets this outbreak apart from previous outbreaks,” said Stephen Gire, a research scientist in the Sabeti lab at the Broad Institute and Harvard. “Although we don’t know whether these differences are related to the severity of the current outbreak, by sharing these data with the research community, we hope to speed up our understanding of this epidemic and support global efforts to contain it.”

“There is an extraordinary battle still ahead, and we have lost many friends and colleagues already like our good friend and colleague Dr. Humarr Khan, a co-senior author here,” said Sabeti. “By providing this data to the research community immediately and demonstrating that transparency and partnership is one way we hope to honor Humarr’s legacy. We are all in this fight together.”

This work was supported by Common Fund and National Institute of Allergy and Infectious Diseases in the National Institutes of Health, Department of Health and Human Services, as well as by the National Science Foundation, the European Union Seventh Framework Programme, the World Bank, and the Natural Environment Research Council.

Other researchers who contributed to this work include Augustine Goba, Kristian G. Andersen, Rachel S. G. Sealfon, Daniel J. Park, Lansana Kanneh, Simbirie Jalloh, Mambu Momoh, Mohamed Fullah, Gytis Dudas, Shirlee Wohl, Lina M. Moses, Nathan L. Yozwiak, Sarah Winnicki, Christian B. Matranga, Christine M. Malboeuf, James Qu, Adrianne D. Gladden, Stephen F. Schaffner, Xiao Yang, Pan-Pan Jiang, Mahan Nekoui, Andres Colubri, Moinya Ruth Coomber, Mbalu Fonnie, Alex Moigboi, Michael Gbakie, Fatima K. Kamara, Veronica Tucker, Edwin Konuwa, Sidiki Saffa, Josephine Sellu, Abdul Azziz Jalloh, Alice Kovoma, James Koninga, Ibrahim Mustapha, Kandeh Kargbo, Momoh Foday, Mohamed Yillah, Franklyn Kanneh, Willie Robert, James L. B. Massally, Sinéad B. Chapman, James Bochicchio, Cheryl Murphy, Chad Nusbaum, Sarah Young, Bruce W. Birren, Donald S.Grant, John S. Scheiffelin, Eric S. Lander, Christian Happi, Sahr M. Gevao, Andreas Gnirke, Andrew Rambaut, Robert F. Garry, and S. Humarr Khan.


Story Source:

The above story is based on materials provided by Broad Institute of MIT and HarvardNote: Materials may be edited for content and length.


Journal Reference:

  1. Gire, SK, Goba, A et al. Genomic surveillance elucidates Ebola virus origin and transmission during the 2014 outbreakScience, 2014 DOI:10.1126/science.1259657

 

Broad Institute of MIT and Harvard. “Genomic sequencing reveals mutations, insights into 2014 Ebola outbreak.” ScienceDaily. ScienceDaily, 28 August 2014. <www.sciencedaily.com/releases/2014/08/140828142738.htm>.

Filed Under News | Leave a Comment 

Date:
August 27, 2014

 

Source:
Penn State Materials Research Institute

 

Summary:
Silicon has been the most successful material of the 20th century, with major global industries and even a valley named after it. But silicon may be running out of steam for high performance/low power electronics. As silicon strains against the physical limits of performance, could a material like InGaAs provide enough of an improvement over silicon that it would be worth the expense in new equipment lines and training to make the switch worthwhile?

 

 

20140828-1
Scanning Electron Microscope micrograph of multigate InGaAs nanowire field effect transistor with an array of five nanowires of width 40 nm.
Credit: Arun Thathachary, Penn State

 

 

In the consumer electronics industry, the mantra for innovation is higher device performance/less power. Arun Thathachary, a Ph.D. student in Penn State’s Electrical Engineering Department, spends his days and sometimes nights in the cleanroom of the Materials Research Institute’s Nanofabrication Laboratory trying to make innovative transistor devices out of materials other than the standard semiconductor silicon that will allow higher performance using less power.

Silicon has been the most successful material of the 20th century, with major global industries and even a valley named after it. But silicon may be running out of steam for high performance/low power electronics. For example, the compound semiconductor indium gallium arsenide is known to have far superior electron mobility than silicon. As silicon strains against the physical limits of performance, could a material like InGaAs provide enough of an improvement over silicon that it would be worth the expense in new equipment lines and training to make the switch worthwhile? Samsung, one of the world’s largest electronics companies, has funded Thathachary through his adviser, professor of electrical engineering Suman Datta, in a project to help them find out.

In an article in the journal Nano Letters early this year, Thathachary and his coauthors described a novel device prototype designed to test nanowires made of compound semiconductors such as InGaAs. Their goal was to see for the first time if such a compound material would retain its superior electron mobility at nanoscale dimensions in a so-called FinFET device configuration, the standard transistor architecture for sub-22 nanometer technology.

“We developed a novel test structure called a Multi-fin Hall Bar Structure. It is the first such measurement of Hall mobility in a multi-fin 3D device,” Thathachary said. “If you look at mainstream chip production today, all transistors are made in a 3D fashion, and because they are made in 3D rather than the earlier planar design, several mechanisms can degrade performance. What we looked at in that paper is how much degradation do you really suffer when going from a planar 2D surface to, in this case, 30 nm size features that are confined in 3D?”

What Thathachary and colleagues discovered was that electron mobility declined in a regular slope and that the experimental results could be modeled by a method called scattering relaxation time approximation. Using this technique they were able to predict how a compound semiconductor device would be likely to operate at the size at which this material would possibly be adopted, for example, the 7 nm technology node.

“We found that at dimensions of even 5 nm, you can still expect a 2x to 3x advantage in the mobility of these materials over silicon, which is very significant,” said Thathachary. “After we published this paper, it was clear from a fundamental physics point of view that if you engineer the device correctly you should outperform existing silicon devices. But will it really? That’s what we set out to investigate next.”

Conference paper draws interest

The VLSI Symposia is an international conference on semiconductor technology and circuits, the leading conference for discussing advances in microelectronic devices. The majority of the presenters are from industry, with only a handful of student papers picked for presentation. At this year’s VLSI, one of the papers was Thathachary’s.

“The paper in Nano Letters was a precursor to the one chosen for the conference,” Thathachary said. They had made a device with 30 nm features and measured the compound semiconductor’s electron mobility down to that scale, but now it was time to actually make nanoscale transistors out of the new materials system and understand transistor behavior in that system.

Two of the parameters that are most important in transistor technology are called “subthreshold slope” and the “on current.” Subthreshold slope indicates how efficiently you can turn the transistor on and off. While on current simply means how much current you can get out of the device. Especially for mobile devices, if the transistor can get the same amount of current with lower voltage it will extend battery life and reduce the amount of heat that has to be gotten rid of.

“In addition, it’s imperative that you increase the functionality of the computer chip,” Thathachary explained, “but that means putting more transistors inside. If you are going to put a 50 or 60 watt limit on the average power consumption of your chip, then those transistors have to require lower power than the existing devices.”

Working on his Ph.D. project and through regular consultation with the Penn State Nanofab engineering staff, Thathachary spent a year in the cleanroom optimizing the processes required to put a new material system into a state of the art 3D FinFET device. That required spending many hours tweaking the conditions, such as temperature, flow rate, types of reactant gases, as well as refining his electron beam lithography and dry etch patterning techniques.

One of the most challenging issues he overcame was etching InGaAs into dense fin arrays comprising nanoscale dimensions. Once that was accomplished, he then needed to see how the new compound semiconductor system interacted with the other materials systems, such as the high-k dielectric thin film coating that surrounded the InGaAs fin.

“If you can get that process right, then you can make a great device, and that is what we showed at the conference,” he said. “We showed that in terms of on current at lower supply voltage we are seeing very good performance compared to existing silicon devices.”

Between the time the conference paper was accepted and the June meeting in Hawaii, Thathachary had continued to refine his processes. He learned that increasing the percentage of indium in the ternary (3-part) material system increased electron mobility significantly. Another mobility boost comes from engineering the dimensions of the active material so that the electrons are forced toward the middle of the material in a process called quantum confinement. This is important because in traditional transistors the electrons move close to the surface where they are exposed to microscopic roughness that degrades their mobility.

“The paper was remarkably well received at the conference, and we had a lot of requests to share our new and improved results. We had to get permission from Samsung to share that material, and eventually we did,” Thathachary said.

Encouraged by his results, Samsung has since renewed the contract with the Datta lab and Thathachary for another year. His next challenge will be the most difficult so far. They want him to investigate the performance of a 3D transistor at the 7 nm dimension, a node that the semiconductor industry is looking at for the future. To do this in the Penn State Nanofab for the first time, means he will need to develop innovative ideas to overcome the limitations of working with limited resources in a university lab. Once a device can be made at or approaching those dimensions, Samsung will likely internalize the research and assign a large team of engineers to develop reproducible industry scale devices.

“If we can show that at those dimensions, these III-V compound semiconductor systems can still beat silicon, that is when it makes sense for industry to move in and invest the billions of dollars required for the new technology generation,” Thathachary concluded.


Story Source:

The above story is based on materials provided by Penn State Materials Research InstituteNote: Materials may be edited for content and length.


Journal Reference:

  1. Arun V. Thathachary, Nidhi Agrawal, Lu Liu, Suman Datta. Electron Transport in Multigate InxGa1–xAs Nanowire FETs: From Diffusive to Ballistic Regimes at Room TemperatureNano Letters, 2014; 14 (2): 626 DOI: 10.1021/nl4038399

 

Penn State Materials Research Institute. “Materials Other Than Silicon for Next Generation Electronic Devices.” ScienceDaily. ScienceDaily, 27 August 2014. <www.sciencedaily.com/releases/2014/08/140827122509.htm>.

Filed Under News | Leave a Comment 

Date:
August 26, 2014

 

Source:
University of Illinois at Urbana-Champaign

 

Summary:
A new analysis suggests the planet can produce much more land-plant biomass — the total material in leaves, stems, roots, fruits, grains and other terrestrial plant parts — than previously thought. The study recalculates the theoretical limit of terrestrial plant productivity, and finds that it is much higher than many current estimates allow.

 

 

20140827-1
Scientists have historically underestimated the potential productivity of the earth’s land plants, researchers report in a new study.
Credit: NASA Earth Observatory image by Jesse Allen

 

 

new analysis suggests the planet can produce much more land-plant biomass — the total material in leaves, stems, roots, fruits, grains and other terrestrial plant parts — than previously thought.

The study, reported in Environmental Science and Technology, recalculates the theoretical limit of terrestrial plant productivity, and finds that it is much higher than many current estimates allow.

“When you try to estimate something over the whole planet, you have to make some simplifying assumptions,” said University of Illinois plant biology professor Evan DeLucia, who led the new analysis. “And most previous research assumes that the maximum productivity you could get out of a landscape is what the natural ecosystem would have produced. But it turns out that in nature very few plants have evolved to maximize their growth rates.”

DeLucia directs the Institute for Sustainability, Energy, and Environment at the U. of I. He also is an affiliate of the Energy Biosciences Institute, which funded the research through the Institute for Genomic Biology at Illinois.

Estimates derived from satellite images of vegetation and modeling suggest that about 54 gigatons of carbon is converted into terrestrial plant biomass each year, the researchers report.

“This value has remained stable for the past several decades, leading to the conclusion that it represents a planetary boundary — an upper limit on global biomass production,” the researchers wrote.

But these assumptions don’t take into consideration human efforts to boost plant productivity through genetic manipulation, plant breeding and land management, DeLucia said. Such efforts have already yielded some extremely productive plants.

For example, in Illinois a hybrid grass, Miscanthus x giganteus, without fertilizer or irrigation produced 10 to 16 tons of above-ground biomass per acre, more than double the productivity of native prairie vegetation or corn. And genetically modified no-till corn is more than five times as productive — in terms of total biomass generated per acre — as restored prairie in Wisconsin.

Some non-native species also outcompete native species; this is what makes many of them invasive, DeLucia said. In Iceland, for example, an introduced species, the nootka lupine, produces four times as much biomass as the native boreal dwarf birch species it displaces. And in India bamboo plantations produce about 40 percent more biomass than dry, deciduous tropical forests.

Some of these plants would not be desirable additions to native or managed ecosystems, DeLucia said, but they represent the untapped potential productivity of plants in general.

“We’re saying this is what’s possible,” he said.

The team used a model of light-use efficiency and the theoretical maximum efficiency with which plant canopies convert solar radiation to biomass to estimate the theoretical limit of net primary production (NPP) on a global scale. This newly calculated limit was “roughly two orders of magnitude higher than the productivity of most current managed or natural ecosystems,” the authors wrote.

“We’re not saying that this is even approachable, but the theory tells us that what is possible on the planet is much, much higher than what current estimates are,” DeLucia said.

Taking into account global water limitations reduced this theoretical limit by more than 20 percent in all parts of the terrestrial landscape except the tropics, DeLucia said. “But even that water-limited NPP is many times higher than we see in our current agricultural systems.”

DeLucia cautions that scientists and agronomists have a long way to go to boost plant productivity beyond current limits, and the new analysis does not suggest that shortages of food or other plant-based resources will cease to be a problem.

“I don’t want to be the guy that says science is going to save the planet and we shouldn’t worry about the environmental consequences of agriculture, we shouldn’t worry about runaway population growth,” he said. “All I’m saying is that we’re underestimating the productive capacity of plants in managed ecosystems.”


Story Source:

The above story is based on materials provided by University of Illinois at Urbana-ChampaignNote: Materials may be edited for content and length.


Journal Reference:

  1. Evan H. DeLucia, Nuria Gomez-Casanovas, Jonathan A. Greenberg, Tara W. Hudiburg, Ilsa B. Kantola, Stephen P. Long, Adam D. Miller, Donald R. Ort, William J. Parton. The Theoretical Limit to Plant ProductivityEnvironmental Science & Technology, 2014; 48 (16): 9471 DOI: 10.1021/es502348e

 

University of Illinois at Urbana-Champaign. “Earth can sustain more terrestrial plant growth than previously thought, analysis shows.” ScienceDaily. ScienceDaily, 26 August 2014. <www.sciencedaily.com/releases/2014/08/140826100855.htm>.

Filed Under News | Leave a Comment 

Date:
August 25, 2014

 

Source:
U.S. Geological Survey

 

Summary:
Natural methane leakage from the seafloor is far more widespread on the U.S. Atlantic margin than previously thought, according to a study by researchers from Mississippi State University, the U.S. Geological Survey, and other institutions.

 

 

20140826-1
Map of the northern U.S. Atlantic margin showing the locations of newly-discovered methane seeps mapped by researchers from Mississippi State University, the U.S. Geological Survey, and other partners. None of the seeps shown here was known to researchers before 2012.
Credit: Image courtesy of U.S. Geological Survey

 

 

Natural methane leakage from the seafloor is far more widespread on the U.S. Atlantic margin than previously thought, according to a study by researchers from Mississippi State University, the U.S. Geological Survey, and other institutions.

Methane plumes identified in the water column between Cape Hatteras, North Carolina and Georges Bank, Massachusetts, are emanating from at least 570 seafloor cold seeps on the outer continental shelf and the continental slope. Taken together, these areas, which lie between the coastline and the deep ocean, constitute the continental margin. Prior to this study, only three seep areas had been identified beyond the edge of the continental shelf, which occurs at approximately 180 meters (590 feet) water depth between Florida and Maine on the U.S. Atlantic seafloor.

Cold seeps are areas where gases and fluids leak into the overlying water from the sediments. They are designated as cold to distinguish them from hydrothermal vents, which are sites where new oceanic crust is being formed and hot fluids are being emitted at the seafloor. Cold seeps can occur in a much broader range of environments than hydrothermal vents.

“Widespread seepage had not been expected on the Atlantic margin. It is not near a plate tectonic boundary like the U.S. Pacific coast, nor associated with a petroleum basin like the northern Gulf of Mexico,” said Adam Skarke, the study’s lead author and a professor at Mississippi State University.

The gas being emitted by the seeps has not yet been sampled, but researchers believe that most of the leaking methane is produced by microbial processes in shallow sediments. This interpretation is based primarily on the locations of the seeps and knowledge of the underlying geology. Microbial methane is not the type found in deep-seated reservoirs and often tapped as a natural gas resource.

Most of the newly discovered methane seeps lie at depths close to the shallowest conditions at which deepwater marine gas hydrate can exist on the continental slope. Gas hydrate is a naturally occurring, ice-like combination of methane and water, and forms at temperature and pressure conditions commonly found in waters deeper than approximately 500 meters (1640 feet).

“Warming of ocean temperatures on seasonal, decadal or much longer time scales can cause gas hydrate to release its methane, which may then be emitted at seep sites,” said Carolyn Ruppel, study co-author and chief of the USGS Gas Hydrates Project. “Such continental slope seeps have previously been recognized in the Arctic, but not at mid-latitudes. So this is a first.”

Most seeps described in the new study are too deep for the methane to directly reach the atmosphere, but the methane that remains in the water column can be oxidized to carbon dioxide. This in turn increases the acidity of ocean waters and reduces oxygen levels.

Shallow-water seeps that may be related to offshore groundwater discharge were detected at the edge of the shelf and in the upper part of Hudson Canyon, an undersea gorge that represents the offshore extension of the Hudson River. Methane from these seeps could directly reach the atmosphere, contributing to increased concentrations of this potent greenhouse gas. More extensive shallow-water surveys than described in this study will be required to document the extent of such seeps.

Some of the new methane seeps were discovered in 2012. In summer 2013 a Brown University undergraduate and National Oceanic and Atmospheric Administration Hollings Scholar Mali’o Kodis worked with Skarke to analyze about 94,000 square kilometers (about 36,000 square miles) of water column imaging data to map the methane plumes. The data had been collected by the vessel Okeanos Explorerbetween 2011 and 2013. The Okeanos Explorer and the Deep Discoverer remotely operated vehicle, which has photographed the seafloor at some of the methane seeps, are managed by NOAA’s Office of Ocean Exploration and Research.

“This study continues the tradition of advancing U.S. marine science research through partnerships between federal agencies and the involvement of academic researchers,” said John Haines, coordinator of the USGS Coastal and Marine Geology Program “NOAA’s Ocean Exploration program acquired state-of-the-art data at the scale of the entire margin, while academic and USGS scientists teamed to interpret these data in the context of a research problem of global significance.”

The study, “Widespread methane leakage from the sea floor on the northern US Atlantic Margin,” by A, Skarke, C. Ruppel, M, Kodis, D. Brothers and E. Lobecker inNature Geoscience is available on line.

USGS Gas Hydrates Project

The USGS has a globally recognized research effort studying natural gas hydrates in deepwater and permafrost settings worldwide. USGS researchers focus on the potential of gas hydrates as an energy resource, the impact of climate change on gas hydrates, and seafloor stability issues.

For more information about the U.S. Geological Survey’s Gas Hydrates Project, visit the Woods Hole Coastal and Marine Science Center, U.S. Geological Survey Gas Hydrates Project website (http://woodshole.er.usgs.gov/).

For more information, visit the Mississippi State University website (http://www.msstate.edu/).


Story Source:

The above story is based on materials provided by U.S. Geological SurveyNote: Materials may be edited for content and length.


Journal Reference:

  1. A. Skarke, C. Ruppel, M. Kodis, D. Brothers, E. Lobecker. Widespread methane leakage from the sea floor on the northern US Atlantic marginNature Geoscience, 2014; DOI: 10.1038/ngeo2232

 

 

U.S. Geological Survey. “Natural methane seepage on U.S. Atlantic ocean margin widespread.” ScienceDaily. ScienceDaily, 25 August 2014. <www.sciencedaily.com/releases/2014/08/140825141457.htm>.

Filed Under News | Leave a Comment 

Date:
August 24, 2014

 

Source:
Brown University

 

Summary:
In a new study researchers show that they could make faint sensations more vivid by triggering a brain rhythm that appears to shift sensory attention. The study in mice provides the first direct evidence that the brain’s ‘gamma’ rhythms have a causal role in processing the sense of touch.

 

 

20140825-1
Precise pulses of blue light allowed researchers to generate a 40-hertz gamma rhythm in the sensory neocortex of mice. Mice with that rhythm in their brains could more often detect the fainter vibrations the researchers provided to their whiskers.
Credit: Mike Cohea/Brown University

 

 

By striking up the right rhythm in the right brain region at the right time, Brown University neuroscientists report in Nature Neuroscience that they managed to endow mice with greater touch sensitivity than other mice, making hard-to-perceive vibrations suddenly more vivid to them.

The findings offer the first direct evidence that “gamma” brainwaves in the cortex affect perception and attention. With only correlations and associations as evidence before, neuroscientists have argued for years about whether gamma has an important role or whether it’s merely a byproduct — an “exhaust fume” in the words of one — of such brain activity.

“There’s a lot of excitement about the importance of gamma rhythms in behavior, as well as a lot of skepticism,” said co-lead author Joshua Siegle, a former graduate student at Brown University and MIT, who is now at the Allen Institute for Neuroscience. “Rather than try to correlate changes in gamma rhythms with changes in behavior, which is what researchers have done in the past, we chose to directly control the cells that produce gamma.”

The result was a mouse with whiskers that were about 20 percent more sensitive.

“There were a lot of ways this experiment could have failed but instead to our surprise it was pretty decisive from the very first subject we looked at — that under certain conditions we can make a super-perceiving mouse,” said Christopher Moore, associate professor of neuroscience at Brown and senior author of the study. “We’re making a mouse do better than a mouse could have done otherwise.”

Specifically, Moore and co-first authors Siegle and Dominique Pritchett performed their experiments by using optogenetics — a technique of using light to control the firing patterns of neurons — to generate a gamma rhythm by manipulating inhibitory interneurons in the primary sensory neocortex of mice. That part of the brain controls a mouse’s ability to detect faint sensations via its whiskers.

A different part of the brain handles stronger, more imposing sensations, Moore said. The primary sensory neocortex, a particular feature of mammals, has the distinction of allowing an animal to purposely pay attention to more subtle sensations. It’s the difference between the feeling of gently brushing a fingertip along a wood board to assess if it needs a bit more sanding and the feeling of dropping the wood board on a foot.

Before anything else in the paper, the researchers confirmed that mice naturally produce a 40-hertz gamma rhythm in their sensory neocortex sometimes. Then they optogenetically generated that gamma rhythm with precise pulses of blue light. Mice with this rhythm could more often detect the fainter vibrations the researchers supplied to their whiskers than could mice who did not have the rhythm going in their brains.

Control and optogenetically stimulated mice alike had been conditioned to indicated their detection of a supplied stimulus by licking a water bottle. The vibrations provided to the mice to sense covered a span of 17 different levels of detectability.

The team’s hypothesis was that the gamma rhythm of the stimulated neurons, because they inhibit the transmission of sensation messages by pyramidal neurons in the neocortex with a structured periodicity, actually orders the pyramidal messages into a more coherent and therefore stronger train.

“It’s not surprising that these synchronized bursts of activity can benefit signal transmission, in the same way that synchronized clapping in a crowd of people is louder than random clapping,” Siegle said.

This idea suggested that the timing of the rhythm matters.

So in another experiment, Siegle, Pritchett, and Moore varied the onset of the gamma rhythm by increments of 5 milliseconds to see whether it made a difference to perception. It did. The mice showed their increased sensitivity only so long as the gamma rhythms were underway 20-25 milliseconds before the subtle sensations were presented. If they weren’t, the mice experienced on average no impact on sensitivity.

One of the key implications from the findings for neuroscience, Moore said, is that the way gamma rhythms appear to structure the processing of perception is more important than the mere firing rate of neurons in the sensory neocortex. Mice became better able to feel not because neurons became more active (they didn’t), but because they were entrained by a precisely timed rhythm.

Although the study provides causal evidence of a functional importance for gamma rhythms, Moore acknowledged, it still leaves open important questions. The exact mechanism by which gamma rhythms affect sensation processing and attention are not proved, only hypothesized.

And in one experiment, optogenetically stimulated mice appeared less able to detect the most obvious and imposing of the sensations, even as they became more sensitive to the more subtle ones. In other experiments, however, their detection of major sensations was not compromised.

But the possible loss of sensitivity to stimuli that are easier to feel could be consistent with a shifting of attention to fainter ones, said Pritchett, also a former Brown and MIT student now at the Champalimaud Centre for the Unknown in Lisbon, Portugal.

“What we are showing is that, paradoxically, the rhythmic inhibitory input works to amplify threshold stimuli, possibly at the expense of salient stimuli,” he said. “This is precisely what you would expect from a mechanism that might be responsible for selective attention in the brain.”

Therefore, Siegle, Pritchett, and Moore say they do have a better feel now for what’s going on in the brain.


Story Source:

The above story is based on materials provided by Brown UniversityNote: Materials may be edited for content and length.


Journal Reference:

  1. Joshua H Siegle, Dominique L Pritchett, Christopher I Moore. Gamma-range synchronization of fast-spiking interneurons can enhance detection of tactile stimuliNature Neuroscience, 2014; DOI: 10.1038/nn.3797

 

Brown University. “Driving brain rhythm makes mice more sensitive to touch.” ScienceDaily. ScienceDaily, 24 August 2014. <www.sciencedaily.com/releases/2014/08/140824152341.htm>.

Filed Under News | Leave a Comment 

Date:
August 21, 2014

 

Source:
Taylor & Francis

 

Summary:
Researchers have designed a computer program that can accurately recognize users’ emotional states as much as 87% of the time, depending on the emotion. The study combined — for the first time — two established ways of detecting user emotions: keystroke dynamics and text-pattern analysis.

 

 

20140822-1
Researchers have designed a computer program that can accurately recognise users’ emotional states as much as 87% of the time, depending on the emotion.
Credit: © jinga80 / Fotolia

 

 

Researchers in Bangladesh have designed a computer programme that can accurately recognise users’ emotional states as much as 87% of the time, depending on the emotion.

Writing in the journal Behaviour & Information Technology, A.F.M. Nazmul Haque Nahin and his colleagues describe how their study combined — for the first time — two established ways of detecting user emotions: keystroke dynamics and text-pattern analysis.

To provide data for the study, volunteers were asked to note their emotional state after typing passages of fixed text, as well as at regular intervals during their regular (‘free text’) computer use; this provided the researchers with data about keystroke attributes associated with seven emotional states (joy, fear, anger, sadness, disgust, shame and guilt). To help them analyse sample texts, the researchers made use of a standard database of words and sentences associated with the same seven emotional states.

After running a variety of tests, the researchers found that their new ‘combined’ results were better than their separate results; what’s more, the ‘combined’ approach improved performance for five of the seven categories of emotion. Joy (87%) and anger (81%) had the highest rates of accuracy.

This research is an important contribution to ‘affective computing’, a growing field dedicated to ‘detecting user emotion in a particular moment’. As the authors note, for all the advances in computing power, performance and size in recent years, a lot more can still be done in terms of their interactions with end users. “Emotionally aware systems can be a step ahead in this regard,” they write.

“Computer systems that can detect user emotion can do a lot better than the present systems in gaming, online teaching, text processing, video and image processing, user authentication and so many other areas where user emotional state is crucial.”

While much work remains to be done, this research is an important step in making ‘emotionally intelligent’ systems that recognise users’ emotional states to adapt their music, graphics, content or approach to learning a reality.


Story Source:

The above story is based on materials provided by Taylor & FrancisNote: Materials may be edited for content and length.


Journal Reference:

  1. A.F.M. Nazmul Haque Nahin, Jawad Mohammad Alam, Hasan Mahmud, Kamrul Hasan. Identifying emotion by keystroke dynamics and text pattern analysis.Behaviour & Information Technology, 2014; 33 (9): 987 DOI:10.1080/0144929X.2014.907343

 

 

Taylor & Francis. “Does your computer know how you’re feeling?.” ScienceDaily. ScienceDaily, 21 August 2014. <www.sciencedaily.com/releases/2014/08/140821090524.htm>.

Filed Under News | Leave a Comment 

Date:
August 20, 2014

 

Source:
University of Toronto

 

Summary:
A group of inhibitory neurons, whose loss leads to sleep disruption in experimental animals, are substantially diminished among the elderly and individuals with Alzheimer’s disease, researchers have found. The authors examined the brains of 45 study subjects (median age at death, 89.2), identifying ventrolateral preoptic neurons by staining the brains for the neurotransmitter galanin. They then correlated the actigraphic rest-activity behavior of the 45 individuals in the year prior to their deaths with the number of remaining ventrolateral preoptic neurons at autopsy.

 

 

As people grow older, they often have difficulty falling asleep and staying asleep, and tend to awaken too early in the morning. In individuals with Alzheimer’s disease, this common and troubling symptom of aging tends to be especially pronounced, often leading to nighttime confusion and wandering.

Now, a study led by researchers at Beth Israel Deaconess Medical Center (BIDMC) and the University of Toronto/Sunnybrook Health Sciences Center helps explain why sleep becomes more fragmented with age. Reported online today in the journal Brain, the new findings demonstrate for the first time that a group of inhibitory neurons, whose loss leads to sleep disruption in experimental animals, are substantially diminished among the elderly and individuals with Alzheimer’s disease, and that this, in turn, is accompanied by sleep disruption.

“On average, a person in his 70s has about one hour less sleep per night than a person in his 20s,” explains senior author Clifford B. Saper, MD, PhD, Chairman of Neurology at BIDMC and James Jackson Putnam Professor of Neurology at Harvard Medical School. “Sleep loss and sleep fragmentation is associated with a number of health issues, including cognitive dysfunction, increased blood pressure and vascular disease, and a tendency to develop type 2 diabetes. It now appears that loss of these neurons may be contributing to these various disorders as people age.”

In 1996, the Saper lab first discovered that the ventrolateral preoptic nucleus, a key cell group of inhibitory neurons, was functioning as a “sleep switch” in rats, turning off the brain’s arousal systems to enable animals to fall asleep. “Our experiments in animals showed that loss of these neurons produced profound insomnia, with animals sleeping only about 50 percent as much as normal and their remaining sleep being fragmented and disrupted,” he explains.

A group of cells in the human brain, the intermediate nucleus, is located in a similar location and has the same inhibitory neurotransmitter, galanin, as the vetrolateral preoptic nucleus in rats. The authors hypothesized that if the intermediate nucleus was important for human sleep and was homologous to the animal’s ventrolateral preoptic nucleus, then it may also similarly regulate humans’ sleep-wake cycles.

In order to test this hypothesis, the investigators analyzed data from the Rush Memory and Aging Project, a community-based study of aging and dementia which began in 1997 and has been following a group of almost 1,000 subjects who entered the study as healthy 65-year-olds and are followed until their deaths, at which point their brains are donated for research.

“Since 2005, most of the subjects in the Memory and Aging Project have been undergoing actigraphic recording every two years. This consists of their wearing a small wristwatch-type device on their non-dominant arm for seven to 10 days,” explains first author Andrew S. P. Lim, MD, of the University of Toronto and Sunnybrook Health Sciences Center and formerly a member of the Saper lab. The actigraphy device, which is waterproof, is worn 24 hours a day and thereby monitors all movements, large and small, divided into 15-second intervals. “Our previous work had determined that these actigraphic recordings are a good measure of the amount and quality of sleep,” adds Lim.

The authors examined the brains of 45 study subjects (median age at death, 89.2), identifying ventrolateral preoptic neurons by staining the brains for the neurotransmitter galanin. They then correlated the actigraphic rest-activity behavior of the 45 individuals in the year prior to their deaths with the number of remaining ventrolateral preoptic neurons at autopsy.

“We found that in the older patients who did not have Alzheimer’s disease, the number of ventrolateral preoptic neurons correlated inversely with the amount of sleep fragmentation,” says Saper. “The fewer the neurons, the more fragmented the sleep became.” The subjects with the largest amount of neurons (greater than 6,000) spent 50 percent or more of total rest time in the prolonged periods of non-movement most likely to represent sleep while subjects with the fewest ventrolateral preoptic neurons (less than 3,000) spent less than 40 percent of total rest time in extended periods of rest. The results further showed that among Alzheimer’s patients, most sleep impairment seemed to be related to the number of ventrolateral preoptic neurons that had been lost.

“These findings provide the first evidence that the ventrolateral preoptic nucleus in humans probably plays a key role in causing sleep, and functions in a similar way to other species that have been studied,” says Saper. “The loss of these neurons with aging and with Alzheimer’s disease may be an important reason why older individuals often face sleep disruptions. These results may, therefore, lead to new methods to diminish sleep problems in the elderly and prevent sleep-deprivation-related cognitive decline in people with dementia.”


Story Source:

The above story is based on materials provided by University of TorontoNote: Materials may be edited for content and length.


Journal Reference:

  1. Andrew S. P. Lim, Brian A. Ellison, Joshua L. Wang, Lei Yu, Julie A. Schneider, Aron S. Buchman, David A. Bennett, and Clifford B. Saper. Sleep is related to neuron numbers in the ventrolateral preoptic/intermediate nucleus in older adults with and without Alzheimer’s diseaseBrain, August 2014 DOI:10.1093/brain/awu222

 

 

University of Toronto. “Why elderly are prone to sleep problems.” ScienceDaily. ScienceDaily, 20 August 2014. <www.sciencedaily.com/releases/2014/08/140820091052.htm>.

Filed Under News | Leave a Comment 

Date:
August 18, 2014

 

Source:
Cardiff University

 

Summary:
Small fluctuations in the sizes of ice sheets during the last ice age were enough to trigger abrupt climate change, scientists have found. The team compared simulated model data with that retrieved from ice cores and marine sediments in a bid to find out why temperature jumps of up to ten degrees took place in far northern latitudes within just a few decades during the last ice age.

 

 

20140820-1
New research shows that small fluctuations in the sizes of ice sheets during the last ice age were enough to trigger abrupt climate change.
Credit: © Kushnirov Avraham / Fotolia

 

 

Small fluctuations in the sizes of ice sheets during the last ice age were enough to trigger abrupt climate change, scientists have found.

The team, which included Cardiff University researchers, compared simulated model data with that retrieved from ice cores and marine sediments in a bid to find out why temperature jumps of up to ten degrees took place in far northern latitudes within just a few decades during the last ice age.

The analysis, led by Germany’s Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI), is published Aug. 21, 2014 in the scientific journal Nature.

The research confirms that thicker ice sheets increased ocean circulation and transferred more heat to the north due to a redirection of the prevailing winds. As the north warmed, glaciers retreated, the winds returned to normal conditions, and the north became cooler once again, completing the cycle

Conor Purcell from Cardiff University’s School of Earth and Ocean Sciences, said: “Using the simulations performed with our climate model, we were able to demonstrate that the climate system can respond to small changes with abrupt climate swings. Our study suggests that at medium sea levels, powerful forces, such as the dramatic acceleration of polar ice cap melting, are not necessary to create abrupt climate shifts and temperature changes.”

At present, the extent of Arctic sea ice is far less than during the last glacial period. The Laurentide Ice Sheet, the major driving force for ocean circulation during the glacials, has also disappeared. Climate changes following the pattern of the last ice age are therefore not anticipated under today’s conditions.

Professor Gerrit Lohmann, leader of the Paleoclimate Dynamics group at the AWI said: “In terms of the Earth’s history, we are currently in one of the climate system’s more stable phases. The preconditions which gave rise to rapid temperature changes during the last ice age do not exist today, but sudden climate changes cannot be excluded in future.”


Story Source:

The above story is based on materials provided by Cardiff UniversityNote: Materials may be edited for content and length.


Journal Reference:

  1. Xu Zhang, Gerrit Lohmann, Gregor Knorr, Conor Purcell. Abrupt glacial climate shifts controlled by ice sheet changesNature, 2014; DOI: 10.1038/nature13592

 

 

Cardiff University. “Minor variations in ice sheet size can trigger abrupt climate change.” ScienceDaily. ScienceDaily, 18 August 2014. <www.sciencedaily.com/releases/2014/08/140818224825.htm>.

Filed Under News | Leave a Comment 

Date:
August 18, 2014

 

Source:
Massachusetts General Hospital

 

Summary:
A microfluidic device may help study key steps in the process by which cancer cells break off from a primary tumor to invade other tissues and form metastases. “This device gives us a platform to be used in testing and comparing compounds to block or delay the epithelial-mesenchymal transition, potentially slowing the progression of cancer,” says one researcher.

 

 

20140819-1
As cells undergoing the epithelial-mesenchymal transition move from left to right through the EMT chip, those expressing mesenchymal markers (red) break away and move independently from other cells, while cells expressing epithelial markers (green) continue to move as a collective front.
Credit: BioMEMS Resource Center, Massachusetts General Hospital

 

 

microfluidic device developed at Massachusetts General Hospital (MGH) may help study key steps in the process by which cancer cells break off from a primary tumor to invade other tissues and form metastases. In their report published in Nature Materials, the investigators describe an stands for epithelial-mesenchymal transition, a fundamental change in cellular characteristics that has been associated with the ability of tumor cells to migrate and invade other sites in the body. Therapies that target this process may be able to slow or halt tumor metastasis.

“This device gives us a platform to be used in testing and comparing compounds to block or delay the epithelial-mesenchymal transition, potentially slowing the progression of cancer,” says Daniel Irimia, MD, PhD, associate director of the BioMEMS Resource Center in the MGH Department of Surgery.

Normally a stage in embryonic development, EMT is important during normal wound healing and also appears to take place when epithelial cells lining bodily surfaces and cavities become malignant. Instead of adhering to each other tightly in layers, cells that have undergone EMT gain the ability to separate out, move to other parts of the body and implant themselves into the new sites. Cells that have transitioned into a mesenchymal state appear to be more resistant to cancer therapies or other measures designed to induce cell death.

The device developed at the MGH allows investigator to follow the movement of cells passing through a comb-like array of micropillars, which temporarily separates cells that are adhering to each other. To establish baseline characteristics of noncancerous cells, the investigators first studied the passage of normal epithelial cells through the array. They observed that those cells moved at the same speed as neighboring cells, reconnecting when they come into contact with each other into multicellular sheets that repeatedly break apart and reseal. Tumor cells, however, passed quickly and more directly through the device and did not interact with nearby cells.

When cells in which the process of EMT had been initiated by genetic manipulation were observed passing through the device, at first they migrated collectively. But soon after encountering the first micropillars, many cells broke away from the collective front and migrated individually for the rest of their trajectory. Some cells appeared to undergo the opposite transition, reverting from individual migration back to collective migration. Subsequent analysis revealed that the slower moving cells that continued migrating together expressed epithelial markers, while the faster moving, independently migrating cells expressed mesenchymal markers. The individually cells migrating also appeared to be more resistant to treatment with chemotherapy drugs.

A particular advantage of the EMT chip is the ability to observe how the behavior of a population of cells changes over time. “Instead of providing a snapshot of cells or tissues at a specific moment, as traditional histology studies do, the new chip can capture the changing dynamics of individual or collective cellular migration,” explains Irimia, an assistant professor of Surgery at Harvard Medical School. “In the controlled environment of the EMT chip, these processes resemble such phase transitions as the change from solid to liquid that occurs with melting. Analogies with well studied physical processes are very useful for summarizing the complex EMT process into a few parameters. These parameters are very helpful when making comparisons between different cell types and studying the contribution of various biological processes to EMT. They are also useful when comparing different chemicals to discover new compounds to block or delay EMT”


Story Source:

The above story is based on materials provided by Massachusetts General Hospital.Note: Materials may be edited for content and length.


Journal Reference:

  1. Ian Y. Wong, Sarah Javaid, Elisabeth A. Wong, Sinem Perk, Daniel A. Haber, Mehmet Toner, Daniel Irimia. Collective and individual migration following the epithelial–mesenchymal transitionNature Materials, 2014; DOI:10.1038/nmat4062

 

 

Massachusetts General Hospital. “Device monitors key step in development of tumor metastases.” ScienceDaily. ScienceDaily, 18 August 2014. <www.sciencedaily.com/releases/2014/08/140818153611.htm>.

Filed Under News | Leave a Comment 

Date:
August 17, 2014

 

Source:
University of Cambridge

 

Summary:
One of the most bizarre-looking fossils ever found — a worm-like creature with legs, spikes and a head difficult to distinguish from its tail — has found its place in the evolutionary tree of life, definitively linking it with a group of modern animals for the first time.

 

 

20140818-1
This is a reconstruction of the Burgess Shale animal Hallucigenia sparsa.
Credit: Elyssa Rider

 

 

One of the most bizarre-looking fossils ever found — a worm-like creature with legs, spikes and a head difficult to distinguish from its tail — has found its place in the evolutionary Tree of Life, definitively linking it with a group of modern animals for the first time.

The animal, known as Hallucigenia due to its otherworldly appearance, had been considered an ‘evolutionary misfit’ as it was not clear how it related to modern animal groups. Researchers from the University of Cambridge have discovered an important link with modern velvet worms, also known as onychophorans, a relatively small group of worm-like animals that live in tropical forests. The results are published in the advance online edition of the journalNature.

The affinity of Hallucigenia and other contemporary ‘legged worms’, collectively known as lobopodians, has been very controversial, as a lack of clear characteristics linking them to each other or to modern animals has made it difficult to determine their evolutionary home.

What is more, early interpretations of Hallucigenia, which was first identified in the 1970s, placed it both backwards and upside-down. The spines along the creature’s back were originally thought to be legs, its legs were thought to be tentacles along its back, and its head was mistaken for its tail.

Hallucigenia lived approximately 505 million years ago during the Cambrian Explosion, a period of rapid evolution when most major animal groups first appear in the fossil record. These particular fossils come from the Burgess Shale in Canada’s Rocky Mountains, one of the richest Cambrian fossil deposits in the world.

Looking like something from science fiction, Hallucigenia had a row of rigid spines along its back, and seven or eight pairs of legs ending in claws. The animals were between five and 35 millimetres in length, and lived on the floor of the Cambrian oceans.

A new study of the creature’s claws revealed an organisation very close to those of modern velvet worms, where layers of cuticle (a hard substance similar to fingernails) are stacked one inside the other, like Russian nesting dolls. The same nesting structure can also be seen in the jaws of velvet worms, which are no more than legs modified for chewing.

“It’s often thought that modern animal groups arose fully formed during the Cambrian Explosion,” said Dr Martin Smith of the University’s Department of Earth Sciences, the paper’s lead author. “But evolution is a gradual process: today’s complex anatomies emerged step by step, one feature at a time. By deciphering ‘in-between’ fossils likeHallucigenia, we can determine how different animal groups built up their modern body plans.”

While Hallucigenia had been suspected to be an ancestor of velvet worms, definitive characteristics linking them together had been hard to come by, and their claws had never been studied in detail. Through analysing both the prehistoric and living creatures, the researchers found that claws were the connection joining them together. Cambrian fossils continue to produce new information on origins of complex animals, and the use of high-end imaging techniques and data on living organisms further allows researchers to untangle the enigmatic evolution of earliest creatures.

“An exciting outcome of this study is that it turns our current understanding of the evolutionary tree of arthropods — the group including spiders, insects and crustaceans — upside down,” said Dr Javier Ortega-Hernandez, the paper’s co-author. “Most gene-based studies suggest that arthropods and velvet worms are closely related to each other; however, our results indicate that arthropods are actually closer to water bears, or tardigrades, a group of hardy microscopic animals best known for being able to survive the vacuum of space and sub-zero temperatures — leaving velvet worms as distant cousins.”

“The peculiar claws of Hallucigenia are a smoking gun that solve a long and heated debate in evolutionary biology, and may even help to decipher other problematic Cambrian critters,” said Dr Smith.


Story Source:

The above story is based on materials provided by University of Cambridge. The original story is licensed under a Creative Commons LicenceNote: Materials may be edited for content and length.


Journal Reference:

  1. Martin R. Smith, Javier Ortega-Hernández. Hallucigenia’s onychophoran-like claws and the case for TactopodaNature, 2014; DOI: 10.1038/nature13576

 

 

University of Cambridge. “Evolutionary misfit: Misunderstood worm-like fossil finds its place in the Tree of Life.” ScienceDaily. ScienceDaily, 17 August 2014. <www.sciencedaily.com/releases/2014/08/140817220058.htm>.

Filed Under News | Leave a Comment 

Next Page →