Date:
January 29, 2015

 

Source:
University of Toronto

 

Summary:
Atmospheric physicists predict that global warming will not lead to an overall increasingly stormy atmosphere, a topic debated by scientists for decades. Instead, strong storms will become stronger while weak storms become weaker, and the cumulative result of the number of storms will remain unchanged.

 

 

20150130-1
Strong storms are likely to get stronger with global warming, atmospheric physicists predict.
Credit: © danmir12 / Fotolia

 

 

A study led by atmospheric physicists at the University of Toronto finds that global warming will not lead to an overall increasingly stormy atmosphere, a topic debated by scientists for decades. Instead, strong storms will become stronger while weak storms become weaker, and the cumulative result of the number of storms will remain unchanged.

“We know that with global warming we’ll get more evaporation of the oceans,” said Frederic Laliberte, a research associate at U of T’s physics department and lead author of a study published this week in Science. “But circulation in the atmosphere is like a heat engine that requires fuel to do work, just like any combustion engine or a convection engine.”

The atmosphere’s work as a heat engine occurs when an air mass near the surface takes up water through evaporation as it is warmed by the Sun and moves closer to the Equator. The warmer the air mass is, the more water it takes up. As it reaches the Equator, it begins to ascend through the atmosphere, eventually cooling as it radiates heat out into space. Cool air can hold less moisture than warm air, so as the air cools, condensation occurs, which releases heat. When enough heat is released, air begins to rise even further, pulling more air behind it producing a thunderstorm. The ultimate “output” of this atmospheric engine is the amount of heat and moisture that is redistributed between the Equator and the North and South Poles.

“By viewing the atmospheric circulation as a heat engine, we were able to rely on the laws of thermodynamics to analyze how the circulation would change in a simulation of global warming,” said Laliberte. “We used these laws to quantify how the increase in water vapour that would result from global warming would influence the strength of the atmospheric circulation.”

The researchers borrowed techniques from oceanography and looked at observations and climate simulations. Their approach allowed them to test global warming scenarios and measure the output of atmospheric circulation under warming conditions.

“We came up with an improved technique to comprehensively describe how air masses change as they move from the Equator to the poles and back, which let us put a number on the energy efficiency of the atmospheric heat engine and measure its output,” said Laliberte.

The scientists concluded that the increase in water vapour was making the process less efficient by evaporating water into air that is not already saturated with water vapour. They showed that this inefficiency limited the strengthening of atmospheric circulation, though not in a uniform manner. Air masses that are able to reach the top of the atmosphere are strengthened, while those that can not are weakened.

“Put more simply, powerful storms are strengthened at the expense of weaker storms,” said Laliberte. “We believe atmospheric circulation will adapt to this less efficient form of heat transfer and we will see either fewer storms overall or at least a weakening of the most common, weaker storms.”


Story Source:

The above story is based on materials provided by University of Toronto. Note: Materials may be edited for content and length.


Journal Reference:

  1. F. Laliberte, J. Zika, L. Mudryk, P. J. Kushner, J. Kjellsson, K. Doos. Constrained work output of the moist atmospheric heat engine in a warming climate.Science, 2015; 347 (6221): 540 DOI: 10.1126/science.1257103

 

University of Toronto. “Global warming won’t mean more storms: Big storms to get bigger, small storms to shrink, experts predict.” ScienceDaily. ScienceDaily, 29 January 2015. <www.sciencedaily.com/releases/2015/01/150129143040.htm>.

Date:
January 28, 2015

 

Source:
University of California – Davis

 

Summary:
From the subarctic Pacific to the Chilean margins, extreme oxygen loss is stretching from the upper ocean to about 3,000 meters deep. In some oceanic regions, such loss occurred within 100 years or less, according to a new study.

 

 

20150129-1
This map of the California Current shows the extent of the low-oxygen seafloor. Yellow indicates intermediate hypoxia, while red zones are areas of severe oxygen loss.
Credit: UC Davis

 

 

Seafloor sediment cores reveal abrupt, extensive loss of oxygen in the ocean when ice sheets melted roughly 10,000-17,000 years ago, according to a study from the University of California, Davis. The findings provide insight into similar changes observed in the ocean today.

In the study, published in the journal PLOS ONE, researchers analyzed marine sediment cores from different world regions to document the extent to which low oxygen zones in the ocean have expanded in the past, due to climate change.

From the subarctic Pacific to the Chilean margins, they found evidence of extreme oxygen loss stretching from the upper ocean to about 3,000 meters deep. In some oceanic regions, such loss took place over a time period of 100 years or less.

“This is a global story that knits these regions together and shows that when you warm the planet rapidly, whole ocean basins can lose oxygen very abruptly and very extensively,” said lead author Sarah Moffitt, a postdoctoral scholar with the UC Davis Bodega Marine Laboratory and formerly a Ph.D. student with the Graduate Group in Ecology.

Marine organisms, from salmon and sardines to crab and oysters, depend on oxygen to exist. Adapting to an ocean environment with rapidly dropping oxygen levels would require a major reorganization of living things and their habitats, much as today polar species on land are retreating to higher, cooler latitudes.

The researchers chose the deglaciation period because it was a time of rising global temperatures, atmospheric carbon dioxide and sea levels — many of the global climate change signs the Earth is experiencing now.

“Our modern ocean is moving into a state that has no precedent in human history,” Moffitt said. “The potential for our oceans to look very, very different in 100-150 years is real. How do you use the best available science to care for these critical resources in the future? Resource managers and conservationists can use science like this to guide a thoughtful, precautionary approach to environmental management.”

The study’s co-authors include: Russell Moffitt with the Marine Conservation Institute; Tessa Hill, professor in the UC Davis Department of Earth and Planetary Sciences and at the Bodega Marine Laboratory; Wilson Sauthoff and Catherine Davis of the Department of Earth and Planetary Sciences; and Kathryn Hewett, UC Davis Department of Civil and Environmental Engineering.

The study arose from a graduate level course that was taught at UC Davis in winter 2013 by Hill. The research was supported by the National Science Foundation.


Story Source:

The above story is based on materials provided by University of California – Davis.Note: Materials may be edited for content and length.


Journal Reference:

  1. Sarah E. Moffitt, Russell A. Moffitt, Wilson Sauthoff, Catherine V. Davis, Kathryn Hewett, Tessa M. Hill. Paleoceanographic Insights on Recent Oxygen Minimum Zone Expansion: Lessons for Modern Oceanography. PLOS ONE, 2015; 10 (1): e0115246 DOI: 10.1371/journal.pone.0115246

 

University of California – Davis. “Smothered oceans: Extreme oxygen loss in oceans accompanied past global climate change.” ScienceDaily. ScienceDaily, 28 January 2015. <www.sciencedaily.com/releases/2015/01/150128152155.htm>.

Date:
January 27, 2015

 

Source:
University of California, San Francisco (UCSF)

 

Summary:
People who carry a variant of a gene that is associated with longevity also have larger volumes in a front part of the brain involved in planning and decision-making, according to researchers.

 

 

20150128-1
The dorsolateral prefrontal cortex, depicted in blue and red, is larger and linked with better function in those who carry one copy of the KLOTHO gene variant.
Credit: Illustration by Michael Griffin Kelly

 

 

People who carry a variant of a gene that is associated with longevity also have larger volumes in a front part of the brain involved in planning and decision-making, according to researchers at UC San Francisco.

The finding bolsters their previous discovery that middle-aged and older people who carry a single copy of the KLOTHO allele, called KL-VS, performed better on a wide range of cognitive tests. When they modeled KL-VS in mice, they found this strengthened the connections between neurons and enhanced learning and memory.

KLOTHO codes for a protein, called klotho, which is produced in the kidney and brain and regulates many different processes in the body. About one in five people carry a single copy of KL-VS, which increases klotho levels and is associated with a longer lifespan and better heart and kidney function. A small minority, about 3 percent, carries two copies, which is associated with a shorter lifespan.

In the current study, published in Annals of Clinical and Translational Neurology, researchers scanned the brains of 422 cognitively normal men and women aged 53 and older to see if the size of any brain area correlated with carrying one, two or no copies of the allele.

They found that the KLOTHO gene variant predicted the size of a region called the right dorsolateral prefrontal cortex (rDLPFC), which is especially vulnerable to atrophy as people age. Deterioration in this area may be one reason why older people have difficulty suppressing distracting information and doing more than one thing at a time.

Researchers found that the rDLPFC shrank with age in all three groups, but those with one copy of KL-VS, about a quarter of the study group, had larger volumes than either non-carriers or those with two copies. Researchers also found that the size of the rDLPFC predicted how well the three groups performed on cognitive tests, such as working memory — the ability to keep a small amount of newly acquired information in mind — and processing speed. Both tests are considered to be good measures of the planning and decision-making functions that the rDLPFC controls.

“We’ve known for a long time that people lose cognitive abilities as they age, but now we’re beginning to understand that factors like klotho can give people a boost and confer resilience in aging,” said senior author Dena Dubal, MD, PhD, assistant professor of neurology at UCSF and the David A. Coulter Endowed Chair in Aging and Neurodegenerative Disease. “Genetic variation in KLOTHO could help us predict brain health and find ways to protect people from the devastating diseases that happen to us as we grow old, like Alzheimer’s and other dementias.”

In statistical tests, the researchers concluded that the larger rDLPFC volumes seen in single copy KL-VS carriers accounted for just 12 percent of the overall effect that the variant had on the abilities tested. However, the allele may have other effects on the brain, such as increasing levels or changing the actions of the klotho protein to enhance synaptic plasticity, or the connections between neurons. In a previous experiment, they found that raising klotho in mice increased the action of a cell receptor critical to forming memories.

“The brain region enhanced by genetic variation in KLOTHO is vulnerable in aging and several psychiatric and neurologic diseases including schizophrenia, depression, substance abuse, and frontotemporal dementia,” said Jennifer Yokoyama, PhD, first author and assistant professor of neurology at UCSF. “In this case, bigger size means better function. It will be important to determine whether the structural boost associated with carrying one copy of KL-VS can offset the cognitive deficits caused by disease.”


Story Source:

The above story is based on materials provided by University of California, San Francisco (UCSF). The original article was written by Laura Kurtzman. Note: Materials may be edited for content and length.


Journal Reference:

  1. Jennifer S. Yokoyama, Virginia E. Sturm, Luke W. Bonham, Eric Klein, Konstantinos Arfanakis, Lei Yu, Giovanni Coppola, Joel H. Kramer, David A. Bennett, Bruce L. Miller, Dena B. Dubal. Variation in longevity geneKLOTHOis associated with greater cortical volumes. Annals of Clinical and Translational Neurology, 2015; DOI: 10.1002/acn3.161

 

University of California, San Francisco (UCSF). “Brain region vulnerable to aging is larger in those with longevity gene variant.” ScienceDaily. ScienceDaily, 27 January 2015. <www.sciencedaily.com/releases/2015/01/150127121158.htm>.

Date:
January 26, 2015

 

Source:
The Optical Society

 

Summary:
A team of scientists has developed, for the first time, a microscopic component that is small enough to fit onto a standard silicon chip that can generate a continuous supply of entangled photons.

 

 

20150127-1
Drawing of the silicon ring resonator with its access waveguide. The green wave at the input represents the laser pump, the red and blue wavepackets at the output represent the generated photon pairs, and the infinity symbol linking the two outputs indicates the entanglement between the pair of photons.
Credit: Università degli Studi di Pavia

 

 

Unlike Bilbo’s magic ring, which entangles human hearts, engineers have created a new micro-ring that entangles individual particles of light, an important first step in a whole host of new technologies.

Entanglement — the instantaneous connection between two particles no matter their distance apart — is one of the most intriguing and promising phenomena in all of physics. Properly harnessed, entangled photons could revolutionize computing, communications, and cyber security. Though readily created in the lab and by comparatively large-scale optoelectronic components, a practical source of entangled photons that can fit onto an ordinary computer chip has been elusive.

New research, reported today in The Optical Society’s (OSA) new high-impact journal Optica, describes how a team of scientists has developed, for the first time, a microscopic component that is small enough to fit onto a standard silicon chip that can generate a continuous supply of entangled photons.

The new design is based on an established silicon technology known as a micro-ring resonator. These resonators are actually loops that are etched onto silicon wafers that can corral and then reemit particles of light. By tailoring the design of this resonator, the researchers created a novel source of entangled photons that is incredibly small and highly efficient, making it an ideal on-chip component.

“The main advantage of our new source is that it is at the same time small, bright, and silicon based,” said Daniele Bajoni, a researcher at the Università degli Studi di Pavia in Italy and co-author on the paper. “The diameter of the ring resonator is a mere 20 microns, which is about one-tenth of the width of a human hair. Previous sources were hundreds of times larger than the one we developed.”

From Entanglement to Innovation

Scientists and engineers have long recognized the enormous practical potential of entangled photons. This curious manifestation of quantum physics, which Einstein referred to as “spooky action at a distance,” has two important implications in real-world technology.

First, if something acts on one of the entangled photons then the other one will respond to that action instantly, even if it is on the opposite side of a computer chip or even the opposite side of the Galaxy. This behavior could be harnessed to increase the power and speed of computations. The second implication is that the two photons can be considered to be, in some sense, a single entity, which would allow for new communication protocols that are immune to spying.

This seemingly impossible behavior is essential, therefore, for the development of certain next-generation technologies, such as computers that are vastly more powerful than even today’s most advanced supercomputers, and secure telecommunications.

Creating Entanglement on a Chip

To bring these new technologies to fruition, however, requires a new class of entangled photon emitters: ones that can be readily incorporated into existing silicon chip technologies. Achieving this goal has been very challenging.

To date, entangled photon emitters — which are principally made from specially designed crystals — could be scaled down to only a few millimeters in size, which is still many orders of magnitude too large for on-chip applications. In addition, these emitters require a great deal of power, which is a valuable commodity in telecommunications and computing.

To overcome these challenges, the researchers explored the potential of ring resonators as a new source for entangled photons. These well-established optoelectronic components can be easily etched onto a silicon wafer in the same manner that other components on semiconductor chips are fashioned. To “pump,” or power, the resonator, a laser beam is directed along an optical fiber to the input side of the sample, and then coupled to the resonator where the photons race around the ring. This creates an ideal environment for the photons to mingle and become entangled.

As photons exited the resonator, the researchers were able to observe that a remarkably high percentage of them exhibited the telltale characteristics of entanglement.

“Our device is capable of emitting light with striking quantum mechanical properties never observed in an integrated source,” said Bajoni. “The rate at which the entangled photons are generated is unprecedented for a silicon integrated source, and comparable with that available from bulk crystals that must be pumped by very strong lasers.”

Applications and Future Technology

The researchers believe their work is particularly relevant because it demonstrates, for the first time, a quintessential quantum effect, entanglement, in a well-established technology.

“In the last few years, silicon integrated devices have been developed to filter and route light, mainly for telecommunication applications,” observed Bajoni. “Our micro-ring resonators can be readily used alongside these devices, moving us toward the ability to fully harness entanglement on a chip.” As a result, this research could facilitate the adoption of quantum information technologies, particularly quantum cryptography protocols, which would ensure secure communications in ways that classical cryptography protocols cannot.

According to Bajoni and his colleagues, these protocols have already been demonstrated and tested. What has been missing was a cheap, small, and reliable source of entangled photons capable of propagation in fiber networks, a problem that is apparently solved by their innovation.


Story Source:

The above story is based on materials provided by The Optical Society. Note: Materials may be edited for content and length.


Journal Reference:

  1. Davide Grassani, Stefano Azzini, Marco Liscidini, Matteo Galli, Michael J. Strain, Marc Sorel, J. E. Sipe, Daniele Bajoni. Micrometer-scale integrated silicon source of time-energy entangled photons. Optica, 2015; 2 (2): 88 DOI:10.1364/OPTICA.2.000088

 

The Optical Society. “Entanglement on a chip: Breakthrough promises secure communications and faster computers.” ScienceDaily. ScienceDaily, 26 January 2015. <www.sciencedaily.com/releases/2015/01/150126095707.htm>.

Growth at Target Health – Expanding to New Jersey

 

Target Health Inc., established in 1993, is expanding to New Jersey.  Our software programming department, headed by Senior Director, Joonhyuk Choi, will be located near the Meadowlands, not far from New York City. Our executive offices and clinical, data management, statistics and business development departments will remain on the 23 and 24th floors at 261 Madison Avenue.

 

Abstract Expressionist and Master Nature Photographer

 

Our friend and colleague, James Farley, Clinical Data Manager at TransTech Pharma LLC and subscriber to ON TARGET newsletter, is sharing another great photo, this time a gorgeous view of the pier – just after Sunset – at Holden Beach, NC.

 

20150126-14

Sunset at Holden Beach, NC, 2015 ©JFarleyPhotography.com

 

ON TARGET is the newsletter of Target Health Inc., a NYC-based, full-service, contract research organization (eCRO), providing strategic planning, regulatory affairs, clinical research, data management, biostatistics, medical writing and software services to the pharmaceutical and device industries, including the paperless clinical trial.

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

QUIZ

Filed Under News | Leave a Comment

Scientists Reprogram Fat Cells to Increase Fat Burning

20150126-13

Scientists have now discovered how fat cells can be reprogrammed to burn fat instead of storing it. By analyzing molecular processes, scientists have described for the first time how white fat cells are converted into brown fat cells.

 

Cells responsible for storing 1) ___ can be converted into cells that burn fat and keep you thinner and healthier. Scientists have now discovered how this conversion happens. “We’ve mapped the fat cells’ genomes and identified a protein which activates specific 2) ___ that reprogram the fat cells to burn fat instead of storing it,“ says Professor S. Mandrup from the department of biochemistry and molecular biology at the University of Southern Denmark (SDU). The study’s findings, recently published the January 2015 journal, Genes & Development, are a crucial step on the way to developing 3) ___ that can make the body transform fat into energy instead of storing it.

 

Our body contains three different types of fat cells: White cells, which store fat derived from food so that you put on weight if you eat too much; 4) ___ fat cells, which convert fat into energy instead of storing it; Brite fat cells are beige/brownish in color (brite is short for brown-in-white). Like brown cells they convert fat into heat. Scientists have long been interested in brite fat cells because it has been shown that white fat cells can be browned and turned into brite so that they burn fat instead of storing it. Studies show that overweight people have a lot of white cells in their fatty 5) ___. We’ve known for a long time that exposure to the cold, for example, can cause the body to produce more brite fat cells which produce heat, although we have not known much about which molecular mechanisms are at play.

 

Anti-Diabetic Medication Browns Cells

In order better to understand how fat-storing white cells can become fat-burning brite cells, scientists at SDU have grown white cells in the laboratory and added a medication previously used to increase the insulin sensitivity of 6) ___ patients. The medication known as rosiglitazone (Avandia) makes the fat cells turn brown thus making them brite — but exactly what effect the drug has on the cells at the molecular level has been unclear until now. To study this, the scientists used advanced gene sequencing technology to map and analyze the genome of both the white and brite cells in order to find out which genes are active when the 7) ___ cells turn brite.

 

A Specific Protein Activates Genes

Their analyses did not only show where in the fat cells’ genome there is activity when white cells are converted to brite cells, but also that a specific 8) ___, known as KLF11, has to be present for the cells to be reprogrammed to burn fat instead of storing it. It’s clear that KLF11 finds the genes in the cells which are active when white fat cells are being converted into brite cells. KLF11 is already known as a protein important to the functionality of insulin-producing beta-cells in the pancreas, but with our study we have demonstrated, for the first time that the protein is also necessary for white fat cells to be reprogrammed to brite fat cells.

 

White fat cells have been known about for many years, but it was not until 2009 that scientists discovered that adult human beings also have so-called brown fat tissue. The fatty tissue is referred to as brown because of its reddish-brown color when viewed through a microscope. The brown color occurs because the cells contain many mitochondria and 9) ___ vessels. Over the past decade it was also discovered that humans have brite fat cells. Brite fat cells burn fat just like the brown ones. In other words: the more brite and brown fat cells you have the more 10) ___ you burn. Unlike the entirely brown fat cells, brite cells are produced from already-existing white fat tissue of the kind we have most of in our body. Scientists still have a lot to learn about brite fat cells. They do not know, for instance, whether we can cause more brite fat cells to be produced, e.g. by eating specific foods. It is not known, either, whether brite cells can only be produced by converting white cells, or whether the body can produce its own new brite fat cells. In the long term, it will be possible to use the discovery to develop new medication which can impact on precisely those areas in the fat cells’ genome. This will make it possible to prevent overweight people from developing insulin intolerance and diabetes or other diseases related to 11) ___. We know more about which buttons we have to push to increase specific processes in the cells which activate the browning process. KLF11 is just one of the factors that determines which genes are active, researchers are certain there are many others.

 

Jacob B. Hansen, an associate professor at the Department of Biology of the University of Copenhagen, who studies brown fat cells, says that the new results represent an important step towards understanding how fat cell conversion takes place in humans. The molecular understanding of how brite is produced and much of what we do know comes from experiments on mice, however, this study is entirely based on 12) ___ cell material. Experiments on mice have shown that KFL11 can regulate parts of the biology of brown fat cells, but in this study, the scientists used advanced methods which enabled them to characterize the browning process in great detail. This was much more elegant than the work previously done on mice because they’ve observed all the genes in one go and found the areas in the genome of the cells which KLF11 binds to. Pills against obesity are closer to reality. The result certainly is a step in the direction of a cure which e.g. can reduce the risk of people developing type 2 diabetes and countering other health issues related to being overweight. The potential is interesting, because the better we understand the biology of fat cells, the closer we’ll get to being able to design medication which can cause the body to produce more beige and brown fat cells, which experiments on mice have shown effective to improve glucose tolerance. “But this doesn’t mean that in five years will have a magic pill against obesity,“ Hansen says. “The work done at SDU is basic science, which improves our knowledge of the process known as browning, but a lot more research will have to be done before we fully understand the process and its significance.“ The genome of white adipocytes, has been investigated and is reprogrammed during browning using advanced genome sequencing technologies. Browning has been stimulated in human white adipocytes by a drug used to treat type II diabetes and compared to white and “brite“ fat cells. This comparison showed that 13) ___ fat cells have distinct gene programs which, when active, make these cells particularly energy-consuming. By identifying the areas of the genome that are directly involved in the reprogramming, an important factor has been identified in the process — the gene regulatory protein KLF11 (Kruppel Like Factor-11), which is found in all fat cells, is required for the reprogramming to take place. This research has been a long process, taking four years to get the results being published. The discovery of the “brite“ fat cell mechanisms and the specific regulatory areas brings scientists closer to understanding how reprogramming of white fat cells takes place. This knowledge potentially means, that in the future drugs can be 14) ___to activate the genomic regions and browning factors like KLF11 in the treatment of obesity.

 

ANSWERS: 1) fat; 2) genes; 3) drugs; 4) brown; 5) tissue; 6) diabetes; 7) white; 8) protein; 9) blood; 10) energy; 11) obesity; 12) human; 13) “brite“; 14) targeted

 

The Ob(Lep) Gene and Weight Loss

20150126-12

Structure of the obese protein leptin-E100

 

The Ob(Lep) gene (Ob for obese, Lep for leptin) is located on chromosome 7 in humans. Human leptin is a 16 kDa protein of 167 amino acids. Leptin should not to be confused with Lectin or Lecithin.

 

Leptin (from Greek meaning “thin“), the “satiety hormone“, is a hormone made by fat cells which regulates the amount of fat stored in the body. It does this by adjusting both the sensation of hunger, and adjusting energy expenditures. Hunger is inhibited (satiety) when the amount of fat stored reaches a certain level. Leptin is then secreted and circulates through the body, eventually activating leptin receptors in the arcuate nucleus of the hypothalamus. Energy expenditure is increased both by the signal to the brain, and directly via leptin receptors on peripheral targets. The effect of leptin is opposite to that of ghrelin, the “hunger hormone“. Ghrelin receptors are on the same brain cells as leptin receptors, so these cells receive competing satiety and hunger signals. Leptin and ghrelin, along with many other hormones, participate in the complex process of energy homeostasis. Although regulation of fat stores is deemed to be the primary function of leptin, it also plays a role in other physiological processes, as evidenced by its multiple sites of synthesis other than fat cells, and the multiple cell types beside hypothalamic cells which have leptin receptors. Many of these additional functions are yet to be defined.

Leptin was approved in the United States in 2014 for use in congenital leptin deficiency and generalized lipodystrophy.

 

An analog of human leptin metreleptin (trade name Myalept) was first approved in Japan in 2013, and in the United States (US) in February 2014. In the US it is indicated as a treatment for complications of leptin deficiency, and for the diabetes and hypertriglyceridemia associated with congenital or acquired generalized lipodystrophy.  Leptin is known to interact with amylin, a hormone involved in gastric emptying and creating a feeling of fullness. When both leptin and amylin were given to obese, leptin-resistant rats, sustained weight loss was seen. Due to its apparent ability to reverse leptin resistance, amylin has been suggested as possible therapy for obesity.

 

Historically, the existence of a hormone regulating hunger and energy expenditure was hypothesized based on studies of mutant obese mice that arose at random within a mouse colony at the Jackson Laboratory in 1950. Mice homozygous for the ob mutation (ob/ob) ate voraciously and were massively obese. In the 1960s, a second mutation causing obesity and a similar phenotype was identified by Douglas Coleman, also at the Jackson Laboratory, and was named diabetes (db), as both ob/ob and db/db were obese. Rudolph Leibel and Jeffrey M. Friedman reported the mapping of the ob gene in 1990. Consistent with Coleman’s and Leibel’s hypothesis, several subsequent studies from Leibel’s and Friedman’s labs and other groups confirmed that the ob gene encoded a novel hormone that circulated in blood and that could suppress food intake and body weight in ob and wild type mice, but not in db mice.

 

In 1994, with the ob gene isolated, Friedman reported the discovery of the gene. In 1995, Caro’s laboratory provided evidence that the mutations present in the mouse ob gene did not occur in humans. Furthermore the ob gene expression was increased in human obesity, which led to postulate the concept of leptin resistance. At the suggestion of Roger Guillemin, Friedman named this new hormone “leptin“ from the Greek lepto meaning thin. Leptin was the first fat cell-derived hormone to be discovered. Subsequent studies confirmed that the db gene encodes the leptin receptor and that it is expressed in the hypothalamus, a region of the brain known to regulate the sensation of hunger and body weight.

 

Coleman and Friedman have been awarded numerous prizes acknowledging their roles in discovery of leptin, including the Gairdner Foundation International Award(2005), the Shaw Prize (2009), the Lasker Award, the BBVA Prize and the King Faisal International Prize. The discovery of leptin has been documented in a series of books including Fat: Fighting the Obesity Epidemic by Robert Pool, The Hungry Gene by Ellen Ruppel Shell, and Rethinking Thin: The New Science of Weight Loss and the Myths and Realities of Dieting by Gina Kolata.Fat: Fighting the Obesity Epidemic and Rethinking Thin: The New Science of Weight Loss and the Myths and Realities of Dieting review the work in the Friedman laboratory that led to the cloning of the ob gene.

 

A mutant leptin was first described in 1997, and subsequently six additional mutations were described. All of those affected were from Eastern countries; and all had variants of leptin not detected by the standard immunoreactive technique. The most recently described eighth mutant reported in January2015 is unique in that it is detected by the standard immunoreactive technique, indicating that leptin levels are elevated but the leptin is nonfunctional. These eight mutations all cause extreme obesity in infancy, with hyperphagia. Leptin is produced primarily in the adipocytes of white adipose tissue. It is also produced by brown adipose tissue, placenta (syncytiotrophoblasts), ovaries, skeletal muscle, stomach (the lower part of the fundic glands), mammary epithelial cells, bone marrow, pituitary, liver, gastric chief cells and P/D1 cells. Leptin circulates in blood in free form, bound to proteins and leptin levels vary exponentially, not linearly, with fat mass. Leptin levels in blood are higher between midnight and early morning, perhaps suppressing appetite during the night. The diurnal rhythm of blood leptin levels can be modified by meal-timing. In humans, many instances are seen where Leptin dissociates from the strict role of communicating nutritional status between body and brain and no longer correlates with body fat levels:

 

1. Leptin level is decreased after short-term fasting (24-72 hours), even when changes in fat mass are not observed.

2. Leptin plays a critical role in the adaptive response to starvation.

3. In obese patients with obstructive sleep apnea, leptin level is increased, but decreased after the administration of continuous positive airway pressure. In non-obese individuals, however, restful sleep (i.e., 8-12 hours of unbroken sleep) can increase leptin to normal levels.

4. Serum level of leptin is reduced by sleep deprivation.

5. Leptin level is increased by perceived emotional stress.

6. Leptin level is decreased by increases in testosterone levels and increased by increases in estrogen levels.

7. Leptin level is chronically reduced by physical exercise training.

8. Leptin level is increased by dexamethasone.

9. Leptin level is increased by insulin.

10. Leptin levels are paradoxically increased in obesity.

 

Brain Recalls Old Memories Via New Pathways

 

Shift in fear retrieval circuitry eyed in anxiety disorders – NIH-funded studies

 

People with anxiety disorders, such as post traumatic stress disorder (PTSD), often experience prolonged and exaggerated fearfulness. Now, an animal study suggests that this might involve disruption of a gradual shifting of brain circuitry for retrieving fear memories. Researchers funded by the National Institutes of Health have discovered in rats that an old fear memory is recalled by a separate brain pathway from the one originally used to recall it when it was fresh.

 

Results of the study, published online in the journal Nature (19 January 2015) showed that after rats were conditioned to fear a tone associated with a mild shock, their overt behavior remained unchanged over time, but the pathway engaged in remembering the traumatic event took a detour, perhaps increasing its staying power. Immediately after fear conditioning, a circuit running from the prefrontal cortex, the executive hub, to part of the amygdala, the fear hub, was engaged to retrieve the memory. But several days later, the authors discovered that retrieval had migrated to a different circuit – from the prefrontal cortex to an area in the thalamus, called the paraventricular region (PVT). The PVT, in turn, communicates with a different central part of the amygdala that orchestrates fear learning and expression. The authors used a genetic/laser technique called optogenetics to spot the moving memory. This technique can activate or silence specific pathways to tease them apart. The authors hypothesized  that the PVT may serve to integrate fear with other adaptive responses, such as stress, thereby strengthening the fear memory, and that people with anxiety disorders, any disruption of timing-dependent regulation in retrieval circuits might worsen fear responses occurring long after a traumatic event.

 

In the same issue of Nature, Bo Li, Ph.D. and Mario Penzo, Ph.D. of Cold Spring Harbor Laboratory in New York, and colleagues, reveal how the long-term fear memory circuit works in mice to translate detection of stress into adaptive behaviors. These authors independently discovered the same shift in memory retrieval circuitry occurring, over time, after fear conditioning in mice. Using powerful genetic-chemical, as well as optogenetic, methods to experimentally switch pathways on and off, they showed conclusively that neurons originating in the PVT regulate fear processing by acting on a class of neurons that store fear memories in the central amygdala area. This article traced this activity in the PVT to the action of a messenger chemical, brain-derived neurotrophic factor (BDNF), which has previously been implicated in mood and anxiety disorders. For example, altered BDNF expression has been linked to PTSD. BDNF from the PVT, working via a specific receptor, activated the memory-storing amygdala neurons. Simply infusing BDNF into the central amygdala area caused mice to freeze in fear, suggesting that it not only enables the formation of fear memories, but also the expression of fear responses.

 

Registering Eye Movements During Reading in Alzheimer’s Disease

 

Reading requires the fine integration of attention, ocular movements, word identification, and language comprehension, among other cognitive parameters. Several of the associated cognitive processes such as working memory and semantic memory are known to be impaired by Alzheimer’s disease (AD). As a result, a study published online in the Journal of Clinical and Experimental Neuropsychology (28 February 2014) analyzed eye movement behavior of 18 patients with probable AD and 40 age-matched controls during Spanish sentence reading.

 

Results showed that controls focused mainly on word properties and considered syntactic and semantic structures. In addition, the controls’ knowledge and prediction about sentence meaning and grammatical structure were quite evident when considering some aspects of visual exploration, such as word skipping, and forward saccades. Saccades are quick, simultaneous movements of both eyes between two phases of fixation in the same direction. By contrast, in the AD group, the predictability effect of the upcoming word was absent, visual exploration was less focused, fixations were much longer, and outgoing saccade amplitudes were smaller than those in controls.

 

According to the authors, the altered visual exploration and the absence of a contextual predictability effect might be related to impairments in working memory and long-term memory retrieval functions and that these eye movement measures demonstrate considerable sensitivity with respect to evaluating cognitive processes in AD. As a result, these measures could provide a user-friendly marker of early disease symptoms and of its posterior progression.

 

FDA Clears First System of Mobile Medical Apps for Continuous Glucose Monitoring

 

Diabetes is a serious, chronic metabolic condition where the body is unable to convert glucose into the energy needed to carry out daily activities. An estimated 25.8 million people in the U.S. – about 215,000 of them under age 20 – have diabetes. If left untreated, high blood glucose levels (hyperglycemia) can lead to serious long-term problems such as stroke, heart disease, and damage to the eyes, kidneys and nerves.

 

A continuous glucose monitor (CGM) is a device that includes a small, wire-like sensor inserted just under the skin that provides a steady stream of information about glucose levels in the fluid around the cells (interstitial fluid). CGMs are worn externally and continuously display an estimate of blood glucose levels, and the direction and rate of change of these estimates. When used along with a blood glucose meter, CGM information can help people with diabetes detect when blood glucose values are approaching dangerously high and dangerously low levels.

 

The FDA has cleared for marketing the first set of mobile medical apps that allow people with diabetes to automatically and securely share data from a CGM with other people in real-time using an Apple mobile device such as an iPhone. The Dexcom Share Direct Secondary Displays system’s data-sharing capability allows caregivers to a person with diabetes to monitor that individual’s blood sugar levels remotely through a legally marketed device that is available on mobile devices. Devices like the Dexcom Share were previously available through open source efforts, but were not in compliance with regulatory requirements. The Dexcom Share system is the first of its kind to offer a legally marketed solution for real-time remote monitoring of a patient’s CGM data.

 

The Dexcom Share system displays data from the G4 Platinum CGM System using two apps: one installed on the patient’s mobile device and one installed on the mobile device of another person. Using Dexcom Share’s mobile medical app, the user can designate people (“followers“) with whom to share their CGM data. The app receives real-time CGM data directly from the G4 Platinum System CGM receiver and transmits it to a Web-based storage location. The app of the “follower“ can then download the CGM data and display it in real-time.

 

The FDA reviewed data for the Dexcom Share system through the de novo classification process, a regulatory pathway for low- to moderate-risk medical devices that are novel and not substantially equivalent to any legally marketed device. Data provided by the device maker showed the device functions as intended and transmits data accurately and securely.

 

Because the device is low to moderate risk, a regulatory concept when classifying devices not requiring a Pre-Market Approval (PMA) application, the FDA has classified the device as class II exempt from premarket submissions. In the future, manufacturers wishing to market devices like the Dexcom Share system will not need premarket clearance by the FDA prior to marketing, but they will still need to register and list their device with the agency, as well as follow other applicable laws and regulations.

 

Alberto Gutierrez, PhD, Director of the Office of In Vitro Diagnostics and Radiological Health in the FDA’s Center for Devices and Radiological Health  said that “Exempting devices from premarket review is part of the FDA’s effort to ensure these products provide accurate and reliable results while still encouraging the development of devices that meet the needs of people living with diabetes and their caregivers.“

 

The Dexcom Share system does not replace real-time continuous glucose monitoring or standard home blood glucose monitoring. It is also not intended to be used by the patient in place of a primary display device. Additionally, CGM values alone are not approved to determine dosing of diabetes medications. CGMs must be calibrated by blood glucose meters, and treatment decisions, such as insulin dosing, should be based on readings from a blood glucose meter.

 

The Dexcom Share system is manufactured by Dexcom, Inc., located in San Diego, California.

 

Next Page →