Date:
December 18, 2014

 

Source:
Harvard School of Public Health

 

Summary:
Women exposed to high levels of fine particulate matter specifically during pregnancy — particularly during the third trimester — may face up to twice the risk of having a child with autism than mothers living in areas with low particulate matter, according to a study. The greater the exposure, the greater the risk, researchers found. It was the first US-wide study exploring the link between airborne particulate matter and autism.

 

 

20141219-1
The greater the exposure a pregnant woman has to fine particulate matter, the greater the risk of their baby having autism, researchers found.
Credit: © Oleg Shelomentsev / Fotolia

 

 

Women exposed to high levels of fine particulate matter specifically during pregnancy–particularly during the third trimester–may face up to twice the risk of having a child with autism than mothers living in areas with low particulate matter, according to a new study from Harvard School of Public Health (HSPH). The greater the exposure, the greater the risk, researchers found. It was the first U.S.-wide study exploring the link between airborne particulate matter and autism.

“Our data add additional important support to the hypothesis that maternal exposure to air pollution contributes to the risk of autism spectrum disorders,” said Marc Weisskopf, associate professor of environmental and occupational epidemiology and senior author of the study. “The specificity of our findings for the pregnancy period, and third trimester in particular, rules out many other possible explanations for these findings.”

The study appears online December 18, 2014 inEnvironmental Health Perspectives.

Prior studies have suggested that, in addition to genetics, exposure to airborne environmental contaminants, particularly during pregnancy and early life, may affect risk of autism. This study focused specifically on the pregnancy period.

The study population included offspring of participants living in all 50 states in Nurses’ Health Study II, a cohort of more than 116,000 female U.S. nurses begun in 1989. The researchers collected data on where participants lived during their pregnancies as well as data from the U.S. Environmental Protection Agency and other sources on levels of fine particulate matter air pollution (PM2.5)–particles 2.5 microns in diameter or smaller–in locations across the U.S. The researchers identified 245 children who were diagnosed with autism spectrum disorder (ASD) and a control group of 1,522 children without ASD during the time period studied.

The researchers explored the association between autism and exposure to PM2.5 before, during, and after pregnancy. They also calculated exposure to PM2.5 during each pregnancy trimester.

Exposure to PM2.5 was significantly associated with autism during pregnancy, but not before or after, the study found. And during the pregnancy, the third trimester specifically was significantly associated with an increased risk. Little association was found between air pollution from larger-sized particles (PM10-2.5) and autism.

“The evidence base for a role for maternal exposure to air pollution increasing the risk of autism spectrum disorders is becoming quite strong,” said Weisskopf. “This not only gives us important insight as we continue to pursue the origins of autism spectrum disorders, but as a modifiable exposure, opens the door to thinking about possible preventative measures.”


Story Source:

The above story is based on materials provided by Harvard School of Public Health.Note: Materials may be edited for content and length.


Journal Reference:

  1. Marc Weisskopf et al. Autism Spectrum Disorder and Particulate Matter Air Pollution before, during, and after Pregnancy: A Nested Case–Control Analysis within the Nurses’ Health Study II Cohort. Environmental Health Perspectives, December 2014 DOI: 10.1289/ehp.1408133

 

Harvard School of Public Health. “Fine particulate air pollution linked with increased autism risk.” ScienceDaily. ScienceDaily, 18 December 2014. <www.sciencedaily.com/releases/2014/12/141218081334.htm>.

Filed Under News | Leave a Comment 

Date:
December 16, 2014

 

Source:
Rice University

 

Summary:
Scientists have detected at least three potential hybridization events that likely shaped the evolutionary paths of ‘old world’ mice, two in recent times and one in the ancient past. The researchers think these instances of introgressive hybridization — a way for genetic material and, potentially, traits to be passed from one species to another through interspecific mating — are only the first of many needles waiting to be found in a very large genetic haystack. While introgressive hybridization is thought to be common among plants, the finding suggests that hybridization in mammals may not be the evolutionary dead end biologists once commonly thought.

 

 

20141218-1
The species used in a Rice University genetic study of mice were collected from 15 locations in Europe and Africa. The green region indicates the range of Mus spretus, the Algerian mouse, while the blue region indicates the range of Mus musculus domesticus, the common European house mouse, which also occupies the green region and beyond. Algerian mouse samples were obtained in Puerto Real (purple diamond). The study showed at least three instances of introgressive hybridization between the two species.
Credit: Courtesy of the Kohn and Nakhleh Labs/Rice University

 

 

Rice University scientists have detected at least three instances of cross-species mating that likely influenced the evolutionary paths of “old world” mice, two in recent times and one in the distant past.

The researchers think these instances of introgressive hybridization — a way for genetic material and, potentially, traits to be passed from one species to another through interspecific mating — are only the first of many needles waiting to be found in a very large genetic haystack. While introgressive hybridization is thought to be common among plants, the finding suggests that hybridization in mammals may not be the evolutionary dead end biologists once commonly thought.

Rice biologist Michael Kohn and computer scientist Luay Nakhleh reported that two species of mice from various locations in Europe and Africa have shared genetic code to their apparent evolutionary advantage at least three times over the centuries.

Kohn, who tracks the genetic roots of mice to see how favorable evolutionary traits develop, and Nakhleh, who studies evolution by comparing genomic data, shared their findings this week in the Proceedings of the National Academy of Sciences.

Mice are common subjects for evolutionary studies of mammals because they breed quickly and a biologist can follow many generations in the span of one’s career. The ability to track such interactions has implications for human genetics and health, the researchers said.

Kohn previously detailed a mutation in common European house mice (Mus musculus domesticus) that gave them resistance to warfarin, a rodent poison also used as a blood thinner in humans. Evidence indicated the mutation appeared in mice about 10 years after the introduction of warfarin and seemed connected to geographically distant Algerian mice (Mus spretus) that carried the same mutation.

But that project looked at only small sections of the mouse genome on one chromosome where the mutant gene in question, Vkorc1, was known to exist. “That gene had adaptively introgressed between these mice and was known to cause resistance at a time when some scientists thought such events should not happen,” said Kohn, an associate professor of ecology and evolutionary biology.

“The question then became, Is it rare or is it common? With the approach we used in this paper, we now know it’s not unique,” he said. The new study compares genome-scale data of 21 mice that originated in 15 different locations in Europe and Africa.

Kohn, Nakhleh and lead author Kevin Liu, their former postdoctoral researcher and now an assistant professor at Michigan State University, employed Rice’s supercomputers and the Nakhleh lab’s open-source PhyloNet-HMM software to locate statistically likely connections between the re-sequenced complete genomes, some newly determined and some collected previously in a massive effort to understand the evolutionary origins of the laboratory mouse genome.

They turned up two more sets of genomic regions, or tracts, that showed hybridization events; one appears to predate the colonization of Europe by M. m. domesticus, and the other affected the subjects’ sense of smell — a definite evolutionary advantage for mice looking for food or mates. “That’s apparently an important locus,” Kohn said.

“The category that jumped out was the olfactory genes,” said Nakhleh, an associate professor of computer science and of biosciences, who had been thinking about large-scale studies of the mouse genome with Kohn since both arrived at Rice in 2004. “Now one has to work through the biology to figure out how this hybridization happened.”

“The new statistical method developed in Luay’s group can only tell you whether an event is there or not. It cannot tell you why it’s there or what it does. But it gives you a way to start looking,” Kohn added.

He said the lengths of the shared genomic tracts provide the key to estimating their ages and evolutionary dynamics. The longer the region, the more recently the hybridization event occurred. And some may be “driver genes” that drag along adjacent chunks of DNA. “One challenge is to see which are driver genes, meaning they encode a biological function that could be favored by natural selection, and which are just tagging along,” he said.

Kohn expects future studies will show evidence of more hybridization among mice from the regions studied and farther afield, though he also realizes many events will never be found.

“Hybrids don’t always pass the test imposed by evolution and disappear from the record,” he said. “We cannot measure them, because we can’t count what we can’t see.”

Nakhleh said other studies may have missed evidence of hybridization because the researchers weren’t specifically looking for it. “Why is it that biologists in general who look at mammalian genomes haven’t found hybridization? I think it’s because they started with the hypothesis that it couldn’t be there and used tools that would ignore it.

“This paper shows the value of collaboration between biologists and ‘big-data’ scientists,” he said. “It also shows there is a need to develop more sophisticated computational methods for biologists.”

Co-authors of the paper are Rice undergraduate students Ethan Steinberg and Alexander Yozzo and former Rice postdoctoral researcher Ying Song, now at the Chinese Academy of Agricultural Sciences in Beijing.

The National Institutes of Health’s National Heart, Lung and Blood Institute and National Library of Medicine, the National Science Foundation (NSF) and the Keck Center of the Gulf Coast Consortia supported the research. The researchers utilized the NSF-funded DAVinCi supercomputer administered by Rice’s Ken Kennedy Institute for Information Technology.


Story Source:

The above story is based on materials provided by Rice University. The original article was written by Mike Williams. Note: Materials may be edited for content and length.


Journal Reference:

  1. Kevin J. Liu, Ethan Steinberg, Alexander Yozzo, Ying Song, Michael H. Kohn, Luay Nakhleh. Interspecific introgressive origin of genomic diversity in the house mouse. Proceedings of the National Academy of Sciences, 2014; 201406298 DOI: 10.1073/pnas.1406298111

 

Rice University. “Big-data analysis reveals gene sharing in mice.” ScienceDaily. ScienceDaily, 16 December 2014. <www.sciencedaily.com/releases/2014/12/141216175746.htm>.

Filed Under News | Leave a Comment 

Date:
December 16, 2014

 

Source:
NASA/Jet Propulsion Laboratory

 

Summary:
It will take about 11 trillion gallons of water (42 cubic kilometers) — around 1.5 times the maximum volume of the largest U.S. reservoir — to recover from California’s continuing drought, according to a new analysis of NASA satellite data.

 

 

20141217-1
Trends in total water storage in California, Nevada and bordering states from NASA’s Gravity Recovery and Climate Experiment (GRACE) satellite mission, September 2011 to September 2014. NASA scientists use these images to better quantify drought and its impact on water availability. Two-thirds of the measured losses were a result of groundwater depletion in California’s Central Valley.
Credit: NASA JPL/Caltech

 

 

It will take about 11 trillion gallons of water (42 cubic kilometers) — around 1.5 times the maximum volume of the largest U.S. reservoir — to recover from California’s continuing drought, according to a new analysis of NASA satellite data.

The finding was part of a sobering update on the state’s drought made possible by space and airborne measurements and presented by NASA scientists Dec. 16 at the American Geophysical Union meeting in San Francisco. Such data are giving scientists an unprecedented ability to identify key features of droughts, and can be used to inform water management decisions.

A team of scientists led by Jay Famiglietti of NASA’s Jet Propulsion Laboratory in Pasadena, California, used data from NASA’s Gravity Recovery and Climate Experiment (GRACE) satellites to develop the first-ever calculation of this kind — the volume of water required to end an episode of drought.

Earlier this year, at the peak of California’s current three-year drought, the team found that water storage in the state’s Sacramento and San Joaquin river basins was 11 trillion gallons below normal seasonal levels. Data collected since the launch of GRACE in 2002 show this deficit has increased steadily.

“Spaceborne and airborne measurements of Earth’s changing shape, surface height and gravity field now allow us to measure and analyze key features of droughts better than ever before, including determining precisely when they begin and end and what their magnitude is at any moment in time,” Famiglietti said. “That’s an incredible advance and something that would be impossible using only ground-based observations.”

GRACE data reveal that, since 2011, the Sacramento and San Joaquin river basins decreased in volume by four trillion gallons of water each year (15 cubic kilometers). That’s more water than California’s 38 million residents use each year for domestic and municipal purposes. About two-thirds of the loss is due to depletion of groundwater beneath California’s Central Valley.

In related results, early 2014 data from NASA’s Airborne Snow Observatory indicate that snowpack in California’s Sierra Nevada range was only half of previous estimates. The observatory is providing the first-ever high-resolution observations of the water volume of snow in the Tuolumne River, Merced, Kings and Lakes basins of the Sierra Nevada and the Uncompahgre watershed in the Upper Colorado River Basin.

To develop these calculations, the observatory measures how much water is in the snowpack and how much sunlight the snow absorbs, which influences how fast the snow melts. These data enable accurate estimates of how much water will flow out of a basin when the snow melts, which helps guide decisions about reservoir filling and water allocation.

“The 2014 snowpack was one of the three lowest on record and the worst since 1977, when California’s population was half what it is now,” said Airborne Snow Observatory Principal Investigator Tom Painter of JPL. “Besides resulting in less snow water, the dramatic reduction in snow extent contributes to warming our climate by allowing the ground to absorb more sunlight. This reduces soil moisture, which makes it harder to get water from the snow into reservoirs once it does start snowing again.”

New drought maps show groundwater levels across the U.S. Southwest are in the lowest 2 to 10 percent since 1949. The maps, developed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, combine GRACE data with other satellite observations.

“Integrating GRACE data with other satellite measurements provides a more holistic view of the impact of drought on water availability, including on groundwater resources, which are typically ignored in standard drought indices,” said Matt Rodell, chief of the Hydrological Sciences Laboratory at Goddard.

The scientists cautioned that while the recent California storms have been helpful in replenishing water resources, they aren’t nearly enough to end the multi-year drought.

“It takes years to get into a drought of this severity, and it will likely take many more big storms, and years, to crawl out of it,” said Famiglietti.

NASA monitors Earth’s vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. The agency develops new ways to observe and study Earth’s interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For more information on GRACE, visit: http://www.nasa.gov/grace andhttp://www.csr.utexas.edu/grace

For more on the Airborne Snow Observatory, visit: http://aso.jpl.nasa.gov/

For more information about NASA’s Earth science activities, visit:http://www.nasa.gov/earthrightnow


Story Source:

The above story is based on materials provided by NASA/Jet Propulsion Laboratory. Note: Materials may be edited for content and length.

 

 

NASA/Jet Propulsion Laboratory. “NASA data underscore severity of California drought.” ScienceDaily. ScienceDaily, 16 December 2014. <www.sciencedaily.com/releases/2014/12/141216184146.htm>.

Filed Under News | Leave a Comment 

Date:
December 15, 2014

 

Source:
Manchester University

 

Summary:
Scientists have identified the most energy-efficient way to make clouds more reflective to the sun in a bid to combat climate change. Marine Cloud Brightening is a reversible geoengineering method proposed to mitigate rising global temperatures. It relies on propelling a fine mist of salt particles high into the atmosphere to increase the albedo of clouds — the amount of sunlight they reflect back into space.

 

 

20141216-1
University of Manchester scientists have identified the most energy-efficient way to make clouds more reflective to the sun in a bid to combat climate change.
Credit: © magann / Fotolia

 

 

University of Manchester scientists have identified the most energy-efficient way to make clouds more reflective to the sun in a bid to combat climate change.

Marine Cloud Brightening is a reversible geoengineering method proposed to mitigate rising global temperatures. It relies on propelling a fine mist of salt particles high into the atmosphere to increase the albedo of clouds — the amount of sunlight they reflect back into space. This would then reduce temperatures on the surface, as less sunlight reaches Earth.

Clouds form when water droplets gather on dust or other particles in the air. Increasing the amount of salt particles in the atmosphere allows more of these water droplets to form, making the clouds denser and therefore more reflective.

A new paper, published in the journal Philosophical Transactions of the Royal Society A, has looked at four different ways of getting the particles into the sky, to compare how effective they may be. The researchers found that a technique called the ‘Rayleigh Jet’ proved to be best.

Named after Lord Rayleigh, who provided the theory, the technique relies on spraying a fine jet of water that breaks down into small droplets into the sky. The liquid droplets evaporate quickly, leaving behind just the salt particles.

These particles, say the paper’s authors, could be generated from specially built ships that could travel the world’s oceans spraying salt particles into the air where they then hang in the atmosphere for several days until they return to Earth as rain.

Previous studies have optimised the size of the salt particles needed to produce the best increase in cloud reflectance but haven’t taken into account how much energy the technique would need and how much it would cost to operate. This new paper, by teams at the universities of Manchester, Washington and Edinburgh, tackled this question. The researchers tested each technique so there was an increase in reflection of 5%, a figure that would combat the predicted effects of increased carbon dioxide levels over the rest of this century. They then looked at how much energy each would consume.

The scientists say that the Rayleigh jet method could produce the desired effect using 30 megawatts of energy, about the same energy that two large ships produce.

Dr Paul Connolly, based in the School of Earth, Atmospheric and Environmental Sciences at The University of Manchester, said: “It can be incredibly energy intensive to propel water high into the atmosphere and the energy required had never really been tested before. Our paper optimises the salt particle sizes to produce the required change in cloud reflectance for the least energy cost. It is an important finding if these techniques should be needed in the future.

“I am not recommending that we use any of these techniques now, but it is important to know how best to use them should they become necessary. Should no progress be made to reduce CO2 levels, then geoengineering techniques, similar to this, might become necessary to avoid dangerous rises in global temperatures.”


Story Source:

The above story is based on materials provided by Manchester University. Note: Materials may be edited for content and length.


Journal Reference:

  1. P. J. Connolly, G. B. McFiggans, R. Wood, A. Tsiamis. Factors determining the most efficient spray distribution for marine cloud brightening. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2014; 372 (2031): 20140056 DOI: 10.1098/rsta.2014.0056

 

Manchester University. “Cost of cloud brightening for cooler planet revealed.” ScienceDaily. ScienceDaily, 15 December 2014. <www.sciencedaily.com/releases/2014/12/141215203029.htm>.

Filed Under News | Leave a Comment 

DIA Meeting 2015: Target Health to Chair a Symposium on eSource and Risk-based Monitoring

 

Target Health Inc. is pleased to announce that at the annual meeting of the DIA, being held in Washington DC (June 14-18, 2015), Dr. Jules T. Mitchel will chair a Symposium entitled: “How Risk-Based Monitoring and eSource Methodologies are Impacting Clinical Sites, Patients, Regulators and Sponsors.“

 

The Symposium will show how risk-based monitoring and eSource methodologies are impacting the way clinical trials are being conducted and managed. Using results and experience from completed and ongoing clinical trials, the speakers will identify how eSource and risk-based monitoring methodologies are impacting the clinical research enterprise including clinical research sites, patients, regulators, quality assurance, CRAs, and project, safety and data managers.

 

In addition to Dr. Mitchel, speakers include:

 

  1. Ed Seguine, MBA, (CEO, Clinical Ink) who will discuss “eClinical and eSource: Global Regulatory Challenges and Opportunities“

 

  1. Frances Nolan, MBA (Vice President, Quality and Regulatory Affairs, Medidata Solutions Worldwide) who will discuss “Overcoming Clinical Trial Data Collection Challenges with eSource Solution and Leveraging Mobile Technologies“

 

  1. Avik Pal, MBA, (CEO, CliniOps) who will discuss “Innovation by Design: Using eSource to Maximize Clinical Development Productivity and Efficiency“

 

Sunrise From the 24th Floor Offices of Target Health Inc.

 

20141215-12

Sunrise from the 24th Floor © Target Health Inc.

 

ON TARGET is the newsletter of Target Health Inc., a NYC-based, full-service, contract research organization (eCRO), providing strategic planning, regulatory affairs, clinical research, data management, biostatistics, medical writing and software services to the pharmaceutical and device industries, including the paperless clinical trial.

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

Filed Under News, What's New | Leave a Comment 

Exercise and Your Heart

20141215-11

Leg Press

 

The benefits of exercise have been known since antiquity. Marcus Cicero, around 65 BCE, stated: “It is exercise alone that supports the spirits, and keeps the mind in vigor.“ However, the link between physical health and 1) ___ (or lack of it) was only discovered in 1949 and reported in 1953 by a team led by Jerry Morris. Dr. Morris noted that men of similar social class and occupation (bus conductors versus bus drivers) had markedly different rates of heart attacks, depending on the level of exercise they got: bus drivers had a sedentary occupation and a higher incidence of heart disease, while bus conductors were forced to move continually and had a lower incidence of heart disease. This link had not previously been noted and was later confirmed by other researchers. Just one day of exercise can protect the heart against a heart 2) ___, and this protection is upheld with months of exercise, making exercise one of the few sustainable preconditioning stimuli“ (Journal of Applied Physiology, September 2011). Wow.

 

Heart attacks occur when a plaque suddenly breaks off from the walls of an artery supplying blood to the heart. The plaque travels down the ever-narrowing artery until it completely blocks the flow of blood to a part of the heart’s muscle. The heart’s muscle must receive oxygen from the bloodstream all the time. When a part of the heart muscle is suddenly deprived of oxygen, it dies and you suffer a heart attack. The dying heart muscle usually causes severe pain, in the chest, back or left arm. Heart attacks are not caused by progressive narrowing of an artery. Lack of oxygen is the ultimate cause of heart muscle damage. Anything that increases the ability of the heart muscle to survive oxygen deprivation or increases 3) ___ supply to the heart muscle helps to prevent heart attacks.

 

Exercise helps to prevent heart attacks, and the more intensely you exercise, the greater the protection. Researchers in Norwaytreated recovering heart attack victims with the same intense training methods used by competitive athletes (American Heart Journal). They supervised them as they ran on a treadmill very fast for a few seconds, rested and then repeated their intense intervals. For example, some of the patients ran fast for 30 seconds every five minutes. The interval-training heart attack victims were able to use more oxygen maximally (VO2max) and had their heart rates return toward normal faster than other heart attack victims who did slower continuous training. This advantage persisted for 30 months after the patients completed their 12-week rehabilitation program.

 

Intense training is not accepted as a treatment for heart attack victims, particularly those who have chest pain with exercise or excessive shortness of 4) ___. Intense exercise can precipitate heart attacks in people with blocked arteries. The exercise sessions are usually supervised by trained technicians using electrocardiograms, at least in the beginning. Intense exercise does not damage healthy hearts. All known tests for heart function show no damage from intense exercise. Post-exercise electrocardiograms and echocardiograms are normal, as are blood levels of heart-specific enzymes, creatine kinase and creatine kinase MB, and myoglobin (Medicine and Science in Sports and Exercise).

 

Exercise increases the size and number of mitochondria in the 5) ___ of mice (American Journal of Physiology, September 2011). The mice ran on a treadmill for an hour a day, six days a week, for eight weeks. This could explain how exercise improves memory, treats depression, and makes people feel better and helps them to think more clearly. Until now, the leading theory to explain how exercise improves memory and treats depression was that exercise causes the brain to release endorphins, morphine-like compounds that can improve mood. However, endorphins would not explain the improvement in memory and brain function associated with a regular exercise program Mitochondria are tiny chambers in 6) ___ that turn food into energy more efficiently than any other process in your body. Scientists have known for years that exercise enlarges and increases the number of mitochondria in muscle cells, to increase strength, speed and endurance; but this is the first research paper to offer a plausible explanation why exercise improves memory and relieves depression. The increase in brain mitochondria could also explain how training for sports increases endurance by making the brain resistant to fatigue. It also could explain how exercise treats mental disorders, delays aging, and improves certain types of nerve damage.

 

People who exercise regularly are far less likely to suffer dementia and had less than half the risk of death during the 17-year study period, compared to those who do not exercise (Medicine & Science in Sports & Exercise, February, 2012). Researchers followed 45,000 men and 15,000 women, ages 20 to 88 years, in the United States for an average of 17 years. Six times as many people in the low fitness group suffered from dementia, compared to those who exercised regularly. While deaths in the United States associated with heart disease, breast cancer and stroke have declined in recent years, deaths related to dementia and Alzheimer’s disease rose 46 percent between 2002 and 2006. Another exciting study, from Japan, shows that many of the benefits that exercise provides to muscles are also provided to your brain (The Journal of Physiology, February, 2012;590 (Pt 3):607-16).

 

THE STUDY: Adult male rats exercised to exhaustion at moderate intensity on a treadmill. Glycogen, sugar stored in muscles, was depleted by 82-90 percent. One day later, the rats’ muscles could store 43-46 percent more than they could originally, In a like manner, brain glycogen levels decreased by 50-64 percent with exhaustive exercise, and were able to store 29-63 percent more on the next day. The greater the depletion of sugar in muscles and brain, the greater the ability to store more sugar after the rats were fed. The brain filled with sugar before the muscles did and after four weeks of training, the rats’ brains could store significantly more glycogen.

 

EXPLANATION: Sugar is the most efficient source of 7) ___ for your muscles during intense exercise. Sugar is the most efficient source of energy for your brain ALL the time. When you exercise regularly, you increase the ability of your muscles to store sugar so you can move faster and longer. This study suggests that exercise also increases the energy supply to your brain, which will help you to think and reason better.

 

Arthritis is classified into three types: inflammatory, degenerative, and traumatic. Your doctor first checks to see if you have a history of trauma that damaged your joint. If you don’t, he does a host of blood tests looking for a source of inflammation (an overactive immunity): various diseases, infections or the deposition of crystals in the joint fluid. If your doctor finds none of these, your doctor announces to you that you have osteoarthritis, which means that he/she doesn’t have the foggiest idea what is causing your joint pain. He/she then does X rays and MRIs. If the cartilage is gone and a bone at the joint touches the end of the opposing bone, your doctor may recommend a joint replacement. If the cartilage is intact, the major treatment is exercise (Current Pain and Headache Reports, December 2011: 15(6):423-30). Regardless of the cause of your joint pain, inactivity will increase damage to the joint. If you don’t move that joint, you can expect it to degenerate to the point where the joint loses its ability to move through its full range of motion. When you stop using a joint, the muscles around it grow weak and the tendons stiffen. This leads to further joint damage and lack of mobility. Of course, you should not pound on, or apply too much force on damaged joints because too much force can break more cartilage, leading to a joint replacement. Do not run with arthritis in knees or hips. Running is almost always contraindicated if you have joint 8) ___ in the legs, hips or lower back because during running, your foot hits the ground with a force that is transmitted up your leg all the way to your back. The faster you run, the greater and more damaging the force of your foot strike. If you want to run with joint pain, you must run very slowly, and even then, you are still transmitting the foot strike force up your leg and into your back. The safest sports for people with painful joints are those that do not involve foot strike force. Cycling is done in a smooth rotary motion. Swimming helps protect your joints because of the buoyancy of the water. Exercise equipment such as elliptical trainers or stair steppers will allow you to move your joints without pounding on your feet. Any of these activities help you to strengthen the muscles around the damaged joints and keep the tendons flexible. Whatever sport you choose, the same rules of training that athletes use apply to you:

 

Start out by pedaling, walking or swimming very 9) ___ and stop if your muscles feel heavy or hurt. Stop even if you have just started your workout for that day. When you can pedal, walk or swim for 30 minutes a day without increasing your joint pain, you are ready to start training. If you do not exercise intensely enough to feel some burning in your muscles, your muscles will not get stronger. You have to work hard enough to damage muscles because healing strengthens them. Athletes take a harder workout on one day in which they feel a burning in their muscles. When you exercise hard enough to feel sore on the next day, you should go slowly for as many days as it takes for the muscles to heal and the soreness to go away. The normal soreness you feel 8 to 24 hours after you have taken an intense workout is called Delayed Onset Muscle Soreness (DOMS). The difference between DOMS and an injury: DOMS is symmetrical; you feel the same soreness on both sides of your body. Also, DOMS does not worsen with easy exercise. Stop exercising immediately if the pain does not improve as you continue to move. If your discomfort increases as you exercise, you have an injury or are headed for an injury. If your joint pain is so severe that you cannot exercise, you need to find some way to have your joints moved for you. Go to a physical 10) ___ who will move your joints for you and use massage, heat, cold or electrical stimulation to help you regain the ability to move the joints. Allowing your joints to stay in one position without daily movement will cause further joint damage and make your arthritis pain worse.

 

ANSWERS: 1) exercise; 2) attack; 3) oxygen; 4) breath; 5) brains; 6) cells; 7) energy; 8) damage; 9) slowly; 10) therapist

 

Filed Under News | Leave a Comment 

Jackie Gleason, An Unhealthy Life Worth Living (1916-1987)

20141215-10

1961 as Minnesota Fats, in The Hustler

 

As Ralph Kramden, the brash, blustering, uncouth and bumbling fool in the TV sitcom, “The Honeymooners“, Jackie Gleason was probably the most famous television character of his day. He portrayed similar personality disorders in movies with characters such as Minnesota Fats in the “The Hustler“ and Buford T. Justice in the “Smokey and the Bandit“ series. Although his public image was that of a buffoon, he was very smart with money. He came from severe poverty but he made incredible amounts of money from just about everything he did. The 39 episodes of “The Honeymooners“ ran live on television for only two years, 1955 and 1956, but supplied him and his heirs with many millions of dollars from reruns that are still shown today. He couldn’t read or write a note of music, but he would hum tunes to seasoned musicians who would write them down and sell his songs that still bring royalties. The characters he played on the screen used many of the same personality traits he developed or observed, early in life.

 

Gleason was born at 364 Chauncey Street in the Bushwick, section of Brooklyn. He grew up nearby, at 328 Chauncey (an address he later used for Ralph and Alice Kramden on The Honeymooners). Originally named Herbert Walton Gleason Jr., he was baptized John Herbert Gleason. His parents were Mae “Maisie“ (nee Kelly), a subway change-booth attendant and Herbert Walton “Herb“ Gleason, insurance auditor. His mother was from Farranree, Cork, Ireland, and his father was Irish American. Gleason was one of their two children; his brother Clemence died of spinal meningitis at age 14. At age nine, Gleason’s father abandoned the family. Gleason remembered his father having “beautiful handwriting,“ as Herbert Gleason often worked at the family’s kitchen table writing policies in the evenings. The night before his disappearance, Gleason’s father disposed of any family photos he was pictured in; just after noon on December 15, 1925, he collected his hat, coat, and paycheck, leaving the insurance company and his family permanently. When it was evident he was not coming back, Mae went to work for the Brooklyn-Manhattan Transit Corporation (BMT).

 

Jackie spent his youth hanging out with a local gang and playing pool. He never finished high school. He supported himself by telling jokes in a theater for four dollars a night, and working in pool halls and carnivals. His mother died when he was 19. He got jobs as a comedian in night clubs and he spent money as fast as he made it. When he was 24, Jack Warner of Warner Brothers saw his comedy routine at a night club and offered him his first movie role. At age 33, he became Chester A. Riley in the television production “The Life of Riley“. At age 36, he starred in “The Jackie Gleason Show“ as a series of characters who screamed instead of speaking, and were unable to speak correct English. At age 49, he took the role of Ralph Kramden, a bus driver who was married toAlice in the “Honeymooners“. He bullied her and they fought in every show. He was so mean that it was funny, and they always made up at the end of every show. Today a life-size statue of bus driver Ralph Kramden stands in the Port Authority Bus Terminal in New York City.

 

At age 20, Jackie wed a vaudeville dancer, Genevieve Halford, but they had a very stormy marriage because he refused to come home after his night club jobs. He must have come home at least twice because they had two daughters. They separated in 1941, but got back together in 1948, and separated again in 1951. In 1954, while he was still married, he broke his leg. His wife came to visit him in the hospital and found Marilyn Taylor, a dancer on his television show, in his room. Everyone in the entire hospital got to hear them fight, after which she filed for divorce. It took 16 years for the divorce to become final. In 1970, ten days after his divorce from his first wife, Gleason married Beverly McKittrick a secretary whom he had met two years earlier. Four years later, while he was still married to Beverly, he went back to seeing Marilyn Taylor, who was now a widow with a young son. He filed for divorce from his second wife, married Taylor and stayed with her for the rest of his life.

 

Gleason’s lifestyle was so self-destructive that it is amazing that he lived to be as old as 71. He was morbidly obese, often weighing in at close to 300 pounds. He was famous for being able to drink every other actor under the table, and smoked more than four packs of Marlboro cigarettes a day. His favorite food was red meat, he loved rich desserts and hated vegetables. He did not exercise, even after he moved to Miami where his home had an exercise room that was larger and better-equipped than many commercial gyms. His lifestyle caused horrible diseases and prevented him from really enjoying his fame and financial successes. In 1978, at age 62, he had chest pains while playing the lead role in the play “Sly Fox“ and was treated and released from the hospital. The following week his pain was so bad that he could not perform and had to have triple-bypass surgery. In 1986, at age 70, while he was acting in his last film, “Nothing in Common“, he was diagnosed with colon cancer that had spread to his liver. He was also diagnosed with diabetes, from which he had probably suffered for many years. If that wasn’t enough, he was afflicted with very painful clotted hemorrhoids. During the filming, he told his daughter: “I won’t be around much longer.“ In 1987, he died at his beautiful Miami mansion. On the base of the statue at his grave are the words “And Away We Go.“

 

The same lifestyle factors that increase risk for heart attacks also increase risk for diabetes, dementia, impotence, and several types of cancer. Jackie Gleason’s lifestyle increased his risk for colon cancer, metastatic cancer to his liver, diabetes, heart attacks and a young death. There is a stronger association between red meat and colon cancer than with any other cancer. At any age, you can reduce your risk for these diseases by changing your lifestyle. Cancer survivors have been shown to live longer when they adopt a healthful lifestyle, and many heart attack patients and diabetics can reverse much of the damage they have caused when they eat healthfully and get plenty of exercise. Source: Wikipedia; Gabe Mirkin MD

 

Filed Under History of Medicine, News | Leave a Comment 

Chromosome Region Linked to Gigantism

 

Gigantism, a rare disorder that causes excessive childhood growth, results from a defect in the pituitary, a pea-sized gland at the base of the brain that makes growth hormones and controls the activity of other glands in the body. Some people with gigantism have a tumor in the pituitary that secretes extra hormone; others just have an oversized pituitary. People with this condition are abnormally tall and may have delayed puberty, large hands and feet, and double vision. Gigantism is often treated by removing the tumor, or even the entire pituitary, but can sometimes be treated with medication alone.

 

According to an article published online in the New England Journal of Medicine (3 December 2014), researchers at the NIH have found a duplication of a short stretch of the X chromosome in some people with Gigantism. According to the NIH, in theory, the causes of overgrowth and undergrowth in children should be regulated by the same mechanisms, and understanding how children grow is extraordinarily important, as an indicator of their general health and their future well-being.

 

The research started with a family who came to the NIH Clinical Center for treatment in the mid-1990s. A mother who had been treated for gigantism had two sons who were also growing rapidly. A second family, with an affected daughter, came to NIH from Australia. The girl had the same duplication the researchers saw in the first family.

 

For the current study, the investigators used whole-genome analysis to find major changes in the DNA in 43 people with gigantism. Results showed that every person in the study who had gigantism as an infant or a toddler had the same defect, a duplication of a stretch of the X chromosome. Family members without gigantism did not have the duplication. Next the investigators sought to identify which gene might be responsible for the excessive growth. While the length of the DNA duplication varied among the patients, the same four genes were found to be duplicated in all of the patients. After testing the genes, the most likely suspect was a gene called GPR101. GPR101’s activity was up to 1,000 times stronger in the pituitaries of children with the duplication than in the pituitaries of typically developing children. The investigators also looked at samples from the pituitary tumors of 248 people with acromegaly, a condition in which adults produce excess growth hormone. None of the patients had the duplication that was seen in people with gigantism. However, 11 of the acromegaly patients did have a mutation in GPR101, suggesting that the gene also may play a role in that condition as well.

 

Figuring out exactly how the protein derived from GPR101 works is the next frontier and it is hoped that these discoveries will lead to new treatments for gigantism in children as well as in insights into undergrowth.

 

Filed Under News | Leave a Comment 

Mouse Study Reveals Potential Clue to Extra Fingers or Toes

 

Polydactyly, a birth defect involving extra fingers on the hand or extra toes on the feet is estimated to occur in roughly 1 of every 500 to 1,000 births. The condition appears to result from a mutation in any of a number of genes. Polydactyly may involve only one extra finger or toe, or it may include multiple extra digits. The condition may also appear as part of a larger disorder having other organ abnormalities, some of which may be serious or life- threatening.

 

According to a study published online in Cell Reports (21 November 2014), it was found that a mouse version of polydactyly results from a malfunction of the cellular machinery that processes one of the cell’s internal transportation vehicles. The authors found that a mouse form of polydactyly appears to result from an error in a single gene out of a group containing the information needed to make a protein complex. The protein complex is called Endosomal Sorting Complex Required for Transport II — ESCRT-II, for short. Endosomes ferry molecules from the cell’s surface to the cell’s interior. When it functions normally, ESCRT-II processes endosomes for eventual disassembly within the cell. The mutation impairs the functioning of ESCRT-II, endosomes cannot be processed properly. As a result, an excess of a hormone known as fibroblast growth factor accumulates on the cell’s surface.

 

According to the authors, since this congenital defect appears both in isolation and in conjunction with other abnormalities, gaining fundamental knowledge of these genetic pathways is vital to developing effective genetic diagnostic screens and directed therapies.

 

The Fibroblast Growth Factors are a Family of Proteins Known as Cell Signaling Molecules

 

20141215-9

At top left: After a fibroblast growth factor binds to a cell, it triggers two potential sequences of events. Under typical conditions, it sets off a cascade of chemical reactions that eventually cause cells to divide, and even to differentiate into tissues and organs. In the other, shown in the image at the above left, it’s processed and then marked for eventual disassembly. With this internal breakdown process, the cell achieves a balance — once the cell acquires a critical mass of a fibroblast growth factor, the excess is marked for elimination. The process begins with the fibroblast growth factor (FGF) attaching to a special site, or receptor, on the cell surface, something like the way a key fits into a lock. The receptor changes shape, setting off the reactions that trigger the cell into such actions as dividing or giving rise to an organ or tissue type. At some point, sufficient FGF has bound to receptors on the cell surface. In response, pocket-like structures form on the cell’s surface. The cell surface changes shape — it involutes, like the surface of a water-filled balloon with a finger in it. This sunken-in structure contains many such FGF-receptor pairs. Eventually, this involution pinches off inside the cell, forming a spherical structure known as an endosome. Like a bathysphere ferrying ocean researchers to the sea depths, the endosome conveys the FGF-receptor duplexes deep into the cell. During their journey, the endosomes encounter a protein complex, called Endosomal Sorting Complex Required for Transport II — ESCRT-II, for short. When it’s functioning normally, ESCRT-II processes the endosome, reducing it in size. The details of just how ESCRIT-II processes the endosome aren’t fully understood. Once this processing takes place, the endosome fuses with another cellular body, called a lysosome. The lysosome disassembles the endosome, degrading it and its contents — FGF and its receptor.

 

At bottom left: The researchers discovered that, in the mouse strain they bred for the study, a mutation in a gene for a key subunit of ESCRT-II impairs the ability of ESCRT-II to do its job. This gene contains the information needed to make vacuolar protein sorting protein 25, or Vps25. The mutant Vps25 protein interferes with ESCRT-II’s ability to do its job. Because the endosomes aren’t sufficiently processed, the lysosomes have trouble breaking them apart. As a result, endosomes accumulate inside the cell. Fewer endosomes form at the cell surface, and FGF-receptor pairs accumulate on the surface because they can’t be removed rapidly enough. This situation creates an imbalance – increasing the amount of FGF bound to receptors on the cell surface. The end result in the mice with the Vps25 mutation is polydactyly.

 

Filed Under News | Leave a Comment 

NIH Takes Step to Speed the Initiation of Clinical Research by Ensuring Use of Single IRB

 

Institutional review boards (IRBs) play a critical role in assuring the ethical conduct of clinical research, and studies must be reviewed and approved by an IRB before they can begin. When the regulations for protection of human subjects were first published, most clinical research was conducted at a single institution. Since then, the research landscape has evolved, and many studies are carried out at multiple sites and within large networks. Studies that go beyond a single site are often able to recruit more individuals from diverse populations. These multi-site studies can often generate important results in less time. However, working through IRB review at each site can add delay without increasing the protections for the research participants in the study.

 

The National Institutes of Health (NIH) has issued a draft policy to promote the use of a single IRB in multi-site clinical research studies. The draft NIH policy proposes that all NIH-funded, multi-site studies carried out in the United States, whether supported through grants, contracts, or the NIH intramural program, should use a single IRB. Exceptions to the policy would be allowed if local IRB review is necessary to meet the needs of specific populations or where it is required by federal, state or tribal laws or regulations. Wider use of single IRB review in multi-site studies will help achieve greater efficiencies in the initiation of studies across NIH’s entire clinical research portfolio.

 

A number of NIH institutes and centers have been supporting the use of a single IRB in multi-site studies, and their experiences have shown the benefits and feasibility of the single IRB review model. Examples include the National Cancer Institute’s (NCI) Central Institutional Review Board, which has been in place for the review of NCI-sponsored clinical trials since 1999. The National Institute of Neurological Disorders and Stroke has also incorporated the use of a single IRB for its Network for Excellence in Neuroscience Clinical Trials’ (NeuroNEXT) and its stroke research network, NIH StrokeNET.

 

NIH is seeking public comments on the draft policy through a 60 day comment period closing Jan. 29, 2015 and comments may be submitted by any of the following methods:

 

— Email: SingleIRBpolicy@mail.nih.gov

— Fax: 301-496-9839

— Mail/hand delivery/courier: Office of Clinical Research and Bioethics Policy, Office of Science Policy, National Institutes of Health, 6705 Rockledge Drive, Suite 750, Bethesda, MD 20892.

 

Filed Under News, Regulatory | Leave a Comment 

Next Page →