New findings link genetic mutation to mammary duct growth as well as shoveled teeth

Date:
April 23, 2018

Source:
University of California – Berkeley

Summary:
Biologists have been puzzled by the evolutionary adaptation behind a common tooth trait of northern Asians and Native Americans: shovel-shaped incisors. An analysis of archeological specimens shows that nearly 100 percent of early Native Americans had shoveled incisors, and genetic evidence pinpoints the selection to the Beringian standstill 20,000 years ago. One researcher proposes that a trait linked to shoveling, mammary duct growth, was selected to provide vitamin D and fat to infants.

 

Photograph of human upper incisors with significant “shoveling,” anatomical variation influenced by the EDAR V370A allele alongside an increase in mammary duct branching.
Credit: Christy G. Turner, II, courtesy G. Richard Scott

 

 

The critical role that breast feeding plays in infant survival may have led, during the last ice age, to a common genetic mutation in East Asians and Native Americans that also, surprisingly, affects the shape of their teeth.

The genetic mutation, which probably arose 20,000 years ago, increases the branching density of mammary ducts in the breasts, potentially providing more fat and vitamin D to infants living in the far north where the scarcity of ultraviolet radiation makes it difficult to produce vitamin D in the skin.

If the spread of this genetic mutation is, in fact, due to selection for increased mammary ductal branching, the adaptation would be the first evidence of selection on the human maternal-infant bond.

“This highlights the importance of the mother-infant relationship and how essential it has been for human survival,” said Leslea Hlusko, an associate professor of integrative biology at the University of California, Berkeley.

As for the teeth, it just so happens that the gene controlling mammary duct growth also affects the shape of human incisors. Consequently, as the genetic mutation was selected for in an ancestral population living in the far north during the last Ice Age, shovel-shaped incisors became more frequent too. Shoveled incisors are common among Native Americans and northeastern Asian populations but rare in everyone else.

Hlusko and her colleagues outline the many threads of evidence supporting the idea in an article published this week in the journal Proceedings of the National Academy of Sciences.

The finding could also have implications for understanding the origins of dense breast tissue and its role in breast cancer.

For the study, Hlusko and her colleagues assessed the occurrence of shovel-shaped incisors in archeological populations in order to estimate the time and place of evolutionary selection for the trait. They found that nearly 100 percent of Native Americans prior to European colonization had shoveled incisors, as do approximately 40 percent of East Asians today.

The team then used the genetic effects that are shared with dental variation as a way to discern the evolutionary history of mammary glands because of their common developmental pathway.

“People have long thought that this shoveling pattern is so strong that there must have been evolutionary selection favoring the trait, but why would there be such strong selection on the shape of your incisors?” Hlusko said. “When you have shared genetic effects across the body, selection for one trait will result in everything else going along for the ride.”

The vitamin D connection

Getting enough vitamin D, which is essential for a robust immune system and proper fat regulation as well as for calcium absorption, is a big problem in northern latitudes because the sun is low on the horizon all year long and, above the Arctic Circle, doesn’t shine at all for part of the year. While humans at lower latitudes can get nearly all the vitamin D they need through exposure of the skin to ultraviolet light, the scarce UV at high latitudes forced northern peoples like the Siberians and Inuit to get their vitamin D from animal fat, hunting large herbivores and sea mammals.

But babies must get their vitamin D from mother’s milk, and Hlusko posits that the increased mammary duct branching may have been a way of delivering more vitamin D and the fat that goes with it.

Hlusko, who specializes in the evolution of teeth among animals, in particular primates and early humans, discovered these connections after being asked to participate in a scientific session on the dispersal of modern humans throughout the Americas at the February 2017 American Association for the Advancement of Science meeting. In preparing her talk on what teeth can tell us about the peopling of the New World, she pulled together the genetics of dental variation with the archaeological evidence to re-frame our understanding of selection on incisor shape.

Incisors are called “shovel-shaped” when the tongue-side of the incisors — the cutting teeth in the front of the mouth, four on top, four on the bottom — have ridges along the sides and biting edge. It is distinctive of Native Americans and populations in East Asia — Korea, Japan and northern China — with an increasing incidence as you travel farther north. Unpersuaded by a previously proposed idea that shoveled incisors were selected for use softening animal hides, she looked at explanations unrelated to teeth.

The genetic mutation responsible for shoveling — which occurs in at least one of the two copies, or alleles, of a gene called EDAR, which codes for a protein called the ectodysplasin A receptor — is also involved in determining the density of sweat glands in the skin, the thickness of hair shafts and ductal branching in mammary glands. Previous genetic analysis of living humans concluded that the mutation arose in northern China due to selection for more sweat glands or sebaceous glands during the last ice age.

“Neither of those is a satisfying explanation,” Hlusko said. “There are some really hot parts in the world, and if sweating was so sensitive to selective pressures, I can think of some places where we would have more likely seen selection on that genetic variation instead of in northern China during the Last Glacial Maximum.”

The Beringian standstill

Clues came from a 2007 paper and later a 2015 study by Hlusko’s coauthor Dennis O’Rourke, in which scientists deduced from the DNA of Native Americans that they split off from other Asian groups more than 25,000 years ago, even though they arrived in North American only 15,000 years ago. Their conclusion was that Native American ancestors settled for some 10,000 years in an area between Asia and North America before finally moving into the New World. This so-called Beringian standstill coincided with the height of the Last Glacial Maximum between 18,000 and 28,000 years ago.

According to the Beringian standstill hypothesis, as the climate became drier and cooler as the Last Glacial Maximum began, people who had been living in Siberia moved into Beringia. Gigantic ice sheets to the east prohibited migration into North America. They couldn’t migrate southwest because of a large expanse of a treeless and inhospitable tundra. The area where they found refuge was a biologically productive region thanks to the altered ocean currents associated with the last ice age, a landmass increased in size by to the lower sea levels. Genetic studies of animals and plants from the region suggest there was an isolated refugium in Beringia during that time, where species with locally adaptive traits arose. Such isolation is ripe for selection on genetic variants that make it easier for plants, animals and humans to survive.

“If you take these data from the teeth to interpret the evolutionary history of this EDAR allele, you frame-shift the selective episode to the Beringian standstill population, and that gives you the environmental context,” Hlusko said. “At that high latitude, these people would have been vitamin D deficient. We know they had a diet that was attempting to compensate for it from the archaeological record, and because there is evidence of selection in this population for specific alleles of the genes that influence fatty acid synthesis. But even more specifically, these genes modulate the fatty acid composition of breast milk. It looks like this mutation of the EDAR gene was also selected for in that ancestral population, and EDAR’s effects on mammary glands is the most likely target of the selection.”

The EDAR gene influences the development of many structures derived from the ectoderm in the fetus, including tooth shape, sweat glands, sebaceous glands, mammary glands and hair. As a consequence, selection on one trait leads to coordinated evolution of the others. The late evolutionary biologist and author Steven Jay Gould referred to such byproducts of evolution as spandrels.

“This Beringian population is one example of what has happened thousands of times, over millions of years: Human populations form, exist for a little while and then disperse to form new populations, mixing with other groups of people, all of them leaving traces on modern human variation today,” Hlusko said. “An important take-home message is that human variation today reflects this dynamic process of ephemeral populations, rather than the traditional concept of geographic races with distinct differences between them.”

Story Source:

Materials provided by University of California – Berkeley. Original written by Robert Sanders. Note: Content may be edited for style and length.


Journal Reference:

  1. Leslea J. Hlusko, Joshua P. Carlson, George Chaplin, Scott A. Elias, John F. Hoffecker, Michaela Huffman, Nina G. Jablonski, Tesla A. Monson, Dennis H. O’Rourke, Marin A. Pilloud, G. Richard Scott. Environmental selection during the last ice age on the mother-to-infant transmission of vitamin D and fatty acids through breast milkProceedings of the National Academy of Sciences, 2018; 201711788 DOI: 10.1073/pnas.1711788115

 

Source: University of California – Berkeley. “Did last ice age affect breastfeeding in Native Americans? New findings link genetic mutation to mammary duct growth as well as shoveled teeth.” ScienceDaily. ScienceDaily, 23 April 2018. <www.sciencedaily.com/releases/2018/04/180423155057.htm>.

Scientists have tracked down an elusive ‘tangled knot’ of DNA

Date:
April 23, 2018

Source:
Garvan Institute of Medical Research

Summary:
In a world first, researchers have identified a new DNA structure — called the i-motif — inside cells. A twisted ‘knot’ of DNA, the i-motif has never before been directly seen inside living cells.

 

This is an artist’s impression of the i-motif DNA structure inside cells, along with the antibody-based tool used to detect it.
Credit: Chris Hammang

 

 

It’s DNA, but not as we know it.

In a world first, Australian researchers have identified a new DNA structure — called the i-motif — inside cells. A twisted ‘knot’ of DNA, the i-motif has never before been directly seen inside living cells.

The new findings, from the Garvan Institute of Medical Research, are published today in the leading journal Nature Chemistry.

Deep inside the cells in our body lies our DNA. The information in the DNA code — all 6 billion A, C, G and T letters — provides precise instructions for how our bodies are built, and how they work.

The iconic ‘double helix’ shape of DNA has captured the public imagination since 1953, when James Watson and Francis Crick famously uncovered the structure of DNA. However, it’s now known that short stretches of DNA can exist in other shapes, in the laboratory at least — and scientists suspect that these different shapes might play an important role in how and when the DNA code is ‘read’.

The new shape looks entirely different to the double-stranded DNA double helix.

“When most of us think of DNA, we think of the double helix,” says Associate Professor Daniel Christ (Head, Antibody Therapeutics Lab, Garvan) who co-led the research. “This new research reminds us that totally different DNA structures exist — and could well be important for our cells.”

“The i-motif is a four-stranded ‘knot’ of DNA,” says Associate Professor Marcel Dinger (Head, Kinghorn Centre for Clinical Genomics, Garvan),.who co-led the research with A/Prof Christ.

“In the knot structure, C letters on the same strand of DNA bind to each other — so this is very different from a double helix, where ‘letters’ on opposite strands recognise each other, and where Cs bind to Gs [guanines].”

Although researchers have seen the i-motif before and have studied it in detail, it has only been witnessed in vitro — that is, under artificial conditions in the laboratory, and not inside cells.

In fact, scientists in the field have debated whether i-motif ‘knots’ would exist at all inside living things — a question that is resolved by the new findings.

To detect the i-motifs inside cells, the researchers developed a precise new tool — a fragment of an antibody molecule — that could specifically recognise and attach to i-motifs with a very high affinity. Until now, the lack of an antibody that is specific for i-motifs has severely hampered the understanding of their role.

Crucially, the antibody fragment didn’t detect DNA in helical form, nor did it recognise ‘G-quadruplex structures’ (a structurally similar four-stranded DNA arrangement).

With the new tool, researchers uncovered the location of ‘i-motifs’ in a range of human cell lines. Using fluorescence techniques to pinpoint where the i-motifs were located, they identified numerous spots of green within the nucleus, which indicate the position of i-motifs.

“What excited us most is that we could see the green spots — the i-motifs — appearing and disappearing over time, so we know that they are forming, dissolving and forming again,” says Dr Mahdi Zeraati, whose research underpins the study’s findings.

The researchers showed that i-motifs mostly form at a particular point in the cell’s ‘life cycle’ — the late G1 phase, when DNA is being actively ‘read’. They also showed that i-motifs appear in some promoter regions (areas of DNA that control whether genes are switched on or off) and in telomeres, ‘end sections’ of chromosomes that are important in the aging process.

Dr Zeraati says, “We think the coming and going of the i-motifs is a clue to what they do. It seems likely that they are there to help switch genes on or off, and to affect whether a gene is actively read or not.”

“We also think the transient nature of the i-motifs explains why they have been so very difficult to track down in cells until now,” adds A/Prof Christ.

A/Prof Marcel Dinger says, “It’s exciting to uncover a whole new form of DNA in cells — and these findings will set the stage for a whole new push to understand what this new DNA shape is really for, and whether it will impact on health and disease.”

Story Source:

Materials provided by Garvan Institute of Medical ResearchNote: Content may be edited for style and length.


Journal Reference:

  1. Mahdi Zeraati, David B. Langley, Peter Schofield, Aaron L. Moye, Romain Rouet, William E. Hughes, Tracy M. Bryan, Marcel E. Dinger, Daniel Christ. I-motif DNA structures are formed in the nuclei of human cellsNature Chemistry, 2018; DOI: 10.1038/s41557-018-0046-3

 

Source: Garvan Institute of Medical Research. “Found: A new form of DNA in our cells: Scientists have tracked down an elusive ‘tangled knot’ of DNA.” ScienceDaily. ScienceDaily, 23 April 2018. <www.sciencedaily.com/releases/2018/04/180423135054.htm>.

Using human stem cells, researchers create 3-D model of the brain to study a mutation tied to schizophrenia, bipolar disorder and depression

Date:
April 19, 2018

Source:
Brigham and Women’s Hospital

Summary:
Researchers are leveraging gene-editing tools and mini-organs grown in the lab to study the effects of DISC1 mutations in cerebral organoids — ‘mini brains’ — cultured from human stem cells.

 

Neurons (stock illustration).
Credit: © whitehoune / Fotolia

 

 

Major mental illnesses such as schizophrenia, severe depression and bipolar disorder share a common genetic link. Studies of specific families with a history of these types of illnesses have revealed that affected family members share a mutation in the gene DISC1. While researchers have been able to study how DISC1 mutations alter the brain during development in animal models, it has been difficult to find the right tools to study changes in humans. However, advancements in engineering human stem cells are now allowing researchers to grow mini-organs in labs, and gene-editing tools can be used to insert specific mutations into these cells.

Researchers from Brigham and Women’s Hospital are leveraging these new technologies to study the effects of DISC1 mutations in cerebral organoids — “mini brains” — cultured from human stem cells. Their results are published in Translational Psychiatry.

“Mini-brains can help us model brain development,” said senior author Tracy Young-Pearse, PhD, head of the Young-Pearse Lab in the Ann Romney Center for Neurologic Diseases at BWH. “Compared to traditional methods that have allowed us to investigate human cells in culture in two-dimensions, these cultures let us investigate the three-dimensional structure and function of the cells as they are developing, giving us more information than we would get with a traditional cell culture.”

The researchers cultured human induced pluripotent stem cells (iPSCs) to create three-dimensional mini-brains for study. Using the gene editing tool CRISPR-Cas9, they disrupted DISC1, modeling the mutation seen in studies of families suffering from these diseases. The team compared mini-brains grown from stem cells with and without this specific mutation.

DISC1-mutant mini-brains showed significant structural disruptions compared to organoids in which DISC1 was intact. Specifically, the fluid-filled spaces, known as ventricles, in the DISC1-mutant mini-brains were more numerous and smaller than in controls, meaning that while the expected cells are present in the DISC1-mutant, they are not in their expected locations. The DISC1-mutant mini-brains also show increased signaling in the WNT pathway, a pathway known to be important for patterning organs and one that is disrupted in bipolar disorder. By adding an inhibitor of the WNT pathway to the growing DISC1-mutant mini-brains, the researchers were able to “rescue” them — instead of having structural differences, they looked similar to the mini-brains developed from normal stem cells. This suggests that the WNT pathway may be responsible for the observed structural disruption in the DISC1-mutants, and could be a potential target pathway for future therapies.

“By producing cerebral organoids from iPSCs we are able to carefully control these experiments. We know that any differences we are seeing are because of the DISC1-mutation that we introduced,” said Young-Pearse. “By looking at how DISC1-mutations disrupt the morphology and gene expression of cerebral organoids, we are strengthening the link between DISC1-mutation and major mental illness, and providing new avenues for investigation of this relationship.”

This study was supported by funding from the Sackler Scholar Programme in Psychobiology, a Young Investigator Award from the Brain and Behavior Research Foundation, and the National Institute of Mental Health.

Story Source:

Materials provided by Brigham and Women’s HospitalNote: Content may be edited for style and length.


Journal Reference:

  1. Priya Srikanth, Valentina N. Lagomarsino, Christina R. Muratore, Steven C. Ryu, Amy He, Walter M. Taylor, Constance Zhou, Marlise Arellano, Tracy L. Young-Pearse. Shared effects of DISC1 disruption and elevated WNT signaling in human cerebral organoidsTranslational Psychiatry, 2018; 8 (1) DOI: 10.1038/s41398-018-0122-x

 

Source: Brigham and Women’s Hospital. “3-D human ‘mini-brains’ shed new light on genetic underpinnings of major mental illness: Using human stem cells, researchers create 3-D model of the brain to study a mutation tied to schizophrenia, bipolar disorder and depression.” ScienceDaily. ScienceDaily, 19 April 2018. <www.sciencedaily.com/releases/2018/04/180419141530.htm>.

Date:
April 18, 2018

Source:
École Polytechnique Fédérale De Lausanne

Summary:
Scientists have examined a slice from a meteorite that contains large diamonds formed at high pressure. The study shows that the parent body from which the meteorite came was a planetary embryo of a size between Mercury to Mars.

 

Meteorite sample.
Credit: © 2018 EPFL / Hillary Sanctuary

 

 

Using transmission electron microscopy, EPFL scientists have examined a slice from a meteorite that contains large diamonds formed at high pressure. The study shows that the parent body from which the meteorite came was a planetary embryo of a size between Mercury to Mars. The discovery is published in Nature Communications.

On October 7, 2008, an asteroid entered Earth’s atmosphere and exploded 37 km above the Nubian Desert in Sudan. The asteroid, now known as “2008 TC3,” was just over four meters in diameter. When it exploded in the atmosphere, it scattered multiple fragments across the desert. Only fifty fragments, ranging in size from 1-10 cm, were collected, for a total mass of 4.5 kg. Over time, the fragments were gathered and catalogued for study into a collection named Almahata Sitta (Arabic for “Station Six,” after a nearby train station between Wadi Halfa and Khartoum).

The Almahata Sitta meteorites are mostly ureilites, a rare type of stony meteorite that often contains clusters of nano-sized diamonds. Current thinking is that these tiny diamonds can form in three ways: enormous pressure shockwaves from high-energy collisions between the meteorite “parent body” and other space objects; deposition by chemical vapor; or, finally, the “normal” static pressure inside the parent body, like most diamonds on Earth.

The unanswered question, so far, has been the planetary origin of 2008 TC3 ureilites. Now, scientists at Philippe Gillet’s lab at EPFL, with colleagues in France and Germany, have studied large diamonds (100-microns in diameter) in some of the Almahata Sitta meteorites and discovered that the asteroid came from a planetary “embryo” whose size is between Mercury to Mars.

The researchers studied the diamond samples using a combination of advanced transmission electron microscopy techniques at EPFL’s Interdisciplinary Centre for Electron Microscopy. The analysis of the data showed that the diamonds had chromite, phosphate, and iron-nickel sulfides embedded in them — what scientists refer to as “inclusions.” These have been known for a long time to exist inside Earth’s diamonds, but are now described for the first time in an extraterrestrial body.

The particular composition and morphology of these materials can only be explained if the pressure under which the diamonds were formed was higher than 20 GPa (giga-Pascals, the unit of pressure). This level of internal pressure can only be explained if the planetary parent body was a Mercury- to Mars-sized planetary “embryo,” depending on the layer in which the diamonds were formed.

Many planetary formation models have predicted that these planetary embryos existed in the first million years of our solar system, and the study offers compelling evidence for their existence. Many planetary embryos were Mars-sized bodies, such as the one that collided with Earth to give rise to the Moon. Other of these went on to form larger planets, or collided with the Sun or were ejected from the solar system altogether. The authors write “This study provides convincing evidence that the ureilite parent body was one such large ‘lost’ planet before it was destroyed by collisions some 4.5 billion years ago.”

Story Source:

Materials provided by École Polytechnique Fédérale De LausanneNote: Content may be edited for style and length.


Journal Reference:

  1. Farhang Nabiei, James Badro, Teresa Dennenwaldt, Emad Oveisi, Marco Cantoni, Cécile Hébert, Ahmed El Goresy, Jean-Alix Barrat, Philippe Gillet. A large planetary body inferred from diamond inclusions in a ureilite meteoriteNature Communications, 2018; 9 (1) DOI: 10.1038/s41467-018-03808-6

 

Source: École Polytechnique Fédérale De Lausanne. “Meteorite diamonds tell of a lost planet.” ScienceDaily. ScienceDaily, 18 April 2018. <www.sciencedaily.com/releases/2018/04/180418144810.htm>.

Date:
April 16, 2018

Source:
University of Bristol

Summary:
It is commonly understood that the dinosaurs disappeared with a bang — wiped out by a great meteorite impact on the Earth 66 million years ago. But their origins have been less understood. In a new study, scientists show that the key expansion of dinosaurs was also triggered by a crisis — a mass extinction that happened 232 million years ago.

 

Dinosaur fossil (stock image).
Credit: © ramirezom / Fotolia

 

 

It is commonly understood that the dinosaurs disappeared with a bang — wiped out by a great meteorite impact on the Earth 66 million years ago.

But their origins have been less understood. In a new study, scientists from MUSE — Museum of Science, Trento, Italy, Universities of Ferrara and Padova, Italy and the University of Bristol show that the key expansion of dinosaurs was also triggered by a crisis — a mass extinction that happened 232 million years ago.

In the new paper, published today in Nature Communications, evidence is provided to match the two events — the mass extinction, called the Carnian Pluvial Episode, and the initial diversification of dinosaurs.

Dinosaurs had originated much earlier, at the beginning of the Triassic Period, some 245 million years ago, but they remained very rare until the shock events in the Carnian 13 million years later.

The new study shows just when dinosaurs took over by using detailed evidence from rock sequences in the Dolomites, in north Italy — here the dinosaurs are detected from their footprints.

First there were no dinosaur tracks, and then there were many. This marks the moment of their explosion, and the rock successions in the Dolomites are well dated. Comparison with rock successions in Argentina and Brazil, here the first extensive skeletons of dinosaurs occur, show the explosion happened at the same time there as well.

Lead author Dr Massimo Bernardi, Curator at MUSE and Research associate at Bristol’s School of Earth Sciences, said: “We were excited to see that the footprints and skeletons told the same story. We had been studying the footprints in the Dolomites for some time, and it’s amazing how clear cut the change from ‘no dinosaurs’ to ‘all dinosaurs’ was.”

The point of explosion of dinosaurs matches the end of the Carnian Pluvial Episode, a time when climates shuttled from dry to humid and back to dry again.

It was long suspected that this event had caused upheavals among life on land and in the sea, but the details were not clear. Then, in 2015, dating of rock sections and measurement of oxygen and carbon values showed just what had happened.

There were massive eruptions in western Canada, represented today by the great Wrangellia basalts — these drove bursts of global warming, acid rain, and killing on land and in the oceans.

Co-author Piero Gianolla, from the University of Ferrara, added: “We had detected evidence for the climate change in the Dolomites. There were four pulses of warming and climate perturbation, all within a million years or so. This must have led to repeated extinctions.”

Professor Mike Benton, also a co-author, from the University of Bristol, said: “The discovery of the existence of a link between the first diversification of dinosaurs and a global mass extinction is important.

“The extinction didn’t just clear the way for the age of the dinosaurs, but also for the origins of many modern groups, including lizards, crocodiles, turtles, and mammals — key land animals today.”

Story Source:

Materials provided by University of BristolNote: Content may be edited for style and length.


Journal Reference:

  1. Massimo Bernardi, Piero Gianolla, Fabio Massimo Petti, Paolo Mietto, Michael J. Benton. Dinosaur diversification linked with the Carnian Pluvial EpisodeNature Communications, 2018; 9 (1) DOI: 10.1038/s41467-018-03996-1

 

Source: University of Bristol. “Dinosaurs ended — and originated — with a bang!.” ScienceDaily. ScienceDaily, 16 April 2018. <www.sciencedaily.com/releases/2018/04/180416105803.htm>.

Even moderate alcohol drinking linked to heart and circulatory diseases, study finds

Date:
April 13, 2018

Source:
University of Cambridge

Summary:
Regularly drinking more than the recommended UK guidelines for alcohol could take years off your life, according to new research. The study shows that drinking more alcohol is associated with a higher risk of stroke, fatal aneurysm, heart failure and death.

 

Beer (stock image).
Credit: © Amy Laughinghouse / Fotolia

 

 

Regularly drinking more than the recommended UK guidelines for alcohol could take years off your life, according to new research from the University of Cambridge. Part-funded by the British Heart Foundation, the study shows that drinking more alcohol is associated with a higher risk of stroke, fatal aneurysm, heart failure and death.

The authors say their findings challenge the widely held belief that moderate drinking is beneficial to cardiovascular health, and support the UK’s recently lowered guidelines.

The study compared the health and drinking habits of over 600,000 people in 19 countries worldwide and controlled for age, smoking, history of diabetes, level of education and occupation.

The upper safe limit of drinking was about five drinks per week (100g of pure alcohol, 12.5 units or just over five pints of 4% ABV beer or five 175ml glasses of 13% ABV wine). However, drinking above this limit was linked with lower life expectancy. For example, having 10 or more drinks per week was linked with one to two years shorter life expectancy. Having 18 drinks or more per week was linked with four to five years shorter life expectancy.

The research, published today in the Lancet, supports the UK’s recently lowered guidelines, which since 2016 recommend both men and women should drink no more than 14 units of alcohol each week. This equates to around six pints of beer or six glasses of wine a week.

However, the worldwide study carries implications for countries across the world, where alcohol guidelines vary substantially.

The researchers also looked at the association between alcohol consumption and different types of cardiovascular disease. Alcohol consumption was associated with a higher risk of stroke, heart failure, fatal aortic aneurysms, fatal hypertensive disease and heart failure and there were no clear thresholds where drinking less did not have a benefit.

By contrast, alcohol consumption was associated with a slightly lower risk of non-fatal heart attacks.

The authors note that the different relationships between alcohol intake and various types of cardiovascular disease may relate to alcohol’s elevating effects on blood pressure and on factors related to elevated high-density lipoprotein cholesterol (HDL-C) (also known as ‘good’ cholesterol). They stress that the lower risk of non-fatal heart attack must be considered in the context of the increased risk of several other serious and often fatal cardiovascular diseases.

The study focused on current drinkers to reduce the risk of bias caused by those who abstain from alcohol due to poor health. However, the study used self-reported alcohol consumption and relied on observational data, so no firm conclusions can me made about cause and effect. The study did not look at the effect of alcohol consumption over the life-course or account for people who may have reduced their consumption due to health complications.

Dr Angela Wood, from the University of Cambridge, lead author of the study said: “If you already drink alcohol, drinking less may help you live longer and lower your risk of several cardiovascular conditions.

“Alcohol consumption is associated with a slightly lower risk of non-fatal heart attacks but this must be balanced against the higher risk associated with other serious — and potentially fatal — cardiovascular diseases.”

Victoria Taylor, Senior dietician at the British Heart Foundation, which part-funded the study, said: “This powerful study may make sobering reading for countries that have set their recommendations at higher levels than the UK, but this does seem to broadly reinforce government guidelines for the UK.

“This doesn’t mean we should rest on our laurels, many people in the UK regularly drink over what’s recommended. We should always remember that alcohol guidelines should act as a limit, not a target, and try to drink well below this threshold.”

The study was funded by the UK Medical Research Council, British Heart Foundation, National Institute for Health Research, European Union Framework 7, and European Research Council.

Story Source:

Materials provided by University of Cambridge. The original story is licensed under a Creative Commons License. Adapted from a press release by British Heart Foundation. Note: Content may be edited for style and length.


Journal Reference:

  1. Angela M Wood et al. Risk thresholds for alcohol consumption: combined analysis of individual-participant data for 599 912 current drinkers in 83 prospective studiesThe Lancet, 2018; 391 (10129): 1513 DOI: 10.1016/S0140-6736(18)30134-X

 

Source: University of Cambridge. “Consuming more than five drinks a week could shorten your life: Even moderate alcohol drinking linked to heart and circulatory diseases, study finds.” ScienceDaily. ScienceDaily, 13 April 2018. <www.sciencedaily.com/releases/2018/04/180413121952.htm>.

World Orphan Drug Congress

 

Target Health has extensive experience in Orphan and Rare Diseases including FDA approvals and multiple Orphan Drug Designations.

 

For those attending the World Orphan Drug Congress, April 25-27 at the Gaylord National Harbor Hotel in Oxon Hill, MD, please reach out to Warren Pearlson, Target’s Director of Business Development who will be attending. Warren can meet with you to discuss our Regulatory Strategy and Services in the Orphan space, and delineate how our Full-Service eCRO, supported by Target Health’s paperless eSource EDC software platform, enables efficient and cost-effective Clinical Trials.

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 165). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

QUIZ

Filed Under News | Leave a Comment

Older Adults Grow Just as Many New Brain Cells as Young People

red: frontal lobe; orange: parietal lobe; yellow: occipital lobe; green: temporal lobe; blue: cerebellum; black: brainstem

 

Graphic credit: By Original concept by w:User:Washington Irving. Current shape by w:User:Mateuszica. Color modified by w:User:Hdante. Text labels by w:User:SAE1962. SVG by User:King of Hearts. – PNG on English Wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=2221053

 

Researchers show for the first time that healthy older men and women can generate just as many new brain cells as 1) ___ people. There has been controversy over whether adult humans grow new neurons, and some research has previously suggested that the adult brain was hard-wired and that adults did not grow new 2) ___. This study, which appeared in the journal Cell Stem Cell on April 5, 2018, counters that notion. According to the authors the findings may suggest that many senior citizens remain more cognitively and emotionally intact than commonly believed. Results showed that older people have similar ability to make thousands of hippocampal new neurons from progenitor 3) ___ as younger people do. The study also found equivalent volumes of the hippocampus (a brain structure used for emotion and cognition) across ages. Nevertheless, older individuals had less vascularization and maybe less ability of new neurons to make connections.

 

For the study, the authors autopsied hippocampi from 28 previously healthy individuals aged 14-79 who had died suddenly. This is the first time it was possible to look at newly formed neurons and the state of blood vessels within the entire human hippocampus soon after 4) ___. The researchers had determined that study subjects were not cognitively impaired and had not suffered from depression or taken antidepressants, which the authors had previously found could impact the production of new brain cells. In rodents and primates, the ability to generate new hippocampal cells declines with 5) ___. Waning production of neurons and an overall shrinking of the dentate gyrus, part of the hippocampus thought to help form new episodic memories, was believed to occur in aging humans as well.

 

The authors from Columbia University and New York State Psychiatric Institute found that even the oldest brains they studied produced new brain cells. While they found similar numbers of intermediate neural progenitors and thousands of immature neurons, older individuals form fewer new 6) ___ vessels within brain structures and possess a smaller pool of progenitor cells — descendants of stem cells that are more constrained in their capacity to differentiate and self-renew. The authors surmised that reduced cognitive-emotional resilience in old age may be caused by this smaller pool of neural stem cells, the decline in vascularization, and reduced cell-to-cell connectivity within the hippocampus. The authors hypothesized that it is possible that ongoing hippocampal neurogenesis sustains human-specific cognitive function throughout 7) ___ and that declines may be linked to compromised cognitive-emotional resilience.

 

The authors feel future research on the aging brain will continue to explore how neural cell proliferation, maturation, and survival are regulated by hormones, transcription factors, and other inter-cellular pathways. Other researchers are focusing on brain evolution and have suggested that there are specific genes that control the size of the human brain. These genes continue to play a role in brain evolution, implying that the brain is continuing to evolve.

 

The study began with the researchers assessing 214 genes that are involved in brain development. These genes were obtained from humans, macaques, rats and mice. The authors noted points in the DNA sequences that caused protein alterations. These DNA changes were then scaled to the evolutionary time that it took for those changes to occur. The data showed the genes in the human brain evolved much faster than those of the other species. Once this genomic evidence was acquired, the authors decided to find the specific gene or genes that allowed for or even controlled this rapid 8) ___. Two genes were found to control the size of the human brain as it develops. These genes are Microcephalin and Abnormal Spindle-like Microcephaly (ASPM). The researchers at the University of Chicago were able to determine that under the pressures of selection, both of these genes showed significant DNA sequence changes.

 

Earlier studies displayed that Microcephalin experienced rapid evolution along the primate lineage which eventually led to the emergence of Homo sapiens. After the emergence of humans, Microcephalin seems to have shown a slower evolution rate. On the contrary, ASPM showed its most rapid evolution in the later years of human evolution once the divergence between chimpanzees and humans had already occurred. Each of the gene sequences went through specific changes that led to the evolution of 9) ___ from ancestral relatives. In order to determine these alterations, the authors used DNA sequences from multiple primates then compared and contrasted the sequences with those of humans. Following this step, the researchers statistically analyzed the key differences between the primate and human DNA to come to the conclusion, that the differences were due to natural selection. The changes in DNA sequences of these 10) ___ accumulated to bring about a competitive advantage and higher fitness that humans possess in relation to other primates. This comparative advantage is coupled with a larger brain size which ultimately allows the human mind to have a higher cognitive awareness.

 

Sources and Researchers: Maura Boldrini, Camille A. Fulmore, Alexandria N. Tartt, Laika R. Simeon, Ina Pavlova, Verica Poposka, Gorazd B. Rosoklija, Aleksandar Stankov, Victoria Arango, Andrew J. Dwork, Ren? Hen, J. John Mann. Human Hippocampal Neurogenesis Persists throughout Aging. Cell Stem Cell, 2018; 22 (4): 589 DOI: 10.1016/j.stem.2018.03.015. Cell Press. “Older adults grow just as many new brain cells as young people.“ ScienceDaily. ScienceDaily, 5 April 2018. www.sciencedaily.com/releases/2018/04/180405223413.htm; Wikipedia

 

ANSWERS: 1) younger; 2) neurons; 3) cells; 4) death; 5) age; 6) blood; 7) life; 8) evolution; 9) humans; 10) genes

 

Phrenology

An 1883 phrenology chart

 

Graphic credit: From People’s Cyclopedia of Universal Knowledge (1883). Transferred from en.wikipedia Original uploader was Whbonney at en.wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=6693422

 

Phrenology is a pseudo medicine primarily focused on measurements of the human skull, based on the concept that the brain is the organ of the mind, and that certain brain areas have localized, specific functions or modules. Although both of those ideas have a basis in reality, phrenology extrapolated beyond empirical knowledge in a way that departed from science. Developed by German physician Franz Joseph Gall in 1796, the discipline was very popular in the 19th century, especially from about 1810 until 1840. The principal British center for phrenology was Edinburgh, where the Edinburgh Phrenological Society was established in 1820. Although now regarded as an obsolete amalgamation of primitive neuroanatomy with moral philosophy, phrenological thinking was influential in 19th-century psychiatry. Gall’s assumption that character, thoughts, and emotions are located in specific parts of the brain is considered an important historical advance toward neuropsychology.

 

Phrenologists believe that the human mind has a set of various mental faculties, each one represented in a different area of the brain. For example, the faculty of “philoprogenitiveness“, from the Greek for “love of offspring“, was located centrally at the back of the head (see illustration of the chart from Webster’s Academic Dictionary).

These areas were said to be proportional to a person’s propensities. The importance of an organ was derived from relative size compared to other organs. It was believed that the cranial skull – like a glove on the hand -accommodates to the different sizes of these areas of the brain, so that a person’s capacity for a given personality trait could be determined simply by measuring the area of the skull that overlies the corresponding area of the brain. Phrenology, which focuses on personality and character, is distinct from craniometry, which is the study of skull size, weight and shape, and physiognomy, the study of facial features. Phrenology is a process that involves observing and/or feeling the skull to determine an individual’s psychological attributes. Franz Joseph Gall believed that the brain was made up of 27 individual organs that determined personality, the first 19 of these ?organs’ he believed to exist in other animal species. Phrenologists would run their fingertips and palms over the skulls of their patients to feel for enlargements or indentations. The phrenologist would often take measurements with a tape measure of the overall head size and more rarely employ a craniometer, a special version of a caliper. In general, instruments to measure sizes of cranium continued to be used after the mainstream phrenology had ended. The phrenologists put emphasis on using drawings of individuals with particular traits, to determine the character of the person and thus many phrenology books show pictures of subjects. From absolute and relative sizes of the skull the phrenologist would assess the character and temperament of the patient.

 

Gall’s list of the “brain organs“ was specific. An enlarged organ meant that the patient used that particular “organ“ extensively. The number – and more detailed meanings – of organs were added later by other phrenologists. The 27 areas varied in function, from sense of color, to religiosity, to being combative or destructive. Each of the 27 “brain organs“ was located under a specific area of the skull. As a phrenologist felt the skull, he would use his knowledge of the shapes of heads and organ positions to determine the overall natural strengths and weaknesses of an individual. Phrenologists believed the head revealed natural tendencies but not absolute limitations or strengths of character. The first phrenological chart gave the names of the organs described by Gall; it was a single sheet, and sold for a cent. Later charts were more expansive.

 

Historically, among the first to identify the brain as the major controlling center for the body were Hippocrates and his followers, inaugurating a major change in thinking from Egyptian, biblical and early Greek views, which based bodily primacy of control on the heart. This belief was supported by the Greek physician Galen, who concluded that mental activity occurred in the brain rather than the heart, contending that the brain, a cold, moist organ formed of sperm, was the seat of the animal soul – one of three “souls“ found in the body, each associated with a principal organ. The Swiss pastor Johann Kaspar Lavater (1741-1801) introduced the idea that physiognomy related to the specific character traits of individuals, rather than general types, in his Physiognomische Fragmente, published between 1775 and 1778. His work was translated into English and published in 1832 as The Pocket Lavater, or, The Science of Physiognomy. He believed that thoughts of the mind and passions of the soul were connected with an individual’s external frame. Of the forehead, When the forehead is perfectly perpendicular, from the hair to the eyebrows, it denotes an utter deficiency of understanding.

 

In 1796 the German physician Franz Joseph Gall (1758-1828) began lecturing on organology: the isolation of mental faculties and later cranioscopy which involved reading the skull’s shape as it pertained to the individual. It was Gall’s collaborator Johann Gaspar Spurzheim who would popularize the term “phrenology“. In 1809 Gall began writing his principal work, The Anatomy and Physiology of the Nervous System in General, and of the Brain in Particular, with Observations upon the possibility of ascertaining the several Intellectual and Moral Dispositions of Man and Animal, by the configuration of their Heads. It was not published until 1819. In the introduction to this main work, Gall makes the following statement in regard to his doctrinal principles, which comprise the intellectual basis of phrenology:

 

The Brain is the organ of the mind

 

1. The brain is not a homogenous unity, but an aggregate of mental organs with specific functions

2. The cerebral organs are topographically localized

3. Other things being equal, the relative size of any particular mental organ is indicative of the power or strength of that organ

4. Since the skull ossifies over the brain during infant development, external craniological means could be used to diagnose the internal states of the mental characters

 

Through careful observation and extensive experimentation, Gall believed he had established a relationship between aspects of character, called faculties, with precise organs in the brain. Johann Spurzheim was Gall’s most important collaborator. He worked as Gall’s anatomist until 1813 when for unknown reasons they had a permanent falling out. Publishing under his own name Spurzheim successfully disseminated phrenology throughout the United Kingdom during his lecture tours through 1814 and 1815 and the United States in 1832 where he would eventually die. Gall was more concerned with creating a physical science, so it was through Spurzheim that phrenology was first spread throughout Europe and America. Phrenology, while not universally accepted, was hardly a fringe phenomenon of the era. George Combe would become the chief promoter of phrenology throughout the English-speaking world after he viewed a brain dissection by Spurzheim, convincing him of phrenology’s merits.

 

The popularization of phrenology in the middle and working classes was due in part to the idea that scientific knowledge was important and an indication of sophistication and modernity. Cheap and plentiful pamphlets, as well as the growing popularity of scientific lectures as entertainment, also helped spread phrenology to the masses. Combe created a system of philosophy of the human mind that became popular with the masses because of its simplified principles and wide range of social applications that were in harmony with the liberal Victorian world view. George Combe’s book On the Constitution of Man and its Relationship to External Objects sold over 200, 000 copies through nine editions. Combe also devoted a large portion of his book to reconciling religion and phrenology, which had long been a sticking point. Another reason for its popularity was that phrenology balanced between free will and determinism. A person’s inherent faculties were clear, and no faculty was viewed as evil, though the abuse of a faculty was. Phrenology allowed for self-improvement and upward mobility, while providing fodder for attacks on aristocratic privilege. Phrenology also had wide appeal because of its being a reformist philosophy not a radical one. Phrenology was not limited to the common people, and both Queen Victoria and Prince Albert invited George Combe to read the heads of their children.

 

Phrenology came about at a time when scientific procedures and standards for acceptable evidence were still being codified. In the context of Victorian society, phrenology was a respectable scientific theory. The Phrenological Society of Edinburgh founded by George and Andrew Combe was an example of the credibility of phrenology at the time and included a number of extremely influential social reformers and intellectuals, including the publisher Robert Chambers, the astronomer John Pringle Nichol, the evolutionary environmentalist Hewett Cottrell Watson, and asylum reformer William A.F. Browne. In 1826, out of the 120 members of the Edinburgh society an estimated one third were from a medical background. By the 1840s there were more than 28 phrenological societies in London with over 1000 members. Another important scholar was Luigi Ferrarese, the leading Italian phrenologist. He advocated that governments should embrace phrenology as a scientific means of conquering many social ills, and his Memorie Risguardanti La Dottrina Frenologica (1836), is considered “one of the fundamental 19th century works in the field“.

 

Traditionally the mind had been studied through introspection. Phrenology provided an attractive, biological alternative that attempted to unite all mental phenomena using consistent biological terminology. Gall’s approach prepared the way for studying the mind that would lead to the downfall of his own theories. Phrenology contributed to development of physical anthropology, forensic medicine, knowledge of the nervous system and brain anatomy as well as contributing to applied psychology. John Elliotson was a brilliant but erratic heart specialist who became a phrenologist in the 1840s. He was also a mesmerist and combined the two into something he called phrenomesmerism or phrenomagnatism. Changing behavior through mesmerism eventually won out in Elliotson’s hospital, putting phrenology in a subordinate role. Others amalgamated phrenology and mesmerism as well, such as the practical phrenologists Collyer and Joseph R. Buchanan. The benefits of combining mesmerism and phrenology was that the trance the patient was placed in was supposed to allow for the manipulation of his/her penchants and qualities. For example, if the organ of self-esteem was touched, the subject would take on a haughty expression.

 

Phrenology has been psychology’s great faux pas. – J.C. Flugel (1933)

 

Phrenology was mostly discredited as a scientific theory by the 1840s. This was due only in part to a growing amount of evidence against phrenology. Phrenologists had never been able to agree on the most basic mental organ numbers, going from 27 to over 40, and had difficulty locating the mental organs. Phrenologists relied on cranioscopic readings of the skull to find organ locations. Jean Pierre Flourens’ experiments on the brains of pigeons indicated that the loss of parts of the brain either caused no loss of function, or the loss of a completely different function than what had been attributed to it by phrenology. Flourens’ experiment, while not perfect, seemed to indicate that Gall’s supposed organs were imaginary. Scientists had also become disillusioned with phrenology since its exploitation with the middle and working classes by entrepreneurs. The popularization had resulted in the simplification of phrenology and mixing in it of principles of physiognomy, which had from the start been rejected by Gall as an indicator of personality. Phrenology from its inception was tainted by accusations of promoting materialism and atheism, and being destructive of morality. These were all factors which led to the downfall of phrenology . Recent studies, using modern day technology like Magnetic Resonance Imaging have further disproven phrenology claims.

 

During the early 20th century, a revival of interest in phrenology occurred, partly because of studies of evolution, criminology and anthropology (as pursued by Cesare Lombroso). The most famous British phrenologist of the 20th century was the London psychiatrist Bernard Hollander (1864-1934). His main works, The Mental Function of the Brain (1901) and Scientific Phrenology (1902), are an appraisal of Gall’s teachings. Hollander introduced a quantitative approach to the phrenological diagnosis, defining a method for measuring the skull, and comparing the measurements with statistical averages. In Belgium, Paul Bouts (1900-1999) began studying phrenology from a pedagogical background, using the phrenological analysis to define an individual pedagogy. Combining phrenology with typology and graphology, he coined a global approach known as psychognomy. Bouts, a Roman Catholic priest, became the main promoter of renewed 20th-century interest in phrenology and psychognomy in Belgium. He was also active in Brazil and Canada, where he founded institutes for characterology. His works Psychognomie and Les Grandioses Destinees individuelle et humaine dans la lumiere de la Caracterologie et de l’Evolution cerebro-cranienne are considered standard works in the field. In the latter work, which examines the subject of paleoanthropology, Bouts developed a teleological and orthogenetical view on a perfecting evolution, from the paleo-encephalical skull shapes of prehistoric man, which he considered still prevalent in criminals and savages, towards a higher form of mankind, thus perpetuating phrenology’s problematic racializing of the human frame. Bouts died on March 7, 1999. His work has been continued by the Dutch foundation PPP (Per Pulchritudinem in Pulchritudine), operated by Anette Muller, one of Bouts’ students. During the 1930’s Belgian colonial authorities in Rwanda used phrenology to explain the so-called superiority of Tutsis over Hutus.

 

Is Lack of Sleep a Risk Factor for Alzheimer’s Disease?

 

According to an article published online in the Proceedings of the National Academy of Sciences. (9 April 2018), losing just one night of sleep led to an immediate increase in beta-amyloid. The study is among the first to demonstrate that sleep may play an important role in human beta-amyloid clearance. Beta-amyloid is a metabolic waste product present in the fluid between brain cells. In Alzheimer’s disease, beta-amyloid proteins clump together to form amyloid plaques, a hallmark of the disease, negatively impacting communication between neurons. While acute sleep deprivation is known to elevate brain beta-amyloid levels in mice, less is known about the impact of sleep deprivation on beta-amyloid accumulation in the human brain.

 

To understand the possible link between beta-amyloid accumulation and sleep, the authors used positron emission tomography (PET) to scan the brains of 20 healthy subjects, ranging in age from 22 to 72, after a night of rested sleep and after sleep deprivation (being awake for about 31 hours). Results showed that beta-amyloid increased about 5% after losing a night of sleep in brain regions including the thalamus and hippocampus, regions especially vulnerable to damage in the early stages of Alzheimer’s disease.

 

In Alzheimer’s disease, beta-amyloid is estimated to increase about 43% in affected individuals relative to healthy older adults. However, it is unknown whether the increase in beta-amyloid in the study participants would subside after a night of rest. Interestingly, the study also found that study participants with larger increases in beta-amyloid reported worse mood after sleep deprivation. According to the authors, even though the sample was small, the study demonstrated the negative effect of sleep deprivation on beta-amyloid burden in the human brain and that future studies are needed to assess the generalizability to a larger and more diverse population. It is also important to note that the link between sleep disorders and Alzheimer’s risk is considered by many scientists to be “bidirectional,” since elevated beta-amyloid may also lead to sleep disturbances.

 

← Previous PageNext Page →