Hans Gerhard Creutzfeldt MD

Hans Gerhard Creutzfeldt (June 2, 1885 – December 30, 1964) was a German neuropathologist, who first described the Creutzfeldt-Jakob disease. He was born in Harburg upon Elbe and died in Munich.

Photo credit: Unknown – http://www.sammlungen.hu-berlin.de/dokumente/11727/, Public Domain, https://commons.wikimedia.org/w/index.php?curid=4008658



Hans Gerhard Creutzfeldt was born into a medical family in Harburg, which was incorporated into Hamburg in 1937. In 1903, at the age of 18, Creutzfeldt was drafted into the German army and spent his service stationed in Kiel. Afterwards, he attended the School of Medicine of the Universities of University of Jena and University of Rostock, receiving his doctorate at the latter in 1909. Part of his practical training was undertaken at St. Georg – Hospital in Hamburg. After qualification he sought adventure as a ship’s surgeon, voyaging the Pacific Ocean, taking the opportunity to study local crafts, linguistics, and tropical plants. After returning to Germany, Creutzfeldt worked at the Neurological Institute in Frankfurt am Main, at the psychiatric-neurological clinics in Breslau, Kiel and Berlin, and at the Deutsche Forschungsanstalt fur Psychiatrie in Munich. Creutzfeldt was habilitated at Kiel in 1920, and in 1925 became Extraordinarius of psychiatry and neurology. In 1938 he was appointed professor and director of the university psychiatric and neurological division in Kiel. Later, Creutzfeldt helped to recognize a neurodegenerative disease, with Alfons Maria Jakob, now known as Creutzfeldt-Jakob disease, in which the brain tissue develops holes and takes on a sponge-like texture. It is now known that this disease is due to a type of infectious protein called a prion. Prions are misfolded proteins which replicate by converting their properly folded counterparts.


In the Third Reich, Creutzfeldt became a Patron Member of Heinrich Himmler’s SS. However, when Creutzfeldt was 54 years old and WW2 broke out, he was unmoved by the Nazi regime and was able to save some people from death in concentration camps. He also managed to rescue almost all of his patients from being murdered under the Nazi Action T4 euthanasia program, an unusual event since most mental patients identified by T4 personnel were gassed or poisoned at separate euthanasia clinics such as Hadamar Euthanasia Centre. During the war, bombing raids destroyed his home and clinic. After the war he was director of the University of Kiel for six months, before being dismissed by the British occupation forces. His efforts to rebuild the university caused a series of conflicts with the British because he wanted to allow more former army officers to study there. In 1953 he moved on to Munich to work on scientific research commissioned by the Max Planck Society.


Creutzfeldt was married to Clara Sombart, a daughter of Werner Sombart. They had five children, among them Otto Detlev Creutzfeldt and Werner Creutzfeldt (1924-2006), a renowned German Internist. Hans Gerhard Creutzfeldt died in 1964 in Munich.


As mentioned above, Creutzfeldt-Jakob disease, is a subacute spongiform encephalopathy caused from prions involving the cerebral cortex, the basal ganglia and the spinal cord. Some of the clinical findings described in the Creutzfeldt and Jakob first papers do not match current criteria for Creutzfeldt-Jakob disease. It has been speculated that at least two of the patients in initial studies were suffering from a different ailment. A study published in 1997 counted more than 100 cases worldwide of transmissible CJD and new cases continued to appear at the time. The first report of suspected iatrogenic CJD was published in 1974. Animal experiments showed that corneas of infected animals could transmit CJD, and the causative agent spreads along visual pathways. A second case of CJD associated with a corneal transplant was reported without details. In 1977, CJD transmission caused by silver electrodes previously used in the brain of a person with CJD was first reported. Transmission occurred despite decontamination of the electrodes with ethanol and formaldehyde. Retrospective studies identified four other cases likely of similar cause. The rate of transmission from a single contaminated instrument is unknown, although it is not 100%. In some cases, the exposure occurred weeks after the instruments were used on a person with CJD.


A review article published in 1979 indicated that 25 dura mater cases had occurred by that date in Australia, Canada, Germany, Italy, Japan, New Zealand, Spain, the United Kingdom, and the United States.

By 1985, a series of case reports in the United States showed that when injected, cadaver-extracted pituitary human growth hormone could transmit CJD to humans. In 1992, it was recognized that human gonadotropin administered by injection could also transmit CJD from person to person. In 2004, a report published by Edinburgh doctors in the Lancet medical journal demonstrated that vCJD was transmitted by blood transfusion.


Stanley B. Prusiner of the University of California, San Francisco (UCSF) was awarded the Nobel Prize in physiology or medicine in 1997 “for his discovery of Prions – a new biological principle of infection“. However, Yale University neuropathologist Laura Manuelidis has challenged the prion protein (PrP) explanation for the disease. In January 2007, she and her colleagues reported that they had found a virus-like particle in naturally and experimentally infected animals. “The high infectivity of comparable, isolated virus-like particles that show no intrinsic PrP by antibody labeling, combined with their loss of infectivity when nucleic acid-protein complexes are disrupted, make it likely that these 25-nm particles are the causal TSE virions.“ Four Australians had been reported with CJD following transfusion as of 1997. There have been ten cases of healthcare-acquired CJD in Australia. They consist of five deaths following treatment with pituitary extract hormone for either infertility or short stature, with no further cases since 1991. The five other deaths were caused by dura grafting during brain surgery, where the covering of the brain was repaired. There have been no other known healthcare-acquired CJD deaths in Australia. A case was reported in 1989 in a 25-year-old man from New Zealand, who also received dura mater transplant. Five New Zealanders have been confirmed to have died of the sporadic form of Creutzfeldt-Jakob disease (CJD) in 2012.


Researchers believe one in 2,000 people in the UK is a carrier of the disease linked to eating contaminated beef (vCJD). The survey provides the most robust prevalence measure to date – and identifies abnormal prion protein across a wider age group than found previously and in all genotypes, indicating “infection“ may be relatively common. This new study examined over 32,000 anonymous appendix samples. Of these, 16 samples were positive for abnormal prion protein, indicating an overall prevalence of 493 per million population, or one in 2,000 people are likely to be carriers. No difference was seen in different birth cohorts (1941-60 and 1961-85), in both genders, and there was no apparent difference in abnormal prion prevalence in three broad geographical areas. Genetic testing of the 16 positive samples revealed a higher proportion of valine homozygous (VV) genotype on the codon 129 of the gene encoding the prion protein (PRNP) compared with the general UK population. This also differs from the 177 patients with vCJD, all of whom to date have been methionine homozygous (MM) genotype. The concern is that individuals with this VV genotype may be susceptible to developing the condition over longer incubation periods.


In 1988, there was a confirmed death from CJD of a person from Manchester, New Hampshire in the United States. Massachusetts General Hospital believed the patient acquired the disease from a surgical instrument at a podiatrist’s office. In September 2013, another patient in Manchester, New Hampshire was posthumously determined to have died of the disease. The patient had undergone brain surgery at Catholic Medical Center three months before his death, and a surgical probe used in the procedure was subsequently reused in other operations. Public health officials identified thirteen patients at three hospitals who may have been exposed to the disease through the contaminated probe, but said the risk of anyone’s contracting CJD is “extremely low.“ In January 2015, the former speaker of the Utah House of Representatives, Rebecca D. Lockhart, died of the disease within a few weeks of diagnosis. John Carroll, former editor of The Baltimore Sun and Los Angeles Times, died of CJD in Kentucky in June 2015, after having been diagnosed in January. American actress Barbara Tarbuck (General Hospital, American Horror Story) died of the disease on December 26, 2016.


An experimental treatment was given to a Northern Irish teenager, Jonathan Simms, beginning in January 2003. The medication, called pentosan polysulphate (PPS) and used to treat interstitial cystitis, is infused into the patient’s lateral ventricle within the brain. PPS does not seem to stop the disease from progressing, and both brain function and tissue continue to be lost. However, the treatment is alleged to slow the progression of the otherwise untreatable disease, and may have contributed to the longer than expected survival of the seven patients studied. Simms died in 2011. The CJD Therapy Advisory Group to the UK Health Departments advises that data are not sufficient to support claims that pentosan polysulphate is an effective treatment and suggests that further research in animal models is appropriate. A 2007 review of the treatment of 26 patients with PPS finds no proof of efficacy because of the lack of accepted objective criteria. Scientists have investigated using RNA interference to slow the progression of scrapie in mice. The RNA blocks production of the protein that the CJD process transforms into prions. This research is unlikely to lead to a human therapy for many years. Both amphotericin B and doxorubicin have been investigated as potentially effective against CJD, but as yet there is no strong evidence that either drug is effective in stopping the disease. Further study has been taken with other medical drugs, but none are effective. However, anticonvulsants and anxiolytic agents, such as valproate or a benzodiazepine, may be administered to relieve associated symptoms.


Scientists from the University of California, San Francisco are currently running a treatment trial for sporadic CJD using quinacrine, a medicine originally created for malaria. Pilot studies showed quinacrine permanently cleared abnormal prion proteins from cell cultures, but results have not yet been published on their clinical study. The efficacy of quinacrine was also assessed in a rigorous clinical trial in the UK and the results were published in Lancet Neurology, and concluded that quinacrine had no measurable effect on the clinical course of CJD. In a 2013 paper published in the Proceedings of the National Academy of Sciences, scientists from The Scripps Research Institute reported that Astemizole, a medication approved for human use, has been found to have anti-prion activity and may lead to a treatment for Creutzfeldt-Jakob disease.


A Short History of Pills

An old Cadmach rotary tablet press

Photo credit: Slashme at the English language Wikipedia, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=21895660



Pills date back to roughly 1500 BCE. They were presumably invented so that measured amounts of a medicinal substance could be delivered to a patient. A long time ago, around 4,000 years or so, medicines were generally liquid preparations. An inscription on an Assyrian clay tablet instructs the user to pulverize various seeds, plant resins and leaves together–then dissolve them in beer. Pills are first referenced in ancient Egyptian. One famous set of papyruses is filled with medical remedies, including pills made from bread dough, honey or grease. Medicinal plants would be reduced to powders, and other active ingredients, and would then be mixed with these substances–then little balls, or pills, would be formed with the fingers. Early ingredients of pills included saffron, myrrh, cinnamon, tree resins and many other botanicals. Pills came in various sizes as well as flat and round, and other assorted shapes. As far back as 500 BCE, some were even trademarked with special indentations in the pills.


Hippocrates, knew about the curative powers of willow bark. And in ancient Greece, the round balls or other shapes were called katapotia (meaning “something to be swallowed“). It was the Roman scholar Pliny (23-79 CE–who first coined the word “pilula.“


Some early pills still exist in museums, such as a famous one dating from 500 BCE. that was known as Terra Sigillata–consisting of clay from a particular island that was mixed with goat’s blood then shaped into pills. Terra Sigillata was supposedly good for practically every ailment, including dysentery, ulcers and gonorrhea. A pill was originally defined as a small, round, solid pharmaceutical oral dosage form of medication. The oldest known pills were made of the zinc carbonates hydrozincite and smithsonite. The pills were used for sore eyes, and were found aboard a Roman ship Relitto del Pozzino which wrecked in 140 BCE. Today, pills include tablets, capsules, and variants thereof like caplets ? essentially any solid form of medication, colloquially falls into the pill category. There are pieces of ancient Roman pill-making equipment, such as a carved stone in the British Museum. The stone has long flat grooves into which the pill maker would press clay or other substances to make long, snaky strings. Then the pill maker would pry the strings out and cut them into discs to form pills–much the way one cuts dough for cookies.


During the Middle medieval times, people would coat their pills with slimy plant substances and other materials so they were easier to swallow and tasted less bitter. Some pills were rolled in spices, and later pills began to be coated with gold and silver. Silver, unfortunately, rendered the pills pretty inert, since they’d pass right through the digestive tract without releasing any of their medicinal compounds. Gilding of pills, continued well into the 19th century. Medicines in pill form were popular in 17th century England and thereafter. Pill manufacturers were granted special patent rights from the king for their top-secret formulas. One famous patented product from the 18th century: “Hooper’s Female Pills,“ which were guaranteed to contain “the best purging and anti-hysterik ingredients.“ And pills, of course, made their way over to the still-new United States–which had its own set of patent-protected preparations, courtesy of the U.S. Patent office–including Chase’s Kidney-Liver Pills, Cheeseman’s Female Regulating Pills and Williams’ Pink Pills for Pale People.


The old-fashioned, roll-and-cut kinds of pills had a drawback: Their preparation required moisture. Early researchers, (doctors) were learning that this moisture could de-activate the drugs contained. In the 1800s, innovators began sugar-coating and gelatin-coating pills. At this time gelatin capsules were invented, as well as the ability to compress tablets. In 1843, English scientist, William Brockedon invented a different pill form. Powder was placed in a tube and then compressed with a mallet, until it solidified. Eventually, this invention became popular. Holloway’s Pills were perhaps the most famous of the patent medicines, and were popular enough to make Thomas Holloway a wealthy man. Testimonials to the value of the pills can be found at this time, in newspapers all over the British Empire, including Indian, Australia and the North American colonies. The range of diseases the pills claimed to cure is astonishing. Along with Holloway’s Ointment, Holloway’s Pills could treat almost anything. Analysis of the pills showed that they contained aloe, myrrh, and saffron. While probably not harmful, these pills would be unlikely to have the claimed affects. The Holloway advertising changed from time to time, listing a variety of dangers that the pills could prevent. An example, for “Children’s Complaints“:


“It is not generally known, but such is the fact that children require medicine oftener than their parents. Three-fourths of the children die before they attain the age of eight years. Let their

mothers, then, be wise, and give to their children small doses of these invaluable pills once or twice every week… The gross humors that are constantly floating about in the blood of children, the forerunners of so many complaints, will thus be expelled, and the lives of thousands saved and preserved to their parents.“


Pills have always been difficult to swallow and efforts long have been made to make them go down easier. In medieval times, people coated pills with slippery plant substances. Another approach, used as recently as the 19th century, was to gild them in gold and silver, although this often meant that they would pass through the digestive tract with no effect. In the 1800s sugar-coating and gelatin-coating was invented, as were gelatin capsules. In 1843, the British painter and inventor William Brockedon was granted a patent for a machine capable of “Shaping Pills, Lozenges and Black Lead by Pressure in Dies“. The device was capable of compressing powder into a tablet without use of an adhesive. In the tablet-pressing process, it is important that all ingredients be fairly dry, powdered or granular, somewhat uniform in particle size, and freely flowing. Mixed particle sized powders segregate during manufacturing operations due to different densities, which can result in tablets with poor drug or active pharmaceutical ingredient (API) content uniformity but granulation should prevent this. Content uniformity ensures that the same API dose is delivered with each tablet. Some APIs may be tableted as pure substances, but this is rarely the case; most formulations include excipients. Normally, a pharmacologically inactive ingredient (excipient) termed a binder is added to help hold the tablet together and give it strength. A wide variety of binders may be used, some common ones including lactose, dibasic calcium phosphate, sucrose, corn (maize) starch, microcrystalline cellulose, povidone polyvinylpyrrolidone and modified cellulose (for example hydroxypropyl methylcellulose and hydroxyethylcellulose).


Often, an ingredient is also needed to act as a disintegrant to aid tablet dispersion once swallowed, releasing the API for absorption. Some binders, such as starch and cellulose, are also excellent disintegrants. Tablets are simple and convenient to use. They provide an accurately measured dosage of the active ingredient in a convenient portable package, and can be designed to protect unstable medications or disguise unpalatable ingredients. Colored coatings, embossed markings and printing can be used to aid tablet recognition. Manufacturing processes and techniques can provide tablets with special properties, for example, sustained release or fast dissolving formulations. Some drugs may be unsuitable for administration by the oral route. For example, protein drugs such as insulin may be denatured by stomach acids. Such drugs cannot be made into tablets. Some drugs may be deactivated by the liver when they are carried there from the gastrointestinal tract by the hepatic portal vein (the “first pass effect“), making them unsuitable for oral use. Drugs which can be taken sublingually are absorbed through the oral mucosa, so that they bypass the liver and are less susceptible to the first pass effect. The oral bioavailability of some drugs may be low due to poor absorption from the gastrointestinal tract. Such drugs may need to be given in very high doses or by injection. For drugs that need to have rapid onset, or that have severe side effects, the oral route may not be suitable. For example, salbutamol, used to treat problems in the respiratory system, can have effects on the heart and circulation if taken orally; these effects are greatly reduced by inhaling smaller doses direct to the required site of action. A proportion of the population have difficulties swallowing tablets either because they just don’t like taking them or because their medical condition makes it difficult for them (dysphagia, vomiting). In such instances it may be better to consider alternative dosage form or administration route.


Tablets can be made in virtually any shape, although requirements of patients and tableting machines mean that most are round, oval or capsule shaped. More unusual shapes have been manufactured but patients find these harder to swallow, and they are more vulnerable to chipping or manufacturing problems. Tablet diameter and shape are determined by the machine tooling used to produce them – a die plus an upper and a lower punch are required. This is called a station of tooling. The thickness is determined by the amount of tablet material and the position of the punches in relation to each other during compression. Once this is done, we can measure the corresponding pressure applied during compression. The shorter the distance between the punches, thickness, the greater the pressure applied during compression, and sometimes the harder the tablet. Tablets need to be hard enough that they don’t break up in the bottle, yet friable enough that they disintegrate in the gastric tract. Tablets need to be strong enough to resist the stresses of packaging, shipping and handling by the pharmacist and patient. The mechanical strength of tablets is assessed using a combination of (i) simple failure and erosion tests, and (ii) more sophisticated engineering tests. The simpler tests are often used for quality control purposes, whereas the more complex tests are used during the design of the formulation and manufacturing process in the research and development phase. Standards for tablet properties are published in the various international pharmacopeias (USP/NF, EP, JP, etc.). The hardness of tablets is the principle measure of mechanical strength. Hardness is tested using a tablet hardness tester. The units for hardness have evolved since the 1930s, but are commonly measured in kilograms per square centimeter. Models of tester include the Monsanto (or Stokes) Hardness Tester from 1930, the Pfizer Hardness Tester from 1950, the Strong Cob Hardness Tester and the Heberlain (or Schleeniger) Hardness Tester.


Lubricants prevent ingredients from clumping together and from sticking to the tablet punches or capsule filling machine. Lubricants also ensure that tablet formation and ejection can occur with low friction between the solid and die wall, as well as between granules, which helps in uniform filling of the die. Common minerals like talc or silica, and fats, e.g. vegetable stearin, magnesium stearate or stearic acid are the most frequently used lubricants in tablets or hard gelatin capsules. In the tablet pressing process, the main guideline is to ensure that the appropriate amount of active ingredient is in each tablet. Hence, all the ingredients should be well-mixed. If a sufficiently homogenous mix of the components cannot be obtained with simple blending processes, the ingredients must be granulated prior to compression to assure an even distribution of the active compound in the final tablet. Two basic techniques are used to granulate powders for compression into a tablet: wet granulation and dry granulation. Powders that can be mixed well do not require granulation and can be compressed into tablets through direct compression.


Combined oral contraceptive pills were nicknamed “the pill“ in the 1960s





Frederic Chopin’s Cause of Death

Chopin plays for the Radziwills, 1829 (painting by Henryk Siemiradzki, 1887)

Credit: Henryk Siemiradzki – images.fineartamerica.com, Public Domain, https://commons.wikimedia.org/w/index.php?curid=1086097


In 2014, a team of medical experts received permission to remove Polish genius, Frederic Chopin’s preserved heart from the Holy Cross Church in Warsaw, where it had ultimately been interred, and examine it for clues that might shed light on the mysterious ailment that led to Chopin’s death at the age of 39. The diagnosis, published in the American Journal of Medicine this past week, is the latest and most convincing foray into the long-running dispute over the likely cause of Chopin’s slow decline and death in his 30s. This published paper suggests that the composer died of pericarditis, a complication of chronic tuberculosis. Other suggested causes of his debilitation and death have included the inherited disease cystic fibrosis; alpha-1-antitrypsin deficiency, a relatively rare genetic ailment that leaves individuals prone to lung infections; and mitral stenosis, a narrowing of the heart valves. Used for the recent analysis and diagnosis was the great composer’s heart, stored in a jar of cognac for 170 years.


An autopsy was performed to try to solve the mysterious cause of the 39-year-old’s death. His heart was removed and later stored in a jar of cognac, then interred in a church pillar in Poland. But when the researchers recently examined the jar containing Chopin’s heart – kept in the crypt of the Holy Cross church in Warsaw – they noted the heart was covered with a fine coating of white fibrous materials. In addition, small lesions were visible, the telltale symptoms of serious complications of tuberculosis, concluded the team. “We didn’t open the jar,“ team leader Professor Michael Witt of the Polish Academy of Sciences told the Observer. “But from the state of the heart we can say, with high probability, that Chopin suffered from tuberculosis while the complication pericarditis was probably the immediate cause of his death.“


The new study is the latest chapter in the strange story of Chopin’s heart. After the composer died in October 1849 in Paris the rest of his remains were buried in the city’s Pere Lachaise cemetery, also the last resting place of Marcel Proust, Oscar Wilde and Jim Morrison. However, his status as a Polish national hero ensured that his heart became embroiled in controversy. Chopin’s health began to falter in the late 1830s, ultimately making it difficult for him to continue composing music. Over the years, a number of diseases have been named as the culprit of his physical decline, from cystic fibrosis to alpha-1-antitrypsin deficiency, a rare genetic condition that eventually leads to lung disease. According to a 2014 article by Alex Ross of the New Yorker, Ludwika Jedrzejewicz, Chopin’s eldest sister, smuggled the organ past Austrian and Russian authorities on her way to Poland, hiding the jar that held the heart beneath her cloak. The jar was subsequently encased in a wooden urn and buried beneath a monument at the Holy Cross Church.


In the early 20th century, Chopin, as one of Poland’s most famous native sons, became the focus of nationalist fervor in the country. During the WWII-era, Nazi occupiers recognized the symbolic significance of Chopin’s legacy and sought to block the performance of his music. But his heart was removed from the Holy Cross and given to the S.S. officer Heinz Reinefarth, who claimed to admire the composer and kept the heart safe at Nazi headquarters in Poland. The organ was returned to Holy Cross in 1945, where it remained until church officials and medical researchers collaborated to dig it up. The examination of the heart by Professor Witt and colleagues was the first since 1945. “We found it is still perfectly sealed in the jar,“ said Witt. “Some people still want to open it in order to take tissue samples to do DNA tests to support their ideas that Chopin had some kind of genetic condition. That would be absolutely wrong. It could destroy the heart and in any case, I am quite sure we now know what killed Chopin.“ The recent examination of Chopin’s heart is unlikely to quell discussion over the cause of his death. As Nature reports, the organ has never been tested for cystic fibrosis, another proposed cause of Chopin’s demise. And some scholars have cast doubt on whether the heart belonged to Chopin at all. But for now, the (possible) relic of the composer can rest undisturbed. Researchers will not be permitted to examine the heart again for another 50 years.

Sources: The Guardian; The Smithsonian; Wikipedia

Read more: http://www.smithsonianmag.com/smart-news/chopins-preserved-heart-may-offer-clues-about-his-death-180967168/#1mR2vDjK42vsapca.99


Chopin on His Deathbed, by Teofil Kwiatkowski, 1849, commissioned by Jane Stirling. Chopin is in the presence of (from left) Aleksander Jelowicki, Chopin’s sister Ludwika, Princess Marcelina Czartoryska, Wojciech Grzymala, Kwiatkowski. Credit: Teofil Kwiatkowski – www.psm.vin.pl, Public Domain, https://commons.wikimedia.org/w/index.php?curid=9613090


Funerary monument on a pillar in Holy Cross Church, Warsaw, enclosing Chopin’s heart.

Photo credit: Nihil novi – Own work, Public Domain, https://commons.wikimedia.org/w/index.php?curid=2704160


Chopin’s grave in Paris

Photo credit: Auguste Clesinger – Marcin L., 26 December 2005, Public Domain, https://commons.wikimedia.org/w/index.php?curid=479220


Here are some favorite Chopin masterpieces.

Frederic Chopin – Prelude in E-Minor (op.28 no. 4)

Chopin Nocturne C sharp minor (1830) (Arjen Seinen).

Chopin Ballade in G Minor Scene; Pianist, Wladyslaw Szpilman

Chopin, Nocturne in C sharp Minor (1830); Pianist, Jan Lisiecki

Chopin Nocturne No. 20; Pianist, Wladyslaw Szpilman

Chopin Piano Concerto No 1 in E Minor; Pianist, Land Lang


Approximately 65 Years Ago, Eugene Aserinsky Discovered REM Sleep

REM Sleep, outlined in red, above. Below the REM Sleep, are slow EEG waveforms of brain activity during non-REM sleep. – By en:User:MrSandman – Own work, Public Domain, https://commons.wikimedia.org/w/index.php?curid=452350


It was Sigmund Freud who stated, “Dreams are the royal road to the unconscious.”


Eugene Aserinsky (May 6, 1921 – July 22, 1998), a pioneer in sleep research, was a graduate student at the University of Chicago in 1953 when he discovered REM sleep. Aserinsky the son of a dentist of Russian – Jewish descent, like many great scientists, was of an immigrant family. Aserinsky made his discovery after hours spent studying the eyelids of sleeping subjects. Aserinsky and his PhD adviser, Nathaniel Kleitman, went on to demonstrate that this “rapid-eye movement“ was correlated with dreaming and a general increase in brain activity. Aserinsky and Kleitman pioneered procedures that have now been used with thousands of volunteers using the electroencephalograph. Because of these discoveries, Aserinsky and Kleitman are generally considered the founders of modern sleep research.


In 1953, for his Ph.D. in physiology at the University of Chicago, Dr. Aserinsky produced his ground-breaking thesis, ”Eye Movements During Sleep.” His discovery of rapid eye movement, or R.E.M. — the periodic, rapid, jerky movement of the eyeballs under the lids during stages of sleep associated with dreaming — showed that the brain was in a state of some alertness for about 22% of total sleep time. In a long career, he taught at Jefferson Medical College in Philadelphia, Marshall University Medical School and West Virginia University.


Eugene Aserinsky, died on July 22, 1998, when his car hit a tree north of San Diego. He was 77 and lived in Escondido, Calif. Nathaniel Kleitman lived to be 104 years old.


Editor’s note: All the Eugene Aserinsky sources we searched through, were quite limited – dry facts only. Then we discovered a fascinating write-up in the Smithsonian Magazine, by Chip Brown, that is such a fascinating account of Eugene Aserinsky, we have included the whole article, below. https://www.smithsonianmag.com/science-nature/the-stubborn-scientist-who-unraveled-a-mystery-of-the-night-91514538/


Night after night Eugene Aserinsky had been working late. He’d dragged an ancient brain-wave machine, an Offner Dynograph, from the basement to the physiology lab on the second floor of Abbott Hall at the University of Chicago. He had tinkered with it long enough to think it might not be totally unreliable. And now, late one December evening in 1951, his 8-year-old son, Armond, came over to the lab and sat patiently on an Army cot while his father scrubbed his scalp and the skin around his eyes with acetone, taped electrodes to the boy’s head and plugged the leads into a switch box over the bed. From the adjacent room, Aserinsky calibrated the machine, telling Armond to look left, right, up and down. The ink pens jumped in concert with the boy’s eyes. And then it was lights out, the sharp smell of acetone lingering in the darkness. Armond fell asleep; his father tried not to. Sustained by pretzels and coffee, Aserinsky sat at a desk under the hellish red eyes of a gargoyle-shaped lamp. He was 30 years old, a trim, handsome man of medium height, with black hair, a mustache, blue eyes and the mien of a bullfighter. When he was not in his lab coat, he usually wore a bow tie and a dark suit. He was a graduate student in physiology, and his future was riding on this research. He had nothing but a high school degree to fall back on. His wife, Sylvia, was pregnant with their second child. They lived on campus in a converted Army barracks heated by a kerosene stove. Money was so tight Aserinsky would eventually have to accept a small loan from his dissertation advisor, Nathaniel Kleitman, and then be obliged to feign enthusiasm for the distinguished man’s suggestion that he economize by eating chicken necks.


The hours crept by in the spooky gray-stone gloom of Abbott Hall. While the long banner of graph paper unfurled, Aserinsky noticed that the pens tracking his son’s eye movements – as well as the pens registering brain activity – were swinging back and forth, suggesting Armond was alert and looking around. Aserinsky went in to check on his son, expecting to find him wide awake. But Armond’s eyes were closed; the boy was fast asleep. What was going on? Yet another problem with the infernal machine? Aserinsky didn’t know what to think, standing in bewildered excitement, on the threshold of a great discovery.


The existence of rapid eye movement (REM) and its correlation with dreaming was announced 50 years ago last month in a brief, little-noted report in the journal Science. The two-page paper is a fine example of the maxim that the eye can see only what the mind knows: for thousands of years the physical clues of REM sleep were baldly visible to anyone who ever gazed at the eyelids of a napping child or studied the twitching paws of a sleeping dog. The association of a certain stage of sleep with dreaming might have been described by any number of observant cave men; in fact, if the 17,000-year-old Lascaux cave painting of a presumably dreaming Cro-Magnon hunter with an erect penis is any indication, maybe it was. But scientists had long been blinkered by preconceptions about the sleeping brain. It remains an astonishing anachronism in the history of science that Watson and Crick unraveled the structure of DNA before virtually anything was known about the physiological condition in which people spend one-third of their lives. As Tom Roth, the former editor of the journal Sleep, put it: “It’s analogous to going to Mars with a third of the Earth’s surface still unexplored.“ The REM state is so important that some scientists have designated it a “third state of being“ (after wakefulness and sleep), yet the phenomenon itself remained hidden in plain sight until September 1953, when the experiments conducted in Chicago by Aserinsky were published.


His now-classic paper, coauthored by advisor Kleitman, was less important for what it revealed than what it began. REM opened the terra incognita of the sleeping brain to scientific exploration. Before REM, it was assumed that sleep was a passive state; absent stimulation, the brain simply switched off at night like a desk lamp. After REM, scientists saw that the sleeping brain actually cycled between two distinct electrical and biochemical climates – one characterized by deep, slow-wave sleep, which is sometimes called “quiet sleep“ and is now known as non-REM or NREM sleep, and the other characterized by REM sleep, also sometimes called “active“ or “paradoxical“ sleep. The mind in REM sleep teems with vivid dreams; some brain structures consume oxygen and glucose at rates equal to or higher than in waking. The surprising implication is that the brain, which generates and evidently benefits from sleep, seems to be too busy to get any sleep itself.


The discovery of REM launched a new branch of medicine, leading to the diagnosis and treatment of sleep disorders that afflict tens of millions of people. It also changed the way we view our dreams and ourselves. It shifted scientists’ focus from the dreaming person to the dreaming brain, and inspired new models in which the chimerical dramas of the night were said to reflect random neural fireworks rather than the hidden intentions of unconscious conflict or the escapades of disembodied souls. By showing that the brain cycles through various neurodynamic phases, the discovery of REM underscored the view that the “self“ is not a fixed state but reflects fluctuating brain chemistry and electrical activity. Many researchers continue to hope that REM may yet provide a link between the physical activity of the brain during a dream and the experience of dreaming itself. It’s hard to overestimate the importance of Aserinsky’s breakthrough, said Bert States, an emeritus professor of dramatic arts at the University of California at Santa Barbara and the author of three books on dreams and dreaming: “The discovery of REM sleep was just about as significant to the study of cognition as the invention of the telescope was to the study of the stars.“


In 1950, when Aserinsky knocked on Nathaniel Kleitman’s office door, Kleitman, then 55, was considered the “father of modern sleep research.“ A Russian emigre, he had received a doctorate from the University of Chicago in 1923 and joined the faculty two years later. There he set up the world’s first sleep lab. The cot where research subjects slept was pitched under a metal hood formerly used to suck out noxious lab fumes. At the time, few scientists were interested in the subject. Despite research on the electrical activity of the brain in the late 1920s, the understanding of sleep hadn’t advanced much beyond the ancient Greeks, who viewed Hypnos, the god of sleep, as the brother of Thanatos, the god of death. Sleep was what happened when you turned out the lights and stopped the influx of sensation. Sleep was what the brain lapsed into, not what it actively constructed. On the face of it, dull stuff.


Kleitman was intrigued nonetheless, and began to explore the physiology of the body’s basic rest-activity cycle. A painstaking researcher, he once stayed up 180 hours straight to appraise the effects of sleep deprivation on himself. In 1938, he and fellow researcher Bruce Richardson moved into Mammoth Cave in Kentucky for more than a month to study fluctuations in their body temperatures and other darkness-engendered changes in their normal sleep-wake cycle – pioneering work in the now booming field of circadian rhythm research. Kleitman backed his fieldwork with formidable scholarship. When he published his landmark book Sleep and Wakefulness in 1939, he apologized for being unable to read in any language other than Russian, English, German, French and Italian. At the office door, Aserinsky found a man with “a grey head, a grey complexion and a grey smock.“ As the younger scientist wrote years later, “there was no joy in this initial encounter for either of us. For my part I recognized Kleitman as the most distinguished sleep researcher in the world. Unfortunately, sleep was perhaps the least desirable of the scientific areas I wished to pursue.“


Aserinsky had grown up in Brooklyn in a Yiddish- and Russian-speaking household. His mother died when he was 12, and he was left in the care of his father, Boris, a dentist who loved to gamble. Boris often had his son sit in on pinochle hands if the table was a player short. Meals were catch as catch can. Aserinsky’s son, Armond, recalled: “Dad once told me he said to his father, ?Pop, I’m hungry,’ and his father said, ?I’m not hungry, how can you be hungry?“ Eugene graduated from public high school at the age of 16 and for the next 12 years knocked about in search of his metier. At Brooklyn College, he took courses in social science, Spanish and premedical studies but never received a degree. He enrolled at the University of Maryland dental school only to discover that he hated teeth. He kept the books for an ice company in Baltimore. He served as a social worker in the Maryland state employment office. Though he was legally blind in his right eye, he did a stint in the U.S. Army as a high explosives handler. By 1949, Aserinsky, married and with a 6-year-old son, was looking to take advantage of the G.I. Bill of Rights to launch a science career. He aced the entrance exams at the University of Chicago and, though he lacked an undergraduate degree, persuaded the admissions office to accept him as a graduate student. “My father was courtly, intelligent and intensely driven,“ says Armond Aserinsky, 60, now a clinical psychologist in North Wales, Pennsylvania. “He could be extremely charming, and he had a fine scientific mind, but he had all kinds of conflicts with authority. He always wore black suits. I once asked him, ?Dad, how come you never wear a sports jacket?’ He looked at me and said, ?I’m not a sport.“


Kleitman’s first idea was to have Aserinsky test a recent claim that the rate of blinking could predict the onset of sleep. But after a number of vexing weeks trying to concoct a way to measure blink rates, Aserinsky confessed his lack of progress. Kleitman proposed that Aserinsky observe infants while they slept and study what their eyelids did. So he sat by cribs for hours but found that it was difficult to differentiate eyelid movements from eyeball movements. Once again he knocked on Kleitman’s door, something he was loath to do because of Kleitman’s austere and formal air. (Ten years after their famous paper was published, Kleitman began a letter to his colleague and coauthor, “Dear Aserinsky.“) Aserinsky had the idea of studying all eye movements in sleeping infants, and with Kleitman’s approval embarked on a new line of inquiry – one that, he would later confess, was “about as exciting as warm milk.“ Significantly, he did not at first “see“ REM, which is obvious if you know to look for it. Over months of monotonous observations, he initially discerned a 20-minute period in each infant’s sleep cycle in which there was no eye movement at all, after which the babies usually woke up. He learned to exploit the observation. During such periods, the fatigued researcher was able to nap himself, certain he would not miss any important data. And he was also able to impress mothers hovering near the cribs by telling them when their babies would wake up. “The mothers were invariably amazed at the accuracy of my prediction and equally pleased by my impending departure,“ he once wrote.


At home, Aserinsky was under considerable pressure. His daughter, Jill, was born in April 1952. His wife, Sylvia, suffered from bouts of mania and depression. Aserinsky couldn’t even afford the rent on the typewriter he leased to draft his dissertation. “We were so poor my father once stole some potatoes so we would have something to eat,“ recalls Jill Buckley, now 51 and a lawyer in Pismo Beach, California, for the American Society for the Prevention of Cruelty to Animals. “I think he saw himself as a kind of Don Quixote. Ninety percent of what drove him was curiosity – wanting to know. We had a set of Collier’s Encyclopedias, and my father read every volume.“ After studying babies, Aserinsky set out to study sleeping adults. At the time, no scientist had ever made all-night continuous measurements of brain-wave activity. Given the thinking of the era – that sleep was a featureless neurological desert – it was pointless to squander thousands of feet of expensive graph paper making electroencephalogram (EEG) recordings. Aserinsky’s decision to do so, combined with his adapting the balky Offner Dynograph machine to register eye movements during sleep, led to the breakthrough. His son, Armond, liked to hang out at the lab because it meant spending time with his father. “I remember going into the lab for the night,“ Armond says. “I knew the machine was harmless. I knew it didn’t read my mind. The set up took a long time. We had to work out some things. It was a long schlep to the bathroom down the hall, so we kept a bottle by the bed.“ Aserinsky did a second nightlong sleep study of Armond with the same results – again the pens traced sharp jerky lines previously associated only with eye movements during wakefulness. As Aserinsky recruited other subjects, he was growing confident that his machine was not fabricating these phenomena, but could it be picking up activity from the nearby muscles of the inner ear? Was it possible the sleeping subjects were waking up but just not opening their eyes? “In one of the earliest sleep sessions, I went into the sleep chamber and directly observed the eyes through the lids at the time that the sporadic eye movement deflections appeared on the polygraph record,“ he would recall in 1996 in the Journal of the History of the Neurosciences. “The eyes were moving vigorously but the subject did not respond to my vocalization. There was no doubt whatsoever that the subject was asleep despite the EEG that suggested a waking state.“ By the spring of 1952, a “flabbergasted“ Aserinsky was certain he had stumbled onto something new and unknown. “The question was, what was triggering these eye movements. What do they mean?“ he recalled in a 1992 interview with the Journal of NIH Research. In the fall of 1952, he began a series of studies with a more reliable EEG machine, running more than 50 sleep sessions on some two dozen subjects. The charts confirmed his initial findings. He thought of calling the phenomena “jerky eye movements,“ but decided against it. He didn’t want critics to ridicule his findings by playing off the word “jerk.“


Aserinsky went on to find that heart rates increased an average of 10% and respiration went up 20% during REM; the phase began a certain amount of time after the onset of sleep; and sleepers could have multiple periods of REM during the night. He linked REM interludes with increased body movement and particular brain waves that appear in waking. Most amazingly, by rousing people from sleep during REM periods, he found that rapid eye movements were correlated with the recall of dreams – with, as he noted in his dissertation, “remarkably vivid visual imagery.“ He later wrote, “The possibility that these eye movements might be associated with dreaming did not arise as a lightning stroke of insight. An association of the eyes with dreaming is deeply ingrained in the unscientific literature and can be categorized as common knowledge. It was Edgar Allan Poe who anthropomorphized the raven, ?and his eyes have all the seeming of a demon’s that is dreaming.’ “


Aserinsky had little patience for Freudian dream theory, but he wondered if the eyes moving during sleep were essentially watching dreams unfold. To test that possibility, he persuaded a blind undergraduate to come into the lab for the night. The young man brought his Seeing Eye dog. “As the hours passed I noticed at one point that the eye channels were a little more active than previously and that conceivably he was in a REM state,“ Aserinsky wrote. “It was imperative that I examine his eyes directly while he slept. Very carefully I opened the door to the darkened sleeping chamber so as not to awaken the subject. Suddenly, there was a low menacing growl from near the bed followed by a general commotion which instantaneously reminded me that I had completely forgotten about the dog. By this time the animal took on the proportions of a wolf, and I immediately terminated the session, foreclosing any further exploration along this avenue.“ (Other researchers would later confirm that blind people do indeed experience REM.) In any event, Aserinsky wasn’t much interested in the meaning of dreams, said his daughter Jill, adding: “He was a pure research scientist. It always irritated him when people wanted him to interpret their dreams.“


But a future colleague of Aserinsky’s was intrigued. William Dement was a medical student at Chicago, and in the fall of 1952 Kleitman assigned him to help Aserinsky with his overnight sleep studies. Dement recounted his excitement in his 1999 book, The Promise of Sleep. “Aserinsky told me about what he had been seeing in the sleep lab and then threw in the kicker that really hooked me: ?Dr. Kleitman and I think these eye movements might be related to dreaming.’ For a student interested in psychiatry, this offhand comment was more stunning than if he had just offered me a winning lottery ticket. It was as if he told me, ?We found this old map to something called the Fountain of Youth.’ “ By Aserinsky’s account, Dement ran five overnight sessions for him starting in January 1953. With a camera Kleitman had obtained, Dement and Aserinsky took 16-millimeter movie footage of subjects in REM sleep, one of whom was a young medical student named Faylon Brunemeier, today a retired ophthalmologist living in Northern California. They were paying three dollars a night, he recalled, “and that was a lot to an impecunious medical student.“ Kleitman had barred women as sleep study subjects, fearing the possibility of scandal, but Dement wheedled permission to wire up his sweetheart, a student named Pamela Vickers. The only provision was that Aserinsky had to be on hand to “chaperon“ the session. While the sleep-deprived Aserinsky passed out on the lab couch, Dement documented that Vickers, too, experienced REM. Next, Dement says he recruited three other female subjects, including Elaine May, then a student at the University of Chicago. Even if she had not become famous a few years later as part of the comedy team Nichols and May, and had not gone on to write Heaven Can Wait and other movies, she would still have a measure of fame, in the annals of sleep science.


From 1955 to 1957, Dement published studies with Kleitman establishing the correlation between REM sleep and dreaming. Dement went on to help organize the first sleep research society and started the world’s first sleep clinic at Stanford in 1970. With a collaborator, Howard Roffwarg, a psychiatrist now at the University of Mississippi Medical Center, Dement showed that even a 7-month-old premature infant experiences REM, suggesting that REM may occur in the womb. Dement’s colony of dogs with narcolepsy – a condition of uncontrollable sleep – shed light on the physiological basis of the disorder, which in people had long been attributed to psychological disturbances. Dement became such an evangelist about the dangers of undiagnosed sleep disorders that he once approached the managers of the rock band R.E.M., seeking to enlist the group for a fundraising concert. The musicians brushed him off with a shaggy story about the acronym standing for retired english majors.


When Aserinsky left the University of Chicago, in 1953, he turned his back on sleep research. He went to the University of Washington in Seattle and for a year studied the effects of electrical currents on salmon. Then he landed a faculty position at Jefferson Medical College in Philadelphia, where he explored high-frequency brain waves and studied animal respiration. In 1957, his wife’s depression came to a tragic conclusion; while staying at a mental hospital in Pennsylvania, Sylvia committed suicide. Two years later, Aserinsky married Rita Roseman, a widow, and became stepfather to her young daughter, Iris; the couple remained together until Rita’s death in 1994.


In the early 1960s, Armond Aserinsky urged his father, then in his 40s, to return to the field he had helped start. Aserinsky finally wrote to Kleitman, who had retired from the University of Chicago. Kleitman replied, “It was good to learn that you have renewed work on rapid eye movements during sleep. The literature on the subject is quite extensive now. I believe that you have ability and perseverance but have had personal hard knocks to contend with. Let us hope that things will be better for you in the future.“ Kleitman also took the opportunity to remind his former student that he still owed him a hundred dollars. In March 1963, Aserinsky went home to Brooklyn to attend a meeting of sleep researchers. “People were shocked,“ his son recalled. “They looked at him and said, ?My God, you’re Aserinsky! We thought you were dead!’ “


Delving into the night again in an unused operating room at the Eastern Pennsylvania Psychiatric Institute in Philadelphia, Aserinsky worked on the physiology of REM and non-REM sleep, but he had prickly encounters with colleagues. He took offense when he did not receive an invitation to a prestigious dinner at a 1972 meeting of sleep researchers. He was often stung when Dement and Kleitman got credit he felt belonged to him. (For his part, Dement said he resented that Aserinsky never acknowledged all the work he did as low man on the lab totem pole. “I was so naive,“ he told me.) In 1976, after more than two decades at Jefferson MedicalCollege, Aserinsky was passed over for the chairmanship of the physiology department. He left, becoming chairman of physiology at Marshall University in Huntington, West Virginia. He retired in 1987. “He could be a deeply suspicious and impolitic person,“ Armond Aserinsky said. Narrating his version of events in the Journal of the History of the Neurosciences, Aserinsky criticized Dement’s contention that the discovery of REM was a “team effort,“ saying, “If anything is characteristic about the REM discovery, it was that there was no teamwork at all. In the first place, Kleitman was reserved, almost reclusive, and had little contact with me. Secondly, I myself am extremely stubborn and have never taken kindly to working with others. This negative virtue carried on throughout my career as evidenced by my resume, which reveals that I was either the sole or senior author in my first thirty publications, encompassing a period of twenty-five years.“ That stubbornness spilled into his family relations as well. Years passed in which he had no contact with Armond. To younger sleep scientists, Aserinsky was only a name on a famous paper, an abstraction from another time. And such he might have remained if not for a license plate and a chance encounter in 1989. Peter Shiromani, then an assistant professor of psychiatry at the University of California at San Diego, had just nosed his Datsun 310 into the parking lot of a Target department store in Encinitas, California. His custom license plates advertised what had been his scientific obsession since his undergraduate days at City College in New York City: REM SLEP. “A woman walked up to me and said, ?I really love your plates! Did you know my father discovered REM sleep?’ “ Shiromani recalled. “I said, ?You must be Eugene Aserinsky’s daughter!’ She was very pleased. I think she felt a lot of pride in her father’s accomplishment, and here was someone who recognized her father’s name. We chatted briefly with much enthusiasm about REM sleep. Fortunately, I had the presence of mind to ask for her father’s address.“ Shiromani passed the address along to Jerry Siegel, a sleep researcher at UCLA and the Sepulveda Veterans Affairs medical center in suburban Los Angeles, who invited Aserinsky to address the June 1995 meeting of the Associated Professional Sleep Societies in Nashville. Siegel was organizing a symposium in honor of Kleitman, who had recently turned 100. “It was very difficult to get Aserinsky to come,“ Siegel recalls. “People who knew him in the early days said, ?Don’t invite him.’ But my dealings with him were very pleasant.“ Despite their rivalry, it was Dement who introduced Aserinsky to the crowd of 2,000 people in the ballroom at the OpryLand Hotel. They gave him a standing ovation. And when he finished a witty, wide-ranging talk on the history of REM, the audience again rose to its feet. “It was one of the high points of his life,“ recalls his daughter Jill, who had accompanied her father to the meeting along with his stepdaughter, Iris Carter. “He wore a name tag, and people would stop and point and say, ?There’s Aserinsky!’ “ says Carter.


One July day three years later, Aserinsky, driving down a hill in Carlsbad, California, collided with a tree and was killed. He was 77. An autopsy could not determine the cause of the accident. It’s possible he fell asleep at the wheel.


Today it’s well established that normal sleep in human adults includes between four and six REM periods a night. The first starts about 90 minutes after sleep begins; it usually lasts several minutes. Each subsequent REM period is longer. REM sleep is characterized by not only brain-wave activity typical of waking but also a sort of muscle paralysis, which renders one incapable of acting on motor impulses. (Sleepwalking most often occurs during non-REM sleep.) In men and women, blood flow to the genitals is increased. Parts of the brain burn more energy. The heart may beat faster. Adults spend about two hours a night in REM, or 25% of their total sleep. Newborns spend 50 percent of their sleep in REM, upwards of eight hours a day, and they are much more active than adults during REM sleep, sighing and smiling and grimacing. After 50 years, researchers have learned a great deal about what REM isn’t. For example, it was once thought that people prevented from dreaming would become psychotic. That proved not to be the case; patients with injuries to the brainstem, which controls REM, do not go nuts without it. Still, if you deprive a person of REM sleep, they’ll recoup it at the first chance, plunging directly into the REM phase – a phenomenon discovered by Dement and called REM rebound.


Studies of animals have yielded insights into REM, sometimes. In the early 1960s, Michel Jouvet, a giant of sleep research and a neurophysiologist at the University Claude Bernard in Lyon, France, mapped the brain structures that generate REM sleep and produce the attendant muscle paralysis. Jouvet, who coined the term “paradoxical sleep“ as a substitute for REM sleep, also discovered that cats with lesions in one part of the brainstem were “disinhibited“ and would act out their dreams, as it were, jumping up and arching their backs. (More recently, University of Minnesota researchers have documented a not-dissimilar condition in people; REM sleep behavior disorder, as it’s called, mainly affects men over 50, who kick, punch and otherwise act out aggressive dream scenarios while they sleep. Researchers believe that REM sleep disorder may be a harbinger of Parkinson’s disease in some people.) Paradoxical sleep has been found in almost all mammals tested so far except for some marine mammals, including dolphins. Many bird species appear to have short bursts of paradoxical sleep, but reptiles, at least the few that have been assessed, do not. Jouvet was especially interested in penguins, because they stay awake for long periods during the brooding season. Hoping to learn more about their physiology, he went to great trouble to implant a costly radio-telemetry chip in an emperor penguin in Antarctica. The prize research subject was released into the sea, only to be promptly gobbled up by a killer whale.


In 1975, Harvard’s Allan Hobson and Robert McCarley proposed that many properties of dreams – the vivid imagery, the bizarre events, the difficulty remembering them – could be explained by neurochemical conditions of the brain in REM sleep, including the ebb and flow of the neurotransmitters norepinephrine, serotonin and acetylcholine. Their theory stunned proponents of the idea that dreams were rooted not in neurochemistry but psychology, and it has been a starting point of dream theorizing for the past 25 years. The once-popular description of REM as “dream sleep“ is now considered an oversimplification, and debate rages over questions of what can be properly claimed about the relation of dreaming to the physiology of REM sleep. (In 2000, an entire volume of the journal Behavioral and Brain Sciences was devoted to the debate.) To be sure, you can have REM without dreaming, and you can dream without experiencing REM. But most researchers say that dreaming is probably influenced and may be facilitated by REM. Still, dissenters, some of whom adhere to psychoanalytic theory, say that REM and dreaming have little connection with each other, as suggested by clinical evidence that different brain structures control the two phenomena. In the years to come, new approaches may help clarify these disagreements. In a sort of echo of Aserinsky’s first efforts to probe the sleeping brain with EEG, some researchers have used powerful positron brain-scanning technology to focus on parts of the brain activated during REM.


This past June, more than 4,800 people attended the Associated Professional Sleep Societies’ annual meeting in Chicago. The scientists took time out to mark REM’s golden anniversary. With mock solemnity, Dement echoed the Gettysburg Address in his lecture: “Two score and ten years ago Aserinsky and Kleitman brought forth on this continent a new discipline conceived at night and dedicated to the proposition that sleep is equal to waking.“ But to paraphrase the physicist Max Planck, science advances funeral by funeral. Kleitman died in 1999 at the age of 104, and though he was a coauthor of the milestone REM study, he never really accepted that REM was anything other than a phase of especially shallow sleep. “Kleitman died still believing there was only one state of sleep,“ Dement told me. Aserinsky had his own blind spots; he never relinquished his doubts that sleeping infants exhibit REM. To honor the research done in Kleitman’s lab five decades ago, the Sleep Research Society commissioned a 65-pound zinc plaque. It now hangs in the psychiatry department at the University of Chicago Medical Center, adjacent to Abbott Hall. To be sure, the inscription – “Commemorating the 50th Anniversary of the Discovery of REM Sleep by Eugene Aserinsky, Ph.D., and Nathaniel Kleitman, Ph.D., at the University of Chicago“ – doesn’t speak to the poetry of a lyric moment in the history of science, a moment when, as Michel Jouvet once said, humanity came upon “a new continent in the brain.“ If it’s the poetry of REM you want, you need wait only until tonight.


Fifty years ago, Eugene Aserinksy discovered rapid eye movement and changed the way we think about sleep and dreaming



Sources: Smithsonian Magazine, October 2003, By Chip Brown; https://public-media.smithsonianmag.com/filer/rem; NIH.gov; Wikipedia

Read more: http://www.smithsonianmag.com/science-nature/the-stubborn-scientist-who-unraveled-a-mystery-of-the-night-91514538/#SHP3CAzWr84vqbsw.99


For your sheer pleasure, British tenor, John Owen-Jones sings, Music of the Night, from Phantom of the Opera.


Dr. Peter Dennis Mitchell, British Biochemist


The Nobel Prize in Chemistry 1978 was awarded to Peter Mitchell “for his contribution to the understanding of biological energy transfer through the formulation of the chemiosmotic theory“.

Peter Dennis Mitchell (29 September 1920-10 April 1992), British biochemist

Sources: Nobel Prize Foundation: MLA style: “The Nobel Prize in Chemistry 1978“. Nobelprize.org. Nobel Media AB 2014. Web. 24 Oct 2017. http://www.nobelprize.org/nobel_prizes/chemistry/laureates/1978/; Wikipedia: By Source, Fair use, https://en.wikipedia.org/w/index.php?curid=29893461


The Thesis – The rates of synthesis and proportions by weight of the nucleic acid components of a Micrococcus during growth in normal and in penicillin containing media with reference to the bactericidal action of penicillin.


Peter Dennis Mitchell was born in Mitcham, Surrey on 29 September 1920. His parents were Christopher Gibbs Mitchell, a civil servant, and Kate Beatrice Dorothy (nee) Taplin. His uncle was Sir Godfrey Way Mitchell, chairman of George Wimpey. He was educated at Queen’s College, Taunton and Jesus College, Cambridge where he studied the Natural Sciences Tripos specializing in Biochemistry. He was appointed a research post in the Department of Biochemistry, Cambridge, in 1942, and was awarded a Ph.D. in early 1951 for work on the mode of action of penicillin.


In 1955 Mitchell was invited by Professor Michael Swann to set up a biochemical research unit, called the Chemical Biology Unit, in the Department of Zoology, at the University of Edinburgh, where he was appointed a Senior Lecturer in 1961 and then Reader in 1962. From 1963 to 1965, he supervised the restoration of a Regency-fronted Mansion, known as Glynn House, at Cardinham near Bodmin, Cornwall – adapting a major part of it for use as a research laboratory. He and his former research colleague, Jennifer Moyle founded a charitable company, known as Glynn Research Ltd., to promote fundamental biological research at Glynn House and they embarked on a program of research on chemiosmotic reactions and reaction systems.


In the 1960s, ATP was known to be the energy currency of life, but the mechanism by which ATP was created in the mitochondria was assumed to be by substrate-level phosphorylation. Mitchell’s chemiosmotic hypothesis was the basis for understanding the actual process of oxidative phosphorylation. At the time, the biochemical mechanism of ATP synthesis by oxidative phosphorylation was unknown. Mitchell realized that the movement of ions across an electrochemical potential difference could provide the energy needed to produce ATP. His hypothesis was derived from information that was well known in the 1960s. He knew that living cells had a membrane potential; interior negative to the environment. The movement of charged ions across a membrane is thus affected by the electrical forces (the attraction of positive to negative charges). Their movement is also affected by thermodynamic forces, the tendency of substances to diffuse from regions of higher concentration. He went on to show that ATP synthesis was coupled to this electrochemical gradient.


His hypothesis was confirmed by the discovery of ATP synthase, a membrane-bound protein that uses the potential energy of the electrochemical gradient to make ATP; and by the discovery by Andre Jagendorf that a pH difference across the thylakoid membrane in the chloroplast results in ATP synthesis. Later, Mitchell also hypothesized some of the complex details of electron transport chains. He conceived of the coupling of proton pumping to quinone-based electron bifurcation, which contributes to the proton motive force and thus, ATP synthesis. In 1978 he was awarded the Nobel Prize in Chemistry “for his contribution to the understanding of biological energy transfer through the formulation of the chemiosmotic theory.“ He was elected a Fellow of the Royal Society (FRS) in 1974. Mitchell could not have achieved all that he did, without standing on the shoulders of at least two other great researchers (among many), Dr. Friedrich Miescher and Dr. Richard Altmann.


Friedrich Miescher (1844-1895)

Photo credit: copied from http://www.pbs.org/wgbh/nova/photo51/images/befo-miescher.jpg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=789048



Miescher isolated various phosphate-rich chemicals, which he called nuclein (now nucleic acids), from the nuclei of white blood cells. This took place in 1869 in Felix Hoppe-Seyler’s laboratory at the University of Tubingen, Germany, paving the way for the identification of DNA as the carrier of inheritance. The significance of the discovery, first published in 1871, was not at first apparent, and it was Albrecht Kossel who made the initial inquiries into its chemical structure. Later, Friedrich Miescher raised the idea that the nucleic acids could be involved in heredity.


Richard Altmann (12 March 1852 – 8 December 1900) was a German pathologist and histologist from Deutsch Eylau in the Province of Prussia. Altmann studied medicine in Greifswald, Konigsberg, Marburg, and Giessen, obtaining a doctorate at the University of Giessen in 1877. He then worked as a prosector at Leipzig, and in 1887 became an anatomy professor (extraordinary). He died in Hubertusburg in 1900 from a nervous disorder. Altmann improved fixation methods, for instance, his solution of potassium dichromate and osmium tetroxide. Using that along with a new staining technique of applying acid-fuchsin contrasted by picric acid amid delicate heating, he observed filaments in the nearly all cell types, developed from granules. He named the granules “bioblasts“, and explained them as the elementary living units, having metabolic and genetic autonomy, in his 1890 book “Die Elementarorganismen“ (“The Elementary Organism“). His explanation drew much skepticism and harsh criticism. Altmann’s granules are now believed to be mitochondria. He is credited with coining the term “nucleic acid“, replacing Friedrich Miescher’s term “nuclein“ when it was demonstrated that nuclein was acidic.



A study of embryos by Leonardo da Vinci. Graphic credit: – Hi! Magazine (direct link), Public Domain, https://commons.wikimedia.org/w/index.php?curid=42138639


In humans, the term embryo refers to the ball of dividing cells from the moment the zygote implants itself in the uterus wall until the end of the eighth week after conception. Beyond the eighth week after conception (tenth week of pregnancy), the developing human is then called a fetus.


Aristotle’s On the Generation of Animals is referred to in Latin as De Generatione Animalium. As with many of Aristotle’s writings, the exact date of authorship is unknown, but it was produced in the latter part of the fourth century BCE. This book is the second recorded work on embryology as a subject of philosophy, being preceded by contributions in the Hippocratic corpus by about a century. It was, however, the first work to provide a comprehensive theory of how generation works and an exhaustive explanation of how reproduction works in a variety of different animals. As such, De Generatione was the first scientific work on embryology. Its influence on embryologists, naturalists, and philosophers in later years was profound. Among these were Hieronymus Fabricius, William Harvey, St. Thomas Aquinas, and Charles Darwin.


A brief overview of the general theory expounded in De Generatione requires an explanation of Aristotle’s philosophy. The Aristotelian approach to philosophy is teleological, and involves analyzing the purpose of things, or the cause for their existence. These causes are split into four distinct types: final cause, formal cause, material cause, and efficient cause. The final cause is what a thing exists for, or its ultimate purpose. The formal cause is the definition of a thing’s essence or existence, and Aristotle states that in generation, the formal cause and the final cause are similar to each other, and can be thought of as the goal of creating a new individual of the species. The material cause is the stuff a thing is made of, which in Aristotle’s theory is the female menstrual blood. The efficient cause is the “mover“ or what causes the thing’s existence, and for reproduction Aristotle designates the male semen as the efficient cause. Thus, while the mother’s body contains all the material necessary for creating her offspring, she requires the father’s semen to start and guide the process.


De Generatione consists of five books, each containing multiple chapters. Books I and II are of most interest to embryology. Book III is a comparative study of zoology that applies principles from Book II to distinct species of animals. Book IV contains miscellaneous information about aspects of reproduction, such as how heredity works and birth defects occur. Book V compares the characteristics that all animals share, and is primarily a discussion of sensory organs and the physical appearance of animals, focusing on characteristics like hair, coloration, voice, and teeth. Aristotle’s research and writing, influenced Renaissance scholars. Early studies of embryology came from the work of the Italian anatomists Aldrovandi, Aranzio, Leonardo da Vinci, Marcello Malpighi, Gabriele Falloppio, Girolamo Cardan, Emilio Parisian, Fortunio Licata, Stefano Lorentzian, Spallanzani, Enrico Sertoli, and Mauro Rossini.


Editor’s note: Dear Readers, it’s not clear to us, how Aristotle’s theories of epigenesis became temporarily derailed by the ideas of preformation. We’re guessing it was the prevailing notions of the church. If any reader has more specific knowledge, we would appreciate an email, clearing up this approximately, 150-year gap, when Aristotle’s ideas were eschewed.


By the 18th century, the prevailing notion in western human embryology was preformation: the idea that semen contains an embryo – a preformed, miniature human, or homunculus – that was planted in the female during intercourse, which then grew into a larger being, as it developed during pregnancy. The competing explanation of embryonic development was epigenesis, originally proposed 2,000 years earlier by Aristotle. However, during the late 18th century an extended and controversial debate among biologists finally led epigenesis to eclipse the short-lived, but established preformationist view. According to epigenesis, the form of an animal emerges gradually from a relatively formless egg. As microscopy improved during the 19th century, biologists could see that embryos took shape in a series of progressive steps, and epigenesis displaced preformation as the favored explanation among embryologists.


After 1827: Karl Ernst von Baer and Heinz Christian Pander proposed the germ layer theory of development; von Baer discovered the mammalian ovum in 1827. Modern embryological pioneers include Charles Darwin, Ernst Haeckel, J.B.S. Haldane, and Joseph Needham. Other important contributors include William Harvey, Kaspar Friedrich Wolff, Heinz Christian Pander, August Weismann, Gavin de Beer, Ernest Everett Just, and Edward B. Lewis.


After the 1950s, with the DNA helical structure being unraveled knowledge increased in the field of molecular biology, and developmental biology emerged as a field of study which attempts to correlate the genes with morphological change, and so tries to determine which genes are responsible for each morphological change that takes place in an embryo, and how these genes are regulated. So, here in the 21st Century, we can acknowledge the genius of Hippocrates and Aristotle, whose correct observations many centuries ago, prevail. Those ancient Greeks knew a thing or two. Sources: Wikipedia; nih.govasu.edu.


History of Adrenoleukodystrophy

A movie, made in 1993 called “Lorenzo’s Oil“ tells the story of a young boy, Lorenzo Odone, who had Adrenoleukodystrophy. His parents were the creators of an oil, still used today, in the diets of people with this terrible disease. Today, this dietary aid is called Lorenzo’s Oil, named after the same substance used to relieve Lorenzo Odone. He recently died in 2008, at the age of 29, after being diagnosed with ALD at age 5 in 1984. http://wiki.ggc.usg.edu/wiki/Adrenoleukodystrophy


Timeline of the History of ALD – ALD Database

August 23rd, 2017 |

Marc Engelen, M.D., Ph.D. and Stephan Kemp, Ph.D.


1910: In retrospect, Haberfeld and Spieler presented the first clinical description of a patient with X-linked adrenoleukodystrophy (Haberfeld and Spieler, 1910). A previously healthy 6 year old boy developed a deeply bronzed skin (hyperpigmentation), impaired visual acuity, and his school performance deteriorated. The following months, this boy became incontinent, lost his ability to speak and developed spastic tetraparesis, which eventually progressed to an inability to walk. He was hospitalized at the age of 7, and died 8 months later. An older brother had died of a similar illness at the age of 8. Postmortem histological examination of the brain revealed extensive changes in brain white matter, combined with perivascular accumulation of lymphocytes and plasma cells in the nervous system, indicating an inflammatory response.

1922: Siemerling and Creutzfeldt reported the case of a boy with a similar disease progression, including the dark skin and neuropathological findings as in the case described by Haberfeld & Spieler in 1910, except that atrophy of the adrenal cortex was documented.

1963: By now nine comparable cases had been reported. The fact that all patients were males suggested X-linked recessive inheritance (Fanconi et al. 1963).

1970: The name adrenoleukodystrophy was introduced based on the striking association of a leukodystrophy with primary adrenocortical (adrenal) insufficiency (Blaw, 1970).

1972: The key to all subsequent knowledge about the disease was the observation made by Powers, Schaumburg, and Johnson that adrenal cells of ALD patients contained characteristic lipid inclusions (fat droplets), followed by the demonstration that these fat droplets consisted of cholesterol esters that contained a striking and characteristic excess of very long-chain fatty acids (VLCFA).

1976: A more slowly progressive adult form of the disease characterized by adrenal insufficiency, myelopathy and peripheral neuropathy was described (Budka et al. 1976). A year later, five more cases were reported by Griffin et al. who proposed this clinical presentation of ALD to be named adrenomyeloneuropathy (AMN) (Griffin et al. 1977Schaumburg et al.1977).

1981: The identification of VLCFA as a biomarker for ALD led to the development of a diagnostic test for ALD based on the demonstration of elevated levels of VLCFA in cultured skin cells (fibroblasts), plasma, red blood cells and amniocytes (Moser et al. 1981). These tests have permitted precise postnatal and prenatal diagnosis. Metabolic studies demonstrated that VLCFA are metabolized (through beta-oxidation) exclusively in subcellular organelles called peroxisomes and this oxidation of VLCFA is reduced in fibroblasts from ALD patients (Singh et al 1981). Therefore, ALD is a peroxisomal disease.

1981: The ALD locus was mapped to the terminal segment of the long arm of the X-chromosome, Xq28 (Migeon et al. 1981).

1982: The first bone-marrow transplantation (BMT) was performed in a boy with cerebral ALD. An allogeneic BMT from a normal HLA identical sibling donor was performed in a 13-year-old boy with rapidly progressive ALD. Engraftment and complete hematologic recovery occurred within 4 weeks. Ten days after BMT, the white blood cell VLCFA levels and enzyme activity became normal; after 3 months, there was progressive reduction of plasma VLCFA to levels only slightly above normal. But neurologic deterioration continued. The patient died of an adenovirus infection 141 days after BMT.

1986: Rizzo et al. demonstrated that the addition of oleic acid (C18:1) to the tissue culture medium normalizes the levels of saturated VLCFA in cultured skin fibroblasts from ALD patients. These findings formed the basis for the development of Lorenzo’s oil. Treatment of ALD patients with Lorenzo’s oil normalizes plasma VLCFA levels within 4 weeks (Moser et al. 1987). Several open-label trials have shown that Lorenzo’s oil failed to improve neurological or endocrine function nor did it arrest the progression of the disease. Unfortunately, the clinical efficacy of Lorenzo’s oil has never been evaluated in a proper placebo-controlled clinical trial. In 2001, Prof Hugo Moser wrote: “It is our view that Lorenzo’s oil therapy is not warranted in most patients who already have neurologic symptoms. The clinical benefit of Lorenzo’s oil is limited at best“.

1990: The team of Prof. Patrick Aubourg reported on the first successful bone-marrow transplantation (BMT) (Aubourg et al. 1990). They had transplanted an 8-year old boy with mild neurological, mild neuropsychological and mild MRI abnormalities. His unaffected non-identical twin was the donor. The patient recovered completely and the neurological, neuropsychological and MRI abnormalities disappeared. When conducted at the earliest stage of cerebral demyelination, a bone-marrow or hematopoietic stem cell transplantation (HSCT) can stabilize or even reverse cerebral demyelination in boys or adolescents with ALD.

1993: A team led by Drs. Mandel and Aubourg identified the putative gene for ALD (ABCD1) using positional cloning strategies (Mosser et al. 1993). The identification of the ALD gene enabled the detection of disease causing mutations, prenatal diagnosis and accurate carrier testing.

1997: Three laboratories reported the generation of a mouse model for ALD (Forss-Petter et al. 1997Kobayashi et al. 1997Lu et al. 1997). While the ALD mouse exhibits the same biochemical abnormalities as observed in patients, the mouse does not develop ALD (Pujol et al. 2002).

1999: The ALD database was created by Hugo Moser and Stephan Kemp. Initially it served only as a registry for mutations identified in the ABCD1 gene, but soon thereafter it was expanded to provide information on many aspects of ALD.

2001: It was reported and established that ALD affects all ethnic groups and it is the most common peroxisomal disorder with an estimated incidence of 1:17,000 (males and females combined) (Bezman et al. 2001). This makes ALD the most common inherited leukodystrophy.

2005: Biochemically, ALD is not only characterized by a defect in the breakdown of VLCFA in peroxisomes, but there is also an increase in the subsequent chain-elongation of VLCFA (Kemp et al. 2005).

2006: The team led by Dr. Ann Moser developed a high-throughput VLCFA analysis method (with C26:0-lysoPC as the diagnostic metabolite) to be used on dried blood spots (Hubbard et al. 2006). These advancements in VLCFA screening will allow the addition of ALD to newborn screening programs.

2009: The team led by Drs. Cartier and Aubourg reported on the successful treatment of two 7-year old boys with early signs of cerebral ALD using gene therapy (Cartier et al. 2009). Brain MRI scans and cognitive tests showed that progression of the cerebral disease stopped 14-16 months post-treatment. This is comparable with the clinical outcome of HSCT.

2010: TThe research team of Dr. Stephan Kemp established that ALDP transports VLCFA across the peroxisomal membrane. A deficiency in ALDP has two major effects: on the one hand, it impairs peroxisomal degradation of VLCFA, and on the other hand, it raises cytosolic levels of VLCFA. These VLCFA are then further elongated to even longer fatty acids by ELOVL1, the human C26 specific elongase (Ofman et al. 2010).

2014: In the United States, New York State started newborn screening for ALD (Vogel et al. 2015). Early diagnosis of ALD is the key to saving lives, because newborn screening allows prospective monitoring and early intervention.

2015: In the US, Connecticut initiated ALD newborn screening. In Europe, the Netherlands expanded its national newborn screening program from 17 to 31 conditions, including ALD.

2016: On February 16, ALD was added to the United States Recommended Uniform Screening Panel (RUSP). In the US, California initiated ALD newborn screening. Since then other states and countries have started newborn screening programs, or have initiated processes intended to add ALD to their existing newborn screening program. Detailed and up-to-date information on ALD newborn screening can be found at the newborn screening page.


The Nobel Prize in Physiology or Medicine 2017


The distinguished award goes to: Jeffrey C. Hall, Michael Rosbash, Michael W. Young, for their discoveries of molecular mechanisms controlling the circadian rhythm.

Michael Rosbash: Photo credit: Howard Hughes Medical Institute


Jeffrey Hall                                                                 Michael Young

Photo credit: Wikipedia                                        Photo credit: Wikipedia


Life on Earth is adapted to the rotation of our planet. For many years we have known that living organisms, including humans, have an internal, biological clock that helps them anticipate and adapt to the regular rhythm of the day. But how does this clock actually work? Jeffrey C. Hall, Michael Rosbash and Michael W. Young were able to peek inside our biological clock and elucidate its inner workings. Their discoveries explain how plants, animals and humans adapt their biological rhythm so that it is synchronized with the Earth’s revolutions.


Nobel winner, Jeffrey C. Hall was born 1945 in New York, USA. He received his doctoral degree in 1971 at the University of Washington in Seattle and was a postdoctoral fellow at the California Institute of Technology in Pasadena from 1971 to 1973. He joined the faculty at Brandeis University in Waltham in 1974. In 2002, he became associated with University of Maine.


Nobel winner, Michael Rosbash was born in 1944 in Kansas City, USA. He received his doctoral degree in 1970 at the Massachusetts Institute of Technology in Cambridge. During the following three years, he was a postdoctoral fellow at the University of Edinburgh in Scotland. Since 1974, he has been on faculty at Brandeis University in Waltham, USA.


Nobel winner, Michael W. Young was born in 1949 in Miami, USA. He received his doctoral degree at the University of Texas in Austin in 1975. Between 1975 and 1977, he was a postdoctoral fellow at Stanford University in Palo Alto. From 1978, he has been on faculty at the Rockefeller University in New York.


The earliest recorded account of a circadian process dates from the 4th century BCE, when Androsthenes, a ship captain serving under Alexander the Great, described diurnal leaf movements of the tamarind tree. The observation of a circadian or diurnal process in humans is mentioned in Chinese medical texts dated to around the 13th century, including the Noon and Midnight Manual and the Mnemonic Rhyme to Aid in the Selection of Acu-points According to the Diurnal Cycle, the Day of the Month and the Season of the Year. The first recorded observation of an endogenous circadian oscillation was by the French scientist Jean-Jacques d’Ortous de Mairan in 1729. He noted that 24-hour patterns in the movement of the leaves of the plant Mimosa pudica continued even when the plants were kept in constant darkness, in the first experiment to attempt to distinguish an endogenous clock from responses to daily stimuli. In 1896, Patrick and Gilbert observed that during a prolonged period of sleep deprivation, sleepiness increases and decreases with a period of approximately 24 hours. In 1918, J.S. Szymanski showed that animals are capable of maintaining 24-hour activity patterns in the absence of external cues such as light and changes in temperature.


In the early 20th century, circadian rhythms were noticed in the rhythmic feeding times of bees. Extensive experiments were done by Auguste Forel, Ingeborg Beling, and Oskar Wahl to see whether this rhythm was due to an endogenous clock. Ron Konopka and Seymour Benzer isolated the first clock mutant in Drosophila in the early 1970s and mapped the “period“ gene, the first discovered genetic determinant of behavioral rhythmicity. Joseph Takahashi discovered the first mammalian circadian clock mutation (clock delta19) using mice in 1994. However, recent studies show that deletion of clock does not lead to a behavioral phenotype (the animals still have normal circadian rhythms), which questions its importance in rhythm generation.


The term circadian was coined by Franz Halberg in the 1950s.


Using fruit flies as a model organism, this year’s Nobel laureates isolated a gene that controls the normal daily biological rhythm. They showed that this gene encodes a protein that accumulates in the cell during the night, and is then degraded during the day. Subsequently, they identified additional protein components of this machinery, exposing the mechanism governing the self-sustaining clockwork inside the cell. We now recognize that biological clocks function by the same principles in cells of other multicellular organisms, including humans. With precision, our inner clock adapts our physiology to the dramatically different phases of the day. The clock regulates critical functions such as behavior, hormone levels, sleep, body temperature and metabolism. Our well-being is affected when there is a temporary mismatch between our external environment and this internal biological clock, for example when we travel across several time zones and experience “jet lag.“ There are also indications that chronic misalignment between our lifestyle and the rhythm dictated by our inner timekeeper is associated with increased risk for various diseases.


Most living organisms anticipate and adapt to daily changes in the environment. During the 18th century, the astronomer Jean Jacques d’Ortous de Mairan studied mimosa plants, and found that the leaves opened towards the sun during daytime and closed at dusk. He wondered what would happen if the plant was placed in constant darkness. He found that independent of daily sunlight the leaves continued to follow their normal daily oscillation. Plants seemed to have their own biological clock. Other researchers found that not only plants, but also animals and humans, have a biological clock that helps to prepare our physiology for the fluctuations of the day. This regular adaptation is referred to as the circadian rhythm, originating from the Latin words circa meaning “around“ and dies meaning “day“. But just how our internal circadian biological clock worked remained a mystery.


During the 1970’s, Seymour Benzer and his student Ronald Konopka asked whether it would be possible to identify genes that control the circadian rhythm in fruit flies. They demonstrated that mutations in an unknown gene disrupted the circadian clock of flies. They named this gene period. But how could this gene influence the circadian rhythm?


This year’s Nobel Laureates, who were also studying fruit flies, aimed to discover how the clock actually works. In 1984, Jeffrey Hall and Michael Rosbash, working in close collaboration at Brandeis University in Boston, and Michael Young at the Rockefeller University in New York, succeeded in isolating the period gene. Hall and Rosbash then went on to discover that PER, the protein encoded by period, accumulated during the night and was degraded during the day. Thus, PER protein levels oscillate over a 24-hour cycle, in synchrony with the circadian rhythm. The next key goal was to understand how such circadian oscillations could be generated and sustained. Hall and Rosbash hypothesized that the PER protein blocked the activity of the period gene. They reasoned that by an inhibitory feedback loop, PER protein could prevent its own synthesis and thereby regulate its own level in a continuous, cyclic rhythm. The model was tantalizing, but a few pieces of the puzzle were missing. To block the activity of the period gene, PER protein, which is produced in the cytoplasm, would have to reach the cell nucleus, where the genetic material is located. Hall and Rosbash had shown that PER protein builds up in the nucleus during night, but how did it get there? In 1994 Michael Young discovered a second clock gene, timeless, encoding the TIM protein that was required for a normal circadian rhythm. In elegant work, he showed that when TIM bound to PER, the two proteins were able to enter the cell nucleus where they blocked period gene activity to close the inhibitory feedback loop. Such a regulatory feedback mechanism explained how this oscillation of cellular protein levels emerged, but questions lingered. What controlled the frequency of the oscillations? Michael Young identified yet another gene, double-time, encoding the DBT protein that delayed the accumulation of the PER protein. This provided insight into how an oscillation is adjusted to more closely match a 24-hour cycle. The paradigm-shifting discoveries have established key mechanistic principles for the biological clock. During the following years other molecular components of the clockwork mechanism were elucidated, explaining its stability and function. For example, this year’s laureates identified additional proteins required for the activation of the period gene, as well as for the mechanism by which light can synchronize the clock. The biological clock is involved in many aspects of our complex physiology. We now know that all multicellular organisms, including humans, utilize a similar mechanism to control circadian rhythms. A large proportion of our genes are regulated by the biological clock and, consequently, a carefully calibrated circadian rhythm adapts our physiology to the different phases of the day. Since the seminal discoveries by the three laureates, circadian biology has developed into a vast and highly dynamic research field, with implications for our health and wellbeing.


The circadian clock anticipates and adapts our physiology to the different phases of the day. Our biological clock helps to regulate sleep patterns, feeding behavior, hormone release, blood pressure, and body temperature.


Read more about Michael Rosbash

Read more about Jeffrey Hall

Read more about Michael Young

Excellent article: New Yorker


Sources: Nobel Foundation: “2017 Nobel Prize in Physiology or Medicine: Molecular mechanisms controlling the circadian rhythm;“ ScienceDaily, 2 October 2017; Wikipedia


Leon Fleisher, Child Prodigy, Struggled to Recover from Focal Dystonia

Fleisher in 1963Photo credit: Seattle Symphony Orchestra, where he was one of their featured artists for the season; photographer: Bender – eBay itemphoto frontphoto back, Public Domain, https://commons.wikimedia.org/w/index.php?curid=18488895


On July 23, 1928, Leon Fleisher was born in San Francisco into a poor Jewish family from Eastern Europe. His father’s business was hat-making, while his mother’s goal was to “make her son a great concert pianist“. Fleisher started studying the piano at age four, made his public debut at age eight, and played with the New York Philharmonic under Pierre Monteux at 16. Monteux famously called him “the pianistic find of the century.“ He became one of the few child prodigies to be accepted for study with Artur Schnabel and also studied with Maria Curcio. Fleisher was linked via Schnabel to a tradition that descended directly from Beethoven himself, handed down through Carl Czerny and Theodor Leschetizky.


“My mother was very ambitious for me and gave me a choice,“ said Fleisher. “Either I was to be the first Jewish President of the United States, or a great concert pianist. Whichever it was, I had to be perfect.“


In the 1950s, Fleisher signed an exclusive recording contract with Columbia Masterworks. He is particularly well known for his interpretations of the piano concerti of Brahms and Beethoven, which he recorded with George Szell and the Cleveland Orchestra. They also recorded Mozart’s Piano Concerto No. 25, the Grieg and Schumann piano concertos, Franck’s Symphonic Variations, and Rachmaninoff’s Rhapsody on a Theme of Paganini.


In 1964, Fleisher lost the use of his right hand, due to a condition that was eventually diagnosed as focal dystonia. At the age of 36, he could barely write his name. “I was preparing for the most important tour of my life when I had a minor accident. I cut my thumb on a piece of cheap garden furniture and required a couple of stitches. When I started practicing again, things didn’t feel quite right on my right side. My fourth and fifth fingers seemed to want to curl under. I practiced even harder, not listening to my body when, through pain, it warned me to stop. Things got progressively worse and in less than a year those two fingers were completely curved under, sticking into the palm of my hand. No way could I play the piano.“ It was as if his arm were a rope becoming unbraided, with creeping numbness in his fingers. Engagements were cancelled, recordings put on hold. “I was desolate,“ he says. “My life fell apart, and this mysterious debilitating condition destroyed my relationship with my second wife, striking deep into my family.“ Doctors were perplexed and could offer no medication or surgical repair to a condition that baffled them. Fleisher even considered suicide. “I grew a beard, wore my hair long and in a ponytail, and I got a Vespa scooter. I felt I had no purpose anymore; I was simply floundering.“


After a couple of years marking time, he realized that his connection was with music, not just with playing the piano with two hands. Out of a disastrous impediment, three new careers beckoned, the first as a left-handed concert pianist. “I thought about Paul Wittgenstein, the Austrian concert pianist whose right arm was shot off in the First World War. He commissioned works for the left hand from Richard Strauss, Korngold, Hindemith, Prokofiev, Ravel and Britten, so there was existing piano literature for pianists with no function in their right hands. And there was Brahms’ magnificent arrangement for left hand of Bach’s Chaconne for solo violin. Thank goodness it wasn’t my left hand that stopped working, since there are hardly any piano works for right hand alone. There are about 1,000 pieces for the left hand out there – most of them pretty bad – but Ravel’s Concerto for left hand, which I must have played over 1,000 times and have also conducted from the keyboard, is a masterpiece in its own right.“ “Secondly, I decided to pursue a musical career through conducting, moving from a sitting to a standing position. It felt so different to be on my feet in front of an orchestra but, worse, I immediately felt my ass to be 10 times its normal size, waving around in front of the audience.“


But it was in teaching that he found real happiness. “I became far better at explaining those elusive areas of expression and nuance that are so difficult to express in words.“ Indeed, his masterclasses, in which, as tutor, it is irrelevant whether you can use five or 10 fingers, are models of gently humorous correction and deeply-felt inspiration. He never gave up the idea of returning to two-handed repertoire. After leaving the concert platform in 1965, Fleisher tried every kind of medical, psychiatric and alternative treatment, from acupuncture and hypnosis to deep-tissue massage, Tiger Balm and others, including more than a few drams of Scotch. But, as a result of conducting and grasping the baton too tightly, he developed carpal tunnel syndrome. This weakness in the forearm and hand caused by pressure on a nerve in the wrist could be alleviated only through surgery. Fleisher agreed to have his wrist cut open with a knife, to the accompaniment, he remembers, of a recording of Mahler’s First Symphony. Astonishingly, the surgery for one ailment helped the other, and his fingers began to straighten out. “After 18 years, I was able to play again. In 1982, I was invited to open the new Meyerhoff concert hall in Baltimore and made the front page of The New York Times for being able to use both hands for the first time since 1965.“ But this supposed cure proved short-lived. “I knew things weren’t quite as they should be,“ said Fleisher. “I had to change the advertised program from Beethoven’s Fourth Piano Concerto to Franck’s Symphonic Variations. It didn’t feel to me like a triumphant return. I broke down in tears in the dressing-room before the concert and felt awful at having to go through an evening of pretense.“


For the remaining 12 of that series of “comeback“ concerts, Fleisher reverted to left-hand repertoire. Only in 1995 was he finally diagnosed with a neurological disorder called focal dystonia. “It’s a malady caused by the brain learning to do a wrong thing, and though a cure has been found, I am a dystonic for life. It’s task-specific. Glass-blowers get it, computer workers can become afflicted and golfers begin to miss their putts.“ Fleisher thinks there could be 10,000 musicians around the world suffering from the condition and that the composer-pianist Robert Schumann may have been an early victim, causing permanent damage by mechanically exercising his troublesome fourth finger.


Fifteen years ago, Botox was still in its experimental stages. However, a small dose injected directly into the appropriate muscle along with holistic massage therapy involving connective tissues restored Fleisher’s fingers sufficiently for him to return to two-handed performances. A tiny amount of Botox relaxes the fingers without causing the paralysis, evident when it is used to reduce facial wrinkles by immobilizing muscles. Crucially, there is no sign of any of the negative effects, such as a diminished quality of emotional experience. In 1995, Fleisher made a second comeback, quietly and without any hype, as he tested his stamina. Only after proving himself to himself did he feel ready to resume his career as a two-handed solo pianist. In 2005, he gave 40 concerts in 31 cities and the following year enjoyed success at New York’s Carnegie Hall. The same two fingers on Fleisher’s right hand still want to curl, but Botox injections every four months keep the condition under control.


When asked if he dances, Fleisher roars with laughter. “Wouldn’t that be a lovely idea?“ he exclaims. “I’m afraid my feet follow my hands. In fact, I have two left feet! It’s a deep regret, along with the fact that I am totally ungifted when it comes to jazz.“ According to his singer-songwriter son Julian, though, Fleisher does have something in common with great jazz players: the importance he places on rhythm. Fleisher feels rhythm as the heartbeat of music. “It regulates the metabolism of the piece, motivates the music and, if it’s infectious enough, makes us tap our toes.“


In 2004, Vanguard Classics released Leon Fleisher’s first “two-handed“ recording since the 1960s, entitled “Two Hands“, to critical acclaim. Two Hands is also the title of a short documentary on Fleisher by Nathaniel Kahn which was nominated for an Academy Award for best short subject on January 23, 2007. Fleisher received the 2007 Kennedy Center Honors. Kennedy Center Chairman Stephen A. Schwarzman described him as “a consummate musician whose career is a moving testament to the life-affirming power of art.“ Fleisher’s musical interests extend beyond the central German Classic-Romantic repertory. The American composer William Bolcom composed his Concerto for Two Pianos, Left Hand for Fleisher and his close friend Gary Graffman, who has also suffered from debilitating problems with his right hand. It received its first performance in Baltimore in April 1996. The concerto is so constructed that it can be performed in one of three ways, with either piano part alone with reduced orchestra, or with both piano parts and the two reduced orchestras combined into a full orchestra.


In 2004, Leon Fleisher played the world premiere of Paul Hindemith’s Klaviermusik (Piano Concerto for the Left Hand), Op. 29, with the Berlin Philharmonic. This work was written in 1923, for Paul Wittgenstein, who disliked and refused to play it. However, he had sole performing rights and kept the score, not allowing any other pianists to play it. The manuscript was discovered among his papers after the death of his widow in 2002. On October 2, 2005, Fleisher played the American premiere of the work, with the San Francisco Symphony under Herbert Blomstedt. In 2012, at the invitation of Justice Ruth Bader Ginsburg, Fleisher performed at the Supreme Court of the United States. Fleisher has continued to be involved in music, both conducting and teaching at the Peabody Conservatory of Music, the Curtis Institute of Music, and the Royal Conservatory of Music in Toronto; he is also closely associated with the Tanglewood Music Center. With Dina Koston, he co-founded and co-directed the Theater Chamber Players in 1968-2003, which was the first resident chamber ensemble of the Smithsonian Institution and of The Kennedy Center. Among others, Fleisher has taught Jonathan Biss, Yefim Bronfman, Phillip Bush, Naida Cole, Jane Coop, Enrico Elisi, Enrique Graf, Helene Grimaud, Hao Huang, Kevin Kenner, Dina Koston, Louis Lortie, Wonny Song, Andre Watts, Jack Winerock, Daniel Wnukowski, Alon Goldstein, Dale Anthony and Orit Wolf.


His memoir, My Nine Lives, co-written with the Washington Post music critic Anne Midgette, appeared in November 2010.


History of FDA and Disaster Relief

FDA Building 31 (left photo) houses the Office of the Commissioner and the Office of Regulatory Department of Health and Human Services. The agency consists of fourteen Centers and Offices. FDA Building 51 (right photo) houses the Center for Drug Evaluation and Research. The FDA campus is located at 10903 New Hampshire Ave., Silver Spring, MD 20993Photo credits: The U.S. Food and Drug Administration – FDA Bldg 31 – Exterior, Public Domain; Wikipedia Commons


President Abraham Lincoln signed into law an act of Congress establishing the United States Department of Agriculture in 1862. The Act of Incorporation, signed by President Abraham Lincoln on March 3, 1863, created the National Academy of Sciences and named 50 charter members. Many of the original NAS members came from a scientific informal network of mostly physical scientists, begun around 1850, working in the vicinity of Cambridge, Massachusetts. These two great scientific agencies, paved the way for the Food and Drug Administration, which emerged over time, from the USDA, founded by the prescient President Abraham Lincoln. Around the world, these U.S. agencies were hailed as a great step forward in government recognition of the role of science in American society. The United States has always been a global leader of scientific solutions.


The Food and Drug Administration (FDA) is the oldest comprehensive consumer protection agency in the U. S. federal government. Its origins can be traced back to the appointment of Lewis Caleb Beck in the Patent Office around 1848 to carry out chemical analyses of agricultural products, a function that the newly created Department of Agriculture inherited in 1862. Although it was not known by its present name until 1930, FDA’s modern regulatory functions began with the passage of the 1906 Pure Food and Drugs Act, a law a quarter-century in the making that prohibited interstate commerce in adulterated and misbranded food and drugs. Harvey Washington Wiley, Chief Chemist of the Bureau of Chemistry in the Department of Agriculture, had been the driving force behind this law and headed its enforcement in the early years, providing basic elements of protection that consumers had never known before that time.


A rectangular shape box with a man battling a skelton (left) and on the right same picture but in the form of a stamp. Photo source: fda.gov


The U. S. Post Office recognized the 1906 Act as a landmark of the 20th century when it released this stamp, the design of which was based on a 19th century patent medicine trading card.


The FDA and its responsibilities have undergone a metamorphosis since 1906. Similarly, the marketplace itself, the sciences undergirding the products the agency regulates, and the social, cultural, political, and economic changes that have formed the context for these developments, all have witnessed upheavals over the past century. Yet the core public health mission of the agency remains now as it did then. This web site features a variety of portals that offer insight into these changes, from overviews on how consumer protection laws evolved, to case studies that explore and interpret the agency’s work and policies. In addition, the visitor will find links to key related web sites as well as citations to valuable sources to help understand the history of FDA.


Several people gathered around a table examining items. FDA Inspector William Ford is at the center of activity in dealing with the 1937 flooding of the Ohio River and its impact on regulated commodities. Photo credit: fda.gov


Images from FDA History

The FDA History Office has mounted a series of 200 posters around the headquarters campus in Silver Spring, Maryland, illustrating the evolution of FDA’s work to protect and promote the public health. These include posters from public health campaigns, images of FDA inspectors, analysts, and others at work, and the commodities the agency regulates. These photos are also available for public access on FDA’s Flickr photo-stream disclaimer icon


Click here to view FDA photos with captions, that capture the history of this important agency


The following is a statement from FDA about crops impacted by Hurricanes Harvey and Irma and FDA’s work with farmers affected by the storms.


September 14, 2017




This is the first time that two category 4 storms have hit the U.S. back-to-back, and the effects have been devastating. At FDA we have a large team working on providing assistance to those affected by these storms, including American farmers who have suffered crop losses. You’ll be hearing a lot from us in the coming weeks, as we do our part to help people continue to recover from these tragic events. Today, we’re providing more information for farmers and food producers who’ve been impacted by these storms, and in particular, the proper handling of crops that have been exposed to floodwaters.


The FDA has longstanding experience responding to flooding and storms. We play an integral role, working with states, in protecting the safety of the food supply – both human and animal food. We recognize that these hurricanes have presented unique challenges for farmers, and the FDA is committed to work with growers, as well as with our federal and state partners, to ensure that the food we serve our families is safe and that consumers have confidence in the products they consume.


We’ve been in close discussion with farmers, consumer representatives, and state officials regarding concerns about how crops may be impacted by these storms. One crop for which there have been a high number of inquiries is rice. This owes, in particular, to the impact of Hurricane Harvey on the large rice crop in Texas. I want to make it clear that the FDA has not issued a ban on rice or any other food crops. Rice grown in normal conditions and rice that has not been exposed to contaminated floodwaters from the recent hurricanes may enter commerce. Also, rice and other crops that were harvested and stored safely before storms hit should not be considered impacted by these events. The documents we’re issuing today, as well as the direct consultations we’re continuing to have, with state officials and with farmers directly, are aimed at providing our most up-to-date, science-based information on which crops can enter commerce without creating risks to consumers or animals who may be fed crops as part of animal feed.


However, we recognize that crops have been and will continue to be impacted in a variety of ways by these storms. There have been substantial crop losses from both storms. Crops may be submerged in flood water, exposed to contaminants, or susceptible to mold. Some of the major concerns for crop safety are heavy metals, chemical, bacterial, and mold contamination. In many cases, it is challenging to determine what contaminants are in crops that were submerged by floodwaters. Both human and animal food must meet well-established safety requirements. FDA has experts that are working closely with state regulators and directly with producers to address questions and concerns.


The FDA takes seriously our obligation to provide guidance to support farmers and food producers, who are responsible for the safety of their products. Many of these resources are already available on FDA’s website. Others will be revised in the coming days and issued directly by the agency, as part of our ongoing effort to provide more timely advice for our stakeholders.


The FDA staff is continuing to work with USDA, state partners, extension services and other stakeholders to help producers as they work to evaluate the safety of their crops. We recognize that in many cases, it is those on the ground who can best advise farmers and help producers evaluate specific concerns and conditions. We have experts in the affected regions who can help provide direct assistance and we are taking additional steps to support recovery efforts. We also understand that state Departments of Agriculture may have specific requirements regarding any attempt to clean, process, test, use or sell crops for human or animal food.


FDA scientists recently had the opportunity to tour farms and packing facilities in Georgia. That trip reminded that farms are different than the other entities FDA regulates. Farms are not just a place of business. Many are homes. Many farms have been in families for generations. As a result, the impact of floods on farms and farmers is especially concerning to FDA. It has hit many farmers hard, destroying their homes and their livelihoods. FDA is leaning forward in our efforts to make sure that we’re providing timely assistance, and that our advice on crop safety reflects our most up-to-date, science based analysis. Our primary mission is the protection and promotion of the public health. We’re committed to making sure food is safe for consumers. But we recognize there are hard questions that must be quickly answered about crops affected by these storms, or else crops that might be safe — because they were not exposed to contaminated floodwaters — could age past their point of use. We recognize the tremendous impact this storm had on region’s farming families. We’re working diligently to provide them with timely guidance. FDA is committed to doing its part to help farmers get back to work.


More detailed information on the impacts of flooding on human and animal crop uses can be found on the FDA website. Also available is general information on evaluating the safety of food and animal food crops exposed to flood waters. In addition, you can find Q & A on crops harvested from flooded fields intended for animal food.

The FDA, is an agency within the U.S. Department of Health and Human Services, that protects the public health by assuring the safety, effectiveness, and security of human and veterinary drugs, vaccines and other biological products for human use, and medical devices. The agency also is responsible for the safety and security of our nation’s food supply, cosmetics, dietary supplements, products that give off electronic radiation, and for regulating tobacco products.


FDA White Oak Campus in Silver Spring, Maryland. Photo credit: FDA.gov


Sources: https://www.fda.gov/aboutfda/whatwedo/history/; Wikipedia


← Previous PageNext Page →