History of Adrenoleukodystrophy

A movie, made in 1993 called “Lorenzo’s Oil“ tells the story of a young boy, Lorenzo Odone, who had Adrenoleukodystrophy. His parents were the creators of an oil, still used today, in the diets of people with this terrible disease. Today, this dietary aid is called Lorenzo’s Oil, named after the same substance used to relieve Lorenzo Odone. He recently died in 2008, at the age of 29, after being diagnosed with ALD at age 5 in 1984. http://wiki.ggc.usg.edu/wiki/Adrenoleukodystrophy


Timeline of the History of ALD – ALD Database

August 23rd, 2017 |

Marc Engelen, M.D., Ph.D. and Stephan Kemp, Ph.D.


1910: In retrospect, Haberfeld and Spieler presented the first clinical description of a patient with X-linked adrenoleukodystrophy (Haberfeld and Spieler, 1910). A previously healthy 6 year old boy developed a deeply bronzed skin (hyperpigmentation), impaired visual acuity, and his school performance deteriorated. The following months, this boy became incontinent, lost his ability to speak and developed spastic tetraparesis, which eventually progressed to an inability to walk. He was hospitalized at the age of 7, and died 8 months later. An older brother had died of a similar illness at the age of 8. Postmortem histological examination of the brain revealed extensive changes in brain white matter, combined with perivascular accumulation of lymphocytes and plasma cells in the nervous system, indicating an inflammatory response.

1922: Siemerling and Creutzfeldt reported the case of a boy with a similar disease progression, including the dark skin and neuropathological findings as in the case described by Haberfeld & Spieler in 1910, except that atrophy of the adrenal cortex was documented.

1963: By now nine comparable cases had been reported. The fact that all patients were males suggested X-linked recessive inheritance (Fanconi et al. 1963).

1970: The name adrenoleukodystrophy was introduced based on the striking association of a leukodystrophy with primary adrenocortical (adrenal) insufficiency (Blaw, 1970).

1972: The key to all subsequent knowledge about the disease was the observation made by Powers, Schaumburg, and Johnson that adrenal cells of ALD patients contained characteristic lipid inclusions (fat droplets), followed by the demonstration that these fat droplets consisted of cholesterol esters that contained a striking and characteristic excess of very long-chain fatty acids (VLCFA).

1976: A more slowly progressive adult form of the disease characterized by adrenal insufficiency, myelopathy and peripheral neuropathy was described (Budka et al. 1976). A year later, five more cases were reported by Griffin et al. who proposed this clinical presentation of ALD to be named adrenomyeloneuropathy (AMN) (Griffin et al. 1977Schaumburg et al.1977).

1981: The identification of VLCFA as a biomarker for ALD led to the development of a diagnostic test for ALD based on the demonstration of elevated levels of VLCFA in cultured skin cells (fibroblasts), plasma, red blood cells and amniocytes (Moser et al. 1981). These tests have permitted precise postnatal and prenatal diagnosis. Metabolic studies demonstrated that VLCFA are metabolized (through beta-oxidation) exclusively in subcellular organelles called peroxisomes and this oxidation of VLCFA is reduced in fibroblasts from ALD patients (Singh et al 1981). Therefore, ALD is a peroxisomal disease.

1981: The ALD locus was mapped to the terminal segment of the long arm of the X-chromosome, Xq28 (Migeon et al. 1981).

1982: The first bone-marrow transplantation (BMT) was performed in a boy with cerebral ALD. An allogeneic BMT from a normal HLA identical sibling donor was performed in a 13-year-old boy with rapidly progressive ALD. Engraftment and complete hematologic recovery occurred within 4 weeks. Ten days after BMT, the white blood cell VLCFA levels and enzyme activity became normal; after 3 months, there was progressive reduction of plasma VLCFA to levels only slightly above normal. But neurologic deterioration continued. The patient died of an adenovirus infection 141 days after BMT.

1986: Rizzo et al. demonstrated that the addition of oleic acid (C18:1) to the tissue culture medium normalizes the levels of saturated VLCFA in cultured skin fibroblasts from ALD patients. These findings formed the basis for the development of Lorenzo’s oil. Treatment of ALD patients with Lorenzo’s oil normalizes plasma VLCFA levels within 4 weeks (Moser et al. 1987). Several open-label trials have shown that Lorenzo’s oil failed to improve neurological or endocrine function nor did it arrest the progression of the disease. Unfortunately, the clinical efficacy of Lorenzo’s oil has never been evaluated in a proper placebo-controlled clinical trial. In 2001, Prof Hugo Moser wrote: “It is our view that Lorenzo’s oil therapy is not warranted in most patients who already have neurologic symptoms. The clinical benefit of Lorenzo’s oil is limited at best“.

1990: The team of Prof. Patrick Aubourg reported on the first successful bone-marrow transplantation (BMT) (Aubourg et al. 1990). They had transplanted an 8-year old boy with mild neurological, mild neuropsychological and mild MRI abnormalities. His unaffected non-identical twin was the donor. The patient recovered completely and the neurological, neuropsychological and MRI abnormalities disappeared. When conducted at the earliest stage of cerebral demyelination, a bone-marrow or hematopoietic stem cell transplantation (HSCT) can stabilize or even reverse cerebral demyelination in boys or adolescents with ALD.

1993: A team led by Drs. Mandel and Aubourg identified the putative gene for ALD (ABCD1) using positional cloning strategies (Mosser et al. 1993). The identification of the ALD gene enabled the detection of disease causing mutations, prenatal diagnosis and accurate carrier testing.

1997: Three laboratories reported the generation of a mouse model for ALD (Forss-Petter et al. 1997Kobayashi et al. 1997Lu et al. 1997). While the ALD mouse exhibits the same biochemical abnormalities as observed in patients, the mouse does not develop ALD (Pujol et al. 2002).

1999: The ALD database was created by Hugo Moser and Stephan Kemp. Initially it served only as a registry for mutations identified in the ABCD1 gene, but soon thereafter it was expanded to provide information on many aspects of ALD.

2001: It was reported and established that ALD affects all ethnic groups and it is the most common peroxisomal disorder with an estimated incidence of 1:17,000 (males and females combined) (Bezman et al. 2001). This makes ALD the most common inherited leukodystrophy.

2005: Biochemically, ALD is not only characterized by a defect in the breakdown of VLCFA in peroxisomes, but there is also an increase in the subsequent chain-elongation of VLCFA (Kemp et al. 2005).

2006: The team led by Dr. Ann Moser developed a high-throughput VLCFA analysis method (with C26:0-lysoPC as the diagnostic metabolite) to be used on dried blood spots (Hubbard et al. 2006). These advancements in VLCFA screening will allow the addition of ALD to newborn screening programs.

2009: The team led by Drs. Cartier and Aubourg reported on the successful treatment of two 7-year old boys with early signs of cerebral ALD using gene therapy (Cartier et al. 2009). Brain MRI scans and cognitive tests showed that progression of the cerebral disease stopped 14-16 months post-treatment. This is comparable with the clinical outcome of HSCT.

2010: TThe research team of Dr. Stephan Kemp established that ALDP transports VLCFA across the peroxisomal membrane. A deficiency in ALDP has two major effects: on the one hand, it impairs peroxisomal degradation of VLCFA, and on the other hand, it raises cytosolic levels of VLCFA. These VLCFA are then further elongated to even longer fatty acids by ELOVL1, the human C26 specific elongase (Ofman et al. 2010).

2014: In the United States, New York State started newborn screening for ALD (Vogel et al. 2015). Early diagnosis of ALD is the key to saving lives, because newborn screening allows prospective monitoring and early intervention.

2015: In the US, Connecticut initiated ALD newborn screening. In Europe, the Netherlands expanded its national newborn screening program from 17 to 31 conditions, including ALD.

2016: On February 16, ALD was added to the United States Recommended Uniform Screening Panel (RUSP). In the US, California initiated ALD newborn screening. Since then other states and countries have started newborn screening programs, or have initiated processes intended to add ALD to their existing newborn screening program. Detailed and up-to-date information on ALD newborn screening can be found at the newborn screening page.


The Nobel Prize in Physiology or Medicine 2017


The distinguished award goes to: Jeffrey C. Hall, Michael Rosbash, Michael W. Young, for their discoveries of molecular mechanisms controlling the circadian rhythm.

Michael Rosbash: Photo credit: Howard Hughes Medical Institute


Jeffrey Hall                                                                 Michael Young

Photo credit: Wikipedia                                        Photo credit: Wikipedia


Life on Earth is adapted to the rotation of our planet. For many years we have known that living organisms, including humans, have an internal, biological clock that helps them anticipate and adapt to the regular rhythm of the day. But how does this clock actually work? Jeffrey C. Hall, Michael Rosbash and Michael W. Young were able to peek inside our biological clock and elucidate its inner workings. Their discoveries explain how plants, animals and humans adapt their biological rhythm so that it is synchronized with the Earth’s revolutions.


Nobel winner, Jeffrey C. Hall was born 1945 in New York, USA. He received his doctoral degree in 1971 at the University of Washington in Seattle and was a postdoctoral fellow at the California Institute of Technology in Pasadena from 1971 to 1973. He joined the faculty at Brandeis University in Waltham in 1974. In 2002, he became associated with University of Maine.


Nobel winner, Michael Rosbash was born in 1944 in Kansas City, USA. He received his doctoral degree in 1970 at the Massachusetts Institute of Technology in Cambridge. During the following three years, he was a postdoctoral fellow at the University of Edinburgh in Scotland. Since 1974, he has been on faculty at Brandeis University in Waltham, USA.


Nobel winner, Michael W. Young was born in 1949 in Miami, USA. He received his doctoral degree at the University of Texas in Austin in 1975. Between 1975 and 1977, he was a postdoctoral fellow at Stanford University in Palo Alto. From 1978, he has been on faculty at the Rockefeller University in New York.


The earliest recorded account of a circadian process dates from the 4th century BCE, when Androsthenes, a ship captain serving under Alexander the Great, described diurnal leaf movements of the tamarind tree. The observation of a circadian or diurnal process in humans is mentioned in Chinese medical texts dated to around the 13th century, including the Noon and Midnight Manual and the Mnemonic Rhyme to Aid in the Selection of Acu-points According to the Diurnal Cycle, the Day of the Month and the Season of the Year. The first recorded observation of an endogenous circadian oscillation was by the French scientist Jean-Jacques d’Ortous de Mairan in 1729. He noted that 24-hour patterns in the movement of the leaves of the plant Mimosa pudica continued even when the plants were kept in constant darkness, in the first experiment to attempt to distinguish an endogenous clock from responses to daily stimuli. In 1896, Patrick and Gilbert observed that during a prolonged period of sleep deprivation, sleepiness increases and decreases with a period of approximately 24 hours. In 1918, J.S. Szymanski showed that animals are capable of maintaining 24-hour activity patterns in the absence of external cues such as light and changes in temperature.


In the early 20th century, circadian rhythms were noticed in the rhythmic feeding times of bees. Extensive experiments were done by Auguste Forel, Ingeborg Beling, and Oskar Wahl to see whether this rhythm was due to an endogenous clock. Ron Konopka and Seymour Benzer isolated the first clock mutant in Drosophila in the early 1970s and mapped the “period“ gene, the first discovered genetic determinant of behavioral rhythmicity. Joseph Takahashi discovered the first mammalian circadian clock mutation (clock delta19) using mice in 1994. However, recent studies show that deletion of clock does not lead to a behavioral phenotype (the animals still have normal circadian rhythms), which questions its importance in rhythm generation.


The term circadian was coined by Franz Halberg in the 1950s.


Using fruit flies as a model organism, this year’s Nobel laureates isolated a gene that controls the normal daily biological rhythm. They showed that this gene encodes a protein that accumulates in the cell during the night, and is then degraded during the day. Subsequently, they identified additional protein components of this machinery, exposing the mechanism governing the self-sustaining clockwork inside the cell. We now recognize that biological clocks function by the same principles in cells of other multicellular organisms, including humans. With precision, our inner clock adapts our physiology to the dramatically different phases of the day. The clock regulates critical functions such as behavior, hormone levels, sleep, body temperature and metabolism. Our well-being is affected when there is a temporary mismatch between our external environment and this internal biological clock, for example when we travel across several time zones and experience “jet lag.“ There are also indications that chronic misalignment between our lifestyle and the rhythm dictated by our inner timekeeper is associated with increased risk for various diseases.


Most living organisms anticipate and adapt to daily changes in the environment. During the 18th century, the astronomer Jean Jacques d’Ortous de Mairan studied mimosa plants, and found that the leaves opened towards the sun during daytime and closed at dusk. He wondered what would happen if the plant was placed in constant darkness. He found that independent of daily sunlight the leaves continued to follow their normal daily oscillation. Plants seemed to have their own biological clock. Other researchers found that not only plants, but also animals and humans, have a biological clock that helps to prepare our physiology for the fluctuations of the day. This regular adaptation is referred to as the circadian rhythm, originating from the Latin words circa meaning “around“ and dies meaning “day“. But just how our internal circadian biological clock worked remained a mystery.


During the 1970’s, Seymour Benzer and his student Ronald Konopka asked whether it would be possible to identify genes that control the circadian rhythm in fruit flies. They demonstrated that mutations in an unknown gene disrupted the circadian clock of flies. They named this gene period. But how could this gene influence the circadian rhythm?


This year’s Nobel Laureates, who were also studying fruit flies, aimed to discover how the clock actually works. In 1984, Jeffrey Hall and Michael Rosbash, working in close collaboration at Brandeis University in Boston, and Michael Young at the Rockefeller University in New York, succeeded in isolating the period gene. Hall and Rosbash then went on to discover that PER, the protein encoded by period, accumulated during the night and was degraded during the day. Thus, PER protein levels oscillate over a 24-hour cycle, in synchrony with the circadian rhythm. The next key goal was to understand how such circadian oscillations could be generated and sustained. Hall and Rosbash hypothesized that the PER protein blocked the activity of the period gene. They reasoned that by an inhibitory feedback loop, PER protein could prevent its own synthesis and thereby regulate its own level in a continuous, cyclic rhythm. The model was tantalizing, but a few pieces of the puzzle were missing. To block the activity of the period gene, PER protein, which is produced in the cytoplasm, would have to reach the cell nucleus, where the genetic material is located. Hall and Rosbash had shown that PER protein builds up in the nucleus during night, but how did it get there? In 1994 Michael Young discovered a second clock gene, timeless, encoding the TIM protein that was required for a normal circadian rhythm. In elegant work, he showed that when TIM bound to PER, the two proteins were able to enter the cell nucleus where they blocked period gene activity to close the inhibitory feedback loop. Such a regulatory feedback mechanism explained how this oscillation of cellular protein levels emerged, but questions lingered. What controlled the frequency of the oscillations? Michael Young identified yet another gene, double-time, encoding the DBT protein that delayed the accumulation of the PER protein. This provided insight into how an oscillation is adjusted to more closely match a 24-hour cycle. The paradigm-shifting discoveries have established key mechanistic principles for the biological clock. During the following years other molecular components of the clockwork mechanism were elucidated, explaining its stability and function. For example, this year’s laureates identified additional proteins required for the activation of the period gene, as well as for the mechanism by which light can synchronize the clock. The biological clock is involved in many aspects of our complex physiology. We now know that all multicellular organisms, including humans, utilize a similar mechanism to control circadian rhythms. A large proportion of our genes are regulated by the biological clock and, consequently, a carefully calibrated circadian rhythm adapts our physiology to the different phases of the day. Since the seminal discoveries by the three laureates, circadian biology has developed into a vast and highly dynamic research field, with implications for our health and wellbeing.


The circadian clock anticipates and adapts our physiology to the different phases of the day. Our biological clock helps to regulate sleep patterns, feeding behavior, hormone release, blood pressure, and body temperature.


Read more about Michael Rosbash

Read more about Jeffrey Hall

Read more about Michael Young

Excellent article: New Yorker


Sources: Nobel Foundation: “2017 Nobel Prize in Physiology or Medicine: Molecular mechanisms controlling the circadian rhythm;“ ScienceDaily, 2 October 2017; Wikipedia


Leon Fleisher, Child Prodigy, Struggled to Recover from Focal Dystonia

Fleisher in 1963Photo credit: Seattle Symphony Orchestra, where he was one of their featured artists for the season; photographer: Bender – eBay itemphoto frontphoto back, Public Domain, https://commons.wikimedia.org/w/index.php?curid=18488895


On July 23, 1928, Leon Fleisher was born in San Francisco into a poor Jewish family from Eastern Europe. His father’s business was hat-making, while his mother’s goal was to “make her son a great concert pianist“. Fleisher started studying the piano at age four, made his public debut at age eight, and played with the New York Philharmonic under Pierre Monteux at 16. Monteux famously called him “the pianistic find of the century.“ He became one of the few child prodigies to be accepted for study with Artur Schnabel and also studied with Maria Curcio. Fleisher was linked via Schnabel to a tradition that descended directly from Beethoven himself, handed down through Carl Czerny and Theodor Leschetizky.


“My mother was very ambitious for me and gave me a choice,“ said Fleisher. “Either I was to be the first Jewish President of the United States, or a great concert pianist. Whichever it was, I had to be perfect.“


In the 1950s, Fleisher signed an exclusive recording contract with Columbia Masterworks. He is particularly well known for his interpretations of the piano concerti of Brahms and Beethoven, which he recorded with George Szell and the Cleveland Orchestra. They also recorded Mozart’s Piano Concerto No. 25, the Grieg and Schumann piano concertos, Franck’s Symphonic Variations, and Rachmaninoff’s Rhapsody on a Theme of Paganini.


In 1964, Fleisher lost the use of his right hand, due to a condition that was eventually diagnosed as focal dystonia. At the age of 36, he could barely write his name. “I was preparing for the most important tour of my life when I had a minor accident. I cut my thumb on a piece of cheap garden furniture and required a couple of stitches. When I started practicing again, things didn’t feel quite right on my right side. My fourth and fifth fingers seemed to want to curl under. I practiced even harder, not listening to my body when, through pain, it warned me to stop. Things got progressively worse and in less than a year those two fingers were completely curved under, sticking into the palm of my hand. No way could I play the piano.“ It was as if his arm were a rope becoming unbraided, with creeping numbness in his fingers. Engagements were cancelled, recordings put on hold. “I was desolate,“ he says. “My life fell apart, and this mysterious debilitating condition destroyed my relationship with my second wife, striking deep into my family.“ Doctors were perplexed and could offer no medication or surgical repair to a condition that baffled them. Fleisher even considered suicide. “I grew a beard, wore my hair long and in a ponytail, and I got a Vespa scooter. I felt I had no purpose anymore; I was simply floundering.“


After a couple of years marking time, he realized that his connection was with music, not just with playing the piano with two hands. Out of a disastrous impediment, three new careers beckoned, the first as a left-handed concert pianist. “I thought about Paul Wittgenstein, the Austrian concert pianist whose right arm was shot off in the First World War. He commissioned works for the left hand from Richard Strauss, Korngold, Hindemith, Prokofiev, Ravel and Britten, so there was existing piano literature for pianists with no function in their right hands. And there was Brahms’ magnificent arrangement for left hand of Bach’s Chaconne for solo violin. Thank goodness it wasn’t my left hand that stopped working, since there are hardly any piano works for right hand alone. There are about 1,000 pieces for the left hand out there – most of them pretty bad – but Ravel’s Concerto for left hand, which I must have played over 1,000 times and have also conducted from the keyboard, is a masterpiece in its own right.“ “Secondly, I decided to pursue a musical career through conducting, moving from a sitting to a standing position. It felt so different to be on my feet in front of an orchestra but, worse, I immediately felt my ass to be 10 times its normal size, waving around in front of the audience.“


But it was in teaching that he found real happiness. “I became far better at explaining those elusive areas of expression and nuance that are so difficult to express in words.“ Indeed, his masterclasses, in which, as tutor, it is irrelevant whether you can use five or 10 fingers, are models of gently humorous correction and deeply-felt inspiration. He never gave up the idea of returning to two-handed repertoire. After leaving the concert platform in 1965, Fleisher tried every kind of medical, psychiatric and alternative treatment, from acupuncture and hypnosis to deep-tissue massage, Tiger Balm and others, including more than a few drams of Scotch. But, as a result of conducting and grasping the baton too tightly, he developed carpal tunnel syndrome. This weakness in the forearm and hand caused by pressure on a nerve in the wrist could be alleviated only through surgery. Fleisher agreed to have his wrist cut open with a knife, to the accompaniment, he remembers, of a recording of Mahler’s First Symphony. Astonishingly, the surgery for one ailment helped the other, and his fingers began to straighten out. “After 18 years, I was able to play again. In 1982, I was invited to open the new Meyerhoff concert hall in Baltimore and made the front page of The New York Times for being able to use both hands for the first time since 1965.“ But this supposed cure proved short-lived. “I knew things weren’t quite as they should be,“ said Fleisher. “I had to change the advertised program from Beethoven’s Fourth Piano Concerto to Franck’s Symphonic Variations. It didn’t feel to me like a triumphant return. I broke down in tears in the dressing-room before the concert and felt awful at having to go through an evening of pretense.“


For the remaining 12 of that series of “comeback“ concerts, Fleisher reverted to left-hand repertoire. Only in 1995 was he finally diagnosed with a neurological disorder called focal dystonia. “It’s a malady caused by the brain learning to do a wrong thing, and though a cure has been found, I am a dystonic for life. It’s task-specific. Glass-blowers get it, computer workers can become afflicted and golfers begin to miss their putts.“ Fleisher thinks there could be 10,000 musicians around the world suffering from the condition and that the composer-pianist Robert Schumann may have been an early victim, causing permanent damage by mechanically exercising his troublesome fourth finger.


Fifteen years ago, Botox was still in its experimental stages. However, a small dose injected directly into the appropriate muscle along with holistic massage therapy involving connective tissues restored Fleisher’s fingers sufficiently for him to return to two-handed performances. A tiny amount of Botox relaxes the fingers without causing the paralysis, evident when it is used to reduce facial wrinkles by immobilizing muscles. Crucially, there is no sign of any of the negative effects, such as a diminished quality of emotional experience. In 1995, Fleisher made a second comeback, quietly and without any hype, as he tested his stamina. Only after proving himself to himself did he feel ready to resume his career as a two-handed solo pianist. In 2005, he gave 40 concerts in 31 cities and the following year enjoyed success at New York’s Carnegie Hall. The same two fingers on Fleisher’s right hand still want to curl, but Botox injections every four months keep the condition under control.


When asked if he dances, Fleisher roars with laughter. “Wouldn’t that be a lovely idea?“ he exclaims. “I’m afraid my feet follow my hands. In fact, I have two left feet! It’s a deep regret, along with the fact that I am totally ungifted when it comes to jazz.“ According to his singer-songwriter son Julian, though, Fleisher does have something in common with great jazz players: the importance he places on rhythm. Fleisher feels rhythm as the heartbeat of music. “It regulates the metabolism of the piece, motivates the music and, if it’s infectious enough, makes us tap our toes.“


In 2004, Vanguard Classics released Leon Fleisher’s first “two-handed“ recording since the 1960s, entitled “Two Hands“, to critical acclaim. Two Hands is also the title of a short documentary on Fleisher by Nathaniel Kahn which was nominated for an Academy Award for best short subject on January 23, 2007. Fleisher received the 2007 Kennedy Center Honors. Kennedy Center Chairman Stephen A. Schwarzman described him as “a consummate musician whose career is a moving testament to the life-affirming power of art.“ Fleisher’s musical interests extend beyond the central German Classic-Romantic repertory. The American composer William Bolcom composed his Concerto for Two Pianos, Left Hand for Fleisher and his close friend Gary Graffman, who has also suffered from debilitating problems with his right hand. It received its first performance in Baltimore in April 1996. The concerto is so constructed that it can be performed in one of three ways, with either piano part alone with reduced orchestra, or with both piano parts and the two reduced orchestras combined into a full orchestra.


In 2004, Leon Fleisher played the world premiere of Paul Hindemith’s Klaviermusik (Piano Concerto for the Left Hand), Op. 29, with the Berlin Philharmonic. This work was written in 1923, for Paul Wittgenstein, who disliked and refused to play it. However, he had sole performing rights and kept the score, not allowing any other pianists to play it. The manuscript was discovered among his papers after the death of his widow in 2002. On October 2, 2005, Fleisher played the American premiere of the work, with the San Francisco Symphony under Herbert Blomstedt. In 2012, at the invitation of Justice Ruth Bader Ginsburg, Fleisher performed at the Supreme Court of the United States. Fleisher has continued to be involved in music, both conducting and teaching at the Peabody Conservatory of Music, the Curtis Institute of Music, and the Royal Conservatory of Music in Toronto; he is also closely associated with the Tanglewood Music Center. With Dina Koston, he co-founded and co-directed the Theater Chamber Players in 1968-2003, which was the first resident chamber ensemble of the Smithsonian Institution and of The Kennedy Center. Among others, Fleisher has taught Jonathan Biss, Yefim Bronfman, Phillip Bush, Naida Cole, Jane Coop, Enrico Elisi, Enrique Graf, Helene Grimaud, Hao Huang, Kevin Kenner, Dina Koston, Louis Lortie, Wonny Song, Andre Watts, Jack Winerock, Daniel Wnukowski, Alon Goldstein, Dale Anthony and Orit Wolf.


His memoir, My Nine Lives, co-written with the Washington Post music critic Anne Midgette, appeared in November 2010.


History of FDA and Disaster Relief

FDA Building 31 (left photo) houses the Office of the Commissioner and the Office of Regulatory Department of Health and Human Services. The agency consists of fourteen Centers and Offices. FDA Building 51 (right photo) houses the Center for Drug Evaluation and Research. The FDA campus is located at 10903 New Hampshire Ave., Silver Spring, MD 20993Photo credits: The U.S. Food and Drug Administration – FDA Bldg 31 – Exterior, Public Domain; Wikipedia Commons


President Abraham Lincoln signed into law an act of Congress establishing the United States Department of Agriculture in 1862. The Act of Incorporation, signed by President Abraham Lincoln on March 3, 1863, created the National Academy of Sciences and named 50 charter members. Many of the original NAS members came from a scientific informal network of mostly physical scientists, begun around 1850, working in the vicinity of Cambridge, Massachusetts. These two great scientific agencies, paved the way for the Food and Drug Administration, which emerged over time, from the USDA, founded by the prescient President Abraham Lincoln. Around the world, these U.S. agencies were hailed as a great step forward in government recognition of the role of science in American society. The United States has always been a global leader of scientific solutions.


The Food and Drug Administration (FDA) is the oldest comprehensive consumer protection agency in the U. S. federal government. Its origins can be traced back to the appointment of Lewis Caleb Beck in the Patent Office around 1848 to carry out chemical analyses of agricultural products, a function that the newly created Department of Agriculture inherited in 1862. Although it was not known by its present name until 1930, FDA’s modern regulatory functions began with the passage of the 1906 Pure Food and Drugs Act, a law a quarter-century in the making that prohibited interstate commerce in adulterated and misbranded food and drugs. Harvey Washington Wiley, Chief Chemist of the Bureau of Chemistry in the Department of Agriculture, had been the driving force behind this law and headed its enforcement in the early years, providing basic elements of protection that consumers had never known before that time.


A rectangular shape box with a man battling a skelton (left) and on the right same picture but in the form of a stamp. Photo source: fda.gov


The U. S. Post Office recognized the 1906 Act as a landmark of the 20th century when it released this stamp, the design of which was based on a 19th century patent medicine trading card.


The FDA and its responsibilities have undergone a metamorphosis since 1906. Similarly, the marketplace itself, the sciences undergirding the products the agency regulates, and the social, cultural, political, and economic changes that have formed the context for these developments, all have witnessed upheavals over the past century. Yet the core public health mission of the agency remains now as it did then. This web site features a variety of portals that offer insight into these changes, from overviews on how consumer protection laws evolved, to case studies that explore and interpret the agency’s work and policies. In addition, the visitor will find links to key related web sites as well as citations to valuable sources to help understand the history of FDA.


Several people gathered around a table examining items. FDA Inspector William Ford is at the center of activity in dealing with the 1937 flooding of the Ohio River and its impact on regulated commodities. Photo credit: fda.gov


Images from FDA History

The FDA History Office has mounted a series of 200 posters around the headquarters campus in Silver Spring, Maryland, illustrating the evolution of FDA’s work to protect and promote the public health. These include posters from public health campaigns, images of FDA inspectors, analysts, and others at work, and the commodities the agency regulates. These photos are also available for public access on FDA’s Flickr photo-stream disclaimer icon


Click here to view FDA photos with captions, that capture the history of this important agency


The following is a statement from FDA about crops impacted by Hurricanes Harvey and Irma and FDA’s work with farmers affected by the storms.


September 14, 2017




This is the first time that two category 4 storms have hit the U.S. back-to-back, and the effects have been devastating. At FDA we have a large team working on providing assistance to those affected by these storms, including American farmers who have suffered crop losses. You’ll be hearing a lot from us in the coming weeks, as we do our part to help people continue to recover from these tragic events. Today, we’re providing more information for farmers and food producers who’ve been impacted by these storms, and in particular, the proper handling of crops that have been exposed to floodwaters.


The FDA has longstanding experience responding to flooding and storms. We play an integral role, working with states, in protecting the safety of the food supply – both human and animal food. We recognize that these hurricanes have presented unique challenges for farmers, and the FDA is committed to work with growers, as well as with our federal and state partners, to ensure that the food we serve our families is safe and that consumers have confidence in the products they consume.


We’ve been in close discussion with farmers, consumer representatives, and state officials regarding concerns about how crops may be impacted by these storms. One crop for which there have been a high number of inquiries is rice. This owes, in particular, to the impact of Hurricane Harvey on the large rice crop in Texas. I want to make it clear that the FDA has not issued a ban on rice or any other food crops. Rice grown in normal conditions and rice that has not been exposed to contaminated floodwaters from the recent hurricanes may enter commerce. Also, rice and other crops that were harvested and stored safely before storms hit should not be considered impacted by these events. The documents we’re issuing today, as well as the direct consultations we’re continuing to have, with state officials and with farmers directly, are aimed at providing our most up-to-date, science-based information on which crops can enter commerce without creating risks to consumers or animals who may be fed crops as part of animal feed.


However, we recognize that crops have been and will continue to be impacted in a variety of ways by these storms. There have been substantial crop losses from both storms. Crops may be submerged in flood water, exposed to contaminants, or susceptible to mold. Some of the major concerns for crop safety are heavy metals, chemical, bacterial, and mold contamination. In many cases, it is challenging to determine what contaminants are in crops that were submerged by floodwaters. Both human and animal food must meet well-established safety requirements. FDA has experts that are working closely with state regulators and directly with producers to address questions and concerns.


The FDA takes seriously our obligation to provide guidance to support farmers and food producers, who are responsible for the safety of their products. Many of these resources are already available on FDA’s website. Others will be revised in the coming days and issued directly by the agency, as part of our ongoing effort to provide more timely advice for our stakeholders.


The FDA staff is continuing to work with USDA, state partners, extension services and other stakeholders to help producers as they work to evaluate the safety of their crops. We recognize that in many cases, it is those on the ground who can best advise farmers and help producers evaluate specific concerns and conditions. We have experts in the affected regions who can help provide direct assistance and we are taking additional steps to support recovery efforts. We also understand that state Departments of Agriculture may have specific requirements regarding any attempt to clean, process, test, use or sell crops for human or animal food.


FDA scientists recently had the opportunity to tour farms and packing facilities in Georgia. That trip reminded that farms are different than the other entities FDA regulates. Farms are not just a place of business. Many are homes. Many farms have been in families for generations. As a result, the impact of floods on farms and farmers is especially concerning to FDA. It has hit many farmers hard, destroying their homes and their livelihoods. FDA is leaning forward in our efforts to make sure that we’re providing timely assistance, and that our advice on crop safety reflects our most up-to-date, science based analysis. Our primary mission is the protection and promotion of the public health. We’re committed to making sure food is safe for consumers. But we recognize there are hard questions that must be quickly answered about crops affected by these storms, or else crops that might be safe — because they were not exposed to contaminated floodwaters — could age past their point of use. We recognize the tremendous impact this storm had on region’s farming families. We’re working diligently to provide them with timely guidance. FDA is committed to doing its part to help farmers get back to work.


More detailed information on the impacts of flooding on human and animal crop uses can be found on the FDA website. Also available is general information on evaluating the safety of food and animal food crops exposed to flood waters. In addition, you can find Q & A on crops harvested from flooded fields intended for animal food.

The FDA, is an agency within the U.S. Department of Health and Human Services, that protects the public health by assuring the safety, effectiveness, and security of human and veterinary drugs, vaccines and other biological products for human use, and medical devices. The agency also is responsible for the safety and security of our nation’s food supply, cosmetics, dietary supplements, products that give off electronic radiation, and for regulating tobacco products.


FDA White Oak Campus in Silver Spring, Maryland. Photo credit: FDA.gov


Sources: https://www.fda.gov/aboutfda/whatwedo/history/; Wikipedia



From ancient Egyptian medical papyruses, like the Edwin Smith Papyrus, above, we know that the symptoms of diabetes were known in 1552 BC, including the understanding that ants were attracted to people with this disease. Source: Wikipedia



Diabetes mellitus is one of the oldest known human diseases.


The first known mention of diabetes symptoms was in 1552 BCE, when Hesy-Ra, an Egyptian physician, documented, in an Egyptian papyrus, frequent urination as a symptom of a mysterious disease that also caused emaciation. Also around this time, ancient healers noted that ants seemed to be attracted to the urine of people who had this disease. Type 1 and type 2 diabetes were also identified as separate conditions for the first time by the Indian physicians Sushruta and Charaka in 400-500 CE with type 1 associated with youth and type 2 with being overweight. The term “mellitus“ or “from honey“ was added by the Briton John Rolle in the late 1700’s to separate the condition from diabetes insipidus which is also associated with frequent urination. The term “diabetes“ or “to pass through“ was first used in 230 BCE by the Greek Apollonius of Memphis. The disease was rare during the time of the Roman empire with Galen commenting that he had only seen two cases during his career. In 150 CE, the Greek physician Arateus described what we now call diabetes as “the melting down of flesh and limbs into urine.“ From then on, physicians began to gain a better understanding of diabetes.


Centuries later, people known as “water tasters“ diagnosed diabetes by tasting the urine of people suspected to have it. If urine tasted sweet, diabetes was diagnosed. To acknowledge this feature, in 1675 the word “mellitus,“ meaning honey, was added to the name “diabetes,“ meaning siphon. It wasn’t until the 1800’s that scientists developed chemical tests to detect the presence of sugar in the urine. As physicians learned more about diabetes, they began to understand how it could be managed. The first diabetes treatment involved prescribed exercise, often horseback riding, which was thought to relieve excessive urination. In the 1700’s and 1800’s, physicians began to realize that dietary changes could help manage diabetes, and they advised their patients to do things like eat only the fat and meat of animals or not consume large amounts of sugar. During the Franco-Prussian War of the early 1870’s, the French physician Apollinaire Bouchardat noted that his diabetic patients’ symptoms improved due to war-related food rationing, and he developed individualized diets as diabetes treatments. This led to the fad diets of the early 1900s, which included the “oat-cure,“ “potato therapy,“ and the “starvation diet.“


In 1916, Boston scientist Elliott Joslin established himself as one of the world’s leading diabetes experts by creating the textbook “The Treatment of Diabetes Mellitus,“ which reported that a fasting diet combined with regular exercise could significantly reduce the risk of death in diabetes patients. Today, doctors and diabetes educators still use these principles when teaching their patients about lifestyle changes for the management of diabetes. Despite these advances, before the discovery of insulin, diabetes inevitably led to premature death. The first big breakthrough that eventually led to the use of insulin to treat diabetes was in 1889, when Oskar Minkowski and Joseph von Mering, researchers at the University of Strasbourg in France, showed that the removal of a dog’s pancreas could induce diabetes. In the early 1900s, Georg Zuelzer, a German scientist, found that injecting pancreatic extract into patients could help control diabetes. Frederick Banting, a physician in Ontario, Canada, first had the idea to use insulin to treat diabetes in 1920, and he and his colleagues began trying out his theory in animal experiments. Banting and his team finally used insulin to successfully treat a diabetic patient in 1922 and were awarded the Nobel Prize in Medicine the following year.


Frederick Banting in 1938: Photo credit: Arthur Goss – Library and Archives of Canada – PA-123481, Public Domain, https://commons.wikimedia.org/w/index.php?curid=468141


A “Flame of Hope” was lit by Her Majesty Queen Elizabeth the Queen Mother in 1989 as a tribute to Dr. Frederick Banting and all the people that have lost their lives to diabetes. The flame will remain lit until there is a cure for diabetes. When a cure is found, the flame will be extinguished by the researchers who discover the cure. The flame is located at Sir Frederick Banting Square in London, Ontario, Canada beside the Banting House National Historic Site of Canada.


Best and Banting in 1924.  Photo credit: University of Toronto, Public Domain, https://commons.wikimedia.org/w/index.php?curid=5956010


A time capsule was buried in the Sir Frederick Banting Square in 1991 to honor the 100th anniversary of Sir Frederick Banting’s birth. It was buried by the International Diabetes Federation Youth Representatives and Governor General of Canada Ray Hnatyshyn. It will be exhumed if a cure for diabetes is found. Prior to the award of the Nobel Prize in Physiology or Medicine for 1923, which Banting shared with Macleod, he received the Reeve Prize of the University of Toronto (1922). In 1928 Banting gave the Cameron Lecture in Edinburgh. He was a member of numerous medical academies and societies in Canada and abroad, including the British and American Physiological Societies, and the American Pharmacological Society. In 1934 he was knighted as a Knight Commander of the Order of the British Empire (KBE) and became an active Vice-President of the Diabetic Association (now Diabetes UK). In May, 1935 he was elected a Fellow of the Royal Society. In 2004, Banting was inducted into the National Inventors Hall of Fame.


Sir Frederick Grant Banting


Sir Frederick Grant Banting, KBE MC FRS FRSC (November 14, 1891 – February 21, 1941) was a Canadian medical scientist, physician, painter, and Nobel laureate noted as the co-discoverer of insulin and its therapeutic potential. In 1923 Banting and John James Rickard Macleod received the Nobel Prize in Medicine. Banting shared the award money with his colleague, Dr. Charles Best. As of November 2016, Banting, who received the Nobel Prize at age 32, remains the youngest Nobel laureate in the area of Physiology/Medicine. In 1923 the Government of Canada granted Banting a lifetime annuity to continue his work. In 1934 he was knighted by King George V.


Frederick Banting was born on November 14, 1891, in a farm house near Alliston, Ontario. The youngest of five children of William Thompson Banting and Margaret Grant, Banting attended public high schools in Alliston. In 1910, he started at Victoria College, part of the University of Toronto, in the General Arts program. After failing his first year, he petitioned to join the medical program in 1912 and was accepted. He began medical school in September 1912. In 1914, he attempted to enter the army on August 5, and then again in October, but was refused due to poor eyesight. Banting successfully joined the army in 1915, and spent the summer training before returning to school. His class was fast-tracked to get more doctors into the war and so he graduated in December 1916 and reported for military duty the next day. He was wounded at the Battle of Cambrai in 1918. Despite his injuries, he helped other wounded men for sixteen hours, until another doctor told him to stop. He was awarded the Military Cross in 1919, for heroism.


Banting returned to Canada after the war and went to Toronto to complete his surgical training. He studied orthopedic medicine and, in 1919-1920, was Resident Surgeon at The Hospital for Sick Children. Banting was unable to gain a place on the hospital staff and so he decided to move to London, Ontario to set up a medical practice. From July 1920 to May 1921, he continued his general practice, while teaching orthopedics and anthropology part-time at the University of Western Ontario in London because his medical practice had not been particularly successful. From 1921 to 1922 he lectured in pharmacology at the University of Toronto. He received his M.D. degree in 1922, and was also awarded a gold medal. An article he read about the pancreas piqued Banting’s interest in diabetes. Banting had to give a talk on the pancreas to one of his classes at the University of Western Ontario on November 1, 1920, and as a result, read reports that other scientists had written. Research by German pathologist Bernhard Naunyn, Oskar Minkowski, American physician and pathologist Eugene Lindsay Opie, English physiologist Edward Albert Sharpey-Schafer, and others suggested that diabetes resulted from a lack of a protein hormone secreted by the islets of Langerhans in the pancreas. Schafer had named this putative hormone “insulin“. Insulin was thought to control the metabolism of sugar; its lack led to an increase of sugar in the blood which was then excreted in urine. Attempts to extract insulin from ground-up pancreas cells were unsuccessful, likely because of the destruction of the insulin by the proteolysis enzyme of the pancreas. The challenge was to find a way to extract insulin from the pancreas prior to it being destroyed.


Moses Barron published an article in 1920 which described experimental closure of the pancreatic duct by ligature; this further influenced Banting’s thinking. The procedure caused deterioration of the cells of the pancreas that secrete trypsin which breaks down insulin, but it left the islets of Langerhans intact. Banting realized that this procedure would destroy the trypsin-secreting cells but not the insulin. Once the trypsin-secreting cells had died, insulin could be extracted from the islets of Langerhans. Banting discussed this approach with J. J. R. Macleod, Professor of Physiology at the University of Toronto. Macleod provided experimental facilities and the assistance of one of his students, Dr. Charles Best. Banting and Best, with the assistance of biochemist James Collip, began the production of insulin by this means. As the experiments proceeded, the required quantities could no longer be obtained by performing surgery on living dogs. On November 16, 1921, Banting hit upon the idea of obtaining insulin from the fetal pancreas. He removed the pancreases from fetal calves at a William Davies slaughterhouse and found the extracts to be just as potent as those extracted from the dog pancreases. Pork and beef would remain the primary commercial sources of insulin until they were replaced by genetically-engineered bacteria in the late 20th century. In spring of 1922, Banting established a private practice in Toronto and began to treat diabetic patients, including Elizabeth Hughes Gossett, daughter of then U.S. Secretary of State Charles Evans Hughes.


Banting and Macleod were jointly awarded the 1923 Nobel Prize in Physiology or Medicine. Banting split his half of the Prize money with Best, and Macleod split the other half of the Prize money with James Collip. Banting was appointed Senior Demonstrator in Medicine at the University of Toronto in 1922. The following year he was elected to the new Banting and Best Chair of Medical Research, endowed by the Legislature of the Province of Ontario. He also served as Honorary Consulting Physician to the Toronto General, the Hospital for Sick Children, and the Toronto Western Hospital. At the Banting and Best Institute, he focused his research on silicosis, cancer, and the mechanisms of drowning. In 1938, Banting’s interest in aviation medicine resulted in his participation with the Royal Canadian Air Force (RCAF) in research concerning the physiological problems encountered by pilots operating high-altitude combat aircraft. Banting headed the RCAF’s Number 1 Clinical Investigation Unit (CIU), which was housed in a secret facility on the grounds of the former Eglington Hunt Club in Toronto. During the Second World War he investigated the problems of aviators, such as “blackout“ (syncope). He also helped Wilbur Franks with the invention of the G-suit to stop pilots from blacking out when they were subjected to g-forces while turning or diving. Another of Banting’s projects during the Second World War involved using and treating mustard gas burns. Banting even tested the gas and antidotes on himself to see if they were effective.


Banting developed an interest in painting beginning around 1921 while he was in London, Ontario. Some of his first pieces were done on the back of the cardboard in which his shirts were packed by the dry-cleaners. He became friends with The Group of Seven artists A. Y. Jackson and Lawren Harris, sharing their love of the rugged Canadian landscape. In 1927 he made a sketching trip with Jackson to the St. Lawrence River in Quebec. Later that year they traveled to RCMP outposts in the Arctic on the Canadian Government supply ship Beothic. The sketches, done both in oils on birch panels and in pen and ink, were named after the places he visited: Craig Harbor, Ellesmere Island; Pond Inlet, Baylot Island; Eskimo tents at Etach; others were untitled. Jackson and Banting also made painting expeditions to Great Slave Lake, Walsh Lake (Northwest Territories), Georgian Bay, French River and the Sudbury District.


Banting married twice. His first marriage was to Marion Robertson in 1924; they had one child, William (born 1928). They divorced in 1932 and Banting married Henrietta Ball in 1937. Four years later, In February 1941, Banting died of wounds and exposure following the crash of a Lockheed L-14 Super Electra/Hudson in which he was a passenger, in Musgrave Harbor, Newfoundland. After departing from Gander, Newfoundland, both of the plane’s engines failed. The navigator and co-pilot died instantly, but Banting and the pilot, Captain Joseph Mackey, survived the initial impact. According to Mackey, the sole survivor, Banting died from his injuries the next day. Banting was en route to England to conduct operational tests on the Franks flying suit developed by his colleague Wilbur Franks. Banting and his wife are buried at Mount Pleasant Cemetery in Toronto.


In 1994 Banting was inducted into the Canadian Medical Hall of Fame. In 2004, he was nominated as one of the top 10 “Greatest Canadians“ by viewers of the Canadian Broadcasting Corporation. When the final votes were counted, Banting finished fourth behind Tommy Douglas, Terry Fox and Pierre Trudeau.


Banting’s namesake, the Banting Research Foundation, was created in 1925 and provides funding to support health and biomedical research in Canada. Banting’s name is immortalized in the yearly Banting Lectures, given by an expert in diabetes, and by the creation of the Banting and Best Department of Medical Research of the University of Toronto; Sir Frederick G Banting Research Centre located on Sir Frederick Banting Driveway in the Tunney’s Pasture complex, Ottawa, ON; Banting Memorial High School in Alliston, ON; Sir Frederick Banting Secondary School in London, ON; Sir Frederick Banting Alternative Program Site in Ottawa, ON; Frederick Banting Elementary School in Montr?al-Nord QC and ?cole Banting Middle School in Coquitlam, BC. The “Major Sir Frederick Banting, MC, RCAMC Award for Military Health Research“, sponsored by the True Patriot Love Foundation, is awarded annually by the Surgeon General to the researcher whose work presented at the annual Military and Veterans Health Research Forum is deemed to contribute most to military health. It was first awarded in 2011 in the presence of several Banting descendants. The “Canadian Forces Major Sir Frederick Banting Term Chair in Military Trauma Research“ at Sunnybrook Health Sciences Centre was established in 2012. The first Chair holder is Colonel Homer Tien, Medical Director of Sunnybrook’s Tory Regional Trauma Centre and Senior Specialist and Trauma Adviser to the Surgeon General. The Banting Postdoctoral Fellowship Program is administered by the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council of Canada, and the Social Sciences and Humanities Research Council of Canada. The fellowship provided up to two years of funding at $70,000 per year to researchers in health, natural sciences, engineering, social sciences and humanities.


Banting House, his former home located in London, Ontario, was declared a National Historic Site of Canada in 1997. The Banting Interpretation Centre in Musgrave Harbor, Newfoundland and Labrador is a museum named after him which focuses on the circumstances surrounding the 1941 plane crash which claimed his life. The crater Banting on the Moon is also named after him for his contributions to medicine. During the voting for “Greatest Canadians“ in late 2003, controversy arose over the future use of the Banting family farm in New Tecumseth which had been left to the Ontario Historical Society by Banting’s late nephew, Edward, in 1998. The dispute centered on the future use of the 40 ha (100 acre) property and its buildings. In a year-long negotiation, assisted by a provincially appointed facilitator, the Town of New Tecumseth offered $1 million to the Ontario Historical Society (OHS). The town intended to turn the property over to the Sir Frederick Banting Legacy Foundation for preservation of the property and buildings, and the Legacy Foundation planned to erect a Camp for Diabetic Youths. The day after the November 22, 2006, deadline for the OHS to sign the agreement, the OHS announced that it had sold the property for housing development to Solmar Development for more than $2 million. The Town of New Tecumseth announced it would designate the property under the Ontario Heritage Act. This would prevent its commercial development and obligate the owner to maintain it properly. OHS objected. The Ontario Conservation Review Board heard arguments for and against designation in September 2007 and recommended designation of the entire property in October. The Town officially passed the designation by-law on November 12, 2007.


Banting’s artwork has gained attention in the art community; A painting of his called “St. T?te des Cap“ sold for $30,000 including buyer’s premium at a Canadian art auction in Toronto.


He and his insulin discovery have also been depicted in various media formats, including comic books, the biography by Michael Bliss, and on television. The National Film Board of Canada produced a short film in 1958, The Quest, about his initial insulin experiments on dogs. The 1988 television movie Glory Enough for All depicted the search for insulin by Banting and Best, with R. H. Thomson starring as Banting. Banting is also portrayed by Jason Priestley boarding his fatal flight in the 2006 historical drama Above and Beyond.


Gregor Mendel (1822-1884)

This photo is from a book published in 1913 by R.C. Punnett, of Punnett Square fame, on Mendelism. Private Collection, Jules T. Mitchel. ©Target Health Inc.



Gregor Johann Mendel was a scientist, Augustinian friar and abbot of St. Thomas’ Abbey in Brno, Margraviate of Moravia. He was born in a German-speaking family in the Silesian part of the Austrian Empire (today’s Czech Republic) and gained posthumous recognition as the founder of the modern science of genetics. Though farmers had known for millennia that crossbreeding of animals and plants could favor certain desirable traits, Mendel’s pea plant experiments conducted between 1856 and 1863 established many of the rules of heredity, now referred to as the laws of Mendelian inheritance.


Mendel worked with seven characteristics of pea plants: plant height, pod shape and color, seed shape and color, and flower position and color. Taking seed color as an example, Mendel showed that when a true-breeding yellow pea and a true-breeding green pea were cross-bred their offspring always produced yellow seeds. However, in the next generation, the green peas reappeared at a ratio of 1 green to 3 yellow. To explain this phenomenon, Mendel coined the terms “recessive“ and “dominant“ in reference to certain traits. (In the preceding example, the green trait, which seems to have vanished in the first filial generation, is recessive and the yellow is dominant.) He published his work in 1866, demonstrating the actions of invisible “factors“ – now called genes – in predictably determining the traits of an organism. The profound significance of Mendel’s work was not recognized until the turn of the 20th century (more than three decades later) with the rediscovery of his laws. Erich von Tschermak, Hugo de Vries, Carl Correns, and William Jasper Spillman independently verified several of Mendel’s experimental findings, ushering in the modern age of genetics.


Mendel was the son of Anton and Rosine (Schwirtlich) Mendel, and had one older sister, Veronika, and one younger, Theresia. They lived and worked on a farm which had been owned by the Mendel family for at least 130 years. During his childhood, Mendel worked as a gardener and studied beekeeping. Later, as a young man, he attended gymnasium in Opava (called Troppau in German). He had to take four months off during his gymnasium studies due to illness. From 1840 to 1843, he studied practical and theoretical philosophy and physics at the Philosophical Institute of the University of Olomouc, taking another year off because of illness. He also struggled financially to pay for his studies, and Theresia gave him her dowry. Later he helped support her three sons, two of whom became doctors. He became a friar in part because it enabled him to obtain an education without having to pay for it himself. As the son of a struggling farmer, the monastic life, in his words, spared him the “perpetual anxiety about a means of livelihood.“


When Mendel entered the Faculty of Philosophy, the Department of Natural History and Agriculture was headed by Johann Karl Nestler who conducted extensive research of hereditary traits of plants and animals, especially sheep. Upon recommendation of his physics teacher Friedrich Franz, Mendel entered the Augustinian St Thomas’s Abbey in Brno (called Brunn in German) and began his training as a priest. Born Johann Mendel, he took the name Gregor upon entering religious life. Mendel worked as a substitute high school teacher. In 1850, he failed the oral part, the last of three parts, of his exams to become a certified high school teacher. In 1851, he was sent to the University of Vienna to study under the sponsorship of Abbot C. F. Napp so that he could get more formal education. At Vienna, his professor of physics was Christian Doppler. Mendel returned to his abbey in 1853 as a teacher, principally of physics. In 1856, he took the exam to become a certified teacher and again failed the oral part. In 1867, he replaced Napp as abbot of the monastery. After he was elevated as abbot in 1868, his scientific work largely ended, as Mendel became overburdened with administrative responsibilities, especially a dispute with the civil government over its attempt to impose special taxes on religious institutions. Mendel died on 6 January 1884, at the age of 61, in Brno, Moravia, Austria-Hungary (now Czech Republic), from chronic nephritis. Czech composer Leo? Jan?cek played the organ at his funeral. After his death, the succeeding abbot burned all papers in Mendel’s collection, to mark an end to the disputes over taxation.


Gregor Mendel, who is known as the “father of modern genetics“, was inspired by both his professors at the Palacky University, Olomouc (Friedrich Franz and Johann Karl Nestler), and his colleagues at the monastery (such as Franz Diebl) to study variation in plants. In 1854, Napp authorized Mendel to carry out a study in the monastery’s 2 hectares (4.9 acres) experimental garden, which was originally planted by Napp in 1830. Unlike Nestler, who studied hereditary traits in sheep, Mendel focused on plants. Mendel carried out his experiments with the common edible pea in his small garden plot in the monastery. These experiments were begun in 1856 and completed some eight years later. In 1865, he described his experiments in two lectures at a regional scientific conference. In the first lecture he described his observations and experimental results. In the second, which was given one month later, he explained them. After initial experiments with pea plants, Mendel settled on studying seven traits that seemed to be inherited independent of other traits: seed shape, flower color, seed coat tint, pod shape, unripe pod color, flower location, and plant height. He first focused on seed shape, which was either angular or round. Between 1856 and 1863 Mendel cultivated and tested some 28,000 plants, the majority of which were pea plants (Pisum sativum). This study showed that, when true-breeding different varieties were crossed to each other (e.g., tall plants fertilized by short plants), one in four pea plants had purebred recessive traits, two out of four were hybrids, and one out of four were purebred dominant. His experiments led him to make two generalizations, the Law of Segregation and the Law of Independent Assortment, which later came to be known as Mendel’s Laws of Inheritance.


A specific illustration: Crossing tall and short plants clarifies some of Mendel’s key observations and deductions.


At the time, gardeners could obtain true-breeding pea varieties from commercial seed houses. For example, one variety was guaranteed to give only tall pea plants (2 meters or so); another, only short plants (about 1/3 of a meter in height). If a gardener crossed one tall plant to itself or to another tall plant, collected the resultant seeds some three months later, planted them, and observed the height of the progeny, he would observe that all would be tall. Likewise, only short plants would result from a cross between true-breeding short peas. However, when Mendel crossed tall plants to short plants, collected the seeds, and planted them, all the offspring were just as tall, on average, as their tall parents. This led Mendel to the conclusion that the tall characteristic was dominant, and the short recessive. Mendel then crossed these second-generation tall plants to each other. The actual results from this cross were: 787 plants among the next generation (“grandchildren“ of the original cross of true-breeding cross of tall and short plants) were tall, and 277 were short. Thus, the short characteristic – which disappeared from sight in the first filial generation – resurfaced in the second, suggesting that two factors (now known as genes) determined plant height. In other words, although the factor which caused short stature ceased to exert its influence in the first filial generation, it was still present. Note also that the ratio between tall and short plants was 787/277, or 2.84 to 1 (approximately 3 to 1), again suggesting that plant height is determined by two factors. Mendel obtained similar results for six other pea traits, suggesting that a general rule is at work here: That most given characteristics of pea plants are determined by a pair of factors (genes in contemporary biology) of which one is dominant and the other is recessive.


Mendel presented his paper, “Versuche uber Pflanzenhybriden“ (“Experiments on Plant Hybridization“), at two meetings of the Natural History Society of Brno in Moravia on 8 February and 8 March 1865. It generated a few favorable reports in local newspapers, but was ignored by the scientific community. When Mendel’s paper was published in 1866 in Verhandlungen des naturforschenden Vereins Brunn, it was seen as essentially about hybridization rather than inheritance, had little impact, and was only cited about three times over the next thirty-five years. His paper was criticized at the time, but is now considered a seminal work. Notably, Charles Darwin was unaware of Mendel’s paper, and it is envisaged that if he had, genetics as we know it now might have taken hold much earlier. Mendel’s scientific biography thus provides one more example of the failure of obscure, highly-original, innovators to receive the attention they deserve.


Mendel began his studies on heredity using mice. He was at St. Thomas’s Abbey but his bishop did not like one of his friars studying animals, so Mendel switched to plants. Mendel also bred bees in a bee house that was built for him, using bee hives that he designed. He also studied astronomy and meteorology, founding the ‘Austrian Meteorological Society’ in 1865. The majority of his published works were related to meteorology. Mendel also experimented with hawkweed (Hieracium) and honeybees. He published a report on his work with hawkweed, a group of plants of great interest to scientists at the time because of their diversity. However, the results of Mendel’s inheritance study in hawkweeds was unlike his results for peas; the first generation was very variable and many of their offspring were identical to the maternal parent. In his correspondence with Carl Nageli, he discussed his results but was unable to explain them. It was not appreciated until the end of the nineteen century that many hawkweed species were apomictic, producing most of their seeds through an asexual process. None of his results on bees survived, except for a passing mention in the reports of Moravian Apiculture Society. All that is known definitely is that he used Cyprian and Carniolan bees, which were particularly aggressive to the annoyance of other monks and visitors of the monastery, such that he was asked to get rid of them. Mendel, on the other hand, was fond of his bees, and referred to them as “my dearest little animals“.


During Mendel’s own lifetime, most biologists held the idea that all characteristics were passed to the next generation through blending inheritance, in which the traits from each parent are averaged. Instances of this phenomenon are now explained by the action of multiple genes with quantitative effects. Charles Darwin tried unsuccessfully to explain inheritance through a theory of pangenesis. It was not until the early twentieth century that the importance of Mendel’s ideas was realized. By 1900, research aimed at finding a successful theory of discontinuous inheritance rather than blending inheritance, led to independent duplication of his work by Hugo de Vries and Carl Correns, and the rediscovery of Mendel’s writings and laws. Both acknowledged Mendel’s priority, and it is thought probable that de Vries did not understand the results he had found until after reading Mendel. Though Erich von Tschermak was originally also credited with rediscovery, this is no longer accepted because he did not understand Mendel’s laws. Though de Vries later lost interest in Mendelism, other biologists started to establish modern genetics as a science. All three of these researchers, each from a different country, published their rediscovery of Mendel’s work within a two-month span in the Spring of 1900. Mendel’s results were quickly replicated, and genetic linkage quickly worked out. Biologists flocked to the theory; even though it was not yet applicable to many phenomena, it sought to give a genotypic understanding of heredity which they felt was lacking in previous studies of heredity which focused on phenotypic approaches. Most prominent of these previous approaches was the biometric school of Karl Pearson and W. F. R. Weldon, which was based heavily on statistical studies of phenotype variation. The strongest opposition to this school came from William Bateson, who perhaps did the most in the early days of publicizing the benefits of Mendel’s theory (the word “genetics“, and much of the discipline’s other terminology, originated with Bateson). This debate between the biometricians and the Mendelians was extremely vigorous in the first two decades of the twentieth century, with the biometricians claiming statistical and mathematical rigor, whereas the Mendelians claimed a better understanding of biology. (Modern genetics shows that Mendelian heredity is in fact an inherently biological process, though not all genes of Mendel’s experiments are yet understood.) In the end, the two approaches were combined, especially by work conducted by R. A. Fisher as early as 1918. The combination, in the 1930s and 1940s, of Mendelian genetics with Darwin’s theory of natural selection resulted in the modern synthesis of evolutionary biology.


In 1936, R.A. Fisher, a prominent statistician and population geneticist, reconstructed Mendel’s experiments, analyzed results from the F2 (second filial) generation and found the ratio of dominant to recessive phenotypes (e.g. green versus yellow peas; round versus wrinkled peas) to be implausibly and consistently too close to the expected ratio of 3 to 1. Fisher asserted that “the data of most, if not all, of the experiments have been falsified so as to agree closely with Mendel’s expectations,“ Mendel’s alleged observations, according to Fisher, were “abominable“, “shocking“, and “cooked“. Other scholars agree with Fisher that Mendel’s various observations come uncomfortably close to Mendel’s expectations. Dr. Edwards, for instance, remarks: “One can applaud the lucky gambler; but when he is lucky again tomorrow, and the next day, and the following day, one is entitled to become a little suspicious“. Three other lines of evidence likewise lend support to the assertion that Mendel’s results are indeed too good to be true. Fisher’s analysis gave rise to the Mendelian Paradox, a paradox that remains unsolved to this very day. Thus, on the one hand, Mendel’s reported data are, statistically speaking, too good to be true; on the other, “everything we know about Mendel suggests that he was unlikely to engage in either deliberate fraud or in unconscious adjustment of his observations.“ A number of writers have attempted to resolve this paradox. One attempted explanation invokes confirmation bias. Fisher accused Mendel’s experiments as “biased strongly in the direction of agreement with expectation to give the theory the benefit of doubt“. This might arise if he detected an approximate 3 to 1 ratio early in his experiments with a small sample size, and, in cases where the ratio appeared to deviate slightly from this, continued collecting more data until the results conformed more nearly to an exact ratio.


In his 2004, J.W. Porteous concluded that Mendel’s observations were indeed implausible. However, reproduction of the experiments has demonstrated that there is no real bias towards Mendel’s data. Another attempt to resolve the Mendelian Paradox notes that a conflict may sometimes arise between the moral imperative of a bias-free recounting of one’s factual observations and the even more important imperative of advancing scientific knowledge. Mendel might have felt compelled “to simplify his data in order to meet real, or feared, editorial objections.“ Such an action could be justified on moral grounds (and hence provide a resolution to the Mendelian Paradox), since the alternative – refusing to comply – might have retarded the growth of scientific knowledge. Similarly, like so many other obscure innovators of science, Mendel, a little known innovator of working class background, had to “break through the cognitive paradigms and social prejudices of his audience. If such a breakthrough “could be best achieved by deliberately omitting some observations from his report and adjusting others to make them more palatable to his audience, such actions could be justified on moral grounds.“


Daniel L. Hartl and Daniel J. Fairbanks reject outright Fisher’s statistical argument, suggesting that Fisher incorrectly interpreted Mendel’s experiments. They find it likely that Mendel scored more than 10 progeny, and that the results matched the expectation. They conclude: “Fisher’s allegation of deliberate falsification can finally be put to rest, because on closer analysis it has proved to be unsupported by convincing evidence.“ In 2008 Hartl and Fairbanks (with Allan Franklin and AWF Edwards) wrote a comprehensive book in which they concluded that there were no reasons to assert Mendel fabricated his results, nor that Fisher deliberately tried to diminish Mendel’s legacy. Reassessment of Fisher’s statistical analysis, according to these authors, also disprove the notion of confirmation bias in Mendel’s results.



Customs of Central Asians. Circumcision. Photograph shows a group of men seated on the ground near a small boy who is being circumcised. Album print. Illus. in: Turkestanskii al’bom, chast’ etnograficheskaia, 1871-1872, part 2, vol. 1, pl. 71. Batga [sic] buri translated from Persian as circumcision. Photo credit: Unknown – Library of Congress, Public Domain, Wikipedia Commons


There is a huge amount of information regarding the history of circumcision, dates which are way before the Bible was written; the Bible has many references to circumcision. Vast historical references include religious and social customs as well as superstitions and taboos, in addition to medical evidence. There is only room here for just a glimpse of the medical point of view.


Sir John Hutchinson MD – Photo credit: Unknown; Wikipedia Commons



Jonathan Hutchinson (1828-1913), an eminent English physician was the first prominent medical advocate for circumcision. Hutchinson’s activity in the cause of scientific surgery and in advancing the study of the natural sciences was unwearying. He published more than 1,200 medical articles and also produced the quarterly Archives of Surgery from 1890 to 1900, being its only contributor. His lectures on neuropathogenesis, gout, leprosy, diseases of the tongue, etc., were full of original observation; but his principal work was connected with the study of syphilis, on which he became the first living authority. He was the first to describe his triad of medical signs for congenital syphilis: notched incisor teeth, labyrinthine deafness and interstitial keratitis, which was very useful for providing a firm diagnosis long before the Treponema pallidum or the Wassermann test were discovered. Hutchinson was the founder of the Medical Graduates’ College and Polyclinic; and both in his native town of Selby and at Haslemere, Surrey, he started educational museums for popular instruction in natural history. He published several volumes on his own subjects and was given an Hon. LL.D degree by both the University of Glasgow and University of Cambridge. He received a knighthood in 1908.


In 1855, Hutchinson published a study in which he compared the rate of contraction of venereal disease amongst the gentile and Jewish population of London. His study appeared to demonstrate that circumcised men were significantly less vulnerable to venereal diseases. In fact, a 2006 systematic review concluded that the evidence strongly indicates that circumcised men are at lower risk of chancroid and syphilis. Clearly, Dr. Hutchinson was ahead of his time. Hutchinson also became a notable leader in the campaign for medical circumcision for the next fifty years, publishing A Plea for Circumcision. in the British Medical Journal (1890). In that article, he contended that,“the foreskin constitutes a harbor for filth, and is a constant source of irritation. It conduces to [self eroticism], and adds to the difficulties of continence. It increases the risk of syphilis in early life, and of cancer in the aged. In an 1893 article, On circumcision as a preventive of self-eroticism, he wrote: I am inclined to believe that circumcision may often accomplish much, both in breaking the habit as an immediate result, and in diminishing the temptation to it subsequently.“


Nathaniel Heckford, a pediatrician at the East London Hospital for Children, wrote Circumcision as a Remedial Measure in Certain Cases of Epilepsy, Chorea, etc. (1865), in which he argued that circumcision acted as an effective remedial measure in the prevention of certain cases of epilepsy and chorea. These increasingly common medical beliefs were even applied to females. The controversial obstetrical surgeon Isaac Baker Brown founded the London Surgical Home for Women in 1858, where he worked on advancing surgical procedures. In 1866, Baker Brown described the use of clitoridectomy, as a cure for several conditions, including epilepsy, catalepsy and mania, which he attributed to self-stimulation. In On the Curability of Certain Forms of Insanity, Epilepsy, Catalepsy, and Hysteria in Females, he gave a 70% success rate using this treatment. However, during 1866, Baker Brown began to receive negative feedback from within the medical profession from doctors who questioned the validity of Baker Brown’s claims of success. An article appeared in The London Times, which was favorable towards Baker Brown’s work but suggested that Baker Brown had treated women of unsound mind. He was also accused of performing procedures without the consent or knowledge of his patients or their families. In 1867 he was expelled from the Obstetrical Society of London for carrying out the operations without consent. Baker Brown’s ideas were more accepted in the United States, where, from the 1860s, the operation was being used to cure hysteria, and in young girls what was called rebellion or unfeminine aggression“.


Lewis Sayre, New York orthopedic surgeon, became a prominent advocate for circumcision in America. In 1870, he examined a five-year-old boy who was unable to straighten his legs, and whose condition had so far defied treatment. Upon noting that the boy’s genitals were inflamed, Sayre hypothesized that chronic irritation of the boy’s foreskin had paralyzed his knees via reflex neurosis. Sayre circumcised the boy, and within a few weeks, he recovered from his paralysis. After several additional incidents in which circumcision also appeared effective in treating paralyzed joints, Sayre began to promote circumcision as a powerful orthopedic remedy. Sayre’s prominence within the medical profession allowed him to reach a wide audience. As more practitioners tried circumcision as a treatment for otherwise intractable medical conditions, sometimes achieving positive results, the list of ailments reputed to be treatable through circumcision grew. By the 1890s, hernia, bladder infections, kidney stones, insomnia, chronic indigestion, rheumatism, epilepsy, asthma, bedwetting, Bright’s disease, erectile dysfunction, syphilis, insanity, and skin cancer had all been linked to the foreskin, and many physicians advocated universal circumcision as a preventive health measure.


Specific medical arguments aside, several hypotheses have been raised in explaining the public’s acceptance of infant circumcision as preventive medicine. The success of the germ theory of disease had not only enabled physicians to combat many of the postoperative complications of surgery, but had made the wider public deeply suspicious of dirt and bodily secretions. Accordingly, the smegma that collects under the foreskin was viewed as unhealthy, and circumcision readily accepted as good hygiene. In this Victorian climate, circumcision could be employed as a means of discouraging self-stimulation. All About the Baby, a popular parenting book of the 1890s, recommended infant circumcision for precisely this purpose. As hospitals proliferated in urban areas, childbirth, at least among the upper and middle classes, was increasingly under the care of physicians in hospitals rather than with midwives in the home. It has been suggested that once a critical mass of infants were being circumcised in the hospital, circumcision became a class marker of those wealthy enough to afford a hospital birth.


During the same time period, circumcision was becoming easier to perform. William Stewart Halsted’s 1885 discovery of hypodermic cocaine as a local anesthetic made it easier for doctors without expertise in the use of chloroform and other general anesthetics to perform minor surgeries. Also, several mechanically aided circumcision techniques, forerunners of modern clamp-based circumcision methods, were first published in the medical literature of the 1890s, allowing surgeons to perform circumcisions more safely and successfully. By the 1920s, advances in the understanding of disease had undermined much of the original medical basis for preventive circumcision. Doctors continued to promote it, however, as good penile hygiene and as a preventive for a handful of conditions such as: balanitis, phimosis, and cancer.


By 2014 the American Academy of Pediatrics found that the health benefits of newborn male circumcision outweigh the risks


Circumcision in English-speaking countries arose in a climate of antiquated, negative attitudes towards relationships. In her 1978 article The Ritual of Circumcision, Karen Erickson Paige writes: The current medical rationale for circumcision developed after the operation was in wide practice. The original reason for the surgical removal of the foreskin, or prepuce, was to control ‘insanity’ – the range of mental disorders that people believed were caused by the polluting’ practice of self-abuse.“


Editor’s note: Page is pointing out, how hard it is to believe that anyone could have such outrageous ideas, completely lacking in any scientific evidence and so harshly punitive.


Self-abuse was a term commonly used to describe self-stimulation in the 19th century. According to Paige, treatments ranged from diet, moral exhortations, hydrotherapy, and marriage, to such drastic measures as surgery, physical restraints, frights, and punishment. Some doctors recommended using plaster of Paris, leather, or rubber; cauterization; making boys wear chastity belts or spiked rings; and in extreme cases, castration. Paige details how circumcision became popular as a remedy:


In the 1890s, it became a popular technique to prevent, or cure, insanity. In 1891 the president of the Royal College of Surgeons of England published On Circumcision as a Preventive, and two years later another British doctor wrote Circumcision: Its Advantages and How to Perform It, which listed the reasons for removing the vestigial prepuce. Evidently the foreskin could cause nocturnal incontinence, hysteria, epilepsy, and irritation that might give rise to erotic stimulation. Another physician, P.C. Remondino, added that circumcision is like a substantial and well-secured life annuity as it insures better health, greater capacity for labor, longer life, less nervousness, sickness, loss of time, and less doctor bills. No wonder it became a popular remedy.


One of the leading advocates of circumcision was John Harvey Kellogg. He advocated the consumption of Kellogg’s corn flakes as a remedy, and he believed that circumcision would be an effective way to eliminate stimulation in males.


Editor’s note: Talk about plain old Puritanical meanness can hardly believe some of this, but it’s true. Eighteenth and Nineteenth Century solutions: Covering the organs with a cage had been practiced with entire success. A remedy which is almost always successful in small boys is circumcision, especially when there is any degree of phimosis. The operation should be performed by a surgeon without administering an anesthetic, as the brief pain attending the operation will have a salutary effect upon the mind, especially if it be connected with the idea of punishment, as it may well be in some cases. The soreness which continues for several weeks interrupts the practice, and if it had not previously become too firmly fixed, it may be forgotten and not resumed. If any attempt is made to watch the child, he should be so carefully surrounded by vigilance that he cannot possibly transgress without detection. If he is only partially watched, he soon learns to elude observation, and thus the effect is only to make him cunning in his vice.


Robert Darby (2003), writing in the Medical Journal of Australia, noted that some 19th-century circumcision advocates – and their opponents – believed that the foreskin was highly erotic and sensitive:


In the 19th century the role of the foreskin in erotic sensation was well understood by physicians who wanted to cut it off precisely because they considered it the major factor leading boys to self-stimulation. The Victorian physician and venerealologist William Acton (1814-1875) damned it as a source of serious mischief, and most of his contemporaries concurred. Both opponents and supporters of circumcision agreed that the significant role the foreskin played in responses was the main reason why it should be either left in place or removed. William Hammond, a Professor of Mind in New York in the late 19th century, commented that circumcision, when performed in early life, generally lessens the voluptuous sensations of intimacy, and both he and Acton considered the foreskin necessary for optimal reproductive function, especially in old age. Jonathan Hutchinson, English surgeon and pathologist (1828-1913), and many others, thought this was the main reason why it should be excised.


Born in the United Kingdom during the late-nineteenth century, John Maynard Keynes and his brother Geoffrey, were both circumcised in boyhood due to parents’ concern about their habits. Mainstream pediatric manuals continued to recommend circumcision as a deterrent until the 1950s.


Wikipedia; http://www.nytimes.com/1997/04/02/us/study-is-adding-to-doubts-about-circumcision.html


Freudian Psychoanalysis – Two Other Branches, Out of Many

Graphic image: by historicair 16:56, 16 December 2006 (UTC) – en:Image:Structural-Iceberg.png by en:User:Jordangordanier, Public Domain; Wikipedia Commons



Freudian psychoanalytic theory spawned other creative approaches to the practice of psychoanalysis, which built upon Freud’s theories of psychic development.


Object Relations and The Basic Fault


Michael Balint (1896-1970) was a Hungarian psychoanalyst who spent most of his adult life in England. He was a proponent of the Object Relations school.


Balint was born Mihaly Maurice Bergsmann, the son of a practicing physician in Budapest. It was against his father’s will that he changed his name to Balint Mihaly. He also changed religion, from Judaism to Unitarian Christianity. During World War I Balint served at the front, first in Russia, then in the Dolomites. He completed his medical studies in Budapest in 1918. On the recommendation of his future wife, Alice Szekely-Kovacs, Balint read Sigmund Freud’s “Drei Abhandlungen zur Sexualtheorie“ (1905) and “Totem und Tabu“. He also began attending the lectures of Sandor Ferenczi, who in 1919 became the world’s first university professor of psychoanalysis. In 1920, Balint married and then moved to Berlin, where he worked in the biochemical laboratory of Otto Heinrich Warburg (1883-1970), who won the Nobel Prize in 1931. Balint worked on his doctorate in biochemistry, while also working half time at the Berlin Institute of psychoanalysis. In 1924 the Balints returned to Budapest, where he assumed a leading role in Hungarian psychoanalysis. During the 1930s the political conditions in Hungary made the teaching of psychotherapy practically impossible, and they emigrated to London in 1938, settling in Manchester, England. In early 1939, Balint became Clinical Director of the Child Guidance Clinic. In 1944, his parents, about to be arrested by the Nazis in Hungary, committed suicide. That year Balint moved from Manchester to London, where he was attached to the Tavistock Clinic and began learning about group work from W.R. Bion; he also obtained the Master of Science degree in psychology. In 1949, Balint became the leader of the Tavistock Institute of Human Relations and developed what is now known as the “Balint group“: The Balint Group consisted of a group of physicians sharing the problems of general practice, in particular, focusing on the responses of the doctors to their patients. This first group of practicing physicians was established in 1950. In 1968 Balint became president of the British Psychoanalytical Society.In Hamburg, Germany, The Michael-Balint-Institut fur Psychoanalyse, Psychotherapie und analytische Kinder- und Jugendlichen- Psychotherapie is named for him.


Balint took an early interest in the mother-infant relationship, a key paper on “Primary Object-Love“ was received with approval by other Freudian psychoanalysts. One respected psychoanalyst wrote that “Michael Balint has analyzed in a thoroughly penetrating way the intricate interaction of theory and technique in the genesis of a new conception of analysis, of a “two-body psychology“. On that basis, Balint explored the idea of what he called “the basic fault“: this was that there was often the experience in the early two-person relationship that something was wrong or missing, and this carried over into the Oedipal period (age 2-5). By 1968, then, Balint had distinguished three levels of experience, each with its particular ways of relating, its own ways of thinking, and its own appropriate therapeutic procedures. Balint’s “three person or level 3,“ was the level at which a person is capable of a three-sided experience, primarily the Oedipal problems between self, mother, and father’. By contrast, ‘the area of the Basic Fault is characterized by a very peculiar exclusively two-person relationship’; while a ‘third area is characterized by the fact that there are no external objects in it – level number 1.


Therapeutic failure is attributed by Balint to the analyst’s inability to “click in“ to the mute needs of the patient who has descended to the level of the basic fault; and he maintained that the basic fault can only be overcome if the patient is allowed to regress to a state of oral dependence on the analyst and experience a new beginning. Balint developed a process of brief psychotherapy he termed “focal psychotherapy,“ in which one specific problem presented by the patient is chosen as the focus of interpretation. The therapy was carefully targeted around that key area to avoid (in part) the risk that the focal therapy would have degenerated into long-term psychotherapy or psychoanalysis. Here as a rule interpretation remained ‘entirely on the whole-person adult level, it was the intention to reduce the intensity of the feelings in the therapeutic relationship. In accordance with the thinking of other members of what is known as the British independent perspective, such as W. R. D. Fairbairn and D. W. Winnicott, great stress was laid upon the creative role of the patient in focal therapy: To our minds, an “independent discovery“ by the patient has the greatest dynamic power. It has been suggested that it was in fact this work of Michael Balint and his colleagues which led to time-limited therapies being rediscovered.


Michael Balint as part of the independent tradition in British psychoanalysis, was influential in setting up groups (now known as “Balint groups“) for medical doctors to discuss psychodynamic factors in relation to patients. Instead of repeating futile investigations of increasing complexity and cost, Balint taught active search for causes of anxiety and unhappiness, and treatment by remedial education aiming at insight by the patient. Such seminars provided opportunities for GPs to discuss with each other and with him aspects of their work with patients for which they had previously felt ill equipped. Since his death the continuance of this work has been assured by the formation of the Balint Society.


Psychoanalysis and Low Dose LSD


The term anaclitic (from the Greek anaklinein – to lean upon) refers to various early infantile needs and tendencies directed toward a pregenital love object. This method was developed in the 1950s by two London Freudian psychoanalysts, Joyce Martin MD and Pauline McCririck MD. It is based on clinical observations of deep age regression occurring in LSD sessions of psychiatric patients. During these periods many of them relive episodes of early infantile frustration and emotional deprivation. This is typically associated with agonizing cravings for love, physical contact, and other instinctual needs experienced on a very primitive level. The technique of LSD therapy practiced by Martin and McCririck was based on psychoanalytic understanding and interpretation of all the situations and experiences occurring in drug sessions and in this sense is very close to psycholytic approaches. The critical difference distinguishing this therapy from any other was the element of direct satisfaction of anaclitic needs of the patients. In contrast to the traditional detached attitude characteristic of psychoanalysis and psycholytic treatment, Martin and McCririck assumed an active mothering role and entered into close physical contact with their patients to help them to satisfy primitive infantile needs reactivated by the drug.


More superficial aspects of this approach involve holding the patients and feeding them warm milk from a bottle, caressing and offering reassuring touches, holding their heads in one’s lap, or hugging and rocking. The extreme of psycho-dramatic involvement of the therapist is the so-called “fusion technique,“ which consists of full body contact with the client. The patient lies on the couch covered with a blanket and the therapist lies beside his or her body, in close embrace, usually simulating the gentle comforting movements of a mother caressing her baby. The subjective reports of patients about these periods of “fusion“ with the therapist are quite remarkable. They describe authentic feelings of symbiotic union with the nourishing mother image, experienced simultaneously on the level of the “good breast“ and “good womb.“ In this state, patients can experience themselves as infants receiving love and nourishment at the breast of the nursing mother and at the same time feel totally identified with a fetus in the oceanic paradise of the womb. This state can simultaneously involve archetypal dimensions and elements of mystical rapture, and the above situations be experienced as contact with the Great Mother or Mother Nature. It is not uncommon that the deepest form of this experience involves feelings of oneness with the entire cosmos and the ultimate creative principle, or God. The fusion technique seems to provide an important channel between the psychodynamic, biographical level of the LSD experience and the transcendental states of consciousness. Patients in anaclitic therapy relate that during their nourishing exchange with the mother image, the milk seemed to be “coming directly from the Milky Way.“ In the imaginary re-enactment of the placentary circulation the life-giving blood can be experienced as sacramental communion, not only with the material organism, but with the divine source. Repeatedly, the situations of “fusion“ have been described in all their psychological and spiritual ramifications as fulfillment of the deepest needs of human nature, and as extremely healing experiences. Some patients described this technique as offering the possibility of a retroactive intervention in their deprived childhood. When the original traumatic situations from childhood become reenacted in all their relevance and complexity with the help of the “psychedelic time-machine,“ the therapist’s affection and loving care can fill the vacuum caused by deprivation and frustration.


The dosages used in this treatment technique ranged between 100 and 200 micrograms of LSD, sometimes with the addition of Ritalin in later hours of the sessions. Martin and McCririck described good and relatively rapidly achieved results in patients with deep neuroses or borderline psychotic disorders who had experienced severe emotional deprivation in childhood. Their papers, presentations at scientific meetings, and a film documenting the anaclitic technique stirred up an enormous amount of interest among LSD therapists and generated a great deal of fierce controversy. The reactions of colleagues to this treatment modality ranged from admiration and enthusiasm to total condemnation. Since most of the criticism from the psychoanalytically oriented therapists revolved around the violation of the psychoanalytic taboo against touching and the possible detrimental consequences of the fusion technique for transference-countertransference problems, it is interesting to describe the authors’ response to this serious objection. Both Martin and McCririck seemed to concur that they had experienced much more difficulty with transference relationships before they started using the fusion technique. According to them, it is the lack of fulfillment in the conventional therapeutic relationship that foments and perpetuates transference. The original traumatic situations are continuously reenacted in the therapeutic relationship and the patient essentially experiences repetitions of the old painful rejections. When the anaclitic needs are satisfied in the state of deep regression induced by the drug, the patients are capable of detaching themselves emotionally from the therapist and look for more appropriate objects in their real life. This situation has a parallel in the early developmental history of the individual. Those children whose infantile emotional needs were adequately met and satisfied by their parents find it relatively easy to give up the affective ties to their family and develop independent existence. By comparison, those individuals who experienced emotional deprivation and frustration in childhood tend to get trapped during their adult life in symbiotic patterns of interaction, destructive and self-destructive clinging behavior, and life-long problems with dependence-independence. According to Martin and McCririck, the critical issue in anaclitic therapy is to use the fusion technique only during periods of deep regression, and keep the experience strictly on the pregenital level. It should not be used in the termination periods of the sessions when the anaclitic elements could get easily confused with adult sexual patterns.


The anaclitic technique never achieved wide acceptance; its use seemed to be closely related to unique personality characteristics in its authors. Most other therapists, particularly males, found it emotionally difficult and uncomfortable to enter into the intimate situation of fusion with their clients. However, the importance of physical contact in LSD psychotherapy is unquestionable and many therapists have routinely used various less-intense forms of body contact.


Sources: History of LSD Therapy by Stanislav Grof, M.D.; Wikipedia

Moritz Kaposi, MD

Moritz Kaposi – Photo credit: Unknown – Images from the History of Medicine (NLM), Public Domain; Wikipedia Commons


According to his biographer, Dr. J.D. Oriel, “in his lifetime, Moritz Kaposi, MD, was acknowledged as one of the great masters of the Vienna School of Dermatology, a superb clinician and renowned teacher“. While his mentor, Ferdinand von Hebra, is considered the “father of dermatology“, Kaposi was one of the first to establish dermatology on its anatomical pathology scientific basis. He became the chairman of the Vienna School of Dermatology, after Hebra’s death in 1880.


Moritz Kaposi, a Hungarian physician, was born on 23 October 1837 in Kaposvar, Austria-Hungary and died on 6 March 1902 in Vienna. This well-known physician is best known as the dermatologist who discovered the skin tumor that received his name (Kaposi’s sarcoma). Kaposi was born to a Jewish family, whose original surname was Kohn. But with his conversion to the Catholic faith, he changed it to Kaposi in 1871, in reference to his town of birth. One purported reason behind this is that he wanted to marry a daughter of current dermatology chairman, Ferdinand Ritter von Hebra, and advance in the society, which he could not have done being of Jewish faith. This seems unlikely because he married Martha Hebra and converted to Catholicism several years prior to changing his name, by which time he was already well established in the Vienna University faculty and a close associate of von Hebra. A more plausible explanation is based on his own comments to colleagues that he changed his name to avoid confusion with five other similarly named physicians on the Vienna faculty. Rumors about the sincerity of both his marriage and his concerns about his Jewish ancestry may have arisen through professional jealousy. According to William Dubreuilh (1857-1935), first professor and chairman of dermatology in Bordeaux: “On disait de Kaposi qu’il avait pris la fille de Hebra, sa maison, sa chaire et sa clientele, laissant le reste a son beau-frere Hans Hebra.“ – “It was said of Kaposi that he had taken the daughter of Hebra, his home, his chair and his clientele, leaving the rest to his brother-in-law, Hans Hebra.“


In 1855, Kaposi began to study medicine at the University of Vienna and attained a doctorate in 1861. In his dissertation, titled Dermatologie und Syphilis (1866), he made an important contribution to the field. Kaposi was appointed as professor at the University of Vienna in 1875, and in 1881 he became a member of the board of the Vienna General Hospital and director of its clinic of skin diseases. Together with his mentor, Ferdinand Ritter von Hebra, he authored the book Lehrbuch der Hautkrankheiten (Textbook of Skin Diseases) in 1878. Kaposi’s main work, however, was Pathologie und Therapie der Hautkrankheiten in Vorlesungen fur praktische Arzte und Studierende (Pathology and Therapy of the Skin Diseases in Lectures for Practical Physicians and Students), published in 1880, which became one of the most significant books in the history of dermatology, being translated to several languages. Kaposi is credited with the description of xeroderma pigmentosum, a rare genetic disorder now known to be caused by defects in nucleotide excision repair (“Ueber Xeroderma pigmentosum. Medizinische Jahrbucher, Wien, 1882: 619-633“). Among other diseases, Kaposi was the first to study Lichen scrofolosorum and Lupus erythematosus. In all, he published over 150 books and papers and is widely credited with advancing the use of pathologic examination in the diagnosis of dermatologic diseases.


Kaposi’s name entered into the history of medicine in 1872, when he described for the first time Kaposi’s sarcoma, a cancer of the skin, which he had discovered in five elderly male patients and which he initially named “idiopathic multiple pigmented sarcoma“. More than a century later, the appearance of this disease in young gay men in New York City, San Francisco and other coastal cities in the United States was one of the first indications that a new disease, now called AIDS, had appeared. In 1993, the discovery that Kaposi’s sarcoma was associated with the herpesvirus, sparked considerable controversy and scientific in-fighting until sufficient data had been collected to show that indeed KSHV was the causative agent of Kaposi’s sarcoma. The virus is now known to be a widespread infection of people living in sub-Saharan Africa; intermediate levels of infection occur in Mediterranean populations (including Israel, Saudi Arabia, Italy and Greece) and low levels of infection occur in most Northern European and North American populations. Kaposi’s sarcoma is now the most commonly reported cancer in parts of sub-Saharan Africa. Kaposi’s sarcoma is usually a localized tumor that can be treated either surgically or through local irradiation. Chemotherapy with drugs such as liposomal anthracyclines or paclitaxel may be used, particularly for invasive disease. Antiviral drugs, such as ganciclovir, that target the replication of herpesviruses such as KSHV have been used to successfully prevent development of Kaposi’s sarcoma, although once the tumor develops these drugs are of little or no use.


Michael D. Gershon, MD

Michael Gershon MD: “Serotonin is a sword and a shield of the bowel: serotonin plays offense and defense.“ Photo credit: Columbia University Medical School, MD/PhD Program


Michael D. Gershon, is Professor of Pathology and Cell Biology, at Columbia University Medical School and Center. Gershon has been called the “father of neurogastroenterology“ because, in addition to his seminal work on neuronal control of gastrointestinal (GI) behavior and development of the enteric nervous system (ENS), his classic trade book, The Second Brain, has made physicians, scientists, and the lay public aware of the significance of the unique ability of the ENS to regulate GI activity in the absence of input from the brain and spinal cord. Gershon’s demonstration that serotonin is an enteric neurotransmitter was the first indication that the ENS is more than a collection of cholinergic relay neurons transmitting signals from the brain to the bowel. He was the first to identify intrinsic primary afferent neurons that initiate peristaltic and secretory reflexes and he demonstrated that these cells are activated by the mucosal release of serotonin. Dr. Gershon has published almost 400 peer-reviewed papers including major contributions relative to disorders of GI motility, including irritable bowel syndrome, identification of serotonin as a GI neurotransmitter and the initial observation in the gut of intrinsic sensory nerve cells that trigger propulsive motor activity. Dr. Gershon also discovered that the serotonin transporter (SERT) is expressed by enterocytes (cells that line the lumen of the gut) as well as by enteric neurons and is critical in the termination of serotonin-mediated effects.


Dr. Gershon has identified roles in GI physiology that specific subtypes of serotonin receptor play and he has provided evidence that serotonin is not only a neurotransmitter and a paracrine factor that initiates motile and secretory reflexes, but also as a hormone that affects bone resorption and inflammation. He has called serotonin “a sword and shield of the bowel“ because it is simultaneously proinflammatory and neuroprotective. Mucosal serotonin triggers inflammatory responses that oppose microbial invasion, while neuronal serotonin protects the ENS from the damage that inflammation would otherwise cause. Neuron-derived serotonin also mobilizes precursor cells, which are present in the adult gut, to initiate the genesis of new neurons, an adult function that reflects a similar essential activity of early-born serotonergic neurons in late fetal and early neonatal life to promote development of late-born sets of enteric neurons.


Dr. Gershon has made many additional contributions to ENS development, including the identification of necessary guidance molecules, adhesion proteins, growth and transcription factors; his observations suggest that defects that occur late in ENS development lead to subtle changes in GI physiology that may contribute to the pathogenesis of functional bowel disorders. More recently, Drs. Michael and Anne Gershon have demonstrated that varicella zoster virus (VZV) infects, becomes latent, and reactivates in enteric neurons, including those of humans. They have demonstrated that “enteric zoster (shingles)“ occurs and may thus be an unexpected cause of a variety of gastrointestinal disorders, the pathogenesis of which is currently unknown.


Born in New York City in 1938, Dr. Michael D. Gershon received his B.A. degree in 1958 “with distinction from Cornell University and his M.D. in 1963, again from Cornell. Gershon received postdoctoral training with Edith Bulbring in Pharmacology at Oxford University before returning to Cornell as an Assistant Professor of Anatomy in 1967. He was promoted to Professor before leaving Cornell to Chair the Department of Anatomy & Cell Biology at Columbia University’s College of P&S from 1975-2005. Gershon is now a Professor of Pathology & Cell Biology at Columbia.


Gershon’s contributions to the identification, location, and functional characterization of enteric serotonin receptors have been important in the design of drugs to treat irritable bowel syndrome, chronic constipation, and chemotherapy-associated nausea. Gershon’s discovery that the serotonin transporter (SERT), which terminates serotonergic signaling, is expressed in the bowel both by enterocytes and neurons opened new paths for research into the pathophysiology of irritable bowel syndrome and inflammatory bowel disease. He has linked mucosal serotonin to inflammation and neuronal serotonin to neuroprotection and the generation of new neurons from adult stem cells. These discoveries have led to the new idea that the function of serotonin is not limited to paracrine signaling and neurotransmission in the service of motility and secretion, but is also a sword and a shield of the gut.


Gershon has teamed with his wife, Anne Gershon, to show that the mannose 6-phosphate receptor plays critical roles in the entry and exit of varicella zoster virus (VZV). The Gershons have also developed the first animal model of VZV disease, which enables lytic and latent infection as well as reactivation to be studied in isolated enteric neurons. The Gershons have also shown that following varicella, VZV establishes latency in the human ENS. Finally, Gershon has made major contributions to understanding the roles played by a number of critical transcription and growth factors in enabling emigres from the neural crest to colonize the bowel, undergo regulated proliferation, find their appropriate destinations in the gut wall, and terminally differentiate into the most phenotypcially diverse component of the peripheral nervous system.


Dr. Michael Gershon has devoted his career to understanding the human bowel (the stomach, esophagus, small intestine, and colon). His thirty years of research have led to an extraordinary rediscovery: nerve cells in the gut that act as a brain. This “second brain“ can control our gut all by itself. Our two brains — the one in our head and the one in our bowel — must cooperate. If they do not, then there is chaos in the gut and misery in the head — everything from “butterflies“ to cramps, from diarrhea to constipation.


Gershon’s groundbreaking book, The Second Brain represents a quantum leap in medical knowledge and is already benefiting patients whose symptoms were previously dismissed as neurotic or “it’s all in your head.“ Dr. Gershon’s research, clearly demonstrates that the human gut actually has a brain of its own. This remarkable scientific breakthrough offers fascinating proof that “gut instinct“ is biological, a function of the second brain. An alarming number of people suffer from heartburn, nausea, abdominal pain, cramps, diarrhea, constipation, or related problems. Often thought to be caused by a “weakness“ of the mind, these conditions may actually be a reflection of a disorder in the second brain. The second brain, located in the bowel, normally works smoothly with the brain in the head, enabling the head-brain to concentrate on the finer pursuits of life while the gut-brain attends to the messy business of digestion. A breakdown in communication between the two brains can lead to stomach and intestinal trouble, causing sufferers great abdominal grief and too often labeling them as neurotic complainers. Dr. Gershon’s research into the second brain provides understanding for those who suffer from gut-related ailments and offers new insight into the origin, extent, and management. The culmination of his work is an extraordinary contribution to the understanding of gastrointestinal illnesses, as well as a fascinating glimpse into how our gut really works.


A light touch: The irreplaceable, indomitable, Stephen Colbert interviews the great Michael Gershon MD about the Second Brain, in the gut


Michael Gershon clearly explains some of his research. This is video one out of seven. You can find the other six

videos on YouTube.


Very short student note regarding Dr Gershon


Next Page →