Patricia Bath MD, Inventor (1942 to Present)

Patricia Bath MD – Inventor of Laserphaco Probe – Photo credit: National Library of Medicine; www.nlm.nih.gov/changingthefaceofmedicine; Public Domain, Wikipedia Commons

 

Patricia Era Bath is an American ophthalmologist, inventor, and academic. She broke ground for women and African Americans in a number of areas. Bath was the first African American to serve as a resident in ophthalmology at New York University. She is also the first African American woman to serve on staff as a surgeon at the UCLA Medical Center. And finally, Bath is the first African-American woman physician to receive a patent for a medical purpose. The holder of four patents, she also founded the non-profit American Institute for the Prevention of Blindness in Washington, D.C.

 

Dr. Bath was born on November 4, 1942, in Harlem, Manhattan. Her father, an immigrant from Trinidad, was newspaper columnist, a merchant seaman and the first African American to work for the New York City Subway as a motorman. Her father inspired her love for culture and encouraged Bath to explore different cultures. Her mother descended from African slaves. It was evident by Bath’s teachers that she was a gifted student and pushed her to explore her strengths in school in science. With the help of a microscope set she was given as a young child, Bath knew she had a love for math and science. Bath attended Charles Evans Hughes High School where she excelled at such a rapid pace that she obtained her diploma in just two and a half years.

 

Inspired by Albert Schweitzer’s work in medicine, Bath applied for and won a National Science Foundation Scholarship while attending high school; this led her to a research project at Yeshiva University and Harlem Hospital Center on connection between cancer, nutrition and stress which helped her interest in science shift to medicine. The head of the researched program realized the significance to her findings during the research and published them in a scientific paper that he later presented. In 1960, still a teenager, Bath won the “Merit Award” of Mademoiselle magazine for her contribution to the project. Bath received her Bachelor of Arts in chemistry from Manhattan’s Hunter College in 1964 and relocated to Washington, D.C. to attend Howard University College of Medicine where she received her doctoral degree in 1968. During her time at Howard, she was President of the Student National Medical Association and received fellowships from the National Institutes of Health and the National Institute of Mental Health.

 

Bath interned at Harlem Hospital Center, subsequently serving as a fellow at Columbia University. Bath traveled to Yugoslavia in 1967 to study children’s health which caused her to become aware that the practice of eye care was uneven among racial minorities and poor populations, with much higher incidence of blindness among her African American and poor patients. She determined that, as a physician, she would help address this issue. It was also not easy for her to go to medical school since her family did not have the funds for it. She persuaded her professors from Columbia to operate on blind patients at Harlem Hospital Center, which had not previously offered eye surgery, at no cost. Bath pioneered the worldwide discipline of “community ophthalmology”, a volunteer-based outreach to bring necessary eye care to underserved populations.

 

After completing her education, Bath served briefly as an assistant professor at Jules Stein Eye Institute at UCLA and Charles R. Drew University of Medicine and Science before becoming the first woman on faculty at the Eye Institute. In 1978, Bath co-founded the American Institute for the Prevention of Blindness, for which she served as president. In 1983, she became the head of a residency in her field at Charles R. Drew, the first woman ever to head such a department. In 1993, she retired from UCLA, which subsequently elected her the first woman on its honorary staff. She served as a professor of Ophthalmology at Howard University’s School of Medicine and as a professor of Telemedicine and Ophthalmology at St. Georges University. She was among the co-founders of the King-Drew Medical Center ophthalmology training program.

 

In 1981, she conceived the Laserphaco Probe, a medical device that improves on the use of lasers to remove cataracts, and “for ablating and removing cataract lenses”. The device was completed in 1986 after Bath conducted research on lasers in Berlin and patented in 1988, making her the first African-American woman to receive a patent for a medical purpose. The device, which quickly and nearly painlessly dissolves the cataract with a laser, irrigates and cleans the eye and permits the easy insertion of a new lens, is used internationally to treat the disease. Bath has continued to improve the device and has successfully restored vision to people who have been unable to see for decades. Three of Bath’s four patents relate to the Laserphaco Probe. In 2000, she was granted a patent for a method she devised for using ultrasound technology to treat cataracts. Bath has been honored by two of her universities. Hunter College placed her in its “hall of fame” in 1988 and Howard University declared her a “Howard University Pioneer in Academic Medicine” in 1993. A children’s picture book on her life and science work, The Doctor with an Eye for Eyes: The Story of Dr. Patricia Bath (The Innovation Press, ISBN 9781943147311) was published in 2017, and was cited by both the National Science Teachers Association and the Chicago Public Library’s list of best kids books of the year.

 

Cataract Surgery

A cataract surgery. Dictionnaire Universel de Medecine (1746-1748).

Graphic credit: Robert James (1703-1776); Wikipedia Commons; This work is in the public domain in its country of origin and other countries and areas where the copyright term is the author’s life plus 100 years or less. This file has been identified as being free of known restrictions under copyright law, including all related and neighboring rights.

 

 

Cataract surgery is one of the most frequently performed operations in the world. Recent advances in techniques and instrumentation have resulted in earlier intervention, improved surgical outcomes, and reduced dependence on spectacles.

 

The first record of cataract being surgically treated is by Susruta, who carried out the procedure in 600 BCE. Cataracts were treated using a technique known as couching, in which the opaque lens is pushed into the vitreous cavity to remove it from the visual axis. Couching is still performed in some parts of Africa and the Middle East. In 1753, Samuel Sharp performed the first intracapsular cataract extraction (ICCE) through a limbal incision. He used pressure from his thumb to extract the lens. In 1961, Polish surgeon Tadeusz Krwawicz developed a cryoprobe which could be used to grasp and extract cataracts during ICCE surgery. However, an aphakic spectacle correction was still required. When the first edition of the Community Eye Health Journal was published, ICCE was still the most widely practiced method of cataract extraction in low- and middle-income countries. However, in high-income countries, ICCE had been superseded by extracapsular surgery with an IOL implant.

 

Modern extracapsular cataract extraction (ECCE) gained acceptance in high-income countries after the introduction of operating microscopes during the 1970s and 1980s made it possible to perform microsurgery. The microscopes offered better intraocular visibility and the ability to safely place multiple corneal sutures. ECCE has the advantage of leaving the posterior capsule intact; this reduces the risk of potentially blinding complications and makes it possible to implant a lens in the posterior chamber. Phacoemulsification was introduced in 1967 by Dr Charles Kelman. Since then, there have been significant improvements in the fluidics, energy delivery, efficiency and safety of this procedure. Advantages include small incision size, faster recovery and a reduced risk of complications.

 

Manual small-incision cataract surgery (MSICS) is a small-incision form of ECCE with a self-sealing wound which is mainly used in low-resource settings. MSICS has several advantages over phacoemulsification, including shorter operative time, less need for technology and a lower cost. It is also very effective in dealing with advanced and hard cataracts. As with modern ECCE techniques, MSICS also allows for a lens to be implanted. A recent introduction is femtosecond laser-assisted cataract surgery, during which a laser is used to dissect tissue at a microscopic level. Initial results from the recent FEMCAT trial suggest little or no improvement in safety and accuracy compared to standard phacoemulsification, and the procedure brings with it new clinical and financial challenges. Today, although phacoemulsification is considered the gold standard for cataract removal in high-income countries, MSICS is hugely popular and practiced widely in many countries of the world because of its universal applicability, efficiency and low cost.

 

Over the three decades since the first issue of the Community Eye Health Journal was published, the availability of microsurgery and high-quality intraocular lenses (IOLs), at an acceptable cost, have made a positive global impact on visual results after cataract surgery. IOLs can be placed in the anterior chamber or posterior chamber, or be supported by the iris. The preferred location is the posterior chamber, where the posterior chamber IOL (or PCIOL) is supported by the residual lens capsule. Sir Harold Ridley is credited with the first intraocular lens implantation in 1949, using a material known as PMMA. Since then, numerous design and material modifications have been developed to make IOLs safer and more effective, and they have been in routine use in high-income countries since the 1980s. However, when the first edition of the CEHJ was published in 1988, an IOL cost approximately $200 and was far too expensive for widespread use in low- and middle-income countries. Thankfully, owing to the foresight and innovation of organizations such as the Fred Hollows Foundation and Aravind Eye Hospitals, IOLs are now produced at low cost in low- and middle-income countries and have become available to even the most disadvantaged patients.

 

With the introduction of the first multifocal and toric IOLs, the focus of IOL development has shifted toward improving refractive outcomes and reducing spectacle dependence. Toric lenses correct postoperative astigmatism, and multifocal lenses reduce dependency on spectacles for near vision. However, multifocal lenses may cause glare and reduced contrast sensitivity after surgery and should only be used in carefully selected patients. The accommodating lenses that are in current use are limited by their low and varied amplitude of accommodation. The light-adjustable lens is made of a photosensitive silicone material. Within two weeks of surgery, the residual refractive error (sphero-cylindrical errors as well as presbyopia) can be corrected by shining an ultraviolet light on the IOL through a dilated pupil to change the shape of the lens. Development of an intraocular lens (IOL) as a drug delivery device has been pursued for many years. Common postoperative conditions such as posterior capsular opacification (PCO), intraocular inflammation or endophthalmitis are potential therapeutic targets for a drug-eluting IOL.

Sources: British Council For Prevention of Blindness; Community Eye Health Journal is published by the International Centre for Eye Health, a research and education group based at the London School of Hygiene and Tropical Medicine (LSHTM), one of the leading Public Health training institutions in the world. Unless otherwise stated, all content is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License

 

Professor Thomas Hunt Morgan, Geneticist

Thomas Hunt Morgan (September 25, 1866 – December 4, 1945) was an American evolutionary biologist, geneticist, embryologist, and science author who won the Nobel Prize in Physiology or Medicine in 1933 for discoveries elucidating the role that the chromosome plays in heredity. Photo credit: Unknown – http://wwwihm.nlm.nih.gov/, Public Domain, https://commons.wikimedia.org/w/index.php?curid=549067; This image is one of several created for the 1891 Johns Hopkins yearbook of 1891.

 

Thomas Hunt Morgan received his Ph.D. from Johns Hopkins University in zoology in 1890. Following the rediscovery of Mendelian inheritance in 1900, Morgan began to study the genetic characteristics of the fruit fly Drosophila melanogaster. In his famous Fly Room at Columbia University, Morgan demonstrated that genes are carried on chromosomes and are the mechanical basis of heredity. These discoveries formed the basis of the modern science of genetics. As a result of his work, Drosophila became a major model organism in contemporary genetics. The Division of Biology which he established at the California Institute of Technology has produced seven Nobel Prize winners.

 

Morgan was born in Lexington, Kentucky, to Charlton Hunt Morgan and Ellen Key Howard Morgan. Part of a line of Southern planter elite on his father’s side, Morgan was a nephew of Confederate General John Hunt Morgan and his great-grandfather John Wesley Hunt had been the first millionaire west of the Allegheny Mountains. Through his mother, he was the great-grandson of Francis Scott Key, the author of the “Star Spangled Banner“, and John Eager Howard, governor and senator from Maryland. Beginning at age 16, Morgan attended the State College of Kentucky (now the University of Kentucky). He focused on science and particularly enjoyed natural history. He worked with the U.S. Geological Survey in his summers and graduated as valedictorian in 1886 with a BS degree. Following a summer at the Marine Biology School in Annisquam, Massachusetts, Morgan began graduate studies in zoology at Johns Hopkins University. After two years of experimental work with morphologist William Keith Brooks, Morgan received a master of science degree from the State College of Kentucky in 1888. The college offered Morgan a full professorship; however, he chose to stay at Johns Hopkins and was awarded a relatively large fellowship to help him fund his studies. Under Brooks, Morgan completed his thesis work on the embryology of sea spiders, to determine their phylogenetic relationship with other arthropods. He concluded that with respect to embryology, they were more closely related to spiders than crustaceans. Based on the publication of this work, Morgan was awarded his Ph.D. from Johns Hopkins in 1890, and was also awarded the Bruce Fellowship in Research. He used the fellowship to travel to Jamaica, the Bahamas and to Europe to conduct further research. Nearly every summer from 1890 to 1942, Morgan returned to the Marine Biological Laboratory to conduct research. He became very involved in governance of the institution, including serving as an MBL trustee from 1897 to 1945.

 

In 1890, Morgan was appointed associate professor (and head of the biology department) at Johns Hopkins’ sister school Bryn Mawr College. During the first few years at Bryn Mawr, he produced descriptive studies of sea acorns, ascidian worms and frogs. In 1894 Morgan was granted a year’s absence to conduct research in the laboratories of Stazione Zoologica in Naples, where Wilson had worked two years earlier. There he worked with German biologist Hans Driesch, whose research in the experimental study of development piqued Morgan’s interest. Among other projects that year, Morgan completed an experimental study of ctenophore (commonly known as comb jellies, that live in marine waters worldwide. At the time, there was considerable scientific debate over the question of how an embryo developed. Following Wilhelm Roux’s mosaic theory of development, some believed that hereditary material was divided among embryonic cells, which were predestined to form particular parts of a mature organism. Driesch and others thought that development was due to epigenetic factors, where interactions between the protoplasm and the nucleus of the egg and the environment could affect development. Morgan was in the latter camp and his work with Driesch demonstrated that blastomeres isolated from sea urchin and ctenophore eggs could develop into complete larvae, contrary to the predictions (and experimental evidence) of Roux’s supporters.

 

When Morgan returned to Bryn Mawr in 1895, he was promoted to full professor. Morgan’s main lines of experimental work involved regeneration and larval development; in each case, his goal was to distinguish internal and external causes to shed light on the Roux-Driesch debate. He wrote his first book, The Development of the Frog’s Egg (1897). He began a series of studies on different organisms’ ability to regenerate. He looked at grafting and regeneration in tadpoles, fish and earthworms; in 1901 he published his research as Regeneration. Beginning in 1900, Morgan started working on the problem of sex determination, which he had previously dismissed when Nettie Stevens discovered the impact of the Y chromosome on gender. He also continued to study the evolutionary problems that had been the focus of his earliest work. In 1904, E. B. Wilson invited Morgan to join him at Columbia University. This move freed him to focus fully on experimental work. When Morgan took the professorship in experimental zoology, he became increasingly focused on the mechanisms of heredity and evolution. He had published Evolution and Adaptation (1903); like many biologists at the time, he saw evidence for biological evolution (as in the common descent of similar species) but rejected Darwin’s proposed mechanism of natural selection acting on small, constantly produced variations. Embryological development posed an additional problem in Morgan’s view, as selection could not act on the early, incomplete stages of highly complex organs such as the eye. The common solution of the Lamarckian mechanism of inheritance of acquired characters, which featured prominently in Darwin’s theory, was increasingly rejected by biologists. Around 1908 Morgan started working on the fruit fly Drosophila melanogaster, and encouraging students to do so as well. In a typical Drosophila genetics experiment, male and female flies with known phenotypes are put in a jar to mate; females must be virgins. Eggs are laid in porridge which the larva feed on; when the life cycle is complete, the progeny are scored for inheritance of the trait of interest. With Fernandus Payne, he mutated Drosophila through physical, chemical, and radiational means. Morgan began cross-breeding experiments to find heritable mutations, but they had no significant success for two years. Castle had also had difficulty identifying mutations in Drosophila, which were tiny. Finally, in 1909, a series of heritable mutants appeared, some of which displayed Mendelian inheritance patterns; in 1910 Morgan noticed a white-eyed mutant male among the red-eyed wild types. When white-eyed flies were bred with a red-eyed female, their progeny were all red-eyed. A second generation cross produced white-eyed males – a gender-linked recessive trait, the gene for which Morgan named white. Morgan also discovered a pink-eyed mutant that showed a different pattern of inheritance. In a paper published in Science in 1911, he concluded that (1) some traits were gender-linked, the trait was probably carried on one of the Y or X chromosomes, and (3) other genes were probably carried on specific chromosomes as well. Morgan proposed that the amount of crossing over between linked genes differs and that crossover frequency might indicate the distance separating genes on the chromosome. The later English geneticist J. B. S. Haldane suggested that the unit of measurement for linkage be called the morgan. Morgan’s student Alfred Sturtevant developed the first genetic map in 1913.

 

Morgan’s fly-room at Columbia became world-famous, and he found it easy to attract funding and visiting academics. In 1927 after 25 years at Columbia, and nearing the age of retirement, he received an offer from George Ellery Hale to establish a school of biology in California. Morgan moved to California to head the Division of Biology at the California Institute of Technology in 1928. In 1933 Morgan was awarded the Nobel Prize in Physiology or Medicine. As an acknowledgement of the group nature of his discovery he gave his prize money to Bridges’, Sturtevant’s and his own children. Morgan declined to attend the awards ceremony in 1933, instead attending in 1934. The 1933 rediscovery of the giant polytene chromosomes in the salivary gland of Drosophila may have influenced his choice. Until that point, the lab’s results had been inferred from phenotypic results, the visible polytene chromosome enabled them to confirm their results on a physical basis. Morgan’s Nobel acceptance speech entitled “The Contribution of Genetics to Physiology and Medicine“ downplayed the contribution genetics could make to medicine beyond genetic counselling. In 1939 he was awarded the Copley Medal by the Royal Society.

 

Morgan eventually retired in 1942, becoming professor and chairman emeritus. George Beadle returned to Caltech to replace Morgan as chairman of the department in 1946. Although he had retired, Morgan kept offices across the road from the Division and continued laboratory work. In his retirement, he returned to the questions of sexual differentiation, regeneration, and embryology. Morgan had throughout his life suffered with a chronic duodenal ulcer. In 1945, at age 79, he experienced a severe heart attack and died from a ruptured artery.

 

Below is Thomas Hunt Morgan’s Drosophila melanogaster genetic linkage map. This was the first successful gene mapping work and provides important evidence for the chromosome theory of inheritance. The map shows the relative positions of allelic characteristics on the second Drosophila chromosome. The distance between the genes (map units) are equal to the percentage of crossing-over events that occurs between different alleles.

 

Thomas Hunt Morgan’s Drosophila melanogaster genetic linkage map. This was the first successful gene mapping work and provides important evidence for the Boveri-Sutton chromosome theory of inheritance. The map shows the relative positions of allelic characteristics on the second Drosophila chromosome. The distance between the genes (map units) are equal to the percentage of crossing-over events that occurs between different alleles. This gene linkage map shows the relative positions of allelic characteristics on the second Drosophila chromosome. The alleles on the chromosome form a linkage group due to their tendency to form together into gametes. The distance between the genes (map units) are equal to the percentage of crossing-over events that occurs between different alleles. This diagram is also based on the findings of Thomas Hunt Morgan in his Drosophila cross. Graphic credit: Twaanders17 – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=40694655

 

Source: https://www.ncbi.nlm.nih.gov; Wikipedia

 

Nori Seaweed

Toasting a sheet of nori. 1864, Japanese painting; Wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=40283081

 

Nori is the Japanese name for edible seaweed species of the red algae genus Pyropia, including P. yezoensis and P. tenera. It is used chiefly as an ingredient (wrap) of sushi. Finished products are made by a shredding and rack-drying process that resembles papermaking. Originally, the term nori was generic and referred to seaweeds, including hijiki. One of the oldest descriptions of nori is dated to around the 8th century. In the Taiho Code enacted in CA 701, when nori was already included in the form of taxation. Local people have been described as drying nori in Hitachi Province Fudoki (ca 721-721), and nori was harvested in Izumo Province Fudoki (ca 713-733), showing that nori was used as food from ancient times. In Utsubo Monogatari, written around 987, nori was recognized as a common food. Nori had been consumed as paste form until the sheet form was invented in Asakusa, Edo (contemporary Tokyo), around 1750 in the Edo period through the method of Japanese paper-making. The word “nori“ first appeared in an English-language publication in C.P. Thunberg’s Trav., published in 1796. It was used in conjugation as “Awa nori“, probably referring to what is now called aonori.

 

The Japanese nori industry was in decline after WW II, when Japan was in need of all food that could be produced. The decline was due to a lack of understanding of nori’s three-stage life cycle, such that local people did not understand why traditional cultivation methods were not effective. The industry was rescued by knowledge deriving from the work of British phycologist Kathleen Mary Drew-Baker, who had been researching the organism Porphyria umbilicalis, which grew in the seas around Wales and was harvested for food, as in Japan. Her work was discovered by Japanese scientists who applied it to artificial methods of seeding and growing the nori, rescuing the industry. Kathleen Baker was hailed as the “Mother of the Sea“ in Japan and a statue erected in her memory; she is still revered as the savior of the Japanese nori industry. In the 21st century, the Japanese nori industry faces a new decline due to increased competition from seaweed producers in China and Korea and domestic sales tax hikes.

 

The word nori started to be used widely in the United States, and the product (imported in dry form from Japan) became widely available at natural food stores and Asian-American grocery stores in the 1960s due to the macrobiotic movement and in the 1970s with the increase of sushi bars and Japanese restaurants. In one study by Jan-Hendrik Hehemann, subjects of Japanese descent have been shown to be able to digest the polysaccharide of the seaweed, after gut microbes developed the enzyme from marine bacteria. Gut microbes from the North American subjects lacked these enzymes.

 

Production and processing of nori is an advanced form of agriculture. The biology of Pyropia, although complicated, is well understood, and this knowledge is used to control the production process. Farming takes place in the sea where the Pyropia plants grow attached to nets suspended at the sea surface and where the farmers operate from boats. The plants grow rapidly, requiring about 45 days from “seeding“ until the first harvest. Multiple harvests can be taken from a single seeding, typically at about ten-day intervals. Harvesting is accomplished using mechanical harvesters of a variety of configurations. Processing of raw product is mostly accomplished by highly automated machines that accurately duplicate traditional manual processing steps, but with much improved efficiency and consistency. The final product is a paper-thin, black, dried sheet of approximately 18 cm x 20 cm (7 in x 8 in) and 3 grams (0.11 oz.) in weight. Several grades of nori are available in the United States. The most common, and least expensive, grades are imported from China, costing about six cents per sheet. At the high end, ranging up to 90 cents per sheet, are “delicate shin-nori“ (nori from the first of the year’s several harvests) cultivated in Ariake Sea, off the island of Kyushu in Japan. In Japan, over 600 square kilometres (230 sq mi) of coastal waters are given to producing 340,000 tons of nori, worth over a billion dollars. China produces about a third of this amount.

 

Nori is commonly used as a wrap for sushi and onigiri. It is also a garnish or flavoring in noodle preparations and soups. It is most typically toasted prior to consumption (yaki-nori). A common secondary product is toasted and flavored nori (ajitsuke-nori), in which a flavoring mixture (variable, but typically soy sauce, sugar, sake, mirin, and seasonings) is applied in combination with the toasting process. It is also eaten by making it into a soy sauce-flavored paste, nori no tsukudani. Nori is also sometimes used as a form of food decoration or garnish. A related product, prepared from the unrelated green algae Monostroma and Enteromorpha, is called aonori literally blue/green nori) and is used like herbs on everyday meals, such as okonomiyaki and yakisoba.

 

Since nori sheets easily absorb water from the air and degrade, a desiccant is indispensable when storing it for any significant time.

Harry Harlow PhD

Rhesus Macaque (Macaca mulatta). This file is licensed under the Creative Commons Attribution 2.0 Generic license.

 

Editor’s note: Much of Dr. Harry Harlow’s research has made an incredible impact in the world of infant and child psychology. The work can be shocking and we don’t condone these kinds of experiments. However, because of the profound effect, Harlow’s research has had on understanding early influences on children, our cultural mores, have changed.

 

The work of Harry Harlow and Abraham Maslow was highly influential regarding the importance of “touch“ and “rocking“ in normal child development. Equally profound are theories of the origin of violence, postulated as the lack of touching and rocking in early infant and child rearing. When we viewed the 1970 Time-Life film, describing in detail the monkey experiments of Harlow (hot link below), we found them shocking. Thankfully, animal rights groups have now outlawed such experimentation. However, keep in mind, the monkeys weren’t shocked or physically harmed in any way; no drugs were used. This film reveals only the deeply sad results of psychological deprivation. You could say that Harlow’s experiments gave even more credence to the theories of Sigmund Freud.

 

 

Harry Frederick Harlow (October 31, 1905 – December 6, 1981) was an American psychologist best known for his maternal-separation, dependency needs, and social isolation experiments on rhesus monkeys, which manifested the importance of caregiving and companionship to social and cognitive development. He conducted most of his research at the University of Wisconsin-Madison, where humanistic psychologist Abraham Maslow worked with him. Harlow’s experiments were controversial as they included creating inanimate surrogate mothers for the rhesus infants from wire and wood. Each infant became attached to its particular mother, recognizing its unique face and preferring it above all others. Harlow next chose to investigate if the infants had a preference for bare-wire mothers or cloth-covered mothers. For this experiment, he presented the infants with a clothed mother and a wire mother under two conditions. In one situation, the wire mother held a bottle with food, and the cloth mother held no food. In the other situation, the cloth mother held the bottle, and the wire mother had nothing. Later in his career, he cultivated infant monkeys in isolation chambers for up to 24 months, from which they emerged intensely disturbed. Some researchers cite the experiments as a factor in the rise of the animal liberation movement in the United States. A Review of General Psychology survey, published in 2002, ranked Harlow as the 26th most cited psychologist of the 20th century.

 

Harlow was born on October 31, 1905, to Mabel Rock and Alonzo Harlow Israel. Harlow was born and raised in Fairfield, Iowa. After a year at Reed College in Portland, Oregon, Harlow obtained admission to Stanford University through a special aptitude test, where he became a psychology major. Harlow attended Stanford in 1924, and subsequently became a graduate student in psychology, working directly under Calvin Perry Stone, a well-known animal behaviorist, and Walter Richard Miles, a vision expert, who were all supervised by Lewis Terman. Harlow studied largely under Terman, the developer of the Stanford-Binet IQ Test, and Terman helped shape Harlow’s future. After receiving a PhD in 1930, Harlow changed his name from Israel to Harlow. The change was made at Terman’s prompting for fear of the negative consequences of having a seemingly Jewish last name, even though his family was not Jewish.

 

After completing his doctoral dissertation, Harlow accepted a professorship at the University of Wisconsin-Madison. Harlow was unsuccessful in persuading the Department of Psychology to provide him with adequate laboratory space. As a result, Harlow acquired a vacant building down the street from the University, and, with the assistance of his graduate students, renovated the building into what later became known as the Primate Laboratory, one of the first of its kind in the world. Under Harlow’s direction, it became a place of cutting-edge research at which some 40 students earned their PhDs.

 

After obtaining his doctorate in 1930, at Stanford University, Harlow began his career with nonhuman primate research at the University of Wisconsin. He worked with the primates at Henry Vilas Zoo, where he developed the Wisconsin General Testing Apparatus (WGTA) to study learning, cognition, and memory. It was through these studies that Harlow discovered that the monkeys he worked with were developing strategies for his tests. What would later become known as learning sets, Harlow described as “learning to learn.“ Harlow exclusively used rhesus macaques in his experiments. In order to study the development of these learning sets, Harlow needed access to developing primates, so he established a breeding colony of rhesus macaques in 1932. Due to the nature of his study, Harlow needed regular access to infant primates and thus chose to rear them in a nursery setting, rather than with their protective mothers. This alternative rearing technique, also called maternal deprivation, is highly controversial to this day, and is used, in variants, as a model of early life adversity in primates. Research with and caring for infant rhesus monkeys further inspired Harlow, and ultimately led to some of his best-known experiments: the use of surrogate mothers. Although Harlow, his students, contemporaries, and associates soon learned how to care for the physical needs of their infant monkeys, the nursery-reared infants remained very different from their mother-reared peers. Psychologically speaking, these infants were slightly strange: they were reclusive, had definite social deficits, and clung to their cloth diapers. For instance, babies that had grown up with only a mother and no playmates showed signs of fear or aggressiveness. Noticing their attachment to the soft cloth of their diapers and the psychological changes that correlated with the absence of a maternal figure, Harlow sought to investigate the mother-infant bond. This relationship was under constant scrutiny in the early twentieth century, as B. F. Skinner and the behaviorists took on John Bowlby in a discussion of the mother’s importance in the development of the child, the nature of their relationship, and the impact of physical contact between mother and child.

 

The studies were motivated by John Bowlby’s World Health Organization-sponsored study and report, “Maternal Care and Mental Health“ in 1950, in which Bowlby reviewed previous studies on the effects of institutionalization on child development, and the distress experienced by children when separated from their mothers, such as Rene Spitz’s and his own surveys on children raised in a variety of settings. In 1953, his colleague, James Robertson, produced a short and controversial documentary film, titled A Two-Year-Old Goes to Hospital, demonstrating the almost-immediate effects of maternal separation. Bowlby’s report, coupled with Robertson’s film, demonstrated the importance of the primary caregiver in human and non-human primate development. Bowlby de-emphasized the mother’s role in feeding as a basis for the development of a strong mother-child relationship, but his conclusions generated much debate. It was the debate concerning the reasons behind the demonstrated need for maternal care that Harlow addressed in his studies with surrogates. Physical contact with infants was considered harmful to their development, and this view led to sterile, contact-less nurseries across the country. Bowlby disagreed, claiming that the mother provides much more than food to the infant, including a unique bond that positively influences the child’s development and mental health. To investigate the debate, Harlow created inanimate surrogate mothers for the rhesus infants from wire and wood. Each infant became attached to its particular mother, recognizing its unique face and preferring it above all others. Harlow next chose to investigate if the infants had a preference for bare-wire mothers or cloth-covered mothers. For this experiment, he presented the infants with a clothed mother and a wire mother under two conditions. In one situation, the wire mother held a bottle with food, and the cloth mother held no food. In the other situation, the cloth mother held the bottle, and the wire mother had nothing. Overwhelmingly, the infant macaques preferred spending their time clinging to the cloth mother. Even when only the wire mother could provide nourishment, the monkeys visited her only to feed. Harlow concluded that there was much more to the mother-infant relationship than milk, and that this “contact comfort“ was essential to the psychological development and health of infant monkeys and children. It was this research that gave strong, empirical support to Bowlby’s assertions on the importance of love and mother-child interaction.

 

Successive experiments concluded that infants used the surrogate as a base for exploration, and a source of comfort and protection in novel and even frightening situations. In an experiment called the “open-field test“, an infant was placed in a novel environment with novel objects. When the infant’s surrogate mother was present, it clung to her, but then began venturing off to explore. If frightened, the infant ran back to the surrogate mother and clung to her for a time before venturing out again. Without the surrogate mother’s presence, the monkeys were paralyzed with fear, huddling in a ball and sucking their thumbs. In the “fear test“, infants were presented with a fearful stimulus, often a noise-making teddy bear. Without the mother, the infants cowered and avoided the object. When the surrogate mother was present, however, the infant did not show great fearful responses and often contacted the device – exploring and attacking it. Another study looked at the differentiated effects of being raised with only either a wire-mother or a cloth-mother. Both groups gained weight at equal rates, but the monkeys raised on a wire-mother had softer stool and trouble digesting the milk, frequently suffering from diarrhea. Harlow’s interpretation of this behavior, which is still widely accepted, was that a lack of contact comfort is psychologically stressful to the monkeys, and the digestive problems are a physiological manifestation of that stress.

 

The importance of these findings is that they contradicted both the traditional pedagogic advice of limiting or avoiding bodily contact in an attempt to avoid spoiling children, and the insistence of the predominant behaviorist school of psychology that emotions were negligible. Feeding was thought to be the most important factor in the formation of a mother-child bond. Harlow concluded, however, that nursing strengthened the mother-child bond because of the intimate body contact that it provided. He described his experiments as a study of love. He also believed that contact comfort could be provided by either mother or father. Though widely accepted now, this idea was revolutionary at the time in provoking thoughts and values concerning the studies of love. Some of Harlow’s final experiments explored social deprivation in the quest to create an animal model for the study of depression. This study is the most controversial and involved isolation of infant and juvenile macaques for various periods of time. Monkeys placed in isolation exhibited social deficits when introduced or re-introduced into a peer group. They appeared unsure of how to interact with their conspecifics, and mostly stayed separate from the group, demonstrating the importance of social interaction and stimuli in forming the ability to interact with conspecifics in developing monkeys, and, comparatively, in children. Critics of Harlow’s research have observed that clinging is a matter of survival in young rhesus monkeys, but not in humans, and have suggested that his conclusions, when applied to humans, overestimate the importance of contact comfort and underestimate the importance of nursing.

 

Harlow first reported the results of these experiments in “The Nature of Love“, the title of his address to the sixty-sixth Annual Convention of the American Psychological Association in Washington, D.C., August 31, 1958. Beginning in 1959, Harlow and his students began publishing their observations on the effects of partial and total social isolation. Partial isolation involved raising monkeys in bare wire cages that allowed them to see, smell, and hear other monkeys, but provided no opportunity for physical contact. Total social isolation involved rearing monkeys in isolation chambers that precluded any and all contact with other monkeys. Harlow et al. reported that partial isolation resulted in various abnormalities such as blank staring, stereotyped repetitive circling in their cages, and self-mutilation. These monkeys were then observed in various settings. For the study, some of the monkeys were kept in solitary isolation for 15 years. In the total isolation experiments, baby monkeys would be left alone for three, six, 12, or 24 months of “total social deprivation“. The experiments produced monkeys that were severely psychologically disturbed. Harlow wrote:

 

No monkey has died during isolation. When initially removed from total social isolation, however, they usually go into a state of emotional shock, characterized by autistic self-clutching and rocking. One of six monkeys isolated for 3 months refused to eat after release and died 5 days later. The autopsy report attributed death to emotional anorexia. The effects of 6 months of total social isolation were so devastating and debilitating that we had assumed initially that 12 months of isolation would not produce any additional decrement. This assumption proved to be false; 12 months of isolation almost obliterated the animals socially.

 

Harlow tried to reintegrate the monkeys who had been isolated for six months by placing them with monkeys who had been raised normally. The rehabilitation attempts met with limited success. Harlow wrote that total social isolation for the first six months of life produced “severe deficits in virtually every aspect of social behavior.“ Isolates exposed to monkeys the same age who were reared normally “achieved only limited recovery of simple social responses.“ Some monkey mothers reared in isolation exhibited “acceptable maternal behavior when forced to accept infant contact over a period of months, but showed no further recovery.“ Isolates given to surrogate mothers developed “crude interactive patterns among themselves.“ In another trial, the surrogate mother was designed to ?reject’ the infant monkey. Rejection was demonstrated through strong jets of air or blunt spikes forcing the baby away. The reactions of the babies were quite amazing, after rejection. The monkeys would cling again to the mothers even tighter than they did before.These trials proved that nourishment is more than just feeding, and the bond between a mother and child is not solely because of feeding but because of the time spent with the child.

 

Since Harlow’s pioneering work on touch research in development, recent work in rats has found evidence that touch during infancy resulted in a decrease in corticosteroid, a steroid hormone involved in stress, and an increase in glucocorticoid receptors in many regions of the brain. Schanberg and Field found that even short-term interruption of mother-pup interaction in rats, markedly affected several biochemical processes in the developing pup: a reduction in ornithine decarboxylase (ODC) activity, a sensitive index of cell growth and differentiation; a reduction in growth hormone release (in all body organs, including the heart and liver, and throughout the brain, including the cerebrum, cerebellum, and brain stem); an increase in corticosterone secretion; and suppressed tissue ODC responsivity to administered growth hormone. Additionally, it was found that animals who are touch-deprived have weakened immune systems. Investigators have measured a direct, positive relationship between the amount of contact and grooming an infant monkey receives during its first six months of life, and its ability to produce antibody titer (IgG and IgM) in response to an antibody challenge (tetanus) at a little over one year of age. Trying to identify a mechanism for the “immunology of touch“, some investigators point to modulations of arousal and associated CNS-hormonal activity. Touch deprivation may cause stress-induced activation of the pituitary-adrenal system, which, in turn, leads to increased plasma cortisol and adrenocorticotropic hormone. Likewise, researchers suggest, regular and “natural“ stimulation of the skin may moderate these pituitary-adrenal responses in a positive and healthful way.

 

Harlow was well known for refusing to use conventional terminology, instead choosing deliberately outrageous terms for the experimental apparatus he devised. This came from an early conflict with the conventional psychological establishment in which Harlow used the term “love“ in place of the popular and archaically correct term, “attachment“. Such terms and respective devices included a forced-mating device he called the “rape rack“, tormenting surrogate-mother devices he called “Iron maidens“, and an isolation chamber he called the “pit of despair“, developed by him and a graduate student, Stephen Suomi. In the last of these devices, alternatively called the “well of despair“, baby monkeys were left alone in darkness for up to one year from birth, or repetitively separated from their peers and isolated in the chamber. These procedures quickly produced monkeys that were severely psychologically disturbed and used as models of human depression.

 

Many of Harlow’s experiments are now considered unethical – in their nature as well as Harlow’s descriptions of them – and they both contributed to heightened awareness of the treatment of laboratory animals and helped propel the creation of today’s ethics regulations. The monkeys in the experiment were deprived of maternal affection, potentially leading to what humans refer to as “panic disorders.“ University of Washington professor Gene Sackett, one of Harlow’s doctoral students, stated that Harlow’s experiments provided the impetus for the animal liberation movement in the U.S.

 

The monkeys used in these experiments eventually became mothers themselves and were observed to see the effect their ?childhood‘ had on them. All the mothers tended to be either indifferent towards their babies, or abusive. The indifferent mothers did not nurse, comfort, or protect their babies however they did not harm them either. The abusive mothers would violently bite, or otherwise injure their infants. Many of the babies from the abusive mothers died in this process. This proved that how you were mothered has a major impact on how you will be as a mother.

 

1970 Time-Life Documentary Examines the Theories and Experiments of Dr. Harry Harlow.

 

Sources: nih.gov; Wikipedia; http://sites.psu.edu/dps16/2016/03/03/harlows-monkeys/

 

CRISPR

Double Strand DNA Breaks Introduced by CRISPR-Cas9 Allows Further Genetic Manipulation By Exploiting Endogenous DNA Repair Mechanisms.

Graphic credit: by Guido4 – Own work; CC BY-SA 4.0; File:16 Hegasy DNA Rep Wiki E CCBYSA.png; Created: 1 November 2017; Wikipedia Creative Commons

 

The discovery of clustered DNA repeats occurred independently in three parts of the world. The first description of what would later be called CRISPR is from Osaka University researcher Yoshizumi Ishino and his colleagues in 1987. They accidentally cloned part of a CRISPR together with the iap gene, the target of interest. The organization of the repeats was unusual because repeated sequences are typically arranged consecutively along DNA. They studied the relation of “iap“ to the bacterium E. coli. The function of the interrupted clustered repeats was not known at the time. In 1993 researchers of Mycobacterium tuberculosis in the Netherlands published two articles about a cluster of interrupted direct repeats (DR) in this bacterium. These researchers recognized the diversity of the DR-intervening sequences among different strains of M. tuberculosis and used this property to design a typing method that was named spoligotyping, which is still in use today. At the same time, repeats were observed in the archaeal organisms of Haloferax and Haloarcula species, and their function was studied by Francisco Mojica at the University of Alicante in Spain. Although his hypothesis turned out to be wrong, Mojica’s supervisor surmised at the time that the clustered repeats had a role in correctly segregating replicated DNA into daughter cells during cell division because plasmids and chromosomes with identical repeat arrays could not coexist in Haloferax volcanii. Transcription of the interrupted repeats was also noted for the first time. By 2000, Mojica performed a survey of scientific literature and one of his students a search in published genomes with a program devised by himself. They found interrupted repeats in 20 species of microbes, and it was the first time different repeats with the same properties were identified as belonging to the same family, not yet known as CRISPR. In 2001, Mojica and Ruud Jansen, who was searching for additional interrupted repeats, proposed the acronym CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) to alleviate the confusion stemming from the numerous acronyms used to describe the sequences in the scientific literature.

 

CRISPR-Associated Systems

 

A major addition to the understanding of CRISPR came with Jansen’s observation that the prokaryote repeat cluster was accompanied by a set of homologous genes that make up CRISPR-associated systems or cas genes. Four cas genes (cas 1 – 4) were initially recognized. The Cas proteins showed helicase and nuclease motifs, suggesting a role in the dynamic structure of the CRISPR loci. In this publication the acronym CRISPR was coined as the universal name of this pattern. However, the CRISPR function remained enigmatic.

 

Simplified diagram of a CRISPR locus. The three major components of a CRISPR locus are shown: cas genes, a leader sequence, and a repeat-spacer array. Repeats are shown as gray boxes and spacers are colored bars. The arrangement of the three components is not always as shown. In addition, several CRISPRs with similar sequences can be present in a single genome, only one of which is associated with cas genes. In 2005, three independent research groups showed that some CRISPR spacers are derived from phage DNA and extrachromosomal DNA such as plasmids. In effect, the spacers are fragments of DNA gathered from viruses that previously tried to attack the cell. The source of the spacers was a sign that the CRISPR/cas system could have a role in adaptive immunity in bacteria. All three studies proposing this idea were initially rejected by high-profile journals, but eventually appeared in other journals. The first publication proposing a role of CRISPR-Cas in microbial immunity, by the researchers at the University of Alicante, predicted a role for the RNA transcript of spacers on target recognition in a mechanism that could be analogous to the RNA interference system used by eukaryotic cells. This hypothesis had already been defended in a pre-doc examination and one scientific meeting in 2004. Koonin and colleagues extended this RNA interference hypothesis by proposing mechanisms of action for the different CRISPR-Cas subtypes according to the predicted function of their proteins.

 

Experimental work by several groups revealed the basic mechanisms of CRISPR-Cas immunity. In 2007 the first experimental evidence that CRISPR was an adaptive immune system was published. A CRISPR region in Streptococcus thermophilus acquired spacers from the DNA of an infecting bacteriophage. The researchers manipulated the resistance of S. thermophilus to phage by adding and deleting spacers whose sequence matched those found in the tested phages. In 2008, Brouns and Van der Oost identified a complex of Cas proteins (called Cascade) that in E. coli cut the CRISPR RNA precursor within the repeats into mature spacer-containing RNA molecules (crRNA), which remained bound to the protein complex. Moreover, it was found that Cascade, crRNA and an helicase/nuclease (Cas3) were required to provide a bacterial host with immunity against infection by a DNA virus. By designing an anti-virus CRISPR, they demonstrated that two orientations of the crRNA (sense/antisense) provided immunity, indicating that the crRNA guides were targeting dsDNA. That year Marraffini and Sontheimer indeed confirmed that a CRISPR sequence of S. epidermidis targeted DNA and not RNA to prevent conjugation. This finding was at odds with the proposed RNA-interference-like mechanism of CRISPR-Cas immunity, although a CRISPR-Cas system that targets foreign RNA was later found in Pyrococcus furiosus. A 2010 study showed that CRISPR-Cas cuts both strands of phage and plasmid DNA in S. thermophilus.

 

Cas9

 

Researchers studied a simpler CRISPR system from Streptococcus pyogenes that relies on the protein Cas9. The Cas9 endonuclease is a four-component system that includes two small RNA molecules named CRISPR RNA (crRNA) and trans-activating CRISPR RNA (tracrRNA). Jennifer Doudna and Emmanuelle Charpentier re-engineered the Cas9 endonuclease into a more manageable two-component system by fusing the two RNA molecules into a “single-guide RNA“ that, when combined with Cas9, could find and cut the DNA target specified by the guide RNA. By manipulating the nucleotide sequence of the guide RNA, the artificial Cas9 system could be programmed to target any DNA sequence for cleavage. Another group of collaborators comprising Siksnys together with Gasiunas, Barrangou and Horvath showed that Cas9 from the S. thermophilus CRISPR system can also be reprogrammed to target a site of their choosing by changing the sequence of its crRNA. These advances fueled efforts to edit genomes with the modified CRISPR-Cas9 system. Feng Zhang’s and George Church’s groups simultaneously described genome editing in human cell cultures using CRISPR-Cas9 for the first time. It has since been used in a wide range of organisms, including baker’s yeast (Saccharomyces cerevisiae), the opportunistic pathogen C. albicans, zebrafish (D. rerio), fruit flies (Drosophila melanogaster), nematodes (C. elegans), plants, mice, monkeys and human embryos.

 

CRISPR has been modified to make programmable transcription factors that allow scientists to target and activate or silence specific genes. The CRIPSR/Cas9 system has shown to make effective gene edits in Human tripronuclear zygotes first described in a 2015 paper by Chinese scientists P. Liang and Y. Xu. The system made a successful cleavage of mutant Beta-Hemoglobin (HBB) in 28 out of 54 embryos. 4 out of the 28 embryos were successfully recombined using a donor template given by the scientists. The scientists showed that during DNA recombination of the cleaved strand, the homologous endogenous sequence HBD competes with the exogenous donor template. DNA repair in human embryos is much more complicated and particular than in derived stem cells.

 

Cpf1

 

In 2015, the nuclease Cpf1 was discovered in the CRISPR/Cpf1 system of the bacterium Francisella novicida. Cpf1 showed several key differences from Cas9 including: causing a ?staggered’ cut in double stranded DNA as opposed to the ?blunt’ cut produced by Cas9, relying on a ?T rich’ PAM (providing alternate targeting sites to Cas9) and requiring only a CRISPR RNA (crRNA) for successful targeting. By contrast Cas9 requires both crRNA and a transactivating crRNA (tracrRNA). These differences may give Cpf1 some advantages over Cas9. For example, Cpf1’s small crRNAs are ideal for multiplexed genome editing, as more of them can be packaged in one vector than can Cas9’s sgRNAs. As well, the sticky 5′ overhangs left by Cpf1 can be used for DNA assembly that is much more target-specific than traditional Restriction Enzyme cloning. Finally, Cpf1 cleaves DNA 18-23 bp downstream from the PAM site. This means there is no disruption to the recognition sequence after repair, and so Cpf1 enables multiple rounds of DNA cleavage. By contrast, since Cas9 cuts only 3 bp upstream of the PAM site, the NHEJ pathway results in indel mutations which destroy the recognition sequence, thereby preventing further rounds of cutting. In theory, repeated rounds of DNA cleavage should cause an increased opportunity for the desired genomic editing to occur.

 

Phrenology

An 1883 phrenology chart

 

Graphic credit: From People’s Cyclopedia of Universal Knowledge (1883). Transferred from en.wikipedia Original uploader was Whbonney at en.wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=6693422

 

Phrenology is a pseudo medicine primarily focused on measurements of the human skull, based on the concept that the brain is the organ of the mind, and that certain brain areas have localized, specific functions or modules. Although both of those ideas have a basis in reality, phrenology extrapolated beyond empirical knowledge in a way that departed from science. Developed by German physician Franz Joseph Gall in 1796, the discipline was very popular in the 19th century, especially from about 1810 until 1840. The principal British center for phrenology was Edinburgh, where the Edinburgh Phrenological Society was established in 1820. Although now regarded as an obsolete amalgamation of primitive neuroanatomy with moral philosophy, phrenological thinking was influential in 19th-century psychiatry. Gall’s assumption that character, thoughts, and emotions are located in specific parts of the brain is considered an important historical advance toward neuropsychology.

 

Phrenologists believe that the human mind has a set of various mental faculties, each one represented in a different area of the brain. For example, the faculty of “philoprogenitiveness“, from the Greek for “love of offspring“, was located centrally at the back of the head (see illustration of the chart from Webster’s Academic Dictionary).

These areas were said to be proportional to a person’s propensities. The importance of an organ was derived from relative size compared to other organs. It was believed that the cranial skull – like a glove on the hand -accommodates to the different sizes of these areas of the brain, so that a person’s capacity for a given personality trait could be determined simply by measuring the area of the skull that overlies the corresponding area of the brain. Phrenology, which focuses on personality and character, is distinct from craniometry, which is the study of skull size, weight and shape, and physiognomy, the study of facial features. Phrenology is a process that involves observing and/or feeling the skull to determine an individual’s psychological attributes. Franz Joseph Gall believed that the brain was made up of 27 individual organs that determined personality, the first 19 of these ?organs’ he believed to exist in other animal species. Phrenologists would run their fingertips and palms over the skulls of their patients to feel for enlargements or indentations. The phrenologist would often take measurements with a tape measure of the overall head size and more rarely employ a craniometer, a special version of a caliper. In general, instruments to measure sizes of cranium continued to be used after the mainstream phrenology had ended. The phrenologists put emphasis on using drawings of individuals with particular traits, to determine the character of the person and thus many phrenology books show pictures of subjects. From absolute and relative sizes of the skull the phrenologist would assess the character and temperament of the patient.

 

Gall’s list of the “brain organs“ was specific. An enlarged organ meant that the patient used that particular “organ“ extensively. The number – and more detailed meanings – of organs were added later by other phrenologists. The 27 areas varied in function, from sense of color, to religiosity, to being combative or destructive. Each of the 27 “brain organs“ was located under a specific area of the skull. As a phrenologist felt the skull, he would use his knowledge of the shapes of heads and organ positions to determine the overall natural strengths and weaknesses of an individual. Phrenologists believed the head revealed natural tendencies but not absolute limitations or strengths of character. The first phrenological chart gave the names of the organs described by Gall; it was a single sheet, and sold for a cent. Later charts were more expansive.

 

Historically, among the first to identify the brain as the major controlling center for the body were Hippocrates and his followers, inaugurating a major change in thinking from Egyptian, biblical and early Greek views, which based bodily primacy of control on the heart. This belief was supported by the Greek physician Galen, who concluded that mental activity occurred in the brain rather than the heart, contending that the brain, a cold, moist organ formed of sperm, was the seat of the animal soul – one of three “souls“ found in the body, each associated with a principal organ. The Swiss pastor Johann Kaspar Lavater (1741-1801) introduced the idea that physiognomy related to the specific character traits of individuals, rather than general types, in his Physiognomische Fragmente, published between 1775 and 1778. His work was translated into English and published in 1832 as The Pocket Lavater, or, The Science of Physiognomy. He believed that thoughts of the mind and passions of the soul were connected with an individual’s external frame. Of the forehead, When the forehead is perfectly perpendicular, from the hair to the eyebrows, it denotes an utter deficiency of understanding.

 

In 1796 the German physician Franz Joseph Gall (1758-1828) began lecturing on organology: the isolation of mental faculties and later cranioscopy which involved reading the skull’s shape as it pertained to the individual. It was Gall’s collaborator Johann Gaspar Spurzheim who would popularize the term “phrenology“. In 1809 Gall began writing his principal work, The Anatomy and Physiology of the Nervous System in General, and of the Brain in Particular, with Observations upon the possibility of ascertaining the several Intellectual and Moral Dispositions of Man and Animal, by the configuration of their Heads. It was not published until 1819. In the introduction to this main work, Gall makes the following statement in regard to his doctrinal principles, which comprise the intellectual basis of phrenology:

 

The Brain is the organ of the mind

 

1. The brain is not a homogenous unity, but an aggregate of mental organs with specific functions

2. The cerebral organs are topographically localized

3. Other things being equal, the relative size of any particular mental organ is indicative of the power or strength of that organ

4. Since the skull ossifies over the brain during infant development, external craniological means could be used to diagnose the internal states of the mental characters

 

Through careful observation and extensive experimentation, Gall believed he had established a relationship between aspects of character, called faculties, with precise organs in the brain. Johann Spurzheim was Gall’s most important collaborator. He worked as Gall’s anatomist until 1813 when for unknown reasons they had a permanent falling out. Publishing under his own name Spurzheim successfully disseminated phrenology throughout the United Kingdom during his lecture tours through 1814 and 1815 and the United States in 1832 where he would eventually die. Gall was more concerned with creating a physical science, so it was through Spurzheim that phrenology was first spread throughout Europe and America. Phrenology, while not universally accepted, was hardly a fringe phenomenon of the era. George Combe would become the chief promoter of phrenology throughout the English-speaking world after he viewed a brain dissection by Spurzheim, convincing him of phrenology’s merits.

 

The popularization of phrenology in the middle and working classes was due in part to the idea that scientific knowledge was important and an indication of sophistication and modernity. Cheap and plentiful pamphlets, as well as the growing popularity of scientific lectures as entertainment, also helped spread phrenology to the masses. Combe created a system of philosophy of the human mind that became popular with the masses because of its simplified principles and wide range of social applications that were in harmony with the liberal Victorian world view. George Combe’s book On the Constitution of Man and its Relationship to External Objects sold over 200, 000 copies through nine editions. Combe also devoted a large portion of his book to reconciling religion and phrenology, which had long been a sticking point. Another reason for its popularity was that phrenology balanced between free will and determinism. A person’s inherent faculties were clear, and no faculty was viewed as evil, though the abuse of a faculty was. Phrenology allowed for self-improvement and upward mobility, while providing fodder for attacks on aristocratic privilege. Phrenology also had wide appeal because of its being a reformist philosophy not a radical one. Phrenology was not limited to the common people, and both Queen Victoria and Prince Albert invited George Combe to read the heads of their children.

 

Phrenology came about at a time when scientific procedures and standards for acceptable evidence were still being codified. In the context of Victorian society, phrenology was a respectable scientific theory. The Phrenological Society of Edinburgh founded by George and Andrew Combe was an example of the credibility of phrenology at the time and included a number of extremely influential social reformers and intellectuals, including the publisher Robert Chambers, the astronomer John Pringle Nichol, the evolutionary environmentalist Hewett Cottrell Watson, and asylum reformer William A.F. Browne. In 1826, out of the 120 members of the Edinburgh society an estimated one third were from a medical background. By the 1840s there were more than 28 phrenological societies in London with over 1000 members. Another important scholar was Luigi Ferrarese, the leading Italian phrenologist. He advocated that governments should embrace phrenology as a scientific means of conquering many social ills, and his Memorie Risguardanti La Dottrina Frenologica (1836), is considered “one of the fundamental 19th century works in the field“.

 

Traditionally the mind had been studied through introspection. Phrenology provided an attractive, biological alternative that attempted to unite all mental phenomena using consistent biological terminology. Gall’s approach prepared the way for studying the mind that would lead to the downfall of his own theories. Phrenology contributed to development of physical anthropology, forensic medicine, knowledge of the nervous system and brain anatomy as well as contributing to applied psychology. John Elliotson was a brilliant but erratic heart specialist who became a phrenologist in the 1840s. He was also a mesmerist and combined the two into something he called phrenomesmerism or phrenomagnatism. Changing behavior through mesmerism eventually won out in Elliotson’s hospital, putting phrenology in a subordinate role. Others amalgamated phrenology and mesmerism as well, such as the practical phrenologists Collyer and Joseph R. Buchanan. The benefits of combining mesmerism and phrenology was that the trance the patient was placed in was supposed to allow for the manipulation of his/her penchants and qualities. For example, if the organ of self-esteem was touched, the subject would take on a haughty expression.

 

Phrenology has been psychology’s great faux pas. – J.C. Flugel (1933)

 

Phrenology was mostly discredited as a scientific theory by the 1840s. This was due only in part to a growing amount of evidence against phrenology. Phrenologists had never been able to agree on the most basic mental organ numbers, going from 27 to over 40, and had difficulty locating the mental organs. Phrenologists relied on cranioscopic readings of the skull to find organ locations. Jean Pierre Flourens’ experiments on the brains of pigeons indicated that the loss of parts of the brain either caused no loss of function, or the loss of a completely different function than what had been attributed to it by phrenology. Flourens’ experiment, while not perfect, seemed to indicate that Gall’s supposed organs were imaginary. Scientists had also become disillusioned with phrenology since its exploitation with the middle and working classes by entrepreneurs. The popularization had resulted in the simplification of phrenology and mixing in it of principles of physiognomy, which had from the start been rejected by Gall as an indicator of personality. Phrenology from its inception was tainted by accusations of promoting materialism and atheism, and being destructive of morality. These were all factors which led to the downfall of phrenology . Recent studies, using modern day technology like Magnetic Resonance Imaging have further disproven phrenology claims.

 

During the early 20th century, a revival of interest in phrenology occurred, partly because of studies of evolution, criminology and anthropology (as pursued by Cesare Lombroso). The most famous British phrenologist of the 20th century was the London psychiatrist Bernard Hollander (1864-1934). His main works, The Mental Function of the Brain (1901) and Scientific Phrenology (1902), are an appraisal of Gall’s teachings. Hollander introduced a quantitative approach to the phrenological diagnosis, defining a method for measuring the skull, and comparing the measurements with statistical averages. In Belgium, Paul Bouts (1900-1999) began studying phrenology from a pedagogical background, using the phrenological analysis to define an individual pedagogy. Combining phrenology with typology and graphology, he coined a global approach known as psychognomy. Bouts, a Roman Catholic priest, became the main promoter of renewed 20th-century interest in phrenology and psychognomy in Belgium. He was also active in Brazil and Canada, where he founded institutes for characterology. His works Psychognomie and Les Grandioses Destinees individuelle et humaine dans la lumiere de la Caracterologie et de l’Evolution cerebro-cranienne are considered standard works in the field. In the latter work, which examines the subject of paleoanthropology, Bouts developed a teleological and orthogenetical view on a perfecting evolution, from the paleo-encephalical skull shapes of prehistoric man, which he considered still prevalent in criminals and savages, towards a higher form of mankind, thus perpetuating phrenology’s problematic racializing of the human frame. Bouts died on March 7, 1999. His work has been continued by the Dutch foundation PPP (Per Pulchritudinem in Pulchritudine), operated by Anette Muller, one of Bouts’ students. During the 1930’s Belgian colonial authorities in Rwanda used phrenology to explain the so-called superiority of Tutsis over Hutus.

 

King George III of Great Britain

Full-length portrait in oils of a clean-shaven young George in eighteenth century dress: gold jacket and breeches, ermine cloak, powdered wig, white stockings, and buckled shoes.

 

Graphic credit: English painter, Allan Ramsay – vgGv1tsB1URdhg at Google Cultural Institute maximum zoom level, Public Domain, https://commons.wikimedia.org/w/index.php?curid=23604082

 

 

Editor’s note: We are including extra information about King George III, not only because all of Europe was seething during his reign, and because the King’s illness is curious because there’s no real diagnosis to this day, and also because American history at the time of King George III, is inextricably bound. Finally, we thought readers should know more than the average educated American, that King George III was not the tyrant that most Americans believe, but an astute politician, a curious intellectual, a highly cultured person, and a moral person within his own family, very dear to him and on behalf of his beloved country. We hope you come away, understanding King George III better after you read this short piece. Readers may not realize that this king was exceedingly popular with the people of Great Britain.

George III (George William Frederick; 4 June 1738 – 29 January 1820) was King of Great Britain and King of Ireland from 25 October 1760 until the union of the two countries on 1 January 1801. After the union, he was King of the United Kingdom of Great Britain and Ireland until his death. He was concurrently Duke and prince-elector of Brunswick-Luneburg (?Hanover’) in the Holy Roman Empire before becoming King of Hanover on 12 October 1814. He was the third British monarch of the House of Hanover, but unlike his two predecessors, he was born in England, spoke English as his first language, and never visited Hanover. His life and with it his reign, which were longer than those of any of his predecessors, were marked by a series of military conflicts involving his kingdoms, much of the rest of Europe, and places farther afield in Africa, the Americas and Asia. Early in his reign, Great Britain defeated France in the Seven Years’ War, becoming the dominant European power in North America and India. However, many of Britain’s American colonies were soon lost in the American War of Independence. Further wars against revolutionary and Napoleonic France from 1793 concluded in the defeat of Napoleon at the Battle of Waterloo in 1815.

 

In the later part of his life, George III had recurrent, and eventually permanent, mental illness. Although it has since been suggested that he had the blood disease porphyria, the cause of his illness remains unknown. After a final relapse in 1810, a regency was established, and George III’s eldest son, George, the conniving Prince of Wales, ruled as Prince Regent. On George III’s death, the Prince Regent succeeded his father as George IV. Historical analysis of George III’s life has gone through a ?kaleidoscope of changing views’ that have depended heavily on the prejudices of his biographers and the sources available to them. Until it was reassessed in the second half of the 20th century, his reputation in the United States was one of a tyrant; and in Britain he became, for (a minority) ?the scapegoat for the failure of imperialism’.

 

George was born in London at Norfolk House in St James’s Square. He was the grandson of King George II, and the eldest son of Frederick, Prince of Wales, and Augusta of Saxe-Gotha. As Prince George was born two months prematurely and he was thought unlikely to survive, he was baptized the same day by Thomas Secker, who was both Rector of St James’s and Bishop of Oxford. One month later, he was publicly baptized at Norfolk House, again by Secker. His godparents were the King of Sweden (for whom Lord Baltimore stood proxy), his uncle the Duke of Saxe-Gotha (for whom Lord Carnarvon stood proxy) and his great-aunt the Queen of Prussia (for whom Lady Charlotte Edwin stood proxy). George grew into a healthy but reserved and shy child. The family moved to Leicester Square, where George and his younger brother Prince Edward, Duke of York and Albany, were educated together by private tutors. Family letters show that he could read and write in both English and German, as well as comment on political events of the time, by the age of eight. He was the first British monarch to study science systematically. Apart from chemistry and physics, his lessons included astronomy, mathematics, French, Latin, history, music, geography, commerce, agriculture and constitutional law, along with sporting and social accomplishments such as dancing, fencing, and riding. His religious education was wholly Anglican. At age 10 George took part in a family production of Joseph Addison’s play Cato and said in the new prologue: ?What, tho’ a boy! It may with truth be said, A boy in England born, in England bred.’ Historian Romney Sedgwick argued that these lines appear ?to be the source of the only historical phrase with which he is associated’. Clearly, this historian, was one of the minority who downplayed King George’s many talents.

 

George’s grandfather, King George II, disliked the Prince of Wales, and took little interest in his grandchildren. However, in 1751 the Prince of Wales died unexpectedly from a lung injury, and George became heir apparent to the throne. He inherited his father’s title of Duke of Edinburgh. Now more interested in his grandson, three weeks later the King created George Prince of Wales (the title is not automatically acquired). In the spring of 1756, as George approached his 18th birthday, the King offered him a grand establishment at St James’s Palace, but George refused the offer, guided by his mother and her confidant, Lord Bute, who would later serve as Prime Minister. George’s mother, now the Dowager Princess of Wales, preferred to keep George at home where she could imbue him with her strict moral values.

 

In 1759, George was smitten with Lady Sarah Lennox, sister of the Duke of Richmond, but Lord Bute advised against the match and George abandoned his thoughts of marriage. ?I am born for the happiness or misery of a great nation,’ he wrote, ?and consequently must often act contrary to my passions.’ Nevertheless, attempts by the King to marry George to Princess Sophie Caroline of Brunswick-Wolfenbuttel were resisted by him and his mother; Sophie married the Margrave of Bayreuth instead. The following year, at the age of 22, George succeeded to the throne when his grandfather, George II, died suddenly on 25 October 1760, two weeks before his 77th birthday. The search for a suitable wife intensified. On 8 September 1761 in the Chapel Royal, St James’s Palace, the King married Princess Charlotte of Mecklenburg-Strelitz, whom he met on their wedding day. A fortnight later on 22 September both were crowned at Westminster Abbey. George remarkably never took a mistress (in contrast with his grandfather and his sons), and the couple enjoyed a genuinely happy marriage until his mental illness struck. They had 15 children – nine sons and six daughters. In 1762, George purchased Buckingham House (on the site now occupied by Buckingham Palace) for use as a family retreat. His other residences were Kew and Windsor Castle. St James’s Palace was retained for official use. He did not travel extensively, and spent his entire life in southern England. In the 1790s, the King and his family took holidays at Weymouth, Dorset, which he thus popularized as one of the first seaside resorts in England.

 

George, in his accession speech to Parliament, proclaimed: ?Born and educated in this country, I glory in the name of Britain.’ He inserted this phrase into the speech, written by Lord Hardwicke, to demonstrate his desire to distance himself from his German forebears, who were perceived as caring more for Hanover than for Britain. Although his accession was at first welcomed by politicians of all parties, the first years of his reign were marked by political instability, largely generated as a result of disagreements over the Seven Years’ War. George was also perceived as favoring Tory ministers, which led to his denunciation by the Whigs as an autocrat. On his accession, the Crown lands produced relatively little income; most revenue was generated through taxes and excise duties. George surrendered the Crown Estate to Parliamentary control in return for a civil list annuity for the support of his household and the expenses of civil government. Claims that he used the income to reward supporters with bribes and gifts are disputed by historians who say such claims ?rest on nothing but falsehoods put out by disgruntled opposition’.

 

Debts amounting to over3 million pounds over the course of George’s reign were paid by Parliament, and the civil list annuity was increased from time to time. He aided the Royal Academy of Arts with large grants from his private funds and may have donated more than half of his personal income to charity. Of his art collection, the two most notable purchases are Johannes Vermeer’s Lady at the Virginals and a set of Canalettos, but it is as a collector of books that he is best remembered. The King’s Library was open and available to scholars and was the foundation of a new national library. In May 1762, the incumbent Whig government of the Duke of Newcastle was replaced with one led by the Scottish Tory Lord Bute. Bute’s opponents worked against him by spreading the calumny that he was having an affair with the King’s mother, and by exploiting anti-Scottish prejudices amongst the English. John Wilkes, a member of parliament, published The North Briton, which was both inflammatory and defamatory in its condemnation of Bute and the government. Wilkes was eventually arrested for seditious libel but he fled to France to escape punishment; he was expelled from the House of Commons, and found guilty in absentia of blasphemy and libel. In 1763, after concluding the Peace of Paris which ended the war, Lord Bute resigned, allowing the Whigs under George Grenville to return to power. Later that year, the Royal Proclamation of 1763 placed a limit upon the westward expansion of the American colonies. The Proclamation aimed to divert colonial expansion to the north (to Nova Scotia) and to the south (Florida). The Proclamation Line did not bother the majority of settled farmers, but it was unpopular with a vocal minority and ultimately contributed to conflict between the colonists and the British government.

 

With the American colonists generally unburdened by British taxes, the government thought it appropriate for them to pay towards the defense of the colonies against native uprisings and the possibility of French incursions. The central issue for the colonists was not the amount of taxes but whether Parliament could levy a tax without American approval, for there were no American seats in Parliament. The Americans protested that like all Englishmen they had rights to ?no taxation without representation’.In 1765, Grenville introduced the Stamp Act, which levied a stamp duty on every document in the British colonies in North America. Since newspapers were printed on stamped paper, those most affected by the introduction of the duty were the most effective at producing propaganda opposing the tax. Meanwhile, the King had become exasperated at Grenville’s attempts to reduce the King’s prerogatives, and tried, unsuccessfully, to persuade William Pitt the Elder to accept the office of Prime Minister. After a brief illness, which may have presaged his illnesses to come, George settled on Lord Rockingham to form a ministry, and dismissed Grenville. Lord Rockingham, with the support of Pitt and the King, repealed Grenville’s unpopular Stamp Act, but his government was weak and he was replaced in 1766 by Pitt, on whom George bestowed the title, Earl of Chatham. The actions of Lord Chatham and George III in repealing the Act were so popular in America that statues of them both were erected in New York City. Lord Chatham fell ill in 1767, and the Duke of Grafton took over the government, although he did not formally become Prime Minister until 1768. That year, John Wilkes returned to England, stood as a candidate in the general election, and came at the top of the poll in the Middlesex constituency. Wilkes was again expelled from Parliament. Wilkes was re-elected and expelled twice more, before the House of Commons resolved that his candidature was invalid and declared the runner-up as the victor. Grafton’s government disintegrated in 1770, allowing the Tories led by Lord North to return to power.

 

George was deeply devout and spent hours in prayer, but his piety was not shared by his brothers. George was appalled by what he saw as their loose morals. In 1770, his brother Prince Henry, Duke of Cumberland and Strathearn, was exposed as an adulterer, and the following year Cumberland married a young widow, Anne Horton. The King considered her inappropriate as a royal bride: she was from a lower social class and German law barred any children of the couple from the Hanoverian succession. George insisted on a new law that essentially forbade members of the Royal Family from legally marrying without the consent of the Sovereign. The subsequent bill was unpopular in Parliament, including among George’s own ministers, but passed as the Royal Marriages Act 1772. Shortly afterward, another of George’s brothers, Prince William Henry, Duke of Gloucester and Edinburgh, revealed he had been secretly married to Maria, Countess Waldegrave, the illegitimate daughter of Sir Edward Walpole. The news confirmed George’s opinion that he had been right to introduce the law: Maria was related to his political opponents. Neither lady was ever received at court. Lord North’s government was chiefly concerned with discontent in America. To assuage American opinion most of the custom duties were withdrawn, except for the tea duty, which in George’s words was ?one tax to keep up the right [to levy taxes]’. In 1773, the tea ships moored in Boston Harbor were boarded by colonists and the tea thrown overboard, an event that became known as the Boston Tea Party. In Britain, opinion hardened against the colonists, with Chatham now agreeing with North that the destruction of the tea was ?certainly criminal’. With the clear support of Parliament, Lord North introduced measures, which were called the Intolerable Acts by the colonists: the Port of Boston was shut down and the charter of Massachusetts was altered so that the upper house of the legislature was appointed by the Crown instead of elected by the lower house. Up to this point, in the words of Professor Peter Thomas, George’s ?hopes were centered on a political solution, and he always bowed to his cabinet’s opinions even when skeptical of their success. The detailed evidence of the years from 1763 to 1775 tends to exonerate George III from any real responsibility for the American Revolution.’ Though the Americans characterized George as a tyrant, in these years he acted as a constitutional monarch supporting the initiatives of his ministers.

 

George III is often accused of obstinately trying to keep Great Britain at war with the revolutionaries in America, despite the opinions of his own ministers. In the words of the Victorian author George Trevelyan, the King was determined ?never to acknowledge the independence of the Americans, and to punish their contumacy by the indefinite prolongation of a war which promised to be eternal.’ The King wanted to ?keep the rebels harassed, anxious, and poor, until the day when, by a natural and inevitable process, discontent and disappointment were converted into penitence and remorse’. However, more recent historians defend George by saying in the context of the times, no king would willingly surrender such a large territory, and his conduct was far less ruthless than contemporary monarchs in Europe. In early 1778, France (Britain’s chief rival) signed a treaty of alliance with the United States and the conflict escalated. The United States and France were soon joined by Spain and the Dutch Republic, while Britain had no major allies of its own. As late as the Siege of Charleston in 1780, Loyalists could still believe in their eventual victory, as British troops inflicted heavy defeats on the Continental forces at the Battle of Camden and the Battle of Guilford Court House. In late 1781, the news of Lord Cornwallis’s surrender at the Siege of Yorktown reached London; Lord North’s parliamentary support ebbed away and he resigned the following year. The King drafted an abdication notice, which was never delivered, finally accepted the defeat in North America, and authorized peace negotiations. The Treaties of Paris, by which Britain recognized the independence of the American states and returned Florida to Spain, were signed in 1782 and 1783. When John Adams was appointed American Minister to London in 1785, George had become resigned to the new relationship between his country and the former colonies. He told Adams, ?I was the last to consent to the separation; but the separation having been made and having become inevitable, I have always said, as I say now, that I would be the first to meet the friendship of the United States as an independent power.’

George III was extremely popular in Britain. The British people admired him for his piety, and for remaining faithful to his wife. He was fond of his children and was devastated at the death of two of his sons in infancy in 1782 and 1783 respectively. By this time, George’s health was deteriorating. He had a mental illness, characterized by acute mania, which was possibly a symptom of the genetic disease porphyria, although this has been questioned. A study of samples of the King’s hair published in 2005 revealed high levels of arsenic, a possible trigger for the disease. The source of the arsenic is not known, but it could have been a component of medicines or cosmetics.

 

The King may have had a brief episode of disease in 1765, but a longer episode began in the summer of 1788. At the end of the parliamentary session, he went to Cheltenham Spa to recuperate. It was the furthest he had ever been from London – just short of 100 miles (150 km) – but his condition worsened. In November he became seriously deranged, sometimes speaking for many hours without pause, causing him to foam at the mouth and making his voice hoarse. His doctors were largely at a loss to explain his illness, and spurious stories about his condition spread, such as the claim that he shook hands with a tree in the mistaken belief that it was the King of Prussia.

 

Editor’s note: The King’s German wife, Charlotte, was intelligent, educated and cultured. She brought German musicians into the Court of George III. Mozart and his family lived for a while there. Handel was such a favorite of the English Court that not only did he compose some of his greatest pieces while living there, but he became an English citizen. As many readers may know, the first aria in Handel’s great opera, Xerxes, depicts a man singing to a tree. For your enjoyment, this aria can be clicked on below.

 

Treatment for mental illness was primitive by modern standards, and the King’s doctors, who included Francis Willis, treated the King by forcibly restraining him until he was calm, or applying caustic poultices to draw out ?evil humors’. In February 1789, the Regency Bill, authorizing his, always scheming, oldest son, the Prince of Wales to act as regent, was introduced and passed in the House of Commons, but before the House of Lords could pass the bill, George III recovered. After George’s recovery, his popularity, and that of Pitt, continued to increase at the expense of Fox and the Prince of Wales. His humane and understanding treatment of two insane assailants, Margaret Nicholson in 1786 and John Frith in 1790, contributed to his popularity. James Hadfield’s failed attempt to shoot the King in the Drury Lane Theatre on 15 May 1800 was not political in origin but motivated by the apocalyptic delusions of Hadfield and Bannister Truelock. George seemed unperturbed by the incident, so much so that he fell asleep during the intermission.

 

The French Revolution of 1789, in which the French monarchy had been overthrown, worried many British landowners. France declared war on Great Britain in 1793; in the war attempt, George allowed Pitt to increase taxes, raise armies, and suspend the right of habeas corpus. The First Coalition to oppose revolutionary France, which included Austria, Prussia, and Spain, broke up in 1795 when Prussia and Spain made separate peace with France. The Second Coalition, which included Austria, Russia, and the Ottoman Empire, was defeated in 1800. Only Great Britain was left fighting Napoleon Bonaparte, the First Consul of the French Republic. At about the same time, the King had a relapse of his previous illness, which he blamed on worry over the Catholic question. On 14 March 1801, Pitt was formally replaced by the Speaker of the House of Commons, Henry Addington. Addington opposed emancipation, instituted annual accounts, abolished income tax and began a program of disarmament. In October 1801, he made peace with the French, and in 1802 signed the Treaty of Amiens. George did not consider the peace with France as real; in his view it was an ?experiment’. In 1803, the war resumed but public opinion distrusted Addington to lead the nation in war, and instead favored Pitt. An invasion of England by Napoleon seemed imminent, and a massive volunteer movement arose to defend England against the French. George’s review of 27,000 volunteers in Hyde Park, London, on 26 and 28 October 1803 and at the height of the invasion scare, attracted an estimated 500,000 spectators on each day. The Times said, ?The enthusiasm of the multitude was beyond all expression.’ A courtier wrote on 13 November that, ?The King is really prepared to take the field in case of attack, his beds are ready and he can move at half an hour’s warning.’ George wrote to his friend Bishop Hurd, ?We are here in daily expectation that Bonaparte will attempt his threatened invasion … Should his troops effect a landing, I shall certainly put myself at the head of mine, and my other armed subjects, to repel them.’ After Admiral Lord Nelson’s famous naval victory at the Battle of Trafalgar, the possibility of invasion was extinguished.

 

In 1804, George’s recurrent illness returned. In late 1810, at the height of his popularity, already virtually blind with cataracts and in pain from rheumatism, George became dangerously ill. In his view the malady had been triggered by stress over the death of his youngest and favorite daughter, Princess Amelia. The Princess’s nurse reported that ?the scenes of distress and crying every day were melancholy beyond description.’ He accepted the need for the Regency Act of 1811, and the Prince of Wales acted as Regent for the remainder of George III’s life. Despite signs of a recovery in May 1811, by the end of the year George had become permanently insane and lived in seclusion at Windsor Castle until his death. Prime Minister Spencer Perceval was assassinated in 1812 and was replaced by Lord Liverpool. Liverpool oversaw British victory in the Napoleonic Wars. The subsequent Congress of Vienna led to significant territorial gains for Hanover, which was upgraded from an electorate to a kingdom.

 

Meanwhile, George’s health deteriorated. He developed dementia, and became completely blind and increasingly deaf. He was incapable of knowing or understanding either that he was declared King of Hanover in 1814, or that his wife died in 1818. At Christmas 1819, he spoke nonsense for 58 hours, and for the last few weeks of his life was unable to walk. He died at Windsor Castle at 8:38 pm on 29 January 1820, six days after the death of his fourth son, the Duke of Kent. His favorite son, Frederick, Duke of York, was with him. George III was buried on 16 February in St George’s Chapel, Windsor Castle. George was succeeded by two of his sons George IV and William IV, who both died without surviving legitimate children, leaving the throne to the only legitimate child of the Duke of Kent, Victoria, the last monarch of the House of Hanover. George III lived for 81 years and 239 days and reigned for 59 years and 96 days: both his life and his reign were longer than those of any of his predecessors. Only Victoria and Elizabeth II have since lived and reigned longer.

 

George III was dubbed ?Farmer George’ by satirists, at first to mock his interest in mundane matters rather than politics, but later to contrast his homely thrift with his son’s grandiosity and to portray him as a man of the people. Under George III, the British Agricultural Revolution reached its peak and great advances were made in fields such as science and industry. There was unprecedented growth in the rural population, which in turn provided much of the workforce for the concurrent Industrial Revolution. George’s collection of mathematical and scientific instruments is now owned by King’s College London but housed in the Science Museum, London, to which it has been on long-term loan since 1927. He had the King’s Observatory built in Richmond-upon-Thames for his own observations of the 1769 transit of Venus. When William Herschel discovered Uranus in 1781, he at first named it Georgium Sidus (George’s Star) after the King, who later funded the construction and maintenance of Herschel’s 1785 40-foot telescope, which was the biggest ever built at the time. In the mid-twentieth century the work of historian, Lewis Namier, who thought George was ?much maligned’, started a re-evaluation of the man and his reign.

 

The very cultured court of King George III would have invited creative composers like Handel and Mozart to perform there often. The most talented opera singer of the time, Farinelli, would have performed at this King’s court for King George and his wife Charlotte.

 

George Frederic Handel (1685-1759); Painting is by Balthasar Denner – National Portrait Gallery: NPG 1976; Public Domain, https://commons.wikimedia.org/w/index.php?curid=6364709

 

Carlo Broschi Farinelli, (1705-1782) wearing the Order of Calatrava, by Jacopo Amigoni c1750-52; Painting by Jacopo Amigoni – Manuel Parada Lopez de Corselas User: Manuel de Corselas ARS SUMMUM, Centro para el Estudio y Difusion Libres de la Historia del Arte. Summer 2007., Public Domain, https://commons.wikimedia.org/w/index.php?curid=2568895

 

Farinelli was the most celebrated Italian castrato singer of the 18th century and one of the greatest singers in the history of opera. Farinelli has been described as having soprano vocal range and sang the highest note customary at the time.

 

 

For your enjoyment

 

Counter tenor, David Daniels, Xerxes, by George Frederic Handel

(The first aria in Xerxes, is the well known Ombra Mai Fu. A man sings to a tree about his love and admiration for it’s existence. Could Handel have heard the malicious gossip after King George III had one of his episodes, that the King was seen talking to a tree? We’ll never know, but the coincidence seems too great, not to come to this conclusion.)

 

Handel’s opera Rinaldo sung in the film, Farinelli

 

Small Pox Killed Millions Throughout Human History

 

Edward Jenner (1749-1823) – Graphic credit: Vigneron Pierre Roch (1789-1872) – http://portrait.kaar.at/,http://www2.biusante.parisdescartes.fr/img/?refbiogr=8701&mod=s, Public Domain, https://commons.wikimedia.org/w/index.php?curid=1497994

 

The scourge of the world! The history of smallpox extends into pre-history; the disease likely emerged in human populations about 10,000 BCE. An estimated 300 to 500 million people died from smallpox in the 20th century alone. This virulent disease, which kills a third of those it infects, is known to have co-existed with human beings for thousands of years. The origin of smallpox is unknown. The earliest evidence of the disease dates back to the 3rd century BCE in Egyptian mummies. The disease historically occurred in outbreaks. In 18th century Europe it is estimated 400,000 people per year died from the disease, and one-third of the cases resulted in blindness. These deaths included those of at least five reigning monarchs. As recently as 1967, 15 million cases occurred a year.

 

In 1798, Edward Jenner discovered that vaccinations could prevent smallpox. In 1967, the World Health Organization intensified efforts to eliminate the disease. Smallpox is one of two infectious diseases to have been eradicated, the other being rinderpest in 2011. The term “smallpox“ was first used in Britain in the 15th century to distinguish the disease from syphilis, which was then known as the “great pox“. Other historical names for the disease include pox, speckled monster, and red plague.

 

The well known (Shakespeare) curse, “A pox on both your houses“ would have been a very serious utterance.

 

One of Many Stories of Smallpox (and True Love in the 19th Century)

 

Soon after his marriage, the great Irish composer and poet, Thomas Moore (1779-1852), was called away on a business trip. Upon his return he was met at the door, not by beauteous Elizabeth, bride and the love of his life, but by the family doctor. “Your wife is upstairs,“ said their doctor. “But she asked that you not come up.“ The physician related the terrible account; Moore learned that; his wife had contracted small pox. The disease had left her once flawless skin pocked and scarred. She had taken one look at her reflection in the mirror and commanded that the shutters be drawn and over them the heavy drapes, and that her husband should never see her again. Moore would not listen. He ran upstairs and threw open the door of his wife’s room. It was black as night inside. Not a sound came from the darkness. Groping along the wall, Moore felt for the gas jets. A startled cry came from a black corner of the room. “No! Don’t light the lamps!“ Moore hesitated, swayed by the pleading in the voice. “Go!“ she begged. “Please go! This is the greatest gift I can give you now.“

 

Moore did go. He went down to his study, where he sat up most of the night, passionately writing, what turned out to be one of the greatest love poems ever written; but he also composed a song, to go with his words, a song that lives on, using certain musical phrases of ancient Irish melodies. He had never written a song before, but now it came naturally, motivated by the adoration of his wife and her profound melancholy. The next morning, as soon as the sun was up he returned to his wife’s room. In spite of the morning light, her room remained as dark as night. He felt his way to a chair and sat down. “Are you awake?“ he asked. “I am,“ came a voice from the far side of the room. “But you must not ask to see me. You must not press me, Thomas.“ “I will sing to you, then,“ he answered. And so, for the first time, Thomas Moore sang to his wife the song that still lives today:

 

Believe me, if all those endearing young charms,

Which I gaze on so fondly today,

Were to change by tomorrow and flee in my arms,

Like fairy gifts fading away,

Thou wouldst still be adored, as this moment thou art —

Let thy loveliness fade as it will,

 

Moore heard a movement from the dark corner where his wife lay in her loneliness, waiting.  He continued:

 

Let thy loveliness fade as it will,

And around the dear ruin each wish of my heart

Would entwine itself verdantly still —

 

The song ended. As his voice trailed off on the last note, Moore heard his bride rise. She crossed the room to the window, reached up and slowly pulled aside the drapes and drew open the shutters.

 

Their long marriage was successful with five children, who tragically all died before both parents. Towards the end of his life, Moore suffered from a stroke and was lovingly cared for by his devoted wife, Elizabeth. He died on 26 February 1852. His remains are in a vault at St. Nicholas churchyard, Bromham, within view of his cottage-home.

 

(Editor’s note: as you listen to this loveliness, try not to look at the photos; would not have been our choice)

 

Here is that same love poem/song:

 

Believe Me If All Those Endearing Young Charms

 

Another version of: “Believe Me

 

Another version of: “Believe Me

 

Final evolution of the music; Harvard University’s song, “Fair Harvard.“

 

The Last Rose of Summer, words and music by Thomas Moore, sung by Renee Fleming. The haunting melody of Moore’s, The Last Rose of Summer, was adapted by Beethoven, Mendelssohn and Flotow in his opera, Martha.

 

Dame Joan Sutherland: Last Rose (adapted for the opera, Martha)

 

Mendelssohn, (adaption) Fantasie Opus #15

 

Beethoven: (adaptation) e flat; “Sad and luckless was the season.“

 

FINAL NOTATION: THE LAST ROSE OF SUMMER, IS A PROMINENT MELODY IN THE SOUND TRACK OF “THREE BILLBOARDS OUTSIDE EBBING, MISSOURI,” A RECENT FILM NOMINATED FOR AN OSCAR.

 

Robert Daniel Lawrence MD (1892-1968)

Robert Daniel Lawrence MD

Photo credit: Unknown – http://wellcomeimages.org/indexplus/image/L0000433.html, CC BY 4.0, https://commons.wikimedia.org/w/index.php?curid=35452965

  

Dr. Robert “Robin“ Daniel Lawrence (1892 – 1968) MA, MB ChB (Hons), MD, FRCP, LLD was a British physician at King’s College Hospital, London. He was diagnosed with diabetes in 1920 and became an early recipient of insulin injections in the UK in 1923. He devoted his professional life to the care of diabetic patients and is remembered as the founder of the British Diabetic Association. Dr. Lawrence, better known as Robin Lawrence was born at 10 Ferryhill Place, Aberdeen, Scotland. He was the second son of Thomas and Margaret Lawrence. His father was a prosperous brush manufacturer, whose firm supplied all the brushes to Queen Victoria and her heirs at Balmoral.

 

At eighteen, Lawrence matriculated at Aberdeen University to take an MA in French and English. After graduation, he briefly worked in an uncle’s drapery shop in Glasgow but gave this up after just a few weeks, returning to Aberdeen where he enrolled back at Aberdeen University to study medicine. He had a brilliant undergraduate career winning gold medals in Anatomy, Clinical Medicine and Surgery and graduated ?with honors’ in 1916. During his second year, on the advice of his anatomy professor he sat and passed the primary FRCS examination in London. He gave up rugby as a student, but represented the University at both hockey and tennis. He was also President of the Students’ Representative Council. On graduation he immediately joined the RAMC and after six months home service, served on the Indian Frontier until invalided home in 1919 with dysentery and was discharged with the final rank of Captain. After a few weeks convalescing at home and fishing, he went to London and obtained the post of House Surgeon in the Casualty Department at King’s College Hospital. Six months later, now being accepted as a “King’s Man“, he became an assistant surgeon in the Ear, Nose and Throat Department. Shortly afterwards, while practicing for a mastoid operation on a cadaver, he was chiseling the bone when a bone chip flew into his right eye setting up an unpleasant infection. He was hospitalized but the infection failed to settle and he was discovered to have diabetes. At his age at this time this represented a death sentence.

 

Lawrence was initially controlled on a rigid diet and the eye infection settled but left permanently impaired vision in that eye. He abandoned thoughts of a career in surgery and worked in the King’s College Hospital Chemical Pathology Department under a Dr G A Harrison. Despite his gloomy prognosis and ill-health he managed to conduct enough research to write his MD thesis. A little later, in the expectation that he had only a short time to live, and not wishing to die at home causing upset to his family, he moved to Florence and set up in practice there. In the winter of 1922-23 his diabetes deteriorated badly after an attack of bronchitis and the end of his life seemed imminent. In early 1922, Banting, Best, Collip and Macleod in Toronto, Canada made the discovery and isolation of insulin. Supplies were initially in short supply and slow to reach the UK, but in May 1923, Harrison cabled Lawrence – “I’ve got insulin – it works – come back quick“. By this time Lawrence was weak and disabled by peripheral neuritis and with difficulty drove across the continent and reached King’s College Hospital on 28 May 1923. After some preliminary tests he received his first insulin injection on 31 May. His life was saved and he spent two months in hospital recovering and learning all about insulin. He was then appointed Chemical Pathologist at King’s College Hospital and devoted the rest of his life to the care and welfare of diabetic patients.

 

Dr. Lawrence developed one of the earliest and largest diabetic clinics in the country and in 1931 was appointed assistant physician-in-charge of the diabetic department at King’s College Hospital, becoming full physician-in-charge in 1939. He also had a large private practice. He wrote profusely on his subject and his books The Diabetic Life and The Diabetic ABC, did much to simplify treatment for doctors and patients. The Diabetic Life was first published in 1925 and became immensely popular, extending to 14 editions and translated into many languages. He published widely on all aspects of diabetes and its management, producing some 106 papers either alone or with colleagues, including important publications on the management of diabetic coma, on the treatment of diabetes and tuberculosis and on the care of pregnancy in diabetics. In 1934, he conceived the idea of an association which would foster research and encourage education and welfare of patients. To this end a group of doctors and diabetics met in the London home of Lawrence’s patient, H. G. Wells, the scientist and writer, and the Diabetic Association was formed. When other countries followed suit it became the British Diabetic Association (the BDA). Lawrence was Chairman of the Executive Council from 1934-1961 and Hon. Life President from 1962.

 

Dr. Lawrence’s enthusiasm and drive ensured the life and steady growth of this association which soon became the voice of the diabetic population and constantly sought to promote the welfare of diabetics. There are now active branches through the country. He was also a prime mover in production of “The Diabetic Journal“ (forerunner of Balance), the first issue of which appeared in January 1935. Many articles thereafter were contributed by himself anonymously. He and colleague Joseph Hoet were the main proponents in founding the International Diabetes Federation and he served as their first president from 1950-1958. At their triennial conferences, Lawrence’s appearance was always greeted with acclaim. Almost immediately after his retirement, he suffered a stroke but his spirit remained indomitable and he continued seeing private patients to the end. His last publication was an account of how hypoglycemia exaggerated the signs of his hemiparesis. Although he preached strict control of diabetes for his patients, he did not keep to a strict diet himself taking instead supplementary shots of soluble insulin as he judged he needed them. He died at home in London on 27 August 1968 aged 76.

 

Lawrence was Oliver-Sharpey lecturer at the Royal College of Physicians of London in 1946. His lecture was one of the earliest descriptions and detailed study of the rare condition now known as Lipoatrophic Diabetes. He was recipient of the Banting Medal of the American Diabetes Association the same year; Banting Lecturer of the BDA in 1949 and in 1964 Toronto University conferred on him its LLD “honoris causa.“ Charles Best, then professor of physiology in Toronto, was probably the proposer for this honor as he had met and become friendly with Lawrence when doing postgraduate research in London with Sir Henry Dale and A. V. Hill in 1925-28. They remained lifelong friends meeting regularly when in each other’s country.

 

RD Lawrence is commemorated by an annual Lawrence lecture given by a young researcher in the field of diabetes to the Medical & Scientific Section of the BDA and by the Lawrence Medal awarded to patients who have been on insulin for 60 years or more. The BDA, now Diabetes UK remains his lasting memorial.

 

Next Page →