Phrenology

An 1883 phrenology chart

 

Graphic credit: From People’s Cyclopedia of Universal Knowledge (1883). Transferred from en.wikipedia Original uploader was Whbonney at en.wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=6693422

 

Phrenology is a pseudo medicine primarily focused on measurements of the human skull, based on the concept that the brain is the organ of the mind, and that certain brain areas have localized, specific functions or modules. Although both of those ideas have a basis in reality, phrenology extrapolated beyond empirical knowledge in a way that departed from science. Developed by German physician Franz Joseph Gall in 1796, the discipline was very popular in the 19th century, especially from about 1810 until 1840. The principal British center for phrenology was Edinburgh, where the Edinburgh Phrenological Society was established in 1820. Although now regarded as an obsolete amalgamation of primitive neuroanatomy with moral philosophy, phrenological thinking was influential in 19th-century psychiatry. Gall’s assumption that character, thoughts, and emotions are located in specific parts of the brain is considered an important historical advance toward neuropsychology.

 

Phrenologists believe that the human mind has a set of various mental faculties, each one represented in a different area of the brain. For example, the faculty of “philoprogenitiveness“, from the Greek for “love of offspring“, was located centrally at the back of the head (see illustration of the chart from Webster’s Academic Dictionary).

These areas were said to be proportional to a person’s propensities. The importance of an organ was derived from relative size compared to other organs. It was believed that the cranial skull – like a glove on the hand -accommodates to the different sizes of these areas of the brain, so that a person’s capacity for a given personality trait could be determined simply by measuring the area of the skull that overlies the corresponding area of the brain. Phrenology, which focuses on personality and character, is distinct from craniometry, which is the study of skull size, weight and shape, and physiognomy, the study of facial features. Phrenology is a process that involves observing and/or feeling the skull to determine an individual’s psychological attributes. Franz Joseph Gall believed that the brain was made up of 27 individual organs that determined personality, the first 19 of these ?organs’ he believed to exist in other animal species. Phrenologists would run their fingertips and palms over the skulls of their patients to feel for enlargements or indentations. The phrenologist would often take measurements with a tape measure of the overall head size and more rarely employ a craniometer, a special version of a caliper. In general, instruments to measure sizes of cranium continued to be used after the mainstream phrenology had ended. The phrenologists put emphasis on using drawings of individuals with particular traits, to determine the character of the person and thus many phrenology books show pictures of subjects. From absolute and relative sizes of the skull the phrenologist would assess the character and temperament of the patient.

 

Gall’s list of the “brain organs“ was specific. An enlarged organ meant that the patient used that particular “organ“ extensively. The number – and more detailed meanings – of organs were added later by other phrenologists. The 27 areas varied in function, from sense of color, to religiosity, to being combative or destructive. Each of the 27 “brain organs“ was located under a specific area of the skull. As a phrenologist felt the skull, he would use his knowledge of the shapes of heads and organ positions to determine the overall natural strengths and weaknesses of an individual. Phrenologists believed the head revealed natural tendencies but not absolute limitations or strengths of character. The first phrenological chart gave the names of the organs described by Gall; it was a single sheet, and sold for a cent. Later charts were more expansive.

 

Historically, among the first to identify the brain as the major controlling center for the body were Hippocrates and his followers, inaugurating a major change in thinking from Egyptian, biblical and early Greek views, which based bodily primacy of control on the heart. This belief was supported by the Greek physician Galen, who concluded that mental activity occurred in the brain rather than the heart, contending that the brain, a cold, moist organ formed of sperm, was the seat of the animal soul – one of three “souls“ found in the body, each associated with a principal organ. The Swiss pastor Johann Kaspar Lavater (1741-1801) introduced the idea that physiognomy related to the specific character traits of individuals, rather than general types, in his Physiognomische Fragmente, published between 1775 and 1778. His work was translated into English and published in 1832 as The Pocket Lavater, or, The Science of Physiognomy. He believed that thoughts of the mind and passions of the soul were connected with an individual’s external frame. Of the forehead, When the forehead is perfectly perpendicular, from the hair to the eyebrows, it denotes an utter deficiency of understanding.

 

In 1796 the German physician Franz Joseph Gall (1758-1828) began lecturing on organology: the isolation of mental faculties and later cranioscopy which involved reading the skull’s shape as it pertained to the individual. It was Gall’s collaborator Johann Gaspar Spurzheim who would popularize the term “phrenology“. In 1809 Gall began writing his principal work, The Anatomy and Physiology of the Nervous System in General, and of the Brain in Particular, with Observations upon the possibility of ascertaining the several Intellectual and Moral Dispositions of Man and Animal, by the configuration of their Heads. It was not published until 1819. In the introduction to this main work, Gall makes the following statement in regard to his doctrinal principles, which comprise the intellectual basis of phrenology:

 

The Brain is the organ of the mind

 

1. The brain is not a homogenous unity, but an aggregate of mental organs with specific functions

2. The cerebral organs are topographically localized

3. Other things being equal, the relative size of any particular mental organ is indicative of the power or strength of that organ

4. Since the skull ossifies over the brain during infant development, external craniological means could be used to diagnose the internal states of the mental characters

 

Through careful observation and extensive experimentation, Gall believed he had established a relationship between aspects of character, called faculties, with precise organs in the brain. Johann Spurzheim was Gall’s most important collaborator. He worked as Gall’s anatomist until 1813 when for unknown reasons they had a permanent falling out. Publishing under his own name Spurzheim successfully disseminated phrenology throughout the United Kingdom during his lecture tours through 1814 and 1815 and the United States in 1832 where he would eventually die. Gall was more concerned with creating a physical science, so it was through Spurzheim that phrenology was first spread throughout Europe and America. Phrenology, while not universally accepted, was hardly a fringe phenomenon of the era. George Combe would become the chief promoter of phrenology throughout the English-speaking world after he viewed a brain dissection by Spurzheim, convincing him of phrenology’s merits.

 

The popularization of phrenology in the middle and working classes was due in part to the idea that scientific knowledge was important and an indication of sophistication and modernity. Cheap and plentiful pamphlets, as well as the growing popularity of scientific lectures as entertainment, also helped spread phrenology to the masses. Combe created a system of philosophy of the human mind that became popular with the masses because of its simplified principles and wide range of social applications that were in harmony with the liberal Victorian world view. George Combe’s book On the Constitution of Man and its Relationship to External Objects sold over 200, 000 copies through nine editions. Combe also devoted a large portion of his book to reconciling religion and phrenology, which had long been a sticking point. Another reason for its popularity was that phrenology balanced between free will and determinism. A person’s inherent faculties were clear, and no faculty was viewed as evil, though the abuse of a faculty was. Phrenology allowed for self-improvement and upward mobility, while providing fodder for attacks on aristocratic privilege. Phrenology also had wide appeal because of its being a reformist philosophy not a radical one. Phrenology was not limited to the common people, and both Queen Victoria and Prince Albert invited George Combe to read the heads of their children.

 

Phrenology came about at a time when scientific procedures and standards for acceptable evidence were still being codified. In the context of Victorian society, phrenology was a respectable scientific theory. The Phrenological Society of Edinburgh founded by George and Andrew Combe was an example of the credibility of phrenology at the time and included a number of extremely influential social reformers and intellectuals, including the publisher Robert Chambers, the astronomer John Pringle Nichol, the evolutionary environmentalist Hewett Cottrell Watson, and asylum reformer William A.F. Browne. In 1826, out of the 120 members of the Edinburgh society an estimated one third were from a medical background. By the 1840s there were more than 28 phrenological societies in London with over 1000 members. Another important scholar was Luigi Ferrarese, the leading Italian phrenologist. He advocated that governments should embrace phrenology as a scientific means of conquering many social ills, and his Memorie Risguardanti La Dottrina Frenologica (1836), is considered “one of the fundamental 19th century works in the field“.

 

Traditionally the mind had been studied through introspection. Phrenology provided an attractive, biological alternative that attempted to unite all mental phenomena using consistent biological terminology. Gall’s approach prepared the way for studying the mind that would lead to the downfall of his own theories. Phrenology contributed to development of physical anthropology, forensic medicine, knowledge of the nervous system and brain anatomy as well as contributing to applied psychology. John Elliotson was a brilliant but erratic heart specialist who became a phrenologist in the 1840s. He was also a mesmerist and combined the two into something he called phrenomesmerism or phrenomagnatism. Changing behavior through mesmerism eventually won out in Elliotson’s hospital, putting phrenology in a subordinate role. Others amalgamated phrenology and mesmerism as well, such as the practical phrenologists Collyer and Joseph R. Buchanan. The benefits of combining mesmerism and phrenology was that the trance the patient was placed in was supposed to allow for the manipulation of his/her penchants and qualities. For example, if the organ of self-esteem was touched, the subject would take on a haughty expression.

 

Phrenology has been psychology’s great faux pas. – J.C. Flugel (1933)

 

Phrenology was mostly discredited as a scientific theory by the 1840s. This was due only in part to a growing amount of evidence against phrenology. Phrenologists had never been able to agree on the most basic mental organ numbers, going from 27 to over 40, and had difficulty locating the mental organs. Phrenologists relied on cranioscopic readings of the skull to find organ locations. Jean Pierre Flourens’ experiments on the brains of pigeons indicated that the loss of parts of the brain either caused no loss of function, or the loss of a completely different function than what had been attributed to it by phrenology. Flourens’ experiment, while not perfect, seemed to indicate that Gall’s supposed organs were imaginary. Scientists had also become disillusioned with phrenology since its exploitation with the middle and working classes by entrepreneurs. The popularization had resulted in the simplification of phrenology and mixing in it of principles of physiognomy, which had from the start been rejected by Gall as an indicator of personality. Phrenology from its inception was tainted by accusations of promoting materialism and atheism, and being destructive of morality. These were all factors which led to the downfall of phrenology . Recent studies, using modern day technology like Magnetic Resonance Imaging have further disproven phrenology claims.

 

During the early 20th century, a revival of interest in phrenology occurred, partly because of studies of evolution, criminology and anthropology (as pursued by Cesare Lombroso). The most famous British phrenologist of the 20th century was the London psychiatrist Bernard Hollander (1864-1934). His main works, The Mental Function of the Brain (1901) and Scientific Phrenology (1902), are an appraisal of Gall’s teachings. Hollander introduced a quantitative approach to the phrenological diagnosis, defining a method for measuring the skull, and comparing the measurements with statistical averages. In Belgium, Paul Bouts (1900-1999) began studying phrenology from a pedagogical background, using the phrenological analysis to define an individual pedagogy. Combining phrenology with typology and graphology, he coined a global approach known as psychognomy. Bouts, a Roman Catholic priest, became the main promoter of renewed 20th-century interest in phrenology and psychognomy in Belgium. He was also active in Brazil and Canada, where he founded institutes for characterology. His works Psychognomie and Les Grandioses Destinees individuelle et humaine dans la lumiere de la Caracterologie et de l’Evolution cerebro-cranienne are considered standard works in the field. In the latter work, which examines the subject of paleoanthropology, Bouts developed a teleological and orthogenetical view on a perfecting evolution, from the paleo-encephalical skull shapes of prehistoric man, which he considered still prevalent in criminals and savages, towards a higher form of mankind, thus perpetuating phrenology’s problematic racializing of the human frame. Bouts died on March 7, 1999. His work has been continued by the Dutch foundation PPP (Per Pulchritudinem in Pulchritudine), operated by Anette Muller, one of Bouts’ students. During the 1930’s Belgian colonial authorities in Rwanda used phrenology to explain the so-called superiority of Tutsis over Hutus.

 

King George III of Great Britain

Full-length portrait in oils of a clean-shaven young George in eighteenth century dress: gold jacket and breeches, ermine cloak, powdered wig, white stockings, and buckled shoes.

 

Graphic credit: English painter, Allan Ramsay – vgGv1tsB1URdhg at Google Cultural Institute maximum zoom level, Public Domain, https://commons.wikimedia.org/w/index.php?curid=23604082

 

 

Editor’s note: We are including extra information about King George III, not only because all of Europe was seething during his reign, and because the King’s illness is curious because there’s no real diagnosis to this day, and also because American history at the time of King George III, is inextricably bound. Finally, we thought readers should know more than the average educated American, that King George III was not the tyrant that most Americans believe, but an astute politician, a curious intellectual, a highly cultured person, and a moral person within his own family, very dear to him and on behalf of his beloved country. We hope you come away, understanding King George III better after you read this short piece. Readers may not realize that this king was exceedingly popular with the people of Great Britain.

George III (George William Frederick; 4 June 1738 – 29 January 1820) was King of Great Britain and King of Ireland from 25 October 1760 until the union of the two countries on 1 January 1801. After the union, he was King of the United Kingdom of Great Britain and Ireland until his death. He was concurrently Duke and prince-elector of Brunswick-Luneburg (?Hanover’) in the Holy Roman Empire before becoming King of Hanover on 12 October 1814. He was the third British monarch of the House of Hanover, but unlike his two predecessors, he was born in England, spoke English as his first language, and never visited Hanover. His life and with it his reign, which were longer than those of any of his predecessors, were marked by a series of military conflicts involving his kingdoms, much of the rest of Europe, and places farther afield in Africa, the Americas and Asia. Early in his reign, Great Britain defeated France in the Seven Years’ War, becoming the dominant European power in North America and India. However, many of Britain’s American colonies were soon lost in the American War of Independence. Further wars against revolutionary and Napoleonic France from 1793 concluded in the defeat of Napoleon at the Battle of Waterloo in 1815.

 

In the later part of his life, George III had recurrent, and eventually permanent, mental illness. Although it has since been suggested that he had the blood disease porphyria, the cause of his illness remains unknown. After a final relapse in 1810, a regency was established, and George III’s eldest son, George, the conniving Prince of Wales, ruled as Prince Regent. On George III’s death, the Prince Regent succeeded his father as George IV. Historical analysis of George III’s life has gone through a ?kaleidoscope of changing views’ that have depended heavily on the prejudices of his biographers and the sources available to them. Until it was reassessed in the second half of the 20th century, his reputation in the United States was one of a tyrant; and in Britain he became, for (a minority) ?the scapegoat for the failure of imperialism’.

 

George was born in London at Norfolk House in St James’s Square. He was the grandson of King George II, and the eldest son of Frederick, Prince of Wales, and Augusta of Saxe-Gotha. As Prince George was born two months prematurely and he was thought unlikely to survive, he was baptized the same day by Thomas Secker, who was both Rector of St James’s and Bishop of Oxford. One month later, he was publicly baptized at Norfolk House, again by Secker. His godparents were the King of Sweden (for whom Lord Baltimore stood proxy), his uncle the Duke of Saxe-Gotha (for whom Lord Carnarvon stood proxy) and his great-aunt the Queen of Prussia (for whom Lady Charlotte Edwin stood proxy). George grew into a healthy but reserved and shy child. The family moved to Leicester Square, where George and his younger brother Prince Edward, Duke of York and Albany, were educated together by private tutors. Family letters show that he could read and write in both English and German, as well as comment on political events of the time, by the age of eight. He was the first British monarch to study science systematically. Apart from chemistry and physics, his lessons included astronomy, mathematics, French, Latin, history, music, geography, commerce, agriculture and constitutional law, along with sporting and social accomplishments such as dancing, fencing, and riding. His religious education was wholly Anglican. At age 10 George took part in a family production of Joseph Addison’s play Cato and said in the new prologue: ?What, tho’ a boy! It may with truth be said, A boy in England born, in England bred.’ Historian Romney Sedgwick argued that these lines appear ?to be the source of the only historical phrase with which he is associated’. Clearly, this historian, was one of the minority who downplayed King George’s many talents.

 

George’s grandfather, King George II, disliked the Prince of Wales, and took little interest in his grandchildren. However, in 1751 the Prince of Wales died unexpectedly from a lung injury, and George became heir apparent to the throne. He inherited his father’s title of Duke of Edinburgh. Now more interested in his grandson, three weeks later the King created George Prince of Wales (the title is not automatically acquired). In the spring of 1756, as George approached his 18th birthday, the King offered him a grand establishment at St James’s Palace, but George refused the offer, guided by his mother and her confidant, Lord Bute, who would later serve as Prime Minister. George’s mother, now the Dowager Princess of Wales, preferred to keep George at home where she could imbue him with her strict moral values.

 

In 1759, George was smitten with Lady Sarah Lennox, sister of the Duke of Richmond, but Lord Bute advised against the match and George abandoned his thoughts of marriage. ?I am born for the happiness or misery of a great nation,’ he wrote, ?and consequently must often act contrary to my passions.’ Nevertheless, attempts by the King to marry George to Princess Sophie Caroline of Brunswick-Wolfenbuttel were resisted by him and his mother; Sophie married the Margrave of Bayreuth instead. The following year, at the age of 22, George succeeded to the throne when his grandfather, George II, died suddenly on 25 October 1760, two weeks before his 77th birthday. The search for a suitable wife intensified. On 8 September 1761 in the Chapel Royal, St James’s Palace, the King married Princess Charlotte of Mecklenburg-Strelitz, whom he met on their wedding day. A fortnight later on 22 September both were crowned at Westminster Abbey. George remarkably never took a mistress (in contrast with his grandfather and his sons), and the couple enjoyed a genuinely happy marriage until his mental illness struck. They had 15 children – nine sons and six daughters. In 1762, George purchased Buckingham House (on the site now occupied by Buckingham Palace) for use as a family retreat. His other residences were Kew and Windsor Castle. St James’s Palace was retained for official use. He did not travel extensively, and spent his entire life in southern England. In the 1790s, the King and his family took holidays at Weymouth, Dorset, which he thus popularized as one of the first seaside resorts in England.

 

George, in his accession speech to Parliament, proclaimed: ?Born and educated in this country, I glory in the name of Britain.’ He inserted this phrase into the speech, written by Lord Hardwicke, to demonstrate his desire to distance himself from his German forebears, who were perceived as caring more for Hanover than for Britain. Although his accession was at first welcomed by politicians of all parties, the first years of his reign were marked by political instability, largely generated as a result of disagreements over the Seven Years’ War. George was also perceived as favoring Tory ministers, which led to his denunciation by the Whigs as an autocrat. On his accession, the Crown lands produced relatively little income; most revenue was generated through taxes and excise duties. George surrendered the Crown Estate to Parliamentary control in return for a civil list annuity for the support of his household and the expenses of civil government. Claims that he used the income to reward supporters with bribes and gifts are disputed by historians who say such claims ?rest on nothing but falsehoods put out by disgruntled opposition’.

 

Debts amounting to over3 million pounds over the course of George’s reign were paid by Parliament, and the civil list annuity was increased from time to time. He aided the Royal Academy of Arts with large grants from his private funds and may have donated more than half of his personal income to charity. Of his art collection, the two most notable purchases are Johannes Vermeer’s Lady at the Virginals and a set of Canalettos, but it is as a collector of books that he is best remembered. The King’s Library was open and available to scholars and was the foundation of a new national library. In May 1762, the incumbent Whig government of the Duke of Newcastle was replaced with one led by the Scottish Tory Lord Bute. Bute’s opponents worked against him by spreading the calumny that he was having an affair with the King’s mother, and by exploiting anti-Scottish prejudices amongst the English. John Wilkes, a member of parliament, published The North Briton, which was both inflammatory and defamatory in its condemnation of Bute and the government. Wilkes was eventually arrested for seditious libel but he fled to France to escape punishment; he was expelled from the House of Commons, and found guilty in absentia of blasphemy and libel. In 1763, after concluding the Peace of Paris which ended the war, Lord Bute resigned, allowing the Whigs under George Grenville to return to power. Later that year, the Royal Proclamation of 1763 placed a limit upon the westward expansion of the American colonies. The Proclamation aimed to divert colonial expansion to the north (to Nova Scotia) and to the south (Florida). The Proclamation Line did not bother the majority of settled farmers, but it was unpopular with a vocal minority and ultimately contributed to conflict between the colonists and the British government.

 

With the American colonists generally unburdened by British taxes, the government thought it appropriate for them to pay towards the defense of the colonies against native uprisings and the possibility of French incursions. The central issue for the colonists was not the amount of taxes but whether Parliament could levy a tax without American approval, for there were no American seats in Parliament. The Americans protested that like all Englishmen they had rights to ?no taxation without representation’.In 1765, Grenville introduced the Stamp Act, which levied a stamp duty on every document in the British colonies in North America. Since newspapers were printed on stamped paper, those most affected by the introduction of the duty were the most effective at producing propaganda opposing the tax. Meanwhile, the King had become exasperated at Grenville’s attempts to reduce the King’s prerogatives, and tried, unsuccessfully, to persuade William Pitt the Elder to accept the office of Prime Minister. After a brief illness, which may have presaged his illnesses to come, George settled on Lord Rockingham to form a ministry, and dismissed Grenville. Lord Rockingham, with the support of Pitt and the King, repealed Grenville’s unpopular Stamp Act, but his government was weak and he was replaced in 1766 by Pitt, on whom George bestowed the title, Earl of Chatham. The actions of Lord Chatham and George III in repealing the Act were so popular in America that statues of them both were erected in New York City. Lord Chatham fell ill in 1767, and the Duke of Grafton took over the government, although he did not formally become Prime Minister until 1768. That year, John Wilkes returned to England, stood as a candidate in the general election, and came at the top of the poll in the Middlesex constituency. Wilkes was again expelled from Parliament. Wilkes was re-elected and expelled twice more, before the House of Commons resolved that his candidature was invalid and declared the runner-up as the victor. Grafton’s government disintegrated in 1770, allowing the Tories led by Lord North to return to power.

 

George was deeply devout and spent hours in prayer, but his piety was not shared by his brothers. George was appalled by what he saw as their loose morals. In 1770, his brother Prince Henry, Duke of Cumberland and Strathearn, was exposed as an adulterer, and the following year Cumberland married a young widow, Anne Horton. The King considered her inappropriate as a royal bride: she was from a lower social class and German law barred any children of the couple from the Hanoverian succession. George insisted on a new law that essentially forbade members of the Royal Family from legally marrying without the consent of the Sovereign. The subsequent bill was unpopular in Parliament, including among George’s own ministers, but passed as the Royal Marriages Act 1772. Shortly afterward, another of George’s brothers, Prince William Henry, Duke of Gloucester and Edinburgh, revealed he had been secretly married to Maria, Countess Waldegrave, the illegitimate daughter of Sir Edward Walpole. The news confirmed George’s opinion that he had been right to introduce the law: Maria was related to his political opponents. Neither lady was ever received at court. Lord North’s government was chiefly concerned with discontent in America. To assuage American opinion most of the custom duties were withdrawn, except for the tea duty, which in George’s words was ?one tax to keep up the right [to levy taxes]’. In 1773, the tea ships moored in Boston Harbor were boarded by colonists and the tea thrown overboard, an event that became known as the Boston Tea Party. In Britain, opinion hardened against the colonists, with Chatham now agreeing with North that the destruction of the tea was ?certainly criminal’. With the clear support of Parliament, Lord North introduced measures, which were called the Intolerable Acts by the colonists: the Port of Boston was shut down and the charter of Massachusetts was altered so that the upper house of the legislature was appointed by the Crown instead of elected by the lower house. Up to this point, in the words of Professor Peter Thomas, George’s ?hopes were centered on a political solution, and he always bowed to his cabinet’s opinions even when skeptical of their success. The detailed evidence of the years from 1763 to 1775 tends to exonerate George III from any real responsibility for the American Revolution.’ Though the Americans characterized George as a tyrant, in these years he acted as a constitutional monarch supporting the initiatives of his ministers.

 

George III is often accused of obstinately trying to keep Great Britain at war with the revolutionaries in America, despite the opinions of his own ministers. In the words of the Victorian author George Trevelyan, the King was determined ?never to acknowledge the independence of the Americans, and to punish their contumacy by the indefinite prolongation of a war which promised to be eternal.’ The King wanted to ?keep the rebels harassed, anxious, and poor, until the day when, by a natural and inevitable process, discontent and disappointment were converted into penitence and remorse’. However, more recent historians defend George by saying in the context of the times, no king would willingly surrender such a large territory, and his conduct was far less ruthless than contemporary monarchs in Europe. In early 1778, France (Britain’s chief rival) signed a treaty of alliance with the United States and the conflict escalated. The United States and France were soon joined by Spain and the Dutch Republic, while Britain had no major allies of its own. As late as the Siege of Charleston in 1780, Loyalists could still believe in their eventual victory, as British troops inflicted heavy defeats on the Continental forces at the Battle of Camden and the Battle of Guilford Court House. In late 1781, the news of Lord Cornwallis’s surrender at the Siege of Yorktown reached London; Lord North’s parliamentary support ebbed away and he resigned the following year. The King drafted an abdication notice, which was never delivered, finally accepted the defeat in North America, and authorized peace negotiations. The Treaties of Paris, by which Britain recognized the independence of the American states and returned Florida to Spain, were signed in 1782 and 1783. When John Adams was appointed American Minister to London in 1785, George had become resigned to the new relationship between his country and the former colonies. He told Adams, ?I was the last to consent to the separation; but the separation having been made and having become inevitable, I have always said, as I say now, that I would be the first to meet the friendship of the United States as an independent power.’

George III was extremely popular in Britain. The British people admired him for his piety, and for remaining faithful to his wife. He was fond of his children and was devastated at the death of two of his sons in infancy in 1782 and 1783 respectively. By this time, George’s health was deteriorating. He had a mental illness, characterized by acute mania, which was possibly a symptom of the genetic disease porphyria, although this has been questioned. A study of samples of the King’s hair published in 2005 revealed high levels of arsenic, a possible trigger for the disease. The source of the arsenic is not known, but it could have been a component of medicines or cosmetics.

 

The King may have had a brief episode of disease in 1765, but a longer episode began in the summer of 1788. At the end of the parliamentary session, he went to Cheltenham Spa to recuperate. It was the furthest he had ever been from London – just short of 100 miles (150 km) – but his condition worsened. In November he became seriously deranged, sometimes speaking for many hours without pause, causing him to foam at the mouth and making his voice hoarse. His doctors were largely at a loss to explain his illness, and spurious stories about his condition spread, such as the claim that he shook hands with a tree in the mistaken belief that it was the King of Prussia.

 

Editor’s note: The King’s German wife, Charlotte, was intelligent, educated and cultured. She brought German musicians into the Court of George III. Mozart and his family lived for a while there. Handel was such a favorite of the English Court that not only did he compose some of his greatest pieces while living there, but he became an English citizen. As many readers may know, the first aria in Handel’s great opera, Xerxes, depicts a man singing to a tree. For your enjoyment, this aria can be clicked on below.

 

Treatment for mental illness was primitive by modern standards, and the King’s doctors, who included Francis Willis, treated the King by forcibly restraining him until he was calm, or applying caustic poultices to draw out ?evil humors’. In February 1789, the Regency Bill, authorizing his, always scheming, oldest son, the Prince of Wales to act as regent, was introduced and passed in the House of Commons, but before the House of Lords could pass the bill, George III recovered. After George’s recovery, his popularity, and that of Pitt, continued to increase at the expense of Fox and the Prince of Wales. His humane and understanding treatment of two insane assailants, Margaret Nicholson in 1786 and John Frith in 1790, contributed to his popularity. James Hadfield’s failed attempt to shoot the King in the Drury Lane Theatre on 15 May 1800 was not political in origin but motivated by the apocalyptic delusions of Hadfield and Bannister Truelock. George seemed unperturbed by the incident, so much so that he fell asleep during the intermission.

 

The French Revolution of 1789, in which the French monarchy had been overthrown, worried many British landowners. France declared war on Great Britain in 1793; in the war attempt, George allowed Pitt to increase taxes, raise armies, and suspend the right of habeas corpus. The First Coalition to oppose revolutionary France, which included Austria, Prussia, and Spain, broke up in 1795 when Prussia and Spain made separate peace with France. The Second Coalition, which included Austria, Russia, and the Ottoman Empire, was defeated in 1800. Only Great Britain was left fighting Napoleon Bonaparte, the First Consul of the French Republic. At about the same time, the King had a relapse of his previous illness, which he blamed on worry over the Catholic question. On 14 March 1801, Pitt was formally replaced by the Speaker of the House of Commons, Henry Addington. Addington opposed emancipation, instituted annual accounts, abolished income tax and began a program of disarmament. In October 1801, he made peace with the French, and in 1802 signed the Treaty of Amiens. George did not consider the peace with France as real; in his view it was an ?experiment’. In 1803, the war resumed but public opinion distrusted Addington to lead the nation in war, and instead favored Pitt. An invasion of England by Napoleon seemed imminent, and a massive volunteer movement arose to defend England against the French. George’s review of 27,000 volunteers in Hyde Park, London, on 26 and 28 October 1803 and at the height of the invasion scare, attracted an estimated 500,000 spectators on each day. The Times said, ?The enthusiasm of the multitude was beyond all expression.’ A courtier wrote on 13 November that, ?The King is really prepared to take the field in case of attack, his beds are ready and he can move at half an hour’s warning.’ George wrote to his friend Bishop Hurd, ?We are here in daily expectation that Bonaparte will attempt his threatened invasion … Should his troops effect a landing, I shall certainly put myself at the head of mine, and my other armed subjects, to repel them.’ After Admiral Lord Nelson’s famous naval victory at the Battle of Trafalgar, the possibility of invasion was extinguished.

 

In 1804, George’s recurrent illness returned. In late 1810, at the height of his popularity, already virtually blind with cataracts and in pain from rheumatism, George became dangerously ill. In his view the malady had been triggered by stress over the death of his youngest and favorite daughter, Princess Amelia. The Princess’s nurse reported that ?the scenes of distress and crying every day were melancholy beyond description.’ He accepted the need for the Regency Act of 1811, and the Prince of Wales acted as Regent for the remainder of George III’s life. Despite signs of a recovery in May 1811, by the end of the year George had become permanently insane and lived in seclusion at Windsor Castle until his death. Prime Minister Spencer Perceval was assassinated in 1812 and was replaced by Lord Liverpool. Liverpool oversaw British victory in the Napoleonic Wars. The subsequent Congress of Vienna led to significant territorial gains for Hanover, which was upgraded from an electorate to a kingdom.

 

Meanwhile, George’s health deteriorated. He developed dementia, and became completely blind and increasingly deaf. He was incapable of knowing or understanding either that he was declared King of Hanover in 1814, or that his wife died in 1818. At Christmas 1819, he spoke nonsense for 58 hours, and for the last few weeks of his life was unable to walk. He died at Windsor Castle at 8:38 pm on 29 January 1820, six days after the death of his fourth son, the Duke of Kent. His favorite son, Frederick, Duke of York, was with him. George III was buried on 16 February in St George’s Chapel, Windsor Castle. George was succeeded by two of his sons George IV and William IV, who both died without surviving legitimate children, leaving the throne to the only legitimate child of the Duke of Kent, Victoria, the last monarch of the House of Hanover. George III lived for 81 years and 239 days and reigned for 59 years and 96 days: both his life and his reign were longer than those of any of his predecessors. Only Victoria and Elizabeth II have since lived and reigned longer.

 

George III was dubbed ?Farmer George’ by satirists, at first to mock his interest in mundane matters rather than politics, but later to contrast his homely thrift with his son’s grandiosity and to portray him as a man of the people. Under George III, the British Agricultural Revolution reached its peak and great advances were made in fields such as science and industry. There was unprecedented growth in the rural population, which in turn provided much of the workforce for the concurrent Industrial Revolution. George’s collection of mathematical and scientific instruments is now owned by King’s College London but housed in the Science Museum, London, to which it has been on long-term loan since 1927. He had the King’s Observatory built in Richmond-upon-Thames for his own observations of the 1769 transit of Venus. When William Herschel discovered Uranus in 1781, he at first named it Georgium Sidus (George’s Star) after the King, who later funded the construction and maintenance of Herschel’s 1785 40-foot telescope, which was the biggest ever built at the time. In the mid-twentieth century the work of historian, Lewis Namier, who thought George was ?much maligned’, started a re-evaluation of the man and his reign.

 

The very cultured court of King George III would have invited creative composers like Handel and Mozart to perform there often. The most talented opera singer of the time, Farinelli, would have performed at this King’s court for King George and his wife Charlotte.

 

George Frederic Handel (1685-1759); Painting is by Balthasar Denner – National Portrait Gallery: NPG 1976; Public Domain, https://commons.wikimedia.org/w/index.php?curid=6364709

 

Carlo Broschi Farinelli, (1705-1782) wearing the Order of Calatrava, by Jacopo Amigoni c1750-52; Painting by Jacopo Amigoni – Manuel Parada Lopez de Corselas User: Manuel de Corselas ARS SUMMUM, Centro para el Estudio y Difusion Libres de la Historia del Arte. Summer 2007., Public Domain, https://commons.wikimedia.org/w/index.php?curid=2568895

 

Farinelli was the most celebrated Italian castrato singer of the 18th century and one of the greatest singers in the history of opera. Farinelli has been described as having soprano vocal range and sang the highest note customary at the time.

 

 

For your enjoyment

 

Counter tenor, David Daniels, Xerxes, by George Frederic Handel

(The first aria in Xerxes, is the well known Ombra Mai Fu. A man sings to a tree about his love and admiration for it’s existence. Could Handel have heard the malicious gossip after King George III had one of his episodes, that the King was seen talking to a tree? We’ll never know, but the coincidence seems too great, not to come to this conclusion.)

 

Handel’s opera Rinaldo sung in the film, Farinelli

 

Small Pox Killed Millions Throughout Human History

 

Edward Jenner (1749-1823) – Graphic credit: Vigneron Pierre Roch (1789-1872) – http://portrait.kaar.at/,http://www2.biusante.parisdescartes.fr/img/?refbiogr=8701&mod=s, Public Domain, https://commons.wikimedia.org/w/index.php?curid=1497994

 

The scourge of the world! The history of smallpox extends into pre-history; the disease likely emerged in human populations about 10,000 BCE. An estimated 300 to 500 million people died from smallpox in the 20th century alone. This virulent disease, which kills a third of those it infects, is known to have co-existed with human beings for thousands of years. The origin of smallpox is unknown. The earliest evidence of the disease dates back to the 3rd century BCE in Egyptian mummies. The disease historically occurred in outbreaks. In 18th century Europe it is estimated 400,000 people per year died from the disease, and one-third of the cases resulted in blindness. These deaths included those of at least five reigning monarchs. As recently as 1967, 15 million cases occurred a year.

 

In 1798, Edward Jenner discovered that vaccinations could prevent smallpox. In 1967, the World Health Organization intensified efforts to eliminate the disease. Smallpox is one of two infectious diseases to have been eradicated, the other being rinderpest in 2011. The term “smallpox“ was first used in Britain in the 15th century to distinguish the disease from syphilis, which was then known as the “great pox“. Other historical names for the disease include pox, speckled monster, and red plague.

 

The well known (Shakespeare) curse, “A pox on both your houses“ would have been a very serious utterance.

 

One of Many Stories of Smallpox (and True Love in the 19th Century)

 

Soon after his marriage, the great Irish composer and poet, Thomas Moore (1779-1852), was called away on a business trip. Upon his return he was met at the door, not by beauteous Elizabeth, bride and the love of his life, but by the family doctor. “Your wife is upstairs,“ said their doctor. “But she asked that you not come up.“ The physician related the terrible account; Moore learned that; his wife had contracted small pox. The disease had left her once flawless skin pocked and scarred. She had taken one look at her reflection in the mirror and commanded that the shutters be drawn and over them the heavy drapes, and that her husband should never see her again. Moore would not listen. He ran upstairs and threw open the door of his wife’s room. It was black as night inside. Not a sound came from the darkness. Groping along the wall, Moore felt for the gas jets. A startled cry came from a black corner of the room. “No! Don’t light the lamps!“ Moore hesitated, swayed by the pleading in the voice. “Go!“ she begged. “Please go! This is the greatest gift I can give you now.“

 

Moore did go. He went down to his study, where he sat up most of the night, passionately writing, what turned out to be one of the greatest love poems ever written; but he also composed a song, to go with his words, a song that lives on, using certain musical phrases of ancient Irish melodies. He had never written a song before, but now it came naturally, motivated by the adoration of his wife and her profound melancholy. The next morning, as soon as the sun was up he returned to his wife’s room. In spite of the morning light, her room remained as dark as night. He felt his way to a chair and sat down. “Are you awake?“ he asked. “I am,“ came a voice from the far side of the room. “But you must not ask to see me. You must not press me, Thomas.“ “I will sing to you, then,“ he answered. And so, for the first time, Thomas Moore sang to his wife the song that still lives today:

 

Believe me, if all those endearing young charms,

Which I gaze on so fondly today,

Were to change by tomorrow and flee in my arms,

Like fairy gifts fading away,

Thou wouldst still be adored, as this moment thou art —

Let thy loveliness fade as it will,

 

Moore heard a movement from the dark corner where his wife lay in her loneliness, waiting.  He continued:

 

Let thy loveliness fade as it will,

And around the dear ruin each wish of my heart

Would entwine itself verdantly still —

 

The song ended. As his voice trailed off on the last note, Moore heard his bride rise. She crossed the room to the window, reached up and slowly pulled aside the drapes and drew open the shutters.

 

Their long marriage was successful with five children, who tragically all died before both parents. Towards the end of his life, Moore suffered from a stroke and was lovingly cared for by his devoted wife, Elizabeth. He died on 26 February 1852. His remains are in a vault at St. Nicholas churchyard, Bromham, within view of his cottage-home.

 

(Editor’s note: as you listen to this loveliness, try not to look at the photos; would not have been our choice)

 

Here is that same love poem/song:

 

Believe Me If All Those Endearing Young Charms

 

Another version of: “Believe Me

 

Another version of: “Believe Me

 

Final evolution of the music; Harvard University’s song, “Fair Harvard.“

 

The Last Rose of Summer, words and music by Thomas Moore, sung by Renee Fleming. The haunting melody of Moore’s, The Last Rose of Summer, was adapted by Beethoven, Mendelssohn and Flotow in his opera, Martha.

 

Dame Joan Sutherland: Last Rose (adapted for the opera, Martha)

 

Mendelssohn, (adaption) Fantasie Opus #15

 

Beethoven: (adaptation) e flat; “Sad and luckless was the season.“

 

FINAL NOTATION: THE LAST ROSE OF SUMMER, IS A PROMINENT MELODY IN THE SOUND TRACK OF “THREE BILLBOARDS OUTSIDE EBBING, MISSOURI,” A RECENT FILM NOMINATED FOR AN OSCAR.

 

Robert Daniel Lawrence MD (1892-1968)

Robert Daniel Lawrence MD

Photo credit: Unknown – http://wellcomeimages.org/indexplus/image/L0000433.html, CC BY 4.0, https://commons.wikimedia.org/w/index.php?curid=35452965

  

Dr. Robert “Robin“ Daniel Lawrence (1892 – 1968) MA, MB ChB (Hons), MD, FRCP, LLD was a British physician at King’s College Hospital, London. He was diagnosed with diabetes in 1920 and became an early recipient of insulin injections in the UK in 1923. He devoted his professional life to the care of diabetic patients and is remembered as the founder of the British Diabetic Association. Dr. Lawrence, better known as Robin Lawrence was born at 10 Ferryhill Place, Aberdeen, Scotland. He was the second son of Thomas and Margaret Lawrence. His father was a prosperous brush manufacturer, whose firm supplied all the brushes to Queen Victoria and her heirs at Balmoral.

 

At eighteen, Lawrence matriculated at Aberdeen University to take an MA in French and English. After graduation, he briefly worked in an uncle’s drapery shop in Glasgow but gave this up after just a few weeks, returning to Aberdeen where he enrolled back at Aberdeen University to study medicine. He had a brilliant undergraduate career winning gold medals in Anatomy, Clinical Medicine and Surgery and graduated ?with honors’ in 1916. During his second year, on the advice of his anatomy professor he sat and passed the primary FRCS examination in London. He gave up rugby as a student, but represented the University at both hockey and tennis. He was also President of the Students’ Representative Council. On graduation he immediately joined the RAMC and after six months home service, served on the Indian Frontier until invalided home in 1919 with dysentery and was discharged with the final rank of Captain. After a few weeks convalescing at home and fishing, he went to London and obtained the post of House Surgeon in the Casualty Department at King’s College Hospital. Six months later, now being accepted as a “King’s Man“, he became an assistant surgeon in the Ear, Nose and Throat Department. Shortly afterwards, while practicing for a mastoid operation on a cadaver, he was chiseling the bone when a bone chip flew into his right eye setting up an unpleasant infection. He was hospitalized but the infection failed to settle and he was discovered to have diabetes. At his age at this time this represented a death sentence.

 

Lawrence was initially controlled on a rigid diet and the eye infection settled but left permanently impaired vision in that eye. He abandoned thoughts of a career in surgery and worked in the King’s College Hospital Chemical Pathology Department under a Dr G A Harrison. Despite his gloomy prognosis and ill-health he managed to conduct enough research to write his MD thesis. A little later, in the expectation that he had only a short time to live, and not wishing to die at home causing upset to his family, he moved to Florence and set up in practice there. In the winter of 1922-23 his diabetes deteriorated badly after an attack of bronchitis and the end of his life seemed imminent. In early 1922, Banting, Best, Collip and Macleod in Toronto, Canada made the discovery and isolation of insulin. Supplies were initially in short supply and slow to reach the UK, but in May 1923, Harrison cabled Lawrence – “I’ve got insulin – it works – come back quick“. By this time Lawrence was weak and disabled by peripheral neuritis and with difficulty drove across the continent and reached King’s College Hospital on 28 May 1923. After some preliminary tests he received his first insulin injection on 31 May. His life was saved and he spent two months in hospital recovering and learning all about insulin. He was then appointed Chemical Pathologist at King’s College Hospital and devoted the rest of his life to the care and welfare of diabetic patients.

 

Dr. Lawrence developed one of the earliest and largest diabetic clinics in the country and in 1931 was appointed assistant physician-in-charge of the diabetic department at King’s College Hospital, becoming full physician-in-charge in 1939. He also had a large private practice. He wrote profusely on his subject and his books The Diabetic Life and The Diabetic ABC, did much to simplify treatment for doctors and patients. The Diabetic Life was first published in 1925 and became immensely popular, extending to 14 editions and translated into many languages. He published widely on all aspects of diabetes and its management, producing some 106 papers either alone or with colleagues, including important publications on the management of diabetic coma, on the treatment of diabetes and tuberculosis and on the care of pregnancy in diabetics. In 1934, he conceived the idea of an association which would foster research and encourage education and welfare of patients. To this end a group of doctors and diabetics met in the London home of Lawrence’s patient, H. G. Wells, the scientist and writer, and the Diabetic Association was formed. When other countries followed suit it became the British Diabetic Association (the BDA). Lawrence was Chairman of the Executive Council from 1934-1961 and Hon. Life President from 1962.

 

Dr. Lawrence’s enthusiasm and drive ensured the life and steady growth of this association which soon became the voice of the diabetic population and constantly sought to promote the welfare of diabetics. There are now active branches through the country. He was also a prime mover in production of “The Diabetic Journal“ (forerunner of Balance), the first issue of which appeared in January 1935. Many articles thereafter were contributed by himself anonymously. He and colleague Joseph Hoet were the main proponents in founding the International Diabetes Federation and he served as their first president from 1950-1958. At their triennial conferences, Lawrence’s appearance was always greeted with acclaim. Almost immediately after his retirement, he suffered a stroke but his spirit remained indomitable and he continued seeing private patients to the end. His last publication was an account of how hypoglycemia exaggerated the signs of his hemiparesis. Although he preached strict control of diabetes for his patients, he did not keep to a strict diet himself taking instead supplementary shots of soluble insulin as he judged he needed them. He died at home in London on 27 August 1968 aged 76.

 

Lawrence was Oliver-Sharpey lecturer at the Royal College of Physicians of London in 1946. His lecture was one of the earliest descriptions and detailed study of the rare condition now known as Lipoatrophic Diabetes. He was recipient of the Banting Medal of the American Diabetes Association the same year; Banting Lecturer of the BDA in 1949 and in 1964 Toronto University conferred on him its LLD “honoris causa.“ Charles Best, then professor of physiology in Toronto, was probably the proposer for this honor as he had met and become friendly with Lawrence when doing postgraduate research in London with Sir Henry Dale and A. V. Hill in 1925-28. They remained lifelong friends meeting regularly when in each other’s country.

 

RD Lawrence is commemorated by an annual Lawrence lecture given by a young researcher in the field of diabetes to the Medical & Scientific Section of the BDA and by the Lawrence Medal awarded to patients who have been on insulin for 60 years or more. The BDA, now Diabetes UK remains his lasting memorial.

 

Most Mysterious Human Organ

Hieroglyphic for the word “brain” (c.1700 BCE)

Source: The Edwin Smith Surgical Papyrus (17th century BCE), Public Domain, Wikipedia Commons

 

From the ancient Egyptian mummifications to 18th century scientific research on “globules“ and neurons, there is evidence of neuroscience practice throughout the early periods of history. The early civilizations lacked adequate means to obtain knowledge about the human brain. Their assumptions about the inner workings of the mind, therefore, were not accurate. Early views on the function of the brain regarded it to be a form of “cranial stuffing“ of sorts. In ancient Egypt, from the late Middle Kingdom onwards, in preparation for mummification, the brain was regularly removed, for it was the heart that was assumed to be the seat of intelligence.

 

According to Herodotus, during the first step of mummification: “The most perfect practice is to extract as much of the brain as possible with an iron hook, and what the hook cannot reach is mixed with drugs.“ Over the next five thousand years, this view came to be reversed; the brain is now known to be the seat of intelligence, although colloquial variations of the former remain as in “memorizing something by heart“.

 

The Edwin Smith Surgical Papyrus, written in the 17th century BCE, contains the earliest recorded reference to the brain. The hieroglyph for brain, occurring eight times in this papyrus, describes the symptoms, diagnosis, and prognosis of two patients, wounded in the head, who had compound fractures of the skull. The assessments of the author (a battlefield surgeon) of the papyrus allude to ancient Egyptians having a vague recognition of the effects of head trauma. While the symptoms are well written and detailed, the absence of a medical precedent is apparent. The author of the passage notes “the pulsations of the exposed brain“ and compared the surface of the brain to the rippling surface of copper slag (which indeed has a gyral-sulcal pattern). The laterality of injury was related to the laterality of symptom, and both aphasia (“he speaks not to thee“) and seizures (“he shutters exceedingly“) after head injury were described. Observations by ancient civilizations of the human brain suggest only a relative understanding of the basic mechanics and the importance of cranial security. Furthermore, considering the general consensus of medical practice pertaining to human anatomy was based on myths and superstition, the thoughts of the battlefield surgeon appear to be empirical and based on logical deduction and simple observation.

 

During the second half of the first millennium BCE, the Ancient Greeks developed differing views on the function of the brain. However, due to the fact that Hippocratic doctors did not practice dissection, because the human body was considered sacred, Greek views of brain function were generally uninformed by anatomical study. It is said that it was the Pythagorean Alcmaeon of Croton (6th and 5th centuries BCE) who first considered the brain to be the place where the mind was located. According to ancient authorities, “he believed the seat of sensations is in the brain. This contains the governing faculty. All the senses are connected in some way with the brain; consequently they are incapable of action if the brain is disturbed, the power of the brain to synthesize sensations makes it also the seat of thought: The storing up of perceptions gives memory and belief and when these are stabilized you get knowledge.“ In the 4th century BCE Hippocrates, believed the brain to be the seat of intelligence (based, among others before him, on Alcmaeon’s work). During the 4th century BCE Aristotle thought that, while the heart was the seat of intelligence, the brain was a cooling mechanism for the blood. He reasoned that humans are more rational than the beasts because, among other reasons, they have a larger brain to cool their hot-bloodedness.

 

In contrast to Greek thought regarding the sanctity of the human body, the Egyptians had been embalming their dead for centuries, and went about the systematic study of the human body. During the Hellenistic period, Herophilus of Chalcedon (c.335/330 – 280/250 BCE) and Erasistratus of Ceos (c. 300 – 240 BCE) made fundamental contributions not only to brain and nervous systems’ anatomy and physiology, but to many other fields of the bio-sciences. Herophilus not only distinguished the cerebrum and the cerebellum, but provided the first clear description of the ventricles. Erasistratus used practical application by experimenting on the living brain. Their works are now mostly lost, and we know about their achievements due mostly to secondary sources. Some of their discoveries had to be re-discovered a millennium after their death.

 

During the Roman Empire, the Greek anatomist Galen dissected the brains of sheep, monkeys, dogs, swine, among other non-human mammals. He concluded that, as the cerebellum was denser than the brain, it must control the muscles, while as the cerebrum was soft, it must be where the senses were processed. Galen further theorized that the brain functioned by movement of animal spirits through the ventricles. “Further, his studies of the cranial nerves and spinal cord were outstanding. He noted that specific spinal nerves controlled specific muscles, and had the idea of the reciprocal action of muscles. For the next advance in understanding spinal function we must await Bell and Magendie in the 19th Century.“ Circa 1000, Al-Zahrawi, living in Islamic Iberia, evaluated neurological patients and performed surgical treatments of head injuries, skull fractures, spinal injuries, hydrocephalus, subdural effusions and headache. Concurrently in Persia, Avicenna also presented detailed knowledge about skull fractures and their surgical treatments. Between the 13th and 14th centuries, the first anatomy textbooks in Europe, which included a description of the brain, were written by Mondino de Luzzi and Guido da Vigevano.

 

Andreas Vesalius noted many structural characteristics of both the brain and general nervous system during his dissections of human cadavers. In addition to recording many anatomical features such as the putamen and corpus collusum, Vesalius proposed that the brain was made up of seven pairs of ‘brain nerves’, each with a specialized function. Other scholars furthered Vesalius’ work by adding their own detailed sketches of the human brain. Ren? Descartes also studied the physiology of the brain, proposing the theory of dualism to tackle the issue of the brain’s relation to the mind. He suggested that the pineal gland was where the mind interacted with the body after recording the brain mechanisms responsible for circulating cerebrospinal fluid. Thomas Willis studied the brain, nerves, and behavior to develop neurologic treatments. He described in great detail the structure of the brainstem, the cerebellum, the ventricles, and the cerebral hemispheres.

 

The role of electricity in nerves was first observed in dissected frogs by Luigi Galvani in the second half of the 18th century. In the 1820s, Jean Pierre Flourens pioneered the experimental method of carrying out localized lesions of the brain in animals describing their effects on motricity, sensibility and behavior. Richard Caton presented his findings in 1875 about electrical phenomena of the cerebral hemispheres of rabbits and monkeys. Studies of the brain became more sophisticated after the invention of the microscope and the development of a staining procedure by Camillo Golgi during the late 1890s that used a silver chromate salt to reveal the intricate structures of single neurons. His technique was used by Santiago Ramon y Cajal and led to the formation of the neuron doctrine, the hypothesis that the functional unit of the brain is the neuron. Golgi and Cajal shared the Nobel Prize in Physiology or Medicine in 1906 for their extensive observations, descriptions and categorizations of neurons throughout the brain. The hypotheses of the neuron doctrine were supported by experiments following Galvani’s pioneering work in the electrical excitability of muscles and neurons. In the late 19th century, Emil du Bois-Reymond, Johannes Peter Muller, and Hermann von Helmholtz showed neurons were electrically excitable and that their activity predictably affected the electrical state of adjacent neurons. In parallel with this research, work with brain-damaged patients by Paul Broca suggested that certain regions of the brain were responsible for certain functions.

 

Tell me where is fancie bred,

In the heart or in the head?

 

William Shakespeare (Merchant of Venice)

 

Britain’s Charles II’s Medical Treatment Led to His Suffering and Death

Charles is of thin build and has chest-length curly black hair

Graphic credit: John Michael Wright – National Portrait Gallery: NPG 531, While Commons policy accepts the use of this media, See Commons: Licensing for more information., Public Domain, https://commons.wikimedia.org/w/index.php?curid=6373274

 

 

Charles II (29 May 1630-6 February 1685) was king of England, Scotland and Ireland. He was king of Scotland from 1649 until his deposition in 1651, and king of England, Scotland and Ireland from the restoration of the monarchy in 1660 until his death. Charles II was one of the most popular and beloved kings of England, known as the Merry Monarch, in reference to both the liveliness and hedonism of his court and the general relief at the return to normality after over a decade of rule by Cromwell and the Puritans. Charles’s wife, Catherine of Braganza, bore no live children, but Charles acknowledged at least twelve illegitimate children by various mistresses. He was succeeded by his brother James.

 

Charles II’s father, Charles I, was executed at Whitehall on 30 January 1649, at the climax of the English Civil War. Although the Parliament of Scotland proclaimed Charles II King on 5 February 1649, England entered the period known as the English Interregnum or the English Commonwealth, and the country was a de facto republic, led by Oliver Cromwell. Cromwell defeated Charles II at the Battle of Worcester on 3 September 1651, and Charles fled to mainland Europe. Cromwell became virtual dictator of England, Scotland and Ireland, and Charles spent the next nine years in exile in France, the Dutch Republic and the Spanish Netherlands. A political crisis that followed the death of Cromwell in 1658 resulted in the restoration of the monarchy, and Charles was invited to return to Britain. On 29 May 1660, his 30th birthday, he was received in London to public acclaim. After 1660, all legal documents were dated as if he had succeeded his father as king in 1649.

 

Charles’s English parliament enacted laws known as the Clarendon Code, designed to shore up the position of the re-established Church of England. Charles acquiesced to the Clarendon Code even though he favored a policy of religious tolerance. The major foreign policy issue of his early reign was the Second Anglo-Dutch War. In 1670, he entered into the Treaty of Dover, an alliance with his first cousin King Louis XIV of France. Louis agreed to aid him in the Third Anglo-Dutch War and pay him a pension, and Charles secretly promised to convert to Catholicism at an unspecified future date. Charles attempted to introduce religious freedom for Catholics and Protestant dissenters with his 1672 Royal Declaration of Indulgence, but the English Parliament forced him to withdraw it. In 1679, Titus Oates’s revelations of a supposed “Popish Plot“ sparked the Exclusion Crisis when it was revealed that Charles’s brother and heir (James, Duke of York) was a Catholic. The crisis saw the birth of the pro-exclusion Whig and anti-exclusion Tory parties. Charles sided with the Tories, and, following the discovery of the Rye House Plot to murder Charles and James in 1683, some Whig leaders were executed or forced into exile. Charles dissolved the English Parliament in 1681 and ruled alone until his death on 6 February 1685. Ironically, he was received into the Roman Catholic Church on his deathbed.

 

He died in his bed, surrounded by his spaniels, friends, and family, in the early hours of 6 February 1685. His death was torture, due to a complete lack of medical knowledge. Hence, in Charles II case, the torturers/killers were his doctors. It was not the intention of the doctors to cause the death of the king. But through their total medical ignorance, their actions, led the already ailing Charles, a speedier and agonizing death. Because the daily accounts of Charles II demise are so detailed and vivid, we include the next few passages as a stark contrast to the 21st century hospice and palliative care we are now accustomed to.

 

DAY 1: On the morning of 2nd February 1685, things seemed to be going normally for Charles. As he was preparing to shave, he suddenly cried out to pain, fell to the floor and suffered from a series of fits. Six royal physicians rushed into the Royal Bedchamber to help Charles. Their good intentions, however, paved the path to Charles’ undoubtedly excruciating end. Once the seizure had passed, the first thing that the doctors did was bleed him of 16 ounces of blood. Next, they applied heated cups to the king’s skin, to form blisters. This treatment, which is still practiced in parts of the world today, was believed to ?stimulate’ his system, and once the blister was lanced, the disease would go away with its contents. After the cupping procedure was completed, Charles’ doctors drained him of 8 more ounces of blood. After this second bleeding session was completed, they gave the king a drug to induce vomiting, an enema to purify his bowels, and a purgative to clean out his intestines. The doctors believed that the bad consequences of the disease was not only in the blood, but also in the bowels. The next treatment was to force-feed a syrup, containing blackthorn and rock salt, followed by shaving his head and blistered his scalp, which caused the king to wake from a nap. None of the physicians understood the healing nature of sleep. They administered yet another enema to the ailing king, put an irritant powder up his nostrils, blistered his skin again with cupping, and applied cow-slip flowers to his stomach. At the end of the day they applied pigeon droppings to his feet. The torturous treatment of the first day, lasted for 12 hours. After the ?care’ was done, the king was put to bed.

 

DAY 2: When the king awoke, he seemed greatly improved. This should have been a sign that something had worked, however, as soon as Charles II woke, his doctors began to bleed the king again, this time, opening both of Charles’ jugular veins bleeding out 10 ounces. At this point, the king had lost 34 ounces of blood. The physicians then proceeded to feed him a potion containing black cherries, peony, lavender, sugar, and crushed pearls. After he ingested the liquid, he slept through the day and night soundly.

 

DAY 3: When Charles awoke on the third morning, he suffered another seizure. His doctors bled him again, after feeding him first sienna pods in spring water, and white wine with nutmeg; next a force-fed drink made of ?40 drops of extract of human skull’, taken from a man who met a very violent demise, as well as a gallstone (the Bezoar Stone) from an East Indian goat. The physicians proudly announced that the king was going to survive.

 

DAY 4: The king was near death on this day. Seeing his pitiful state of health, the doctors applied the hot cups to his skin again to form blisters, gave him another enema and emetic, and was bled yet again. He was then given Jesuit’s Powder; a quinine remedy, laced with opium and wine. Perhaps this potion helped as a pain killer and a soporific.

 

DAY 5: Dr. Scarborough, one of the royal doctors, wrote on the morning of 5th February 1685: “Alas! After an ill-fated night, His Serene Majesty’s strength seemed exhausted to such a degree that the whole assembly of physicians became despondent and lost hope.“ On this day, in an attempt to revive the king, he was bled until the doctors gave up this technique and turned to creating a new stronger potion. The physicians gathered an antidote containing ?extracts of all the herbs and animals of the Kingdom’ by scouring the palace grounds. These ingredients were then mixed with ammonia and poured down his throat.

 

DAY 6: 6th February 1685 was the final day for the popular monarch. The scene around his deathbed was one that still draws some emotion 300 years or more later. Charles, although incredibly weak and in great pain, wished to see each of his surviving children and mistresses for one last time. At one point, the king asked for the curtains of his room to be drawn back, so that he could view the sun over the Thames for one last time. As he took in the view, he said: “I have suffered much more than you can imagine. You must pardon me, gentlemen, for being a most unconscionable time a-dying.“ He converted to Catholicism shortly before he died. At 11:15 am, on 6th February 1685, at the age of 54 years, King Charles II died.

 

It’s said that Charles was suffering from a variety of ailments at this time; uremia, malaria, mercury poisoning, chronic nephritis, and quite possibly some form of an STD. We know that he was ill, but the exact illness was not known. Could today’s physicians have kept the king alive? We’ll never know; however, we do know that his death would have been peaceful and would have lacked all the suffering Charles II endured.

 

Emanuel Swedenborg (1688 – 1772)

Graphic credit: By Carl Frederik von Breda – http://www.newchurchhistory.org/articles/ceg2006b/ceg2006b.php, Public Domain, https://commons.wikimedia.org/w/index.php?curid=15230429

 

 

Emanuel Swedenborg, born on 29 January 1688 and died 29 March 1772, was a Swedish scientist, philosopher, theologian, revelator, mystic and founder of Swedenborgianism. Swedenborg had a prolific career as an inventor and scientist. During the 1730s, Swedenborg undertook many studies of anatomy and physiology. He had the first known anticipation of the neuron concept. It was not until a century later that science recognized the full significance of the nerve cell. He also had prescient ideas about the cerebral cortex, the hierarchical organization of the nervous system, the localization of the cerebrospinal fluid, the functions of the pituitary gland, the perivascular spaces, the foramen of Magendie, the idea of somatotopic organization, and the association of frontal brain regions with the intellect. In some cases, his conclusions have been experimentally verified in modern times.

 

In the 1730s, Swedenborg became increasingly interested in spiritual matters and was determined to find a theory to explain how matter relates to spirit. Swedenborg’s desire to understand the order and the purpose of creation first led him to investigate the structure of matter and the process of creation itself. In the Principia, he outlined his philosophical method, which incorporated experience, geometry (the means by which the inner order of the world can be known) and the power of reason. He also outlined his cosmology, which included the first presentation of his nebular hypothesis. There is evidence that Swedenborg may have preceded Kant by as much as 20 years in the development of that hypothesis. Although the first known observations of the CSF (cerebrospinal fluid) date back to Hippocrates (460-375 BCE) and later Galen (130-200 CE), its discovery is credited to Emanuel Swedenborg (1688-1772 CE), who, being a devoutly religious man, identified the CSF during his search for the seat of the soul. The 16 centuries of anatomists that came after Hippocrates and Galen may have missed identifying the CSF due to the time period’s prevailing autopsy technique, which included severing the head and draining the blood before dissecting the brain. Although Swedenborg’s work (in translation) was not published until 1887, due in part to his lack of medical credentials, he may have also made the first connection between the CSF and the lymphatic system. His description of the CSF was of a “spirituous lymph.“

 

In the peripheral organs, the lymphatic system performs important immune functions, and runs parallel to the blood circulatory system to provide a secondary circulation that transports excess interstitial fluid, proteins and metabolic waste products from the systemic tissues back into the blood. The efficient removal of soluble proteins from the interstitial fluid is critical to the regulation of both colloidal osmotic pressure and homeostatic regulation of the body’s fluid volume. The importance of lymphatic flow is especially evident when the lymphatic system becomes obstructed. In lymphatic associated diseases such as elephantiasis (where parasites occupying the lymphatic vessels block the flow of lymph), the impact of such an obstruction can be dramatic. The resulting chronic edema is due to the breakdown of lymphatic clearance and the accumulation of interstitial solutes. In 2015, about 300 years after Emanuel Swedenborg, the presence of a meningeal lymphatic system was first identified. For over a century the prevailing hypothesis was that the flow of cerebrospinal fluid (CSF), which surrounds but does not come in direct contact with the parenchyma of the CNS, could replace peripheral lymphatic functions and play an important role in the clearance of extracellular solutes.

 

The majority of the CSF is formed in the choroid plexus and flows through the brain along a distinct pathway: moving through the cerebral ventricular system, into the subarachnoid space surrounding the brain, then draining into the systemic blood column via arachnoid granulations of the dural sinuses or to peripheral lymphatics along cranial nerve sheathes. Many researchers have suggested that the CSF compartment constitutes a sink for interstitial solute and fluid clearance from the brain parenchyma. However, the distances between the interstitial fluid and the CSF in the ventricles and subarachnoid space are too great for the efficient removal of interstitial macromolecules and wastes by simple diffusion alone. Helen Cserr at Brown University calculated that mean diffusion times for large molecules such as albumin would exceed 100 hrs to traverse 1 cm of brain tissue, a rate that is not compatible with the intense metabolic demands of brain tissue. A clearance system based on simple diffusion would additionally lack the sensitivity to respond rapidly to deviations from homeostatic conditions. Key determinants of diffusion through the brain interstitial spaces are the dimensions and composition of the extracellular compartment. In a series of elegantly designed experiments in the 1980s and 1990s, C. Nicholson and colleagues from New York University explored the microenvironment of the extracellular space using ion-selective micropipettes and ionophoretic point sources. Using these techniques Nicholson showed that solute and water movement through the brain parenchyma slows as the extracellular volume fraction decreases and becomes more tortuous.

 

As an alternative explanation to diffusion, Cserr and colleagues proposed that convective bulk flow of interstitial fluid from the brain parenchyma to the CSF was responsible for efficient waste clearance. Experiments conducted at the University of Maryland in the 1980s by Patricia Grady and colleagues postulated the existence of solute exchange between the interstitial fluid of the brain parenchyma and the CSF via paravascular spaces. In 1985, Grady and colleagues suggested that cerebrospinal fluid and interstitial fluid exchange along specific anatomical pathways within the brain, with CSF moving into the brain along the outside of blood vessels. Grady’s group suggested that these ?paravascular channels’ were functionally analogous to peripheral lymph vessels, facilitating the clearance of interstitial wastes from the brain. Other labs at the time, however, did not observe such widespread paravascular CSF-ISF exchange. The continuity between the brain interstitial fluid and the CSF was confirmed by H. Cserr and colleagues from Brown University and Kings College London. The same group postulated that interstitial solutes in the brain parenchyma exchange with CSF via a bulk flow mechanism, rather than diffusion. However other work from this same lab indicated that the exchange of CSF with interstitial fluid was inconsistent and minor, contradicting the findings of Grady and colleagues.

 

Gunter Blobel MD, PhD (1936 – 2018) Redefined Cell Biology

Gunter Blobel: Died a few days ago, on 18 February 2018 (aged 81): From Wikimedia Commons, the free media repository

 

Gunter Blobel (May 21, 1936 – February 18, 2018 was a Silesian German and American biologist and 1999 Nobel Prize laureate in Physiology for the discovery that proteins have intrinsic signals that govern their transport and localization in the cell. Blobel was born in Waltersdorf in the Prussian Province of Lower Silesia, now a part of Poland. In January 1945 his family fled from native Silesia from the advancing Red Army. After the war Blobel grew up and attended gymnasium in the Saxon town of Freiberg. He graduated at the University of Tubingen in 1960 with MD and received his Ph.D. from the University of Wisconsin-Madison in 1967. Blobel joined the Rockefeller University faculty 51 years ago where he was the John D. Rockefeller Jr. Professor. He was also a Howard Hughes Medical Institute Investigator since 1986.

 

Blobel was awarded the 1999 Nobel Prize in Physiology or Medicine for the discovery of signal peptides. Signal peptides form an integral part of protein targeting, a mechanism for cells to direct newly synthesized protein molecules to their proper location by means of an “address tag“ (i.e. a signal peptide) within the molecule. Proteins that are manufactured within cells must be transported to the sites where they are needed. Blobel discovered a system of intrinsic signals that explain how cells are able to accurately distribute billions of such proteins within a cell each day. Along with his colleagues, Blobel learned that sequences in proteins were responsible for directing traffic, matching up these “zip codes“ with transport machinery in the cell that facilitate targeting to the proper cellular membranes. This connection results in the proteins either passing through the membranes or becoming embedded within them. His observations were central to uniting the fields of molecular biology, which deals primarily with proteins and nucleic acids, and cell biology, which is focused on the structures inside cells, called organelles. In addition, he found that the same system plays a role across all eukaryotes, ranging from yeast to humans.

 

Blobel became well known for his direct and active support for the rebuilding of Dresden in Germany, becoming, in 1994, the founder and president of the nonprofit “Friends of Dresden, Inc.“ He donated all of the Nobel award money to the restoration of Dresden, in particular for the rebuilding of the Frauenkirche (completed in 2005) and the building of a new synagogue. In Leipzig he pursued a rebuilding of the Paulinerkirche, the university church of the University of Leipzig, which had been blown up by the communist regime of East Germany in 1968, arguing “this is a shrine of German cultural history, connected to the most important names in German cultural history.“ In addition to his research at the Rockefeller University in New York City from 1968 to 2018, Blobel lived in Manhattan’s Upper East Side with his wife, Laura Maioglio (owner of Barbetta Restaurant in Manhattan). He was on the board of directors for Nestle and the Board of Scientific Governors at The Scripps Research Institute. Furthermore, he was Co-Founder and Chairman of the Scientific Advisory Board for Chromocell Corporation. He sat on the Selection Committee for Life Science and Medicine which chooses winners of the Shaw Prize.

 

Excellent video about the work of Dr. Gunter Blobel

 

CRISPR

Please open up this video so that this serves as the graphic for the article

 

The discovery of clustered DNA repeats began independently in three parts of the world. One of the first discoveries was in 1987 at Osaka University in Japan. Researcher Yoshizumi Ishino and colleagues published their findings on the sequence of a gene called “iap“ and its relation to E. coli. Technological advances in the 1990s allowed them to continue their research and speed up their sequencing with a technique called metagenomics. They were able to collect seawater or soil samples and sequence the DNA in the sample. The first description of what would later be called CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats), occurred in 1987 when Yoshizumi Ishino accidentally cloned part of a CRISPR together with the iap gene, the target of interest. The organization of the repeats was unusual because repeated sequences are typically arranged consecutively along DNA. The function of the interrupted clustered repeats was not known at the time.

 

Ishino received his BS, MS and PhD degree in 1981, 1983 and 1986, respectively, from Osaka University. From 1987 to 1989, he was a post-doctoral fellow at Yale University (Dieter Soll’s laboratory). In 2002, he became a professor at Kyushu University. Since October 2013, he is also a member of the NASA Astrobiology Institute, University of Illinois at Urbana-Champaign.

 

In 1993 researchers of Mycobacterium tuberculosis in the Netherlands published two articles about a cluster of interrupted direct repeats (DR) in this bacterium. These researchers recognized the diversity of the DR-intervening sequences among different strains of M. tuberculosis and used this property to design a typing method that was named spoligotyping, which is still in use today. At the same time, repeats were observed in the archaeal organisms of Haloferax and Haloarcula species, and their function was studied by Francisco Mojica at the University of Alicante in Spain. Although his hypothesis turned out to be wrong, Mojica surmised at the time that the clustered repeats had a role in correctly segregating replicated DNA into daughter cells during cell division because plasmids and chromosomes with identical repeat arrays could not coexist in Haloferax volcanii. Transcription of the interrupted repeats was also noted for the first time. Transcription of the interrupted repeats was also noted for the first time. In 2017 Mojica was a winner of the Albany Medical Center Prize.

 

The three articles below, are well written and informative regarding this new and exciting technology.

 

https://www.wired.com/2015/07/crispr-dna-editing-2/

https://www.the-scientist.com/?articles.view/articleNo/44919/title/Credit-for-CRISPR–A-Conversation-with-George-Church/

https://genotopia.scienceblog.com/573/a-whig-history-of-crispr/

 

Santiago Ramon y Cajal (1852 – 1934)

Santiago Ramon y Cajal. Spanish Nobel laureate in medicine.

Photo credit: Original photo is anonymous although published by Clark University in 1899. Restoration by Garrondo – Cajal.PNG, Public Domain, https://commons.wikimedia.org/w/index.php?curid=12334552

  

Santiago Ramon y Cajal was a Spanish neuroscientist and pathologist, specializing in neuroanatomy, particularly the histology of the central nervous system. He won the Nobel prize in 1906, becoming the first person of Spanish origin who won a scientific Nobel prize. His original investigations of the microscopic structure of the brain made him a pioneer of modern neuroscience. Hundreds of his drawings illustrating the delicate arborizations of brain cells are still in use for educational and training purposes.

 

Santiago Ramon y Cajal was born 1 May 1852 in the town of Petilla de Aragon, Navarre, Spain. His father was an anatomy teacher. As a child, he was transferred many times from one school to another because of behavior that was declared poor, rebellious, and showing an anti-authoritarian attitude. An extreme example of his precociousness and rebelliousness at the age of eleven is his 1863 imprisonment for destroying his neighbor’s yard gate with a homemade cannon. He was an avid painter, artist, and gymnast, but his father neither appreciated nor encouraged these abilities, even though these artistic talents would contribute to his success later in life. In order to tame the unruly character of his son, his father apprenticed him to a shoemaker and barber.

 

Ramon y Cajal as a young captain in the Ten Years’ War in Cuba, 1874.

Graphic credit: Izquierdo Vives, Public Domain, https://commons.wikimedia.org/w/index.php?curid=32562868

 

 

Over the summer of 1868, his father hoped to interest his son in a medical career, and took him to graveyards to find human remains for anatomical study. Sketching bones was a turning point for him and subsequently, he did pursue studies in medicine. Ramon y Cajal attended the medical school of the University of Zaragoza, where his father was an anatomy teacher. He graduated in 1873, aged 21. After a competitive examination, he served as a medical officer in the Spanish Army. He took part in an expedition to Cuba in 1874-75, where he contracted malaria and tuberculosis. In order to heal, he visited the Panticosa spa-town in the Pyrenees. After returning to Spain, he received his doctorate in medicine in Madrid in 1877. In 1879, he became the director of the Zaragoza Museum, and he married Silveria Fananas Garc?a, with whom he had four daughters and three sons. Cajal worked at the University of Zaragoza until 1883, when he was awarded the position of anatomy professor of the University of Valencia. His early work at these two universities focused on the pathology of inflammation, the microbiology of cholera, and the structure of epithelial cells and tissues.

 

In 1887 Cajal moved to Barcelona for a professorship. There he first learned about Golgi’s method, a cell staining method which uses potassium dichromate and silver nitrate to (randomly) stain a few neurons a dark black color, while leaving the surrounding cells transparent. This method, which he improved, was central to his work, allowing him to turn his attention to the central nervous system (brain and spinal cord), in which neurons are so densely intertwined that standard microscopic inspection would be nearly impossible. During this period he made extensive detailed drawings of neural material, covering many species and most major regions of the brain. In 1892, he became a professor in Madrid. In 1899 he became director of the National Institute of Hygiene , and in 1922 founder of the Laboratory of Biological Investigations , later renamed to the Cajal Institute. He died in Madrid on October 17, 1934, at the age of 82, continuing to work even on his deathbed.

 

Ramon y Cajal made several major contributions to neuroanatomy. He discovered the axonal growth cone, and demonstrated experimentally that the relationship between nerve cells was not continuous, but contiguous. This provided definitive evidence for what Heinrich Waldeyer coined the term neuron theory as opposed to the reticular theory This is now widely considered the foundation of modern neuroscience. Cajal was an advocate of the existence of dendritic spines, although he did not recognize them as the site of contact from presynaptic cells. He was a proponent of polarization of nerve cell function and his student, Rafael Lorente de N?, would continue this study of input-output systems into cable theory and some of the earliest circuit analysis of neural structures. By producing excellent depictions of neural structures and their connectivity and providing detailed descriptions of cell types he discovered a new type of cell, which was subsequently named after him, the interstitial cell of Cajal (ICC). This cell is found interleaved among neurons embedded within the smooth muscles lining the gut, serving as the generator and pacemaker of the slow waves of contraction which move material along the gastrointestinal tract, mediating neurotransmission from motor neurons to smooth muscle cells. In his 1894 Croonian Lecture, Ramon y Cajal suggested (in an extended metaphor) that cortical pyramidal cells may become more elaborate with time, as a tree grows and extends its branches.

 

Cajal devoted a considerable amount of time studying hypnosis which he used to help his wife during labor and parapsychological phenomena. A book he had written on these topics was lost during the Spanish Civil War. Cajal received many prizes, distinctions, and societal memberships during his scientific career, including honorary doctorates in medicine from Cambridge University and Wurzburg University and an honorary doctorate in philosophy from Clark University in the United States. The most famous distinction he was awarded was the Nobel Prize in Physiology or Medicine in 1906, together with the Italian scientist Camillo Golgi “in recognition of their work on the structure of the nervous system“. This caused some controversy because Golgi, a staunch supporter of reticular theory, disagreed with Ramon y Cajal in his view of the neuron doctrine. He published more than 100 scientific works and articles in Spanish, French and German. Among his most notable works were:

 

Rules and advice on scientific investigation

Histology

Degeneration and regeneration of the nervous system

Manual of normal histology and micrographic technique

Elements of histology

 

A list of his books includes:

 

Ramon y Cajal, Santiago (1905) [1890]. Manual de Anatomia Patologica General (Handbook of general Anatomical Pathology) (in Spanish) (fourth ed.).

Ramon y Cajal, Santiago; Richard Greeff (1894). Die Retina der Wirbelthiere: Untersuchungen mit der Golgi-cajal’schen Chromsilbermethode und der ehrlich’schen Methylenblauf?rbung (Retina of vertebrates) (in German). Bergmann.

Ramon y Cajal, Santiago; L. Azoulay (1894). Les nouvelles idees sur la structure du systeme nerveux chez l’homme et chez les vertebres’ (‘New ideas on the fine anatomy of the nerve centres) (in French). C. Reinwald.

Ramon y Cajal, Santiago; Johannes Bresler; E. Mendel (1896). Beitrag zum Studium der Medulla Oblongata: Des Kleinhirns und des Ursprungs der Gehirnnerven (in German). Verlag von Johann Ambrosius Barth.

Ramon y Cajal, Santiago (1898). “Estructura del quiasma optico y teoria general de los entrecruzamientos de las vias nerviosas. (Structure of the Chiasma opticum and general theory of the crossing of nerve tracks)“ [Die Structur des Chiasma opticum nebst einer allgemeine Theorie der Kreuzung der Nervenbahnen (German, 1899, Verlag Joh. A. Barth)]. Rev. Trim. Micrografica (in Spanish). 3: 15-65.

Ramon y Cajal, Santiago (1899). Comparative study of the sensory areas of the human cortex. p. 85. Archived from the original on 10 September 2009.

Ramon y Cajal, Santiago (1899-1904). Textura del sistema nervioso del hombre y los vertebrados (in Spanish). Madrid.

Histologie du systeme nerveux de l’homme & des vertebres (in French) – via Internet Archive.

Texture of the Nervous System of Man and the Vertebrates.

Ramon y Cajal, Santiago (1906). Studien uber die Hirnrinde des Menschen v.5 (Studies about the meninges of man) (in German). Johann Ambrosius Barth.

Ramon y Cajal, Santiago (1999) [1897]. Advice for a Young Investigator. Translated by Neely Swanson and Larry W. Swanson. Cambridge: MIT Press. ISBN 0-262-68150-1.

Ramon y Cajal, Santiago (1937). Recuerdos de mi Vida (in Spanish). Cambridge: MIT Press. ISBN 84-206-2290-7.

 

Other accomplishments and honors:

 

In 1905, he published five science-fiction stories called “Vacation Stories“ under the pen name “Dr. Bacteria“.

The asteroid 117413 Ramonycajal has been named in his honor.

 

In the 21st Century:

In 2014, the National Institutes of Health exhibited original Ramon y Cajal drawings in its Neuroscience Research Center.

 

This year 2018:

An exhibition called The Beautiful Brain: The Drawings of Santiago Ramon y Cajal travelled through the US beginning 2017 at the Weisman Art Museum in Minneapolis ending April 2019 at the Ackland Art Museum in Chapel Hill, North Carolina.

 

A short documentary by REDES is available on YouTube. Spanish public television filmed a biopic series to commemorate his life

 

Take a look at the beauty of the drawings by the great neuroscientist, Santiago Ramon-y-Cajal

 

Review of Cajal’s work

Life of the genius at work

Short biography

NIH discusses the great drawings

Discussion of 21 drawings, with a short pause between each discussion

Next Page →