Date:
January 18, 2018

Source:
NASA

Summary:
Continuing the planet’s long-term warming trend, globally averaged temperatures in 2017 were 1.62 degrees Fahrenheit (0.90 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists.

 

This map shows Earth’s average global temperature from 2013 to 2017, as compared to a baseline average from 1951 to 1980, according to an analysis by NASA’s Goddard Institute for Space Studies. Yellows, oranges, and reds show regions warmer than the baseline.
Credit: NASA’s Scientific Visualization Studio

 

 

Earth’s global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA.

Continuing the planet’s long-term warming trend, globally averaged temperatures in 2017 were 1.62 degrees Fahrenheit (0.90 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York. That is second only to global temperatures in 2016.

In a separate, independent analysis, scientists at the National Oceanic and Atmospheric Administration (NOAA) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies’ records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.

Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017’s global mean change is accurate to within 0.1 degree Fahrenheit, with a 95 percent certainty level.

“Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we’ve seen over the last 40 years,” said GISS Director Gavin Schmidt.

The planet’s average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.

Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event — and with a La Niña starting in the later months of 2017 — last year’s temperatures ranked between 2015 and 2016 in NASA’s records.

In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.

Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.

Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.

NASA’s temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.

These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.

NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth’s polar regions and global temperatures.

The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available at: https://data.giss.nasa.gov/gistemp

GISS is a laboratory within the Earth Sciences Division of NASA’s Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University’s Earth Institute and School of Engineering and Applied Science in New York.

Story Source:

Materials provided by NASANote: Content may be edited for style and length.

 

Source: NASA. “Long-term warming trend continued in 2017: NASA, NOAA.” ScienceDaily. ScienceDaily, 18 January 2018. <www.sciencedaily.com/releases/2018/01/180118173711.htm>.

Novel approach lays groundwork for using 3-D printing to repair tissue in the body

Date:
January 17, 2018

Source:
The Optical Society

Summary:
For the first time, researchers have shown that an optical fiber as thin as a human hair can be used to create microscopic structures with laser-based 3-D printing. The innovative approach might one day be used with an endoscope to fabricate tiny biocompatible structures directly into tissue inside the body.

 

Researchers used an optical fiber housed inside the needle pictured to deliver light for 3-D printing microstructures. The light selectively hardens volumes inside the droplet of photopolymer on the glass slide. The new system could one day allow 3-D printing inside the body.
Credit: Damien Loterie and Paul Delrot, École Polytechnique Fédérale de Lausanne

 

 

For the first time, researchers have shown that an optical fiber as thin as a human hair can be used to create microscopic structures with laser-based 3D printing. The innovative approach might one day be used with an endoscope to fabricate tiny biocompatible structures directly into tissue inside the body. This capability could enable new ways to repair tissue damage.

“With further development our technique could enable endoscopic microfabrication tools that would be valuable during surgery,” said research team leader Paul Delrot, from École Polytechnique Fédérale de Lausanne, Switzerland. “These tools could be used to print micro- or nano-scale 3D structures that facilitate the adhesion and growth of cells to create engineered tissue that restores damaged tissues.”

In The Optical Society (OSA) journal Optics Express, the researchers show that their new approach can create microstructures with a 1.0-micron lateral (side-to-side) and 21.5-micron axial (depth) printing resolution. Although these microstructures were created on a microscope slide, the approach could be useful for studying how cells interact with various microstructures in animal models, which would help pave the way for endoscopic printing in people.

To create the microstructures, the researchers dipped the end of an optical fiber into a liquid known as photopolymer that solidifies, or cures, when illuminated with a specific color of light. They used the optical fiber to deliver and digitally focus laser light point-by-point into the liquid to build a three-dimensional microstructure.

By printing delicate details onto large parts, the new ultra-compact microfabrication tool could also be a useful add-on to today’s commercially available 3D printers that are used for everything from rapid prototyping to making personalized medical devices. “By using one printer head with a low resolution for the bulk parts and our device as a secondary printer head for the fine details, multi-resolution additive manufacturing could be achieved,” said Delrot.

Simplifying the setup

Current laser-based microfabrication techniques rely on a non-linear optical phenomenon called two-photon photopolymerization to selectively cure a volume deep inside a liquid photosensitive material. These techniques are difficult to use for biomedical applications because two-photon photopolymerization requires complex and expensive lasers that emit very short pulses as well as bulky optical systems to deliver the light.

“Our group has expertise in manipulating and shaping light through optical fibers, which led us to think that microstructures could be printed with a compact system. In addition, to make the system more affordable, we took advantage of a photopolymer with a nonlinear dose response. This can work with a simple continuous-wave laser, so expensive pulsed lasers were not required,” said Delrot.

To selectively cure a specific volume of material, the researchers took advantage of a chemical phenomenon in which solidification only occurs above a certain threshold in light intensity. By performing a detailed study of the light scanning parameters and the photopolymer’s behavior, the researchers discovered the best parameters for using this chemical phenomenon to print microstructures using a low-power, inexpensive laser that emits continuously (rather than pulsed).

To create hollow and solid microstructures, the researchers used an organic polymer precursor doped with photoinitiator made of off-the-shelf chemical components. They focused a continuous-wave laser emitting light at 488-nanometer wavelength — visible-wavelength light that is potentially safe for cells — through an optical fiber small enough to fit in a syringe. Using an approach known as wavefront shaping they were able to focus the light inside the photopolymer so that only a small 3D point was cured. Performing a calibration step prior to microfabrication allowed them to digitally focus and scan laser light through the ultra-thin optical fiber without moving the fiber.

“Compared to two-photon photopolymerization state-of-the-art systems, our device has a coarser printing resolution, however, it is potentially sufficient to study cellular interactions and does not require bulky optical systems nor expensive pulsed lasers,” said Delrot. “Since our approach doesn’t require complex optical components, it could be adapted to use with current endoscopic systems.”

Moving toward clinical use

The researchers are working to develop biocompatible photopolymers and a compact photopolymer delivery system, which are necessary before the technique could be used in people. A faster scanning speed is also needed, but in cases where the instrument size is not critical, this limitation could be overcome by using a commercial endoscope instead of the ultra-thin fiber. Finally, a technique to finalize and post-process the printed structure inside the body is required to create microstructures with biomedical functions.

“Our work shows that 3D microfabrication can be achieved with techniques other than focusing a high-power femtosecond pulsed laser,” said Delrot. “Using less complex lasers or light sources will make additive manufacturing more accessible and create new opportunities of applications such as the one we demonstrated.”

Story Source:

Materials provided by The Optical SocietyNote: Content may be edited for style and length.


Journal Reference:

  1. Paul Delrot, Damien Loterie, Demetri Psaltis, Christophe Moser. Single-photon three-dimensional microfabrication through a multimode optical fiberOptics Express, 2018; 26 (2): 1766 DOI: 10.1364/OE.26.001766

 

Source: The Optical Society. “Ultra-thin optical fibers offer new way to 3-D print microstructures: Novel approach lays groundwork for using 3-D printing to repair tissue in the body.” ScienceDaily. ScienceDaily, 17 January 2018. <www.sciencedaily.com/releases/2018/01/180117102644.htm>.

Date:
January 16, 2018

Source:
University of York

Summary:
Researchers have found no evidence to support the theory that video games make players more violent.

 

Playing video games.
Credit: © panuwat / Fotolia

 

 

Researchers at the University of York have found no evidence to support the theory that video games make players more violent.

In a series of experiments, with more than 3,000 participants, the team demonstrated that video game concepts do not ‘prime’ players to behave in certain ways and that increasing the realism of violent video games does not necessarily increase aggression in game players.

The dominant model of learning in games is built on the idea that exposing players to concepts, such as violence in a game, makes those concepts easier to use in ‘real life’.

This is known as ‘priming’, and is thought to lead to changes in behaviour. Previous experiments on this effect, however, have so far provided mixed conclusions.

Researchers at the University of York expanded the number of participants in experiments, compared to studies that had gone before it, and compared different types of gaming realism to explore whether more conclusive evidence could be found.

In one study, participants played a game where they had to either be a car avoiding collisions with trucks or a mouse avoiding being caught by a cat. Following the game, the players were shown various images, such as a bus or a dog, and asked to label them as either a vehicle or an animal.

Dr David Zendle, from the University’s Department of Computer Science, said: “If players are ‘primed’ through immersing themselves in the concepts of the game, they should be able to categorise the objects associated with this game more quickly in the real world once the game had concluded.

“Across the two games we didn’t find this to be the case. Participants who played a car-themed game were no quicker at categorising vehicle images, and indeed in some cases their reaction time was significantly slower.”

In a separate, but connected study, the team investigated whether realism influenced the aggression of game players. Research in the past has suggested that the greater the realism of the game the more primed players are by violent concepts, leading to antisocial effects in the real world.

Dr Zendle said: “There are several experiments looking at graphic realism in video games, but they have returned mixed results. There are, however, other ways that violent games can be realistic, besides looking like the ‘real world’, such as the way characters behave for example.

“Our experiment looked at the use of ‘ragdoll physics’ in game design, which creates characters that move and react in the same way that they would in real life. Human characters are modelled on the movement of the human skeleton and how that skeleton would fall if it was injured.”

The experiment compared player reactions to two combat games, one that used ‘ragdoll physics’ to create realistic character behaviour and one that did not, in an animated world that nevertheless looked real.

Following the game the players were asked to complete word puzzles called ‘word fragment completion tasks’, where researchers expected more violent word associations would be chosen for those who played the game that employed more realistic behaviours.

They compared the results of this experiment with another test of game realism, where a single bespoke war game was modified to form two different games. In one of these games, enemy characters used realistic soldier behaviours, whilst in the other game they did not employ realistic soldier behaviour.

Dr Zendle said: “We found that the priming of violent concepts, as measured by how many violent concepts appeared in the word fragment completion task, was not detectable. There was no difference in priming between the game that employed ‘ragdoll physics’ and the game that didn’t, as well as no significant difference between the games that used ‘real’ and ‘unreal’ solider tactics.

“The findings suggest that there is no link between these kinds of realism in games and the kind of effects that video games are commonly thought to have on their players.

“Further study is now needed into other aspects of realism to see if this has the same result. What happens when we consider the realism of by-standing characters in the game, for example, and the inclusion of extreme content, such as torture?

“We also only tested these theories on adults, so more work is needed to understand whether a different effect is evident in children players.”

Story Source:

Materials provided by University of YorkNote: Content may be edited for style and length.


Journal Reference:

  1. David Zendle, Daniel Kudenko, Paul Cairns. Behavioural realism and the activation of aggressive concepts in violent video gamesEntertainment Computing, 2018; 24: 21 DOI: 10.1016/j.entcom.2017.10.003

 

Source: University of York. “No evidence to support link between violent video games and behavior.” ScienceDaily. ScienceDaily, 16 January 2018. <www.sciencedaily.com/releases/2018/01/180116131317.htm>.

Date:
January 15, 2018

Source:
Marine Biological Laboratory

Summary:
Many of the genes involved in natural repair of the injured spinal cord of the lamprey are also active in the repair of the peripheral nervous system in mammals, according to a new study.

 

Jennifer Morgan and Ona Bloom with juvenile lamprey in the MBL Whitman Center.
Credit: Amanda R. Martinez

 

 

Many of the genes involved in natural repair of the injured spinal cord of the lamprey are also active in the repair of the peripheral nervous system in mammals, according to a study by a collaborative group of scientists at the Marine Biological Laboratory (MBL) and other institutions. This is consistent with the possibility that in the long term, the same or similar genes may be harnessed to improve spinal cord injury treatments.

“We found a large overlap with the hub of transcription factors that are driving regeneration in the mammalian peripheral nervous system,” says Jennifer Morgan, director of the MBL’s Eugene Bell Center for Regenerative Biology and Tissue Engineering, one of the authors of the study published this week in Scientific Reports.

Lampreys are jawless, eel-like fish that shared a common ancestor with humans about 550 million years ago. This study arose from the observation that a lamprey can fully recover from a severed spinal cord without medication or other treatment.

“They can go from paralysis to full swimming behaviors in 10 to 12 weeks,” says Morgan.

“Scientists have known for many years that the lamprey achieves spontaneous recovery from spinal cord injury, but we have not known the molecular recipe that accompanies and supports this remarkable capacity,” says Ona Bloom of the Feinstein Institute for Medical Research and the Zucker School of Medicine at Hofstra/Northwell, a former MBL Whitman Center Fellow who collaborated on the project.

“In this study, we have determined all the genes that change during the time course of recovery and now that we have that information, we can use it to test if specific pathways are actually essential to the process,” Bloom says.

The researchers followed the lampreys’ healing process and took samples from the brains and spinal cords at multiple points in time, from the first hours after injury until three months later when they were healed. They analyzed the material to determine which genes and signaling pathways were activated as compared to a non-injured lamprey.

As expected, they found many genes in the spinal cord that change over time with recovery. Somewhat unexpectedly, they also discovered a number of injury-induced gene expression changes in the brain. “This reinforces the idea that the brain changes a lot after a spinal cord injury,” says Morgan. “Most people are thinking, ‘What can you do to treat the spinal cord itself?’ but our data really support the idea that there’s also a lot going on in the brain.”

They also found that many of the genes associated with spinal cord healing are part of the Wnt signaling pathway, which plays a role in tissue development. “Furthermore, when we treated the animals with a drug that inhibits the Wnt signaling pathway, the animals never recovered their ability to swim,” says Morgan. Future research will explore why the Wnt pathway seems particularly important in the healing process.

The paper is the result of a collaboration between Morgan, Bloom and other scientists including Jeramiah Smith of University of Kentucky and Joseph Buxbaum of Icahn School of Medicine at Mount Sinai, both former Whitman Center Fellows. The collaboration was made possible by the MBL Whitman Center Fellowship program.

“[This study] involved several different labs located in different parts of the country with different types of expertise, but it absolutely could not and would not have been done without the support of the MBL that allows us to to work collaboratively in a shared laboratory setting,” says Morgan.

Story Source:

Materials provided by Marine Biological Laboratory. Original written by Diana Kenney. Note: Content may be edited for style and length.


Journal Reference:

  1. Paige E. Herman, Angelos Papatheodorou, Stephanie A. Bryant, Courtney K. M. Waterbury, Joseph R. Herdy, Anthony A. Arcese, Joseph D. Buxbaum, Jeramiah J. Smith, Jennifer R. Morgan, Ona Bloom. Highly conserved molecular pathways, including Wnt signaling, promote functional recovery from spinal cord injury in lampreysScientific Reports, 2018; 8 (1) DOI: 10.1038/s41598-017-18757-1

 

Source: Marine Biological Laboratory. “Genes that aid spinal cord healing in lamprey also present in humans, researchers discover.” ScienceDaily. ScienceDaily, 15 January 2018. <www.sciencedaily.com/releases/2018/01/180115094222.htm>.

A Breathless Sunset in New York City as Seen From the 24th Floor

 

Every once in a while the sky explodes. Two days after a big storm a gorgeous sunset surprised us. The entire company stopped to view one of the most breath-taking views we have ever seen. It is our pleasure to share this photo with our readers from all over the world as we welcome in the new year.

 

Sunset Behind the Empire State Building ©Jules T. Mitchel

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 165). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

QUIZ

Filed Under News | Leave a Comment

Clinical Trial Designs

Timeline of various approval tracks and research phases in the US

Graph credit: Kernsters – Graph created based on information provided in Scientific American article, “Faster Evaluation of Vital Drugs“, CC BY-SA 3.0, https://en.wikipedia.org/w/index.php?curid=39972696

 

 

Clinical trials are experiments or observations done in clinical research. Such prospective biomedical or behavioral research studies on human participants are designed to answer specific questions about biomedical or behavioral interventions, including new treatments (such as novel vaccines, drugs, dietary choices, dietary supplements, and medical 1) ___) and known interventions that warrant further study and comparison. Clinical trials generate data on safety and efficacy. They are conducted only after they have received health authority/ethics committee approval in the country where approval of the therapy is sought. These authorities are responsible for vetting the risk/2) ___ratio of the trial – their approval does not mean that the therapy is ‘safe’ or effective, only that the trial may be conducted.

 

Depending on product type and development stage, investigators initially enroll volunteers or patients into small pilot studies, and subsequently conduct progressively larger scale comparative 3) ___. Clinical trials can vary in size and cost, and they can involve a single research center or multiple centers, in one country or in multiple countries. Clinical study design aims to ensure the scientific validity and reproducibility of the results. Costs for clinical trials can be in the millions. The sponsor may be a governmental organization or a pharmaceutical, biotechnology or medical device company. Certain functions necessary to the trial, such as monitoring and lab work, may be managed by an outsourced partner, such as a (CRO) contract 4) ___ organization or a central laboratory.

 

An adaptive clinical trial is a clinical trial that evaluates a medical device or treatment by observing participant outcomes (and possibly other measures, such as side-5) ___) on a prescribed schedule, and modifying parameters of the trial protocol in accord with those observations. The adaptation process generally continues throughout the trial, as prescribed in the trial protocol. Modifications may include dosage, sample size, drug undergoing trial, patient selection criteria and “cocktail“ mix. In some cases, trials have become an ongoing process that regularly adds and drops therapies and patient groups as more information is gained. Importantly, the trial protocol is set before the trial begins; the protocol pre-specifies the adaptation schedule and processes. The aim of an adaptive trial is to more quickly identify drugs or devices that have a therapeutic effect, and to zero in on patient populations for whom the drug is appropriate. A key modification is to adjust 6) ___ levels. Traditionally, non-adverse patient reactions are not considered until a trial is completed.

 

In 2004, a Strategic Path Initiative was introduced by the United States’ FDA (Food and 7) ___ Administration) to modify the way drugs travel from lab to market. This initiative aimed at dealing with the high attrition levels observed in the clinical phase. It also attempted to offer flexibility to investigators to find the optimal clinical benefit without affecting the study’s validity. Adaptive clinical trials initially came under this regime. The FDA issued draft guidance on adaptive trial design in 2010. In 2012, the President’s Council of Advisors on Science and Technology (PCAST) recommended that FDA “run pilot projects to explore adaptive approval mechanisms to generate evidence across the lifecycle of a drug from the premarket through the 8) ___ phase.“ While not specifically related to clinical trials, the Council also recommended that FDA “make full use of accelerated approval for all drugs meeting the statutory standard of addressing an unmet need for a serious or life threatening disease, and demonstrating an impact on a clinical endpoint other than survival or irreversible morbidity, or on a surrogate endpoint, likely to predict clinical benefit.“

 

In the 2007-2009 period, the Department of Biostatistics at the M. D. Anderson Cancer Center was running 89 Bayesian adaptive trials, 36% of the total designed by the faculty. The FDA adaptive trial design guidance is a 50-page document covering wide-ranging and important topics “such as what aspects of adaptive design trials (i.e., clinical, statistical, regulatory) call for special consideration, when to interact with FDA while planning and conducting adaptive design studies, what information to include in the adaptive design for FDA review, and issues to consider in the evaluation of a completed adaptive design study.“ Attempts have been made to excerpt the guidance and make it more accessible.

 

According to FDA guidelines, an adaptive Bayesian clinical trial can involve the following:

 

1. Interim looks to stop or to adjust patient accrual

2. Interim looks to assess stopping the trial early either for success, futility or harm

3. Reversing the hypothesis of non-inferiority to superiority or vice versa

4. Dropping arms or doses or adjusting doses

5. Modification of the randomization rate to increase the probability that a patient is allocated to the most appropriate arm

 

The logistics of managing traditional, fixed format clinical trials are quite complex. Adapting the design as results arrive, adds to the complexity of design, monitoring, drug supply, data capture and randomization. However, according to PCAST “One approach is to focus studies on specific subsets of patients most likely to benefit, identified based on validated biomarkers. In some cases, using appropriate biomarkers can make it possible to dramatically decrease the sample size required to achieve statistical significance – for example, from 1500 to 50 patients.“ An adaptive trial design enabled two experimental breast cancer drugs to deliver promising results after just six months of testing, far shorter than usual. Researchers assessed the results while the trial was in process and found that 9) ___ had been eradicated in more than half of one group of patients. The trial, known as I-Spy 2, tested 12 experimental drugs. For its predecessor I-SPY 1, 10 cancer centers and the National Cancer Institute (NCI SPORE program and the NCI Cooperative groups) collaborated to identify response indicators that would best predict survival for women with high-risk breast cancer. During 2002-2006, the study monitored 237 patients undergoing neoadjuvant therapy before surgery. Iterative MRI and tissue samples monitored the biology of patients to chemotherapy given in a neoadjuvant setting, or presurgical setting. Evaluating chemotherapy’s direct impact on tumor tissue took much less time than monitoring outcomes in thousands of patients over long time periods. The approach helped to standardize the imaging and tumor sampling processes, and led to miniaturized assays. Key findings included that tumor response was a good predictor of patient survival, and that tumor shrinkage during treatment was a good predictor of long-10) ___ outcome. Importantly, the vast majority of tumors were identified as high risk by molecular signature. However, heterogeneity within this group of women and measuring response within tumor subtypes was more informative than viewing the group as a whole. Within genetic signatures, level of response to treatment appears to be a reasonable predictor of outcome. Additionally, its shared database has furthered the understanding of drug response and generated new targets and agents for subsequent testing. Sources: nih.gov; Wikipedia

 

ANSWERS: 1) devices; 2) benefit; 3) studies; 4) research; 5) effects; 6) dosing; 7) Drug; 8) post-market; 9) cancer; 10) term

 

Thomas Bayes (1701-1761)

Graphic credit: Unknown, Public Domain, https://commons.wikimedia.org/w/index.php?curid=14532025

 

The randomized clinical trial is widely viewed to be the gold standard for evaluation of treatments, diagnostic procedures, or disease screening. The proper design and analysis of a clinical trial requires careful consideration of the study objectives (e.g., whether to demonstrate treatment superiority or non-inferiority) and the nature of the primary end point. Different statistical methods apply when the end point variable is discrete (counts), continuous (measurements), or time to event (survival analysis). Other complicating factors include patient noncompliance, loss to follow-up, missing data, and multiple comparisons when more than 2 treatments are evaluated in the same study.  The best known statistical method used today, in clinical trials is the Bayesian method, named after 18thCentury Thomas Bayes.

 

Thomas Bayes (1701 – 1761) was an English statistician, philosopher and Presbyterian minister who is known for having formulated a specific case of the theorem that bears his name: Bayes’ theorem. Bayes never published what would eventually become his most famous accomplishment; his notes were edited and published after his death by Richard Price. Bayes was the son of London Presbyterian minister Joshua Bayes, and was possibly born in Hertfordshire. He came from a prominent nonconformist family from Sheffield. In 1719, he enrolled at the University of Edinburgh to study logic and theology. On his return around 1722, he assisted his father at the latter’s chapel in London before moving to Tunbridge Wells, Kent, around 1734. There he was minister of the Mount Sion chapel, until 1752.

 

Bayes is known to have published two works in his lifetime, one theological and one mathematical:

 

1. Divine Benevolence, or an Attempt to Prove That the Principal End of the Divine Providence and Government is the Happiness of His Creatures (1731)

 

2. An Introduction to the Doctrine of Fluxions, and a Defence of the Mathematicians Against the Objections of the Author of The Analyst (published anonymously in 1736), in which he defended the logical foundation of Isaac Newton’s calculus (“fluxions”) against the criticism of George Berkeley, author of The Analyst

 

It is speculated that Bayes was elected as a Fellow of the Royal Society in 1742 on the strength of the Introduction to the Doctrine of Fluxions, as he is not known to have published any other mathematical works during his lifetime. In his later years, Bayes took a deep interest in probability. Professor Stephen Stigler, historian of statistical science, thinks that Bayes became interested in the subject while reviewing a work written in 1755 by Thomas Simpson, but George Alfred Barnard thinks he learned mathematics and probability from a book by Abraham de Moivre.  Others speculate he was motivated to rebut David Hume’s argument against believing in miracles on the evidence of testimony in An Enquiry Concerning Human Understanding.  His work and findings on probability theory were passed in manuscript form to his friend Richard Price after his death. By 1755 he was ill and by 1761 had died in Tunbridge Wells. He was buried in Bunhill Fields burial ground in Moorgate, London, where many nonconformists lie.

 

Bayes’ solution to a problem of inverse probability was presented in “An Essay towards solving a Problem in the Doctrine of Chances” which was read to the Royal Society in 1763 after Bayes’ death. Richard Price shepherded the work through this presentation and its publication in the Philosophical Transactions of the Royal Society of London the following year. This was an argument for using a uniform prior distribution for a binomial parameter and not merely a general postulate. This essay contains a statement of a special case of Bayes’ theorem. In the first decades of the eighteenth century, many problems concerning the probability of certain events, given specified conditions, were solved. For example: given a specified number of white and black balls in an urn, what is the probability of drawing a black ball? Or the converse: given that one or more balls has been drawn, what can be said about the number of white and black balls in the urn? These are sometimes called “inverse probability” problems. Bayes’ “Essay” contains his solution to a similar problem posed by Abraham de Moivre, author of The Doctrine of Chances (1718). In addition, a paper by Bayes on asymptotic series was published posthumously.

 

Bayesian probability is the name given to several related interpretations of probability as an amount of epistemic confidence – the strength of beliefs, hypotheses etc. -, rather than a frequency. This allows the application of probability to all sorts of propositions rather than just ones that come with a reference class. “Bayesian” has been used in this sense since about 1950. Since its rebirth in the 1950s, advancements in computing technology have allowed scientists from many disciplines to pair traditional Bayesian statistics with random walk techniques. The use of the Bayes theorem has been extended in science and in other fields. Bayes himself might not have embraced the broad interpretation now called Bayesian, which was in fact pioneered and popularized by Pierre-Simon Laplace; it is difficult to assess Bayes’ philosophical views on probability, since his essay does not go into questions of interpretation. There Bayes defines probability of an event as “the ratio between the value at which an expectation depending on the happening of the event ought to be computed, and the value of the thing expected upon its happening”. Within modern utility theory, the same definition would result by rearranging the definition of expected utility (the probability of an event times the payoff received in case of that event – including the special cases of buying risk for small amounts or buying security for big amounts) to solve for the probability. As Stigler points out, this is a subjective definition, and does not require repeated events; however, it does require that the event in question be observable, for otherwise it could never be said to have “happened”. Stigler argues that Bayes intended his results in a more limited way than modern Bayesians. Given Bayes’ definition of probability, his result concerning the parameter of a binomial distribution makes sense only to the extent that one can bet on its observable consequences.

 

Bayes was elected to membership in the Royal Society in 1742; and his nomination letter has been posted with other membership records at the Royal Society website. Those signing that nomination letter were: Philip Stanhope; Martin Folkes; James Burrow; Cromwell Mortimer; John Eames.

 

Click here if you’re interested in reading a short piece about the multi-armed bandits, or slot machines, and how/why statistics are important when it comes to gambling at these machines in Las Vegas.

 

Slot machines in Las Vegas

Photo credit: Wikipedia

 

HIV/AIDS

Filed Under News | Leave a Comment

Short-Term HIV Treatment Interruption in Clinical Trials

 

Antiretroviral therapy (ART) benefits the health of people living with HIV, prolongs their lives and prevents transmission of the virus to others. If taken daily as directed, ART can reduce viral load — the amount of HIV in the blood — to levels that are undetectable with standard tests. However, the virus remains dormant in a small number of immune cells, and people living with HIV must take ART daily to keep the virus suppressed. If a person with ART-suppressed HIV stops taking medication, viral load will almost invariably rebound to high levels. Studies are now underway to develop therapeutic strategies to induce sustained ART-free remission  –the absence of viral rebound following discontinuation of ART. Clinical trials to assess the efficacy of such experimental therapies may require participants to temporarily stop taking ART, an approach known as analytical treatment interruption, or ATI.

 

According to an article published in PLOS Pathogens (11 January 2018), a short-term pause in HIV treatment during a carefully monitored clinical trial does not lead to lasting expansion of the HIV reservoir nor cause irreversible damage to the immune system. The study was designed to better understand the immunologic and virologic effects of ATI. For the study, blood samples were analyzed from 10 volunteers who had participated in a clinical trial evaluating whether infusions of a broadly neutralizing antibody could control HIV in the absence of ART. During the trial, participants temporarily stopped taking ART subsequently experienced viral rebound and resumed ART 22 to 115 days after stopping. While treatment was paused, the participants’ HIV reservoirs expanded along with increases in viral load, and abnormalities in the participants’ immune cells was observed. However, six to 12 months after the participants resumed ART, the size of the HIV reservoirs and the immune parameters returned to levels observed prior to ATI.

 

According to the authors, the findings support the use of ATI in clinical trials to evaluate the efficacy of therapeutic strategies aimed at achieving sustained ART-free remission. However, the authors added that larger studies that do not involve any interventional drugs are needed to confirm and expand on these results. The authors are currently conducting a clinical trial to monitor the impacts of short-term ATI on a variety of immunologic and virologic parameters in people living with HIV.

 

MERS Antibodies Produced in Cattle Tolerated in Phase 1 Clinical Trial

 

The first confirmed case of Middle East respiratory syndrome (MERS) was reported in Saudi Arabia in 2012. Since then, the MERS coronavirus has spread to 27 countries and sickened more than 2,000 people, of whom about 35% have died, according to the World Health Organization. There are no licensed treatments for MERS.

 

According to an article published in The Lancet Infectious Diseases (9 January 2018), SAB-301, an experimental treatment developed from cattle plasma for MERS infection, was found to be well-tolerated by healthy volunteers, with only minor reactions. SAB-301 was developed by SAB Biotherapeutics of Sioux Falls, South Dakota, and has been successfully tested in mice. The treatment comes from so-called “transchromosomic cattle.“ These cattle have genes that have been slightly altered to enable them to produce fully human antibodies instead of cow antibodies against killed microbes with which they have been vaccinated — in this case the MERS virus. The clinical trial, was conducted by NIH’s National Institute of Allergy and Infectious Diseases, took place at the NIH Clinical Center.

 

For the study, 28 healthy volunteers were treated with SAB-301 and 10 received a placebo. Six groups of volunteers given different intravenous doses were assessed six times over 90 days. Complaints among the treatment and placebo groups — such as headache and common cold symptoms — were similar and generally mild. The authors stated that they may be able to use these transchromosomic cattle to rapidly produce human antibodies, in as few as three months, against other human pathogens as well,. This means they could conceivably develop antibody treatments against a variety of infectious diseases in a much faster timeframe and in much greater volume than currently possible.

 

SAB Biotherapeutics is planning a larger study of SAB-301 in patients infected with MERS coronavirus.

 

First Treatment for Breast Cancer with a Certain Inherited Mutation

 

Breast cancer is the most common form of cancer in the United States. The National Cancer Institute at the National Institutes of Health estimates approximately 252,710 women will be diagnosed with breast cancer this year, and 40,610 will die of the disease. Approximately 20-25% of patients with hereditary breast cancers and 5-10% of patients with any type of breast cancer have a BRCA mutation. BRCA genes are involved with repairing damaged DNA and normally work to prevent tumor development. However, mutations of these genes may lead to certain cancers, including breast cancers.

 

The FDA has expanded the approved use of Lynparza (olaparib tablets) to include the treatment of patients with certain types of breast cancer that have spread (metastasized) and whose tumors have a specific inherited (germline) genetic mutation BRACAnalysis CDx. This is the first PARP (poly ADP-ribose polymerase) inhibitor approved to treat breast cancer, and it is the first time any drug has been approved to treat certain patients with metastatic breast cancer who have a “BRCA“ gene mutation.

 

Lynparza is a PARP inhibitor that blocks an enzyme involved in repairing damaged DNA. By blocking this enzyme, DNA inside the cancerous cells with damaged BRCA genes may be less likely to be repaired, leading to cell death and possibly a slow-down or stoppage of tumor growth. Lynparza was first approved by the FDA in 2014 to treat certain patients with ovarian cancer and is now indicated for the treatment of patients with germline breast cancer susceptibility gene (BRCA) mutated, human epidermal growth factor receptor 2 (HER2)-negative metastatic breast cancer, who have been previously treated with chemotherapy. Patients with hormone receptor (HR)-positive breast cancer should have been treated with a prior hormonal (endocrine) therapy or be considered inappropriate for endocrine treatment.

 

The FDA has also expanded the approval of the BRACAnalysis CDx, an approved companion diagnostic to Lynparza, to include the detection of BRCA mutations in blood samples from patients with breast cancer.

 

The safety and efficacy of Lynparza for the treatment of breast cancer was based on a randomized clinical trial of 302 patients with HER2-negative metastatic breast cancer with a germline BRCA mutation. The trial measured the length of time the tumors did not have significant growth after treatment (progression-free survival). The median progression-free survival for patients taking Lynparza was 7 months compared to 4.2 months for patients taking chemotherapy only. Common side effects of Lynparza include low levels of red blood cells (anemia), low levels of certain white blood cells (neutropenia, leukopenia), nausea, fatigue, vomiting, common cold (nasopharyngitis), respiratory tract infection, influenza, diarrhea, joint pain (arthralgia/myalgia), unusual taste sensation (dysgeusia), headache, indigestion (dyspepsia), decreased appetite, constipation and inflammation and sores in the mouth (stomatitis). Severe side effects of Lynparza include development of certain blood or bone marrow cancers (myelodysplastic syndrome/acute myeloid leukemia) and inflammation in the lungs (pneumonitis). Lynparza can cause harm to a developing fetus; women should be advised of the potential risk to the fetus and to use effective contraception. Women taking Lynparza should not breastfeed as it could cause harm to a newborn baby.

 

This application was granted Priority Review, under which the FDA’s goal is to take action on an application within 6 months where the agency determines that the drug, if approved, would significantly improve the safety or effectiveness of treating, diagnosing or preventing a serious condition. Lynparza is also approved for the treatment of patients with BRCA-mutated, advanced ovarian cancer who have received three or more treatments of chemotherapy, and for the maintenance treatment of patients with recurrent epithelial ovarian, fallopian tube or primary peritoneal cancer whose tumors have completely or partially responded to chemotherapy.

 

The FDA granted the approval of Lynparza to AstraZeneca Pharmaceuticals LP. The approval of the BRACAnalysis CDx was granted to Myriad Genetic Laboratories, Inc.

 

Next Page →