February 23, 2018

NASA/Goddard Space Flight Center

A new analysis of data from two lunar missions finds evidence that the Moon’s water is widely distributed across the surface and is not confined to a particular region or type of terrain.


If the Moon has enough water, and if it’s reasonably convenient to access, future explorers might be able to use it as a resource.
Credit: NASA’s Goddard Space Flight Center



A new analysis of data from two lunar missions finds evidence that the Moon’s water is widely distributed across the surface and is not confined to a particular region or type of terrain. The water appears to be present day and night, though it’s not necessarily easily accessible.

The findings could help researchers understand the origin of the Moon’s water and how easy it would be to use as a resource. If the Moon has enough water, and if it’s reasonably convenient to access, future explorers might be able to use it as drinking water or to convert it into hydrogen and oxygen for rocket fuel or oxygen to breathe.

“We find that it doesn’t matter what time of day or which latitude we look at, the signal indicating water always seems to be present,” said Joshua Bandfield, a senior research scientist with the Space Science Institute in Boulder, Colorado, and lead author of the new study published in Nature Geoscience. “The presence of water doesn’t appear to depend on the composition of the surface, and the water sticks around.”

The results contradict some earlier studies, which had suggested that more water was detected at the Moon’s polar latitudes and that the strength of the water signal waxes and wanes according to the lunar day (29.5 Earth days). Taking these together, some researchers proposed that water molecules can “hop” across the lunar surface until they enter cold traps in the dark reaches of craters near the north and south poles. In planetary science, a cold trap is a region that’s so cold, the water vapor and other volatiles which come into contact with the surface will remain stable for an extended period of time, perhaps up to several billion years.

The debates continue because of the subtleties of how the detection has been achieved so far. The main evidence has come from remote-sensing instruments that measured the strength of sunlight reflected off the lunar surface. When water is present, instruments like these pick up a spectral fingerprint at wavelengths near 3 micrometers, which lies beyond visible light and in the realm of infrared radiation.

But the surface of the Moon also can get hot enough to “glow,” or emit its own light, in the infrared region of the spectrum. The challenge is to disentangle this mixture of reflected and emitted light. To tease the two apart, researchers need to have very accurate temperature information.

Bandfield and colleagues came up with a new way to incorporate temperature information, creating a detailed model from measurements made by the Diviner instrument on NASA’s Lunar Reconnaissance Orbiter, or LRO. The team applied this temperature model to data gathered earlier by the Moon Mineralogy Mapper, a visible and infrared spectrometer that NASA’s Jet Propulsion Laboratory in Pasadena, California, provided for India’s Chandrayaan-1 orbiter.

The new finding of widespread and relatively immobile water suggests that it may be present primarily as OH, a more reactive relative of H2O that is made of one oxygen atom and one hydrogen atom. OH, also called hydroxyl, doesn’t stay on its own for long, preferring to attack molecules or attach itself chemically to them. Hydroxyl would therefore have to be extracted from minerals in order to be used.

The research also suggests that any H2O present on the Moon isn’t loosely attached to the surface.

“By putting some limits on how mobile the water or the OH on the surface is, we can help constrain how much water could reach the cold traps in the polar regions,” said Michael Poston of the Southwest Research Institute in San Antonio, Texas.

Sorting out what happens on the Moon could also help researchers understand the sources of water and its long-term storage on other rocky bodies throughout the solar system.

The researchers are still discussing what the findings tell them about the source of the Moon’s water. The results point toward OH and/or H2O being created by the solar wind hitting the lunar surface, though the team didn’t rule out that OH and/or H2O could come from the Moon itself, slowly released from deep inside minerals where it has been locked since the Moon was formed.

“Some of these scientific problems are very, very difficult, and it’s only by drawing on multiple resources from different missions that are we able to hone in on an answer,” said LRO project scientist John Keller of NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

LRO is managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland, for the Science Mission Directorate at NASA Headquarters in Washington, D.C. JPL designed, built and manages the Diviner instrument.

Story Source:

Materials provided by NASA/Goddard Space Flight CenterNote: Content may be edited for style and length.

Journal Reference:

  1. Joshua L. Bandfield, Michael J. Poston, Rachel L. Klima, Christopher S. Edwards. Widespread distribution of OH/H2O on the lunar surface inferred from spectral dataNature Geoscience, 2018; DOI: 10.1038/s41561-018-0065-0


Source: NASA/Goddard Space Flight Center. “On second thought, the Moon’s water may be widespread and immobile.” ScienceDaily. ScienceDaily, 23 February 2018. <>.

February 21, 2018

Howard Hughes Medical Institute

Fueled by advances in analyzing DNA from the bones of ancient humans, scientists have dramatically expanded the number of samples studied — revealing vast and surprising migrations and genetic mixing of populations in our prehistoric past.


The use of stylized bell-shaped pots like this one from Sierentz, France spread across Europe beginning about 4,700 years ago. DNA analysis show that this so-called Bell Beaker culture was brought to Britain by people who largely replaced the island’s existing inhabitants.
Credit: Anthony Denaire



Scientists once could reconstruct humanity’s distant past only from the mute testimony of ancient settlements, bones, and artifacts.

No longer. Now there’s a powerful new approach for illuminating the world before the dawn of written history — reading the actual genetic code of our ancient ancestors. Two papers published in the journal Nature on February 21, 2018, more than double the number of ancient humans whose DNA has been analyzed and published to 1,336 individuals — up from just 10 in 2014.

The new flood of genetic information represents a “coming of age” for the nascent field of ancient DNA, says lead author David Reich, a Howard Hughes Medical Institute investigator at Harvard Medical School — and it upends cherished archeological orthodoxy. “When we look at the data, we see surprises again and again and again,” says Reich.

Together with his lab’s previous work and that of other pioneers of ancient DNA, the Big Picture message is that our prehistoric ancestors were not nearly as homebound as once thought. “There was a view that migration is a very rare process in human evolution,” Reich explains. Not so, says the ancient DNA. Actually, Reich says, “the orthodoxy — the assumption that present-day people are directly descended from the people who always lived in that same area — is wrong almost everywhere.”

Instead, “the view that’s emerging — for which David is an eloquent advocate — is that human populations are moving and mixing all the time,” says John Novembre, a computational biologist at the University of Chicago.

Stonehenge’s Builders Largely Vanish

In one of the new papers, Reich and a cast of dozens of collaborators chart the spread of an ancient culture known by its stylized bell-shaped pots, the so-called Bell Beaker phenomenon. This culture first spread between Iberia and central Europe beginning about 4,700 years ago. By analyzing DNA from several hundred samples of human bones, Reich’s team shows that only the ideas — not the people who originated them — made the move initially. That’s because the genes of the Iberian population remain distinct from those of the central Europeans who adopted the characteristic pots and other artifacts.

But the story changes when the Bell Beaker culture expanded to Britain after 4,500 years ago. Then, it was brought by migrants who almost completely supplanted the island’s existing inhabitants — the mysterious people who had built Stonehenge — within a few hundred years. “There was a sudden change in the population of Britain,” says Reich. “It was an almost complete replacement.”

For archeologists, these and other findings from the study of ancient DNA are “absolutely sort of mind-blowing,” says archaeologist Barry Cunliffe, a professor emeritus at the University of Oxford. “They are going to upset people, but that is part of the excitement of it.”

Vast Migration from the Steppe

Consider the unexpected movement of people who originally lived on the steppes of Central Asia, north of the Black and Caspian seas. About 5,300 years ago, the local hunter-gatherer cultures were replaced in many places by nomadic herders, dubbed the Yamnaya, who were able to expand rapidly by exploiting horses and the new invention of the cart, and who left behind big, rich burial sites.

Archeologists have long known that some of the technologies used by the Yamnaya later spread to Europe. But the startling revelation from the ancient DNA was that the people moved, too — all the way to the Atlantic coast of Europe in the west to Mongolia in the east and India in the south. This vast migration helps explain the spread of Indo-European languages. And it significantly replaced the local hunter-gatherer genes across Europe with the indelible stamp of steppe DNA, as happened in Britain with the migration of the Bell Beaker people to the island.

“This whole phenomenon of the steppe expansion is an amazing example of what ancient DNA can show,” says Reich. And, adds Cunliffe, “no one, not even archeologists in their wildest dreams, had expected such a high steppe genetic content in the populations of northern Europe in the third millennium B.C.”

This ancient DNA finding also explains the “strange result” of a genetic connection that had been hinted at in the genomes of modern-day Europeans and Native Americans, adds Chicago’s Novembre. The link is evidence from people who lived in Siberia 24,000 years ago, whose telltale DNA is found both in Native Americans, and in the Yamnaya steppe populations and their European descendants.

New Insights from Southeastern Europe

Reich’s second new Nature paper, on the genomic history of southeastern Europe, reveals an additional migration as farming spread across Europe, based on data from 255 individuals who lived between 14,000 and 2,500 years ago. It also adds a fascinating new nugget — the first compelling evidence that the genetic mixing of populations in Europe was biased toward one sex.

Hunter-gatherer genes remaining in northern Europeans after the influx of migrating farmers came more from males than females, Reich’s team found. “Archaeological evidence shows that when farmers first spread into northern Europe, they stopped at a latitude where their crops didn’t grow well,” he says. “As a result, there were persistent boundaries between the farmers and the hunter-gatherers for a couple of thousand years.” This gave the hunter-gatherers and farmers a long time to interact. According to Reich, one speculative scenario is that during this long, drawn-out interaction, there was a social or power dynamic in which farmer women tended to be integrated into hunter-gatherer communities.

So far that’s only a guess, but the fact that ancient DNA provides clues about the different social roles and fates of men and women in ancient society “is another way, I think, that these data are so extraordinary,” says Reich.

Advanced Machines

These scientific leaps forward have been fueled by three key developments. One is the dramatic cost reduction (and speed increase) in gene sequencing made possible by advanced machines from Illumina and other companies. The second is a discovery spearheaded by Ron Pinhasi, an archaeologist at University College Dublin. His group showed that the petrous bone, containing the tiny inner ear, harbors 100 times more DNA than other ancient human remains, offering a huge increase in the amount of genetic material available for analysis. The third is a method implemented by Reich for reading the genetic codes of 1.2 million carefully chosen variable parts of DNA (known as single nucleotide polymorphisms) rather than having to sequence entire genomes. That speeds the analysis and reduces its cost even further.

The new field made a splash when Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology, working with Reich and many other colleagues, used ancient DNA to prove that Neanderthals and humans interbred. Since then, the number of ancient humans whose DNA Reich has analyzed has risen exponentially. His lab has generated about three-quarters of the world’s published data and, included unpublished data, has now reached 3,700 genomes. “Every time we jump an order of magnitude in the number of individuals, we can answer questions that we couldn’t even have asked before,” says Reich.

Now, with hundreds of thousands of ancient skeletons (and their petrous bones) still to be analyzed, the field of ancient DNA is poised to both pin down current questions and tackle new ones. For example, Reich’s team is working with Cunliffe and others to study more than 1,000 samples from Britain to more accurately measure the replacement of the island’s existing gene pool by the steppe-related DNA from the Bell Beaker people. “The evidence we have for a 90 percent replacement is very, very suggestive, but we need to test it a bit more to see how much of the pre-Beaker population really survived,” explains Cunliffe.

Beyond that, ancient DNA offers the promise of studying not only the movements of our distant ancestors, but also the evolution of traits and susceptibilities to diseases. “This is a new scientific instrument that, like the microscope when it was invented in the seventeenth century, makes it possible to study aspects of biology that simply were not possible to examine before,” explains Reich. In one example, scientists at the University of Copenhagen found DNA from plague in the steppe populations. If the groups that migrated to Britain after 4,500 years ago brought the disease with them, that could help explain why the existing population shrank so quickly.

With the possibility of many such discoveries still ahead, “it is a very exciting time,” says Cunliffe. “Ancient DNA is going to revitalize archeology in a way that few of us could have guessed even ten years ago.”

Story Source:

Materials provided by Howard Hughes Medical InstituteNote: Content may be edited for style and length.

Journal References:

  1. Iain Mathieson et al. The genomic history of southeastern EuropeNature, 2018; DOI: 10.1038/nature25778
  2. Iñigo Olalde et al. The Beaker phenomenon and the genomic transformation of northwest EuropeNature, 2018; DOI: 10.1038/nature25738



Study of hominin fossils shows that brain size increased gradually and consistently, driven by evolution within populations, introduction of larger-brained species and extinction of smaller-brained ones

February 20, 2018

University of Chicago Medical Center

Modern humans have brains that are more than three times larger than our closest living relatives, chimpanzees and bonobos. Scientists don’t agree on when and how this dramatic increase took place, but new analysis of 94 hominin fossils shows that average brain size increased gradually and consistently over the past three million years.


These are models of human ancestors brain size compared to modern day humans.
Credit: Matt Wood, UChicago



Modern humans have brains that are more than three times larger than our closest living relatives, chimpanzees and bonobos. Scientists don’t agree on when and how this dramatic increase took place, but new analysis of 94 hominin fossils shows that average brain size increased gradually and consistently over the past three million years.

The research, published this week in the Proceedings of the Royal Society B, shows that the trend was caused primarily by evolution of larger brains within populations of individual species, but the introduction of new, larger-brained species and extinction of smaller-brained ones also played a part.

“Brain size is one of the most obvious traits that makes us human. It’s related to cultural complexity, language, tool making and all these other things that make us unique,” said Andrew Du, PhD, a postdoctoral scholar at the University of Chicago and first author of the study. “The earliest hominins had brain sizes like chimpanzees, and they have increased dramatically since then. So, it’s important to understand how we got here.”

Du began the work as a graduate student at the George Washington University (GW). His advisor, Bernard Wood, GW’s University Professor of Human Origins and senior author of the study, gave his students an open-ended assignment to understand how brain size evolved through time. Du and his fellow students, who are also co-authors on the paper, continued working on this question during his time at George Washington, forming the basis of the new study.

“Think about the entrance to a building. You can reach the front door by walking up a ramp, or you can take the steps,” Wood said. “The conventional wisdom was that our large brains had evolved because of a series of step-like increases each one making our ancestors smarter. Not surprisingly the reality is more complex, with no clear link between brain size and behavior.”

“The moral is this: When you don’t understand something ask a bunch of bright and motivated students to figure it out,” he said.

Du and his colleagues compared published research data on the skull volumes of 94 fossil specimens from 13 different species, beginning with the earliest unambiguous human ancestors, Australopithecus, from 3.2 million years ago to pre-modern species, including Homo erectus, from 500,000 years ago when brain size began to overlap with that of modern-day humans.

The researchers saw that when the species were counted at the clade level, or groups descending from a common ancestor, the average brain size increased gradually over three million years. Looking more closely, the increase was driven by three different factors, primarily evolution of larger brain sizes within individual species populations, but also by the addition of new, larger-brained species and extinction of smaller-brained ones. The team also found that the rate of brain size evolution within hominin lineages was much slower than how it operates today, although why this discrepancy exists is still an open question.

The study quantifies for the first time when and by how much each of these factors contributes to the clade-level pattern. Du said he likens it to how a football coach might build a roster of bigger, strong players. One way would be to make all the players hit the weight room to bulk up. But the coach could also recruit new, larger players and cut the smallest ones.

“That’s exactly what we see going on in brain size,” he said. “The dominant process is like the players hitting the gym. They’re evolving larger brains within a population. But we also see speciation events adding larger-brained daughter species, or recruiting bigger players, and we see extinction, or cutting the smallest players too.”

Story Source:

Materials provided by University of Chicago Medical CenterNote: Content may be edited for style and length.

Journal Reference:

  1. Andrew Du, Andrew M. Zipkin, Kevin G. Hatala, Elizabeth Renner, Jennifer L. Baker, Serena Bianchi, Kallista H. Bernal, Bernard A. Wood. Pattern and process in hominin brain size evolution are scale-dependentProceedings of the Royal Society B: Biological Sciences, 2018; 285 (1873): 20172738 DOI: 10.1098/rspb.2017.2738


Source: University of Chicago Medical Center. “Brain size of human ancestors evolved gradually over 3 million years: Study of hominin fossils shows that brain size increased gradually and consistently, driven by evolution within populations, introduction of larger-brained species and extinction of smaller-brained ones.” ScienceDaily. ScienceDaily, 20 February 2018. <>.

February 20, 2018

Ecole Polytechnique Fédérale de Lausanne

Scientists have found that the activity of proprotein covertases, the enzymes that turn-on proteins, is regulated by the location of the enzyme inside the cell. The study uses a novel biosensor, CLIP, and has significant implications for cancer treatment.


Most proteins in the cell are not produced “ready to go.” Instead, they are first synthesized with chains of amino acids that block their activity until they are removed by enzymes called “proprotein convertases” (PCs). This family of enzymes plays significant but very different roles in various cancers, and regulating the activity of PCs could help develop cancer treatments. But PCs overlap in terms of activity, meaning that two or more of these enzymes can process the same protein. This overlap makes it very difficult to distinguish and map out the functional profile of each PC.

One important aspect of a PC’s life is its trafficking in secretory vesicles to the cell surface, and its internalization in endosomes that mediate re-cycling to the so-called trans-Golgi network, a trafficking hub where secretory vesicles and endosomes exchange protein cargo, and where PC’s have been thought to perform most of their protein-activating work.

The lab of Daniel Constam at ISREC (EPFL) has now developed targeted biosensors that can image specific PCs on a subcellular level, thereby overcoming the problems of overlap. The biosensors are based on an original sensor molecule developed by Constam in 2010 called CLIP (Cell-Linked Indicator of Proteolysis), to image where a cell’s convertase enzymes are active in living cells.

With the CLIP biosensors, the researchers were able to successfully track the activity of the “prototypical” PC, furin, at subcellular resolution. Furin’s trafficking has been shown to influence its activity by regulating which proteins it can access and activate; in other words, furin activity depends on where it is enriched inside the cell.

Using a PC inhibitor, the researchers found that when inside endosomes, furin is ten times less inhibited but enriched more than three times compared to the trans-Golgi network. Another PC that resists this inhibitor (PC7) was active in distinct vesicles and it only reached the trans-Golgi network, the endosomes and the cell surface when it was over-expressed.

“We are particularly interested in one of the substrates, Activin-A, because of its newly discovered immunosuppressive role in melanoma,” says Daniel Constam. “We now found that different ‘rooms’ cleave it in a stepwise fashion, leaving a mark of where it has been, presumably to regulate where it should go next and with whom to interact.”

The researchers then turned their attention to the amino acid sequence of the PCs they were studying. They found that a “PLC motif” (a sequence of three amino acids, Proline, Leucine, and Cysteine) in the cytosolic tail of PC7 was specifically required for its recycling in the trans-Golgi network and for rescuing proActivin-A cleavage in furin-depleted melanoma cells, but was relatively dispensable for PC7’s activity inside endosomes.

“Our study provides a proof-in-principle that compartment-specific biosensors can be used to gain insight into the regulation of PC trafficking and to map the tropism of PC-specific inhibitors,” says the study’s first author, Pierpaolo Ginefra. “It’s like a game of Cluedo: we want to know in which of the room(s) in a cell a given substrate is cleaved and by which PC.”

Story Source:

Materials provided by Ecole Polytechnique Fédérale de LausanneNote: Content may be edited for style and length.

Journal Reference:

  1. Pierpaolo Ginefra, Bruno G.H. Filippi, Prudence Donovan, Sylvain Bessonnard, Daniel B. Constam. Compartment-Specific Biosensors Reveal a Complementary Subcellular Distribution of Bioactive Furin and PC7Cell Reports, 2018; 22 (8): 2176 DOI: 10.1016/j.celrep.2018.02.005


Source: Ecole Polytechnique Fédérale de Lausanne. “Enzyme location controls enzyme activity.” ScienceDaily. ScienceDaily, 20 February 2018. <>.

ACRP Webinar: Regulatory Concerns When Running Paperless Clinical Trials

On March 7, 2018, Jonathan Helfgott (formerly at FDA) and Jules Mitchel (President of Target Health Inc.) will be presenting an ACRP Webinar entitled: “Regulatory Concerns When Running Paperless Clinical Trials.“ Here is a summary:


The FDA and other regulatory bodies have issued multiple guidance documents addressing requirements when using technology tools to execute paperless clinical trials. The push from the regulatory agencies has occurred despite a risk-averse pharmaceutical industry, still conflicted about living in a “paper world.“ This risk aversion is in part due to an irrational fear by sites and sponsors of receiving a FDA Form 483 when an FDA inspector discovers that a patient was born in 1982, when in the study database it is recorded as 1983; and it did not matter. Additionally, the clinical sites are also fearful of losing business as a result of any FDA Form 483 finding, however minor.


The following are examples of regulated clinical trial software: 1) Data capture (EDC) systems; 2) Electronic informed consent; 3) Electronic trial master file (eTMF): 4) ePRO and eCOA; 5) Dedicated tablets collecting and storing data in real time; 6) Mobile Apps; and 7) Web-based systems collecting data in real time which transmit the data to the study database only after the source record is securely received and stored in an independent eSource storage location under control of the study site.


Upon completion of this Webinar, attendees should be able to:


1.     Understand the future landscape of paperless clinical trials

2.     Understand regulatory concerns when running paperless clinical trials

3.     How to “separate toys from tools“ when choosing mobile and related devices


For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 165). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.


Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor



Filed Under News | Leave a Comment

Pancreatic Islet Transplantation

The process of clinical islet transplantation for the treatment of diabetes mellitus

Graphic credit: Giovanni Maki – Naftanel MA, Harlan DM (2004) Pancreatic Islet Transplantation. PLoS Med 1(3): e58 (image link), CC BY 2.5,


Islet transplantation is the transplantation of isolated islets from a donor pancreas into another person. It is an experimental treatment for type 1 diabetes mellitus. Once transplanted, the islets begin to produce insulin, actively regulating the level of glucose in the 1) ___. Islets are usually infused into the patient’s liver. If the cells are not from a genetically identical donor the patient’s body will recognize them as foreign and the immune system will begin to attack them as with any transplant rejection. To prevent this, immunosuppressant 2) ___ are used. Recent studies have shown that islet transplantation has progressed to the point that 58% of the patients in one study were insulin independent one year after the operation.


The concept of islet transplantation is not new. Investigators as early as the English surgeon Charles Pybus (1882-1975) attempted to graft pancreatic tissue to cure diabetes. Most, however, credit the recent era of islet transplantation research to Paul Lacy’s studies dating back more than three decades. In 1967, Lacy’s group described a novel collagenase-based method to isolate islets, paving the way for future in vitro and in vivo islet experiments. Subsequent studies showed that transplanted islets could reverse diabetes in both rodents and non-human primates. In a summary of the 1977 Workshop on Pancreatic Islet Cell Transplantation in Diabetes, Lacy commented on the feasibility of “islet cell transplantation as a therapeutic approach [for] the possible prevention of the complications of diabetes in man“. Improvements in isolation techniques and immunosuppressive regimens ushered in the first human islet transplantation clinical trials in the mid-1980s.


The first successful trial of human islet allotransplantation resulting in long-term reversal of 3) ___ was performed at the University of Pittsburgh in 1990. Yet despite continued procedural improvements, only about 10% of islet recipients in the late 1990s achieved euglycemia (normal blood glucose). In 2000, Dr. James Shapiro and colleagues published a report describing seven consecutive patients who achieved euglycemia following islet transplantation using a steroid-free protocol and large numbers of donor islets, since referred to as the Edmonton protocol. This protocol has been adapted by islet transplant centers around the world and has greatly increased islet transplant success. The goal of islet transplantation is to infuse enough islets to control the blood glucose level removing the need for 4) ___ injections. For an average-size person (70 kg), a typical transplant requires about one million islets, isolated from two donor pancreases. Because good control of blood glucose can slow or prevent the progression of complications associated with diabetes, such as nerve or eye damage, a successful transplant may reduce the risk of these complications. But a transplant recipient will need to take immunosuppressive drugs that stop the immune system from rejecting the transplanted islets. A mixture of highly purified enzymes (Collagenase) is used to isolate islets from the pancreas of a deceased donor. Collagenase solution is injected into the pancreatic duct which runs through the head, body and tail of the pancreas. Delivered this way, the enzyme solution causes distension of the pancreas, which is subsequently cut into small chunks and transferred into so-called Ricordi’s chamber, where digestion takes place until the islets are liberated and removed from the solution. Isolated islets are then separated from the exocrine tissue and debris in a process called purification. During the transplant, a radiologist uses ultrasound and radiography to guide placement of a catheter through the upper abdomen and into the portal vein of the liver. The islets are then infused through the 5) ___ into the liver. The patient will receive a local anesthetic. If a patient cannot tolerate local anesthesia, the surgeon may use general anesthesia and do the transplant through a small incision. Possible risks of the procedure include bleeding or blood clots.


In 2000, the Edmonton protocol used a combination of immunosuppressive drugs, including daclizumab (Zenapax), sirolimus (Rapamune) and tacrolimus (Prograf). Daclizumab is given intravenously right after the transplant and then discontinued. Sirolimus and tacrolimus, the two main drugs that keep the immune system from destroying the transplanted islets, must be taken for life. While significant progress has been made in the islet transplantation field, many obstacles remain that currently preclude its widespread application. Two of the most important limitations are the currently inadequate means for preventing islet rejection, and the limited supply of islets for transplantation. Current immunosuppressive regimens are capable of preventing islet failure for months to years, but the agents used in these treatments are expensive and may increase the risk for specific malignancies and opportunistic infections. Perhaps of greatest concern to the patient and physician is the harmful effect of certain widely employed immunosuppressive agents on renal function. For the patient with diabetes, renal function is a crucial factor in determining long-term outcome, and calcineurin inhibitors (tacrolimus and ciclosporin) are significantly nephrotoxic. Thus, while some patients with a pancreas transplant tolerate the immunosuppressive agents well, and for such patients diabetic nephropathy can gradually improve, in other patients the net effect (decreased risk due to the improved blood glucose control, increased risk from the immunosuppressive agents) may worsen kidney function. Indeed, Ojo et al. have published an analysis indicating that among patients receiving other-than-kidney allografts, 7%-21% end up with renal 6) ___ as a result of the transplant and/or subsequent immunosuppression. Seen another way, patients with heart, liver, lung, or kidney failure have a dismal prognosis for survival, so the toxicity associated with immunosuppression is warranted (the benefits of graft survival outweigh the risks associated with the medications). But for the subset of patients with diabetes and preserved kidney function, even those with long-standing and difficult-to-control disease, the prognosis for survival is comparatively much better.


Like all transplantation therapies, islet transplantation is also handicapped by the limited donor pool. The numbers are striking; at least 1 million Americans have type 1 diabetes mellitus, and only a few thousand donors are available each year. To circumvent this organ shortage problem, researchers continue to look for ways to “grow“ islets – or at least cells capable of physiologically regulated insulin secretion – in vitro, but currently only islets from cadaveric donors can be used to restore euglycemia. Further exacerbating the problem (and unlike kidney, liver, and heart transplants, where only one donor is needed for each recipient) most islet transplant patients require islets from two or more donors to achieve euglycemia. Lastly, the current methods for islet isolation need improvement, since only about half of attempted isolations produce transplant-ready islets.


While islet transplantation research has made important progress and the success stories are encouraging, the long-term safety and efficacy of the procedure remain unclear. Other concerns relating to the field include questions about the impact of having insulin-producing foreign cells within the hepatic parenchyma, the long-term consequences of elevated portal pressures resulting from the islet infusion, and the fact that islet recipients can be sensitized against donor tissue types, making it more difficult to find a suitable donor should another life-saving transplant be required in the future. Also, very few islet transplant recipients have remained euglycemic without the use of any exogenous insulin beyond four years post-transplant. Thus, while most islet recipients achieve better glycemia control and suffer less serious hypoglycemia, islet transplantation continues to fall short of the definitive diabetes cure. Pancreatic islet 7) ___ has been reappraised based on accumulated clinical evidence. Although initially expected to therapeutically target long-term insulin independence, islet transplantation is now indicated for more specific clinical benefits. With the long-awaited report of the first phase 3 clinical trial in 2016, allogeneic islet transplantation is now transitioning from an experimental to a proven therapy for type 1 diabetes with problematic hypoglycemia. Islet autotransplantation (IAT) has already been therapeutically proven in chronic pancreatitis with severe abdominal pain refractory to conventional treatments, and it holds promise for preventing diabetes after partial pancreatectomy due to benign pancreatic tumors. Based on current evidence, this review focuses on islet transplantation as a realistic approach to treating diabetes. Recently, the French-Swiss GRAGIL Network successfully reproduced the long-term outcome achieved with the Edmonton protocol in terms of the graft survival rate (~80%), with a 58% rate of HbA1c levels at < 7% and lack of severe hypoglycemia 5 years after islet transplantation . Several cases of partial islet graft function have also been reported in Korea (ROK). Studies revealed that the islet yield and islet function in this clinical setting was superior to those of allogeneic islet transplantation, in which islets are isolated from brain-dead donors . Additionally, it was showed that transplanted islets can promote the regeneration of endogenous beta-cells in experimental models of IAT after partial pancreatectomy. In summary, IAT after partial pancreatectomy for benign tumors could be a promising indication for clinical islet transplantation. In this setting, IAT may improve the metabolic milieu after pancreatic resection and offers a unique opportunity to understand the biological effects of intraportal islet transplantation beyond the simple replacement of islet cell mass. Recent results from international cohort studies and the phase 3 clinical trial of allogeneic islet transplantation prompt reappraisal of this method as an important component of the stepwise approach to the treatment of problematic hypoglycemia. The 5-year insulin-independence rate of islet transplantation patients has also improved at some experienced centers. IAT has already been proven to be an effective therapy for intractable pain due to advanced chronic pancreatitis. Partial pancreatectomy for the treatment of benign pancreatic 8) ___ could be another indication for the use of IAT in the near future.


Successful islet 9) ___ transplantation can provide the following benefits:


1. Restore or improve the body’s ability to regulate blood sugar levels. The need for frequent blood sugar measurements and daily insulin injections can be reduced, and in a minority of patients, eliminated three years after transplantation. Although being free from insulin injections may only last several months or a year, islet cell transplantation reduces episodes of low blood sugar for a longer time.

2. Improve the quality of life.

3. Reduce the progression of long-term complications of diabetes, including heart disease, kidney disease, stroke, and nerve and eye damage.


Because it is still considered an experimental therapy, islet cell transplantation for diabetes is not widely available. There are currently 17 U.S. centers participating in islet cell research programs. The American Diabetes Association recommends that pancreas or islet cell transplantation be performed only in certain major centers, which are best equipped to handle the complex and long-term medical and personal needs of transplant patients. Collecting enough islet cells to do the transplant: Obtaining enough islet cells for transplantation is a major challenge. In most cases, islet cells from several different donors are needed. Because the need surpasses the number of human donors available, researchers are studying the use of cells from other sources, including fetal tissue and animals such as pigs. Researchers are also attempting to grow human islet cells in the laboratory. Preventing rejection: Researchers are continuously seeking to develop new and better anti-rejection drugs. Many advances have been made in anti-rejection 10) ___ over the past 15 years. Newer drugs — such as tacrolimus (FK506) and rapamycin — have fewer and less harmful side effects than some older drugs like cyclosporine and prednisone. Researchers are also working to develop methods of transplanting islet cells that will reduce or eliminate the risk of rejection and the need for immunosuppression. One approach involves coating the islet cells with a special gel that prevents the immune system from recognizing and targeting the donor cells.

Sources: Korean Health Technology R&D Project, Ministry of Health and Welfare, Republic of Korea (HI13C0954); Sang-Man Jin1 and Kwang-Won Kim2…/type-1-diabetes-islet-transplantation-gains-momentum.



ANSWERS: 1) blood; 2) drugs; 3) diabetes; 4) insulin; 5) catheter; 6) failure; 7) transplantation; 8) tumors; 9) cell; 10) drugs



Please open up this video so that this serves as the graphic for the article


The discovery of clustered DNA repeats began independently in three parts of the world. One of the first discoveries was in 1987 at Osaka University in Japan. Researcher Yoshizumi Ishino and colleagues published their findings on the sequence of a gene called “iap“ and its relation to E. coli. Technological advances in the 1990s allowed them to continue their research and speed up their sequencing with a technique called metagenomics. They were able to collect seawater or soil samples and sequence the DNA in the sample. The first description of what would later be called CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats), occurred in 1987 when Yoshizumi Ishino accidentally cloned part of a CRISPR together with the iap gene, the target of interest. The organization of the repeats was unusual because repeated sequences are typically arranged consecutively along DNA. The function of the interrupted clustered repeats was not known at the time.


Ishino received his BS, MS and PhD degree in 1981, 1983 and 1986, respectively, from Osaka University. From 1987 to 1989, he was a post-doctoral fellow at Yale University (Dieter Soll’s laboratory). In 2002, he became a professor at Kyushu University. Since October 2013, he is also a member of the NASA Astrobiology Institute, University of Illinois at Urbana-Champaign.


In 1993 researchers of Mycobacterium tuberculosis in the Netherlands published two articles about a cluster of interrupted direct repeats (DR) in this bacterium. These researchers recognized the diversity of the DR-intervening sequences among different strains of M. tuberculosis and used this property to design a typing method that was named spoligotyping, which is still in use today. At the same time, repeats were observed in the archaeal organisms of Haloferax and Haloarcula species, and their function was studied by Francisco Mojica at the University of Alicante in Spain. Although his hypothesis turned out to be wrong, Mojica surmised at the time that the clustered repeats had a role in correctly segregating replicated DNA into daughter cells during cell division because plasmids and chromosomes with identical repeat arrays could not coexist in Haloferax volcanii. Transcription of the interrupted repeats was also noted for the first time. Transcription of the interrupted repeats was also noted for the first time. In 2017 Mojica was a winner of the Albany Medical Center Prize.


The three articles below, are well written and informative regarding this new and exciting technology.–A-Conversation-with-George-Church/


Eye Imaging Could Provide “Window to the Brain“ After Stroke

According to an article published in Neurology (7 February 2018), research into curious bright spots in the eyes on stroke patients’ brain images could one day alter the way these individuals are assessed and treated. A team of scientists at the NIH found that a chemical routinely given to stroke patients undergoing brain scans can leak into their eyes, highlighting those areas and potentially providing insight into their strokes. The eyes glowed brightly on those images due to gadolinium, a harmless, transparent chemical often given to patients during magnetic resonance imaging (MRI) scans to highlight abnormalities in the brain. In healthy individuals, gadolinium remains in the blood stream and is filtered out by the kidneys. However, when someone has experienced damage to the blood-brain barrier, which controls whether substances in the blood can enter the brain, gadolinium leaks into the brain, creating bright spots that mark the location of brain damage.


Previous research had shown that certain eye diseases could cause a similar disruption to the blood-ocular barrier, which does for the eye what the blood-brain barrier does for the brain. The authors discovered that a stroke can also compromise the blood-ocular barrier and that the gadolinium that leaked into a patient’s eyes could provide information about his or her stroke.


The authors performed MRI scans on 167 stroke patients upon admission to the hospital without administering gadolinium and compared them to scans taken using gadolinium two hours and 24 hours later. Because gadolinium is transparent, it did not affect patients’ vision and could only be detected with MRI scans. Roughly three-quarters of the patients experienced gadolinium leakage into their eyes on one of the scans, with 66% showing it on the two-hour scan and 75% on the 24-hour scan. The phenomenon was present in both untreated patients and patients who received a treatment, called tPA, to dissolve the blood clot responsible for their strokes. Gadolinium was typically present in the front part of the eye, called the aqueous chamber, after two hours, and in a region towards the back, called the vitreous chamber, after 24 hours. Patients showing gadolinium in the vitreous chamber at the later timepoint tended to be of older age, have a history of hypertension, and have more bright spots on their brain scans, called white matter hyperintensities, that are associated with brain aging and decreased cognitive function.


In a minority of patients, the two-hour scan showed gadolinium in both eye chambers. The strokes in those patients tended to affect a larger portion of the brain and cause even more damage to the blood-brain barrier than the strokes of patients with a slower pattern of gadolinium leakage or no leakage at all. The findings raise the possibility that, in the future, clinicians could administer a substance to patients that would collect in the eye just like gadolinium and quickly yield important information about their strokes without the need for an MRI.


According to the authors, it is much easier for us to look inside somebody’s eye than to look into somebody’s brain, so if the eye truly is a window to the brain, we can use one to learn about the other. Despite the relationship between gadolinium leakage and stroke severity, the phenomenon was not found to be related to the level of disability the patients developed following their strokes. It also remains unclear whether gadolinium can enter the eye in healthy people.


Ebola Virus Infects Reproductive Organs in Monkeys

According to an article published in the American Journal of Pathology (8 February 2018), Ebola virus can infect the reproductive organs of male and female macaques, suggesting that humans could be similarly infected. Prior studies of survivors of the 2014-2016 Ebola outbreak in West Africa have revealed that viral RNA (Ebola virus genetic material) can persist in male and female human reproductive tracts following recovery. While little is known about viral persistence in female reproductive tissues, pregnant women with Ebola virus disease have a maternal death rate of more than 80% and a fetal death rate of nearly 100%.

For the study, four female and eight male macaques were infected with the Makona variant of Ebola virus, the variant responsible for the recent West Africa outbreak. All the macaques succumbed to Ebola virus disease and were euthanized six to nine days after infection. The authors then took reproductive tissue samples from each macaque and analyzed the samples for signs of Ebola virus infection, organ and tissue damage, and immune responses. They found widespread Ebola virus infection of reproductive organs with minimal tissue immune response or signs of disease. Based on the findings, the authors hypothesize that Ebola virus can persist in these tissues in human survivors, and that the virus may reach seminal fluid in men by infecting immune cells called tissue macrophages. However, it is unclear if the detection of Ebola virus RNA in semen documented in human studies means that infectious virus is present. The authors noted that additional research is needed to learn how Ebola virus persists in these sites, to determine if drugs and vaccines can cure or prevent such infections. To do this, NIAID scientists at NIH are developing a new nonhuman primate model of Ebola virus disease in which monkeys survive infection. Few macaques survive in the current model, making it difficult to study virus persistence and its long-term impacts.

FDA Expands Approval of Imfinzi to Reduce the Risk of Progressing NSCLC


Lung cancer is the leading cause of cancer death in the United States, with an estimated 222,500 new diagnoses and 155,870 deaths in 2017, according to the National Cancer Institute at the National Institutes of Health. The most common type of lung cancer, non-small cell lung cancer (NSCLC), occurs when cancer cells form in the tissues of the lung. Stage III NSCLC means tumors have spread to nearby lymph nodes or into other parts of the body near the lungs.


The FDA has approved Imfinzi (durvalumab) for the treatment of patients with stage III (NSCLC whose tumors are not able to be surgically removed (unresectable) and whose cancer has not progressed after treatment with chemotherapy and radiation (chemoradiation). According to FDA, this is the first treatment approved for stage III unresectable NSCLC to reduce the risk of the cancer progressing, when the cancer has not worsened after chemoradiation. For patients with stage III lung cancer that cannot be removed surgically, the current approach to prevent progression is chemoradiation. Although a small number of patients may be cured with the chemoradiation, the cancer may eventually progress. Patients now have an approved therapy that has been shown to keep the cancer from progressing for a longer time after chemoradiation.


Imfinzi targets the PD-1/PD-L1 pathway (proteins found on the body’s immune cells and some cancer cells). By blocking these interactions, Imfinzi may help the body’s immune system attack cancer cells. Imfinzi was previously granted accelerated approval in 2017 for the treatment of certain patients with locally advanced or metastatic bladder cancer. The approval of Imfinzi for the treatment of stage III, unresectable NSCLC was based on a randomized trial of 713 patients whose cancer had not progressed after completing chemotherapy and radiation. The trial measured the length of time the tumors did not have significant growth after starting treatment with Imfinzi or a placebo (progression-free survival). The median progression-free survival for patients taking Imfinzi was 16.8 months compared to 5.6 months for patients receiving a placebo. In addition, the sponsor has agreed to a post-marketing commitment to provide additional information from their study to the FDA about how long patients lived following treatment with Imfinzi after chemotherapy and radiation (overall survival).


Common side effects of Imfinzi in patients with stage III unresectable NSCLC include cough, fatigue, inflammation in the lungs (pneumonitis/radiation pneumonitis), upper respiratory tract infections, difficulty breathing and rash. Serious risks of Imfinzi include immune-mediated side effects, where the body’s immune system attacks healthy cells or organs, such as the lungs (pneumonitis), liver (hepatitis), colon (colitis), hormone-producing glands (endocrinopathies) and kidneys (nephritis). Other serious side effects of Imfinzi include infection and infusion-related reactions. Imfinzi can cause harm to a developing fetus; women should be advised of the potential risk to the fetus and to use effective contraception.


The FDA granted this application Priority Review and Breakthrough Therapy designations. Imfinzi is marketed by AstraZeneca.


Next Page →