Ultra-low-power sensors carrying genetically engineered bacteria can detect gastric bleeding

Date:
May 24, 2018

Source:
Massachusetts Institute of Technology

Summary:
Researchers have built an ingestible sensor equipped with genetically engineered bacteria that can diagnose bleeding in the stomach or other gastrointestinal problems.

 

MIT engineers have designed an ingestible sensor equipped with bacteria programmed to sense environmental conditions and relay the information to an electronic circuit.
Credit: Lillie Paquette/MIT

 

 

MIT researchers have built an ingestible sensor equipped with genetically engineered bacteria that can diagnose bleeding in the stomach or other gastrointestinal problems.

This “bacteria-on-a-chip” approach combines sensors made from living cells with ultra-low-power electronics that convert the bacterial response into a wireless signal that can be read by a smartphone.

“By combining engineered biological sensors together with low-power wireless electronics, we can detect biological signals in the body and in near real-time, enabling new diagnostic capabilities for human health applications,” says Timothy Lu, an MIT associate professor of electrical engineering and computer science and of biological engineering.

In the new study, appearing in the May 24 online edition of Science, the researchers created sensors that respond to heme, a component of blood, and showed that they work in pigs. They also designed sensors that can respond to a molecule that is a marker of inflammation.

Lu and Anantha Chandrakasan, dean of MIT’s School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science, are the senior authors of the study. The lead authors are graduate student Mark Mimee and former MIT postdoc Phillip Nadeau.

Wireless communication

In the past decade, synthetic biologists have made great strides in engineering bacteria to respond to stimuli such as environmental pollutants or markers of disease. These bacteria can be designed to produce outputs such as light when they detect the target stimulus, but specialized lab equipment is usually required to measure this response.

To make these bacteria more useful for real-world applications, the MIT team decided to combine them with an electronic chip that could translate the bacterial response into a wireless signal.

“Our idea was to package bacterial cells inside a device,” Nadeau says. “The cells would be trapped and go along for the ride as the device passes through the stomach.”

For their initial demonstration, the researchers focused on bleeding in the GI tract. They engineered a probiotic strain of E. coli to express a genetic circuit that causes the bacteria to emit light when they encounter heme.

They placed the bacteria into four wells on their custom-designed sensor, covered by a semipermeable membrane that allows small molecules from the surrounding environment to diffuse through. Underneath each well is a phototransistor that can measure the amount of light produced by the bacterial cells and relay the information to a microprocessor that sends a wireless signal to a nearby computer or smartphone. The researchers also built an Android app that can be used to analyze the data.

The sensor, which is a cylinder about 1.5 inches long, requires about 13 microwatts of power. The researchers equipped the sensor with a 2.7-volt battery, which they estimate could power the device for about 1.5 months of continuous use. They say it could also be powered by a voltaic cell sustained by acidic fluids in the stomach, using technology that Nadeau and Chandrakasan have previously developed.

“The focus of this work is on system design and integration to combine the power of bacterial sensing with ultra-low-power circuits to realize important health sensing applications,” Chandrakasan says.

Diagnosing disease

The researchers tested the ingestible sensor in pigs and showed that it could correctly determine whether any blood was present in the stomach. They anticipate that this type of sensor could be either deployed for one-time use or designed to remain the digestive tract for several days or weeks, sending continuous signals.

Currently, if patients are suspected to be bleeding from a gastric ulcer, they have to undergo an endoscopy to diagnose the problem, which often requires the patient to be sedated.

“The goal with this sensor is that you would be able to circumvent an unnecessary procedure by just ingesting the capsule, and within a relatively short period of time you would know whether or not there was a bleeding event,” Mimee says.

To help move the technology toward patient use, the researchers plan to reduce the size of the sensor and to study how long the bacteria cells can survive in the digestive tract. They also hope to develop sensors for gastrointestinal conditions other than bleeding.

In the Science paper, the researchers adapted previously described sensors for two other molecules, which they have not yet tested in animals. One of the sensors detects a sulfur-containing ion called thiosulfate, which is linked to inflammation and could be used to monitor patients with Crohn’s disease or other inflammatory conditions. The other detects a bacterial signaling molecule called AHL, which can serve as a marker for gastrointestinal infections because different types of bacteria produce slightly different versions of the molecule.

“Most of the work we did in the paper was related to blood, but conceivably you could engineer bacteria to sense anything and produce light in response to that,” Mimee says. “Anyone who is trying to engineer bacteria to sense a molecule related to disease could slot it into one of those wells, and it would be ready to go.”

The researchers say the sensors could also be designed to carry multiple strains of bacteria, allowing them to diagnose a variety of conditions.

“Right now, we have four detection sites, but if you could extend it to 16 or 256, then you could have multiple different types of cells and be able to read them all out in parallel, enabling more high-throughput screening,” Nadeau says.

Story Source:

Materials provided by Massachusetts Institute of TechnologyNote: Content may be edited for style and length.


Journal Reference:

  1. Mark Mimee, Phillip Nadeau, Alison Hayward, Sean Carim, Sarah Flanagan, Logan Jerger, Joy Collins, Shane McDonnell, Richard Swartwout, Robert J. Citorik, Vladimir Bulović, Robert Langer, Giovanni Traverso, Anantha P. Chandrakasan, Timothy K. Lu. An ingestible bacterial-electronic system to monitor gastrointestinal healthScience, 2018; 360 (6391): 915 DOI: 10.1126/science.aas9315

 

Source: Massachusetts Institute of Technology. “Ingestible ‘bacteria on a chip’ could help diagnose disease: Ultra-low-power sensors carrying genetically engineered bacteria can detect gastric bleeding.” ScienceDaily. ScienceDaily, 24 May 2018. <www.sciencedaily.com/releases/2018/05/180524141712.htm>.

Filed Under News | Leave a Comment 

Date:
May 23, 2018

Source:
DOE/Oak Ridge National Laboratory

Summary:
Scientists have now simulated an atomic nucleus using a quantum computer. The results demonstrate the ability of quantum systems to compute nuclear physics problems and serve as a benchmark for future calculations.

 

An image of a deuteron, the bound state of a proton and a neutron.
Credit: Andrew Sproles, Oak Ridge National Laboratory

 

 

Scientists at the Department of Energy’s Oak Ridge National Laboratory are the first to successfully simulate an atomic nucleus using a quantum computer. The results, published in Physical Review Letters, demonstrate the ability of quantum systems to compute nuclear physics problems and serve as a benchmark for future calculations.

Quantum computing, in which computations are carried out based on the quantum principles of matter, was proposed by American theoretical physicist Richard Feynman in the early 1980s. Unlike normal computer bits, the qubit units used by quantum computers store information in two-state systems, such as electrons or photons, that are considered to be in all possible quantum states at once (a phenomenon known as superposition).

“In classical computing, you write in bits of zero and one,” said Thomas Papenbrock, a theoretical nuclear physicist at the University of Tennessee and ORNL who co-led the project with ORNL quantum information specialist Pavel Lougovski. “But with a qubit, you can have zero, one, and any possible combination of zero and one, so you gain a vast set of possibilities to store data.”

In October 2017 the multidivisional ORNL team started developing codes to perform simulations on the IBM QX5 and the Rigetti 19Q quantum computers through DOE’s Quantum Testbed Pathfinder project, an effort to verify and validate scientific applications on different quantum hardware types. Using freely available pyQuil software, a library designed for producing programs in the quantum instruction language, the researchers wrote a code that was sent first to a simulator and then to the cloud-based IBM QX5 and Rigetti 19Q systems.

The team performed more than 700,000 quantum computing measurements of the energy of a deuteron, the nuclear bound state of a proton and a neutron. From these measurements, the team extracted the deuteron’s binding energy — the minimum amount of energy needed to disassemble it into these subatomic particles. The deuteron is the simplest composite atomic nucleus, making it an ideal candidate for the project.

“Qubits are generic versions of quantum two-state systems. They have no properties of a neutron or a proton to start with,” Lougovski said. “We can map these properties to qubits and then use them to simulate specific phenomena — in this case, binding energy.”

A challenge of working with these quantum systems is that scientists must run simulations remotely and then wait for results. ORNL computer science researcher Alex McCaskey and ORNL quantum information research scientist Eugene Dumitrescu ran single measurements 8,000 times each to ensure the statistical accuracy of their results.

“It’s really difficult to do this over the internet,” McCaskey said. “This algorithm has been done primarily by the hardware vendors themselves, and they can actually touch the machine. They are turning the knobs.”

The team also found that quantum devices become tricky to work with due to inherent noise on the chip, which can alter results drastically. McCaskey and Dumitrescu successfully employed strategies to mitigate high error rates, such as artificially adding more noise to the simulation to see its impact and deduce what the results would be with zero noise.

“These systems are really susceptible to noise,” said Gustav Jansen, a computational scientist in the Scientific Computing Group at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL. “If particles are coming in and hitting the quantum computer, it can really skew your measurements. These systems aren’t perfect, but in working with them, we can gain a better understanding of the intrinsic errors.”

At the completion of the project, the team’s results on two and three qubits were within 2 and 3 percent, respectively, of the correct answer on a classical computer, and the quantum computation became the first of its kind in the nuclear physics community.

The proof-of-principle simulation paves the way for computing much heavier nuclei with many more protons and neutrons on quantum systems in the future. Quantum computers have potential applications in cryptography, artificial intelligence, and weather forecasting because each additional qubit becomes entangled — or tied inextricably — to the others, exponentially increasing the number of possible outcomes for the measured state at the end. This very benefit, however, also has adverse effects on the system because errors may also scale exponentially with problem size.

Papenbrock said the team’s hope is that improved hardware will eventually enable scientists to solve problems that cannot be solved on traditional high-performance computing resources — not even on the ones at the OLCF. In the future, quantum computations of complex nuclei could unravel important details about the properties of matter, the formation of heavy elements, and the origins of the universe.

Story Source:

Materials provided by DOE/Oak Ridge National LaboratoryNote: Content may be edited for style and length.


Journal Reference:

  1. E. F. Dumitrescu, A. J. McCaskey, G. Hagen, G. R. Jansen, T. D. Morris, T. Papenbrock, R. C. Pooser, D. J. Dean, P. Lougovski. Cloud Quantum Computing of an Atomic NucleusPhysical Review Letters, 2018; 120 (21) DOI: 10.1103/PhysRevLett.120.210501

 

Source: DOE/Oak Ridge National Laboratory. “Nuclear physicists leap into quantum computing with first simulations of atomic nucleus.” ScienceDaily. ScienceDaily, 23 May 2018. <www.sciencedaily.com/releases/2018/05/180523133216.htm>.

Filed Under News | Leave a Comment 

Most comprehensive analysis of Earth’s largest animal group — the euarthropods — shows they evolved gradually, challenging major theories of early animal evolution

Date:
May 21, 2018

Source:
University of Oxford

Summary:
All the major groups of animals appear in the fossil record for the first time around 540-500 million years ago — an event known as the Cambrian Explosion — but new research suggests that for most animals this ‘explosion’ was in fact a more gradual process.

 

These are exceptionally preserved soft-bodied fossils of the Cambrian predator and stem-lineage euarthropod Anomalocaris canadensis from the Burgess Shale, Canada. Top left: Frontal appendage showing segmentation similar to modern-day euarthropods. Bottom right: Full body specimen showing one pair of frontal appendages (white arrows) and mouthparts consisting of plates with teeth (black arrow) on the head.
Credit: A. Daley

 

 

All the major groups of animals appear in the fossil record for the first time around 540-500 million years ago — an event known as the Cambrian Explosion — but new research from the University of Oxford in collaboration with the University of Lausanne suggests that for most animals this ‘explosion’ was in fact a more gradual process.

The Cambrian Explosion produced the largest and most diverse grouping of animals the Earth has ever seen: the euarthropods. Euarthropoda contains the insects, crustaceans, spiders, trilobites, and a huge diversity of other animal forms alive and extinct. They comprise over 80 percent of all animal species on the planet and are key components of all of Earth’s ecosystems, making them the most important group since the dawn of animals over 500 million years ago.

A team based at Oxford University Museum of Natural History and the University of Lausanne carried out the most comprehensive analysis ever made of early fossil euarthropods from every different possible type of fossil preservation. In an article published today in the Proceedings of the National Academy of Sciences they show that, taken together, the total fossil record shows a gradual radiation of euarthropods during the early Cambrian, 540-500 million years ago.

The new analysis presents a challenge to the two major competing hypotheses about early animal evolution. The first of these suggests a slow, gradual evolution of euarthropods starting 650-600 million years ago, which had been consistent with earlier molecular dating estimates of their origin. The other hypothesis claims the nearly instantaneous appearance of euarthropods 540 million years ago because of highly elevated rates of evolution.

The new research suggests a middle-ground between these two hypotheses, with the origin of euarthropods no earlier than 550 million years ago, corresponding with more recent molecular dating estimates, and with the subsequent diversification taking place over the next 40 million years.

“Each of the major types of fossil evidence has its limitation and they are incomplete in different ways, but when taken together they are mutually illuminating and allow a coherent picture to emerge of the origin and radiation of the euarthropods during the lower to middle Cambrian,” explains Professor Allison Daley, who carried out the work at Oxford University Museum of Natural History and at the University of Lausanne. “This indicates that the Cambrian Explosion, rather than being a sudden event, unfolded gradually over the ~40 million years of the lower to middle Cambrian.”

The timing of the origin of Euarthropoda is very important as it affects how we view and interpret the evolution of the group. By working out which groups developed first we can trace the evolution of physical characteristics, such as limbs.

It has been argued that the absence of euarthropods from the Precambrian Period, earlier than around 540 million years ago, is the result of a lack of fossil preservation. But the new comprehensive fossil study suggests that this isn’t the case.

“The idea that arthropods are missing from the Precambrian fossil record because of biases in how fossils are preserved can now be rejected,” says Dr Greg Edgecombe FRS from the Natural History Museum, London, who was not involved in the study. “The authors make a very compelling case that the late Precambrian and Cambrian are in fact very similar in terms of how fossils preserve. There is really just one plausible explanation — arthropods hadn’t yet evolved.”

Harriet Drage, a PhD student at Oxford University Department of Zoology and one of the paper’s co-authors, says: “When it comes to understanding the early history of life the best source of evidence that we have is the fossil record, which is compelling and very complete around the early to middle Cambrian. It speaks volumes about the origin of euarthropods during an interval of time when fossil preservation was the best it has ever been.”

Story Source:

Materials provided by University of OxfordNote: Content may be edited for style and length.


Journal Reference:

  1. Allison C. Daley, Jonathan B. Antcliffe, Harriet B. Drage, Stephen Pates. Early fossil record of Euarthropoda and the Cambrian ExplosionProceedings of the National Academy of Sciences, 2018; 201719962 DOI: 10.1073/pnas.1719962115

 

Source: University of Oxford. “Major fossil study sheds new light on emergence of early animal life 540 million years ago: Most comprehensive analysis of Earth’s largest animal group — the euarthropods — shows they evolved gradually, challenging major theories of early animal evolution.” ScienceDaily. ScienceDaily, 21 May 2018. <www.sciencedaily.com/releases/2018/05/180521154302.htm>.

Filed Under News | Leave a Comment 

New analysis compares 22 named storms with possible hurricanes of the future

Date:
May 21, 2018

Source:
National Science Foundation

Summary:
Scientists have developed a detailed analysis of how 22 recent hurricanes would be different if they formed under the conditions predicted for the late 21st century.

 

Will future hurricanes resemble 2017’s Jose (top) and Maria? Scientists have new answers.
Credit: NASA

 

 

Scientists have developed a detailed analysis of how 22 recent hurricanes would be different if they formed under the conditions predicted for the late 21st century.

While each storm’s transformation would be unique, on balance, the hurricanes would become a little stronger, a little slower-moving, and a lot wetter.

In one example, Hurricane Ike — which killed more than 100 people and devastated parts of the U.S. Gulf Coast in 2008 — could have 13 percent stronger winds, move 17 percent slower, and be 34 percent wetter if it formed in a future, warmer climate.

Other storms could become slightly weaker (for example, Hurricane Ernesto) or move slightly faster (such as Hurricane Gustav). None would become drier. The rainfall rate of simulated future storms would increase by an average of 24 percent.

The study, led by scientists at the National Center for Atmospheric Research (NCAR) and published in the Journal of Climate, compares high-resolution computer simulations of more than 20 historical, named Atlantic storms with a second set of simulations that are identical but for a warmer, wetter climate that’s consistent with the average scientific projections for the end of the century.

A future with Hurricane Harvey-like rains

“Our research suggests that future hurricanes could drop significantly more rain,” said NCAR scientist Ethan Gutmann, who led the study. “Hurricane Harvey demonstrated last year just how dangerous that can be.”

Harvey produced more than 4 feet of rain in some locations, breaking records and causing devastating flooding across the Houston area.

The research was funded by the National Science Foundation (NSF), which is NCAR’s sponsor, and by DNV GL (Det Norske Veritas Germanischer Lloyd), a global quality assurance and risk management company.

“This study shows that the number of strong hurricanes, as a percent of total hurricanes each year, may increase,” said Ed Bensman, a program director in NSF’s Division of Atmospheric and Geospace Sciences, which supported the study. “With increasing development along coastlines, that has important implications for future storm damage.”

Tapping a vast dataset to see storms

With more people and businesses relocating to coastal regions, the potential influence of environmental change on hurricanes has significant implications for public safety and the economy.

Last year’s hurricane season, which caused an estimated $215 billion in losses according to reinsurance company Munich RE, was the costliest on record.

It’s been challenging for scientists to study how hurricanes might change in the future as the climate continues to warm. Most climate models, which are usually run on a global scale over decades or centuries, are not run at a high enough resolution to “see” individual hurricanes.

Most weather models, on the other hand, are run at a high enough resolution to accurately represent hurricanes, but because of the high cost of computational resources, they are not generally used to simulate long-term changes in climate.

For the current study, the researchers took advantage of a massive new dataset created at NCAR. The scientists ran the Weather Research and Forecasting (WRF) model at a high resolution (4 kilometers, or about 2.5 miles) over the contiguous United States over two 13-year periods.

The simulations took about a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center in Cheyenne.

The first set of model runs simulates weather as it unfolded between 2000 and 2013, and the second simulates the same weather patterns but in a climate that’s warmer by about 5 degrees Celsius (9 degrees Fahrenheit) — the amount of warming that may be expected by the end of the century.

Drawing on the vast amount of data, the scientists created an algorithm that enabled them to identify 22 named storms that appear with very similar tracks in the historic and future simulations, allowing the hurricanes to be more easily compared.

As a group, storms in simulations of the future had 6 percent stronger average hourly maximum wind speeds than those in the past. They also moved at 9 percent slower speeds and had 24 percent higher average hourly maximum rainfall rates. Average storm radius did not change.

Each storm unique

“Some past studies have also run the WRF at a high resolution to study the impact of climate change on hurricanes, but those studies have tended to look at a single storm, like Sandy or Katrina,” Gutmann said.

“What we find in looking at more than 20 storms is that some change one way, while others change in a different way. There is so much variability that you can’t study one storm and then extrapolate to all storms.”

But there was one consistent feature across storms: They all produced more rain.

While the study sheds light on how a particular storm might look in a warmer climate, it doesn’t provide insight into how environmental change might affect storm genesis. That’s because the hurricanes analyzed in this study formed outside the region simulated by the WRF model and passed into the WRF simulation as fully formed storms.

Other research has suggested that fewer storms may form in the future because of increasing atmospheric stability or greater high-level wind shear, though the storms that do form are apt to be stronger.

“It’s possible that in a future climate, large-scale atmospheric changes wouldn’t allow some of these storms to form,” Gutmann said. “But from this study, we get an idea of what we can expect from the storms that do form.”

Story Source:

Materials provided by National Science FoundationNote: Content may be edited for style and length.


Journal Reference:

  1. Ethan D. Gutmann, Roy M. Rasmussen, Changhai Liu, Kyoko Ikeda, Cindy L. Bruyere, James M. Done, Luca Garrè, Peter Friis-Hansen, Vidyunmala Veldore. Changes in Hurricanes from a 13-Yr Convection-Permitting Pseudo–Global Warming SimulationJournal of Climate, 2018; 31 (9): 3643 DOI: 10.1175/JCLI-D-17-0391.1

 

Source: National Science Foundation. “Hurricanes: Stronger, slower, wetter in the future? New analysis compares 22 named storms with possible hurricanes of the future.” ScienceDaily. ScienceDaily, 21 May 2018. <www.sciencedaily.com/releases/2018/05/180521131532.htm>.

Filed Under News | Leave a Comment 

BIOMED 2018 Conference – Tel Aviv

 

Target Heath just returned from the 17th MIXiii-BIOMED 2018 Conference and Exhibition held May 15-17, 2018 in Tel Aviv, Israel. Target Health has been working in Israel since 2001 and has developed long lasting and successful professional and personal relationships, including approvals of products being marketed globally.

 

This year, BIOMED explored the following:

 

Digital Health, IoT, and Big Data – New Armamentarium in Medicine

Next Generation Oncology Treatments

Brain Health

Personalized Diagnostics and Treatments

Fighting Rare Genetic Diseases Using Novel Therapeutic Approaches

Nanomedicine and its Role in New Medical Therapeutics

From Academia Research to Industry

Cutting Edge Medical Device Technologies: Metabolic, Ophthalmology

Novel Clinical Trial Designs and Technologies to Accelerate Drug Development

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 165). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

Filed Under News, What's New | Leave a Comment 

Dietary Seaweed Used to Manipulate Gut Bacteria in Mice

Nori, roasted sheets of seaweed used in Japanese cuisine for sushi. The smaller ones are already seasoned with sesame oil and spices. Nori is typically toasted prior to consumption (yaki-nori). Part of the toasting process includes umami flavors like soy sauce, and seasonings. It is also eaten by making it into a soy sauce-flavored paste, (nori no tsukudani) Photo credit: Alice Wiegand, (Lyzzy) – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=740793

 

A paper published online in Nature (9 May 2018), showed in experiments in 1) ___, that it’s possible to favor the engraftment of one gut bacterial strain over others by manipulating the diet. The study also showed that it’s possible to control how much a bacteria grow in the intestine by calibrating the amount of a specific carbohydrate in the water or food.

 

Gut bacteria thrive on the 2) ___ we eat. In turn, they provide essential nutrients that keep us healthy, repel pathogens and even help guide our immune responses. Understanding how and why some bacterial strains we ingest can successfully take up residence in the large intestine, while others are quickly evicted, could help in the understanding on how to manipulate the makeup of thousands of bacterial species found there, in ways that enhance our health or help fend off 3) ___. But the sheer complexity of gut ecology has hampered this task. Now, researchers at the Stanford University School of Medicine working with laboratory mice have shown that it’s possible to favor the engraftment of one bacterial strain over others by manipulating the mice’s 4) ___. The authors also have shown it’s possible to control how much a bacterium grows in the intestine by calibrating the amount of a specific carbohydrate in each mouse’s water or food. According to the authors, we are all endowed with a microbial community in our guts that are assembled during our first few years of 5) ___, and that although we continue to acquire new strains throughout life, this acquisition is a poorly orchestrated and not-well-understood process. The study suggests it could be possible to reshape our microbiome in a deliberate manner to enhance health and fight disease.

 

The burgeoning field of probiotics, that comprises live, presumably healthful bacterial cultures naturally found in food such as yogurt or included in over-the-counter oral supplements, is an example of a growing public awareness of the importance of gut bacteria. But even if you don’t take probiotics or eat 6) ___, each of us unknowingly consumes low levels of gut-adapted microbes throughout our life. But, regardless of the source, it’s not known what causes one strain to be successful over another. Many pass quickly through our digestive tract without gaining a foothold in our teeming intestinal carpet. To investigate whether a dietary boost would give specific bacterial strains a leg up in the gut microbiome. The authors went to the San Jose Wastewater Treatment Facility to find members of the Bacteroides — the most prominent genus in the human 7) ___ microbiota — specifically looking for strains that are able to digest an ingredient relatively rare in American diets: the seaweed called nori used in sushi rolls and other Japanese foods. The authors screened the bacteria collected in the primary effluent for an ability to use a carbohydrate found in nori called porphyran. Apparently, the genes that allow a bacterium to digest porphyran are exceedingly rare among humans that don’t have 8) ___ as a common part of their diet. This allowed the authors to test whether it was possible to circumvent the rules of complex ecosystems by creating a privileged niche that could favor a single microbe by allowing it to exist in the absence of competition from the 30 trillion other microbes in the gut. Once a nori-gobbling strain of Bacteroides was identified, the authors attempted to introduce it into each of three groups of laboratory mice. Two groups of the mice had their own gut bacteria eliminated and replaced with the naturally occurring gut 9) ___ from two healthy human donors, each of whom donated exclusively to one group or the other. The third group of mice harbored a conventional mouse-specific community of gut microbiota.

 

Results showed that when the mice were fed a typical diet of mouse chow, the porphyran-digesting strain was able to engraft in two groups of mice to varying and limited degrees; one of the groups of mice with human gut bacteria rejected the new strain completely. However, when the mice were fed a porphyran-rich diet, the results were dramatically different: The bacteria engrafted robustly at similar levels in all the mice. Furthermore, it was possible to precisely calibrate the population size of the engrafted bacteria by increasing or decreasing the amount of nori the animals ingested. In addition to showing that they could favor the engraftment and growth of the nori-gobbling bacterial strain, the authors went one step further by showing that the genes necessary to enable the digestion of porphyran exist as a unit that can be engineered into other Bacteroides strains, giving them the same engraftment advantage. Now they’re working to identify other genes that confer similar dietary abilities. The authors also envision developing bacteria that harbor kill switches and logic gates that will permit clinicians to toggle bacterial activity on and off at will, or when a specific set of circumstances occur. For example, a physician whose patient is about to begin immunotherapy for 10) ___ may choose to also administer a bacterial strain known to activate the immune system. Conversely, a patient with an autoimmune disease may benefit from a different set of microbiota that can dial down an overactive immune response. They are just a very powerful lever to modulate our biology in health and disease. Sources: Stanford University School of Medicine; Elizabeth Stanley Shepherd, William C. DeLoache, Kali M. Pruss, Weston R. Whitaker, Justin L. Sonnenburg. An exclusive metabolic niche enables strain engraftment in the gut microbiota. Nature, 2018; DOI: 10.1038/s41586-018-0092-4; ScienceDaily, Krista Conger; Wikipedia

 

ANSWERS: 1) mice; 2) food; 3) disease; 4) diet; 5) life; 6) yogurt; 7) gut; 8) seaweed; 9) bacteria; 10) cancer

 

Filed Under News | Leave a Comment 

Nori Seaweed

Toasting a sheet of nori. 1864, Japanese painting; Wikipedia, Public Domain, https://commons.wikimedia.org/w/index.php?curid=40283081

 

Nori is the Japanese name for edible seaweed species of the red algae genus Pyropia, including P. yezoensis and P. tenera. It is used chiefly as an ingredient (wrap) of sushi. Finished products are made by a shredding and rack-drying process that resembles papermaking. Originally, the term nori was generic and referred to seaweeds, including hijiki. One of the oldest descriptions of nori is dated to around the 8th century. In the Taiho Code enacted in CA 701, when nori was already included in the form of taxation. Local people have been described as drying nori in Hitachi Province Fudoki (ca 721-721), and nori was harvested in Izumo Province Fudoki (ca 713-733), showing that nori was used as food from ancient times. In Utsubo Monogatari, written around 987, nori was recognized as a common food. Nori had been consumed as paste form until the sheet form was invented in Asakusa, Edo (contemporary Tokyo), around 1750 in the Edo period through the method of Japanese paper-making. The word “nori“ first appeared in an English-language publication in C.P. Thunberg’s Trav., published in 1796. It was used in conjugation as “Awa nori“, probably referring to what is now called aonori.

 

The Japanese nori industry was in decline after WW II, when Japan was in need of all food that could be produced. The decline was due to a lack of understanding of nori’s three-stage life cycle, such that local people did not understand why traditional cultivation methods were not effective. The industry was rescued by knowledge deriving from the work of British phycologist Kathleen Mary Drew-Baker, who had been researching the organism Porphyria umbilicalis, which grew in the seas around Wales and was harvested for food, as in Japan. Her work was discovered by Japanese scientists who applied it to artificial methods of seeding and growing the nori, rescuing the industry. Kathleen Baker was hailed as the “Mother of the Sea“ in Japan and a statue erected in her memory; she is still revered as the savior of the Japanese nori industry. In the 21st century, the Japanese nori industry faces a new decline due to increased competition from seaweed producers in China and Korea and domestic sales tax hikes.

 

The word nori started to be used widely in the United States, and the product (imported in dry form from Japan) became widely available at natural food stores and Asian-American grocery stores in the 1960s due to the macrobiotic movement and in the 1970s with the increase of sushi bars and Japanese restaurants. In one study by Jan-Hendrik Hehemann, subjects of Japanese descent have been shown to be able to digest the polysaccharide of the seaweed, after gut microbes developed the enzyme from marine bacteria. Gut microbes from the North American subjects lacked these enzymes.

 

Production and processing of nori is an advanced form of agriculture. The biology of Pyropia, although complicated, is well understood, and this knowledge is used to control the production process. Farming takes place in the sea where the Pyropia plants grow attached to nets suspended at the sea surface and where the farmers operate from boats. The plants grow rapidly, requiring about 45 days from “seeding“ until the first harvest. Multiple harvests can be taken from a single seeding, typically at about ten-day intervals. Harvesting is accomplished using mechanical harvesters of a variety of configurations. Processing of raw product is mostly accomplished by highly automated machines that accurately duplicate traditional manual processing steps, but with much improved efficiency and consistency. The final product is a paper-thin, black, dried sheet of approximately 18 cm x 20 cm (7 in x 8 in) and 3 grams (0.11 oz.) in weight. Several grades of nori are available in the United States. The most common, and least expensive, grades are imported from China, costing about six cents per sheet. At the high end, ranging up to 90 cents per sheet, are “delicate shin-nori“ (nori from the first of the year’s several harvests) cultivated in Ariake Sea, off the island of Kyushu in Japan. In Japan, over 600 square kilometres (230 sq mi) of coastal waters are given to producing 340,000 tons of nori, worth over a billion dollars. China produces about a third of this amount.

 

Nori is commonly used as a wrap for sushi and onigiri. It is also a garnish or flavoring in noodle preparations and soups. It is most typically toasted prior to consumption (yaki-nori). A common secondary product is toasted and flavored nori (ajitsuke-nori), in which a flavoring mixture (variable, but typically soy sauce, sugar, sake, mirin, and seasonings) is applied in combination with the toasting process. It is also eaten by making it into a soy sauce-flavored paste, nori no tsukudani. Nori is also sometimes used as a form of food decoration or garnish. A related product, prepared from the unrelated green algae Monostroma and Enteromorpha, is called aonori literally blue/green nori) and is used like herbs on everyday meals, such as okonomiyaki and yakisoba.

 

Since nori sheets easily absorb water from the air and degrade, a desiccant is indispensable when storing it for any significant time.

Filed Under History of Medicine, News | Leave a Comment 

Hibernation and Survival – A Tale of 13 Squirrels

 

How squirrel tissues adapt to the cold and metabolic stress has confounded researchers for years.

 

A structure in cells known to be vulnerable to cold is the microtubule cytoskeleton. This network of small tubes within a cell provides structural support and acts as a kind of inner cellular railway system, transporting organelles and molecular complexes vital for a cell’s survival.

 

According to an article published on line the journal Cell (3 May 2018), cellular mechanisms were identified during hibernation that helped 13-lined ground squirrel survive near freezing temperatures, by dramatically slowing their heart rate and respiration. The findings could be a step to extending storage of human donor tissues awaiting transplantation and protecting traumatic brain injury patients who undergo induced hypothermia.

 

In a series of experiments, the research team compared cells from non-hibernators to cells from the ground squirrel to determine differences in their response to cold. Results showed that in ground squirrel neurons, the microtubule cytoskeleton remains intact, while it deteriorates in the neurons of humans and other non-hibernating animals, including rats.

 

To investigate the biological factors supporting the squirrel’s cold adaptation, the authors created “hibernation in a dish“. They took cells from a newborn ground squirrel and reprogrammed them to become stem cells, which are undifferentiated cells capable of becoming any type of tissue in the body. Importantly, these lab-made cells, also known as induced pluripotent stem cells (iPSCs), retained the intrinsic cold-adaptive characteristics of the adult squirrel’s cells, thus providing a type of platform for studying how various kinds of the rodent’s cells adapt to the cold. Next, the authors compared gene expression of stem cell-derived neurons from ground squirrels and humans. Cold exposure revealed distinct differences in the reaction of mitochondria, organelles that provide energy to the cell in the form of adenosine, triphosphate (ATP). It was found that cold-exposed human stem cell-derived neurons tended to overproduce a byproduct of metabolism known as reactive oxygen species (ROS). The overabundance of ROS in human neurons appeared to cause proteins along the microtubules to oxidize, wreaking havoc with the microtubule structure. By comparison, ground squirrel ROS levels remained relatively low and their microtubules remained intact.

 

Cold exposure also interfered with the human stem cell-derived neurons’ ability to dispose of the toxic oxidized proteins via its protein quality control system. Under normal conditions, lysosomes envelop oxidized proteins and digest them via enzymes called proteases, but in the cold-exposed human neurons, the proteases leaked from the lysosomes and digested nearby microtubules. The authors then treated non-hibernating cells prior to cold exposure with two drugs to alter the course of the cold-induced damage. One of the drugs, BAM15, inhibits the production of ATP, which reduces the production of ROS. The second drug inhibited protease activity. After bathing a variety of cell types from non-hibernators in both drugs, the research team exposed them to 4-degrees Celsius for four to 24 hours. The drug combination preserved microtubule structure in human stem cell-derived neurons, and rat retina – the light sensitive tissue at the back of the eye. Subsequent tests showed that the rat retina also remained functional. The authors also found that the drug combination also preserved nonneural tissue. Microtubules in renal cells from mouse kidneys showed improved structural integrity after cooling and rewarming.

 

In addition to the implications for organ transplantation, these findings pave the way for future studies looking at possible therapeutic applications. For example, inducing hypothermia is a commonly used strategy to protect the brain following a traumatic injury, but the potential benefits are weighed against the potential harm from cold-induced cellular damage. According to the authors, by understanding the biology of cold adaptation in hibernation, it may be possible to improve and broaden the applications of induced hypothermia in the future, and perhaps prolong the viability of organs prior to transplantation. For example, kidneys are typically stored for no more than 30 hours. After that, the tissue starts to deteriorate, impairing the organ’s ability to function properly after its been rewarmed and reperfused. Heart, lungs and livers have an even shorter shelf life. The findings also suggest that stem cell-derived neurons from the ground squirrel can serve as a platform for studying other aspects of hibernation adaption, a field of research that has been limited by a lack of transgenic animal models and the inability to induce hibernation in the animal.

 

Filed Under News | Leave a Comment 

Bacteria Therapy for Eczema Shows Promise

 

Atopic dermatitis is an inflammatory skin disease that can make skin dry and itchy, cause rashes and lead to skin infections. The disease is linked to an increased risk of developing asthma, hay fever and food allergy. Atopic dermatitis is common in children and sometimes resolves on its own, but it also can persist into or develop during adulthood. The cause of atopic dermatitis is unknown, but studies suggest that the skin microbiome — the community of bacteria and other microbes living on the skin — plays a key role. For years, scientists have known that people with atopic dermatitis tend to have large populations of Staphylococcus aureus bacteria on their skin. These bacteria can cause skin infections and trigger immune responses that increase inflammation and worsen symptoms. Recent work by NIAID researchers using mouse and cell culture models of atopic dermatitis revealed that treatment with isolates of R. mucosa collected from the skin of healthy people improved disease outcomes in the models. In contrast, R. mucosa isolates from people with atopic dermatitis either had no impact or worsened outcomes in the models. Based on these preclinical findings, an early stage clinical trial was designed to test the safety and potential benefit of a treatment containing live R. mucosa in people with atopic dermatitis.

 

According to an article published in JCI Insight (3 May 2018), topical treatment with live Roseomonas mucosa – a bacterium naturally present on the skin — was safe for adults and children with atopic dermatitis (eczema) and was associated with reduced disease severity. These results are based on initial findings from an ongoing early-phase clinical trial. Preclinical work in a mouse model of atopic dermatitis had suggested that R. mucosa strains collected from healthy skin can relieve disease symptoms.

 

The authors first tested the experimental treatment in 10 adult volunteers with atopic dermatitis. Twice a week for six weeks, the volunteers sprayed a solution of sugar water containing increasing doses of live R. mucosa onto their inner elbows and one additional skin area of their choice. The R. mucosa strains included in the treatment were originally isolated from the skin of healthy individuals and grown under carefully controlled laboratory conditions. Participants were instructed to continue their normal eczema treatments, including topical steroids and other medications. Study participants did not report any adverse reactions or complications. Most participants experienced improvements in their atopic dermatitis, and four weeks after stopping the bacteria therapy, some reported needing fewer topical steroids.

 

The authors next enrolled five volunteers aged 9 to 14 years with atopic dermatitis. Treatments were applied to all affected skin areas twice weekly for 12 weeks and every other day for an additional four weeks. Consistent with the findings in adults, there were no complications or adverse effects, and most participants experienced improvements in their eczema, including a reduced need for topical steroids. The authors also found that treatment was associated with decreases in the S. aureus population on the children’s skin.

 

According to the authors, although larger studies comparing the bacteria therapy with a placebo will be required to assess the effectiveness of this potential treatment, results from the current study showed a greater than 50% improvement in atopic dermatitis severity in four of the five children and six of the 10 adults.

 

To better understand factors that may contribute to imbalances in the bacteria on the skin, the authors also investigated whether chemicals produced by R. mucosa or present in certain skin products may be associated with atopic dermatitis. Results showed that strains of R. mucosa from people with atopic dermatitis produced skin irritants, while strains isolated from healthy skin produced chemicals that may enhance the skin’s barrier and help regulate the immune system. In addition, some forms of parabens, a common preservative in skin products, and some topical emollients (moisturizers) blocked the growth of R. mucosa from healthy skin and did not have as strong an effect on growth of S. aureus or eczema-associated R. mucosa. These findings suggest that certain products may worsen atopic dermatitis and/or affect the effectiveness of microbiome-based therapies.

 

Final results from the ongoing study will provide the foundation for larger trials to evaluate the efficacy of this novel investigational therapy, as well as to better understand the role of R. mucosa in atopic dermatitis. NIH has exclusively licensed the technology to Forte Biosciences to advance this potential new therapy through further clinical development.

 

Filed Under News | Leave a Comment 

FDA Expands Approval of Gilenya to Treat Multiple Sclerosis in Pediatric Patients

 

Multiple Sclerosis (MS) is a chronic, inflammatory, autoimmune disease of the central nervous system that disrupts communication between the brain and other parts of the body. It is among the most common causes of neurological disability in young adults and occurs more frequently in women than men. For most people with MS, episodes of worsening function and appearance of new symptoms, called relapses or flare-ups, are initially followed by recovery periods (remissions). Over time, recovery may be incomplete, leading to progressive decline in function and increased disability. Most people with MS experience their first symptoms, like vision problems or muscle weakness, between the ages of 20 to 40. Two to five percent of people with MS have symptom onset before age 18 and estimates suggest that 8,000 to 10,000 children and adolescents in the U.S. have MS.

 

The FDA has approved Gilenya (fingolimod) to treat relapsing MS in children and adolescents age 10 years and older. This is the first FDA approval of a drug to treat MS in pediatric patients. Gilenya was first approved by the FDA in 2010 to treat adults with relapsing MS.

 

The clinical trial evaluating the effectiveness of Gilenya in treating pediatric patients with MS included 214 evaluated patients aged 10 to 17 and compared Gilenya to another MS drug, interferon beta-1a. In the study, 86% of patients receiving Gilenya remained relapse-free after 24 months of treatment, compared to 46% of those receiving interferon beta-1a. The side effects of Gilenya in pediatric trial participants were similar to those seen in adults. The most common side effects were headache, liver enzyme elevation, diarrhea, cough, flu, sinusitis, back pain, abdominal pain and pain in extremities.

 

Gilenya must be dispensed with a patient Medication Guide that describes important information about the drug’s uses and risks. Serious risks include slowing of the heart rate, especially after the first dose. Gilenya may increase the risk of serious infections. Patients should be monitored for infection during treatment and for two months after discontinuation of treatment. A rare brain infection that usually leads to death or severe disability, called progressive multifocal leukoencephalopathy (PML) has been reported in patients being treated with Gilenya. PML cases usually occur in patients with weakened immune systems. Gilenya can cause vision problems. Gilenya may increase the risk for swelling and narrowing of the blood vessels in the brain (posterior reversible encephalopathy syndrome). Other serious risks include respiratory problems, liver injury, increased blood pressure and skin cancer. Gilenya can cause harm to a developing fetus; women of child-bearing age should be advised of the potential risk to the fetus and to use effective contraception.

 

The FDA granted Priority Review and Breakthrough Therapy designation to Novartis for this indication.

 

Filed Under News, Regulatory | Leave a Comment 

Next Page →