Malaria – Part 2

 

 

 

 

Malaria is a mosquito-borne infectious disease caused by eukaryotic protists of the genus Plasmodium (see illustration above). Malaria is widespread in tropical and subtropical regions, including much of Sub-Saharan Africa, Asia and the Americas. Malaria is prevalent in these regions because of the significant amounts of rainfall and consistent high temperatures.

 

The genus Plasmodium was described in 1885 by Ettore Marchia. Of the over 200 known species of Plasmodium, at least 11 species infect humans. Other species infect other animals, including monkeys, rodents, birds, and reptiles. The parasite always has two hosts in its life cycle: a mosquito vector and a vertebrate host.

 

The first effective treatment for malaria came from the bark of cinchona tree, which contains quinine. This tree grows on the slopes of the Andes, mainly in Peru. The indigenous peoples of Peru made a tincture of cinchona to control malaria. The Jesuits noted the efficacy of the practice and introduced the treatment to Europe during the 1640s, where it was rapidly accepted.  It was not until 1820 that the active ingredient, quinine, was extracted from the bark, isolated and named by the French chemists Pierre Joseph Pelletier and Joseph Bienaim? Caventou.

 

The cause of Malaria is a protozoan, discovered in 1880 by Charles Louis Alphonse Lavern. While Lavern was working in the military hospital in Constantine, Algeria, he observed the parasites in a blood smear taken from a patient who had just died of malaria.

 

Dr. Probert’s Malarial Remedy,“Will cure chills and fever, dyspepsia. Will cure bilious fever, liver complaint.“ c.1881, New York

 

 

Laveran proposed that malaria is caused by this organism, the first time a protist was identified as causing disease. For this and later discoveries, he was awarded the 1907 Nobel Prize for Physiology or Medicine. In April 1894, a Scottish physician Sir Ronald Ross started a 4-year collaboration with Sir Patrick Manson which culminated in 1898 when Ross, who was working in the Presidency General Hospital in Calcutta, proved the complete life-cycle of the malaria parasite in mosquitoes. He did this by showing that certain mosquito species transmit malaria to birds by isolating malaria parasites from the salivary glands of mosquitoes that had fed on infected birds. For this work, Ross received the 1902 Nobel Prize in Medicine.

 

In the 20th century, chloroquine replaced quinine as treatment of both uncomplicated and severe falciparum malaria until resistance supervened. Malaria was the most important health hazard encountered by U.S. troops in the South Pacific during World War II, where about 500,000 men were infected.  According to Joseph Patrick Byrne, “Sixty thousand American soldiers died of malaria during the African and South Pacific campaigns.“

 

IV. ONCOLOGY

Mosaic Analysis with Double Markers Reveals Tumor Cell of Origin in Glioma

 

Malignant glioma is diagnosed in 10,000 Americans annually, with most patients dying within two years. Victims have included U.S. Sen. Ted Kennedy (2009), pianist George Gershwin (1937), marine biologist Thor Heyerdahl (2002) and film critic Gene Siskel (1999). An autopsy on Charles Whitman, who killed 13 during a shooting rampage at the University of Texas in 1966, uncovered a glioma tumor intruding into areas of the brain related to emotion and aggression.

 

Cancer cell of origin is difficult to identify by analyzing cells within terminal stage tumors, whose identity could be concealed by the acquired plasticity. Thus, an ideal approach to identify the cell of origin is to analyze proliferative abnormalities in distinct lineages prior to malignancy. According to an article published in the journal Cell (2011;146:209-221), a mosaic analysis with double markers (MADM) was used to model gliomagenesis by initiating concurrent p53/Nf1 mutations sporadically in neural stem cells (NSCs) in mice. The essence of MADM is its unambiguous labeling of mutant cells with green fluorescent protein, which allows for probing into pre-clinical, tumor-initiating stages that are inaccessible using conventional tools.

 

Study results showed that MADM-based lineage tracing revealed significant aberrant growth prior to malignancy only in oligodendrocyte precursor cells (OPCs), but not in any other NSC-derived lineages or NSCs themselves. Upon tumor formation, phenotypic and transcriptome analyses of tumor cells revealed salient OPC features. Finally, introducing the same p53/Nf1 mutations directly into OPCs consistently led to gliomagenesis. The findings suggest OPCs as the cell of origin in this model, even when initial mutations occur in NSCs, and highlight the importance of analyzing premalignant stages to identify the cancer cell of origin.

 

Further analyses of all cell lineages derived from neural stem cells also demonstrated that OPCs are the cell of origin since mutant green OPCs over-populate their normal red counterparts by 130-fold before any visible signs of tumor can be detected. To confirm that OPCs have intrinsic ability independent of the mutation-passing process from NSCs, the study also introduced p53 and NF1 mutations directly into OPCs.

 

According to the authors, studying cancer in this way should not only lead to molecular diagnostics to detect the earliest emergence of any cancers, but also allow the understanding the molecular mechanisms by which initial mutant cells progressively gain advantages over normal cells. The authors added that this type of knowledge should eventually enable the ability to increase the efficacy of cancer treatment.

Zinc Sparks Fly From Egg Within Minutes of Fertilization

 

 

 

 

According to an article published in ACS Chemical Biology (2011;6:716-723), it has been reported that at fertilization, a massive release of the metal zinc appears to set the fertilized egg cell on the path to dividing and growing into an embryo. The authors referred to the zinc discharge and accompanying light flash as zinc sparks. The zinc discharge followed the egg cell’s steady accumulation of zinc atoms in the developmental stages before fertilization. The study documented the discharge by bathing the eggs in a solution that gives off light when exposed to zinc. The authors suggest that zinc acts as a switch, turning off the process of cell division while the egg matures and turning it on again after fertilization.

 

In the study, the researchers observed egg cells from mice and from monkeys. To conduct the study, the authors devised a microscope that would allow them to view the concentration and distribution of zinc atoms in individual cells. With the aid of the chemical that gives off light when exposed to zinc, the first zinc sparks were documented 20 minutes after fertilization. Most fertilized eggs released two or three rounds of sparks, but as few as one and as many as five were observed within the first two hours after fertilization. The sparks flared every 10 minutes, on average.

 

Previous research had shown that fertilization triggers cyclical changes in the level of calcium in the egg cell. The researchers noted that the zinc sparks always occurred after a peak in calcium levels inside the cell.

 

According to the authors, the number, timing and intensity of these sparks could provide important information about the quality of the egg and will be an important area for future research. The authors added that it may also be worth investigating whether the amount of zinc in a woman’s diet plays a role in fertility.

 

Additional experiments helped confirm a role for zinc in the fertilization process. Typically, once the egg is released from the ovary, it must get rid of excess chromosomes in two stages as it prepares to fuse with the sperm. The team’s earlier research showed that the early accumulation of zinc is essential for properly completing the first stage. The latest results suggest that zinc may act as a brake in between these stages, as the egg awaits fertilization. If the cell is fertilized, the zinc release appears to lift the brake. The cell discards its excess genetic material and begins to divide. The study also showed that even unfertilized eggs would start to divide if zinc levels were artificially reduced, mimicking release. In addition, when fertilized cells were forced to take on additional zinc, the process was reversed.

ONCOLOGY

Filed Under News | Leave a Comment

Height and Cancer Incidence in the Million Women Study: Prospective Cohort, and Meta-Analysis of Prospective Studies of Height and Total Cancer Risk

 

Epidemiological studies have shown that taller people are at increased risk of cancer, but it is unclear if height-associated risks vary by cancer site, or by other factors such as smoking and socioeconomic status. As a result, a study published in The Lancet Oncology, Early Online Publication (21 July 2011), was performed to investigate these associations in a large UK prospective cohort of women.

 

For the study, information on height and other factors relevant for cancer was obtained between 1996 and 2001 for middle-aged women without previous cancer who were followed up for cancer incidence. The authors used Cox regression models to calculate adjusted relative risks (RRs) per 10 cm increase in measured height for total incident cancer and for 17 specific cancer sites, taking attained age as the underlying time variable. The authors also performed a meta-analysis of published results from prospective studies of total cancer risk in relation to height.

 

A total of 1,297,124 women were followed up for a total of 11.7 million person-years (median 9.4 years per woman), during which time 97,376 incident cancers occurred. The RR for total cancer was of 1.16 (p<0.0001) for every 10 cm increase in height. Risk increased for 15 of the 17 cancer sites assessed, and was statistically significant for ten sites: colon (RR 1.25), rectum (1.14), malignant melanoma (1.32), breast (1.17), endometrium (1.19), ovary (1.17), kidney (1.29), CNS (1.20), non-Hodgkin lymphoma (1.21), and leukemia (1.26).

 

The increase in total cancer RR per 10 cm increase in height did not vary significantly by socioeconomic status or by ten other personal characteristics, but was significantly lower in current than in never smokers (p<0.0001). In current smokers, smoking-related cancers were not as strongly related to height as were other cancers (RR per 10 cm increase in height 1.05, and 1.17, respectively; p=0.0004). In a meta-analysis including the present study and ten other prospective studies, height-associated RRs for total cancer showed little variation across Europe, North America, Australasia, and Asia.

 

According to the authors, cancer incidence increases with increasing adult height for most cancer sites and that the relation between height and total cancer RR is similar in different populations.

TARGET HEALTH excels in Regulatory Affairs and Public Policy issues. Each week we highlight new information in these challenging areas.

 

 

FDA Outlines Oversight of Mobile Medical Applications

 

 

Today, mobile medical applications or “mobile medical apps,“ include a variety of functions, ranging from monitoring calorie intake, helping people maintain a healthy weight, and allowing doctors to view a patient’s radiology images on their mobile communications device. According to Research2Guidance 2010, 500 million smartphone users worldwide will be using a health care application by 2015.

 

The FDA announced that it is seeking input on its proposed oversight approach for certain mobile applications specific to medicine or health care called mobile medical applications (“apps“) that are designed for use on smartphones and other mobile computing devices. This approach encourages the development of new apps, focuses only on a select group of applications and will not regulate the sale or general consumer use of smartphones or tablets.

 

“The use of mobile medical apps on smart phones and tablets is revolutionizing health care delivery,“ said Jeffrey Shuren, M.D., J.D., director of the FDA’s Center for Devices and Radiological Health. “Our draft approach calls for oversight of only those mobile medical apps that present the greatest risk to patients when they don’t work as intended.“

 

The agency’s draft guidance defines a small subset of mobile medical apps that impact or may impact the performance or functionality of currently regulated medical devices. This subset includes mobile medical apps that:

 

  • are used as an accessory to medical device already regulated by the FDA (e.g., an application that allows a health care professional to make a specific diagnosis by viewing a medical image from a picture archiving and communication system (PACS) on a smartphone or a mobile tablet); or

 

  • transform a mobile communications device into a regulated medical device by using attachments, sensors or other devices (e.g., an application that turns a smartphone into an ECG machine to detect abnormal heart rhythms or determine if a patient is experiencing a heart attack).

 

The agency is seeking public input on this approach. Once posted, comments can be submitted for 90 days online or in writing to: Division of Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, Rm. 1061, Rockville, MD 20852. The FDA will update the guidance based on feedback received. For more information, go to:

 

 

 

 

For more information about our expertise in Medical Affairs, contact Dr. Mark L. Horn. For Regulatory Affairs, please contact Dr. Jules T. Mitchel or Dr. Glen Park.

Target Health (www.targethealth.com) is a full service eCRO with full-time staff dedicated to all aspects of drug and device development. Areas of expertise include Regulatory Affairs, comprising, but not limited to, IND (eCTD), IDE, NDA (eCTD), BLA (eCTD), PMA (eCopy) and 510(k) submissions, execution of Clinical Trials, Project Management, Biostatistics and Data Management, EDC utilizing Target e*CRF®, and Medical Writing.
Target Health has developed a full suite of eClinical Trial software including:
1) Target e*CRF® (EDC plus randomization and batch edit checks)

2) Target e*CTMS™

3) Target Document®

4) Target Encoder®

5) Target Newsletter®

6) Target e*CTR™ (electronic medical record for clinical trials).
Target Health’s Pharmaceutical Advisory Dream Team assists companies in strategic planning from Discovery to Market Launch. Let us help you on your next project.

 


TARGET HEALTH INC.


261 Madison Avenue
24th Floor
New York, NY 10016
Phone: (212) 681-2100; Fax (212) 681-2105

http://blog.targethealth.com
www.targethealth.com
Ms Joyce Hays, CEO
Dr. Jules T. Mitchel, President
©2011 Target Health Inc. All rights reserved

 

Medscape.com, by Sue Hughes, July 21, 2011 (Madrid, Spain) — A new study has confirmed the importance of continuing to take aspirin long term for patients with a history of heart disease [1].

The study, published online in the British Medical Journal on July 19, 2011, found that patients who stop taking aspirin are at a significantly increased risk of MI than those who continue treatment.

Researchers, led by Dr Luis Garcia Rodriguez (Spanish Centre for Pharmacoepidemiologic Research, Madrid, Spain), explain that low-dose aspirin is a standard treatment for the secondary prevention of cardiovascular outcomes. However, despite strong evidence supporting the protective effects of low-dose aspirin, around half of patients discontinue treatment. While many studies have shown this to be associated with an increased risk of cardiovascular events, they have all taken place in secondary care centers.

To study this issue in a primary care population, Garcia Rodriguez and colleagues analyzed data of 39 513 patients from the Health Improvement Network, a large UK database of primary care records. Patients were aged 50 to 84 years, with a first low-dose aspirin prescription for the prevention of cardiovascular outcomes from 2000 to 2007, and were followed for three years.

The authors conducted a nested case-control analysis and compared 1222 cases (patients who had an MI or coronary heart disease [CHD] death) with 5000 controls. Aspirin had been discontinued in 12.2% of cases and 11.0% of controls. Compared with current use, recent discontinuation was associated with a clinically and statistically significant increase in risk of nonfatal MI and in the combined outcome of death from CHD and nonfatal MI. There was no significant difference in risk of CHD death alone.

Risk of Major Cardiovascular Events in Those Who Discontinued vs Those Who Continued Aspirin

Event Rate ratio (95% CI)
Nonfatal MI 1.63 (1.23–2.14)
MI/CHD death 1.43 (1.12–1.84)

The results translate into four additional MIs each year for every 1000 patients who discontinued aspirin. The increased risk was present irrespective of the length of time the patient had previously been taking low-dose aspirin.

The authors note that these results support those of previous studies in secondary care and show that they are applicable to the general population. “Reducing the number of patients who discontinue low-dose aspirin could have a major impact on the benefit obtained with low-dose aspirin in the general population,” they add. They call for further research to test whether efforts to encourage patients to continue prophylactic treatment with low-dose aspirin will decrease MI rates.

q”Any day off aspirin is a day at risk for patients with previous cardiovascular disease.”

In an accompanying editorial [2], Dr Giuseppe Biondi-Zoccai (University of Modena, Italy) and Dr Giovanni Landoni (Università Vita-Salute San Raffaele, Milan, Italy) write: “any day off aspirin is a day at risk for patients with previous cardiovascular disease.”

But the editorialists note that some patients may not be able to take aspirin because of bleeding risk, and that the risk-benefit ratio of aspirin in individual patients should always be considered. But in general, “patients on chronic low-dose aspirin for secondary prevention of cardiovascular disease should be advised that unless severe bleeding ensues or an informed colleague explicitly says so, aspirin should never be discontinued given its overwhelming benefits on atherothrombosis, as well as colorectal cancer and venous thromboembolism.”

They add: “Accordingly, doctors should maintain their patients on low-dose aspirin as long as they can and carefully assess individual patients for the risk of both thrombosis and bleeding before discontinuing aspirin for invasive procedures. Patients who need to discontinue aspirin should do so for the minimum time necessary.”

ZINC ‘SPARKS’ FLY FROM EGG WITHIN MINUTES OF FERTILIZATION

 

 

Sparks of Life

 

 

 

NIH-funded study of animal eggs reveals major role for metal
For Immediate Release: Thursday, July 21, 2011

 

 

At fertilization, a massive release of the metal zinc appears to set the fertilized egg cell on the path to dividing and growing into an embryo, according to the results of animal studies supported by the National Institutes of Health.

 

The zinc discharge follows the egg cell’s steady accumulation of zinc atoms in the developmental stages before fertilization.  The researchers documented the discharge by bathing the eggs in a solution that gives off light when exposed to zinc.  They referred to the zinc discharge and accompanying light flash as zinc sparks.

 

“The discovery of egg cells’ massive intake and later release of zinc defines a new role for this element in biology,” said Louis DePaolo, chief of the Reproductive Sciences Branch at the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), one of the NIH institutes supporting the study.  “We anticipate the findings will one day lead to information useful for the treatment of infertility as well as the development of new ways to prevent fertilization from occurring.”

 

The study’s authors suggest that zinc acts as a switch, turning off the process of cell division while the egg matures and turning it on again after fertilization.

 

“These findings suggest zinc is essential for developing a healthy egg and, ultimately, a healthy embryo,” said Teresa Woodruff, Ph.D., one of the article’s senior authors.

 

The study’s first author is Alison M. Kim, Ph.D., of Northwestern University, Evanston, Ill.  The other authors are Miranda L. Bernhardt, Betty Y. Kong, Richard W. Ahn, Dr. Woodruff and Thomas V. O’Halloran, Ph.D., of Northwestern, and Stefan Vogt, Ph.D., of Argonne National Laboratory, Washington, D.C.

 

Their findings appear in the July issue of ACS Chemical Biology.

 

In this study, the researchers observed egg cells from mice and from monkeys.  To conduct the study, they devised a microscope that would allow them to view the concentration and distribution of zinc atoms in individual cells.  With the aid of the chemical that gives off light when exposed to zinc, the researchers documented the first zinc sparks 20 minutes after fertilization.  Most fertilized eggs released two or three rounds of sparks, but the researchers saw as few as one and as many as five within the first two hours after fertilization.  The sparks flared every 10 minutes, on average.

 

Previous research had shown that fertilization triggers cyclical changes in the level of calcium in the egg cell.  The researchers noted that the zinc sparks always occurred after a peak in calcium levels inside the cell.

 

“The number, timing and intensity of these sparks could tell us something important about the quality of the egg and will be an important area for future research,” said Dr. O’Halloran, the article’s other senior author.  “It’s may also be worth investigating whether the amount of zinc in a woman’s diet plays a role in fertility.”

 

Additional experiments helped confirm a role for zinc in the fertilization process.  Typically, once the egg is released from the ovary, it must get rid of excess chromosomes in two stages as it prepares to fuse with the sperm.  The team’s earlier research showed that the early accumulation of zinc is essential for properly completing the first stage, Dr. O’Halloran explained.  The latest results suggest that zinc may act as a brake in between these stages, as the egg awaits fertilization.  If the cell is fertilized, the zinc release appears to lift the brake. The cell discards its excess genetic material and begins to divide.

 

The researchers also showed that even unfertilized eggs would start to divide if zinc levels were artificially reduced, mimicking release.  In addition, when fertilized cells were forced to take on additional zinc, the process was reversed.

 

“We have shown that zinc appears to regulate this precisely calibrated, intricate process,” Dr. Woodruff said.  “The findings give us new insights into what these cells need to grow and mature properly.”

 

The NICHD sponsors research on development, before and after birth; maternal, child, and family health; reproductive biology and population issues; and medical rehabilitation.  For more information, visit the Institute’s website at <http://www.nichd.nih.gov/>.

 

About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit <www.nih.gov>.

 

 

This NIH News Release is available online at:

http://www.nih.gov/news/health/jul2011/nichd-21.htm

 

Credit: E8 Album HQR Initiative

 

 

 

By Chris Jablonski | July 21, 2011

Summary

ZDNet.com  —  Using high-magnetic fields researchers have managed to suppress quantum decorehence, a key stumbling block for quantum computing.

 

Researchers announced that they’ve managed to predict and suppress environmental decoherence, a phenomenon that has been described as a “quantum bug” that destroys fundamental properties that quantum computers would rely on.

 

Decoherence is the tendency of atomic-scale particles to get quickly tangled up with the larger physical world we live in. Electrons, for instance, obey the laws of quantum physics and can therefore be in two places at once, like a coin simultaneously showing heads and tails. Scientists refer to this as state superposition. In contrast, larger, more complex physical systems appear to be in one consistent physical state because they interact and “entangle” with other objects in their environment and “decay” into a single state. The resultant decoherence is like a noise or interference that knocks the quantum particle, in this case the electron, out of superposition.

The realization of quantum computing’s promise depends on switches that are capable of state superposition. Until now, all efforts to achieve such superposition with many molecules at once were blocked by decoherence.

By using high-magnetic fields, researchers at University of British Columbia and University of California Santa Barbara discovered a way to reduce the level of the noises in the surroundings so they can constrain the decoherence efficiently.

“For the first time we’ve been able to predict and control all the environmental decoherence mechanisms in a very complex system, in this case a large magnetic molecule called the ‘Iron-8 molecule,’” said Phil Stamp, UBC professor of physics and astronomy and director of the Pacific Institute of Theoretical Physics. “Our theory also predicted that we could suppress the decoherence, and push the decoherence rate in the experiment to levels far below the threshold necessary for quantum information processing, by applying high magnetic fields.”

The findings, which are published in today’s edition of the journal Nature, could help pave the way for the development of quantum computers that perform complex calculations that are magnitudes greater than compared to today’s traditional computers.

Christopher Jablonski is a freelance technology writer. Previously, he held research analyst positions in the IT industry and was the manager of marketing editorial at CBS Interactive. He’s been contributing to ZDNet since 2003.

Christopher received a bachelor’s degree in business administration from the University of Illinois at Urbana/Champaign. With over 12 years in IT, he’s an expert on transformational technologies, particularly those influential in B2B.

Sources & further reading:

Nature: Decoherence in crystals of quantum molecular magnets

USC: USC Scientists Contribute to a Breakthrough in Quantum Computing

UBC: Discovery may overcome obstacle for quantum computing: UBC, California researchers

 

 

 

Quantum Computing Starts Yielding Real Applications

GoogleNews.com, July 21, 2011  —  Quantum Computing Starts Yielding Real Applications: Google has recently announced that it is successfully investigating the use of Quantum Algorithms to run its next generation of faster applications. So far, Google search services run in warehouses filled with conventional computers. As pointed out by ATCA in our briefings over the last few years, Quantum Computing and Quantum Algorithms have the potential to make search problems much easier to solve –- so it is no surprise that Google finds it extremely important to get involved in this emerging area of Quantum Technology with the potential to bring about asymmetric disruptive change, garnering a massive first mover competitive advantage over its rivals.

Quantum Computers point to much faster processing, by exploiting the principle of quantum superposition: that a particle such as an ion, electron or photon can be in two different states at the same time. While each basic “binary digit” or “bit” of data in a conventional computer can be either a 1 or a 0 at any given time, a Qubit can be both at once! Classical computers use what is known as a von Neumann architecture, in which data is fetched from memory and processed according to rules defined in a program to generate results that are then stored step by step. It is essentially a sequential process, though multiple versions of it can run in parallel to speed things up!

Way back in 2007, a Canadian company called D-Wave claimed it demonstrated a Quantum Computer, although the jury is still out because no other research labs in the world — despite their large budgets and talented scientists — have been able to produce a fully functional Quantum Computer yet. D-Wave developed an on-chip array of Quantum bits – or Qubits – encoded in magnetically coupled superconducting loops. D-Wave’s investors include Goldman Sachs and Draper Fisher Jurvetson.

Hartmut Neven, Head of Google’s Image Recognition team, has revealed that the firm has been quietly developing a Quantum Algorithm application over the last three years that can identify particular objects in a database of stills or video. The team adapted Quantum adiabatic algorithms, discovered by Edward Farhi and collaborators at MIT, for the D-Wave chip so that it could learn to recognize cars in photos, and reported at the Neural Information Processing Systems summit in Vancouver, Canada, recently that they have succeeded. Using 20,000 photographs of street scenes, half of which contained cars and half of which didn’t, they trained the algorithm to recognize what cars look like by hand-labeling all the cars with boxes drawn around them. After that training, the quantum algorithm was set loose on a second set of 20,000 photos, again with half containing cars. It sorted the images with cars from those without faster than an algorithm on a conventional computer could, in fact, faster than anything running in a Google data centre at present. The team appears to have perfected a simulated annealing system that is well suited to searching images for well-defined objects. Normally this costs too much to have a computer do it in real time. However if Google and D-Wave can get it working then common searches can be pre-calculated and the results stored in databases for fast retrieval!

There has been some dispute over whether D-Wave’s Chimera chip is actually a Quantum Computer. Hartmut Neven at Google acknowledges, “It is not easy to demonstrate that a multi-Qubit system such as the D-Wave chip exhibits the desired quantum behavior, and physicists are still in the process of characterizing it.” However, while questions remain over the exact capabilities of D-Wave’s hardware, future developments are likely to centre on different Quantum Computing hardware. For example, it is widely accepted that trapped ions are the most successful implementation of Quantum Technology.

The mi2g Intelligence Unit and the ATCA Research and Analysis Wing (RAW) expect more and more government agencies and companies around the world to pursue research in Quantum Computing and Quantum Algorithms in the coming few years due to their vast potential not only in search applications but also for a multiplicity of other complex problem solving capabilities.

 

 

 

A Few Interesting Science Articles to Ponder Over the Weekend

 

 

 

Infinity Bridge, UK, as a Symbol of The Holotropic State

 

Infinity Bridge, Stockton-on-Tees, U.K.    Credit: E8 Album HQR Initiative

 

 

Infinity Bridge, UK, as a Symbol of The Holotropic State: The key findings of Quantum Physics point towards Holistic Quantum Relativity and the Holotropic State, especially:

1. Dr David Bohm’s work about the universe being made up of an ‘interconnected unbroken wholeness';

2. The Non-Locality phenomena, ie, Bell’s Theorem; and

3. The ‘Observer Effect’ implying that consciousness underlies all reality.

Original References from Holistic Quantum Relativity (HQR):

I. Quantum Physics — The Holotropic State

The key findings of Quantum Physics point towards Holistic Quantum Relativity and the Holotropic State, especially: 1. Dr David Bohm’s work about the universe being made up of an “interconnected unbroken wholeness”; 2. The Non-Locality phenomena, ie, Bell’s Theorem; and 3. The ‘Observer Effect’ implying that consciousness underlies all reality.

Points 1, 2 and 3 — taken together — have striking parallels with the timeless spiritual concepts that all reality is the manifestation of an Infinite Singularity — Creative Principle — which we may choose to name Self-Designing Source, Supra-Universal Consciousness, Divine Principle or God. This demonstrates that Quantum Physics is moving the boundaries of our thinking towards the Holotropic State. Holotropic means “moving toward wholeness.” [Origin: Greek “Holos” = whole and “Trepein” = moving in the direction of]

However, none of this should be surprising to those mystics who have experienced the ‘Oneness’ via some sort of deep spiritual inner experience or Holotropic State.

The Growth of Quantum Physics beyond The Classical Model of Science

Many consider Quantum Physics to be at the cutting edge of Western science and in many respects it goes beyond Einstein’s Theory of Relativity. The interesting challenge associated with quantum physics is that the original impetus giving rise to it, namely the pursuit of the elemental building blocks of the Universe (separate elementary particles) has become meaningless with the discovery that the Universe appears to be an undivided Whole in a perpetual state of dynamic flux.

Like Einstein’s Theory of Relativity, that latest experiments and associated theories of Quantum Physics reveal the Universe to be a single gigantic field of energy in which matter itself is just a ‘slowed down’ form of energy. Further, Quantum Physics has postulated that matter/energy does not exist with any certainty in definite places, but rather shows ‘tendencies’ to exist. [Heisenberg’s Uncertainty Principle]

Even more intriguing is the notion that the existence of an observer is fundamental to the existence of the Universe — a concept known as ‘The Observer Effect’ – implying that the Universe is a product of consciousness, beyond Universe and Supra-Universe are product of the Super-Consciousness. [The Mind of God concept]

“Through experiments over the past few decades physicists have discovered matter to be completely mutable into other particles or energy and vice-versa and on a subatomic level, matter does not exist with certainty in definite places, but rather shows ‘tendencies’ to exist. Quantum physics is beginning to realize that the Universe appears to be a dynamic web of interconnected and inseparable energy patterns. If the universe is indeed composed of such a web, there is logically no such thing as a part. This implies we are not separated parts of a whole but rather we are the Whole.” [The Hands of Light, Barbara Brennan, American physicist]

On the other hand, the methodology of contemporary Western science, which still taught in most of our educational institutions today, works on the basis of breaking the world into its component parts. Quantum Physicist, Dr David Bohm stated “primary physical laws cannot be discovered by a science that attempts to break the world into its parts.” Bohm wrote of an “implicate enfolded order” which exists in an un-manifested state and which is the foundation upon which all manifest reality rests. He called this manifest reality “the explicate unfolded order”. He went on to say “parts are seen to be in immediate connection, in which their dynamical relationships depend in an irreducible way on the state of the whole system . . . Thus, one is led to a new notion of unbroken wholeness which denies the classical idea of analyzability of the world into separately and independently existent parts.” [The Implicate Order]

The classical perception of the ‘rules’ of the Universe are changing to reflect the multiple-universes predicted by the Quantum Physics based mathematics. In other words, the mathematical formulae that were initially developed to describe the behavior of the universe — with multiple-universes — turn out to govern the behavior of the multiple universes and planes described in spirituality. Thus, the mathematics which was thought to produce some absurd results, has in fact come to be relied upon to demonstrate that matter and energy somehow indeed change to behave in exactly that absurd manner to reflect the formulae!

Of course, all these notions completely contradict the understanding of reality held by most humans, whose perception of reality is still based upon “Linear Cause and Effect” Newtonian Physics, if only because Newtonian Physics seems to describe the observed Universe, as we knew it, so well at a preliminary level. (but not on a micro level such as sub atomic ‘particles’ etc or indeed at the level of accounting for acceleration and other unusual phenomena.)

Non-Locality is defined as that phenomenon in which occurrences on one side of the Universe can instantly effect ‘matter’ or ‘energy’ on the other side of the Universe. Non-locality has profound implications for the prevailing world view of reality in that it clearly demonstrates the inter-connectedness between all ‘matter’ and ‘energy’ in the Physical Universe and the illusory nature of Space and Time, something that those who have had some sort of deep spiritual experience are already well aware of.

 

 

 

 

 

 

 

www.victoriajohnson.co.uk/images/blog/2010/may/18/index.html

 

 

 

 

 

 

 

 

The Gaping Chasm between Economics and Physics

 

 

The Gaping Chasm between Economics and Physics – Rising Systemic Risk and Multiple Black Swans: Does today’s dominant economic and financial thinking violate the laws of physics? Mainstream finance and economics have long been inconsistent with the underlying laws of thermodynamics, which are fast catching up as a result of globalization! At present, economics is the study of how people transform nature to meet their needs and it treats the exploitation of finite natural resources including energy, water, air, arable land and oceans as externalities, which they are not! For example, we cannot pollute and damage natural ecosystems and their local communities ad infinitum without severe repercussions to their underlying sustainability. It is widely recognized both within the distinguished ATCA community and beyond that exchange rates instability, equity and commodity market speculation — particularly fuel, food and finance — and resultant volatilities as well as external debt are the main cause of asymmetric threats and disruption at the international level manifest as known unknowns, ie, low probability high impact risks and unknown unknowns or black swans.

There is a fundamental conflict between economic growth and environmental protection, including conservation of biodiversity, clean air and water, and atmospheric stability. This conflict is due to the laws of thermodynamics. An economic translation of the first law of thermodynamics is that we cannot make something from nothing. All economic production must come from resources provided by nature. Also, any waste generated by the economy cannot simply disappear. At given levels of technology, therefore, economic growth entails increasing resource use and waste in the form of pollution. According to the second law of thermodynamics, although energy and materials remain constant in quantity, they degrade in quality or structure. This is the meaning of increasing entropy. In the context of the economy, whatever resources we transform into something useful must decay or otherwise fall apart to return as waste to the environment. The current model of the disposable economy operates as a system for transforming low-entropy raw materials and energy into high-entropy toxic waste and unavailable energy, while providing society with interim goods and services and the temporary satisfaction that most deliver. Any such transformations in the economy mean that there will be less low-entropy materials and energy available for natural ecosystems. Mounting evidence of this conflict demonstrates the limits to our global growth!

Where do massive turbulences actually come from and what is the underlying cause of periodic financial and economic crises with accelerating levels of severity at national and trans-national levels? Mainstream economics is fundamentally flawed in its measurement of: 1. The value of human capital; 2. The real long term cost of renewal of natural ecosystems and resources; and 3. The overall health of the economy as assessed by Gross Domestic Product (GDP). The near-universal quest for constant economic growth — translated as rising GDP — ignores the world’s diminishing supply of natural resources at humanity’s peril, failing to take account of the principle of net Energy Return On Investment (EROI). The Great Reset — the protracted 21st century financial and economic crisis and global downturn which ATCA originally labeled as The Great Unwind in 2007 — has led to much soul-searching amongst economists and policy makers, the vast majority of whom never saw it coming because they never understood that the credit pyramid is an inversion of the energy-dependence pyramid.

When we look beyond the narrow lens of the current human perspective, survival of all living creatures — including ourselves — is limited by the concept of Energy Return on Investment (EROI). What does EROI really mean? Any living thing or living societies can survive only so long as they are capable of getting more net energy from any activity than they expend during the performance of that activity. This simple concept is ignored by present-day economics when focusing solely on demand and supply curves or daily financial market gyrations. For example, if a human burns energy eating food, that food ought to give that person more energy back then s/he expended, or the person will not survive. It is a golden rule that lies at the core of studying all flora and fauna, whether they are micro-organisms, thousand year old trees or mighty elephants. Human society should be looked at no differently: even technologically complex societies are still governed by the EROI and the laws of thermodynamics!

The petroleum sector’s EROI in USA was about 100-to-1 in the 1930s, meaning one had to burn approximately 1 barrel of oil’s worth of energy to get 100 barrels out of the ground. By the 1990s, that number slid to less than 36-to-1, and further down to 19-to-1 by 2006. It has fallen even further in 2009. Oil extraction has evolved by leaps and bounds since the early 1900s, and yet companies must expend much more energy to get less and less oil than they did a hundred years ago. If one were to go from using a 19-to-1 energy return on fuel down to a 3-to-1 EROI, economic disruption is guaranteed as nothing is left for other economic activity at all!

Is it because we don’t have the technology that we find ourselves cornered? No. Technology is in a global race with rocketing energy consumption and accelerating depletion of energy, and that’s a very complex set of challenges to confront simultaneously. The resource constraints foreseen by the Club of Rome in 1972 are more evident today than at any time since the publication of the think tank’s famous book, “The Limits to Growth” which stated: “If the present growth trends in world population, industrialization, pollution, food production, and resource depletion continue unchanged, the limits to growth on this planet will be reached sometime within the next one hundred years. The most probable result will be a rather sudden and uncontrollable decline in both population and industrial capacity.”

Although more than 60 years have elapsed since The Great Depression of the 1930s and the subsequent horrors of the second world war, our understanding of severe economic downturns has improved very little in the 20th and 21st centuries. Economists, financiers, and policy makers are too often at a loss when asked to provide a diagnosis and propose a remedy for the reoccurrence of complex systemic risk. The main problem with mainstream economics is that it treats energy as the same as any other commodity input in the production function, thinking of it purely in money terms, and treating it the same as they would other raw materials and sub-components, but without energy, one can’t have any of the other inputs or outputs! We have to begin regarding Calories, Joules and Watts as the key currencies rather than the Dollar, Euro and Yen!

Is lowering the carbon footprint as Copenhagen would have us believe, the only answer, or is conservation of energy efficiency another important thread in the global solution that we all seek? Neither would be sufficient at our present rate of accelerating energy consumption worldwide! The International Energy Agency’s data shows that global energy use is doubling every 37 years or so, while energy productivity takes about 56 years to double! Energy and resource conservation is somewhat pointless in the mainstream economic system as it is now legislated and operates. Whilst such efforts are noteworthy as they buy the world a bit more time — as Copenhagen no doubt will claim — but the destination is inevitably unaltered! A barrel of oil not burned by an American or European will be burned by someone else in another emerging country as that nation seeks to topple its rival in the high GDP growth league!

What is needed is a unified working model consistent with the nature of Energy Return on Investment (EROI) and capable of accounting for the process of all types of capital accumulation, treatment of externalities as internalities, mapping global energy flows and their circulation. In 1926, Frederick Soddy, a chemist who was awarded the Nobel Prize just a few weeks before, published “Wealth, Virtual Wealth and Debt,” one of the first books to argue that energy should lie at the heart of economics and not supply-and-demand curves. Soddy was critical of traditional monetary policy for seemingly ignoring the fact that “real wealth” is derived from using energy to transform physical objects, and that these physical objects are inescapably subject to the laws of thermodynamics, or inevitable decline and disintegration.

The main problem is that we as a global society, even in 2009, are almost incapable of detecting and measuring systemic risk in a complex system as we have seen in the global financial and economic crises: The Great Unwind and The Great Reset. Given this inability, what we tend to do, is to focus on a single cause — such as capping carbon emissions at Copenhagen — and extrapolate out of that a wider perspective which has the capacity to lead to an incomplete and distorted view. A new model of economics must aim to contribute to the development of the scientific understanding of the way our economic systems work holistically, with particular reference to inherent monetary disequilibria caused by energy, water, air, arable land and natural resources dependence and how those could practically be dealt with via modern physics.

We welcome your thoughts, observations and views. To reflect further on this, please respond within Twitter, Linked and Facebook’s ATCA Open and related discussion platform of HQR.

 

 

 

The Great Reset

HQR.com, The Great Reset — How To Regenerate World Growth? The World Trade Organization (WTO) recently held its first major ministerial meeting in Geneva, since Hong Kong in 2005, as it sought to find some way to get world trade liberalization back on track, post ‘The Great Reset’.

What is the Great Reset? The Great Reset occurred between the third quarter of 2008 and the second quarter of 2009 when global demand for durable products collapsed abruptly leaving a vast gap relative to supply capacity worldwide. The Great Reset is colossal — the steepest fall of world trade in recorded history and the deepest fall since the Great Depression of the 1930s. World demand experienced a sudden, severe and synchronized plunge on an unprecedented scale in the last quarter of 2008 after the insolvency of Lehman Brothers in September. Signs are that we might be turning a corner in the second half of 2009. However, According to Pascal Lamy, Director General, WTO, “Despite some evidence that trade volumes grew over the summer this year, global recovery has been patchy — and so fragile that a sudden shock in equity or currency markets could once again undermine consumer and business confidence, leading to a further deterioration of world trade.”

Severity and Speed

It took 24 months in the Great Depression of the 1930s for world trade to fall as far as it fell in the 9 months from November 2008 to July 2009. The seven biggest month-on-month drops in world trade — based on data compiled by the OECD over the past 44 years — all occurred since November 2008. Global trade has dropped before –- three times since WWII –- but nothing when compared to The Great Reset we are going through at present. Those three recessions were the oil-shock of 1974, the inflation-check of 1982, and the DotCom bust plus 9/11 in 2001. The Great Reset of 2008-09 is much worse; for two quarters in a row, world trade flows have been 15% below their previous year levels. “Driven largely by collapsing domestic demand and production levels, but also by a shortage of affordable trade finance, trade volumes are likely to fall by a further 10% this year. Whether world trade will recover next year is an open question,” states Pascal Lamy.

Scale and Synchronicity

All 104 nations on which the WTO reports data, experienced a drop in both imports and exports during the second half of 2008 and the first half of 2009. Imports and exports collapsed for the European Union’s 27 member countries and 10 other leading nations, that together account for three-quarters of world trade. Each of these trade flows dropped by more than 20% during that period; many fell 30% or more. Why did world trade fall so much more than GDP? Given the global recession, a drop in global trade is not surprising. The question remains: Why so big? During the four large post-war recessions — 1975, 1982, 1991 and 2001 — world trade dropped nearly 5 times more than GDP. This time the drop is far, far larger.

Trans-national Supply Chains

Evidence shows that the world trade-to-GDP ratio rose steeply in the late 1990s before stagnating in the 21st century right up to the start of The Great Reset in 2008, when it fell off a cliff. The rise in the 1990s is explained by a number of interlinked factors including trade liberalization and trans-national supply chains. Essentially, geography became history! Manufacturing was geographically unbundled with various modules of the value-added processes being placed in the most cost-effective or time-efficient nations on the planet. This unbundling meant that the same value-added item crossed national borders several times. In a simple trans-national supply chain, imported sub-components would be transformed step-by-step into exported components which in turn would be assembled into final goods and exported again, so the world trade figures counted the final value add several times over. The presence of these highly integrated and tightly synchronized production and distribution networks has played an important and unprecedented role in precipitating the severity, speed, scale and synchronicity of The Great Reset worldwide.

The Great Reset is manifest as a gigantic drop in international sales and is mostly a demand shock -– although supply side factors did play some role. The demand shock operated through two distinct reinforcing channels. 1. Commodity prices, which tumbled when the price bubble burst in mid 2008. 2. The production and exports of manufacturing collapsed as modern trade in durable manufactured goods fell dramatically. In the face of financial crisis and uncertainty, consumers and corporations postponed purchases of anything that wasn’t needed immediately.

Conclusion

If we carefully study world events in the 17th, 18th and 19th centuries, as well as the Great Depression in the last century, we can see that Globalization has had a long and cyclical propensity to generate bubble after bubble followed by collapse and whiplash. Clearly, the greatest danger of The Great Reset we are living through, is not simply the destruction of demand, wealth and living standards, however unpalatable that may be. It is also the destruction of “value” in the ethical and moral sense of an interlinked system of trust, commitments and social obligations, which allow trans-national capitalism to operate in harmonious concert. When in 1931 there was a collapse of confidence that resulted in the proliferation of beggar-thy-neighbor policies regarding currencies, trade and immigration, there was “De-Globalization” that damaged economies around the world and ultimately led to geo-political upheaval and confrontation. If we do not pay heed to shore up that core system of values and global trade agreements and descend into tit-for-tat tariffs and penalties — as in the Great Depression of the 1930s — we may be truly on the edge of an abyss within The Great Reset of 2008-?

[ENDS]

We welcome your thoughts, observations and views. To reflect further on this, please respond within Twitter, Linked and Facebook’s ATCA Open and related discussion platform of HQR.

All the best

DK Matai

Chairman and Founder: mi2g.net, ATCA, The Philanthropia, HQR, @G140

To connect directly with:

. DK Matai: twitter.com/DKMatai

. Open HQR: twitter.com/OpenHQR

. ATCA Open: twitter.com/ATCAOpen

. @G140: twitter.com/G140

. mi2g: twitter.com/intunit

– ATCA, The Philanthropia, mi2g, HQR, @G140 —

This is an “ATCA Open, Philanthropia and HQR Socratic Dialogue.”

The “ATCA Open” network on LinkedIn and Facebook is for professionals interested in ATCA’s original global aims, working with ATCA step-by-step across the world, or developing tools supporting ATCA’s objectives to build a better world.

The original ATCA — Asymmetric Threats Contingency Alliance — is a philanthropic expert initiative founded in 2001 to resolve complex global challenges through collective Socratic dialogue and joint executive action to build a wisdom based global economy. Adhering to the doctrine of non-violence, ATCA addresses asymmetric threats and social opportunities arising from climate chaos and the environment; radical poverty and microfinance; geo-politics and energy; organized crime & extremism; advanced technologies — bio, info, nano, robo & AI; demographic skews and resource shortages; pandemics; financial systems and systemic risk; as well as transhumanism and ethics. Present membership of the original ATCA network is by invitation only and has over 5,000 distinguished members from over 120 countries: including 1,000 Parliamentarians; 1,500 Chairmen and CEOs of corporations; 1,000 Heads of NGOs; 750 Directors at Academic Centers of Excellence; 500 Inventors and Original thinkers; as well as 250 Editors-in-Chief of major media.

The Philanthropia, founded in 2005, brings together over 1,000 leading individual and private philanthropists, family offices, foundations, private banks, non-governmental organizations and specialist advisors to address complex global challenges such as countering climate chaos, reducing radical poverty and developing global leadership for the younger generation through the appliance of science and technology, leveraging acumen and finance, as well as encouraging collaboration with a strong commitment to ethics. Philanthropia emphasizes multi-faith spiritual values: introspection, healthy living and ecology. Philanthropia Targets: Countering climate chaos and carbon neutrality; Eliminating radical poverty — through micro-credit schemes, empowerment of women and more responsible capitalism; Leadership for the Younger Generation; and Corporate and social responsibility.

The “E8 Album” photos are visual intersections of Spirituality, Science, Art and Sustainability!

The Holistic Quantum Relativity (HQR) Group is on:

groups.google.com/group/holistic-quantum-relativity-hqr

Drug Discovery with Computational Chemistry

 

Unhappy water: Chemical modeling software called WaterMap can predict how water molecules, shown here in red and green, can influence how strongly drug candidates bind to their intended target.     Credit: Schrödinger

 

 

 

A startup is banking on new software that incorporates the energy of water molecules into chemical models

 

 

 

MIT Technology Review, July 20, 2011, by Emily  Singer  —  Most pharmaceutical companies use software to model chemical interactions, with the hope of speeding up the drug development process. But it’s typically a small component of a complex array of approaches. Nimbus Discovery, a startup based in Cambridge, Massachusetts, is using computational chemistry to drive the entire process.

The company emerged from a partnership with Schrödinger, a maker of computational drug discovery software, and venture capital firm Atlas Venture. Nimbus will use Schrödinger’s software, computing power, and modeling experts to develop drugs for disease-linked proteins that have historically been difficult to target.

If successful, this computationally driven approach could make drug development faster and cheaper by making much of the trial and error process virtual.  Nimbus recently raised $24 million in venture funding. Bill Gates was one of the investors.

Schrödinger’s software, which is used by many pharmaceutical companies, models the various chemical forces that drive a candidate drug molecule to bind to a specific spot on the target protein. That allows drug developers to predict how well various candidate molecules bind to targets of interest. While this approach has been in use for about two decades, it has yet to truly transform the drug-discovery process.

Nimbus researchers think that part of the reason is that most tools fail to incorporate the thermodynamics of the resident water molecules in the protein’s binding site. “The need for improved water models is a widely acknowledged yet seldom-addressed limitation of current methods,” says Christopher Snow, a postdoctoral researcher at Caltech who is not involved with the company. It’s difficult to model the energy of water molecules.

WaterMap, a new tool from Schrödinger that predicts how water will affect the binding reaction, could overcome that barrier. “We think we can use our technology to transform the way drug development is done,” says Ramy Farid, president of Schrödinger and cofounder of Nimbus. Researchers have used WaterMap to explain the success or failure of some molecules, as well as to develop new candidate molecules. “It led in a number of cases to rapid development of drug candidates that were of higher quality than what appeared to be otherwise possible,” says Farid.

 

The startup spent its first year using the software to narrow down a list of 1,200 potential drug targets, chemical binding sites on different disease-linked proteins, to a list of 20 that looked most amenable to the technology. (That depended on a number of factors, including knowledge of the protein’s three-dimensional structure, its desirability as a target for disease, as well as the number of water molecules that reside in the binding site.) The company will focus on targets involved in inflammation, oncology, metabolic disease, and antibiotics.

The most advanced target to date is called IRAK4, a kinase enzyme that plays a role in inflammation and drives an aggressive form of non-Hodgkin’s lymphoma. Researchers conducted a virtual drug screen, looking for molecules that would bind to IRAK4, and then put those virtual molecules to the test by synthesizing them and running real chemical reactions. “We have been able to quickly find a highly selective molecule with drug-like properties,” says Rosana Kapeller, Nimbus’s chief scientific officer. It took just nine months to go from virtual screening to testing in animal models of disease.

“We have seen powerful examples of how minor changes to the molecule can result in profound changes in binding,” says Bruce Booth, one of Nimbus’s cofounders. By displacing one “unhappy” molecule, a high-energy water molecule in the binding site, “we can improve binding a hundred-fold,” he says.

While WaterMap is available to pharmaceutical companies for purchase, Farid says, the newness of the technology, and the fact that it requires intense computing power, has made it difficult to implement effectively. Part of the reason for founding Nimbus, he says, was to demonstrate how powerful the tool can be.

But it remains to be seen how significantly the WaterMap tool will speed drug discovery or how broadly applicable it will be. It may turn out to be very useful for some targets but not others.

 

Cell rebirth: Charles Limoli hopes that neural stem cells, like the ones shown here, can help regenerate brain cells damaged or destroyed by cancer treatment. Cell nuclei are shown in red.    Credit: Charles Limoli

 

 

 

The study could offer hope for brain cancer patients, who often suffer dire cognitive problems as a result of radiation treatment

 

 

 

MIT Technology Review, July 20, 2011, by Karen Weintraub  —  Radiation treatment for brain cancer can be lifesaving, but it can come at a terrible cost. The radiation that kills cancer cells also kills brain cells, destroying memories, impairing intelligence, and causing confusion.

Charles Limoli and colleagues at the University of California, Irvine, have shown that stem cells could help reverse some of this damage. In a new paper in the journal Cancer Research, Limoli shows that it’s possible to cause new brain cells to grow by injecting human neural stem cells into the brains of mice whose cognitive abilities had been damaged by radiation. The mice regained lost skills after the stem-cell treatment.

Stem cells have long been used to repair the damage caused by cancer treatment. Bone-marrow transplants for leukemia rely on stem cells to replenish blood cells, for instance. But Limoli says his team is the only one using neural stem cells to treat symptoms in the brain.

Several peers praised his work, calling it an important proof of the idea that human stem cells can repair neuronal damage.

“The results are very promising,” says Howard B. Lieberman, professor of radiation oncology and environmental health sciences at Columbia University. “If the findings continue to be as positive as what’s published in this paper, I would assume Dr. Limoli will take great effort to try to move it into the clinic as quickly as possible.”

Limoli’s team irradiated three groups of mice, later treating two of them with human neural stem cells. The third, a control group, received a sham surgery, but no cells were implanted. One month after the damage, 23 percent of implanted stem cells were active in the brains of the first group of mice. After four months, 12 percent were still active in the second group. Using cellular labeling, Limoli’s team also showed that tens of thousands of new neurons and astrocyte cells had grown in the brains of the treated mice. The treated mice performed better than the untreated ones on cognitive tests, and recovered their preradiation abilities.

Protein activity in the treated mice suggests that the implanted stem cells are integrating into the brain, Limoli says, replacing cells that have been lost or damaged.

Both Limoli and Lieberman say the treatment could also be effective against “chemo brain,” a side effect often reported by breast cancer patients. The chemotherapy can impair their ability to focus and think clearly.

Rob Coppes, a radiation and stem-cell biologist at the University Medical Center Groningen, in the Netherlands, says he would next like to see Limoli test how long the benefits of the stem cells last. He also hopes Limoli will repeat his experiments using induced pluripotent stem cells (iPS cells), adult stem cells that have been converted back to an embryonic-like state. These would likely be the cells that doctors would use in patients. Ideally they’d be taken from the patients themselves to avoid an immune rejection.

It will be important to show that mice—and later, humans in a trial—don’t reject these cells, and also that the stem cells don’t trigger new cancers, says Coppes, who employs stem cells in his own work, which involves regenerating salivary glands.

Limoli plans to carry out further work involving human neuronal stem cells and iPS cells. He also wants to figure out the optimal time to transplant these stem cells into the brain.

 

← Previous PageNext Page →