Stanford University Medical Center, November 18, 2009  –  A well-known Eastern medicine supplement may help avoid the most common cause of liver transplantation, according to a study by researchers at the Stanford University School of Medicine. The finding came as a surprise to the scientists, who used a number of advanced genetic and genomic techniques in mice to identify a molecular pathway that counters acetaminophen toxicity, which leads to liver failure.

“I didn’t know anything about the substance that was necessary for the pathway’s function, so I had to look it up,” said Gary Peltz, MD, PhD, professor of anesthesiology. “My postdoctoral fellow, whose parents and other family members in Asia were taking this compound in their supplements, started laughing. He recognized it immediately.”

The molecule was S-methylmethionine, which had been marketed as an herbal medicine known as Vitamin U for treatment of the digestive system. It is highly abundant in many plants, including cabbage and wheat, and is routinely ingested by people. Coincidentally, Garnett Cheney, MD, at Stanford University performed a series of studies in the 1950s in which he used the compound to treat peptic ulcers.

Peltz is the senior author of the research, which will be published online Nov. 18 in Genome Research. The experiments were conducted in Peltz’s laboratory at Roche Palo Alto in Palo Alto, Calif., where Peltz worked before coming to Stanford in July 2008. He is continuing the research at Stanford. The first author of the paper, Hong-Hsing Liu, MD, PhD, is now a postdoctoral scholar in Peltz’s Stanford lab.

Acetaminophen is a pain reliever present in many over-the-counter cold and flu medicines. It is broken down, or metabolized, in the body into byproducts — one of which can be very toxic to the liver. At normal, therapeutic levels, this byproduct is easily deactivated when it binds to a naturally occurring, protective molecule called glutathione. But the body’s glutathione stores are finite, and are quickly depleted when the recommended doses of acetaminophen are exceeded.

Unfortunately, the prevalence of acetaminophen makes it easy to accidentally exceed the recommended levels, which can occur by dosing more frequently than indicated or by combining two or more acetaminophen-containing products. However, severe liver damage can occur at even two to three times the recommended dose (the maximum adult dose is 4 grams per day; toxic daily levels range from 7 to 10 grams).

“It’s a huge public health problem,” said Peltz. “It’s particularly difficult for parents, who may not realize that acetaminophen is in so many pediatric medicines.” Acetaminophen overdose is the most common cause of liver transplantation in this country. The only effective antidote is an unpalatable compound called NAC that can induce nausea and vomiting, and must be administered as soon as possible after the overdose.

Peltz and his colleagues used 16 inbred strains of laboratory mice for their investigations. Most strains are susceptible to acetaminophen toxicity, but one is resistant. They compared how the drug is metabolized by the different strains and looked for variations in gene expression and changes in endogenous metabolites in response to acetaminophen administration. They identified 224 candidate genes that might explain the resistant strain’s ability to ward off liver damage, and then plumbed computer databases to identify those involved in metabolizing acetaminophen’s dangerous byproducts.

One, an enzyme called Bhmt2, fit the bill: It helped generate more glutathione, and its sequence varied between the resistant and non-resistant strains of mice. Bhmt2 works by converting the diet-derived molecule S-methylmethionine, or SMM, into methionine, which is subsequently converted in a series of steps into glutathione. The researchers confirmed the importance of the pathway by showing that SMM conferred protection against acetaminophen-induced liver toxicity only in strains of mice in which the Bhmt2 pathway was functional.

“By administering SMM, which is found in every flowering plant and vegetable, we were able to prevent a lot of the drug’s toxic effect,” said Peltz. He and his colleagues are now working to set up clinical trials at Stanford to see whether it will have a similar effect in humans. In the meantime, though, he cautions against assuming that dosing oneself with SMM will protect against acetaminophen overdose.

“There are many pathways involved in the metabolism of this drug, and individuals’ genetic backgrounds are tremendously variable. This is just one piece of the puzzle; we don’t have the full answer,” he said. However, if subsequent studies are promising, Peltz envisions possibly a co-formulated drug containing both acetaminophen and SMM or using SMM as a routine dietary supplement.

The research was partially funded by the Institute of General Medical Sciences and the National Institute of Diabetes and Digestive and Kidney Diseases of the National Institutes of Health and by Roche. Peltz and Liu are the co-inventors on a patent filed on the use of SMM to prevent acetaminophen toxicity in humans. SandHill Bio, a drug discovery startup co-founded by Peltz, is further investigating the potential therapeutic applications of the finding.

By Daniel J. DeNoon
WebMD Health News
Reviewed by Louise Chang, MD

Plavix Dulled by Nexium, Prilosec, Tagamet, Prozac, Other Drugs

Nov. 19, 2009 – The FDA today warned patients not to combine Plavix with Nexium, Prilosec, and nine other drugs — including Prozac and Tagamet.

The drugs may make the anti-clotting drug Plavix dangerously less effective, the FDA says. They block an enzyme in the body that turns the drug into its active form. Plavix is the second most-prescribed drug in the world.

It’s not the first time questions have been raised about the potent stomach-acid-reducing drug Prilosec blunting the benefit of Plavix. But data from more recent studies suggested that Prilosec — and a similar drug called Nexium — might actually help patients on Plavix.

This led the FDA to ask Plavix maker Sanofi-Aventis to specifically study interactions between Plavix and Prilosec and between Plavix and Nexium, FDA safety evaluator Mary Ross Southworth, PharmD, said at a news conference.

“We think there is a significant interaction,” Southworth said. “In general, patients should avoid the combination of these two medications.”

But the FDA recommendation goes far beyond the two drugs studied. Nexium and Prilosec interfere with Plavix not because they are members of a particularly powerful class of acid-reducing drugs called proton pump inhibitors or PPIs, but because they inhibit an enzyme, CYP 2C19, that activates Plavix.

Several other drugs also inhibit CYP 2C19, and the FDA says patients on Plavix should avoid them as well. These drugs are:

Patients taking Plavix should not stop taking any of these drugs — including Nexium and Prilosec — until speaking with a doctor. It’s dangerous to stop taking any medication without medical advice.

Patients taking Plavix should call their doctor as soon as possible to discuss all the other drugs and supplements they are taking, including over-the-counter medications.

The FDA warning does not cover other acid-reducing drugs except for Tagamet, which inhibits CYP 2C19. The FDA is actively looking at data on other PPIs, but at this time the FDA is not warning patients to avoid taking Plavix along with other PPIs.

The FDA says there’s no evidence that the acid reducers Zantac, Pepcid, or Axid interfere with Plavix.

While some doctors have tried to avoid drug interactions by having patients take Nexium or Prilosec 12 hours before or after taking other drugs, the FDA says this strategy will not avoid interactions between Nexium or Prilosec and Plavix.

20091120-4

The Washington Post, November 18, 2009, by Steven Pearlstein  —  Even as Washington policymakers struggle to reform the country’s health-care system, about 20 miles away in Fairfax County, Dietrich Stephan is hatching a plot to revolutionize it.

The current system, as everyone knows, is the world’s costliest machine for healing you when you get sick, largely by using drugs and devices and surgical procedures that have proven themselves effective with most other people with the same ailment. But what if there were a way, based on your genetic makeup, to anticipate whether you’re likely to come down with cancer, heart disease or Alzheimer’s and prevent it with a fix specially designed for you?

It’s called personalized medicine. And while people have been anticipating it for a decade, ever since the humane genome was mapped out, it’s been slow in coming. Now Stephan — working with the Inova hospital juggernaut, the scientists at George Mason University, and the researchers and health policy experts at George Washington University — thinks he can foment this health-care revolution and create a new economic engine for Northern Virginia.

Although it’s been in the works for months, the official announcement came Monday when Gov. Tim Kaine, with his Republican successor looking on, announced that Virginia had come up with $25 million to finance operations at the new Ignite Institute. If its supervisors approve, Fairfax County will throw in $150 million in financing guarantees to construct a state-of-the-art, 300,000-square-foot research lab somewhere along the Route 28 corridor. At full strength, the institute will have an annual operating budget of $100 million and 400 employees working closely with a new Center for Personalized Medicine at Inova Fairfax Hospital, part of a $1 billion overhaul that Inova has planned for its flagship campus.

You’d be right, of course, to be a bit skeptical. For decades, we’ve heard how, thanks to the innovation gushing out of the National Institutes of Health and the Johns Hopkins medical complex, the fertile crescent between Rockville and Baltimore was destined to become the Silicon Valley of biotech. Watching that develop has been about as exciting as watching grass grow.

And just as Maryland has plenty of competition from other biotech clusters, Northern Virginia is the latest entry in the individualized-medicine sweepstakes, along with the Translational Genomics Research Institute (a.k.a. T-Gen) in Phoenix, where Stephan himself held the No. 2 position as director of research. Similar efforts are underway at the Institute for Systems Biology at the University of Washington, the Mayo Clinic and Duke University, along with the Broad Institute in Boston, which boasts not only the cachet and intellectual horsepower of Harvard and MIT but also a $400 million gift from Los Angeles real estate billionaire Eli Broad.

Closer to home, genome pioneer J. Craig Venter has his own genomic research empire, with headquarters in both Rockville and San Diego. And from the commercial sector, competition comes not only from every major drug and biotech company, but also from hot start-ups like Navigenics and 23andMe, which for a fee will tell you the diseases to which your genetic makeup is inclined.

Navigenics, in fact, was a spinoff of T-Gen, and Stephan was one of its co-founders. His experiences at both places, and at the genome labs of the National Institutes of Health, convinced him that the best place to launch this revolution is not a pure research lab, or a medical complex or a commercial start-up, but an entity that straddles the divide between nonprofit inquiry and for-profit commercialization and is driven by the everyday collaboration of researchers and clinicians.

Stephan considered locating his new venture in San Francisco or Boston, each of which had the necessary academic, medical and venture capital infrastructure. But in Northern Virginia he found a place where he would not be overshadowed by more-established players, and where he found public and private partners who, like himself, were ambitious and entrepreneurial and eager to break into the next big thing. If he, and they, have any competitive advantage, it is that the shift to individualized medicine will raise a myriad of questions about privacy, medical ethics and financing that will require difficult decisions from policymakers in Washington. Being close will give Stephan and his partners a front-row seat from which to participate in those conversations.

It’s way too early to say whether the Ignite Institute will be able to attract superstar talent or big-time funding, or whether its partnerships with Inova and the universities will bear fruit, or whether through new company start-ups it will be able to generate lots of jobs, wealth and tax revenue for the region. But it says a lot about Virginia and Fairfax County that, even in the midst of economic downturn and budget shortfall, they saw the potential, seized the opportunity and invested in the future. Having also won funding for Metrorail’s extension out to Dulles and won the headquarters competitions for Hilton, SAIC, CSC and Volkswagen of America, Northern Virginia is now primed to emerge from the economic doldrums and once again lead the region’s growth.

20091120-2

The-Scientist.com, November 19, 2009, by Edyta Zielinska  —  In early 2009, the chieftains of several pharmaceutical giants got together to discuss a new kind of biotechnology company. It would be a company guided by pharma to find and develop the technologies missing in the drug-making enterprise. But for it to work, the companies-typically, rivals-would have to work together to identify the biggest technological bottlenecks facing the industry. The new company, dubbed Enlight Biosciences, didn’t want solely an investment of expertise, it was asking for $13 million from each participating company.

“We didn’t know, when we brought these companies together, what it would be like,” said Raju Kucherlapati, a member of Enlight’s scientific advisory board. But to the surprise of meeting organizers Daphne Zohar and David Steinberg, they saw a real appetite to “interact in an open and honest way,” says Steinberg.

In the past, the idea of pharmaceutical companies working together was unheard of. “If you know about this industry and how insular and proprietary it is, then these are very substantial changes in approach and strategy,” says Ken Kaitin, director of the Tufts Center for the Study of Drug Development.

After consulting with pharmaceutical companies about their needs, Enlight narrows its list to a set of key projects, with each company deciding which projects they want to support. Then, with the help of experts and its scientific advisory board, Enlight scours academic labs for emerging products that could satisfy pharma’s wishlist and become viable companies.

Launched only months before the meeting between pharma leaders, Enlight may be the first company to capitalize on pharma’s newfound curiosity about the talents of their rivals, and its success (six pharmaceutical companies have already agreed to participate) rests largely on presenting companies with a model that was too attractive to reject. “It is an industry that desperately needs a new strategy,” says Kaitin. “They’ve been talking about this for two decades, but now it’s an imperative.”

Introduction

Enlight’s model was an outgrowth of a concept developed by its parent company, PureTech Ventures, based in Boston, Mass. PureTech built companies from the top down. First, find a need-such as a noninvasive way to perform gastric ballooning in obese patients. Next, find academicians or industry scientists who either already work on the problem or who have a product that can be licensed and tweaked. Then create a company (in the case of gastric ballooning, Gelesis) that can develop that idea.

PureTech was started in 2001 as something that functioned as a biotech incubator, providing guidance and management support that an entrepreneur may lack. “This process of starting a company is something that people learn by trial and error,” says board member Zohar. Indeed, many companies fail, despite having excellent underlying science or technology-in part, Zohar and her team noticed, because they weren’t getting enough guidance. “Incubators tend to be really passive” in the way they support entrepreneurs, she says. So PureTech would take a decidedly more active role-in many ways, acting as its own entrepreneur: coming up with the ideas, acquiring or filing the intellectual property (IP) rights, and building a self-sufficient company. With cofounders Robert Langer from Massachusetts Institute of Technology and John Zabriskie, former president and CEO of Pharmacia and Upjohn, helping guide the projects, PureTech launched seven companies, which have attracted hundreds of millions of dollars in follow-on financing. Some of those companies have products already on the market.

Over the course of brainstorming new ideas for companies, PureTech’s team of experts realized that scientific instruments and medical devices typically don’t capture the attention of venture capitalists as well as drugs do. In order to build companies around devices, they would need another funding source. So Zohar approached Mervyn Turner, Merck’s chief strategy officer, during a meeting in the summer of 2006, and asked him whether he thought pharmaceutical companies might want to fund technologies that could facilitate their development processes. Turner was all for it; with his input and early support, PureTech initiated plans for what would become Enlight.

Enlight “almost started like any other PureTech project. We saw a need and an opportunity,” says Steinberg, who became Enlight’s CEO. “What was really attractive about his notion” was having pharma companies who were “willing to work together,” says Kucherlapati of Harvard University, who was brought on to recruit Enlight’s board of scientific advisors.After the first infusion of venture capital, many biotech companies struggle to win the attention and validation of pharma backers. With Enlight, “you have the validation of an idea to start with right from the beginning,” says Kucherlapati.

Results

Armed with the early interest of Merck and two other big companies-Pfizer and Eli Lilly-members of the Enlight team descended on individual companies to get at the essence of the industry’s needs. Following discussions with a range of people-from pharmaceutical executives and external technology experts to scientists who work on the problems-the team developed a “master list of pharma needs,” says Steinberg.

Once the needs were tabulated, it was time to discuss the plan of attack at a stakeholders’ summit, including all the pharmaceutical partners. “You can envision that it would be a show and tell” by Enlight, “but it was just the opposite,” says Reid Leonard, executive director of external research and licensing at Merck. No one held back, and “we had many more ideas coming out of the summit than going in,” he says.

20091120-3

Enlight’s website lists some of the items on pharma’s “wish list”-such as predictive models, biomarkers, and biologics platforms-but the company won’t discuss specifics of a planned device until it’s ready to launch the company. Quite early on, Kucherlapati says, the group decided that developing “photoacoustic technology would be great.” One weakness of standard ultrasound is its contrast, whereas standard optical imaging is limited by the depth of light penetration. But combining both techniques, photoacoustic imaging gets the best of both worlds. You get “high-resolution images with optical-type contrast at much [greater] depths than existing optical techniques (cm vs. mm),” says Steinberg in an email.

The goal was to create a noninvasive technology that was as effective in the lab as it was in the clinic. The technology had been explored in academic circles since the 1990s, but never had “core customers showing what would be [the] most impactful” application, says Steinberg.

When Enlight started Endra, its first (and so far, only) spin-off company, based on photoacoustic technology, the company went around to five different institutions to negotiate the intellectual property rights to acquire all of the components necessary to assemble the technology. Though acquiring dispersed IP may seem a Herculean task, “this is our area of focus and expertise,” says Zohar. “We have a good sense of the benchmarks.”

“If you know about this industry and how insular and proprietary it is, then these are very substantial changes in approach and strategy,” -Ken Kaitin, director of the Tufts Center for the Study of Drug Development.

All of Enlight’s spin-off companies will function as independent entities. The pharmaceutical companies that initially funded them can buy their products (potentially with discounts), as can their competitors. The primary stipulation, says Steinberg, is that “for a predetermined period of time, the developed technologies must remain commercially available to Enlight pharmas that participate in that particular effort.” Zohar says Enlight has a number of other companies in the works around other technologies.

It’s too early to say whether the model will succeed, says Jeff Elton, former global chief operating officer and head of strategy at Novartis, who had some discussion with Enlight before he left the company. But clearly, he says, “the concept is working”-the number of pharmaceutical companies that are interested in participating has grown from three founding companies to include Johnson & Johnson and Novartis (and a sixth company that has yet to be made public), each contributing $13 million over 5 years.

“It’s a lot of fun,” says Zohar, developing ideas that satisfy a missing piece of the drug development process and creating companies around that idea. “It’s the best part of being entrepreneurial.”

By Gabe Mirkin MD, November 19, 2009  —  After just one day of switching from a plant-based diet  to a high-fat-and-sugar diet, mice with human intestinal bacteria developed bacteria associated with obesity in humans, and soon became grossly obese (Science Translational Medicine, November 11, 2009)
Dr. Jeffrey Gordon of Washington University in St Louis first showed that certain types of bacteria in the human intestinal tract can break down food more efficiently and help you absorb a greater percentage of calories from the food that you eat.  He also showed that humans whose intestinal tracts are dominated by these bacteria tend to be overweight.
In this new study, Dr. Gordon created germ-free mice and fed them a low-fat, plant-rich diet.  Then he fed them bacteria extracted from human stool and continued to feed them a low-fat, plant-based diet for one month.  By sequencing the microbes’ 16S rRNA gene, he showed that the intestinal bacteria in the mice were the same as those living in a healthy human’s intestines.
One month later, he switched half the mice to a high-fat, high-sugar diet. After 24 hours, the intestines of the mice had increases in the obesity-causing bacteria, Firmicutes, and
decreases in the obesity-preventing Bacteroidetes.  The mice continued to grow fatter and fatter, even when switched back to the low-fat, plant-based diet.
What does this mean to you?  When you eat a diet rich in refined carbohydrates (fruit juices, sugared drinks and foods made from flour and sugar) and fat (meat, fried foods, and fatty desserts),  you develop intestines full of bacteria that thrive on these foods, break down these foods more efficiently, and then absorb far more calories from these foods.  If you want your gut flora to help you maintain a healthful weight, you should eat primarily fruits, vegetables, whole grains, beans, seeds and nuts.

20091120-1

Fast-rising carbon emissions mean that worst-case predictions for climate change are coming true

GoogleNews.com, The-Scientist.com, November 19, 2009, by Steve Connor and Michael McCarthy  —    The world is now firmly on course for the worst-case scenario in terms of climate change, with average global temperatures rising by up to 6C by the end of the century, leading scientists said yesterday. Such a rise – which would be much higher nearer the poles – would have cataclysmic and irreversible consequences for the Earth, making large parts of the planet uninhabitable and threatening the basis of human civilisation.

We are headed for it, the scientists said, because the carbon dioxide emissions from industry, transport and deforestation which are responsible for warming the atmosphere have increased dramatically since 2002, in a way which no one anticipated, and are now running at treble the annual rate of the 1990s.

This means that the most extreme scenario envisaged in the last report from the UN Intergovernmental Panel on Climate Change, published in 2007, is now the one for which society is set, according to the 31 researchers from seven countries involved in the Global Carbon Project.

Although the 6C rise and its potential disastrous effects have been speculated upon before, this is the first time that scientists have said that society is now on a path to meet it.

Their chilling and remarkable prediction throws into sharp relief the importance of next month’s UN climate conference in Copenhagen, where the world community will come together to try to construct a new agreement to bring the warming under control.

For the past month there has been a lowering of expectations about the conference, not least because the US may not be ready to commit itself to cuts in its emissions. But yesterday President Barack Obama and President Hu Jintao of China issued a joint communiqué after a meeting in Beijing, which reignited hopes that a serious deal might be possible after all.

It cannot come too soon, to judge by the results of the Global Carbon Project study, led by Professor Corinne Le Quéré, of the University of East Anglia and the British Antarctic Survey, which found that there has been a 29 per cent increase in global CO2 emissions from fossil fuel between 2000 and 2008, the last year for which figures are available.

On average, the researchers found, there was an annual increase in emissions of just over 3 per cent during the period, compared with an annual increase of 1 per cent between 1990 and 2000. Almost all of the increase this decade occurred after 2000 and resulted from the boom in the Chinese economy. The researchers predict a small decrease this year due to the recession, but further increases from 2010.

In total, CO2 emissions from the burning of fossil fuels have increased by 41 per cent between 1990 and 2008, yet global emissions in 1990 are the reference level set by the Kyoto Protocol, which countries are trying to fall below in terms of their own emissions.

The 6C rise now being anticipated is in stark contrast to the C rise at which all international climate policy, including that of Britain and the EU, hopes to stabilise the warming – two degrees being seen as the threshold of climate change which is dangerous for society and the natural world.

The study by Professor Le Quéré and her team, published in the journal Nature Geoscience, envisages a far higher figure. “We’re at the top end of the IPCC scenario,” she said.

Professor Le Quéré said that Copenhagen was the last chance of coming to a global agreement that would curb carbon-dioxide emissions on a time-course that would hopefully stabilise temperature rises to within the danger threshold. “The Copenhagen conference next month is in my opinion the last chance to stabilise climate at C above pre-industrial levels in a smooth and organised way,” she said.

“If the agreement is too weak, or the commitments not respected, it is not 2.5C or 3C we will get: it’s 5C or 6C – that is the path we’re on. The timescales here are extremely tight for what is needed to stabilise the climate at C,” she said.

Meanwhile, the scientists have for the first time detected a failure of the Earth’s natural ability to absorb man-made carbon dioxide released into the air.

They found significant evidence that more man-made CO2 is staying in the atmosphere to exacerbate the greenhouse effect because the natural “carbon sinks” that have absorbed it over previous decades on land and sea are beginning to fail, possibly as a result of rising global temperatures.

The amount of CO2 that has remained in the atmosphere as a result has increased from about 40 per cent in 1990 to 45 per cent in 2008. This suggests that the sinks are beginning to fail, they said.

Professor Le Quéré emphasised that there are still many uncertainties over carbon sinks, such as the ability of the oceans to absorb dissolved CO2, but all the evidence suggests that there is now a cycle of “positive feedbacks”, whereby rising carbon dioxide emissions are leading to rising temperatures and a corresponding rise in carbon dioxide in the atmosphere.

“Our understanding at the moment in the computer models we have used – and they are state of the art – suggests that carbon-cycle climate feedback has already kicked in,” she said.

“These models, if you project them on into the century, show quite large feedbacks, with climate amplifying global warming by between 5 per cent and 30 per cent. There are still large uncertainties, but this is carbon-cycle climate feedback that has already started,” she said.

The study also found that, for the first time since the 1960s, the burning of coal has overtaken the burning of oil as the major source of carbon-dioxide emissions produced by fossil fuels.

Much of this coal was burned by China in producing goods sold to the West – the scientists estimate that 45 per cent of Chinese emissions resulted from making products traded overseas.

It is clear that China, having overtaken the US as the world’s biggest carbon emitter, must be central to any new climate deal, and so the communiqué from the Chinese and US leaders issued yesterday was widely seized on as a sign that progress may be possible in the Danish capital next month.

Presidents Hu and Obama specifically said an accord should include emission-reduction targets for rich nations, and a declaration of action plans to ease greenhouse-gas emissions in developing countries – key elements in any deal.

6C rise: The consequences

If two degrees is generally accepted as the threshold of dangerous climate change, it is clear that a rise of six degrees in global average temperatures must be very dangerous indeed, writes Michael McCarthy. Just how dangerous was signalled in 2007 by the science writer Mark Lynas, who combed all the available scientific research to construct a picture of a world with temperatures three times higher than the danger limit.

His verdict was that a rise in temperatures of this magnitude “would catapult the planet into an extreme greenhouse state not seen for nearly 100 million years, when dinosaurs grazed on polar rainforests and deserts reached into the heart of Europe”.

He said: “It would cause a mass extinction of almost all life and probably reduce humanity to a few struggling groups of embattled survivors clinging to life near the poles.”

Very few species could adapt in time to the abruptness of the transition, he suggested. “With the tropics too hot to grow crops, and the sub-tropics too dry, billions of people would find themselves in areas of the planet which are essentially uninhabitable. This would probably even include southern Europe, as the Sahara desert crosses the Mediterranean.

“As the ice-caps melt, hundreds of millions will also be forced to move inland due to rapidly-rising seas. As world food supplies crash, the higher mid-latitude and sub-polar regions would become fiercely-contested refuges.

“The British Isles, indeed, might become one of the most desirable pieces of real estate on the planet. But, with a couple of billion people knocking on our door, things might quickly turn rather ugly.”