Medscape.com, by Sue Hughes, July 21, 2011 (Madrid, Spain) — A new study has confirmed the importance of continuing to take aspirin long term for patients with a history of heart disease [1].

The study, published online in the British Medical Journal on July 19, 2011, found that patients who stop taking aspirin are at a significantly increased risk of MI than those who continue treatment.

Researchers, led by Dr Luis Garcia Rodriguez (Spanish Centre for Pharmacoepidemiologic Research, Madrid, Spain), explain that low-dose aspirin is a standard treatment for the secondary prevention of cardiovascular outcomes. However, despite strong evidence supporting the protective effects of low-dose aspirin, around half of patients discontinue treatment. While many studies have shown this to be associated with an increased risk of cardiovascular events, they have all taken place in secondary care centers.

To study this issue in a primary care population, Garcia Rodriguez and colleagues analyzed data of 39 513 patients from the Health Improvement Network, a large UK database of primary care records. Patients were aged 50 to 84 years, with a first low-dose aspirin prescription for the prevention of cardiovascular outcomes from 2000 to 2007, and were followed for three years.

The authors conducted a nested case-control analysis and compared 1222 cases (patients who had an MI or coronary heart disease [CHD] death) with 5000 controls. Aspirin had been discontinued in 12.2% of cases and 11.0% of controls. Compared with current use, recent discontinuation was associated with a clinically and statistically significant increase in risk of nonfatal MI and in the combined outcome of death from CHD and nonfatal MI. There was no significant difference in risk of CHD death alone.

Risk of Major Cardiovascular Events in Those Who Discontinued vs Those Who Continued Aspirin

Event Rate ratio (95% CI)
Nonfatal MI 1.63 (1.23–2.14)
MI/CHD death 1.43 (1.12–1.84)

The results translate into four additional MIs each year for every 1000 patients who discontinued aspirin. The increased risk was present irrespective of the length of time the patient had previously been taking low-dose aspirin.

The authors note that these results support those of previous studies in secondary care and show that they are applicable to the general population. “Reducing the number of patients who discontinue low-dose aspirin could have a major impact on the benefit obtained with low-dose aspirin in the general population,” they add. They call for further research to test whether efforts to encourage patients to continue prophylactic treatment with low-dose aspirin will decrease MI rates.

q”Any day off aspirin is a day at risk for patients with previous cardiovascular disease.”

In an accompanying editorial [2], Dr Giuseppe Biondi-Zoccai (University of Modena, Italy) and Dr Giovanni Landoni (Università Vita-Salute San Raffaele, Milan, Italy) write: “any day off aspirin is a day at risk for patients with previous cardiovascular disease.”

But the editorialists note that some patients may not be able to take aspirin because of bleeding risk, and that the risk-benefit ratio of aspirin in individual patients should always be considered. But in general, “patients on chronic low-dose aspirin for secondary prevention of cardiovascular disease should be advised that unless severe bleeding ensues or an informed colleague explicitly says so, aspirin should never be discontinued given its overwhelming benefits on atherothrombosis, as well as colorectal cancer and venous thromboembolism.”

They add: “Accordingly, doctors should maintain their patients on low-dose aspirin as long as they can and carefully assess individual patients for the risk of both thrombosis and bleeding before discontinuing aspirin for invasive procedures. Patients who need to discontinue aspirin should do so for the minimum time necessary.”

ZINC ‘SPARKS’ FLY FROM EGG WITHIN MINUTES OF FERTILIZATION

 

 

Sparks of Life

 

 

 

NIH-funded study of animal eggs reveals major role for metal
For Immediate Release: Thursday, July 21, 2011

 

 

At fertilization, a massive release of the metal zinc appears to set the fertilized egg cell on the path to dividing and growing into an embryo, according to the results of animal studies supported by the National Institutes of Health.

 

The zinc discharge follows the egg cell’s steady accumulation of zinc atoms in the developmental stages before fertilization.  The researchers documented the discharge by bathing the eggs in a solution that gives off light when exposed to zinc.  They referred to the zinc discharge and accompanying light flash as zinc sparks.

 

“The discovery of egg cells’ massive intake and later release of zinc defines a new role for this element in biology,” said Louis DePaolo, chief of the Reproductive Sciences Branch at the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), one of the NIH institutes supporting the study.  “We anticipate the findings will one day lead to information useful for the treatment of infertility as well as the development of new ways to prevent fertilization from occurring.”

 

The study’s authors suggest that zinc acts as a switch, turning off the process of cell division while the egg matures and turning it on again after fertilization.

 

“These findings suggest zinc is essential for developing a healthy egg and, ultimately, a healthy embryo,” said Teresa Woodruff, Ph.D., one of the article’s senior authors.

 

The study’s first author is Alison M. Kim, Ph.D., of Northwestern University, Evanston, Ill.  The other authors are Miranda L. Bernhardt, Betty Y. Kong, Richard W. Ahn, Dr. Woodruff and Thomas V. O’Halloran, Ph.D., of Northwestern, and Stefan Vogt, Ph.D., of Argonne National Laboratory, Washington, D.C.

 

Their findings appear in the July issue of ACS Chemical Biology.

 

In this study, the researchers observed egg cells from mice and from monkeys.  To conduct the study, they devised a microscope that would allow them to view the concentration and distribution of zinc atoms in individual cells.  With the aid of the chemical that gives off light when exposed to zinc, the researchers documented the first zinc sparks 20 minutes after fertilization.  Most fertilized eggs released two or three rounds of sparks, but the researchers saw as few as one and as many as five within the first two hours after fertilization.  The sparks flared every 10 minutes, on average.

 

Previous research had shown that fertilization triggers cyclical changes in the level of calcium in the egg cell.  The researchers noted that the zinc sparks always occurred after a peak in calcium levels inside the cell.

 

“The number, timing and intensity of these sparks could tell us something important about the quality of the egg and will be an important area for future research,” said Dr. O’Halloran, the article’s other senior author.  “It’s may also be worth investigating whether the amount of zinc in a woman’s diet plays a role in fertility.”

 

Additional experiments helped confirm a role for zinc in the fertilization process.  Typically, once the egg is released from the ovary, it must get rid of excess chromosomes in two stages as it prepares to fuse with the sperm.  The team’s earlier research showed that the early accumulation of zinc is essential for properly completing the first stage, Dr. O’Halloran explained.  The latest results suggest that zinc may act as a brake in between these stages, as the egg awaits fertilization.  If the cell is fertilized, the zinc release appears to lift the brake. The cell discards its excess genetic material and begins to divide.

 

The researchers also showed that even unfertilized eggs would start to divide if zinc levels were artificially reduced, mimicking release.  In addition, when fertilized cells were forced to take on additional zinc, the process was reversed.

 

“We have shown that zinc appears to regulate this precisely calibrated, intricate process,” Dr. Woodruff said.  “The findings give us new insights into what these cells need to grow and mature properly.”

 

The NICHD sponsors research on development, before and after birth; maternal, child, and family health; reproductive biology and population issues; and medical rehabilitation.  For more information, visit the Institute’s website at <http://www.nichd.nih.gov/>.

 

About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit <www.nih.gov>.

 

 

This NIH News Release is available online at:

http://www.nih.gov/news/health/jul2011/nichd-21.htm

 

Credit: E8 Album HQR Initiative

 

 

 

By Chris Jablonski | July 21, 2011

Summary

ZDNet.com  —  Using high-magnetic fields researchers have managed to suppress quantum decorehence, a key stumbling block for quantum computing.

 

Researchers announced that they’ve managed to predict and suppress environmental decoherence, a phenomenon that has been described as a “quantum bug” that destroys fundamental properties that quantum computers would rely on.

 

Decoherence is the tendency of atomic-scale particles to get quickly tangled up with the larger physical world we live in. Electrons, for instance, obey the laws of quantum physics and can therefore be in two places at once, like a coin simultaneously showing heads and tails. Scientists refer to this as state superposition. In contrast, larger, more complex physical systems appear to be in one consistent physical state because they interact and “entangle” with other objects in their environment and “decay” into a single state. The resultant decoherence is like a noise or interference that knocks the quantum particle, in this case the electron, out of superposition.

The realization of quantum computing’s promise depends on switches that are capable of state superposition. Until now, all efforts to achieve such superposition with many molecules at once were blocked by decoherence.

By using high-magnetic fields, researchers at University of British Columbia and University of California Santa Barbara discovered a way to reduce the level of the noises in the surroundings so they can constrain the decoherence efficiently.

“For the first time we’ve been able to predict and control all the environmental decoherence mechanisms in a very complex system, in this case a large magnetic molecule called the ‘Iron-8 molecule,’” said Phil Stamp, UBC professor of physics and astronomy and director of the Pacific Institute of Theoretical Physics. “Our theory also predicted that we could suppress the decoherence, and push the decoherence rate in the experiment to levels far below the threshold necessary for quantum information processing, by applying high magnetic fields.”

The findings, which are published in today’s edition of the journal Nature, could help pave the way for the development of quantum computers that perform complex calculations that are magnitudes greater than compared to today’s traditional computers.

Christopher Jablonski is a freelance technology writer. Previously, he held research analyst positions in the IT industry and was the manager of marketing editorial at CBS Interactive. He’s been contributing to ZDNet since 2003.

Christopher received a bachelor’s degree in business administration from the University of Illinois at Urbana/Champaign. With over 12 years in IT, he’s an expert on transformational technologies, particularly those influential in B2B.

Sources & further reading:

Nature: Decoherence in crystals of quantum molecular magnets

USC: USC Scientists Contribute to a Breakthrough in Quantum Computing

UBC: Discovery may overcome obstacle for quantum computing: UBC, California researchers

 

 

 

Quantum Computing Starts Yielding Real Applications

GoogleNews.com, July 21, 2011  —  Quantum Computing Starts Yielding Real Applications: Google has recently announced that it is successfully investigating the use of Quantum Algorithms to run its next generation of faster applications. So far, Google search services run in warehouses filled with conventional computers. As pointed out by ATCA in our briefings over the last few years, Quantum Computing and Quantum Algorithms have the potential to make search problems much easier to solve –- so it is no surprise that Google finds it extremely important to get involved in this emerging area of Quantum Technology with the potential to bring about asymmetric disruptive change, garnering a massive first mover competitive advantage over its rivals.

Quantum Computers point to much faster processing, by exploiting the principle of quantum superposition: that a particle such as an ion, electron or photon can be in two different states at the same time. While each basic “binary digit” or “bit” of data in a conventional computer can be either a 1 or a 0 at any given time, a Qubit can be both at once! Classical computers use what is known as a von Neumann architecture, in which data is fetched from memory and processed according to rules defined in a program to generate results that are then stored step by step. It is essentially a sequential process, though multiple versions of it can run in parallel to speed things up!

Way back in 2007, a Canadian company called D-Wave claimed it demonstrated a Quantum Computer, although the jury is still out because no other research labs in the world — despite their large budgets and talented scientists — have been able to produce a fully functional Quantum Computer yet. D-Wave developed an on-chip array of Quantum bits – or Qubits – encoded in magnetically coupled superconducting loops. D-Wave’s investors include Goldman Sachs and Draper Fisher Jurvetson.

Hartmut Neven, Head of Google’s Image Recognition team, has revealed that the firm has been quietly developing a Quantum Algorithm application over the last three years that can identify particular objects in a database of stills or video. The team adapted Quantum adiabatic algorithms, discovered by Edward Farhi and collaborators at MIT, for the D-Wave chip so that it could learn to recognize cars in photos, and reported at the Neural Information Processing Systems summit in Vancouver, Canada, recently that they have succeeded. Using 20,000 photographs of street scenes, half of which contained cars and half of which didn’t, they trained the algorithm to recognize what cars look like by hand-labeling all the cars with boxes drawn around them. After that training, the quantum algorithm was set loose on a second set of 20,000 photos, again with half containing cars. It sorted the images with cars from those without faster than an algorithm on a conventional computer could, in fact, faster than anything running in a Google data centre at present. The team appears to have perfected a simulated annealing system that is well suited to searching images for well-defined objects. Normally this costs too much to have a computer do it in real time. However if Google and D-Wave can get it working then common searches can be pre-calculated and the results stored in databases for fast retrieval!

There has been some dispute over whether D-Wave’s Chimera chip is actually a Quantum Computer. Hartmut Neven at Google acknowledges, “It is not easy to demonstrate that a multi-Qubit system such as the D-Wave chip exhibits the desired quantum behavior, and physicists are still in the process of characterizing it.” However, while questions remain over the exact capabilities of D-Wave’s hardware, future developments are likely to centre on different Quantum Computing hardware. For example, it is widely accepted that trapped ions are the most successful implementation of Quantum Technology.

The mi2g Intelligence Unit and the ATCA Research and Analysis Wing (RAW) expect more and more government agencies and companies around the world to pursue research in Quantum Computing and Quantum Algorithms in the coming few years due to their vast potential not only in search applications but also for a multiplicity of other complex problem solving capabilities.

 

 

 

A Few Interesting Science Articles to Ponder Over the Weekend

 

 

 

Infinity Bridge, UK, as a Symbol of The Holotropic State

 

Infinity Bridge, Stockton-on-Tees, U.K.    Credit: E8 Album HQR Initiative

 

 

Infinity Bridge, UK, as a Symbol of The Holotropic State: The key findings of Quantum Physics point towards Holistic Quantum Relativity and the Holotropic State, especially:

1. Dr David Bohm’s work about the universe being made up of an ‘interconnected unbroken wholeness’;

2. The Non-Locality phenomena, ie, Bell’s Theorem; and

3. The ‘Observer Effect’ implying that consciousness underlies all reality.

Original References from Holistic Quantum Relativity (HQR):

I. Quantum Physics — The Holotropic State

The key findings of Quantum Physics point towards Holistic Quantum Relativity and the Holotropic State, especially: 1. Dr David Bohm’s work about the universe being made up of an “interconnected unbroken wholeness”; 2. The Non-Locality phenomena, ie, Bell’s Theorem; and 3. The ‘Observer Effect’ implying that consciousness underlies all reality.

Points 1, 2 and 3 — taken together — have striking parallels with the timeless spiritual concepts that all reality is the manifestation of an Infinite Singularity — Creative Principle — which we may choose to name Self-Designing Source, Supra-Universal Consciousness, Divine Principle or God. This demonstrates that Quantum Physics is moving the boundaries of our thinking towards the Holotropic State. Holotropic means “moving toward wholeness.” [Origin: Greek “Holos” = whole and “Trepein” = moving in the direction of]

However, none of this should be surprising to those mystics who have experienced the ‘Oneness’ via some sort of deep spiritual inner experience or Holotropic State.

The Growth of Quantum Physics beyond The Classical Model of Science

Many consider Quantum Physics to be at the cutting edge of Western science and in many respects it goes beyond Einstein’s Theory of Relativity. The interesting challenge associated with quantum physics is that the original impetus giving rise to it, namely the pursuit of the elemental building blocks of the Universe (separate elementary particles) has become meaningless with the discovery that the Universe appears to be an undivided Whole in a perpetual state of dynamic flux.

Like Einstein’s Theory of Relativity, that latest experiments and associated theories of Quantum Physics reveal the Universe to be a single gigantic field of energy in which matter itself is just a ‘slowed down’ form of energy. Further, Quantum Physics has postulated that matter/energy does not exist with any certainty in definite places, but rather shows ‘tendencies’ to exist. [Heisenberg’s Uncertainty Principle]

Even more intriguing is the notion that the existence of an observer is fundamental to the existence of the Universe — a concept known as ‘The Observer Effect’ – implying that the Universe is a product of consciousness, beyond Universe and Supra-Universe are product of the Super-Consciousness. [The Mind of God concept]

“Through experiments over the past few decades physicists have discovered matter to be completely mutable into other particles or energy and vice-versa and on a subatomic level, matter does not exist with certainty in definite places, but rather shows ‘tendencies’ to exist. Quantum physics is beginning to realize that the Universe appears to be a dynamic web of interconnected and inseparable energy patterns. If the universe is indeed composed of such a web, there is logically no such thing as a part. This implies we are not separated parts of a whole but rather we are the Whole.” [The Hands of Light, Barbara Brennan, American physicist]

On the other hand, the methodology of contemporary Western science, which still taught in most of our educational institutions today, works on the basis of breaking the world into its component parts. Quantum Physicist, Dr David Bohm stated “primary physical laws cannot be discovered by a science that attempts to break the world into its parts.” Bohm wrote of an “implicate enfolded order” which exists in an un-manifested state and which is the foundation upon which all manifest reality rests. He called this manifest reality “the explicate unfolded order”. He went on to say “parts are seen to be in immediate connection, in which their dynamical relationships depend in an irreducible way on the state of the whole system . . . Thus, one is led to a new notion of unbroken wholeness which denies the classical idea of analyzability of the world into separately and independently existent parts.” [The Implicate Order]

The classical perception of the ‘rules’ of the Universe are changing to reflect the multiple-universes predicted by the Quantum Physics based mathematics. In other words, the mathematical formulae that were initially developed to describe the behavior of the universe — with multiple-universes — turn out to govern the behavior of the multiple universes and planes described in spirituality. Thus, the mathematics which was thought to produce some absurd results, has in fact come to be relied upon to demonstrate that matter and energy somehow indeed change to behave in exactly that absurd manner to reflect the formulae!

Of course, all these notions completely contradict the understanding of reality held by most humans, whose perception of reality is still based upon “Linear Cause and Effect” Newtonian Physics, if only because Newtonian Physics seems to describe the observed Universe, as we knew it, so well at a preliminary level. (but not on a micro level such as sub atomic ‘particles’ etc or indeed at the level of accounting for acceleration and other unusual phenomena.)

Non-Locality is defined as that phenomenon in which occurrences on one side of the Universe can instantly effect ‘matter’ or ‘energy’ on the other side of the Universe. Non-locality has profound implications for the prevailing world view of reality in that it clearly demonstrates the inter-connectedness between all ‘matter’ and ‘energy’ in the Physical Universe and the illusory nature of Space and Time, something that those who have had some sort of deep spiritual experience are already well aware of.

 

 

 

 

 

 

 

www.victoriajohnson.co.uk/images/blog/2010/may/18/index.html

 

 

 

 

 

 

 

 

The Gaping Chasm between Economics and Physics

 

 

The Gaping Chasm between Economics and Physics – Rising Systemic Risk and Multiple Black Swans: Does today’s dominant economic and financial thinking violate the laws of physics? Mainstream finance and economics have long been inconsistent with the underlying laws of thermodynamics, which are fast catching up as a result of globalization! At present, economics is the study of how people transform nature to meet their needs and it treats the exploitation of finite natural resources including energy, water, air, arable land and oceans as externalities, which they are not! For example, we cannot pollute and damage natural ecosystems and their local communities ad infinitum without severe repercussions to their underlying sustainability. It is widely recognized both within the distinguished ATCA community and beyond that exchange rates instability, equity and commodity market speculation — particularly fuel, food and finance — and resultant volatilities as well as external debt are the main cause of asymmetric threats and disruption at the international level manifest as known unknowns, ie, low probability high impact risks and unknown unknowns or black swans.

There is a fundamental conflict between economic growth and environmental protection, including conservation of biodiversity, clean air and water, and atmospheric stability. This conflict is due to the laws of thermodynamics. An economic translation of the first law of thermodynamics is that we cannot make something from nothing. All economic production must come from resources provided by nature. Also, any waste generated by the economy cannot simply disappear. At given levels of technology, therefore, economic growth entails increasing resource use and waste in the form of pollution. According to the second law of thermodynamics, although energy and materials remain constant in quantity, they degrade in quality or structure. This is the meaning of increasing entropy. In the context of the economy, whatever resources we transform into something useful must decay or otherwise fall apart to return as waste to the environment. The current model of the disposable economy operates as a system for transforming low-entropy raw materials and energy into high-entropy toxic waste and unavailable energy, while providing society with interim goods and services and the temporary satisfaction that most deliver. Any such transformations in the economy mean that there will be less low-entropy materials and energy available for natural ecosystems. Mounting evidence of this conflict demonstrates the limits to our global growth!

Where do massive turbulences actually come from and what is the underlying cause of periodic financial and economic crises with accelerating levels of severity at national and trans-national levels? Mainstream economics is fundamentally flawed in its measurement of: 1. The value of human capital; 2. The real long term cost of renewal of natural ecosystems and resources; and 3. The overall health of the economy as assessed by Gross Domestic Product (GDP). The near-universal quest for constant economic growth — translated as rising GDP — ignores the world’s diminishing supply of natural resources at humanity’s peril, failing to take account of the principle of net Energy Return On Investment (EROI). The Great Reset — the protracted 21st century financial and economic crisis and global downturn which ATCA originally labeled as The Great Unwind in 2007 — has led to much soul-searching amongst economists and policy makers, the vast majority of whom never saw it coming because they never understood that the credit pyramid is an inversion of the energy-dependence pyramid.

When we look beyond the narrow lens of the current human perspective, survival of all living creatures — including ourselves — is limited by the concept of Energy Return on Investment (EROI). What does EROI really mean? Any living thing or living societies can survive only so long as they are capable of getting more net energy from any activity than they expend during the performance of that activity. This simple concept is ignored by present-day economics when focusing solely on demand and supply curves or daily financial market gyrations. For example, if a human burns energy eating food, that food ought to give that person more energy back then s/he expended, or the person will not survive. It is a golden rule that lies at the core of studying all flora and fauna, whether they are micro-organisms, thousand year old trees or mighty elephants. Human society should be looked at no differently: even technologically complex societies are still governed by the EROI and the laws of thermodynamics!

The petroleum sector’s EROI in USA was about 100-to-1 in the 1930s, meaning one had to burn approximately 1 barrel of oil’s worth of energy to get 100 barrels out of the ground. By the 1990s, that number slid to less than 36-to-1, and further down to 19-to-1 by 2006. It has fallen even further in 2009. Oil extraction has evolved by leaps and bounds since the early 1900s, and yet companies must expend much more energy to get less and less oil than they did a hundred years ago. If one were to go from using a 19-to-1 energy return on fuel down to a 3-to-1 EROI, economic disruption is guaranteed as nothing is left for other economic activity at all!

Is it because we don’t have the technology that we find ourselves cornered? No. Technology is in a global race with rocketing energy consumption and accelerating depletion of energy, and that’s a very complex set of challenges to confront simultaneously. The resource constraints foreseen by the Club of Rome in 1972 are more evident today than at any time since the publication of the think tank’s famous book, “The Limits to Growth” which stated: “If the present growth trends in world population, industrialization, pollution, food production, and resource depletion continue unchanged, the limits to growth on this planet will be reached sometime within the next one hundred years. The most probable result will be a rather sudden and uncontrollable decline in both population and industrial capacity.”

Although more than 60 years have elapsed since The Great Depression of the 1930s and the subsequent horrors of the second world war, our understanding of severe economic downturns has improved very little in the 20th and 21st centuries. Economists, financiers, and policy makers are too often at a loss when asked to provide a diagnosis and propose a remedy for the reoccurrence of complex systemic risk. The main problem with mainstream economics is that it treats energy as the same as any other commodity input in the production function, thinking of it purely in money terms, and treating it the same as they would other raw materials and sub-components, but without energy, one can’t have any of the other inputs or outputs! We have to begin regarding Calories, Joules and Watts as the key currencies rather than the Dollar, Euro and Yen!

Is lowering the carbon footprint as Copenhagen would have us believe, the only answer, or is conservation of energy efficiency another important thread in the global solution that we all seek? Neither would be sufficient at our present rate of accelerating energy consumption worldwide! The International Energy Agency’s data shows that global energy use is doubling every 37 years or so, while energy productivity takes about 56 years to double! Energy and resource conservation is somewhat pointless in the mainstream economic system as it is now legislated and operates. Whilst such efforts are noteworthy as they buy the world a bit more time — as Copenhagen no doubt will claim — but the destination is inevitably unaltered! A barrel of oil not burned by an American or European will be burned by someone else in another emerging country as that nation seeks to topple its rival in the high GDP growth league!

What is needed is a unified working model consistent with the nature of Energy Return on Investment (EROI) and capable of accounting for the process of all types of capital accumulation, treatment of externalities as internalities, mapping global energy flows and their circulation. In 1926, Frederick Soddy, a chemist who was awarded the Nobel Prize just a few weeks before, published “Wealth, Virtual Wealth and Debt,” one of the first books to argue that energy should lie at the heart of economics and not supply-and-demand curves. Soddy was critical of traditional monetary policy for seemingly ignoring the fact that “real wealth” is derived from using energy to transform physical objects, and that these physical objects are inescapably subject to the laws of thermodynamics, or inevitable decline and disintegration.

The main problem is that we as a global society, even in 2009, are almost incapable of detecting and measuring systemic risk in a complex system as we have seen in the global financial and economic crises: The Great Unwind and The Great Reset. Given this inability, what we tend to do, is to focus on a single cause — such as capping carbon emissions at Copenhagen — and extrapolate out of that a wider perspective which has the capacity to lead to an incomplete and distorted view. A new model of economics must aim to contribute to the development of the scientific understanding of the way our economic systems work holistically, with particular reference to inherent monetary disequilibria caused by energy, water, air, arable land and natural resources dependence and how those could practically be dealt with via modern physics.

We welcome your thoughts, observations and views. To reflect further on this, please respond within Twitter, Linked and Facebook’s ATCA Open and related discussion platform of HQR.

 

 

 

The Great Reset

HQR.com, The Great Reset — How To Regenerate World Growth? The World Trade Organization (WTO) recently held its first major ministerial meeting in Geneva, since Hong Kong in 2005, as it sought to find some way to get world trade liberalization back on track, post ‘The Great Reset’.

What is the Great Reset? The Great Reset occurred between the third quarter of 2008 and the second quarter of 2009 when global demand for durable products collapsed abruptly leaving a vast gap relative to supply capacity worldwide. The Great Reset is colossal — the steepest fall of world trade in recorded history and the deepest fall since the Great Depression of the 1930s. World demand experienced a sudden, severe and synchronized plunge on an unprecedented scale in the last quarter of 2008 after the insolvency of Lehman Brothers in September. Signs are that we might be turning a corner in the second half of 2009. However, According to Pascal Lamy, Director General, WTO, “Despite some evidence that trade volumes grew over the summer this year, global recovery has been patchy — and so fragile that a sudden shock in equity or currency markets could once again undermine consumer and business confidence, leading to a further deterioration of world trade.”

Severity and Speed

It took 24 months in the Great Depression of the 1930s for world trade to fall as far as it fell in the 9 months from November 2008 to July 2009. The seven biggest month-on-month drops in world trade — based on data compiled by the OECD over the past 44 years — all occurred since November 2008. Global trade has dropped before –- three times since WWII –- but nothing when compared to The Great Reset we are going through at present. Those three recessions were the oil-shock of 1974, the inflation-check of 1982, and the DotCom bust plus 9/11 in 2001. The Great Reset of 2008-09 is much worse; for two quarters in a row, world trade flows have been 15% below their previous year levels. “Driven largely by collapsing domestic demand and production levels, but also by a shortage of affordable trade finance, trade volumes are likely to fall by a further 10% this year. Whether world trade will recover next year is an open question,” states Pascal Lamy.

Scale and Synchronicity

All 104 nations on which the WTO reports data, experienced a drop in both imports and exports during the second half of 2008 and the first half of 2009. Imports and exports collapsed for the European Union’s 27 member countries and 10 other leading nations, that together account for three-quarters of world trade. Each of these trade flows dropped by more than 20% during that period; many fell 30% or more. Why did world trade fall so much more than GDP? Given the global recession, a drop in global trade is not surprising. The question remains: Why so big? During the four large post-war recessions — 1975, 1982, 1991 and 2001 — world trade dropped nearly 5 times more than GDP. This time the drop is far, far larger.

Trans-national Supply Chains

Evidence shows that the world trade-to-GDP ratio rose steeply in the late 1990s before stagnating in the 21st century right up to the start of The Great Reset in 2008, when it fell off a cliff. The rise in the 1990s is explained by a number of interlinked factors including trade liberalization and trans-national supply chains. Essentially, geography became history! Manufacturing was geographically unbundled with various modules of the value-added processes being placed in the most cost-effective or time-efficient nations on the planet. This unbundling meant that the same value-added item crossed national borders several times. In a simple trans-national supply chain, imported sub-components would be transformed step-by-step into exported components which in turn would be assembled into final goods and exported again, so the world trade figures counted the final value add several times over. The presence of these highly integrated and tightly synchronized production and distribution networks has played an important and unprecedented role in precipitating the severity, speed, scale and synchronicity of The Great Reset worldwide.

The Great Reset is manifest as a gigantic drop in international sales and is mostly a demand shock -– although supply side factors did play some role. The demand shock operated through two distinct reinforcing channels. 1. Commodity prices, which tumbled when the price bubble burst in mid 2008. 2. The production and exports of manufacturing collapsed as modern trade in durable manufactured goods fell dramatically. In the face of financial crisis and uncertainty, consumers and corporations postponed purchases of anything that wasn’t needed immediately.

Conclusion

If we carefully study world events in the 17th, 18th and 19th centuries, as well as the Great Depression in the last century, we can see that Globalization has had a long and cyclical propensity to generate bubble after bubble followed by collapse and whiplash. Clearly, the greatest danger of The Great Reset we are living through, is not simply the destruction of demand, wealth and living standards, however unpalatable that may be. It is also the destruction of “value” in the ethical and moral sense of an interlinked system of trust, commitments and social obligations, which allow trans-national capitalism to operate in harmonious concert. When in 1931 there was a collapse of confidence that resulted in the proliferation of beggar-thy-neighbor policies regarding currencies, trade and immigration, there was “De-Globalization” that damaged economies around the world and ultimately led to geo-political upheaval and confrontation. If we do not pay heed to shore up that core system of values and global trade agreements and descend into tit-for-tat tariffs and penalties — as in the Great Depression of the 1930s — we may be truly on the edge of an abyss within The Great Reset of 2008-?

[ENDS]

We welcome your thoughts, observations and views. To reflect further on this, please respond within Twitter, Linked and Facebook’s ATCA Open and related discussion platform of HQR.

All the best

DK Matai

Chairman and Founder: mi2g.net, ATCA, The Philanthropia, HQR, @G140

To connect directly with:

. DK Matai: twitter.com/DKMatai

. Open HQR: twitter.com/OpenHQR

. ATCA Open: twitter.com/ATCAOpen

. @G140: twitter.com/G140

. mi2g: twitter.com/intunit

– ATCA, The Philanthropia, mi2g, HQR, @G140 —

This is an “ATCA Open, Philanthropia and HQR Socratic Dialogue.”

The “ATCA Open” network on LinkedIn and Facebook is for professionals interested in ATCA’s original global aims, working with ATCA step-by-step across the world, or developing tools supporting ATCA’s objectives to build a better world.

The original ATCA — Asymmetric Threats Contingency Alliance — is a philanthropic expert initiative founded in 2001 to resolve complex global challenges through collective Socratic dialogue and joint executive action to build a wisdom based global economy. Adhering to the doctrine of non-violence, ATCA addresses asymmetric threats and social opportunities arising from climate chaos and the environment; radical poverty and microfinance; geo-politics and energy; organized crime & extremism; advanced technologies — bio, info, nano, robo & AI; demographic skews and resource shortages; pandemics; financial systems and systemic risk; as well as transhumanism and ethics. Present membership of the original ATCA network is by invitation only and has over 5,000 distinguished members from over 120 countries: including 1,000 Parliamentarians; 1,500 Chairmen and CEOs of corporations; 1,000 Heads of NGOs; 750 Directors at Academic Centers of Excellence; 500 Inventors and Original thinkers; as well as 250 Editors-in-Chief of major media.

The Philanthropia, founded in 2005, brings together over 1,000 leading individual and private philanthropists, family offices, foundations, private banks, non-governmental organizations and specialist advisors to address complex global challenges such as countering climate chaos, reducing radical poverty and developing global leadership for the younger generation through the appliance of science and technology, leveraging acumen and finance, as well as encouraging collaboration with a strong commitment to ethics. Philanthropia emphasizes multi-faith spiritual values: introspection, healthy living and ecology. Philanthropia Targets: Countering climate chaos and carbon neutrality; Eliminating radical poverty — through micro-credit schemes, empowerment of women and more responsible capitalism; Leadership for the Younger Generation; and Corporate and social responsibility.

The “E8 Album” photos are visual intersections of Spirituality, Science, Art and Sustainability!

The Holistic Quantum Relativity (HQR) Group is on:

groups.google.com/group/holistic-quantum-relativity-hqr