Lawrence Berkeley Labs — Imagine a technology that would not only provide a green and renewable source of electrical energy, but could also help scrub the atmosphere of excessive carbon dioxide resulting from the burning of fossil fuels. That’s the promise of artificial versions of photosynthesis, the process by which green plants have been converting solar energy into electrochemical energy for millions of years. To get there, however, scientists need a far better understanding of how Nature does it, starting with the harvesting of sunlight and the transporting of this energy to electrochemical reaction centers.

-1.jpg
Elizabeth Read, Graham Fleming and Gabriella Schlau-Cohen have extended the technique known as 2D electronic spectroscopy to the study of energy-transferring functions within pigment-protein complexes, a crucial capability to understanding the astonishing efficiency behind photsynthesis. (Photo by Roy Kaltschmidt, Berkeley Lab Public Affairs)

Graham Fleming, a physical chemist who holds joint appointments with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) at Berkeley, is the leader of an ongoing effort to discover how plants are able to transfer energy through a network of pigment-protein complexes with nearly 100-percent efficiency. In previous studies, he and his research group used a laser-based technique they developed called two-dimensional electronic spectroscopy to track the flow of excitation energy through both time and space. Now, for the first time, they’ve been able to connect that flow to energy-transferring functions by providing direct experimental links between atomic and electronic structures in pigment-protein complexes.

“To fully understand how the energy-transfer system in photosynthesis works, you can’t just study the spatial landscape of these pigment-protein complexes, you also need to study the electronic energy landscape. This has been a challenge because the electronic energy landscape is not confined to a single molecule but is spread out over an entire system of molecules,” Fleming said. “Our new 2D electronic spectroscopy technique has enabled us to move beyond the imaging of structures and to start imaging functions. This makes it possible for us to examine the crucial aspects of the energy-transfer system that enable it to work the way it does.

In a paper published by the Biophysical Journal, Fleming and his group report on a study of the energy-transferring functions within the Fenna-Matthews-Olson (FMO) photosynthetic light-harvesting protein, a pigment-protein complex in green sulfur bacteria that serves as a model system because it consists of only seven well-characterized pigment molecules. The paper, entitled “Visualization of Excitonic Structure in the Fenna-Matthews-Olson Photosynthetic Complex by Polarization-Dependent Two-Dimensional Electronic Spectroscopy,” was co-authored by Elizabeth Read, along with Gabriela Schlau-Cohen, Gregory Engel, Jianzhong Wen and Robert Blankenship.

“The optical properties of bacteriochlorophyll pigments are well understood, and the spatial arrangement of the pigments in FMO is known, but this has not been enough to understand how the protein as a whole responds to light excitation,” said Read. “By polarizing the laser pulses in our 2D electronic spectroscopy set-up in specific ways, we were able to visualize the direction of electronic excitation states in the FMO complex and probe the way individual states contribute to the collective behavior of the pigment-protein complex after broadband excitation.”

Fleming has compared 2D electronic spectroscopy to the early super-heterodyne radios, where an incoming high frequency radio signal was converted by an oscillator to a lower frequency for more controllable amplification and better reception. In 2D electronic spectroscopy, a sample is sequentially flashed with light from three laser beams, delivered in femtosecond timescale bursts, while a fourth beam serves as a local oscillator to amplify and phase-match the resulting spectroscopic signals.

-2.jpg
This simplified schematic depicts the harvesting of sunlight (photons) and the transfer of this energy via pigment-protein complexes to a photosynthetic reaction center. (Image from the National Energy Research Scientific Computing Center)

“By providing femtosecond temporal resolution and nanometer spatial resolution, 2D electronic spectroscopy allows us to simultaneously follow the dynamics of multiple electronic states, which makes it an especially useful tool for studying photosynthetic complexes,” Fleming said. “Because the pigment molecules within protein complexes have a fixed orientation relative to each other and each absorbs light polarized along a particular molecular axis, the use of 2D electronic spectroscopy with polarized laser pulses allows us to follow the electronic couplings and interactions (between pigments and the surrounding protein) that dictate the mechanism of energy flow. This suggests the possibility of designing future experiments that use combinations of tailored polarization sequences to separate and monitor individual energy relaxation pathways.”

In all photosynthetic systems, the conversion of light into chemical energy is driven by electronic couplings that give rise to collective excitations – called molecular or Frenkel excitons (after Russian physicist Yakov Frenkel) – which are distinct from individual pigment excitations. Energy in the form of these molecular excitons is transferred from one molecule to the next down specific energy pathways as determined by the electronic energy landscape of the complex. Polarization-selective 2D electronic spectroscopy is sensitive to molecular excitons – their energies, transition strengths, and orientations – and therefore is an ideal probe of complex functions.

“Using specialized polarization sequences that select for a particular cross-peak in a spectrum allows us to probe any one particular electronic coupling even in a system containing many interacting chromophores,” said Read. “The ability to probe specific interactions between electronic states more incisively should help us better understand the design principles of natural light-harvesting systems, which in turn should help in the design of artificial light-conversion devices.”

Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California.

Lawrence Berkeley Labs, BERKELEY, Calif. May 5, 2008 — Three researchers from the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have proposed an innovative way to improve global climate change predictions by using a supercomputer with low-power embedded microprocessors, an approach that would overcome limitations posed by today’s conventional supercomputers.

-1.jpg
Berkeley Lab has signed a collaboration agreement with Tensilica®, Inc. to explore the use of Tensilica’s Xtensa processor cores as the basic building blocks in a massively parallel system design. Tensilica’s Xtensa processor is about 400 times more efficient in floating point operations per watt than the conventional server processor chip shown here.

In a paper published in the May issue of the International Journal of High Performance Computing Applications, Michael Wehner and Lenny Oliker of Berkeley Lab’s Computational Research Division, and John Shalf of the National Energy Research Scientific Computing Center (NERSC) lay out the benefit of a new class of supercomputers for modeling climate conditions and understanding climate change. Using the embedded microprocessor technology used in cell phones, iPods, toaster ovens and most other modern day electronic conveniences, they propose designing a cost-effective machine for running these models and improving climate predictions.

In April, Berkeley Lab signed a collaboration agreement with Tensilica®, Inc. to explore such new design concepts for energy-efficient high-performance scientific computer systems. The joint effort is focused on novel processor and systems architectures using large numbers of small processor cores, connected together with optimized links, and tuned to the requirements of highly-parallel applications such as climate modeling.

Understanding how human activity is changing global climate is one of the great scientific challenges of our time. Scientists have tackled this issue by developing climate models that use the historical data of factors that shape the earth’s climate, such as rainfall, hurricanes, sea surface temperatures and carbon dioxide in the atmosphere. One of the greatest challenges in creating these models, however, is to develop accurate cloud simulations.

Although cloud systems have been included in climate models in the past, they lack the details that could improve the accuracy of climate predictions. Wehner, Oliker and Shalf set out to establish a practical estimate for building a supercomputer capable of creating climate models at 1-kilometer (km) scale. A cloud system model at the 1-km scale would provide rich details that are not available from existing models.

To develop a 1-km cloud model, scientists would need a supercomputer that is 1,000 times more powerful than what is available today, the researchers say. But building a supercomputer powerful enough to tackle this problem is a huge challenge.

Historically, supercomputer makers build larger and more powerful systems by increasing the number of conventional microprocessors — usually the same kinds of microprocessors used to build personal computers. Although feasible for building computers large enough to solve many scientific problems, using this approach to build a system capable of modeling clouds at a 1-km scale would cost about $1 billion. The system also would require 200 megawatts of electricity to operate, enough energy to power a small city of 100,000 residents.


-2.jpg-3.jpg-4.jpg

Berkeley Lab scientists Michael Wehner, Lenny Oliker and John Shalf have made the case that using a supercomputer with low-power embedded microprocessors would overcome limitations posed by today’s conventional supercomputers and greatly benefit such challenges as modeling climate conditions and understanding global climate change.

In their paper, “Towards Ultra-High Resolution models of Climate and Weather,” the researchers present a radical alternative that would cost less to build and require less electricity to operate. They conclude that a supercomputer using about 20 million embedded microprocessors would deliver the results and cost $75 million to construct. This “climate computer” would consume less than 4 megawatts of power and achieve a peak performance of 200 petaflops.

“Without such a paradigm shift, power will ultimately limit the scale and performance of future supercomputing systems, and therefore fail to meet the demanding computational needs of important scientific challenges like the climate modeling,” Shalf said.

The researchers arrive at their findings by extrapolating performance data from the Community Atmospheric Model (CAM). CAM, developed at the National Center for Atmospheric Research in Boulder, Colorado, is a series of global atmosphere models commonly used by weather and climate researchers.

The “climate computer” is not merely a concept. Wehner, Oliker and Shalf, along with researchers from UC Berkeley, are working with scientists from Colorado State University to build a prototype system in order to run a new global atmospheric model developed at Colorado State.

“What we have demonstrated is that in the exascale computing regime, it makes more sense to target machine design for specific applications,” Wehner said. “It will be impractical from a cost and power perspective to build general-purpose machines like today’s supercomputers.”

Under the agreement with Tensilica, the team will use Tensilica’s Xtensa LX extensible processor cores as the basic building blocks in a massively parallel system design. Each processor will dissipate a few hundred milliwatts of power, yet deliver billions of floating point operations per second and be programmable using standard programming languages and tools. This equates to an order-of-magnitude improvement in floating point operations per watt, compared to conventional desktop and server processor chips. The small size and low power of these processors allows tight integration at the chip, board and rack level and scaling to millions of processors within a power budget of a few megawatts.

Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California.