Target Health Inc. is pleased to have Ferring Pharmaceuticals as a client.

For this approved product we…………………

European Commission grants Ferring Pharmaceuticals approval of FIRMAGON® (degarelix) for treatment of prostate cancer

New gonadotrophin-releasing hormone (GnRH) receptor antagonist demonstrates rapid, long-term suppression of testosterone

Saint-Prex, Switzerland, 19 February 2009 – Ferring Pharmaceuticals announced today that it has received marketing authorisation from the European Commission, for FIRMAGON® (degarelix), a new GnRH receptor antagonist indicated for patients with advanced, hormone-dependent prostate cancer.

In Phase III studies degarelix produced a significant reduction in levels of testosterone [1] [2], within three days in more than 96% of study patients.[3] Testosterone plays a major role in the growth and spread of prostate cancer cells.

The data show that degarelix provided an extremely fast effect on testosterone levels, close to the immediate effect achieved with surgery (orchidectomy).[2] [3]

“We are delighted with the approval of FIRMAGON® (degarelix), which demonstrated in clinical trials both an immediate onset of action and a profound long-term suppression of testosterone and PSA” commented Dr. Pascal Danglas, Executive Vice President Clinical & Product Development at Ferring Pharmaceuticals. “We will work with local authorities to ensure the launch of FIRMAGON to patients across European Union countries as soon as possible.”

The European Commission approval for FIRMAGON® (degarelix) follows approval from the FDA in the US in December 2008.

– ENDS –

Notes to Editors:

About Prostate Cancer
Prostate cancer is the most common form of cancer in men, and the second leading cause of cancer death. In the US 218,890 new cases were estimated for 2007, with a mortality rate of 27,050. In 2005 127,490 new cases were diagnosed in the 5 biggest European countries and 18,310 in Japan.

About degarelix
Degarelix is a GnRH receptor antagonist indicated for advanced prostate cancer. Ferring plans to communicate a range of information about the treatment at the European Academy of Urology (EAU) congress in Stockholm in March.

About Ferring
Ferring is a Swiss-headquartered, research driven, speciality biopharmaceutical group active in global markets. The company identifies, develops and markets innovative products in the areas of urology, endocrinology, gastroenterology, gynaecology, and fertility. In recent years Ferring has expanded beyond its traditional European base and now has offices in over 45 countries. To learn more about Ferring or our products please visit


FDA Approves Drug for Patients with Advanced Prostate Cancer

The U.S. Food and Drug Administration recently approved the injectable drug degarelix, the first new drug in several years for prostate cancer.

Degarelix is intended to treat patients with advanced prostate cancer. It belongs to a class of agents called gonadotropin releasing hormone (GnRH) receptor inhibitors. These agents slow the growth and progression of prostate cancer by suppressing testosterone, which plays an important role in the continued growth of prostate cancer.

Hormonal treatments for prostate cancer may cause an initial surge in testosterone production before lowering testosterone levels. This initial stimulation of the hormone receptors may temporarily prompt tumor growth rather than inhibiting it. Degarelix doesn’t do this.

“Prostate cancer is the second leading cause of cancer death among men in the United States and there is an ongoing need for additional treatment options for these patients,” said Richard Pazdur, M.D., director of the Office of Oncology Drug Products, Center for Drug Evaluation and Research, FDA.

Prostate cancer is one of the most commonly diagnosed cancers in the United States. In 2004, the most recent year for which statistics are currently available, nearly 190,000 men were diagnosed with prostate cancer and 29,000 men died from the cancer.

Several treatment options exist for different stages of prostate cancer including observation, prostatectomy (surgical removal of the prostate gland), radiation therapy, chemotherapy, and hormone therapy with agents that affect GnRH receptors.

The efficacy of degarelix was established in a clinical trial in which patients with prostate cancer received either degarelix or leuprolide, a drug currently used for hormone therapy in treating advanced prostate cancer. Degarelix treatment did not cause the temporary increase in testosterone that is seen with some other drugs that affect GnRH receptors.

In fact, nearly all of the patients on either drug had suppression of testosterone to levels seen with surgical removal of the testes.

The most frequently reported adverse reactions in the clinical study included injection site reactions (pain, redness, and swelling), hot flashes, increased weight, fatigue, and increases in some liver enzymes.

Degarelix is manufactured for Ferring Pharmaceuticals Inc., Parsippany, N.J., by Rentschler Biotechnologie Gmbh, Laupheim, Germany.


RSS Feed for FDA News Releases


The New York Times, February 23, 2009, by Donald G. McNeil Jr — In a discovery that could radically change how the world fights flu, researchers have engineered antibodies that protect against many strains of influenza, including even the 1918 Spanish flu and the H5N1 bird flu.

The discovery, experts said, could lead to the development of a flu vaccine that would not have to be changed yearly. And the antibodies already developed can be injected as a treatment, targeting the virus in ways that drugs like Tamiflu do not. Clinical trials to prove they are safe in humans could begin within three years, a researcher estimated.

“This is a really good study,” said Dr. Anthony S. Fauci, the head of the National Institute of Allergy and Infectious Diseases, who was not part of the study. “It’s not yet at the point of practicality, but the concept is really quite interesting.”

The work is so promising that his institute will offer the researchers grants and access to its ferrets, which can catch human flu.

The study, done by researchers from Harvard Medical School, the Centers for Disease Control and Prevention and the Burnham Institute for Medical Research, was published Sunday in the journal Nature Structural & Molecular Biology.

In an accompanying editorial, Dr. Peter Palese, a leading flu researcher from Mount Sinai Medical School, said the researchers apparently found “a viral Achilles heel.”

Dr. Anne Moscona, a flu specialist at Cornell University’s medical school, called it “a big advance in itself, and one that shows what’s possible for other rapidly evolving pathogens.”

But Henry L. Niman, a biochemist who tracks flu mutations, was skeptical, arguing that human immune systems would have long ago eliminated flu if the virus were as vulnerable in one spot as this discovery suggests. Also, he noted, protecting the mice in the study took huge doses of antibodies, which today are expensive and cumbersome to infuse.

One team leader, Dr. Wayne A. Marasco of Harvard, said the team began by screening a library of 27 billion antibodies he had created, looking for ones that target the hemagglutinin “spikes” on the shells of flu viruses.

Antibodies are proteins normally produced by white blood cells that attach to invaders, either neutralizing them by clumping on, or tagging them so white cells can find and engulf them. Today, they can be built in the laboratory and then “farmed” in plants, driving prices down, Dr. Marasco said.

The flu virus uses the lollipop-shaped hemagglutinin spike to invade nose and lung cells. There are 16 known types of spikes, H1 through H16.

The spike’s tip mutates constantly, which is why flu shots have to be reformulated each year. But the team found a way to expose the spike’s neck, which apparently does not mutate, and picked antibodies that clamp onto it. Once its neck is clamped, a spike can still penetrate a human cell, but it cannot unfold to inject the genetic instructions that hijack the cell’s machinery to make more virus.

The team then turned the antibodies into full-length immunoglobulins and tested them in mice.

Immunoglobulin — antibodies derived from the blood of survivors of an infection — has a long history in medicine. As early as the 1890s, doctors injected blood from sheep that had survived diphtheria to save a girl dying of it. But there can be dangerous side effects, including severe immune reactions or accidental infection with other viruses.

The mice in the antibody experiments were injected both before and after getting doses of H5N1. In 80 percent of cases, they were protected. The team then showed that their new antibodies could protect against both H1 and H5 viruses. Most of this season’s flu is H1, and experts still fear that the lethal H5N1 bird flu might start a human pandemic.

However, each year’s other seasonal flu outbreaks are usually caused by H3 or B strains, so flu shots must also contain those. But there is always at least a partial mismatch because vaccine makers must pick from among strains circulating in February since it takes months to make supplies. By the time the flu returns in November, its “lollipop heads” have often mutated.

Therefore, other antibodies that clamp to and disable H3 and B will have to be found before doctors even think of designing a once-a-lifetime flu shot. It is also unclear how long an antibody-producing vaccine will offer protection; new antibodies themselves fade out of the blood after about three weeks.

Dr. Marasco said his team had already found a stable neck in the H3 “and we’re going after that one too.” They have not tried with B strains yet.

To make a vaccine work, researchers also need a way to teach the immune system to expose the spike’s neck for attack. It is hidden by the fat lollipop head, whose rapid mutations may act as a decoy, attracting the immune system.

As a treatment for people already infected with flu, Dr. Marasco said, the antibodies are “ready to go, no additional engineering needed.”

They will, of course, need the safety testing required by the Food and Drug Administration.

Anti-flu drugs like Tamiflu, Relenza and rimantadine do not target the hemagglutinin spike at all.

Tamiflu and Relenza inhibit neuraminidase (the “N” in flu names like H5N1), which has been described as a helicopter blade on the outside of the virus that chops up the receptors on the outside of the infected cell so the new virus being made inside can escape. Rimantidine is believed to attack a layer of the virus’s shell.


New Berkeley Lab Report Shows Significant Historical Reductions in the Installed Costs of Solar Photovoltaic Systems in the U.S.

Berkeley, CA — A new study on the installed costs of solar photovoltaic (PV) power systems in the U.S. shows that the average cost of these systems declined significantly from 1998 to 2007, but remained relatively flat during the last two years of this period.

Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) who conducted the study say that the overall decline in the installed cost of solar PV systems is mostly the result of decreases in nonmodule costs, such as the cost of labor, marketing, overhead, inverters, and the balance of systems.

“This suggests that state and local PV deployment programs — which likely have a greater impact on nonmodule costs than on module prices — have been at least somewhat successful in spurring cost reductions,” states the report, which was written by Ryan Wiser, Galen Barbose, and Carla Peterman of Berkeley Lab’s Environmental Energy Technologies Division.

Installations of solar PV systems have grown at a rapid rate in the U.S., and governments have offered various incentives to expand the solar market.

“A goal of government incentive programs is to help drive the cost of PV systems lower. One purpose of this study is to provide reliable information about the costs of installed systems over time,” says Wiser.

The study examined 37,000 grid-connected PV systems installed between 1998 and 2007 in 12 states. It found that average installed costs, in terms of real 2007 dollars per installed watt, declined from $10.50 per watt in 1998 to $7.60 per watt in 2007, equivalent to an average annual reduction of 30 cents per watt or 3.5 percent per year in real dollars.

The researchers found that the reduction in nonmodule costs was responsible for most of the overall decline in costs. According to the report, this trend, along with a reduction in the number of higher-cost “outlier” installations, suggests that state and local PV-deployment policies have achieved some success in fostering competition within the industry and in spurring improvements in the cost structure and efficiency of the delivery infrastructure for solar power.

Costs differ by region and type of system

Other information about differences in costs by region and by installation type emerged from the study. The cost reduction over time was largest for smaller PV systems, such as those used to power individual households. Also, installed costs show significant economies of scale. Systems completed in 2006 or 2007 that were less than two kilowatts in size averaged $9.00 per watt, while systems larger than 750 kilowatts averaged $6.80 per watt.

Installed costs were also found to vary widely across states. Among systems completed in 2006 or 2007 and less than 10 kilowatts, average costs range from a low of $7.60 per watt in Arizona, followed by California and New Jersey, which had average installed costs of $8.10 per watt and $8.40 per watt respectively, to a high of $10.60 per watt in Maryland. Based on these data, and on installed-cost data from the sizable Japanese and German PV markets, the authors suggest that PV costs can be driven lower through sizable deployment programs.

The study also found that the new construction market offers cost advantages for residential PV systems. Among small residential PV systems in California completed in 2006 or 2007, those systems installed in residential new construction cost 60 cents per watt less than comparably-sized systems installed as retrofit applications.

Cash incentives declined

The study also found that direct cash incentives provided by state and local PV incentive programs declined over the 1998-2007 study period. Other sources of incentives, however, have become more significant, including federal incentive tax credits (ITCs). As a result of the increase in the federal ITC for commercial systems in 2006, total after-tax incentives for commercial PV were $3.90 per watt in 2007, an all-time high based on the data analyzed in the report. Total after-tax incentives for residential systems, on the other hand, averaged $3.1 per watt in 2007, their lowest level since 2001.

Because incentives for residential PV systems declined over this period, the net installed cost of residential PV has remained relatively flat since 2001. At the same time, the net installed cost of commercial PV has dropped — it was $3.90 per watt in 2007, compared to $5.90 per watt in 2001, a drop of 32 percent, thanks in large part to the federal ITC.

“Tracking the Sun: The Installed Cost of Photovoltaics in the U.S. from 1998–2007,” by Ryan Wiser, Galen Barbose, and Carla Peterman, may be downloaded from The research was supported by funding from the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy (Solar Energy Technologies Program) and Office of Electricity Delivery and Energy Reliability (Permitting, Siting and Analysis Division), and by the Clean Energy States Alliance.

Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California.

4Solar Energy

The sun has produced energy for billions of years. Solar energy is the sun’s rays (solar radiation) that reach the earth.

Solar energy can be converted into other forms of energy, such as heat and electricity. In the 1830s, the British astronomer John Herschel used a solar thermal collector box (a device that absorbs sunlight to collect heat) to cook food during an expedition to Africa.

Solar energy can be converted to thermal (or heat) energy and used to:

· Heat water – for use in homes, buildings, or swimming pools.

· Heat spaces – inside greenhouses, homes, and other buildings.

Solar energy can be converted to electricity in two ways:

· Photovoltaic (PV devices) or “solar cells” – change sunlight directly into electricity. PV systems are often used in remote locations that are not connected to the electric grid. They are also used to power watches, calculators, and lighted road signs.

· Solar Power Plants – indirectly generate electricity when the heat from solar thermal collectors is used to heat a fluid which produces steam that is used to power generator. Out of the 15 known solar electric generating units operating in the United States at the end of 2006, 10 of these are in California, and 5 in Arizona. No statistics are being collected on solar plants that produce less than 1 megawatt of electricity, so there may be smaller solar plants in a number of other states.

The major disadvantages of solar energy are:

· The amount of sunlight that arrives at the earth’s surface is not constant. It depends on location, time of day, time of year, and weather conditions.

· Because the sun doesn’t deliver that much energy to any one place at any one time, a large surface area is required to collect the energy at a useful rate.

Photovoltaic Energy

Photovoltaic energy is the conversion of sunlight into electricity. A photovoltaic cell, commonly called a solar cell or PV, is the technology used to convert solar energy directly into electrical power. A photovoltaic cell is a nonmechanical device usually made from silicon alloys.
Sunlight is composed of photons, or particles of solar energy. These photons contain various amounts of energy corresponding to the different wavelengths of the solar spectrum. When photons strike a photovoltaic cell, they may be reflected, pass right through, or be absorbed. Only the absorbed photons provide energy to generate electricity. When enough sunlight (energy) is absorbed by the material (a semiconductor), electrons are dislodged from the material’s atoms. Special treatment of the material surface during manufacturing makes the front surface of the cell more receptive to free electrons, so the electrons naturally migrate to the surface.

When the electrons leave their position, holes are formed. When many electrons, each carrying a negative charge, travel toward the front surface of the cell, the resulting imbalance of charge between the cell’s front and back surfaces creates a voltage potential like the negative and positive terminals of a battery. When the two surfaces are connected through an external load, electricity flows.
The photovoltaic cell is the basic building block of a photovoltaic system. Individual cells can vary in size from about 1 centimeter (1/2 inch) to about 10 centimeter (4 inches) across. However, one cell only produces 1 or 2 watts, which isn’t enough power for most applications. To increase power output, cells are electrically connected into a packaged weather-tight module. Modules can be further connected to form an array. The term array refers to the entire generating plant, whether it is made up of one or several thousand modules. The number of modules connected together in an array depends on the amount of power output needed.

The performance of a photovoltaic array is dependent upon sunlight. Climate conditions (e.g., clouds, fog) have a significant effect on the amount of solar energy received by a photovoltaic array and, in turn, its performance. Most current technology photovoltaic modules are about 10 percent efficient in converting sunlight. Further research is being conducted to raise this efficiency to 20 percent.

The photovoltaic cell was discovered in 1954 by Bell Telephone researchers examining the sensitivity of a properly prepared silicon wafer to sunlight. Beginning in the late 1950s, photovoltaic cells were used to power U.S. space satellites. The success of PV in space generated commercial applications for this technology. The simplest photovoltaic systems power many of the small calculators and wrist watches used everyday. More complicated systems provide electricity to pump water, power communications equipment, and even provide electricity to our homes.

Some advantages of photovoltaic systems are:

1. Conversion from sunlight to electricity is direct, so that bulky mechanical generator systems are unnecessary.

2. PV arrays can be installed quickly and in any size required or allowed.

3. The environmental impact is minimal, requiring no water for system cooling and generating no by-products.

Photovoltaic cells, like batteries, generate direct current (DC) which is generally used for small loads (electronic equipment). When DC from photovoltaic cells is used for commercial applications or sold to electric utilities using the electric grid, it must be converted to alternating current (AC) using inverters, solid state devices that convert DC power to AC.

Historically, PV has been used at remote sites to provide electricity. In the future PV arrays may be located at sites that are also connected to the electric grid enhancing the reliability of the distribution system.

Solar Thermal Heat

Solar thermal(heat) energy is often used for heating swimming pools, heating water used in homes, and space heating of buildings. Solar space heating systems can be classified as passive or active.

Passive space heating is what happens to your car on a hot summer day. In buildings, the air is circulated past a solar heat surface(s) and through the building by convection (i.e. less dense warm air tends to rise while more dense cooler air moves downward) . No mechanical equipment is needed for passive solar heating.

Active heating systems require a collector to absorb and collect solar radiation. Fans or pumps are used to circulate the heated air or heat absorbing fluid. Active systems often include some type of energy storage system.

Solar collectors can be either nonconcentrating or concentrating.

1) Nonconcentrating collectors – have a collector area (i.e. the area that intercepts the solar radiation) that is the same as the absorber area (i.e., the area absorbing the radiation). Flat-plate collectors are the most common and are used when temperatures below about 200o degrees F are sufficient, such as for space heating.

2) Concentrating collectors – where the area intercepting the solar radiation is greater, sometimes hundreds of times greater, than the absorber area.

Solar Thermal Power Plants

Solar thermal power plants use the sun’s rays to heat a fluid, from which heat transfer systems may be used to produce steam. The steam, in turn, is converted into mechanical energy in a turbine and into electricity from a conventional generator coupled to the turbine. Solar thermal power generation works essentially the same as generation from fossil fuels except that instead of using steam produced from the combustion of fossil fuels, the steam is produced by the heat collected from sunlight. Solar thermal technologies use concentrator systems due to the high temperatures needed to heat the fluid. The three main types of solar-thermal power systems are:

Solar Energy And The Environment

Solar energy is free, and its supplies are unlimited. Using solar energy produces no air or water pollution but does have some indirect impacts on the environment. For example, manufacturing the photovoltaic cells used to convert sunlight into electricity, consumes silicon and produces some waste products. In addition, large solar thermal farms can also harm desert ecosystems if not properly managed.


Easy, Clean and Cheap

Spanish Trade Group Visits German PV Installation. German manufacturers produce the most PV systems, followed by the Japanese, with increasing competition from cheaper Chinese PV systems.

IBM is running an ad that shows two workers sinking into quick sand. The article below is a perfect example of what that really means.

Gov. Arnold Schwarzengger announces that he will sign the newly approved state …

Another High-Profile SAP Failure: State Of California (SAP), February 21, 2009, by Eric Krangel — Yet another black eye for German software giant SAP (SAP). Last month Jeweler Shane Co. blamed difficulties trying to get SAP software running as partly responsible for the company’s bankruptcy. Now the State of California — already in dire financial straits — is giving up on its SAP implementation after sinking $25 million into the project and seeing nothing out of it.

CIOinsight: Schwarzenegger has long eyed the payroll as a way to stave off financial shortfalls until a budget reaches his desk for signing. Last July, the governor attempted to cut back payroll by temporarily dropping workers down to minimum wage until a budget deal was hammered out by lawmakers.

This plan was obstructed not by political wrangling or labor lobbyists — it was held up by absolutely ancient IT infrastructure and a beleaguered project to upgrade to SAP.

The California State Controller’s Office (SCO) is currently running on an old COBOL-based payroll system that dates back to the 1970s. The SCO began an initiative in 2006 to update this system, with initial estimates targeting full implementation by 2009. State Controller John Chiang said that the systems needed to carry out Schwarzenegger’s minimum wage plan would not be available for six months. That was last summer.

Just this January, the SCO announced that it was canceling its contract with the consulting company in charge of the project and had not estimated when it would hire another firm to carry on. That was $25 million into an estimated $69 million project.

Earlier this month, SAP co-CEO Leo Apotheker angrily denied there were problems with SAP’s software, and blamed consulting firms like IBM (IBM) and Accenture (ACN) for sending people who knew nothing about the software to clients as experts on SAP. Leo also has said SAP’s new cloud-like package, SAP Business Suite 7, should be easier to implement.

Plenty of blame to go around, we think. At least in the California bomb, the consulting firm involved was BearingPoint, which yesterday filed for bankruptcy. Accenture has already moved to acquire part of BearingPoint’s operations, February 20, 2009, by George Gilder — EZchip CEO, Eli Fruchter is a kindly, tough, humble, inspiring man, with sandy hair above a broad, blunt weather-beaten face. You would not recognize him as a miracle worker. He does not make grand claims. He is not an agile debater on a panel. He is not full of artful analogies and elegant prose or riveting details or luminous power points. He does not have avian or angular features or dark hair or other prototypical Israeli characteristics. He tells his story lucidly but without embellishment in careful Hebrew accented English. Take it or leave it.

If you believe his glib rivals with their claims of chips that will excel and eclipse his own, he does not seem to care. He knows what the customers say, what he has accomplished, and he seems indifferent to the hyperbole of others. Wall Street, the journals and magazines, the tech blogs, they will learn in time. His competitors will learn in time. In an unimpressive gray glass-clad multistory building by a pitted road on a hill in Yokneam, far from the centers of Israeli enterprise, with no architectural distinction or flourish, Fruchter has performed a miracle. But Eli does not preen as a miracle worker. He is embarrassed by prophetic language. When I informed him of my plans for this book, describing Israeli entrepreneurship and technology as the consummation of the Jewish science of the Twentieth Century, he balked, waving me aside.

“I am not important,” he said.

Then he asked me about Einstein.

“You are going to put me in a book with Einstein?” the entrepreneur of EZchip asked incredulously.

“Yes,” I said, “Einstein, and Bohr, and Pauli, and Von Neumann, and Feynman. All those guys were just preparing the way for you, Eli, providing the theoretical foundations for network processors that can compute at the speed of fiber optic communications, at the speed of light.”

Eli peered back at me full of skepticism.

I tried to explain.

Science finds its test in engineering. If scientific theories cannot be incorporated in machines that work, they are a form of theology. Throughout most of the history of science, the pioneers actually built the devices that proved their theories. Faraday, Hertz, Michelson, all those guys described by George Johnson in his book on the great experiments, they all proved their mastery of their ideas by creating the apparatus that tested them and embodied them. If you cannot build something that incorporates your idea, you cannot fully understand it and you probably cannot build on it.

The regnant physicists today are mostly mythopoeic metaphysicians reifying math: string theorists exploring dozens of mythical dimensions; exponents of mythical infinite parallel universes, with anthropic principles to explain us and our ideas as mere random happenings; nanotech evangelists who imagine a mythical reduction of all engineering to pure physics and its replication; cosmologists with their black holes and myriad particle types and unfathomable dark matter and dark energy dominating the universe. These guys cannot begin the construct anything that proves their increasingly fantastic theories.

You, Eli, take the best work of twentieth century science—quantum chemistry and solid state physics and optical engineering and computer science and information theory—and make it into an entirely new device: a network processor that can apply programmable computer intelligence to millions of frames of data and packets of information traveling at rates of a hundred billion bits a second. That’s one hundred gigabits a second. Equivalent to 100 thousand 400 page books, with each page or so scanned and addressed and sorted, and all sent in one second to the right destination.

When conditions on the network change, the network processor can be reprogrammed. With as many as eight “touches” of the data per packet, classifying the packet, looking up addresses, finding the best route—parsing, searching, resolving, modifying, resetting the packet headers. That means trillions of programmable operations per second. You make the most efficient computers on the planet.

You build things that the world has never seen before. In fact, even so, you and your team—Gil Koren, Amir Ayal, Ran Giladi and the rest—may well not fully understand everything that is going on in your machines. No one has fully fathomed the quantum mysteries underlying modern electronics. But I believe that von Neumann was the paramount figure of Twentieth Century science because he was the link between the pioneers of quantum theory and the machines that won World War II, that prevailed in the cold war, and that enabled the emergence of a global economy tied together and fructified by the Internet. The entire saga is one fabric. And you are the current embodiment of this great tradition, mainly a Jewish tradition.

Von Neumann was the man who outlined the path between the new quantum science of materials and the new computer science of information. “You Eli are a leading figure in the next generation of computer technology: the creation of parallel processors made of sand that can link at fiberspeed with the new optical communications technology.”

“But there are thousands of entrepreneurs in Israel more important than me,” Fruchter insisted. “Thousands. You should speak to Zohar Zisapel. He and his brother created RAD in 1981, put the first modem on a single chip and then started five companies that emerged from RAD. Today they have 2,500 employees in Israel and hundreds more around the world.” I looked it up. They do signaling for high speed trains, electronic messaging to motorists seeking free parking spaces, communications for remote surgery across the globe.

“Zohar’s an Israeli entrepreneur,” says Eli. “He laid the foundations of Israeli technology. EZchip is still just a small company…”

I first heard Eli describe his plans for a network processor at a forum in Atlanta called InterOp 2000. At the time, EZchip was one of at least fifty companies pursuing the technology. Linking the network to computers around the world, it was the most challenging target for the next generation of microchips. A network processor has to function at the speed of a network increasingly made of fiber optic lines. For most of the decade of the 1990s, fiber optics—light transmitted down glass threads—grew in bandwidth and capability at a pace at least three times as fast as the pace of advance of electronics. Called Moore’s Law after Gordon Moore of Intel, named and researched by Carver Mead of Caltech, this law of the pace of advance of computing capabilities ordains that computer technology doubles in cost effectiveness every 18 months to two years. During the first decade of the 21st century, fiber optic technology has been advancing nearly twice as fast as Moore’s law.

The network processor has to bridge this gap. Just as the Pentium is the microprocessor that makes the PC work, the network processor has become the device that makes the next generation Internet work—that does the crucial routing and switching at network nodes on the net.

I first encountered Eli Fruchter not in person but in a series of tapes. Intrigued by the promise of network processors, I ordered them from a major technology conference called InterOp that was holding a network processor forum. At InterOp, engineers have to prove that their technologies can interoperate with other networking technologies and standards. In communications, systems must interoperate or they are useless. Interoperation between systems that are rapidly changing requires devices that are programmable. In the 1990s the fastest changing technology in the world was the network. In those days, nearly anyone who was anybody in networking showed up at InterOp and made his interoperability pitch.

With the market tumbling and my own company in chaos, I had missed InterOp1999. But I was interested in network processors and InterOp hosted a two day forum on the subject. I ordered the tapes and drove around the Berkshires where I live, listening to all the vendors of new network processor designs.

At the time, the leaders were Motorola, Intel, IBM, Trimedia (now part of Alcatel), Cisco, Lucent, Texas Instruments, AMCC, Broadcom, and Agere. You name your technology champion, they were investing billions of dollars apiece in network processor projects. The largest electronics and computer companies in the world put more than $20 billion into network processor design and development over the last decade.

I listened to seven or eight hours of tapes, and I decided that the most plausible, scalable design for a network processor was presented by Eli Fruchter of EZchip. Alone among the presenters, Fruchter seemed to grasp that network processors would have to scale faster than computer technology. Ordinary arrays of parallel RISC (reduced instruction set computing) microprocessors might perform the role for a couple years. But within five years they would be obsolete. Fruchter saw that a new architecture would be needed.

This meant moving beyond the von Neumann computer architecture that had dominated computing since the beginning. The von Neumann model was based on the successive step-by-step movement of data and instructions from memory to a processor. With scores of homogeneous RISC machines requiring data and instructions at once, the performance of the system depended on the bandwidth to memory. It seemed to me that none of the existing network processors had addressed this challenge in a way that would scale with the constant acceleration of dataflows across the Internet.

Most router and switch companies, such as Cisco, Lucent, Juniper, Alcatel and others, had contrived specialized machines. These application specific devices could perform network processing at tremendous speeds for particular protocols and datatypes. But these processors could not change with the changes in the network. They could not adapt. They could not scale. Every time the network changed, the network processing function would have to change. That, it seemed to me, would not be a successful solution.

None the less, at InterOp, Motorola, Intel, AMCC, Agere, Bay Microsystems, and IBM among others were presenting programmable processors. Their devices were available in the market and were being produced in volume in workable programmable silicon devices.

As I said at the time, Eli Fruchter had developed a leading edge device, alright, and it met the challenge of changeability and scalability, because it was inscribed upon the easily adaptable and programmable substrate of PowerPoint slides.

Now I reminded Eli: “You had at least 50 competitors and no customers, and no product, and you invested maybe a hundredth of the money that they did.

“Now, just eight years later, you have more than 50 customers, six industry leading products, and virtually no serious competitors. All the large players—Intel, Motorola, IBM—have essentially left the field. That is stunning. How did you do it, Eli?”