Bacterial clock: Scientists have engineered bacteria to glow in synchronous waves, as shown in this still image from a video. The genetic circuit might one day be used to detect toxins or deliver drugs.
Credit: Tal Danino, Octavio Mondragon-Palamino, Lev Tsimring, and Jeff Hasty

A Synchronous Clock Made of Bacteria

Such microorganisms might make environmental sensors or drug delivery systems

MIT Technology Review, January 21, 2010, by Emily Singer  —  It’s not your typical clock. Rather than a quartz movement and sweeping second hand, the heart of this device is a colony of genetically engineered bacteria. A deceptively simple circuit of genes allows the microorganisms to keep time with synchronized pulses of fluorescent light, beating with a slow, rhythmic flicker of 50 to 100 minutes.

The bacteria represent the first synchronized genetic oscillator. Scientists say the tool will be foundational for synthetic biology, an offshoot of genetic engineering that attempts to create microorganisms designed to perform useful functions. The oscillator might one day provide the basis for new biosensors tuned to detect toxins, or for cellular drug delivery systems designed to release chemicals into the body at preprogrammed intervals.

Oscillators are an integral part of the biological world, defining cycles from heartbeats to brain waves to circadian rhythms. They also provide a vital control mechanism in electronic circuits. Biologists first set out to engineer a biological version more than a decade ago, creating a circuit dubbed the “repressilator.” (The creation of the repressilator, along with a genetic on-off switch, in 2000 is generally considered the birth of synthetic biology.) However, early oscillators lacked precision–the rhythm quickly decayed, and its frequency and amplitude couldn’t be controlled.

In 2008, Jeff Hasty and his team at the University of California, San Diego, created a more robust oscillator that could be tuned by the temperature at which the bacteria were grown, the nutrients they were fed, and specific chemical triggers. But the oscillations were still limited to individual cells–the bacteria did not flash together in time. In the new research, published today in the journal Nature, Hasty and colleagues build on this work by incorporating quorum-sensing, a molecular form of communication that many bacteria use to coordinate their activity.

The new oscillator consists of a simple circuit of two genes that creates both a positive and negative feedback loop. The circuit is activated by a signaling molecule, which triggers the production of both more of itself and of a glowing molecule called green fluorescent protein. The signaling molecule diffuses out of the cell and activates the circuit in neighboring bacteria.

The activated circuit also produces a protein that breaks down the signaling molecule, providing a time-delayed brake to the cycle. The dynamic interactions of different parts of the circuit in individual and neighboring cells create regular pulses of the signaling molecule and the fluorescent protein, appearing as a wave of synchronous activity. It’s “a feat analogous to engineering all the world’s traffic lights to blink in unison,” wrote Martin Fussenegger, a bioengineer at the Swiss Federal Institute of Technology, in Zurich, in a commentary accompanying the paper in Nature.

The colonies of bacteria are grown in a custom-designed microfluidics chip, a device that allows scientists to precisely control the conditions the microorganisms are exposed to. Changing the rate at which nutrients flow into the chip alters the period of the oscillations, says Hasty.

“The ability to synchronize activity among cells in a population could be an important building block for many applications, from biomedicine to bioenergy,” says Ron Weiss, a former TR35 winner and a bioengineer at MIT who was not involved in the research. For example, the bacteria could be engineered to detect a specific toxin, with the frequency of the fluorescence indicating its concentration in the environment. While a microscope is currently needed to read the output, Hasty’s team is now working on a version that can be seen with the naked eye.

The oscillator could also be used to deliver drugs, such as insulin, that function best when dosed at certain intervals. “In the future you could think of implants that produce a therapeutic effect,” says Fussenegger. The dosing of the drug would relate to the strength or amplitude of the oscillation, while the timing of the dosing would be determined by its frequency. “There would be nothing to worry about for the patient,” he says.

Researchers are now trying to make the system more robust, as well as to extend the timescale over which it can synchronize activity. They also want to combine it with previous genetic oscillators, and transfer it into different cell types that might be suited to different biotech applications.

Missouri Univ School of Medicine, 2009/2010  —  As soon as babies are born, they are susceptible to diseases and infections, such as jaundice and e-coli. For up to a month, their immune systems aren’t adequately developed to fight diseases. Although these infections are often minor, they can lead to serious problems if left untreated. To help strengthen newborns’ immune systems, University of Missouri researchers have pinpointed a group of depleted white blood cells, which might lead to an immune-strengthening vaccine.

“We’re trying to improve the immune system of newborns to make them more like adults’ immune systems and, therefore, less susceptible to diseases,” said Christine Hoeman, doctoral student in the MU School of Medicine. “Although our testing has only been on animal models thus far, our ultimate goal is to create better pediatric vaccines for humans to improve the balance within the immune system.”

Hoeman and Habib Zaghouani, professor of molecular microbiology and immunology and child health at the MU School of Medicine, have found that newborns have an imbalance of two different groups of T-helper cells (TH cells), which are white blood cells and the main fighters in the immune system. Newborns have a large amount of TH2 cells, a group of white blood cells that mediate allergic reactions, but not enough of TH1 cells, a group of white blood cells that fight infections.

Environmental factors also affect the imbalance of these two groups of T-helper cells. The first time newborns are exposed to an antigen, or a foreign substance that illicits a response in the immune system, their white blood cells are balanced, but the second time they are exposed to the antigen, they create too much of TH2 and not enough of TH1. This imbalance is what leads to possible infection and allergic reactions.

“What’s happening is that the TH2 cells are killing the TH1 cells, creating the imbalance,” Hoeman said. “Once we know more about the timeline of the imbalance, we can start to develop the vaccine, which would increase the levels of TH1 and would ideally be administered in newborns soon after they’re born.”

Hoeman and Zaghouani’s research has been published in The Journal of Environmental Medicine and Trends in Immunology.
Regenerative Medicine and Transplantations: How the Immune System Produces T Cells
Babraham Institute*, Cambridge University, January 21, 2010  —  Research from the Babraham Institute, reported in the Journal of Experimental Medicine, provides new insights into how our immune system produces T cells, a type of white blood cell that is an essential part of the body’s immune surveillance system for fighting infection. The findings pave the way for a new means of making purified T cells, which gets over one of many hurdles faced in the use of T cells in regenerative medicine and transplantations, and in addition will open up new avenues of research and applications in drug and toxicity testing in industry.

This international collaboration of immunologists draws together academic and commercial researchers from the UK, Japan, GlaxoSmithKline USA and a Da Vinci exchange student from Italy. It reveals for the first time how immature T cells can be grown without the need for supporting “feeder” cells — these cannot be easily separated from T cell preparations, reducing their suitability for transplantation in the clinic. This may advance the field of regenerative medicine.

The discovery will enable scientists to ask fundamental questions about the immune system. Dr Martin Turner a Group Leader and Head of Babraham’s Laboratory of Lymphocyte Signalling and Development, who led the research team said, “Studying how T cells develop helps us to understand healthy development, how T cells acquire specialised functions and what factors can cause lymphomas or other devastating illnesses. A goal of research in the field of regenerative medicine is T cell reconstitution for therapeutic purposes.”

“One of the challenges for the scientific community is to reproduce the process of T cell development in the laboratory,” said Dr Michelle Janas, lead author on the paper. “This technology could enable the production of T cells for clinical applications such as their transplantation into immuno-compromised individuals.”

T cells develop in the thymus from progenitor cells recruited from the bone marrow. It is a complicated process requiring many biochemical signals and growth factors which bind to T cells. This binding transmits signals inside the cell causing genetic changes that are required to produce mature, active T cells capable of detecting foreign bodies — viruses, bacteria or fungi and mounting an appropriate attack.

Thymic function and T cell development is most active in early life but around the onset of puberty, the thymus starts shrinking and fewer T cells are made as we age. This progressive deterioration normally has little effect on healthy people. However, in the event of chemo/radiotherapy or infections like HIV/AIDS, the body’s ability to replace T cells is severely compromised resulting in an abnormally low level of lymphocytes (T cell lymphopenia). Even after bone marrow transplant, T cell numbers to do not recover for at least two years. This decrease in thymic output also reduces the diversity of T cells patrolling our systems, leaving individuals vulnerable to opportunistic infection. This discovery at Babraham, an institute of the Biotechnology and Biological Sciences Research Council (BBSRC), may facilitate the production of pure tailor made T cells for transplantation.

Central to the team’s discovery is a family of signalling proteins called Phosphoinositide 3-kinases, or PI3Ks, and their interaction with T cells as they mature in the thymus. PI3Ks are also used by cells to transmit signals from receptors on their outside to the machinery inside to dictate how the cell should react, for example when a T cell recognises the presence of a pathogen. However, the receptors to which each of these molecules were associated had, until now not been identified.

This study reveals that PI3K-p110d transmits signals from the pre-T cell receptor, a precursor of the T Cell Receptor, which detects foreign antigens in the body. Another signalling molecule called PI3K-p110g transmits signals from a receptor known as CXCR4, which binds to the chemokine CXCL12 produced in the thymus. Chemokines conventionally stimulate cells of the immune system to move (chemotaxis) to a site of infection, however, these findings indicate that CXCL12 is an important growth factor for developing T cells.

Dr Janas added, “The generation of T cells in culture is currently possible, but requires supporting feeder cells; these mimic the thymus environment but have the disadvantage of contaminating the recovered T cells. Producing T cells without additional feeder cells requires a greater understanding of the growth factors normally provided by the thymus. The discovery that CXCL12 is critical for immature T cell growth brings us a step closer to achieving this goal. We have shown that immature T cells isolated from the thymus could only continue their developmental program when cultured in the presence of CXCL12 and another growth factor known as Notch-ligand. This is the first demonstration of T cell development in vitro that does not require supporting feeder cells.”

These patented discoveries could also be beneficial and highly valuable to the field of drug discovery and toxicology, where reliable methods to screen and understand the mode of action of pharmacological reagents on lymphocytes are sought, and in a clinical setting where sources of purified T cells free of contaminating accessory cells are required for transplantation purposes. The research was funded by BBSRC and MRC.
*Babraham Institute
The Babraham Institute, set in an attractive and extensive parkland estate just south of Cambridge University, is an independent charitable life sciences institute undertaking innovative biomedical research. The aim of this research is to discover the molecular mechanisms that underlie normal cellular processes and functions, and how their failure or abnormality may lead to disease. The Babraham Institute is sponsored by the Biotechnology and Biological Sciences Research Council (BBSRC) and is based in Babraham, Cambridgeshire, England.

The Babraham Institute provide a highly successful research environment and has the status of a postgraduate department within the University of Cambridge and trains PhD students who are registered with the University’s Faculty of Biology.

Babraham Bioscience Technologies Ltd (BBT), the wholly-owned trading subsidiary of the Babraham Institute promotes knowledge transfer and translation of the Institute’s research discoveries, actively managing and exploiting the Institute’s intellectual property, promoting and negotiating commercial research partnerships and establishing spin-out companies when appropriate. BBT is developing the campus to promote innovation and become one of the leading centres for bioscience innovation in the UK.

Babraham Hall

Back of Babraham Hall

Near Entrance to Babraham Institute

View from Babraham campus near some research buildings

January 22, 2010, by Gabe Mirkin MD  —  A study from the University of California, Davis shows that a high-fat diet prevents exercising mice from enlarging their

muscles (Journal of Physiology, December 2010).  The mice received either a low fat, high carbohydrate diet or a high fat, low carbohydrate diet for 14 weeks.  Each group was divided into those who performed progressive resistance exercises with their

plantaris muscles or those that did not do this exercise.

Those who exercised on the low fat, high-carbohydrate diet had substantially larger muscles than those who exercised on the high-fat diet.  Chemical analysis of their muscles showed that the high fat diet group had lower levels of polysomes (Akt and S6K1) necessary for making protein.

If this study can be applied to humans, it will mean that not only does a high-saturated-fat diet make you fatter, it also keeps you from enlarging your muscles.  We know that both full fat cells and eating large amounts of saturated fats (the dominant fat in meat) turns on your immunity to cause inflammation that can prevent the body from making protein necessary for enlarging muscles. (Journal of Nutrition, January 2009).

A high saturated-fat diet also blocks insulin receptors and thus prevents your body from

responding to insulin, which is necessary for muscles to heal from intense workouts.  Insulin drives amino acids, the protein building blocks, into muscles to help them heal faster.  Anything that blocks muscles’ ability to respond to insulin will decrease amino acid entry into muscles and thus delay healing so you can’t recover as fast for your next workout.  Further journal references and recommendations can be found at

http://www.drmirkin.com/fitness/muscle_growth.html

All carbohydrates are made up of combinations of sugars.  Before any carbohydrate can be absorbed, it must first be broken down into single sugars that are almost always

absorbed before they reach your colon. Only single sugars can be absorbed.  Dried skins of fruits contain fiber that your body cannot break down, and sugars imbedded so deeply in the fiber that these sugars cannot be absorbed in the small intestines. When these sugars reach the colon, bacteria ferment them rapidly and break them into 1) small particles that draw large amounts of fluids into the colon, and 2) gas that dilates the colon and pushes stool toward the opening.

When you eat, the pyloric sphincter at the end of the stomach closes, and food is not allowed to pass into your small intestine until is converted to a liquid soup. Nutrients are

absorbed from the food there, but most of the liquid soup passes to your colon, where the fluid is rapidly absorbed.  As a general rule, food reaches your colon four to ten hours after you eat, but it can remain in your colon from many hours to many days, depending on when you push it out. The longer stool remains in your colon, the more water is absorbed, the harder stool becomes, and the more difficult it is to pass.  To prevent or treat constipation, try to move everything from your colon as soon as possible.

If you have chronic constipation, check with your doctor who will probably order tests to rule out a cancer or other obstruction, or diabetic nerve damage.  Usually these tests are normal and you simply need to change your diet.

  • Exercise every day. Exercise causes giant contractions of the colon which push food out.  The longer and harder you exercise, the greater the movement of food toward the outside.
  • Drink plenty of fluid because dehydration increases the rate that fluid is absorbed from your colon.
  • Avoid constipating foods: low-fiber foods such as flour and sugar water; and high-fat foods such as cheese, eggs, and meats.
  • Eat lots of high-fiber foods that hold extra water in your colon: vegetables, whole grains, beans, fruits, nuts.
  • Dried fruits are particularly effective for pushing food onward: prunes, apricots, cranberries,  apples and so forth.
  • Try to empty your colon less than a half hour after eating. When food reaches your stomach, the stomach is stretched, sending a message along nerves from the stomach to cause the colon to contract and push foods forward.  This is called the gastro-colic reflex.  The longer stool remains in your colon,  the drier and harder it becomes, so you want to empty your colon as soon as it fills.

Dilithium crystals: Crystals of dilithium oxalate (white strands) form in a solution
 containing a blue-green copper-based material. This is one stage of a catalytic
process that captures carbon dioxide from the atmosphere to produce useful chemicals.
Credit: Elisabeth Bouwman,
University of Leiden

A copper-based catalyst helps turn the gas into antifreeze and household cleaners.

MIT Technology Review, January 21, 2010, by Kevin Bullis  —  When it’s exposed to the elements, the surface of copper turns green because it reacts with oxygen. But now scientists have discovered a copper-based material with a surprising property: it reacts with carbon dioxide in air rather than oxygen. Though the reaction is not a practical way to remove large quantities of carbon dioxide from the atmosphere, it does provide an alternative new route, using a cheap, nonpetroleum feedstock, to make useful chemicals.

Researchers have been looking for such a material for a long time, taking a cue from plants, which use atmospheric carbon dioxide to produce a wide range of useful materials. But previous approaches have fallen short in a variety of ways. For example, they’ve required large amounts of energy and concentrated streams of carbon dioxide rather than the trace amounts found in air. One of the big challenges is that materials tend to preferentially react with oxygen, which is much more reactive than carbon dioxide and far more abundant. Oxygen makes up over 20 percent of the atmosphere, whereas there are only a few hundred parts per million of carbon dioxide.

With the new material, “the energy you need to put in is very low,” says Daniel DuBois, a senior scientist at the Pacific Northwest National Laboratory in Richland, WA, who was not involved with the research. “And the fact that it will bind and reduce CO2 directly from the atmosphere is pretty startling. I wouldn’t have thought that you could do that.”

When the copper material is exposed to air, it binds two molecules of carbon dioxide to form oxalate. The researchers then expose the material to a lithium salt, which removes the oxalate from the material, forming lithium oxalate. By applying a low voltage to the copper material, the researchers reduce it to its original state, and it can again bind carbon dioxide. Lithium oxalate can easily be converted to oxalic acid, a ingredient in household cleaners such as rust removers, says Elisabeth Bouwman, a senior lecturer in inorganic chemistry at the University of Leiden in the Netherlands, who led the group that discovered the material. Oxalic acid can also be used to make ethylene glycol, an antifreeze and a precursor to some common plastics.

But Bouwman notes that the material is “a long way” from commercial applications. For one thing, it is very slow–it takes an hour to do each cycle. It’s also unlikely that the process, or indeed any other process for turning carbon dioxide into commodity chemicals, will significantly reduce atmospheric levels of carbon dioxide. Even if the process can be made much faster–and if expensive lithium salts can be replaced with sodium to reduce costs–there’s not enough demand for such chemicals to make a dent in carbon dioxide levels. For example, the chemicals produced in the largest volumes in the United States are only made at the scale of tens of millions of tons. Annual carbon dioxide emissions in the United States are around six billion tons.

The researchers are currently experimenting with the new material, changing it in small ways to learn more about how it works and to improve its performance.

The New York Times, January 21, 2010, by Ashlee Vance  —  Back at the dawn of the Web, the most popular account password was “12345.”

Today, it’s one digit longer but hardly safer: “123456.”

Despite all the reports of Internet security breaches over the years, including the recent attacks on Google’s e-mail service, many people have reacted to the break-ins with a shrug.

According to a new analysis, one out of five Web users still decides to leave the digital equivalent of a key under the doormat: they choose a simple, easily guessed password like “abc123,” “iloveyou” or even “password” to protect their data.

“I guess it’s just a genetic flaw in humans,” said Amichai Shulman, the chief technology officer at Imperva, which makes software for blocking hackers. “We’ve been following the same patterns since the 1990s.”

Mr. Shulman and his company examined a list of 32 million passwords that an unknown hacker stole last month from RockYou, a company that makes software for users of social networking sites like Facebook and MySpace. The list was briefly posted on the Web, and hackers and security researchers downloaded it. (RockYou, which had already been widely criticized for lax privacy practices, has advised its customers to change their passwords, as the hacker gained information about their e-mail accounts as well.)

The trove provided an unusually detailed window into computer users’ password habits. Typically, only government agencies like the F.B.I. or the National Security Agency have had access to such a large password list.

“This was the mother lode,” said Matt Weir, a doctoral candidate in the e-crimes and investigation technology lab at Florida State University, where researchers are also examining the data.

Imperva found that nearly 1 percent of the 32 million people it studied had used “123456” as a password. The second-most-popular password was “12345.” Others in the top 20 included “qwerty,” “abc123” and “princess.”

More disturbing, said Mr. Shulman, was that about 20 percent of people on the RockYou list picked from the same, relatively small pool of 5,000 passwords.

That suggests that hackers could easily break into many accounts just by trying the most common passwords. Because of the prevalence of fast computers and speedy networks, hackers can fire off thousands of password guesses per minute.

“We tend to think of password guessing as a very time-consuming attack in which I take each account and try a large number of name-and-password combinations,” Mr. Shulman said. “The reality is that you can be very effective by choosing a small number of common passwords.”

Some Web sites try to thwart the attackers by freezing an account for a certain period of time if too many incorrect passwords are typed. But experts say that the hackers simply learn to trick the system, by making guesses at an acceptable rate, for instance.

To improve security, some Web sites are forcing users to mix letters, numbers and even symbols in their passwords. Others, like Twitter, prevent people from picking common passwords.

Still, researchers say, social networking and entertainment Web sites often try to make life simpler for their users and are reluctant to put too many controls in place.

Even commercial sites like eBay must weigh the consequences of freezing accounts, since a hacker could, say, try to win an auction by freezing the accounts of other bidders.

Overusing simple passwords is not a new phenomenon. A similar survey examined computer passwords used in the mid-1990s and found that the most popular ones at that time were “12345,” “abc123” and “password.”

Why do so many people continue to choose easy-to-guess passwords, despite so many warnings about the risks?

Security experts suggest that we are simply overwhelmed by the sheer number of things we have to remember in this digital age.

“Nowadays, we have to keep probably 10 times as many passwords in our head as we did 10 years ago,” said Jeff Moss, who founded a popular hacking conference and is now on the Homeland Security Advisory Council. “Voice mail passwords, A.T.M. PINs and Internet passwords — it’s so hard to keep track of.”

In the idealized world championed by security specialists, people would have different passwords for every Web site they visit and store them in their head or, if absolutely necessary, on a piece of paper.

But bowing to the reality of our overcrowded brains, the experts suggest that everyone choose at least two different passwords — a complex one for Web sites were security is vital, such as banks and e-mail, and a simpler one for places where the stakes are lower, such as social networking and entertainment sites.

Mr. Moss relies on passwords at least 12 characters long, figuring that those make him a more difficult target than the millions of people who choose five- and six-character passwords.

“It’s like the joke where the hikers run into a bear in the forest, and the hiker that survives is the one who outruns his buddy,” Mr. Moss said. “You just want to run that bit faster.”

Credit: Technology Review

There’s more to it than microblog posts and social network updates

MIT Technology Review, January 21, 2010, by Erica Naone  —  The “real-time Web” is a hot concept these days. Both Google and Microsoft are racing to add more real-time information to their search results, and a slew of startups are developing technology to collect and deliver the freshest information from around the Web.

But there’s more to the real-time Web than just microblogging posts, social network updates, and up-to-the-minute news stories. Huge volumes of data are generated, behind the scenes, every time a person watches a video, clicks on an ad, or performs just about any other action online. And if this user-generated data can be processed rapidly, it could provide new ways to tailor the content on a website, in close to real time.

Many Web companies already use analytics to optimize their content throughout the course of a day. Some online news sites will, for example, tweak the layout on their home page by monitoring the popularity of different articles. But traditionally, information has been collected, stored, and then analyzed afterward. Using seconds-old data to tailor content automatically is the next step. In particular, a lot of the information generated in real-time relates to advertising. A few startup companies are developing technologies to process this data rapidly.

Sailesh Krishnamurthy, vice president and cofounder of the data-analysis company Truviso, based in Foster City, CA, points to the hundreds of billions of data points created each day through the delivery of online video. “If you think of each one of those hits and the associated advertisements being served by those hits,” he says, “then it’s this complex ecosystem of companies serving the ads, managing the ads, companies trying to figure out metrics. It’s pretty amazing to think that just that one user interaction leads to this explosion of activity happening under the covers.”

Real-time data analysis has its roots in the financial markets, but Ben Lorica, a senior analyst in the research group at O’Reilly Media, believes that Web companies will want to optimize ads, video, and multimedia campaigns as fast as possible. He adds that services that deliver Web content instantly make the approach relevant to the end users, too. “As people realize that they can push content out and others will start consuming it in real-time, then people will also naturally want the reporting of how that is being consumed in real-time,” he says.

Truviso and another startup, StreamBase, based in Lexington, MA, have created software to process real-time analytics data. Both companies were spun out of university research aimed at processing real-time data from sensor networks, such as those used to monitor environmental conditions. Richard Tibbetts, CTO of StreamBase, explains that financial markets make up about 80 percent of his company’s customers today. Web companies are just starting to adopt the technology.

“You’re going to see real-time Web mashups, where data is integrated from multiple sources,” Tibbetts says. Such a mashup could, for example, monitor second-to-second fluctuations in the price of airline tickets and automatically purchase one when it falls below a certain price.

Truviso recently launched a feature that allows users to calculate unique visitors to a website in real time. This has historically been a difficult problem because several steps must be performed each time to make sure the user is really distinct. Both StreamBase and Truviso rely on accessing conventional, structured databases. Lorica sees potential for real-time analysis of unstructured data–a set of numbers found scattered through a paragraph of text rather than formatted in a chart.

Software frameworks, such as Hadoop and Google’s MapReduce, which process large amounts of Web data using large numbers of computers, are often used to analyze unstructured data. Recent research from Yahoo and the University of California, Berkeley also promises to make these frameworks work in real-time, too.

Joseph Hellerstein, a UC Berkeley professor of computer science who was involved with this work, explains that the key was to find a way to make Hadoop and MapReduce faster and more interactive without compromising their ability to protect data.

Real-time applications, whether using traditional database technology or Hadoop, stand to become much more sophisticated going forward. “When people say real-time Web today, they have a narrow view of it–consumer applications like Twitter, Facebook, and a little bit of search,” says StreamBase’s Tibbetts.

Credit: Technology Review

Search giant compresses the time frame of its results from minutes to seconds

MIT Technology Review, 2009/2010, by David Talbot  —  Gradually, over the past decade, Google has compressed the gap between fresh indexing of the Web from months to mere minutes. On Monday the search giant upped the ante in time-sensitive search, saying that within a few days it will offer search results–including headlines, blogs, tweets, and feeds from Facebook and MySpace–that are just seconds old.

At the same press event, the company unveiled new search features for mobile devices. These include a prototype visual search technology, which allows snapshots of real objects, like signs and buildings, to be used as search “terms.” It also tweaked its geographic search–your GPS-derived position now causes Google to offer different search results based on location. For example: if you start a search with the letters “R” and “E” in Boston, the service will suggest various “Red Sox” search results, while the same two letters typed in San Francisco suggest the retailer REI.

However, Google clearly sees up-to-the-second search results as its most important new offering. The search giant has recently come under unfamiliar pressure from Microsoft’s revamped search engine, Bing, which also provides some “real-time” search results.

“This is the first time, ever, that any search engine has integrated the real-time Web into the results page,” Amit Singhal, a Google fellow, said yesterday in making the announcement.

“Information is being created at a pace I have never seen before–and in this environment, seconds matter,” Singhal added. “I cannot emphasize enough–relevance is the foundation of this product. It is relevance, relevance, relevance. There is so much info being generated out there, getting you relevant information is the key to success of a product like this.”

The arrival of Twitter, in particular, has focused the attention of many Internet companies on the value of real-time information on the Web. By tapping into customers’ interest in time-sensitive information–from Twitter posts to breaking news stories–Google stands to build its audience and, ultimately, its advertising revenues.

The new feature will be available when a user clicks the “Latest results” tab on Google searches. It will be available immediately in English-language countries, but will soon be expanded to other languages, the company says. Searchers will see updates from popular social sites such as Twitter and Friendfeed, and headlines from news sites. Visiting Google Trends and clicking on a “hot topic” will reveal a search results page showing the most popular real-time information.

Other search engines are working to make their results just as fresh. Bing includes some recent results in its search returns, and the newcomer Cuil launched streaming results last month. “It is a good thing to see Google innovate on their search page thanks to competition brought on by other search engines like Bing and Cuil,” said Seval Oz Ozveren, VP for business development at Cuil.

The visual search tool, released in Google Labs, lets users take a photo of a landmark or a store sign, for example, and then searches billions of images for matches, and for Web pages providing relevant information. However, this feature will not include face-recognition software until Google devises a system to protect privacy. “We have decided to delay that until we have more safeguards in place,” says Vic Gundotra, Google’s vice president for engineering.

Dan Weld, a computer scientist and search researcher at the University of Washington, tested the visual search technology and pronounced it “pretty darn cool.” He says that it recognized a can of Diet Dr Pepper and found relevant search returns. And, after initially drawing a blank on a bottle of Lipton Iced Tea, it recognized it with a closer-up shot, and delivered good search results.

Weld suggests that the technology works by doing optical-character recognition of the words, rather than of the labels itself, since at one point it caught the letters “API” from a label and gave him search results for “application programming interface”. The technology also recognized the Seattle space needle and gave him tourist websites. “Not a formal evaluation, but it’s pretty neat,” he says. “And it seems like it has the potential to be a huge opportunity for them if it takes off.”

With the convergence of billions of mobile networked devices, powerful cloud computing resources, and ubiquitous sensors like cameras and GPS chips, “it could be that we are on the cusp of a new computing era,” Gundotra added. “Take the camera and connect it to the cloud, it becomes an eye. The microphone connected to the cloud becomes an ear. Search by site, search by location, search by voice.”