20090730-10

The New York Times, July 27, 2009, by John Markoff  —  A robot that can open doors and find electrical outlets to recharge itself. Computer viruses that no one can stop. Predator drones, which, though still controlled remotely by humans, come close to a machine that can kill autonomously.

Impressed and alarmed by advances in artificial intelligence, a group of computer scientists is debating whether there should be limits on research that might lead to loss of human control over computer-based systems that carry a growing share of society’s workload, from waging war to chatting with customers on the phone.

Their concern is that further advances could create profound social disruptions and even have dangerous consequences.

 

As examples, the scientists pointed to a number of technologies as diverse as experimental medical systems that interact with patients to simulate empathy, and computer worms and viruses that defy extermination and could thus be said to have reached a “cockroach” stage of machine intelligence.

 

While the computer scientists agreed that we are a long way from Hal, the computer that took over the spaceship in “2001: A Space Odyssey,” they said there was legitimate concern that technological progress would transform the work force by destroying a widening range of jobs, as well as force humans to learn to live with machines that increasingly copy human behaviors.

 

The researchers – leading computer scientists, artificial intelligence researchers and roboticists who met at the Asilomar Conference Grounds on Monterey Bay in California – generally discounted the possibility of highly centralized superintelligences and the idea that intelligence might spring spontaneously from the Internet. But they agreed that robots that can kill autonomously are either already here or will be soon.

 

They focused particular attention on the specter that criminals could exploit artificial intelligence systems as soon as they were developed. What could a criminal do with a speech synthesis system that could masquerade as a human being? What happens if artificial intelligence technology is used to mine personal information from smart phones?

 

The researchers also discussed possible threats to human jobs, like self-driving cars, software-based personal assistants and service robots in the home. Just last month, a service robot developed by Willow Garage in Silicon Valley proved it could navigate the real world.

 

A report from the conference, which took place in private on Feb. 25, is to be issued later this year. Some attendees discussed the meeting for the first time with other scientists this month and in interviews.

 

The conference was organized by the Association for the Advancement of Artificial Intelligence, and in choosing Asilomar for the discussions, the group purposefully evoked a landmark event in the history of science. In 1975, the world’s leading biologists also met at Asilomar to discuss the new ability to reshape life by swapping genetic material among organisms. Concerned about possible biohazards and ethical questions, scientists had halted certain experiments. The conference led to guidelines for recombinant DNA research, enabling experimentation to continue.

 

The meeting on the future of artificial intelligence was organized by Eric Horvitz, a Microsoft researcher who is now president of the association.

 

Dr. Horvitz said he believed computer scientists must respond to the notions of superintelligent machines and artificial intelligence systems run amok.

 

The idea of an “intelligence explosion” in which smart machines would design even more intelligent machines was proposed by the mathematician I. J. Good in 1965. Later, in lectures and science fiction novels, the computer scientist Vernor Vinge popularized the notion of a moment when humans will create smarter-than-human machines, causing such rapid change that the “human era will be ended.” He called this shift the Singularity.

 

This vision, embraced in movies and literature, is seen as plausible and unnerving by some scientists like William Joy, co-founder of Sun Microsystems. Other technologists, notably Raymond Kurzweil, have extolled the coming of ultrasmart machines, saying they will offer huge advances in life extension and wealth creation.

“Something new has taken place in the past five to eight years,” Dr. Horvitz said. “Technologists are replacing religion, and their ideas are resonating in some ways with the same idea of the Rapture.”

 

The Kurzweil version of technological utopia has captured imaginations in Silicon Valley. This summer an organization called the Singularity University began offering courses to prepare a “cadre” to shape the advances and help society cope with the ramifications.

 

“My sense was that sooner or later we would have to make some sort of statement or assessment, given the rising voice of the technorati and people very concerned about the rise of intelligent machines,” Dr. Horvitz said.

 

The A.A.A.I. report will try to assess the possibility of “the loss of human control of computer-based intelligences.” It will also grapple, Dr. Horvitz said, with socioeconomic, legal and ethical issues, as well as probable changes in human-computer relationships. How would it be, for example, to relate to a machine that is as intelligent as your spouse?

 

Dr. Horvitz said the panel was looking for ways to guide research so that technology improved society rather than moved it toward a technological catastrophe. Some research might, for instance, be conducted in a high-security laboratory.

 

The meeting on artificial intelligence could be pivotal to the future of the field. Paul Berg, who was the organizer of the 1975 Asilomar meeting and received a Nobel Prize for chemistry in 1980, said it was important for scientific communities to engage the public before alarm and opposition becomes unshakable.

 

“If you wait too long and the sides become entrenched like with G.M.O.,” he said, referring to genetically modified foods, “then it is very difficult. It’s too complex, and people talk right past each other.”

 

Tom Mitchell, a professor of artificial intelligence and machine learning at Carnegie Mellon University, said the February meeting had changed his thinking. “I went in very optimistic about the future of A.I. and thinking that Bill Joy and Ray Kurzweil were far off in their predictions,” he said. But, he added, “The meeting made me want to be more outspoken about these issues and in particular be outspoken about the vast amounts of data collected about our personal lives.”

 

Despite his concerns, Dr. Horvitz said he was hopeful that artificial intelligence research would benefit humans, and perhaps even compensate for human failings. He recently demonstrated a voice-based system that he designed to ask patients about their symptoms and to respond with empathy. When a mother said her child was having diarrhea, the face on the screen said, “Oh no, sorry to hear that.”

 

A physician told him afterward that it was wonderful that the system responded to human emotion. “That’s a great idea,” Dr. Horvitz said he was told. “I have no time for that.”                                        Ken Conley/Willow Garage

20090730-11

20090730-9

Updated: July 24, 2009

Intelligence officials call unmanned aerial vehicles, often referred to as drones, their most effective weapon against Al Qaeda. The remotely piloted planes are used to transmit live video from Iraq, Afghanistan and Pakistan to American forces, and to carry out air strikes.

Predator spy planes were first used in Bosnia and Kosovo in the 1990s. The Air Force’s fleet has grown quickly in recent years, and consists of 195 Predators – which are 27 feet long and cost $4.5 million apiece – and 28 Reapers, a new, more heavily armed drone. Unmanned drones fly 34 surveillance patrols each day in Iraq and Afghanistan, up from 12 in 2006. They are also transmitting 16,000 hours of video each month, some of it directly to troops on the ground.

In addition, Army units have used hand-launched models, which look like toy planes, to peer over hills or buildings. Other drones monitor the seas and eavesdrop from high altitudes, much like the storied U-2 spy planes.

Despite their popularity, the drones have many shortcomings that have resulted from the rush to deploy them. Air Force officials acknowledge that more than a third of their Predators have crashed. Complaints about civilian casualties, particularly from strikes in Pakistan, have also stirred some concerns among human rights advocates.

In July 2009, the Air Force released a report that envisions building larger ones over the next several decades that could do the work of bombers and cargo planes and even tiny ones that could spy inside a room.

The Air Force also said it could eventually field swarms of drones to attack enemy targets. And it will have to be ready to defend against the same threat, which could become another inexpensive way for insurgents to attack American forces.The report envisions a family ranging from “nano”-size drones that could flit inside buildings like moths to gather intelligence, to large aircraft that could be used as strategic bombers or aerial refueling tankers. Midsize drones could act like jet fighters, attacking other planes or ground targets and jamming enemy communications.

Perhaps the most controversial is the idea of drones swarming on attack. Advances in computing power could enable them to mount preprogrammed attacks on their own, though that would be a difficult legal and ethical barrier for the military to cross.

But before long, even a single insurgent could dispatch several small drones at once. Referring to the improvised explosive devices that insurgents have planted like mines in Iraq and Afghanistan, the report warned that the next inexpensive threat to American troops could be “an airborne I.E.D.”

20090730-7

2008-2009 Study

Association for the Advancement of Artificial Intelligence

Co-chairs: Eric Horvitz and Bart Selman

Panel: Margaret Boden, Craig Boutilier, Greg Cooper, Tom Dean, Tom Dietterich, Oren Etzioni, Barbara Grosz, Eric Horvitz, Toru Ishida, Sarit Kraus, Alan Mackworth, David McAllester, Sheila McIlraith, Tom Mitchell, Andrew Ng, David Parkes, Edwina Rissland, Bart Selman, Diana Spears, Peter Stone, Milind Tambe, Sebastian Thrun, Manuela Veloso, David Waltz, Michael Wellman

Terms of Reference

The AAAI President has commissioned a study to explore and address potential long-term societal influences of AI research and development. The panel will consider the nature and timing of potential AI successes, and will define and address societal challenges and opportunities in light of these potential successes. On reflecting about the long term, panelists will review expectations and uncertainties about the development of increasingly competent machine intelligences, including the prospect that computational systems will achieve “human-level” abilities along a variety of dimensions, or surpass human intelligence in a variety of ways. The panel will appraise societal and technical issues that would likely come to the fore with the rise of competent machine intelligence. For example, how might AI successes in multiple realms and venues lead to significant or perhaps even disruptive societal changes?

The committee’s deliberation will include a review and response to concerns about the potential for loss of human control of computer-based intelligences and, more generally, the possibility for foundational changes in the world stemming from developments in AI. Beyond concerns about control, the committee will reflect about potential socioeconomic, legal, and ethical issues that may come with the rise of competent intelligent computation, the changes in perceptions about machine intelligence, and likely changes in human-computer relationships.

In addition to projecting forward and making predictions about outcomes, the panel will deliberate about actions that might be taken proactively over time in the realms of preparatory analysis, practices, or machinery so as to enhance long-term societal outcomes.

On issues of control and, more generally, on the evolving human-computer relationship, writings, such as those by statistician I.J. Good on the prospects of an “intelligence explosion” followed up by mathematician/science fiction author Vernor Vinge’s writings on the inevitable march towards an AI “singularity,” propose that major changes might flow from the unstoppable rise of powerful computational intelligences. Popular movies have portrayed computer-based intelligence to the public with attention-catching plots centering on the loss of control of intelligent machines. Well-known science fiction stories have included reflections (e.g., the “Laws of Robotics” described in Asimov’s Robot Series) on the need for and value of establishing behavioral rules for autonomous systems. Discussion, media, and anxieties about AI in the public and scientific realms highlight the value of investing more thought as a scientific community on preceptions, expectations, and concerns about long-term futures for AI. The committee will study and discuss these issues and will address in their report the myths and potential realities of anxieties about long-term futures. Beyond reflection about the validity of such concerns by scientists and lay public about disruptive futures, the panel will reflect about the value of formulating guidelines for guiding research and of creating policies that might constrain or bias the behaviors of autonomous and semi-autonomous systems so as to address concerns.

Focus groups:

  • Pace, Concerns, Control, Guidelines

Chair: David McAllester

  • Potentially Disruptive Advances: Nature and timing

Chair: Milind Tambe

  • Ethical and Legal Challenges

Chair: David Waltz

 

Asilomar meeting, February 2009

20090730-81

Attendees at Asilomar, Pacific Grove, February 21-22, 2009 (left to right): Michael Wellman, Eric Horvitz, David Parkes, Milind Tambe, David Waltz, Thomas Dietterich, Edwina Rissland (front), Sebastian Thrun, David McAllester, Magaret Boden, Sheila McIlraith, Tom Dean, Greg Cooper, Bart Selman, Manuela Veloso, Craig Boutilier, Diana Spears (front), Tom Mitchell, Andrew Ng.

Feedback on study: aifutures @ aaai.org

20090730-6

Motile cilia on airway epithelial cells
Image: SEM by Tom Moninger
 

The-Scientist.com, July 29, 2009, by Bob Grant  —  Your sense of taste doesn’t end in your mouth: Cilia lining airways leading to the lungs express taste receptors and alter their undulations in the presence of bitter chemicals, says a study published online in the July 23rd edition of Science.

These cilia are linked to signaling pathways that regulate their motility, allowing epithelial tissues in airways to sense toxins or noxious compounds and help protect the lungs.

 

This is the first paper that shows that motile cilia can have sensory function,” said Gáspár Jékely, a cell and molecular biologist at the Max Planck Institute for Developmental Biology in Tuebingen, Germany, who was not involved in the study. “That’s really quite remarkable that you have structures that move around but are also intimately tied to signaling pathways.”

Primary cilia — relatively immobile cellular extensions that play important roles in sight and olfaction — have long been known to serve in sensory capacities, but motile cilia were thought only to wave (or “beat”), moving mucous through airways and ova down fallopian tubes, for example. Alok Shah, a University of Iowa graduate student in the lab of Michael Welsh and co-first-author on the paper, said that researchers typically thought of primary cilia as “the smart ones” while motile cilia were characterized as the “workhorses” of the cilia universe.

“The motile cilia were thought to be, well, just motile and not much more,” Maxence Nachury, a Stanford molecular and cell biologist who was not involved with the study, told The Scientist.

Shah and his collaborators searched for sensory-related genes in samples of human airway epithelia, the cells that line the trachea and upper bronchi. They found several members of the T2R family, which senses and responds to bitter tastes, lurking in microarray expression data. Using antibodies to specific T2Rs, they singled out four specific receptors localized in cilia protruding from epithelial cells that line airways.

“I was very surprised” by the results, Shah told The Scientist. “When I first saw that, I thought, ‘Oh my God. What are taste receptors doing in the airways?'”

Moreover, the team noticed that the four different T2Rs localized to different parts of a single cilia. “As far as I know, no one has seen different receptors of the same family localizing differentially like that.” Shah said this differential localization may play a role in tweaking sensitivities or signaling pathways.

To nail down the motile cilia’s role in these signaling pathways, Shah subjected cultured airway epithelial tissues to several bitter compounds, including nicotine. “We found a very nice dose dependent increase in intracellular calcium” — the hallmark of cell signaling cascades — “in response to bitter compounds,” Shah noted. This calcium boost was followed by increased rates of ciliary beating.

The taste-sensing abilities of motile cilia could also lead to bitter compound-mimicking drugs that could target motile cilia to increase their beating to clear disrupted airways, as is common in cystic fibrosis. “This is a clear target that could be used to upregulate ciliary beating in the airways,” said Jékely. Conversely, in asthma, where epithelial airway surfaces become inflamed and irritated, drugs targeting motile cilia could quell the inflammation. “If one could modulate the excitation at the surface of these cilia,” said Nachury, “one could really have a way to dampen that reaction.”

Shah, who recently received his PhD and is now shopping for a postdoc position, said that much about the sensory role and molecular mechanisms of motile cilia is yet to be discovered.

Sudipto Roy, a researcher at the Institute of Molecular and Cell Biology in Singapore, agreed. “It will not be too surprising if subsequent research shows that motile cilia act as signaling hubs for many other kinds of sensory pathways,” Roy wrote in an email to The Scientist.

by Gabe Mirkin MD  —  Why are some people skinny, even though they eat large amounts of food, while others become fat? Jeffery Gordon of Washington University in St. Louis thinks it’s because some people have types of bacteria that cause them to absorb more calories from their food.

You have two absorption systems in your body. You absorb most of your food as it passes through your small intestines. Food that is not absorbed in the small intestine goes to your colon. The colon contains a huge colony of bacteria that work to ferment undigested carbohydrates such as soluble fiber into short chain fatty acids and simple sugars that can then be absorbed through the colon walls into the bloodstream. Most people get about ten percent of their total calories from food absorbed through their colons.

Animal studies lead us to the next step. The dominant bacteria in the gut of obese mice are Firmicutes, types of bacteria that have more genes for breaking down the complex starches and fiber. Mice who are thin have more Bacteroidetes in their guts, and these bacteria are not as efficient in breaking down fiber and complex carbohydrates. Transplanting Firmicutes bacteria into the guts of lean mice made them fat.

These researchers also found that fat humans had far more Firmicutes bacteria than thinner ones. They then asked their overweight subjects to go on a low-fat, low-refined- carbohydrate diet for one year. As they lost weight, their bacteria changed to predominantly Bacteroidetes.

Today you may be able to lose weight by changing the composition of your diet in a way that changes the bacteria in your gut so you absorb fewer calories. In the future, you may be able to get a pill that contain primarily Bacteroidetes bacteria, take it daily and watch the pounds melt off because of the change in intestinal bacteria. – www.DrMirkin.com

20090730-1

WebMD.com, July 28, 2009, by Denise Foley  —  Thirty billion a year — that’s about how much Americans spend on slim-down products, many of which don’t even work. A better way to get real weight-loss results? Go grocery shopping. New research points to more than a dozen foods, from beans to beef, that can help you fight hunger, kick your candy addiction, boost your metabolism-and ultimately shed pounds. And some of these superfoods deliver health bonuses too.

 20090730-2

1. Eggs. Skip the bagel this morning. Eggs, which are full of protein, will help you feel fuller longer-a lot longer. A multicenter study of 30 overweight or obese women found that those who ate two scrambled eggs (with two slices of toast and a reduced-calorie fruit spread) consumed less for the next 36 hours than women who had a bagel breakfast of equal calories. Other research has shown that protein may also prevent spikes in blood sugar, which can lead to food cravings.

2. Beans. You’ve probably never heard of cholecystokinin, but it’s one of your best weight-loss pals. This digestive hormone is a natural appetite suppressant. So how do you get more cholecystokinin? One way, report researchers at the University of California at Davis, is by eating beans: A study of eight men found that their levels of the hormone (which may work by keeping food in your stomach longer) were twice as high after a meal containing beans than after a low-fiber meal containing rice and dry milk. There’s also some evidence that beans keep blood sugar on an even keel, so you can stave off hunger longer. Heart-health bonus: High-fiber beans can lower your cholesterol.

3. Salad. Do you tend to stuff yourself at meals? Control that calorie intake by starting with a large salad (but hold the creamy dressing). In a study of 42 women at Penn State University, those who ate a big, low-cal salad consumed 12 percent less pasta afterward-even though they were offered as much as they wanted. The secret, say researchers, is the sheer volume of a salad, which makes you feel too full to pig out. Health bonus: A study published in the Journal of the American Dietetic Association found that people who ate one salad a day with dressing had higher levels of vitamins C and E, folic acid, lycopene, and carotenoids-all disease fighters-than those who didn’t add salad to their daily menu.

4. Green tea. The slimming ingredient isn’t caffeine. Antioxidants called catechins are what help speed metabolism and fat burning. In a recent Japanese study, 35 men who drank a bottle of oolong tea mixed with green tea catechins lost weight, boosted their metabolism, and had a significant drop in their body mass index. Health bonus: The participants also lowered their (bad) LDL cholesterol.

20090730-3

5. Pears. They’re now recognized as having more fiber, thanks to a corrected calculation by the U.S. Food and Drug Administration. At six grams (formerly four grams) per medium-size pear, they’re great at filling you up. Apples come in second, with about three grams per medium-size fruit. Both contain pectin fiber, which decreases blood-sugar levels, helping you avoid between-meal snacking. This may explain why, in a Brazilian study that lasted 12 weeks, overweight women who ate three small pears or apples a day lost more weight than women on the same diet who ate three oat cookies daily instead of the fruit.

6. Soup. A cup of chicken soup is as appetite blunting as a piece of chicken: That was the finding of a Purdue University study with 18 women and 13 men. Why? Researchers speculate that even the simplest soup satisfies hunger because your brain perceives it as filling.

7. Lean beef. It’s what’s for dinner-or should be, if you’re trying to shed pounds. The amino acid leucine, which is abundant in proteins like meat and fish as well as in dairy products, can help you pare down while maintaining calorie-burning muscle. That’s what it did for 24 overweight middle-aged women in a study at the University of Illinois at Urbana-Champaign. Eating anywhere from nine to 10 ounces of beef a day on a roughly 1,700-calorie diet helped the women lose more weight, more fat, and less muscle mass than a control group consuming the same number of calories, but less protein. The beef eaters also had fewer hunger pangs.

8. Olive oil. Fight off middle-age pounds with extra virgin olive oil. A monounsaturated fat, it’ll help you burn calories. In an Australian study, 12 postmenopausal women (ages 57 to 73) were given a breakfast cereal dressed either with a mixture of cream and skim milk or half an ounce of olive oil and skim milk. The women who ate the oil-laced muesli boosted their metabolism. Don’t want to add olive oil to your oatmeal? That’s OK-it works just as well in salad dressings, as a bread dip, or for sautéing.

20090730-4

9. Grapefruit. It’s back! A 2006 study of 91 obese people conducted at the Nutrition and Metabolic Research Center at Scripps Clinic found that eating half a grapefruit before each meal or drinking a serving of the juice three times a day helped people drop more than three pounds over 12 weeks. The fruit’s phytochemicals reduce insulin levels, a process that may force your body to convert calories into energy rather than flab.

10. Cinnamon. Sprinkle it on microwave oatmeal or whole-grain toast to help cure those mid-afternoon sugar slumps. Research from the U.S.
Department of Agriculture found that a little cinnamon can help control post-meal insulin spikes, which make you feel hungry. Health bonus: One USDA study showed that just a quarter teaspoon of cinnamon a day lowered the blood sugar, cholesterol, and triglyceride levels in people with type 2 diabetes.

11. Vinegar. It’s a great filler-upper. In a Swedish study, researchers found that people who ate bread dipped in vinegar felt fuller than those who had their slices plain. The probable reason: Acetic acid in the vinegar may slow the passage of food from the stomach into the small intestine, so your tummy stays full longer. Vinegar can also short-circuit the swift blood-sugar rise that occurs after you eat refined carbs such as white bread, cookies, and crackers.

12. Tofu. It seems too light to be filling, but a study at Louisiana State University showed that tofu does the job. Researchers tested it against chicken as a pre-meal appetizer for 42 overweight women-and the participants who had tofu ate less food during the meal. The secret: Tofu is an appetite-quashing protein.

20090730-5

13. Nuts. Yes, they are fattening: A handful of peanuts is about 165 calories. But research shows that people who snack on nuts tend to be slimmer than those who don’t. A study from Purdue University found that when a group of 15 normal-weight people added about 500 calories worth of peanuts to their regular diet, they consumed less at subsequent meals. The participants also revved up their resting metabolism by 11 percent, which means they burned more calories even when relaxing. Health bonus: Walnuts contain omega-3 fatty acids. And researchers at Loma Linda University recently found that eating 10 to 20 whole pecans daily can reduce heart disease risks.

14. High-fiber cereal. Studies show that you can curb your appetite by eating a bowl for breakfast. But how well does it really work? Researchers at the VA Medical Center and the University of Minnesota in Minneapolis tested the theory against the ultimate diet challenge: the buffet table. They gave 14 volunteers one of five cereals before sending them out to the smorgasbord. Those who’d had the highest-fiber cereal ate less than those who didn’t have as much fiber in the morning. Try General Mills Fiber One (14 grams per serving) or Kellogg’s All Bran With Extra Fiber (13 grams per serving).

15. Hot red pepper. Eating a bowl of spicy chili regularly can help you lose weight. In a Japanese study, 13 women who ate breakfast foods with red pepper (think southwestern omelet) ate less than they normally did at lunch. The magic ingredient may be capsaicin, which helps suppress appetite.