Curry Eggplant Pancake with Wide Variety of Toppings

Medium rare lamb pieces, cooked in oil and garlic with a fig reduction, is the topping on the eggplant pancake that you see above. It’s garnished with chopped scallion and served with fresh mango and mango chutney. ©Joyce Hays, Target Health Inc.

 

Here, topping the eggplant pancake is a delicious shrimp salad (finely chopped celery, a few spices and a very light creamy dressing) ©Joyce Hays, Target Health Inc.

 

Over the eggplant pancake, are curry lamb meatballs served with yogurt and mango chutney; cilantro garnish ©Joyce Hays, Target Health Inc.

 

The eggplant pancake is topped with ripe avocados mixed with fresh garlic, finely chopped parsley and cilantro and a simple extra virgin olive oil & fresh lemon dressing. ©Joyce Hays, Target Health Inc.

 

Homemade lentils cooked with tomatoes (sundried or fresh or canned) onion, garlic, parsley, spices, seasonings. I always cook lentils in chicken stock or broth. Not shown is the finely chopped fresh parsley garnish. ©Joyce Hays, Target Health Inc.

 

On the eggplant pancake is left over chicken or turkey, turned into a salad with fresh apples, green and red grapes, garlic, scallions, nuts. ©Joyce Hays, Target Health Inc.

 

Curried eggplant pancake, served plain without any topping, as a veggie along with a cooked but rare salmon entr�e and a couple of asparagus spears. ©Joyce Hays, Target Health Inc.

 

Get all your ingredients together. ©Joyce Hays, Target Health Inc.

 

 

Ingredients

 

2 or 3 cups roasted eggplant

2 large eggs, slightly beaten

1 Onion, well chopped

6 fresh garlic cloves, thinly sliced

1/2 cup chickpea flour

1/4 to 1/2 creamy goat cheese or plain Greek yogurt

1/2 teaspoon baking powder

Pinch salt

Pinch black pepper

Pinch dried oregano

1 teaspoon curry powder

1/2 cup fresh mint, very finely chopped

1/4 to 1/2 cup canola oil, more if needed

 

 

Directions

 

  1. Roast 1 or 2 Italian eggplant until the inside is very soft. With an oven mitt on, squeeze the eggplant to feel the degree of softness. Remove from oven when done and (use oven mitts) cut each eggplant in half, and open like a book, to cool, on a plate.
  2. When cool enough, throw away any seeds or skin and remove the soft eggplant to a food processor to break up and soften any fibers. Set aside
  3. While eggplant is baking, do all your chopping
  4. Into a large mixing bowl, add the eggs, garlic, onion and creamy goat cheese, finely chopped parsley. Stir to combine everything well.
  5. Add the eggplant from the food processor, to the bowl with mixed wet ingredients.
  6. In another bowl, mix the dry ingredients: flour, baking powder, salt, black pepper, curry powder, dried oregano, and fresh mint. Mix well.
  7. Slowly, add the mixed dry ingredients to the eggplant mixture and stir until the batter is just moistened. Don’t over-mix.

 

Cooking the Eggplant Pancakes

 

Heat canola oil in a large skillet over medium-high heat. Drop rounded spoonfuls of eggplant batter into hot oil and fry until golden, 2 to 3 minutes per side. Drain pancakes on a paper towel-lined plate.

 

Bake the whole eggplant. You can rub extra virgin olive oil over the skin before baking. ©Joyce Hays, Target Health Inc.

 

I covered the whole eggplants, lightly with foil. ©Joyce Hays, Target Health Inc.

 

When roasted and soft when you squeeze, ( but not soft and mushy), remove from oven and let cool. ©Joyce Hays, Target Health Inc.

 

Italian eggplants don’t have as many seeds as other varieties, so use these. When eggplant has cooled enough to handle, cut in half and open like a book to cool more. ©Joyce Hays, Target Health Inc.

 

Scoop all the flesh out of the eggplant and put into food processor. ©Joyce Hays, Target Health Inc.

 

Do all your chopping at the same time. ©Joyce Hays, Target Health Inc.

 

 

Mix all wet ingredients together in a bowl. ©Joyce Hays, Target Health Inc.

 

After mixing wet ingredients, add the eggplant from the food processor, to the wet ingredients. ©Joyce Hays, Target Health Inc.

 

Slowly, add the bowl of mixed dry ingredients, into the bowl of wet ingredients and stir only until you get a moist batter. Don’t over stir. ©Joyce Hays, Target Health Inc.

 

Don’t crowd the pan. Cook over medium high flame. ©Joyce Hays, Target Health Inc.

 

Cook for about 3 or 4 minutes on each side. ©Joyce Hays, Target Health Inc.

 

From pan to plate covered with paper towel to drain. ©Joyce Hays, Target Health In.

 

On a platter, serve plain as a vegetable with fish, seafood, poultry, meat. Or with a topping of your choice, make this into a meal in itself. ©Joyce Hays, Target Health Inc.

 

With curried lamb cooked in a wine/fig sauce, the eggplant pancake turned into a gourmet feast. With it, we had a delicious robust Shiraz from Southern Australia. ©Joyce Hays, Target Health Inc. (you’ve got to try it)

 

This wine was a gift from dinner guests. We didn’t open it right away. Recently, when we opened it for the eggplant pancake with lamb, we were so-o surprised at the unique quality of this Shiraz. You take notice the minute the first sip hits your mouth – then such a pleasant sensation as it explodes and sets your throat on fire, the long finish continues as the heat lingers. This is a fabulous 2012 Henschke blend, with 65% Shiraz, 20% Cab, 10% Merlot, and 5% Cabernet Franc. We highly recommend this wine. ©Joyce Hays, Target Health Inc.

 

 

From Our Table to Yours !

 

Bon Appetit!

 

Date:
March 16, 2017

Source:
Institute for Research in Biomedicine (IRB Barcelona)

Summary:
It is an innovative approach that takes advantage of the different expression profiles of certain proteins between tumor and healthy cells that make the virus to only infect the first ones.

 

The image shows tumor cells infected by the virus, which expresses a fluorescent protein. Over the days (in the image fifth day), the virus multiplies, generating new virions that infect more cancer cells
Credit: IDIBAPS, IRB Barcelona

 

 

Scientists at the IDIBAPS Biomedical Research Institute and at the Institute for Research in Biomedicine (IRB Barcelona) lead a study in which they have designed a new strategy to get genetically modified viruses to selectively attack tumor cells without affecting healthy tissues. The study, published today by the journal Nature Communications, is part of Eneko Villanueva’s work for his PhD and it is co-lead by Cristina Fillat, head of the Gene Therapy and Cancer Group at IDIBAPS, and Raúl Méndez, ICREA researcher at IRB Barcelona.

Conventional cancer treatment may cause undesirable side effects as a result of poor selectivity. To avoid them it is important that new therapies can efficiently remove cancer cells and preserve the healthy ones. One of the new approaches in cancer therapy is based on the development of oncolytic viruses, ie, viruses modified to only infect tumor cells. In recent years several studies have been focused on the development of viruses created by genetic engineering to maximize their anticancer effect but, as their potency increases, so does the associated toxicity. Limiting this effect on healthy cells is now the key for the application of this promising therapy.

An innovative and specific approach

In the study published in the journal Nature Communications, researchers from IDIBAPS and IRB Barcelona have developed an innovative approach to provide adenovirus with high specificity against tumor cells. “We have taken advantage of the different expression of a type of protein, CPEBs, in normal and tumor tissues,” explains Raúl Méndez from IRB Barcelona.

CPEB is a family of four RNA binding proteins (the molecules that carry information from genes to synthesize proteins) that control the expression of hundreds of genes and maintain the functionality and the ability to repair tissues under normal conditions. When CPEBs become imbalanced, they change the expression of these genes in cells and contribute to the development of pathological processes such as cancer. “We have focused on the double imbalance of two of these proteins in healthy tissues and tumors: on the one hand we have CPEB4, which in previous studies we have shown that it is highly expressed in cancer cells and necessary for tumor growth; and, on the other hand, CPEB1, expressed in normal tissue and lost in cancer cells. We have taken advantage of this imbalance to make a virus that only attacks cells with high levels of CPEB4 and low CPEB1, that means that it only affects tumor cells, ignoring the healthy tissues,” says Méndez.

“In this study we have worked with adenoviruses, a family of viruses that can cause infections of the respiratory tract, the urinary tract, conjunctivitis or gastroenteritis but which have features that make them very attractive to be used in the therapy against tumors,” explains Cristina Fillat. To do this, it is necessary to modify the genome of these viruses. In the study researchers have inserted sequences that recognize CPEB proteins in key regions for the control of viral proteins. Their activity was checked in in vitro models of pancreatic cancer and control of tumor growth was observed in mouse models.

The oncoselective viruses created in this study were very sophisticated, being activated by CPEB4 but repressed by CPEB1. Thus, researchers achieved attenuated viral activity in normal cells, while in tumor cells the virus potency was maintained or even increased. “When the modified viruses entered into tumor cells they replicated their genome and, when going out, they destroyed the cell and released more particles of the virus with the potential to infect more cancer cells,” says Fillat. She adds that, “this new approach is very interesting since it is a therapy selectively amplified in the tumor.”

Since CPEB4 is overexpressed in several tumors, this oncoselective strategy may be valid for other solid tumors. Researchers are now trying to combine this treatment with therapies that are already being used in clinical practice, or that are in a very advanced stage of development, to find synergies that make them more effective.


Story Source:

Materials provided by Institute for Research in Biomedicine (IRB Barcelona). Note: Content may be edited for style and length.


Journal Reference:

  1. Eneko Villanueva, Pilar Navarro, Maria Rovira-Rigau, Annarita Sibilio, Raúl Méndez, Cristina Fillat. Translational reprogramming in tumour cells can generate oncoselectivity in viral therapies. Nature Communications, 2017; 8: 14833 DOI: 10.1038/NCOMMS14833

 

Source: Institute for Research in Biomedicine (IRB Barcelona). “Viruses created to selectively attack tumor cells.” ScienceDaily. ScienceDaily, 16 March 2017. <www.sciencedaily.com/releases/2017/03/170316112147.htm>.

Newly available tool will help researchers worldwide shed light on inner workings of cells

Date:
March 15, 2017

Source:
University of Alberta

Summary:
Researchers have developed a new method of controlling biology at the cellular level using light. The tool — called a photocleavable protein — breaks into two pieces when exposed to light, allowing scientists to study and manipulate activity inside cells in new and different ways.

 

Light-activated control of protein localization in mammalian cells. The protein is initially in the cytoplasm and excluded from the nucleus of the cell (blue area in the middle each cell). Upon illumination, the part of the protein that prevents it from entering the nucleus is cleaved off and the protein is now able to enter the nucleus. The concentration of protein is represented by the color, with blue indicating a low concentration, and red representing a high concentration.
Credit: Robert Campbell

 

 

Researchers at the University of Alberta have developed a new method of controlling biology at the cellular level using light.

The tool — called a photocleavable protein — breaks into two pieces when exposed to light, allowing scientists to study and manipulate activity inside cells in new and different ways.

First, scientists use the photocleavable protein to link cellular proteins to inhibitors, preventing the cellular proteins from performing their usual function. This process is known as caging.

“By shining light into the cell, we can cause the photocleavable protein to break, removing the inhibitor and uncaging the protein within the cell,” said lead author Robert Campbell, professor in the Department of Chemistry. Once the protein is uncaged, it can start to perform its normal function inside the cell.

The tool is relatively easy to use and widely applicable for other research that involves controlling processes inside a cell.

The power of light-sensitive proteins, Campbell explained, is that they can be used to study the inner workings of any living cell. For example, optogenetic tools are widely used to activate brain activity in mice.

“We could use the photocleavable protein to study single bacteria, yeast, human cells in the lab or even whole animals such as zebrafish or mice,” explained Campbell. “To put these proteins inside an animal, we simply splice the gene for the protein into DNA and insert it into the cells using established techniques.”

The gene for the photocleavable will be made available on Addgene, providing access to other researchers and scientists.

“We want to provide new ways to learn about cell biology,” said Campbell. “I see countless potential applications for research and future investigation — from looking at which cells become which tissues in development biology, to investigating the possibilities of gene-editing technology.”

The research was published in Nature Methods in March 2017. It was conducted in collaboration with Roger Thompson and post-doctoral fellow Alex Lohman from the Hotchkiss Brain Institute from the University of Calgary.


Story Source:

Materials provided by University of Alberta. Original written by Katie Willis. Note: Content may be edited for style and length.


Journal Reference:

  1. Wei Zhang, Alexander W Lohman, Yevgeniya Zhuravlova, Xiaocen Lu, Matthew D Wiens, Hiofan Hoi, Sine Yaganoglu, Manuel A Mohr, Elena N Kitova, John S Klassen, Periklis Pantazis, Roger J Thompson, Robert E Campbell. Optogenetic control with a photocleavable protein, PhoCl. Nature Methods, 2017; DOI: 10.1038/nmeth.4222

 

Source: https://www.sciencedaily.com/releases/2017/03/170315125623.htm

Date:
March 14, 2017

Source:
PLOS

Summary:
Scientists have found fossils of 1.6 billion-year-old probable red algae. The spectacular finds indicate that advanced multicellular life evolved much earlier than previously thought.

 

X-ray tomographic picture (false colors) of fossil thread-like red algae.
Credit: Stefan Bengtson; CCAL

 

 

Scientists at the Swedish Museum of Natural History have found fossils of 1.6 billion-year-old probable red algae. The spectacular finds, publishing on 14 March in the open access journal PLOS Biology, indicate that advanced multicellular life evolved much earlier than previously thought.

The scientists found two kinds of fossils resembling red algae in uniquely well-preserved sedimentary rocks at Chitrakoot in central India. One type is thread-like, the other one consists of fleshy colonies. The scientists were able to see distinct inner cell structures and so-called cell fountains, the bundles of packed and splaying filaments that form the body of the fleshy forms and are characteristic of red algae.

“You cannot be a hundred per cent sure about material this ancient, as there is no DNA remaining, but the characters agree quite well with the morphology and structure of red algae,” says Stefan Bengtson, Professor emeritus of palaeozoology at the Swedish Museum of Natural History.

The earliest traces of life on Earth are at least 3.5 billion years old. These single-celled organisms, unlike eukaryotes, lack nuclei and other organelles. Large multicellular eukaryotic organisms became common much later, about 600 million years ago, near the transition to the Phanerozoic Era, the “time of visible life.”

Discoveries of early multicellular eukaryotes have been sporadic and difficult to interpret, challenging scientists trying to reconstruct and date the tree of life. The oldest known red algae before the present discovery are 1.2 billion years old. The Indian fossils, 400 million years older and by far the oldest plant-like fossils ever found, suggest that the early branches of the tree of life need to be recalibrated.

“The ‘time of visible life’ seems to have begun much earlier than we thought,” says Stefan Bengtson.

The presumed red algae lie embedded in fossil mats of cyanobacteria, called stromatolites, in 1.6 billion-year-old Indian phosphorite. The thread-like forms were discovered first, and when the then doctoral student Therese Sallstedt investigated the stromatolites she found the more complex, fleshy structures.

“I got so excited I had to walk three times around the building before I went to my supervisor to tell him what I had seen!” she says.

The research group was able to look inside the algae with the help of synchrotron-based X-ray tomographic microscopy. Among other things, they have seen regularly recurring platelets in each cell, which they believe are parts of chloroplasts, the organelles within plant cells where photosynthesis takes place. They have also seen distinct and regular structures at the centre of each cell wall, typical of red algae.


Story Source:

Materials provided by PLOS. Note: Content may be edited for style and length.


Journal Reference:

  1. Stefan Bengtson, Therese Sallstedt, Veneta Belivanova, Martin Whitehouse. Three-dimensional preservation of cellular and subcellular structures suggests 1.6 billion-year-old crown-group red algae. PLOS Biology, 2017; 15 (3): e2000735 DOI: 10.1371/journal.pbio.2000735

 

Source: PLOS. “World’s oldest plant-like fossils show multicellular life appeared earlier than thought.” ScienceDaily. ScienceDaily, 14 March 2017. <www.sciencedaily.com/releases/2017/03/170314150937.htm>.

Date:
March 13, 2017

Source:
University of Washington

Summary:
The dramatic decline of Arctic sea ice in recent decades is caused by a mixture of global warming and a natural, decades-long atmospheric hot spot over Greenland and the Canadian Arctic.

 

Arctic sea ice, as seen from an ice breaker ship in 2014.
Credit: Bonnie Light/University of Washington

 

 

Arctic sea ice in recent decades has declined even faster than predicted by most models of climate change. Many scientists have suspected that the trend now underway is a combination of global warming and natural climate variability.

A new study finds that a substantial chunk of summer sea ice loss in recent decades was due to natural variability in the atmosphere over the Arctic Ocean. The study, from the University of Washington, the University of California Santa Barbara and federal scientists, is published March 13 in Nature Climate Change.

“Anthropogenic forcing is still dominant — it’s still the key player,” said first author Qinghua Ding, a climate scientist at the University of California Santa Barbara who holds an affiliate position at the UW, where he began the work as a research scientist in the UW’s Applied Physics Laboratory. “But we found that natural variability has helped to accelerate this melting, especially over the past 20 years.”

The paper builds on previous work by Ding and other UW scientists that found changes in the tropical Pacific Ocean have in recent decades created a “hot spot” over Greenland and the Canadian Arctic that has boosted warming in that region.

The hot spot is a large region of higher pressure where air is squeezed together so it becomes warmer and can hold more moisture, both of which bring more heat to the sea ice below. The new paper focuses specifically on what this atmospheric circulation means for Arctic sea ice in September, when the ocean reaches its maximum area of open water.

“The idea that natural or internal variability has contributed substantially to the Arctic sea ice loss is not entirely new,” said second author Axel Schweiger, a University of Washington polar scientist who tracks Arctic sea ice. “This study provides the mechanism and uses a new approach to illuminate the processes that are responsible for these changes.”

Ding designed a new sea ice model experiment that combines forcing due to climate change with observed weather in recent decades. The model shows that a shift in wind patterns is responsible for about 60 percent of sea ice loss in the Arctic Ocean since 1979. Some of this shift is related to climate change, but the study finds that 30-50 percent of the observed sea ice loss since 1979 is due to natural variations in this large-scale atmospheric pattern.

“What we’ve found is that a good fraction of the decrease in September sea ice melt in the past several decades is most likely natural variability. That’s not really a surprise,” said co-author David Battisti, a UW professor of atmospheric sciences.

“The method is really innovative, and it nails down how much of the observed sea ice trend we’ve seen in recent decades in the Arctic is due to natural variability and how much is due to greenhouse gases.”

The long-term natural variability is ultimately thought to be driven by the tropical Pacific Ocean. Conditions in the tropical Pacific set off ripple effects, and atmospheric waves snake around the globe to create areas of higher and lower air pressure.

Teasing apart the natural and human-caused parts of sea ice decline will help to predict future sea ice conditions in Arctic summer. Forecasting sea ice conditions is relevant for shipping, climate science, Arctic biology and even tourism. It also helps to understand why sea ice declines may be faster in some decades than others.

“In the long term, say 50 to 100 years, the natural internal variability will be overwhelmed by increasing greenhouse gases,” Ding said. “But to predict what will happen in the next few decades, we need to understand both parts.”

What will happen next is unknown. The tropical Pacific Ocean could stay in its current phase or it could enter an opposite phase, causing a low-pressure center to develop over Arctic seas that would temporarily slow the long-term loss of sea ice due to increased greenhouse gases.

“We are a long way from having skill in predicting natural variability on decadal time scales,” Ding said.


Story Source:

Materials provided by University of Washington. Original written by Hannah Hickey. Note: Content may be edited for style and length.


Journal Reference:

  1. Qinghua Ding, Axel Schweiger, Michelle L’Heureux, David S. Battisti, Stephen Po-Chedley, Nathaniel C. Johnson, Eduardo Blanchard-Wrigglesworth, Kirstin Harnos, Qin Zhang, Ryan Eastman, Eric J. Steig. Influence of high-latitude atmospheric circulation changes on summertime Arctic sea ice. Nature Climate Change, 2017; DOI: 10.1038/nclimate3241

 

Source: University of Washington. “Rapid decline of Arctic sea ice a combination of climate change and natural variability.” ScienceDaily. ScienceDaily, 13 March 2017. <www.sciencedaily.com/releases/2017/03/170313160827.htm>.

Climate Change: Springtime in NYC: First a Snow Storm, then Sun and Flowers

 

The photo below was taken in the front gardens of our building in NYC showing the crocuses pushing through.

 

On this same morning we had a very cold snow storm, but by the time I was coming home in the afternoon, the sun was out and crocuses blooming.

 

First 2017 crocuses blooming in our front garden. © Joyce Hays, Target Health Inc.

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 165). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

QUIZ

Filed Under News | Leave a Comment

Sharing Patient Records Is Still a Digital Dilemma

Structure and basic components of the Austrian Electronic Health Records (ELGA)Source: Sebastian 19781, the copyright holder of this work, hereby publish it under the following licenses: Creative Commons Attribution-Share Alike 3.0 Unported license.

 

Privacy and Security, are still important issues connected with Electronic Health Records (EHR). EHRs allow providers to use information more effectively to improve the quality and efficiency of your care, but EHRs do not change the privacy protections or security safeguards that apply to health information. EHRs Personal Health Records (PHR) are electronic versions of the 1) ___ charts in doctor’s or other health care provider’s offices. An EHR may include one’s medical history, notes, and other information about one’s health including symptoms, diagnoses, medications, lab results, vital signs, immunizations, and reports from diagnostic tests such as x-rays. The information in EHRs can be shared with other organizations if the computer systems are set up to talk to each other. Information in these records should only be shared for purposes authorized by law or by individuals. There are privacy rights whether information is stored as a paper record or stored in an electronic form. The same federal laws that already protect health information also apply to information in EHRs.

 

As health care providers begin to use EHRs and set up ways to securely share health information with other providers, it will make it easier for everyone to work together to make sure that one gets the care they need. For example: Information about medications will be available in EHRs so that health care providers don’t perscribe another 2) ___ that might be harmful. EHR systems are backed up like most computer systems, so health information can be retrieved in the vent of a computer shutdown. EHRs can also be available in an emergency. If one is involved in an accident and are non-verbal, a hospital could retrieve personal information so decisions about emergency care can be faster and more informed. Doctors using EHRs may find it easier or faster to track lab results and share progress with patients. If doctors’ systems can share information, one doctor can see test results from another 3) ___, so the test doesn’t always have to be repeated. Especially with x-rays and certain lab tests, this means one is at less risk from radiation and other side effects. When tests are not repeated unnecessarily, it also means care costs less.

 

Most of us feel that our health information is private and should be protected. The federal government put in place the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule to ensure one has rights over one’s own health information, no matter what form it is in. The government also created the HIPAA Security Rule to require specific protections to safeguard electronic health information. A few possible measures that can be built in to EHR systems may include: “Access control“ tools like passwords and PIN numbers, to help limit access to authorized individuals. Encrypting stored information means health information cannot be read or understood except by those using a system that can decrypt it with a “key.“ An audit trail feature, which records who accessed information, what changes were made and when. Finally, federal law requires doctors, hospitals, and other health care providers to notify patients of any breach. The law also requires the health care provider to notify the Secretary of Health and Human Services if a breach affects more than 500 residents of a state or jurisdiction. In that case, the health care provider must also notify media outlets serving the state or jurisdiction. This requirement helps patients know if something has gone wrong with the protection of their information and helps keep providers accountable for EHR protection.

 

How can electronic health records (EHR) and regulations be designed to positively affect doctors’ practices? The meaningful use program has been successful in “forcing the adoption of EHRs but they weren’t ready for prime time,“ said AMA President Steven J. Stack, MD, recently, during a town hall meeting on EHRs at the Swedish Medical Center in Seattle. This is the third AMA town hall on EHRs and was co-hosted by the Washington State Medical Association (WSMA). The focus of this special session was: What is wrong with current EHRs and how they could be designed to benefit physicians in practice. Earlier this month Centers for Medicare & Medicaid Services Acting Administrator Andy Slavitt said the agency is changing its culture to focus more on listening to physician needs and will implement better policy in place of the meaningful use program when the new streamlined Medicare reporting program is created. With this statement, there’s never been a better time to speak up and offer constructive solutions to regulatory missteps that have stolen time from physicians that they would rather have spent with patients.

 

Taking control: Regulations should not hinder care of 4) ___.

 

As it did in Boston and Atlanta last year, the physician voice resounded through Seattle during Tuesday night’s town hall, emphasizing that EHR design should be focused on usability and interoperability and the physician voice must be heard. “Administrative burdens are strangling medical practice and creating unnecessary and costly inefficiencies in health care delivery while adding stress to physicians and their teams,“ said WSMA president Ray Hsiao, MD, kicking off the discussion. “It can make a cynic out of the happiest people and can lead to discouragement, professional dissatisfaction and burnout, and even drive 5) ___ to leave the profession. We cannot let that happen.“ Regulations force physicians to do “a lot of busy work that has nothing to do with the quality of care we provide,“ said Jane Fellner, MD, a primary care physician at the University of Washington School of Medicine. “It needs to stop.“

 

How we can make EHRs more functional in practice

 

Speaking to what they really need from these tools to help them in their practices, many physicians offered solutions and suggestions for how 6) ___ should work for the end-users who depend on them daily. Interoperability proved top of mind as the current EHRs struggle to communicate. “My EHR does not necessarily have the tools to interoperate well with other EHRs,“ Dr. Fellner said. “But within the universe of the other medical centers who use the same software – it is magic. I can import an entire record from Florida in 20 seconds.“ If all EHRs could talk to each other in this way, it would have a very positive effect on the way physicians treat patients nationally, she said. “It has revolutionized the care I provide.“ Another focus for improvement during the discussion was the need for more data usage focused on population health to show physicians how their patients’ health compares to national trends. “What we don’t see is our information going in to create this big picture that we can then [see] in real time,“ said Reena Koshy, MD, a family physician in Seattle. This capability is currently available but not to everyone using EHRs. Dr. Koshy said it would be very helpful if national patient data coordination were available to all practices. Thomas Payne, MD, medical director of information technology services at the University of Washington School of Medicine and board chair of the American Medical Informatics Association, said he uses his EHR in every patient visit. “We need to address documentation because that is the source of a lot of unnecessary new time that [we] spend,“ he said. “Natural language processing is a great example. As we speak, as we are this evening we can use that same capability to communicate in the medical record and be able to record what kinds of care people have received.“ “When you’re searching for billing 7) ___, you have to type it exactly correct or it boots it out,“ said Carrie Horwitch, MD, a primary care physician in Seattle. She suggested physicians could work much more efficiently if EHRs had the same kind of spell-check and search option drop-down menus as Internet search engines.

 

U.S. taxpayers have poured $30 billion into funding electronic records systems in hospitals and doctors’ offices since 2009. But most of those systems still can’t talk to each other, which makes transfer of medical information tough. Technology entrepreneur Jonathan Bush says he was recently watching a patient move from a hospital to a nursing home. The patient’s information was in an electronic medical record, or EMR. And getting the patient’s records from the hospital to the nursing home, Bush says, wasn’t exactly drag and drop. “These two guys then type – I kid you not – the printout from the brand new EMR into their EMR, so that their fax server can fax it to the bloody nursing home,“ Bush says. We should be working off the same set of standards,“ says Dr. Karen DeSalvo, coordinator for information technology, Department of Health and 8) ___ Services. In an era when most industries easily share big, complicated, digital files, health care still leans hard on paper printouts and fax machines. The American taxpayer has funded the installation of electronic records systems in hospitals and doctors’ offices – to the tune of $30 billion since 2009. While those systems are supposed to make health care better and more efficient, most of them can’t talk to each other. Bush lays a lot of blame for that at the feet of this federal financing. “I called it the ?Cash for Clunkers’ bill,“ he says. “It gave $30 billion to buy the very pre-internet systems that all of the doctors and hospitals had already looked at and rejected,“ he says. “And the vendors of those systems were about to die. And then they got put on life support by this bill that pays you billions of dollars, and didn’t get you any coordination of information!“ Bush’s assessment is colored by the fact that the company he runs – AthenaHealth – is cloud-based, and stresses easily sharing electronic health records. The firm also got a lot of the federal cash. Dr. Robert Wachter, with the University of California, San Francisco, says sure – in hindsight, the government could have mandated that stimulus money be spent only on software that made sharing information easy. But, he says, “I think the right call was to get the systems in. Then to toggle to, OK, now you have a computer, now you’re using it, you’re working out some of the kinks. The next thing we need to do is to be sure all these systems 9) ___ to each other.“ Right now, the ability of the systems to converse is at about a 2 or 3 on a scale of 0 to 10, Wachter and Bush agree. Wachter is about to publish The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, a book that assesses the value of information technology in health care. Up until now, he says, there has actually been a financial dis-incentive for doctors and hospitals to share information. For example, if a doctor doesn’t have a patient’s record immediately available, the doctor may order a test that has already been done – and can bill for that test. Keeping EMRs from talking to each other also makes it easier to keep patients from taking their medical records – and their business – to a competing doctor. It’s time for that to change, says Dr. Karen DeSalvo, the federal government’s health IT coordinator.

 

The billions of dollars a year the government pays to doctors, hospitals and other institutions for patients enrolled in Medicare is a pretty good motivator. Already, 10) ___ is starting to increase pay to doctors and hospitals that work together to streamline care and avoid duplicative tests, and to penalize those that don’t. Winning the new payments and avoiding the penalties increasingly require proving that all of a patient’s doctors, no matter where they are, are working together. That requires using good electronic records that can seamlessly move from one system to the next. Wachter says that consumers are now demanding better health information technology, too – “because we’re all used to our app stores and we know how magical it can be when core IT platforms invite in a number of apps.“ “So I think,“ he says, “that even the vendors and healthcare delivery organizations that have been fighting interoperability recognize it’s the future.“ He says a lot of IT companies are now eager come up with software that meets the demands of the health care industry and consumers. About a dollar of every $6 in the U.S. economy is spent on health care. A new IT boom in that sector means there are billions of dollars to be made.

 

The effort to change meaningful use and fix EHRs

 

Early last year, the Medicare Access and CHIP Reauthorization Act of 2015 repealed the sustainable growth rate formula and called for the new Merit-Based Incentive Payment System (MIPS), which is intended to sunset the three existing reporting programs and streamline them into a single program. The AMA and 100 state and specialty medical associations recently submitted 10 principles to guide the foundation of the MIPS, and the AMA provided detailed comments (log in) as part of its ongoing efforts on this issue and submitted a detailed framework for what needs to change. The AMA and MedStar Health’s National Center for Human Factors in Healthcare last year developed an EHR User-Centered Design Evaluation Framework to compare the design and testing processes for optimizing EHR usability. Visit BreakTheRedTape.org, the AMA’s grassroots campaign to advocate for ways to solve medicine’s regulatory and legislative challenges.

 

Sources: NPR’s reporting partnership with Montana Public Radio and Kaiser Health News; www.hhs.gov/ocr/privacy/

 

ANSWERS: 1) paper; 2) medicine; 3) doctor; 4) patients; 5) physicians; 6) EHRs; 7) codes; 8) Human; 9) talk; 10) Medicare

 

Stephan Wolfram (1959 to Present)

Stephen Wolfram at home. Source: Wikipedia

 

Target Health Inc. is an eCRO, that creates software for clinical trials and is interested in how Big Data is sorted through, especially for application to medical research, and clinical trials.

 

Wolfram Alpha (also: WolframAlpha and Wolfram|Alpha) is a computational knowledge engine or answer engine developed by Wolfram Research, which was founded by Stephen Wolfram. Stephen Wolfram is a brilliant computer scientist/physicist who thinks about the world and its issues, in a very Big way. Wolfram Research is an online service that answers factual queries directly by computing the answer from externally sourced “curated data“, rather than providing a list of documents or web pages that might contain the answer as a search engine might. Wolfram Alpha, which was released on May 18, 2009, is based on Wolfram’s earlier flagship product Wolfram Mathematica, a computational platform or toolkit that encompasses computer algebra, symbolic and numerical computation, visualization, and statistics capabilities. Additional data is gathered from both academic and commercial websites such as the CIA’s The World Factbook, the United States Geological Survey, a Cornell University Library publication called All About Birds, Chambers Biographical Dictionary, Dow Jones, the Catalogue of Life, CrunchBase, Best Buy,the FAA and optionally a user’s Facebook account.

 

Stephen Wolfram (born 29 August 1959) is a British-American computer scientist, physicist, and businessman. He is known for his work in computer science, mathematics, and in theoretical physics. He is the author of the book A New Kind of Science. In 2012 he was named an inaugural fellow of the American Mathematical Society. His recent work has been on knowledge-based programming, expanding and refining the programming language of Mathematica into what is now called the Wolfram Language. His book An Elementary Introduction to the Wolfram Language appeared in 2015 and Idea Makers appeared in 2016.

 

Stephen Wolfram was born in London in 1959 to Hugo and Sybil Wolfram. Wolfram’s father, Hugo Wolfram (1925-2015), a textile manufacturer born in Bochum, Germany, served as managing director of the Lurex Company, makers of the fabric Lurex, and was the author of three novels. He emigrated to England in 1933. When World War II broke out, young Hugo left school at 15 and subsequently found it hard to get a job since he was regarded as an “enemy alien.“ As an adult, he took correspondence courses in philosophy and psychology. Wolfram’s mother, Sybil Wolfram (1931-1993) was a Fellow and Tutor in philosophy at Lady Margaret Hall at University of Oxford from 1964 to 1993. She published two books, Philosophical Logic: An Introduction (1989) and In-laws and Outlaws: Kinship and Marriage in England (1987). She was the daughter of criminologist and psychoanalyst Kate Friedlander (1902-1949), an expert on the subject of juvenile delinquency, and the physician Walter Misch (1889-1943) who, together, wrote Die vegetative Genese der neurotischen Angst und ihre medikament?se Beseitigung. After the Reichstag fire in 1933, she emigrated from Berlin, Germany to England with her parents and Jewish psychoanalyst, Paula Heimann (1899-1982).

 

As a young child, Wolfram initially struggled in school and had difficulties learning arithmetic. At the age of 12, he wrote a dictionary on physics. By 13 or 14, he had written three books on particle physics. They were not published. Wolfram was a wunderkind. By age 15, he began research in applied quantum field theory and particle physics and published scientific papers. Topics included matter creation and annihilation, the fundamental interactions, elementary particles and their currents, hadronic and leptonic physics, and the parton model, published in professional peer-reviewed scientific journals including Nuclear Physics B, Australian Journal of Physics, Nuovo Cimento, and Physical Review D. Working independently, Wolfram published a widely cited paper on heavy quark production at age 18 and nine other papers. He continued to do research and to publish on particle physics into his early twenties. Wolfram’s work with Geoffrey C. Fox on the theory of the strong interaction is still used in experimental particle physics.

 

He was educated at Eton College, but left prematurely in 1976. He entered St. John’s College, Oxford at age 17 but found lectures “awful“, and left in 1978 without graduating to attend the California Institute of Technology, the following year, where he received a PhD in particle physics on November 19, 1979 at age 20. Wolfram’s thesis committee was composed of Richard Feynman, Peter Goldreich, Frank J. Sciulli and Steven Frautschi, and chaired by Richard D. Field. A 1981 letter from Feynman to Gerald Freund giving reference for Wolfram for the MacArthur grant appears in Feynman’s collective letters, Perfectly Reasonable Deviations from the Beaten Track. Following his PhD, Wolfram joined the faculty at Caltech and became the youngest recipient of the MacArthur Fellowships in 1981, at age 21. In 1983, Wolfram left for the School of Natural Sciences of the Institute for Advanced Study in Princeton, where he conducted research into cellular automata, mainly with computer simulations. He produced a series of papers systematically investigating the class of elementary cellular automata, conceiving the Wolfram code, a naming system for one-dimensional cellular automata, and a classification scheme for the complexity of their behavior. He conjectured that the Rule 110 cellular automaton might be Turing complete.

 

A 1985 letter, from Feynman to Wolfram, also appears in Feynman’s letters. In it, in response to Wolfram writing to him that he was thinking about creating some kind of institute where he might study complex systems, Feynman tells Wolfram, “You do not understand ordinary people,“ and advises him “find a way to do your research with as little contact with non-technical people as possible.“ In the mid-1980s, Wolfram worked on simulations of physical processes (such as turbulent fluid flow) with cellular automata on the Connection Machine alongside Richard Feynman and helped initiate the field of complex systems, founding the first institute devoted to this subject, The Center for Complex Systems Research (CCSR) at the University of Illinois at Urbana-Champaign and the journal Complex Systems in 1987. Wolfram led the development of the computer algebra system SMP (Symbolic Manipulation Program) in the Caltech physics department during 1979-1981. A dispute with the administration over the intellectual property rights regarding SMP – patents, copyright, and faculty involvement in commercial ventures – eventually caused him to resign from Caltech. SMP was further developed and marketed commercially by Inference Corp. of Los Angeles during 1983-1988. In 1986 Wolfram left the Institute for Advanced Study for the University of Illinois at Urbana-Champaign where he founded their Center for Complex Systems Research and started to develop the computer algebra system Mathematica, which was first released in 1988, when he left academia. From 1992 to 2002, Wolfram worked on his controversial book A New Kind of Science,which presents an empirical study of very simple computational systems. Additionally, it argues that for fundamental reasons these types of systems, rather than traditional mathematics, are needed to model and understand complexity in nature. Wolfram’s conclusion is that the universe is digital in its nature, and runs on fundamental laws which can be described as simple programs. He predicts that a realization of this within the scientific communities will have a major and revolutionary influence on physics, chemistry and biology and the majority of the scientific areas in general, which is the reason for the book’s title. Since the release of the book in 2002, Wolfram has split his time between developing Mathematica and encouraging people to get involved with the subject matter of A New Kind of Science by giving talks, holding conferences, and starting a summer school devoted to the topic.

 

In March 2009, Wolfram announced Wolfram|Alpha, an answer engine. Wolfram|Alpha later launched in May 2009, and a paid-for version with extra features launched on February 2012. The engine is based on natural language processing and a large library of algorithms, and answers queries using the approach described in A New Kind of Science. The application programming interface allows other applications to extend and enhance Alpha. Wolfram believes that as Wolfram Alpha comes into common use, “It will raise the level of scientific things that the average person can do.“ Wolfram|Alpha is one of the answer engines behind Microsoft’s Bing and Apple’s Siri answering factual questions. In June 2014, Wolfram officially announced the Wolfram Language as a new general multi-paradigm programming language. The documentation for the language was pre-released in October 2013 to coincide with the bundling of Mathematica and the Wolfram Language on every Raspberry Pi computer. While the Wolfram Language has existed for over 25 years as the primary programming language used in Mathematica, it was not officially named until 2014. Wolfram’s son, Christopher Wolfram, appeared on the program of SXSW giving a live-coding demonstration using Wolfram Language and has blogged about Wolfram Language for Wolfram Research. On 8 December 2015, Wolfram published the book “An Elementary Introduction to the Wolfram Language“ to introduce people with no knowledge of programming to the Wolfram Language and the kind of computational thinking it allows. Both Stephen Wolfram and Christopher Wolfram were involved in helping create the alien language for the film Arrival, for which they used the Wolfram Language. The significance that data has on the products Stephen creates transfers into his own life. He has an extensive log of personal analytics, including emails received and sent, key strokes made, meetings and events attended, phone calls, even physical movement dating back to the 1980s. He has stated “[personal analytics] can give us a whole new dimension to experiencing our lives“.

Sources: Wikipedia; Edge.org

 

Click here to read a short piece by Stephan Wolfram, written beautifully with great clarity, about his vision of artificial intelligence and how humans should deal with it, through a shared language, through personal analytics and more. To know more about shared language, all humans should read, the recently published: An Elementary Introduction to the Wolfram Language. If you saw the Academy Award nominee, Arrival, you would have heard a version of this shared language, which a human linguist, is finally able to understand.

 

Beauty is truth, truth beauty, – that is all

Ye know on earth, and all ye need to know.’

John Keats

 

TED talk by Stephan Wolfram Computing a Theory of Everything

 

Chimpanzee Adenovirus Vector Ebola Vaccine

 

The unprecedented 2014 epidemic of Ebola virus disease (EVD) prompted an international response to accelerate the availability of a preventive vaccine. A replication-defective recombinant chimpanzee adenovirus type 3-vectored ebolavirus vaccine (cAd3-EBO), encoding the glycoprotein from Zaire and Sudan species, that offers protection in the nonhuman primate model, was rapidly advanced into phase 1 clinical evaluation.

 

As a result, in a study published in the New England of Medicine (2017; 376:928-938), enrolled 20 healthy adults, in sequentially enrolled groups of 10 each, received vaccination intramuscularly in doses of 2×1010 particle units or 2×1011 particle units. Primary and secondary end points related to safety and immunogenicity were assessed throughout the first 8 weeks after vaccination; in addition, longer-term vaccine durability was assessed at 48 weeks after vaccination.

 

While no safety concerns were identified, transient fever developed within 1 day after vaccination in two participants who had received the 2×1011 particle-unit dose. Glycoprotein-specific antibodies were induced in all 20 participants; the titers were of greater magnitude in the group that received the 2×1011 particle-unit dose than in the group that received the 2×1010 particle-unit dose (geometric mean titer against the Zaire antigen at week 4, 2037 vs. 331; P=0.001). Glycoprotein-specific T-cell responses were more frequent among those who received the 2×1011 particle-unit dose than among those who received the 2×1010 particle-unit dose, with a CD4 response in 10 of 10 participants versus 3 of 10 participants (P=0.004) and a CD8 response in 7 of 10 participants versus 2 of 10 participants (P=0.07) at week 4. Assessment of the durability of the antibody response showed that titers remained high at week 48, with the highest titers in those who received the 2?1011particle-unit dose.

 

According to the authors, reactogenicity and immune responses to cAd3-EBO vaccine were dose-dependent. At the 2?1011 particle-unit dose, glycoprotein Zaire-specific antibody responses were in the range reported to be associated with vaccine-induced protective immunity in challenge studies involving nonhuman primates, and responses were sustained to week 48. Phase 2 studies and efficacy trials assessing cAd3-EBO are in progress.

 

Blood-based NfL: A Biomarker for Differential Diagnosis of Parkinsonian Disorder

 

In order t0 improve the diagnostic workup of parkinsonian disorders, a study published online in Neurology (2017; 88:930-937), was performed to determine if blood neurofilament light chain (NfL) protein can discriminate between Parkinson disease (PD) and atypical parkinsonian disorders (APD) with equally high diagnostic accuracy as CSF NfL.

 

The study included 3 independent prospective cohorts: the Lund (n = 278) and London (n = 117) cohorts, comprising healthy controls and patients with PD, progressive supranuclear palsy (PSP), corticobasal syndrome (CBS), and multiple system atrophy (MSA), as well as an early disease cohort (n = 109) of patients with PD, PSP, MSA, or CBS with disease duration <3 years. Blood NfL concentration was measured using an ultrasensitive single molecule array (Simoa) method, and the diagnostic accuracy to distinguish PD from APD was assessed.

 

Results found strong correlations between blood and CSF concentrations of NfL (? > 0.73-0.84, p < 0.001). Blood NfL was increased in patients with MSA, PSP, and CBS (i.e., all APD groups) when compared to patients with PD as well as healthy controls in all cohorts (p < 0.001). Furthermore, in the Lund cohort, blood NfL could accurately distinguish PD from APD (area under the curve [AUC] 0.91) with similar results in both the London cohort (AUC 0.85) and the early disease cohort (AUC 0.81).

 

According to the authors, quantification of blood NfL concentration can be used to distinguish PD from APD, and that blood-based NfL might consequently be included in the diagnostic workup of patients with parkinsonian symptoms in both primary care and specialized clinics.

 

← Previous PageNext Page →