Protein in Urine Linked to Increased Risk of Memory Problems, Dementia
Credit: Wikipedia Commons
Dementia, also known as 1) ___, is a broad category of brain diseases that cause a long term and often gradual decrease in the ability to think and remember that is great enough to affect a person’s daily functioning. Other common symptoms include emotional problems, problems with language, and a decrease in motivation. A person’s consciousness is usually not affected. A dementia diagnosis requires a change from a person’s usual mental functioning and a greater decline than one would expect due to aging. These diseases also have a significant effect on a person’s caregivers. The most common type of dementia is 2) ___ disease, which makes up 50% to 70% of cases. Other common types include vascular dementia (25%), Lewy body dementia (15%), and frontotemporal dementia. Less common causes include normal pressure hydrocephalus, Parkinson’s disease, syphilis, and Creutzfeldt-Jakob disease among others. More than one type of dementia may exist in the same person. A small proportion of cases run in families. In the DSM-5, dementia was reclassified as a neurocognitive disorder, with various degrees of severity. Diagnosis is usually based on history of the illness and cognitive testing with medical imaging and blood work used to rule out other possible causes. The mini mental state examination is one commonly used cognitive test. Efforts to prevent dementia include trying to decrease risk factors such as high 3) ___ pressure, smoking, diabetes, and obesity. Screening the general population for the disorder is not recommended.
There is no cure for dementia. Cholinesterase inhibitors such as donepezil are often used and may be beneficial in mild to moderate disorder. Overall benefit, however, may be minor. For people with dementia and those who care for them many measures can improve their lives. Cognitive and behavioral interventions may be appropriate. Educating and providing emotional support to the caregiver is important. Exercise programs may be beneficial with respect to activities of daily living and potentially improve outcomes. Treatment of behavioral problems with antipsychotics is common but not usually recommended due to the little benefit and side effects, including an increased risk of 4) ___.
People who have protein in their 5) ___, which is a sign of kidney problems, may also be more likely to later develop problems with thinking and memory skills or even dementia, according to a meta-analysis published in the December 14, 2016, online issue of Neurology, the medical journal of the American Academy of Neurology. The researchers looked at all available studies on kidney problems and the development of cognitive impairment or dementia. Kidney dysfunction has been considered a possible risk factor for cognitive impairment or dementia, said Kay Deckers, MSc, of Maastricht University in the Netherlands, author of the systematic review and meta-analysis. Chronic 6) ___ disease and dementia share many risk factors, such as high blood pressure, diabetes and high cholesterol, and both show similar effects on the brain, so they may have shared vascular factors or there may even be a direct effect on the 7) ___ from kidney problems. A total of 22 studies on the topic were included in the systematic review. Five of the studies, including 27,805 people, were evaluated in the meta-analysis on protein in the urine, also called albuminuria or proteinuria. The analysis showed that people with protein in the urine were 35% more likely to develop cognitive impairment or dementia than people who did not have 8) ___ in their urine. For another marker of kidney function, estimated glomerular filtration rate, the results were mixed and did not show an association. For three other markers of kidney function, cystatin C, serum creatinine and creatinine clearance, no meta-analysis could be completed because the few studies available did not use the same methods and could not be compared. Protein in the urine was associated with a modestly increased 9) ___ of cognitive impairment or dementia, Deckers said. More research is needed to determine whether the kidney problems are a cause of the cognitive problems or if they are both caused by the same mechanisms. The study was supported by the In-MINDD (Innovative Midlife Intervention for Dementia Deterrence) project funded by the European Union’s Framework Program Seven. To learn more about the brain, visit www.aan.com/patients.
The American Academy of Neurology is the world’s largest association of neurologists and neuroscience professionals, with 30,000 members. The AAN is dedicated to promoting the highest quality patient-centered neurologic care. A neurologist is a doctor with specialized training in diagnosing, treating and managing disorders of the brain and nervous system such as Alzheimer’s disease, stroke, migraine, multiple sclerosis, concussion, Parkinson’s disease and epilepsy.
ANSWERS: 1) senility; 2) Alzheimer’s; 3) blood; 4) death; 5) urine; 6) kidney; 7) brain; 8) protein; 9) risk
Credit: Garrondo, National Institute on Aging; Public Domain, Wikipedia Commons
Credit: National Institutes of Health nlm.nih.gov/medline; Public Domain, Wikipedia Commons
The history of dementia is probably as old as mankind itself. In recent years, considerable advances have been made in our understanding of the epidemiology, the pathogenesis and the diagnosis of Alzheimer’s disease (AD) and related disorders, and the nosology of these disorders is under scrutiny. Furthermore, we are witnessing the emergence of therapeutic agents specifically designed to enhance memory and cognition in AD patients. Despite the limited efficacy of the agents currently available, their introduction has shed an entirely new light on the field. We therefore feel that this is a good time to look at the past to understand the present and perhaps gain insight into the future. Until the end of the 19th century, dementia was a much broader clinical concept. It included mental illness and any type of psychosocial incapacity, including conditions that could be reversed. Dementia at this time simply referred to anyone who had lost the ability to reason, and was applied equally to psychosis of mental illness, organic diseases like syphilis that destroy the brain, and to the dementia associated with old age, which was attributed to hardening of the arteries.
Dementia has been referred to in medical texts since antiquity. One of the earliest known accounts was written by the 7th century BCE Greek physician and mathematician Pythagoras, who divided the human lifespan into six distinct phases, which were 0-6 (infancy), 7-21 (adolescence), 22-49 (young adulthood), 50-62 (middle age), 63-79 (old age), and 80- (advanced age). The last two he described as the senium, a period of mental and physical decay, and of the final phase being where the scene of mortal existence closes after a great length of time that very fortunately, few of the human species arrive at, where the mind is reduced to the imbecility of the first epoch of infancy. In 550 BCE, the Greek Athenian statesman and poet Solon argued that the terms of a man’s will might be invalidated if he exhibited loss of judgement due to advanced age. Chinese medical texts made allusions to the condition as well, and the characters for dementia translate literally to foolish old person.
Aristotle and Plato from Ancient Greece spoke of the mental decay of advanced age, but apparently simply viewed it as an inevitable process that affected all old men, and which nothing could prevent. The latter stated that the elderly were unsuited for any position of responsibility because, There is not much acumen of the mind that once carried them in their youth, those characteristics one would call judgement, imagination, power of reasoning, and memory. They see them gradually blunted by deterioration and can hardly fulfill their function. For comparison, the Roman statesman Cicero held a view much more in line with modern-day medical wisdom that loss of mental function was not inevitable in the elderly and affected only those old men who were weak-willed. He spoke of how those who remained mentally active and eager to learn new things could stave off dementia. However, Cicero’s views on aging, although progressive, were largely ignored in a world that would be dominated by Aristotle’s medical writings for centuries. Subsequent physicians during the time of Roman Empire such as Galen and Celsus simply repeated the beliefs of Aristotle while adding few new contributions to medical knowledge.
Byzantine physicians sometimes wrote of dementia, and it is recorded that at least seven emperors whose lifespans exceeded the age of 70 displayed signs of cognitive decline. In Constantinople, there existed special hospitals to house those diagnosed with dementia or insanity, but these naturally did not apply to the emperors who were above the law and whose health conditions could not be publicly acknowledged. Otherwise, little is recorded about senile dementia in Western medical texts for nearly 1700 years. One of the few references to it was the 13th-century friar Roger Bacon, who viewed old age as divine punishment for original sin. Although he repeated existing Aristotelian beliefs that dementia was inevitable after a long enough lifespan, he did make the extremely progressive assertion that the brain was the center of memory and thought rather than the heart. Poets, playwrights, and other writers however made frequent allusions to the loss of mental function in old age. Shakespeare notably mentions it in some of his plays including Hamlet and King Lear.
Dementia in the elderly was called senile dementia or senility, and viewed as a normal and somewhat inevitable aspect of growing old, rather than as being caused by any specific diseases. At the same time, in 1907, a specific organic dementing process of early onset, called Alzheimer’s disease, had been described. This was associated with particular microscopic changes in the brain, but was seen as a rare disease of middle age because the first patient diagnosed with it was a 50-year-old woman. During the 19th century, doctors generally came to believe that dementia in the elderly was the result of cerebral atherosclerosis, although opinions fluctuated between the idea that it was due to blockage of the major arteries supplying the brain or small strokes within the vessels of the cerebral cortex. This viewpoint remained conventional medical wisdom through the first half of the 20th century, but by the 1960s was increasingly challenged as the link between neurodegenerative diseases and age-related cognitive decline was established. By the 1970s, the medical community maintained that vascular dementia was rarer than previously thought and Alzheimer’s disease caused the vast majority of mental impairments in old age. More recently however, it is believed that dementia is often a mixture of both conditions.
Much like other diseases associated with aging, dementia was comparatively rare before the 20th century, due to the fact that it is most common in people over 80, and such lifespans were uncommon in preindustrial times. Conversely, syphilitic dementia was widespread in the developed world until largely being eradicated by the use of penicillin after WWII. With significant increases in life expectancy following WWII, the number of people in developed countries over 65 started rapidly climbing. While elderly persons constituted an average of 3-5% of the population prior to 1945, by 2010 it was common in many countries to have 10-14% of people over 65 and in Germany and Japan, this figure exceeded 20%. Public awareness of Alzheimer’s Disease was greatly increased in 1994 when former US president Ronald Reagan announced that he had been diagnoses with the condition. By the period of 1913-20, schizophrenia had been well-defined in a way similar to today, and also the term dementia praecox had been used to suggest the development of senile-type dementia at a younger age. Eventually the two terms fused, so that until 1952 physicians used the terms dementia praecox (precocious dementia) and schizophrenia interchangeably. The term precocious dementia for a mental illness suggested that a type of mental illness like schizophrenia (including paranoia and decreased cognitive capacity) could be expected to arrive normally in all persons with greater age (see paraphrenia). After about 1920, the beginning use of dementia for what we now understand as schizophrenia and senile dementia helped limit the word’s meaning to permanent, irreversible mental deterioration. This began the change to the more recognizable use of the term today.
In 1976, neurologist Robert Katzmann suggested a link between senile dementia and Alzheimer’s disease. Katzmann suggested that much of the senile dementia occurring (by definition) after the age of 65, was pathologically identical with Alzheimer’s disease occurring before age 65 and therefore should not be treated differently. He noted that the fact that senile dementia was not considered a disease, but rather part of aging, was keeping millions of aged patients experiencing what otherwise was identical with Alzheimer’s disease from being diagnosed as having a disease process, rather than simply considered as aging normally. Katzmann thus suggested that Alzheimer’s disease, if taken to occur over age 65, is actually common, not rare, and was the 4th or 5th leading cause of death, even though rarely reported on death certificates in 1976. This suggestion opened the view that dementia is never normal, and must always be the result of a particular disease process, and is not part of the normal healthy aging process, per se. The ensuing debate led for a time to the proposed disease diagnosis of senile dementia of the Alzheimer’s type (SDAT) in persons over the age of 65, with Alzheimer’s disease diagnosed in persons younger than 65 who had the same pathology. Eventually, however, it was agreed that the age limit was artificial, and that Alzheimer’s disease was the appropriate term for persons with the particular brain pathology seen in this disorder, regardless of the age of the person with the diagnosis. A helpful finding was that although the incidence of Alzheimer’s disease increased with age (from 5-10% of 75-year-olds to as many as 40-50% of 90-year-olds), there was no age at which all persons developed it, so it was not an inevitable consequence of aging, no matter how great an age a person attained. Evidence of this is shown by numerous documented supercentenarians (people living to 110+) that experienced no serious cognitive impairment. There is some evidence that dementia is most likely to develop between the ages of 80-84 and individuals who pass that point without being affected have a lower chance of developing it. Women account for a larger percentage of dementia cases than men, although this can be attributed to their longer overall lifespan and greater odds of attaining an age where the condition is likely to occur. Also, after 1952, mental illnesses like schizophrenia were removed from the category of organic brain syndromes, and thus (by definition) removed from possible causes of dementing illnesses (dementias). At the same, however, the traditional cause of senile dementia – hardening of the arteries – now returned as a set of dementias of vascular cause (small strokes). These were now termed multi-infarct dementias or vascular dementias.
In the 21st century, a number of other types of dementia have been differentiated from Alzheimer’s disease and vascular dementias (these two being the most common types). This differentiation is on the basis of pathological examination of brain tissues, symptomatology, and by different patterns of brain metabolic activity in nuclear medical imaging tests such as SPECT and PETscans of the brain. The various forms of dementia have differing prognoses (expected outcome of illness), and also differing sets of epidemologic risk factors. The causal etiology of many of them, including Alzheimer’s disease, remains unclear, although many theories exist such as accumulation of protein plaques as part of normal aging, inflammation (either from bacterial pathogens or exposure to toxic chemicals), inadequate blood sugar, and traumatic brain injury. Sources: nih.gov; Wikipedia
Cellular Immunotherapy Targets a Common Human Cancer Mutation
More than 30% of all human cancers are driven by mutations in a family of genes known collectively as RAS, which has three members: KRAS, NRAS, and HRAS. Mutations in the KRAS gene are thought to drive 95 percent of all pancreatic cancers and 45% of all colorectal cancers. A mutation called G12D is the most common KRAS mutation and is estimated to occur in more than 50,000 new cases of cancer in the United States each year. Because of their importance in cancer causation, worldwide efforts to successfully target mutant RAS genes are being pursued. Such efforts have met with limited success to date.
According to an online published report in the New England Journal of Medicine (8 December 2016), an immune therapy study that involved a single patient, identified a method for targeting the cancer-causing protein produced by a mutant form of the KRAS gene for colorectal cancer. This targeted immunotherapy led to cancer regression in the patient in the study. The study was led by Steven A. Rosenberg, M.D., Ph.D., chief of the Surgery Branch at NCI’s Center for Cancer Research, and was conducted at the NIH Clinical Center. NCI is part of the National Institutes of Health.
In attempting to develop more effective approaches to targeting RAS, the authors isolated tumor infiltrating lymphocytes (TILs) that targeted the KRAS G12D mutation from tumor nodules in the patient’s lungs that developed after colorectal cancer cells had spread to the lungs. TILs are white blood cells that migrate from the bloodstream into a tumor. The isolated TILs were grown in the laboratory to large numbers and then infused into the patient intravenously. Following the TIL infusion, all seven metastatic lung nodules in the patient regressed, and the regression persisted for nine months. After 9 months, one of the lesions progressed and was surgically removed. This lesion was found to have lost a segment of chromosome 6 that includes a gene known as HLA-C*0802. This gene is involved in antigen presentation, a process through which an antigen produced by a cell is displayed on the cell’s outer surface and is thereby presented to the immune system. If the immune system recognizes the antigen as abnormal or foreign, an immune response against it will be mounted. In this case, because of the loss of the segment of chromosome 6, the immune system was unable to recognize the cancer cells as being abnormal, and they were able to escape immune attack and continue to thrive. Since the lesion was removed, however, the patient has been disease-free for over 8 months.
According to the authors, the study demonstrates for the first time, that this method of administering TILs, called adoptive T cell transfer immunotherapy, can mediate effective antitumor immune responses against cancers that express the KRAS G12D mutation. The authors added that they have also identified multiple T cell receptors that recognize this KRAS product, thus opening the possibility of T cell receptor gene therapy against multiple types of cancer that express this common mutation.
Veterans Endure Higher Pain Severity Than Non-Veterans
According to a paper published online in the Journal of Pain (21 November 2016) with graphics posted on the NIH website, American veterans experience higher prevalence of pain and more severe pain than nonveterans, with young and middle-aged veterans suffering the most. This survey provides the first national estimate of severe pain associated with painful health conditions in veterans and nonveterans and underscores the importance of sustaining efforts to monitor and manage pain among veterans.
The analysis is based on data from the 2010-2014 National Center for Complementary and Integrative Health (NCCIH), in which 67,696 adults (6,647 veterans and 61,049 nonveterans) responded to questions about the persistence and intensity of self-reported pain during the three months prior to the survey. The majority of veteran participants were men (92.5%), while the majority of nonveteran participants were women (56.5%). The survey data did not identify any specific aspects of military service, including branch of the armed forces, years of service, or whether the veteran served in a combat role.
Among the findings from this analysis:
1. More veterans (65.5%) than nonveterans (56.4%) reported having pain in the previous three months.
2. A higher proportion of veterans (9.1%) reported having severe pain than nonveterans (6.3%).
3. Younger veterans (7.8%) were substantially more likely to report suffering from severe pain than nonveterans (3.2%) of similar ages, even after controlling for underlying demographic characteristics.
4. Veterans were more likely than nonveterans to have any back pain (32.8%), back pain with or without sciatica (12.2%, 20.5%), or joint pain (43.6%), but less likely to have jaw pain (3.6%) or migraines (10.0%).
5. The prevalence of severe pain was significantly higher in veterans with back pain (21.6%), jaw pain (37.5%), severe headaches or migraine (26.4%), and neck pain (27.7%) than in nonveterans with these conditions.
6. For nonveterans, as age increased, the prevalence of any pain and severe pain also increased; however, for veterans, those aged 50 to 59 were most likely to have severe pain, while the youngest and oldest groups were least likely to have severe pain.
7. Veterans aged 18-39 and 50-59 were more likely than nonveterans of the same ages to have any pain. Veterans aged 18-39 were also more likely to have severe pain than nonveterans in the same age group. However, veterans aged 70 or older were less likely to have severe pain than similarly aged nonveterans.
8. Male veterans (9.0%) were more likely to report severe pain than male nonveterans (4.7%); however, no significant difference was seen between the two female groups.
According to the NIH, these findings show that there is much more to do to help our veterans who are suffering from pain, and that this new knowledge can help inform effective health care strategies for veterans of all ages. In addition, more research is needed to generate additional evidence-based options for veterans managing pain and over time this research may help nonveterans as well.
NCCIH is partnering with the Department of Veterans Affairs and Department of Defense on 13 grants to research military and veteran health with a focus on nonpharmacological approaches to pain and related conditions.
21st Century Cures Act: Making Progress on Shared Goals for Patients
The following was excerpted from FDA Voice, authored by Robert M. Califf, M.D., FDA Commissioner
On 13 December 2016, Today, President Obama signed into law the 21st Century Cures Act, which builds on FDA’s ongoing efforts to advance medical product innovation and ensure that patients get access to treatments as quickly as possible, with continued assurance from high quality evidence that they are safe and effective. The 21st Century Cures Act will greatly improve FDA’s ability to hire and retain scientific experts as one of FDA’s ongoing challenges has been recruiting and retaining the experts needed in specialized areas to meet FDA’s growing responsibilities. This is an especially important need given the tremendous advances in biological sciences, engineering, information technology and data science. Preventive, diagnostic and therapeutic strategies will become more complex with much greater potential for benefit and in some cases greater risk if used without adequate evidence to exclude risks that exceed potential benefits.
This new law rightly recognizes that patients should play an essential role in the development of drugs and devices to diagnose and treat their disease, since patients are in a unique position to provide essential insights about what it is like to live with and fight their disease. That’s been FDA’s perspective as well, and it’s why FDA has continued to advance the science of patient input through its patient-focused drug development program and its partner with patients program for medical devices. As it is, 21st Century Cures will enhance these ongoing efforts to better incorporate the patient’s voice into FDA’s decision-making.
21st Century Cures will also support FDA’s efforts to modernize and improve efficiency in clinical trial design. This has been an important FDA priority for decades, but exciting new approaches are now available to develop a common understanding of which designs should be used for which clinical issues. In cancer, for example, FDA is already weighing the use of common control trials, which share a control arm, involve multiple different drugs for the same indication, and may even involve different companies. One of the benefits of using a common control arm is that the overall number of patients who need to be recruited and enrolled decreases, thereby optimizing clinical trial resources and potentially shortening the time it takes to get a new study off the ground.
Even without the benefit of Cures, patients have been well-served by FDA’s program efficiencies, emphasis on early meetings, and use of expedited pathway programs to speed approval and delivery of new drugs and devices to patients. Rather than passively processing product applications, FDA works to advise companies and inventors from the earliest stages of the development process on the kinds of medical products needed, how to do the necessary research, and how to viably and effectively translate from concept to product. This not only means that important new products will be developed as efficiently as possible but also that medicines and devices with no chance of success are identified much earlier so that money isn’t wasted on futile development. These programs have been embraced by developers of medical products in this country, and they are making a real and positive difference.
In the United States, the FDA uses expedited programs (fast track, priority review, accelerated approval, and breakthrough therapy) for drugs and biologics more than comparable drug and biologic regulators in other countries use theirs and as a result FDA is the first to approve a majority of novel drugs compared to foreign counterparts. For devices, this past year was the first full year of operation for FDA’s expedited access pathway (EAP) program, which helps speed the development and availability of certain medical devices that demonstrate the potential to address unmet medical needs for life-threatening or irreversibly-debilitating diseases or conditions. So far, FDA has granted 24 devices access to this program. 21st Century Cures builds on EAP by creating the breakthrough device pathway.
The law establishes other new programs as well. For instance, the Limited Population pathway will help streamline the development programs for certain anti-bacterials and anti-fungals intended to treat targeted groups of patients suffering from serious or life-threatening infections where unmet need exists due to lack of available therapies. Approvals of these antimicrobials are expected to rely on data primarily targeting these limited populations. The statement Limited Population will appear prominently next to the drug’s name in labeling, which will provide notice to healthcare providers that the drug is indicated for use in a limited and specific population of patients. The limited population statement, additional labeling statements describing the data, and FDA review of promotional materials, will help assure these drugs are used narrowly to treat these serious and life-threatening infections while additional evidence is generated to assess safety and effectiveness for broader use.
21st Century Cures also creates a new program for the development of regenerative medicine products, an important and exciting new field that deserves this special focus. The program designates drugs as regenerative advanced therapies and takes appropriate actions to improve the efficiency of development and to enhance the exchange of information among FDA, researchers and developers. An especially important element of this program is the creation of a research network and a public-private partnership to assist developers in generating definitive evidence about whether their proposed therapies indeed provide clinical benefits that are hoped for.
Looking ahead, much still needs to be done to spur product development. There have yet to be successful therapies identified for certain diseases, such as Alzheimer’s disease, where underlying scientific knowledge is still lacking. In addition, we are only at the early stage in building a national evidence generation system based on registries, claims data, and electronic health records that will be a rich source of post-market data and an avenue for conducting more efficient research. Last week, FDA published a consensus of FDA leadership on the use of real world evidence in the New England Journal of Medicine, focusing on the misperception that randomized trials and real world data are incompatible. In fact, the use of randomization within the context of clinical practice will constitute a major advance in evidence generation and we are actively encouraging proposals with this combination of randomized trials conducted in real world practice. Cures provides support for continued exploration of the use of real world evidence in the regulatory context. The law also addresses drug firms providing healthcare economic information to payers and formulary committees. This complex area will require careful delineation of principles to guide information exchange to enable these entities to appropriately assess the value of drugs.
With 21st Century Cures, great progress has been made towards our shared goal of advancing regulatory science so that we can continue to speed the discovery, development, and delivery of medical products to prevent and cure disease and improve health while sustaining the evidence framework that enables assurance to the public of the safety and effectiveness of medical products. FDA now stands ready to work with Congress, other federal agencies and the medical products ecosystem to implement these important provisions as we all continue to work on behalf of all Americans to protect and promote public health and promote innovation in this exciting time.
Baked Cauliflower with Marinara
This is a recipe worth trying. If you’re too busy to make your own marinara sauce, get a good store-bought marinara like Newman’s Own. ©Joyce Hays, Target Health Inc.
Mop the delicious juices up with warm Italian or French bread. ©Joyce Hays, Target Health Inc.
1/2 cup almond flour
1 teaspoon fresh cilantro, well chopped
1 teaspoon fresh parsley, well chopped
1 teaspoon fresh dill, well chopped
4 large eggs, lightly beaten
10 fresh garlic cloves, large slices
3 cups panko
1 onion, chopped
Pinch Kosher salt
Pinch Black pepper
Pinch chili flakes
1 teaspoon turmeric
1 or 2 heads cauliflower, trimmed and cut into 2-inch florets
1/2 cup or more, olive oil, for frying (more as needed)
5 cups Marinara Sauce (more if needed). Use your favorite recipe.
1 or more cups finely grated Parmesan, preferably Parmigiano-Reggiano
1/2 to 1 pound fresh Burrata mozzarella, torn into bite-size pieces
Fresh ingredients make a huge difference in the outcome. However, when making marinara sauce, I always use Cento tomatoes, because I never know if Whole Foods or FreshDirect will have peak flavor tomatoes. Long ago, a chef at a very fine Italian restaurant told me Cento is the secret ingredient of many chefs. I have been doing that ever since. ©Joyce Hays, Target Health Inc.
1. Heat the oven to 400 degrees.
2. Do all the chopping that needs to be done.
Chopping all the herbs at the same time. When you chop this way, and mix all herbs together, just use 3 teaspoons of mixed chopped herbs. ©Joyce Hays, Target Health Inc.
3. Make the marinara sauce. Set aside. If pressed for time, used store-bought, like Newman’s Own.
Making the marinara sauce. ©Joyce Hays, Target Health Inc.
4. Place flour, eggs and panko into three wide, shallow bowls. Season each generously with salt and pepper. Dip a cauliflower piece first in flour, then eggs, then coat with panko. Repeat with remaining cauliflower.
Dip the cauliflower florets first into almond flour, then egg, then the Panko. Now, you’re ready to fry. ©Joyce Hays, Target Health Inc.
5. Sautee onions & garlic in olive oil. Then add spices, and herbs. Add more oil if needed, stir to combine well. Set aside. You will add all of this to the marinara sauce.
Cooking the onions & garlic. ©Joyce Hays, Target Health Inc.
6. Fill a large skillet with 1/2-inch oil. Place over medium-high heat. When oil is hot, fry cauliflower in batches, turning halfway through, until golden brown. Transfer fried cauliflower pieces to a paper towel-lined plate.
Frying the cauliflower. ©Joyce Hays, Target Health Inc.
The fried cauliflower should look like this when you’re done. It happens to be scrumptious like this, as well. If you just wanted to stop and serve at this point, you would not be disappointed. ©Joyce Hays, Target Health Inc.
7. Spoon a thin layer of sauce over the bottom of a 9-by-13-inch baking pan. Sprinkle one-third of the Parmesan over sauce. Scatter half cauliflower mixture over the Parmesan and top with half the mozzarella pieces. Top with half the remaining sauce, sprinkle with another third of the Parmesan and repeat layering, ending with a final layer of sauce and Parmesan.
Here is final layer of sauce, parmesan and burrata are about to be added. ©Joyce Hays, Target Health Inc.
8. Transfer pan to oven and bake until cheese is golden and casserole is bubbling, about 40 minutes. Let cool a few minutes before serving.
About to go into the oven. ©Joyce Hays, Target Health Inc.
This is a fairly low calorie comfort food, delicious on a cold night with a full bodied Italian Banfi, or Napa cab or? the list of fine red wines is endless. Oh, and serve warm Italian or French bread to sop up the flavorful juices. ©Joyce Hays, Target Health Inc.
Saturday, we saw the show, Love Love Love named after the same song by John Lennon, which I do love.
A great song but not a great show. The actors gave it their all, but the play itself was lacking in depth and never seemed to be quite in focus. Afterwards, we dined at the recommendation of friends, an informal French bistro on Broadway, near Carnegie Hall and City Center, called Brasserie Cognac. Apparently, it’s open around the clock, so if we get out of the theater at say, 4, we could go to a place like this. Most well-known Manhattan restaurants don’t serve dinner until 5 or 5:30 at the earliest. These days, many shows are 90 minutes long, so we need to know what’s open after 90 minute productions. Anyway, we sat at a cozy corner banquette. Jules had a delicious tomato/goat cheese tart and I had a tuna tart. Next, we shared a cheese souffl?, which was wonderful. The wine selection is not extensive, but we were happy with a Napa red. They had run out of Stags Leap cab. Jules had chicken Paillard and I had a perfect filet mignon flambe; both very fine. I don’t have very good luck with the thin pounded chicken, Jules had. Mine always comes out too dry; whereas this bistro style chicken was moist and tasty. Probably, the real reason we tried this bistro out is we heard they served crepes suzette flambeed, table side. The last time we had this dessert was at The Four Seasons Restaurant, where we used to hang out a lot. Now, sadly, this great NY restaurant has closed. Anyway, we had the world’s most delicious dessert once again with a scoop of vanilla ice cream, (Four Seasons style) and were not disappointed.
We headed home, up Park Avenue, with slight powdering of snow, and its mall of holiday lights. Manhattan is very lovely at this time of year. ©Joyce Hays, Target Health Inc.
May Your Days Be Merry and Bright! Irving Berlin. Photo: ©Joyce Hays, Target Health Inc.
From Our Table to Yours !
Research suggest that large submarine landslides off Great Bahama Bank in the past were large enough to generate tsunamis
December 15, 2016
University of Miami Rosenstiel School of Marine & Atmospheric Science
While the Caribbean is not thought to be at risk for tsunamis, a new study indicates that large submarine landslides on the slopes of the Great Bahama Bank have generated tsunamis in the past and could potentially again in the future.
While the Caribbean is not thought to be at risk for tsunamis, a new study by researchers at the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science indicates that large submarine landslides on the slopes of the Great Bahama Bank have generated tsunamis in the past and could potentially again in the future.
“Our study calls attention to the possibility that submarine landslides can trigger tsunami waves,” said UM Rosenstiel School Ph.D. student Jara Schnyder, the lead author of the study. “The short distance from the slope failures to the coastlines of Florida and Cuba makes potential tsunamis low-probability but high-impact events that could be dangerous.”
The team identified margin collapses and submarine landslides along the slopes of the western Great Bahama Bank — the largest of the carbonate platforms that make up the Bahamas archipelago — using multibeam bathymetry and seismic reflection data. These landslides are several kilometers long and their landslide mass can slide up to 20 kilometers (12 miles) into the basin.
An incipient failure scar of nearly 100 kilometers (70 miles) length was identified as a potential future landslide, which could be triggered by an earthquake that occasionally occur off the coast of Cuba.
Using the mathematical models commonly used to evaluate tsunami potential in the U.S., the researchers then simulated the tsunami waves for multiple scenarios of submarine landslides originating off the Great Bahama Bank to find that submarine landslides and margin collapses in the region could generate dangerous ocean currents and possibly hazardous tsunami waves several meters high along the east coast of Florida and northern Cuba.
“Residents in these areas should be aware that tsunamis do not necessarily have to be created by large earthquakes, but can also be generated by submarine landslides that can be triggered by smaller earthquakes,” said UM Rosenstiel School Professor of Marine Geosciences Gregor Eberli, senior author of the study.
The study, titled “Tsunamis caused by submarine slope failures along western Great Bahama Bank,” was published in the Nov. 4 issue of the journal Scientific Reports. The paper’s co-authors include: Jara S.D. Schnyder, Gregor P. Eberli of the CSL-Miami, James T. Kirby, Fengyan Shi, and Babak Tehranirad of the University of Delaware, Thierry Mulder and Emmanuelle Ducassou of the Université de Bordeaux in France, and Dierk Hebbeln and Paul Wintersteller of the University of Bremen in Germany.
Materials provided by University of Miami Rosenstiel School of Marine & Atmospheric Science. Note: Content may be edited for style and length.
- Jara S.D. Schnyder, Gregor P. Eberli, James T. Kirby, Fengyan Shi, Babak Tehranirad, Thierry Mulder, Emmanuelle Ducassou, Dierk Hebbeln, Paul Wintersteller. Tsunamis caused by submarine slope failures along western Great Bahama Bank. Scientific Reports, 2016; 6: 35925 DOI: 10.1038/srep35925
Source: University of Miami Rosenstiel School of Marine & Atmospheric Science. “Tsunami risk for Florida and Cuba modeled: Research suggest that large submarine landslides off Great Bahama Bank in the past were large enough to generate tsunamis.” ScienceDaily. ScienceDaily, 15 December 2016. <www.sciencedaily.com/releases/2016/12/161215085924.htm>.
December 14, 2016
NASA/Goddard Space Flight Center
Far above Earth’s surface is a sea of particles that have been split into positive and negative ions by the suns harsh ultraviolet radiation called the ionosphere — this is Earth’s interface to space.
Scientists from NASA and three universities have presented new discoveries about the way heat and energy move and manifest in the ionosphere, a region of Earth’s atmosphere that reacts to changes from both space above and Earth below.
Far above Earth’s surface, within the tenuous upper atmosphere, is a sea of particles that have been split into positive and negative ions by the sun’s harsh ultraviolet radiation. Called the ionosphere, this is Earth’s interface to space, the area where Earth’s neutral atmosphere and terrestrial weather give way to the space environment that dominates most of the rest of the universe — an environment that hosts charged particles and a complex system of electric and magnetic fields. The ionosphere is both shaped by waves from the atmosphere below and uniquely responsive to the changing conditions in space, conveying such space weather into observable, Earth-effective phenomena — creating the aurora, disrupting communications signals, and sometimes causing satellite problems.
Many of these effects are not well-understood, leaving the ionosphere, for the most part, a region of mystery. Scientists from NASA’s Goddard Space Flight Center in Greenbelt, Maryland, the Catholic University of America in Washington, D.C., the University of Colorado Boulder, and the University of California, Berkeley, presented new results on the ionosphere at the fall meeting of the American Geophysical Union on Dec. 14, 2016, in San Francisco.
One researcher explained how the interaction between the ionosphere and another layer in the atmosphere, the thermosphere, counteract heating in the thermosphere — heating that leads to expansion of the upper atmosphere, which can cause premature orbital decay. Another researcher described how energy outside the ionosphere accumulates until it discharges — not unlike lightning — offering an explanation for how energy from space weather crosses over into the ionosphere. A third scientist discussed two upcoming NASA missions that will provide key observations of this region, helping us better understand how the ionosphere reacts both to space weather and to terrestrial weather.
Changes in the ionosphere are primarily driven by the sun’s activity. Though it may appear unchanging to us on the ground, our sun is, in fact, a very dynamic, active star. Watching the sun in ultraviolet wavelengths of light from space — above our UV light-blocking atmosphere — reveals constant activity, including bursts of light, particles, and magnetic fields.
Occasionally, the sun releases huge clouds of particles and magnetic fields that explode out from the sun at more than a million miles per hour. These are called coronal mass ejections, or CMEs. When a CME reaches Earth, its embedded magnetic fields can interact with Earth’s natural magnetic field — called the magnetosphere — sometimes compressing it or even causing parts of it to realign.
It is this realignment that transfers energy into Earth’s atmospheric system, by setting off a chain reaction of shifting electric and magnetic fields that can send the particles already trapped near Earth skittering in all directions. These particles can then create one of the most recognizable and awe-inspiring space weather events — the aurora, otherwise known as the Northern Lights.
But the transfer of energy into the atmosphere isn’t always so innocuous. It can also heat the upper atmosphere — where low-Earth satellites orbit — causing it to expand like a hot-air balloon.
“This swelling means there’s more stuff at higher altitudes than we would otherwise expect,” said Delores Knipp, a space scientist at the University of Colorado Boulder. “That extra stuff can drag on satellites, disrupting their orbits and making them harder to track.”
This phenomenon is called satellite drag. New research shows that this understanding of the upper atmosphere’s response to solar storms — and the resulting satellite drag — may not always hold true.
“Our basic understanding has been that geomagnetic storms put energy into the Earth system, which leads to swelling of the thermosphere, which can pull satellites down into lower orbits,” said Knipp, lead researcher on these new results. “But that isn’t always the case.”
Sometimes, the energy from solar storms can trigger a chemical reaction that produces a compound called nitric oxide in the upper atmosphere. Nitric oxide acts as a cooling agent at very high altitudes, promoting energy loss to space, so a significant increase in this compound can cause a phenomenon called overcooling.
“Overcooling causes the atmosphere to quickly shed energy from the geomagnetic storm much quicker than anticipated,” said Knipp. “It’s like the thermostat for the upper atmosphere got stuck on the ‘cool’ setting.”
That quick loss of energy counteracts the previous expansion, causing the upper atmosphere to collapse back down — sometimes to an even smaller state than it started in, leaving satellites traveling through lower-density regions than anticipated.
A new analysis by Knipp and her team classifies the types of storms that are likely to lead to this overcooling and rapid upper atmosphere collapse. By comparing over a decade of measurements from Department of Defense satellites and NASA’s Thermosphere, Ionosphere, Mesosphere Energetics and Dynamics, or TIMED, mission, the researchers were able to spot patterns in energy moving throughout the upper atmosphere.
“Overcooling is most likely to happen when very fast and magnetically-organized ejecta from the sun rattle Earth’s magnetic field,” said Knipp. “Slow clouds or poorly-organized clouds just don’t have the same effect.”
This means that, counterintuitively, the most energetic solar storms are likely to provide a net cooling and shrinking effect on the upper atmosphere, rather than heating and expanding it as had been previously understood.
Competing with this cooling process is the heating that caused by solar storm energy making its way into Earth’s atmosphere. Though scientists have known that solar wind energy eventually reaches the ionosphere, they have understood little about where, when and how this transfer takes place. New observations show that the process is localized and impulsive, and partly dependent on the state of the ionosphere itself.
Traditionally, scientists have thought that the way energy moves throughout Earth’s magnetosphere and atmosphere is determined by the characteristics of the incoming particles and magnetic fields of the solar wind — for instance, a long, steady stream of solar particles would produce different effects than a faster, less consistent stream. However, new data shows that the way energy moves is much more closely tied to the mechanisms by which the magnetosphere and ionosphere are linked.
“The energy transfer process turns out to be very similar to the way lightning forms during a thunderstorm,” said Bob Robinson, a space scientist at NASA Goddard and the Catholic University of America.
During a thunderstorm, a buildup of electric potential difference — called voltage — between a cloud and the ground leads to a sudden, violent discharge of that electric energy in the form of lightning. This discharge can only happen if there’s an electrically conducting pathway between the cloud and the ground, called a leader.
Similarly, the solar wind striking the magnetosphere can build up a voltage difference between different regions of the ionosphere and the magnetosphere. Electric currents can form between these regions, creating the conducting pathway needed for that built-up electric energy to discharge into the ionosphere as a kind of lightning.
“Terrestrial lightning takes several milliseconds to occur, while this magnetosphere-ionosphere ‘lightning’ lasts for several hours — and the amount of energy transferred is hundreds to thousands of times greater,” said Robinson, lead researcher on these new results. These results are based on data from the global Iridium satellite communications constellation.
Because solar storms enhance the electric currents that let this magnetosphere-ionosphere lightning take place, this type of energy transfer is much more likely when Earth’s magnetic field is jostled by a solar event.
The huge energy transfer from this magnetosphere-ionosphere lightning is associated with heating of the ionosphere and upper atmosphere, as well as increased aurora.
Though scientists are making progress in understanding the key processes that drive changes in the ionosphere and, in turn, on Earth, there is still much to be understood. In 2017, NASA is launching two missions to investigate this dynamic region: the Ionospheric Connection Explorer, or ICON, and Global Observations of the Limb and Disk, or GOLD.
“The ionosphere doesn’t only react to energy input by solar storms,” said Scott England, a space scientist at the University of California, Berkeley, who works on both the ICON and GOLD missions. “Terrestrial weather, like hurricanes and wind patterns, can shape the atmosphere and ionosphere, changing how they react to space weather.”
ICON will simultaneously measure the characteristics of charged particles in the ionosphere and neutral particles in the atmosphere — including those shaped by terrestrial weather — to understand how they interact. GOLD will take many of the same measurements, but from geostationary orbit, which gives a global view of how the ionosphere changes.
Both ICON and GOLD will take advantage of a phenomenon called airglow — the light emitted by gas that is excited or ionized by solar radiation — to study the ionosphere. By measuring the light from airglow, scientists can track the changing composition, density, and even temperature of particles in the ionosphere and neutral atmosphere.
ICON’s position 350 miles above Earth will enable it to study the atmosphere in profile, giving scientists an unprecedented look at the state of the ionosphere at a range of altitudes. Meanwhile, GOLD’s position 22,000 miles above Earth will give it the chance to track changes in the ionosphere as they move across the globe, similar to how a weather satellite tracks a storm.
“We will be using these two missions together to understand how dynamic weather systems are reflected in the upper atmosphere, and how these changes impact the ionosphere,” said England.
Source: NASA/Goddard Space Flight Center. “Revolutions in understanding the ionosphere, Earth’s interface to space.” ScienceDaily. ScienceDaily, 14 December 2016. <www.sciencedaily.com/releases/2016/12/161214151652.htm>.
Researchers say results should be considered when designing iPSC therapies
Date:December 12, 2016Source:Scripps HealthSummary:Researchers who looked at the effect of aging on induced pluripotent stem cells (iPSCs) found that genetic mutations increased with the age of the donor who provided the source cells, according to study results.
As it is in much of life, the aging process isn’t kind to an important type of stem cell that has great therapeutic promise.
Researchers at the Scripps Translational Science Institute (STSI) and The Scripps Research Institute (TSRI) who looked at the effect of aging on induced pluripotent stem cells (iPSCs) found that genetic mutations increased with the age of the donor who provided the source cells, according to study results published by the journal Nature Biotechnology.
The findings reinforce the importance of screening iPSCs for potentially harmful DNA mutations before using them for therapeutic purposes, said lead investigators Ali Torkamani, Ph.D., director of genome informatics at STSI, and Kristin Baldwin Ph.D., the study’s co-lead investigators and associate professor of molecular and cellular neuroscience at the Dorris Neuroscience Center at TSRI.
“Any time a cell divides, there is a risk of a mutation occurring. Over time, those risks multiply,” Torkamani said. “Our study highlights that increased risk of mutations in iPSCs made from older donors of source cells.”
Researchers found that iPSCs made from donors in their late 80s had twice as many mutations among protein-encoding genes as stem cells made from donors in their early 20s.
That trend followed a predictable linear track paired with age with one exception. Unexpectedly, iPSCs made from blood cells donated by people over 90 years old actually contained fewer mutations than what researchers had expected. In fact, stem cells from those extremely elderly participants had mutation numbers more comparable to iPSCs made from donors one-half to two-thirds younger.
Researchers said the reason for this could be tied to the fact that blood stem cells remaining in elderly people have been protected from mutations over their lifetime by dividing less frequently.
“Using iPSCs for treatment has already been initiated in Japan in a woman with age-related macular degeneration,” said paper co-author and STSI Director Eric Topol, M.D. “Accordingly, it’s vital that we fully understand the effects of aging on these cells being cultivated to treat patients in the future.”
STSI is a National Institutes of Health-sponsored site led by Scripps Health in collaboration with TSRI. This innovative research partnership is leading the effort to translate wireless and genetic medical technologies into high-quality, cost-effective treatments and diagnostics for patients.
Of the 336 different mutations that were identified in the iPSCs generated for the study, 24 were in genes that could impair cell function or trigger tumor growth if they malfunctioned.
How troublesome these mutations could be depends on how well the stem cells are screened to filter out the defects and how they are used therapeutically, Torkamani said. For example, cells made from iPSCs for a bone marrow transplant would be potentially dangerous if they contained a TET2 gene mutation linked to blood cancer, which surfaced during the study.
“We didn’t find any overt evidence that these mutations automatically would be harmful or pathogenic,” he said.
For the study, researchers tapped three sources for 16 participant blood samples: The Wellderly Study, an ongoing STSI research project that is searching for the genetic secrets behind lifelong health by looking at the genes of healthy elderly people ages 80 to 105; the STSI GeneHeart Study, which involves people with coronary artery disease; and TSRI’s research blood donor program.
The iPSCs were generated by study co-authors Valentina Lo Sardo, Ph.D., and Will Ferguson, M.S., researchers in the TSRI group led by Baldwin.
“When we proposed this study, we weren’t sure whether it would even be possible to grow iPSCs from the blood of the participants in the Wellderly Study, since others have reported difficulty in making these stem cells from aged patients,” Baldwin said. “But through the hard work and careful experiments designed by Valentina and Will, our laboratories became the first to produce iPSCs from the blood of extremely elderly people.”
- Valentina Lo Sardo, William Ferguson, Galina A Erikson, Eric J Topol, Kristin K Baldwin, Ali Torkamani. Influence of donor age on induced pluripotent stem cells. Nature Biotechnology, 2016; DOI: 10.1038/nbt.3749
Source: Scripps Health. “Aging process increases DNA mutations in important type of stem cell: Researchers say results should be considered when designing iPSC therapies.” ScienceDaily. ScienceDaily, 12 December 2016. <www.sciencedaily.com/releases/2016/12/161212115837.htm>.
December 12, 2016
Most robots achieve grasping and tactile sensing through motorized means, which can be excessively bulky and rigid. Scientists have now devised a way for a soft robot to feel its surroundings internally, in much the same way humans do. Stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand.
Most robots achieve grasping and tactile sensing through motorized means, which can be excessively bulky and rigid. A Cornell University group has devised a way for a soft robot to feel its surroundings internally, in much the same way humans do.
A group led by Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, has published a paper describing how stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand.
Doctoral student Huichan Zhao is lead author of “Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides,” which is featured in the debut edition of Science Robotics.
“Most robots today have sensors on the outside of the body that detect things from the surface,” Zhao said. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”
Optical waveguides have been in use since the early 1970s for numerous sensing functions, including tactile, position and acoustic. Fabrication was originally a complicated process, but the advent over the last 20 years of soft lithography and 3-D printing has led to development of elastomeric sensors that are easily produced and incorporated into a soft robotic application.
Shepherd’s group employed a four-step soft lithography process to produce the core (through which light propagates), and the cladding (outer surface of the waveguide), which also houses the LED (light-emitting diode) and the photodiode.
The more the prosthetic hand deforms, the more light is lost through the core. That variable loss of light, as detected by the photodiode, is what allows the prosthesis to “sense” its surroundings.
“If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor,” Shepherd said. “The amount of loss is dependent on how it’s bent.”
The group used its optoelectronic prosthesis to perform a variety of tasks, including grasping and probing for both shape and texture. Most notably, the hand was able to scan three tomatoes and determine, by softness, which was the ripest.
- Huichan Zhao, Kevin O’Brien, Shuo Li, Robert F. Shepherd. Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides. Science Robotics, 2016; 1 (1): eaai7529 DOI: 10.1126/scirobotics.aai7529
Source: Cornell University. “New robot has a human touch.” ScienceDaily. ScienceDaily, 12 December 2016. <www.sciencedaily.com/releases/2016/12/161212134605.htm>.