Chicken-Radish Salad with Frisee & Green Grapes
Easy to make; delicious to eat! ©Joyce Hays, Target Health Inc. 2015
4 cups bite-size, cooked chicken
2 stalks fresh celery, chopped well
1/2 sweet onion, chopped
1 garlic clove, juiced
1 or 2 radishes, cut in half, then sliced thin
1 cup green grapes, cut in half
1/2 of a fresh endive, leaves cut on the diagonal
1 cup Frisee
3 Tablespoons low-fat Kraft mayo or Kraft regular mayo (your choice)
Using fresh ingredients make all the difference!
Combine above ingredients, add the mayo, then toss.
This recipe is so quick and easy, that it’s amazing how delicious it is. The secret is the right blend of fabulous, very fresh ingredients. For me, what makes it so quick, is that, for the cooked chicken (if I don’t have any left-overs) I simply buy a rotisserie chicken at Dean and Deluca (this is a spacious NYC high-end type deli) and use all the while meat, and if not enough of that, then the dark meat. We had this last night, with a veggie burger recipe I’m working on and will share very soon and another experiment with broccoli that needs working on. We nibbled on warm pita bread dipped in our best olive oil.
Re: the chicken/radish salad – I was raised on Hellman’s mayo. Only years later, when I took a big step away from what I was used to, did I discover (just a personal preference) that Kraft mayo tasted better. I’m not invested in, or a spokesperson for, Kraft products at all, I’m just suggesting that you might like it better too. Even the Kraft no-fat and low-fat mayo has more flavor.
We ended our sumptuous repast with the jello cake that we’re currently hooked on, topped with a big dollops of no-fat Cool Whip.
We had a Chateauneuf du Pape, that he couldn’t/wouldn’t wait to get his hands on. Couldn’t blame him, such a nice warm finish.
We had a wonderful relaxing weekend. Saw a terrific play at one of the theater clubs we’re patrons of (MCC Theater production at The Lucille Lortel Theater, 121 Christopher Street in Greenwich Village). The play, (first opened in LA to excellent reviews) Lost Girls, by actor/writer, John Pollono has a limited run, which has been extended to December 4th. If you like good theater, try to get tickets. John Pollono also wrote (another opening, first in LA to rave reviews), Small Engine Repairs, another terrific play that we saw in The Village. Lost Girls is 90 minutes long so a lot is packed into this time frame. The first half sets up the plot and the characters, who are beautifully written for, and acted by the whole company. We won’t give away the ending which spun our heads around. Jules, next to me said something like, Wo or Wow and I immediately teared up. I predict that we will see much more excellent theater from the gifted John Pollono. Watch for him.
John Pollono, playwright, Lost Girls
I love these theater clubs, that we support. They are where the most creative aspect of theater is born and then, often heads, a year later to Broadway. Theater clubs are creative laboratories, which when combined with Broadway, makes New York the theater capitol of the world. We’re so happy, in our small way, to be a part of it all.
From Our Table to Yours!
November 12, 2015
A new RNA test of blood platelets can be used to detect, classify and pinpoint the location of cancer by analyzing a sample equivalent to one drop of blood. Using this new method for blood-based RNA tests of blood platelets, researchers have been able to identify cancer with 96 per cent accuracy, scientists report.
A new RNA test of blood platelets can be used to detect, classify and pinpoint the location of cancer by analysing a sample equivalent to one drop of blood. Using this new method for blood-based RNA tests of blood platelets, researchers have been able to identify cancer with 96 per cent accuracy. This according to a study at Umeå University in Sweden recently published in the journal Cancer Cell.
“Being able to detect cancer at an early stage is vital. We have studied how a whole new blood-based method of biopsy can be used to detect cancer, which in the future renders an invasive cell tissue sample unnecessary in diagnosing lung cancer, for instance. In the study, nearly all forms of cancer were identified, which proves that blood-based biopsies have an immense potential to improve early detection of cancer,” according to Jonas Nilsson, cancer researcher at Umeå University and co-author of the article.
In the study, researchers from Umeå University, in collaborations with researchers from the Netherlands and the US, have investigated how a new method of blood-based RNA tests of the part of the blood called platelets could be used in detecting and classifying cancer.
The results show that blood platelets could constitute a complete and easily accessible blood-based source for sampling and hence be used in diagnosing cancer as well as in the choice of treatment method.
Blood samples from 283 individuals were studied of which 228 people had some form of cancer and 55 showed no evidence of cancer. By comparing the blood samples RNA profiles, researchers could identify the presence of cancer with an accuracy of 96 per cent among patients. Among the 39 patients in the study in which an early detection of cancer had been made, 100 per cent of the cases could be identified and classified.
In follow-up tests using the same method, researchers could identify the origin of tumours with a so far unsurpassed accuracy of 71 per cent in patients with diagnosed cancer in the lung, breast, pancreas, brain, liver, colon and rectum. The samples could also be sorted in subdivisions depending on molecular differences in the cancer form, which can be of great use in the choice of treatment method.
- Myron G. Best, Nik Sol, Irsan Kooi, Jihane Tannous, Bart A. Westerman, François Rustenburg, Pepijn Schellen, Heleen Verschueren, Edward Post, Jan Koster, Bauke Ylstra, Najim Ameziane, Josephine Dorsman, Egbert F. Smit, Henk M. Verheul, David P. Noske, Jaap C. Reijneveld, R. Jonas A. Nilsson, Bakhos A. Tannous, Pieter Wesseling, Thomas Wurdinger. RNA-Seq of Tumor-Educated Platelets Enables Blood-Based Pan-Cancer, Multiclass, and Molecular Pathway Cancer Diagnostics. Cancer Cell, 2015; 28 (5): 666 DOI: 10.1016/j.ccell.2015.09.018
Source: Umea University. “Blood sample new way of detecting cancer.” ScienceDaily. ScienceDaily, 12 November 2015. <www.sciencedaily.com/releases/2015/11/151112123708.htm>.
November 9, 2015
Focused Ultrasound Foundation
The blood-brain barrier has been non-invasively opened in a patient for the first time. Scientists used focused ultrasound to enable temporary and targeted opening of the blood-brain barrier (BBB), allowing the more effective delivery of chemotherapy into a patient’s malignant brain tumor.
The blood-brain barrier has been non-invasively opened in a patient for the first time. A team at Sunnybrook Health Sciences Centre in Toronto used focused ultrasound to enable temporary and targeted opening of the blood-brain barrier (BBB), allowing the more effective delivery of chemotherapy into a patient’s malignant brain tumor.
The team, led by neurosurgeon Todd Mainprize, MD, and physicist Kullervo Hynynen, PhD, infused the chemotherapy agent doxorubicin, along with tiny gas-filled bubbles, into the bloodstream of a patient with a brain tumor. They then applied focused ultrasound to areas in the tumor and surrounding brain, causing the bubbles to vibrate, loosening the tight junctions of the cells comprising the blood-brain barrier and allowing high concentrations of the chemotherapy to enter targeted tissues.
“The blood-brain barrier has been a persistent impediment to delivering valuable therapies to treat tumors,” said Dr. Mainprize. “We are encouraged that we were able to open this barrier to deliver chemotherapy directly into the brain, and we look forward to more opportunities to apply this revolutionary approach.”
This patient treatment is part of a pilot study of up to 10 patients to establish the feasibility, safety and preliminary efficacy of focused ultrasound to temporarily open the blood-brain barrier to deliver chemotherapy to brain tumors. The Focused Ultrasound Foundation is currently funding this trial through their Cornelia Flagg Keller Memorial Fund for Brain Research.
“Breaching this barrier opens up a new frontier in treating brain disorders,” said Neal Kassell, MD, Chairman of the Focused Ultrasound Foundation. “We are encouraged by the momentum building for the use of focused ultrasound to non-invasively deliver therapies for a number of brain disorders.”
Opening the blood-brain barrier in a localized region to deliver chemotherapy to a tumor is a predicate for utilizing focused ultrasound for the delivery of other drugs, DNA-loaded nanoparticles, viral vectors, and antibodies to the brain to treat a range of neurological conditions, including various types of brain tumors, Parkinson’s, Alzheimer’s and some psychiatric diseases.
The procedure was conducted using Insightec’s ExAblate Neuro system. “This first patient treatment is a technological breakthrough that may lead to many clinical applications,” said Eyal Zadicario, Vice President for R&D and Director of Neuro Programs, Insightec.
While the current trial is a first-in-human achievement, Dr. Kullervo Hynynen, senior scientist at the Sunnybrook Research Institute, has been performing similar pre-clinical studies for about a decade. His research has shown that the combination of focused ultrasound and microbubbles may not only enable drug delivery, but might also stimulate the brain’s natural responses to fight disease. For example, the temporary opening of the blood-brain barrier appears to facilitate the brain’s clearance of a key pathologic protein related to Alzheimer’s and improves cognitive function.
A recent study by Gerhard Leinenga and Jürgen Götz from the Queensland Brain Institute in Australia further corroborated Hynynen’s research, demonstrating opening the blood-brain barrier with focused ultrasound reduced brain plaques and improved memory in a mouse model of Alzheimer’s disease.
Based on these two pre-clinical studies, a pilot clinical trial using focused ultrasound to treat Alzheimer’s is being organized.
About The Blood-Brain Barrier
The blood-brain barrier (BBB) is a protective layer of tightly joined cells that lines the blood vessels of the brain and keeps harmful substances, such as toxins and infectious agents, from entering the surrounding tissue. Unfortunately, this barrier also prevents certain drugs from reaching their targets within the brain in adequate concentrations. Safely and temporarily opening the barrier in a well-defined area to deliver drugs at therapeutic levels is a long-sought goal for treatment of a wide variety of neurological conditions including brain tumors, Alzheimer’s disease, Parkinson’s disease and epilepsy.
Currently, there are limited options to circumvent the blood-brain barrier and deliver drugs. Drugs can be directly injected into the brain, with the risk of hemorrhage, infection or damage to normal brain tissue from the needle or catheter. The pharmacological agent mannitol has been used to disrupt the barrier when injected into the blood supply, but this approach is uncontrolled and non-selective and can further be associated with significant effects on blood pressure and the body’s fluid balance.
Source: Focused Ultrasound Foundation. “Blood-brain barrier opened non-invasively with focused ultrasound for the first time.” ScienceDaily. ScienceDaily, 9 November 2015. <www.sciencedaily.com/releases/2015/11/151109085103.htm>.
Textbooks from different major publishers give climate deniers equal weight as vast majority of climate scientists who cite scientific evidence of human-caused global warming
November 10, 2015
Southern Methodist University
A new study that analyzed four California science textbooks from major publishers found they position climate change as a debate over differing opinions. Contrary to the clear majority of climate scientists who cite scientific data and evidence of human-caused climate change, the textbooks present the topic as uncertain, that humans may or may not cause it, and that its unclear if we need immediate mitigating action, the researchers found.
If American teens are unsure about climate change or its cause, some school textbooks aren’t helping, says teaching expert Diego Román, Southern Methodist University, Dallas, co-author of a new study on the subject.
Studies estimate that only 3 percent of scientists who are experts in climate analysis disagree about the causes of climate change. But the most recent report from the Intergovernmental Panel on Climate Change — the evidence of 600 climate researchers in 32 countries reporting changes to Earth’s atmosphere, ice and seas — in 2013 stated “human influence on the climate system is clear.”
Yet only 54 percent of American teens believe climate change is happening, 43 percent don’t believe it’s caused by humans, and 57 percent aren’t concerned about it.
The new study measured how four sixth-grade science textbooks adopted for use in California frame the subject of global warming. Sixth grade is the first time California state standards indicate students will encounter climate change in their formal science curriculum.
The researchers examined different textbooks, each published in either 2007 or 2008 by a different major publisher. They found and analyzed 279 clauses containing 2,770 words discussing climate change.
“We found that climate change is presented as a controversial debate stemming from differing opinions,” said Román, an assistant professor in the Department of Teaching and Learning in the SMU Simmons School. “Climate skeptics and climate deniers are given equal time and treated with equal weight as scientists and scientific facts — even though scientists who refute global warming total a miniscule number.”
The message communicated in the four textbooks was that climate change is possibly happening, that humans may or may not be causing it, and its unclear if we need to take immediate mitigating action, the researchers found.
That representation matches the public discourse around global warming, in which previous studies have shown that media characterize climate change as unsettled science with high levels of scientific uncertainty. The researchers said only 33 percent of the U.S. public believes climate change is a serious threat.
The textbooks misrepresented, however, actual scientific discourse, which asserts climate change is an environmental problem bearing immense risk, where the human impact is clear, and where immediate action is warranted, the authors said.
“The primary purpose of science education is to represent the science accurately, but this analysis of textbooks shows this not to be the case for climate science,” they said.
Co-author on the article is K.C. Busch, a Ph.D. candidate in science education in Stanford University’s Graduate School of Education.
The authors reported the findings in October at the 11th Conference of the European Science Education Research Association (ESERA), held in Helsinki, Finland.
The findings were also published in the Environmental Education Researchjournal in the article, “Textbooks of doubt: Using systemic functional analysis to explore the framing of climate change in middle-school science textbooks.”
New national standards align with scientific discourse
An extensive body of prior research has revealed students have many misconceptions about climate change, confusing it, for example, with causing acid rain and ozone depletion, as well as linking it to skin cancer, the authors note.
Now there’s an opportunity to ensure textbooks aren’t part of the problem, by altering misleading language, Román said.
States have begun adopting new national standards for science education as a result of recommendations by the U.S. Next Generation Science Standards. Those standards were developed in part by the National Science Teachers Association and the American Association for the Advancement of Science and align more accurately with the scientific discourse.
“As the Next Generation Science Standards become adopted and implemented, publishers are writing new textbooks that include climate change,” the authors said. “This reworking of science textbooks provides a rare opportunity to reflect on how we can create texts that enhance science teaching and learning.” The standards were completed in April 2013.
Specifically, the textbook researchers recommend against stripping out uncertainty, since even well proven theories carry the possibility of a better theory that contradicts one or more postulates of the theory.
Instead they recommend clarifying what exactly is unknown and why.
They also recommend the inclusion of humans as agents and as the cause of climate change. That fact is scientifically supported and not controversial among scientists who study climate from a broad range of disciplines, including geology, geophysics, geography, paleoclimatology, glaciology, hydrology, ecology, evolutionary biology, environmental studies and oceanography.
Textbook language doesn’t reflect science of climate change
To study the textbooks, the researchers applied text analysis to conduct an exhaustive examination of the choices and frequency of language, including the level of uncertainty as well as the agents involved.
The textbooks did promote uncertainty when addressing the causes of climate change by using verbs such as could, may or might. And some passages created the view that global warming could even be beneficial. One textbook wrote:
“Global warming could have some positive effects. Farmers in some areas that are now cool could plant two crops a year instead of one. Places that are too cold for farming today could become farmland. However, many effects of global warming are likely to be less positive. Higher temperatures would cause water to evaporate from exposed soil, such as plowed farmland. Dry soil blows away easily. Thus, some fertile fields might become ‘dust bowls.'”
The texts emphasized abstractions, such as deforestation or the burning of wood, without referencing humans.
When attributing information to scientists, the textbooks used verbs such as believe, think or propose, but rarely were scientists said to be drawing conclusions from evidence or data. There was one occurrence when the noun evidence was used, the authors said, and then it was to suggest the notion that climate change is not new:
“Scientists have found evidence of many major ice ages throughout Earth’s geologic history.”
Less frequently used were verbs that describe scientific practices — such as “find,” “determine,” “measure,” “obtain.” The most frequently used word when scientists were present in the sentence was “think,” which introduces the idea that it was decided rather than observed or found as the result of scientific observation and research, Román said.
Language matters, particularly in California, Texas, New York
The findings suggest that textbooks should be more specific about the facts, should cite sources, and should accurately reflect the methods by which scientists reached their conclusions.
“The work of scientists should be represented accurately rather than saying that scientists think or believe, as if it’s a matter of opinion,” Román said.
As a social scientist who studies linguistics and the impact of words, Román said language matters, particularly in the textbooks in the nation’s three most populated states, California, Texas and New York, which set standards for the rest of the country.
“These textbooks discuss the impact of climate change on the Earth in hypothetical terms, in complete contradiction to scientific research findings,” he said.
The researchers note that while it’s accurate that agreement isn’t unanimous, only about 3 percent of climate scientists disagree about the causes of climate change. “Yet textbooks characterize that with the description ‘some scientists,’ so students can assume its 50-50, which is very different from saying ’97 percent of scientists,'” he said.
Does the language reflect a compromise by publishers as they walk a fine line?
“It appears textbook publishers include discussion of climate change to appease one segment of their market — but then to appease another segment they suggest doubt, which doesn’t reflect the scientific reality,” he said.
Textbooks lack specific language to guide student action
Textbook language should reflect the language used in scientific reports, be explicit about the sources of information and should clarify human cause, with specific actions students can take to produce change, the authors recommend.
Yet none of the textbooks explicitly called students to act to mitigate climate change, the authors note.
Generic information, such as “take care of the environment” or “stop burning coal and wood,” lack specific solutions for action.
“Students think, ‘that’s not me — that’s the people in the Amazon who are burning forests,'” Román said. “Textbooks must draw the connection between specifics, such as turning off lights or driving less, to relate solutions to students and their lives.”
- Diego Román, K.C. Busch. Textbooks of doubt: using systemic functional analysis to explore the framing of climate change in middle-school science textbooks. Environmental Education Research, 2015; 1 DOI: 10.1080/13504622.2015.1091878
Source: Southern Methodist University. “California 6th grade science books: Climate change a matter of opinion not scientific fact: Textbooks from different major publishers give climate deniers equal weight as vast majority of climate scientists who cite scientific evidence of human-caused global warming.” ScienceDaily. ScienceDaily, 10 November 2015. <www.sciencedaily.com/releases/2015/11/151110120441.htm>.
November 9, 2015
The thickness of the cortex in a region of the brain that specializes in facial recognition can predict an individual’s ability to recognize faces and other objects.
When you see a familiar face, when a bird-watcher catches a glimpse of a rare bird perched on a limb, or when a car-fancier spots a classic auto driving past, the same small region in the brain becomes engaged.
For almost two decades, neuroscientists have known that this area, called the fusiform face area (FFA), plays a vital role in the brain’s ability to recognize faces and objects that an individual has learned to identify.
Now a new study, accepted for publication by the Journal of Cognitive Neuroscience, has taken this a step further by finding that the thickness of the cortex in the FFA — as measured using magnetic resonance imaging — can predict a person’s ability to recognize faces and objects.
“It is the first time we have found a direct relationship between brain structure and visual expertise,” said Isabel Gauthier, David K. Wilson Professor of Psychology at Vanderbilt University, who directed the study. “It shows more clearly than ever that this part of the brain is relevant to both face and object recognition abilities.”
Surprising twist on cortical thickness
Relationships between cortical thickness and other types of processes, such as motor learning and acquisition of musical skills, have been observed before. The relationship seems relatively straightforward: the process of learning to type faster or play a violin causes the neurons in the relevant area of the cortex to make new connections, which causes the cortex to appear thicker. However, the link between cortical thickness and how well we recognize faces and objects turns out to have a surprising twist.
To establish this surprising relationship, Gauthier and her co-authors, post-doctoral fellow Rankin McGugin and Ana Van Gulick from Carnegie Mellon University, measured the ability of 27 men to identify objects from several different categories divided into two groups: living and non-living. They also tested subjects’ ability at recognizing faces.
Using advanced brain-mapping techniques, the researchers were able to pinpoint the exact location of the FFA in each individual and to measure its cortical thickness. When they analyzed the results, the researchers found that the men with thicker FFA cortex performed generally better at identifying non-living objects while those having thinner FFA cortex performed better at identifying faces and living objects.
“It was really a surprise to find that the effects are in opposite directions for faces and non-living objects,” said Gauthier. “One possibility that we are exploring is that we acquire expertise for faces much earlier than we learn about cars, and brain development is quite different earlier versus later in life.”
There are significant sex differences in facial and object recognition, so the researchers would like to repeat the experiment using women to see if this same relation holds true. They would also like to start with a group of non-experts and then track how the thickness of their FFA cortex changes as they undergo the training process to become experts.
This research was supported by National Science Foundation grant SBE-0542013 and National Eye Institute grant R01-EY013441-06A2.
- Isabel Gauthier, Marlene Behrmann, Michael J. Tarr. Can Face Recognition Really be Dissociated from Object Recognition?Journal of Cognitive Neuroscience, 1999; 11 (4): 349 DOI: 10.1162/089892999563472#.VkD1M6LIyfe
Source: Vanderbilt University. “Thickness of grey matter predicts ability to recognize faces and objects.” ScienceDaily. ScienceDaily, 9 November 2015. <www.sciencedaily.com/releases/2015/11/151109133903.htm>.
Target Health Greets and Trains SFDA Officials from Shandong Province
On Nov. 3, 2015, Target Health provided FDA regulatory affairs training to a delegation of State FDA officials (SFDA) from Jinan city of Shandong Province of China. The delegation consisted of 15 SFDA Chief and Deputy Chief Directors of Jinan city and various Districts and Counties of Jinan city. The delegation is here for a three week training on the FDA regulatory affairs and pharmaceutical industry in the U.S., as well as business meetings. Thank you to Mary Shatzoff and our Chinese-speaking regulatory experts, Adam Harris and Hui Wang for a very impressive introduction to US regulatory processes and for enhancing US/China relations.
The training in New York/New Jersey was organized by LFI Solutions, Inc., a management consulting firm, specializing in US-China cross border growth strategies and executions. LFI advises clients in both the U.S. and China on market entry and expansion strategies, business development, marketing, licensing, joint-venture, and mergers and acquisitions. Industry sectors include healthcare, clean tech, consumer products and financial services. Its clients range from global brand name corporations to venture funded startups
ON TARGET is the newsletter of Target Health Inc., a NYC – based, full – service, contract research organization (eCRO), providing strategic planning, regulatory affairs, clinical research, data management, biostatistics, medical writing and software services to the pharmaceutical and device industries, including the paperless clinical trial.
For more information about Target Health contact Warren Pearlson (212 – 681 – 2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchel or Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.
Joyce Hays, Founder and Editor in Chief of On Target
Jules Mitchel, Editor
Autism Speaks, The MSSNG Project
The crystallized DNA on the cover of this January 2015, Nature Medicine is the iconic image of the Autism Speaks MSSNG project, with its aim of uncovering the unique beauty spelled out in the genome of every person with autism.
LARGEST-EVER AUTISM GENOME STUDY FINDS MOST SIBLINGS HAVE DIFFERENT AUTISM-RISK GENES
The largest-ever autism genome study reveals that the disorder’s genetic underpinnings are even more complex than previously thought: Most siblings who have1) ___ ___ ___ (ASD) have different autism-linked genes. Led by the director, Stephen Scherer, of the Autism Speaks MSSNG project (pronounced missing), the report made the cover of the January 2005 Nature Medicine. This study data becomes part of a historic first upload to Autism Speaks MSSNG portal for open-access research.
Simultaneously with the publication, the study’s data became part of the historic first upload of approximately 1,000 autism genomes to the Autism Speaks MSSNG portal on the Google Cloud Platform. Autism Speaks will be making the de-identified data freely available for global research to speed understanding of autism and the development of individualized 2) ___. The goal of the project’s first phase is to upload 10,000 sequenced autism genomes, together with state-of-the-art, web-based analytic tools.
This is a historic day, says study leader Stephen Scherer, as it marks the first time whole 3) ___ sequences for autism will be available for research on the MSSNG open-science database. This is an exemplar for a future when open-access genomics will lead to personalized treatments for many developmental and medical disorders. In addition to leading Autism Speaks’ MSSNG program, Dr. Scherer directs the Centre for Applied Genomics at Toronto’s Hospital for Sick Children and the McLaughlin Centre at the University of Toronto. By using the cloud to make data like this openly available to researchers around the world, we’re literally breaking down barriers in a way never done before, says Autism Speaks, Chief Science Officer Robert Ring. As always, our goal at Autism Speaks is to accelerate scientific discovery that will ultimately improve the lives of individuals with 4) ___ at home and around the world. Dr. Ring is among the co-authors of today’s Nature Medicine report.
In its first phase, the MSSNG project aims to make at least 10,000 sequenced autism genomes available for 5) ___, along with a tool box of state-of-the-art instruments to aide analysis.
Autism’s surprising diversity is a challenge to research scientists. In the new study, the researchers sequenced 340 whole genomes from 85 families, each with two children affected by autism. They found that the majority of siblings (69%) had little to no overlap in the gene variations known to contribute to autism. Less than a third (31%) of the sibling pairs shared the same autism-associated 6) ___. The findings challenge long-held presumptions. Because autism often runs in families, experts had assumed that siblings with the disorder were 7) ___ the same autism-predisposing genes from their parents. It now appears this may not be true in most cases. We knew that there were many differences in autism, but our recent findings firmly nail that down, Dr. Scherer says. We believe that each child with autism is like a snowflake – unique from the other. This means we should not be looking just for suspected autism-risk genes, as is typically done in 8) ___genetic testing, Dr. Scherer adds. A full assessment of each individual’s genome is needed to determine how to best use knowledge of genetic factors in personalized autism treatment. Whole genome sequencing goes far beyond traditional genetic testing to analyze an individual’s complete 9) ___ sequence.
Known autism-risk genes showed up in 42% of the families participating in the study. This may help explain why autism came about in their child or provide insight into related medical conditions, Dr. Scherer says. For example, genome sequencing found an autism-10) ___ gene with a strong link to epilepsy in one of the study participants. Her brother, who also had autism, had a completely different risk gene – one related to Angelman’s syndrome. In a 2013 pilot genome sequencing study, Dr. Scherer’s team identified autism-linked genes in more than half of 32 participating families. That study provided several families with medically important information.
Other leading authors of the new study include Ryan Yuen, whose work grew out of his Autism Speaks Meixner Postdoctoral Fellowship in Translational Research. In addition to Autism Speaks and Autism Speaks Canada, study funders included KRG Children’s Charitable Foundation, The Catherine and Maxwell Meighen Foundation, NeuroDevNet, Canadian Institutes for Advanced Research, the University of Toronto McLaughlin Centre, Genome Canada, Ontario Genomics Institute, Ontario Brain Institute, the Government of Ontario, Canadian Institutes of Health Research and SickKids Foundation.
ANSWERS: 1) autism spectrum disorder; 2) treatments; 3) genome; 4) autism; 5) research; 6) genes; 7) inheriting; 8) diagnostic; 9) DNA; 10) -risk
Stephen W. Scherer (1964 to present)
Dr. Steve Scherer is a molecular geneticist whose research focuses on understanding how human genes interact to cause disease. Dr. Scherer has received international acclaim for his discovery of the regions of the human chromosome that contain genes linked to autism. He has been able to identify these regions because he specializes in identifying variations in the structure of the human genome. Although we all carry genetic variations – they are what make us individuals – some occur in important developmental genes, which can lead to disease. Dr. Scherer’s work strengthens our knowledge and understanding of genomics and makes him a key player in the advancement of biomedical research.
Dr. Scherer was born in Windsor, Ontario and completed his Honors Science Degree at the University of Waterloo, Master’s of Science and Doctor of Philosophy in the Faculty of Medicine at the University of Toronto. Scherer’s discoveries led to the initial description of genome-wide copy number variations (CNVs) of genes and DNA, including defining CNV as a highly abundant form of human genetic variation. Previous theory held that humans were 99.9% DNA identical with the small difference in variation almost entirely accounted for by some 3 million single nucleotide polymorphisms (SNPs) per genome. Larger genomic CNV changes involving losses or gains of thousands or millions of nucleotides encompassing one or several genes were thought to be exceptionally rare, and almost always involved in disease. Scherer’s discovery of frequent CNV events found in the genomes of all cells in every individual, co-published with Charles Lee of Harvard in 2004, opened a new window for studies of natural genetic variation, evolution and disease. Scherer recalled, When the scientific establishment didn’t believe it, we knew we were on to something big. In retrospect, it’s so simple to see these copy number variations were not at all biological outliers, just outliers of the scientific dogma of the time. Scherer, Lee and collaborators at the Wellcome Trust Sanger Institute then generated the first CNV maps of human DNA revealing the structural properties, mechanisms of formation, and population genetics of this previously unrecognized ubiquitous form of natural variation. These studies were also the first to discover that CNVs number in the thousands per genome and encompass at least ten times more DNA letters than SNPs, revealing a ?dynamic patchwork’ structure of chromosomes. These findings were further substantiated through work with J. Craig Venter’s team, which contributed to the completion of the first genome sequence of an individual.
Between 2007 and 2010, Scherer and collaborators went on to discover numerous disease-associated CNVs, and the corresponding disease-susceptibility genes in upwards of 10% of individuals with autism spectrum disorder. These discoveries have led to broadly available tests facilitating early diagnostic information for autism. In 2013 with collaborators at the Beijing Genomics Institute, Duke University and Autism Speaks USA, Scherer’s team used whole genome sequencing to find genetic variants of clinical relevance in Canadian families with autism. Earlier (1988-2003) with Lap-chee Tsui, he led studies of human chromosome 7, in particular in the mapping phase of the Human Genome Project. Through collaborative research, genes causative in holoprosencephaly, renal carcinoma, Williams syndrome, sacral agenesis, citrullinemia, renal tubular acidosis, and many others were identified. His group also discovered the largest gene in the genome, which was later found to be involved in autism. The sum of this work including contributions from scientists worldwide and J. Craig Venter’s Celera Genomics, generated the first published description of human chromosome 7. In other studies with Berge Minassian, disease genes causing deadly forms of epilepsy were identified. In 2012, Scherer and colleagues launched the Personal Genome Project Canada and in 2013, with Mansoor Mohammed and Michael Scott Smith he founded YouNique Genomics.
Scherer appears regularly on the Canadian Broadcasting Corporation (CBC) and other national TV, radio, and media, including Quirks and Quarks, explaining scientific discoveries. He was featured in Roger Martin’s book The Design of Business and served as the scientific consultant for two documentaries, the MediCinema Film creation Cracking the Code, the continuing saga of genetics, and the Gemini Award-winning documentary, After Darwin by GalaFilms-Telefilm Canada. Scherer holds the GlaxoSmithKline-Canadian Institutes of Health Research Chair in Genome Sciences at the Hospital for Sick Children and University of Toronto. He has been awarded Canada’s Top 40 under 40 Award (1999), Honorary Doctorate-University of Windsor (2001), Scholar of the Howard Hughes Medical Institute (2002), Genetics Society of Canada Scientist Award (2002), the Canadian Institute for Advanced Research Explorer Award (2002), the Steacie Prize in the Natural Sciences (2004), Fellow of the Royal Society of Canada (2007), Fellow of the American Association for the Advancement of Science (AAAS) (2011) and the inaugural Distinguished Science Alumni Award-University of Waterloo (2007).
Scherer is on the Scientific Advisory Board of Autism Speaks, the Board of Trustees of Genome Canada and the international Human Genome Organization, and is a Fellow of the Canadian Institute for Advanced Research. He won the $5 million Premier’s Summit Award for Medical Research (2008) for his seminal contributions in redefining our understanding of genetic variation and disease studies. Recently he was also recognized as a Significant Sigma Chi (2011), became a Distinguished High Impact Professor of the King Abdulaziz University, and was awarded the Queen Elizabeth II Diamond Jubilee Medal for unique contributions to Canada (2013).
Brain Repair Post-Stroke
Stroke can occur when a brain blood vessel becomes blocked, preventing nearby tissue from getting essential nutrients. When brain tissue is deprived of oxygen and nutrients, it begins to die. Once this occurs, repair mechanisms, such as axonal sprouting, are activated as the brain attempts to overcome the damage. During axonal sprouting, healthy neurons send out new projections (sprouts) that re-establish some of the connections lost or damaged during the stroke and form new ones, resulting in partial recovery. Until the study below, recently published in Nature Neuroscience (26 October 2015), it was not known what triggered axonal sprouting.
By looking at brain tissue from mice, monkeys and humans, the authors found that a molecule known as growth and differentiation factor 10 (GDF10) is a key player in repair mechanisms following stroke. The findings suggest that GDF10 may be a potential therapy for recovery after stroke. According to the NIH, these findings help to elucidate the mechanisms of repair following stroke, and by identifying this key protein further advances our knowledge of how the brain heals itself from the devastating effects of stroke, and may help to develop new therapeutic strategies to promote recovery. Previous studies suggested that GDF10 was involved in the early stages of axonal sprouting, but its exact role in the process was unclear.
By examining animal models of stroke as well as human autopsy tissue, the authors found that GDF10 was activated very early after stroke. Then, using rodent and human neurons in-vitro, the authors tested the effect of GDF10 on the length of axons, the neuronal projections that carry messages between brain cells. Results showed that GDF10 stimulated axonal growth and increased the length of the axons. The authors also found that GDF10 may be important for functional recovery after stroke. In order to examine this, the authors treated mouse models of stroke with GDF10 and had the animals perform various motor tasks to test recovery. The results suggested that increasing levels of GDF10 were associated with significantly faster recovery after stroke. When the authors blocked GDF10, the animals did not perform as well on the motor tasks, suggesting the repair mechanisms were impaired — and that the natural levels of GDF10 in the brain represent a signal for recovery.
It has been widely believed that mechanisms of brain repair are similar to those that occur during development. The study team conducted comprehensive analyses to compare the effects of GDF10 on genes related to stroke repair with genes involved in development and learning and memory, processes that result in connections forming between neurons. Surprisingly, there was little similarity. The findings revealed that GDF10 affected entirely different genes following stroke than those involved in development or learning and memory. It was found that regeneration is a unique program in the brain that occurs after injury and that it is not simply Development 2.0, using the same mechanisms that take place when the nervous system is forming. According to the authors, more research is necessary to determine whether GDF10 can be a potential treatment for stroke recovery.
Trends in Prescription Drug Use Among Adults in the United States From 1999-2012
It is is important to document patterns of prescription drug use to inform both clinical practice and research. As a result, a study published in JAMA (2015;314:1818-1830) was performed to evaluate trends in prescription drug use among adults living in the United States. For the study, temporal trends in prescription drug use were evaluated using nationally representative data from the National Health and Nutrition Examination Survey (NHANES). Participants included 37,959 noninstitutionalized US adults, aged 20 years and older. Seven NHANES cycles were included (1999-2000 to 2011-2012), and the sample size per cycle ranged from 4861 to 6212. The main outcomes and measures within each NHANES cycle, was use of prescription drugs in the prior 30 days as assessed by overall us and by drug class. Temporal trends across cycles were evaluated and weighted to represent the US adult population.
Results indicated an increase in overall use of prescription drugs among US adults between 1999-2000 and 2011-2012 with an estimated 51% of US adults reporting use of any prescription drugs in 1999-2000 and an estimated 59% reporting use of any prescription drugs in 2011-2012 (difference, 8%; P for trend <.001). The prevalence of polypharmacy (use of >5 prescription drugs) increased from an estimated 8.2% in 1999-2000 to 15% in 2011-2012 (difference, 6.6%; P for trend <.001). These trends remained statistically significant with age adjustment. Among the 18 drug classes used by more than 2.5% of the population at any point over the study period, the prevalence of use increased in 11 drug classes including antihyperlipidemic agents, antidepressants, prescription proton-pump inhibitors, and muscle relaxants.
According to the authors, in this nationally representative survey, significant increases in overall prescription drug use and polypharmacy were observed, and that these increases persisted after accounting for changes in the age distribution of the population. In conclusion, the prevalence of prescription drug use increased in the majority of, but not all, drug classes.