Date:
October 21, 2014

 

Source:
University of California – Santa Barbara

 

Summary:
A longstanding question among scientists is whether evolution is predictable. A team of researchers from University of California Santa Barbara may have found a preliminary answer. The genetic underpinnings of complex traits in cephalopods may in fact be predictable because they evolved in the same way in two distinct species of squid.

 

 

20141023-1
Hawaiian Bobtail Squid Previous Pause Next 1 of 2 Swordtip squid (Uroteuthis edulis), a Japanese species used for sushi.
Credit: Sabrina Pankey

 

 

longstanding question among scientists is whether evolution is predictable. A team of researchers from UC Santa Barbara may have found a preliminary answer. The genetic underpinnings of complex traits in cephalopods may in fact be predictable because they evolved in the same way in two distinct species of squid.

Last, year, UCSB professor Todd Oakley and then-Ph.D. student Sabrina Pankey profiled bioluminescent organs in two species of squid and found that while they evolved separately, they did so in a remarkably similar manner. Their findings are published today in the Proceedings of the National Academy of Science.

Oakley, professor and vice chair of UCSB’s Department of Ecology, Evolution and Marine Biology, and Pankey, now a postdoctoral scholar at the University of New Hampshire, leveraged advances in sequencing technology and cutting-edge genomic tools to test predictability in the evolution of biological light production.

They chose to work with the Hawaiian bobtail squid (Euprymna scolopes) and the swordtip squid (Uroteuthis edulis), a Japanese species used for sushi. These distantly related species are two of five genera known to have bioluminescent organs called photophores. The photophores contain symbiotic, light-emitting bacteria, and the squid are capable of controlling the aperture of their organ to modulate how much light is produced.

The scientists wanted to know how similar the two species’ photophores are in terms of their genetic makeup. To find the answer, they sequenced all of the genes expressed in these light organs, something that could not be done using older sequencing technology.

“They are much more similar than we expected in terms of their genetic makeup,” Oakley said. “Usually when two complicated organs evolve separately we would expect them to take very different evolutionary paths to arrive where they are today. The unexpectedly similar genetic makeup demonstrates that these two squid species took very similar paths to evolve these traits.”

More specifically, the researchers demonstrated that bioluminescent organs originated repeatedly during squid evolution and then showed that the global gene expression profiles (transcriptomes) underlying those organs are strikingly — even predictably — similar. To confirm their hypothesis and findings, Oakley and Pankey enlisted the assistance of statisticians from the University of Washington and UCLA, who developed new statistical methods to test the idea of convergent (separately evolved) origins.

“I did find some individual genes that were counter to the main pattern, which means we can no longer study just one gene anymore in order to test these questions about the genetic basis of convergence,” said Pankey. “We’re at the point now where we need to — and can — study all of them.”

Some previous experiments have indicated that these squid use their bioluminescent capabilities for camouflage, as counterintuitive as that may seem. “If you imagine lying on your back in the deep ocean and looking up, almost all the light comes from straight above,” Oakley explained. “There’s no structure like walls or trees to reflect the light, so if there’s something above you, it’s going to cast a shadow. The squid can produce light that then matches the light from behind them so it blocks their shadow to a viewer below, which is a type of camouflage.”

The team’s results demonstrate that the evolution of overall gene expression underlying convergent complex traits may be predictable. This finding is unexpected and could indicate unusually strong constraints: The probability of complex organs evolving multiple times with similar trajectories should be vanishingly small, noted Oakley. Yet the team’s novel bioinformatic approaches indicate the evolution of convergent phenotypes is associated with the convergent expression of thousands of genes.

“These results have broad implications for workers in the fields of evolution, genetics, genomics/bioinformatics, biomaterials, symbiosis, invertebrate zoology and evolutionary development,” Oakley concluded.


Story Source:

The above story is based on materials provided by University of California – Santa Barbara. The original article was written by Julie Cohen. Note: Materials may be edited for content and length.


Journal Reference:

  1. M. Sabrina Pankey, Vladimir N. Minin, Greg C. Imholte, Marc A. Suchard and Todd H. Oakley. Predictable transcriptome evolution in the convergent and complex bioluminescent organs of squid. PNAS, 2014 DOI:10.1073/pnas.1416574111

 

University of California – Santa Barbara. “Let there be light: Evolution of complex bioluminescent traits may be predictable.” ScienceDaily. ScienceDaily, 21 October 2014. <www.sciencedaily.com/releases/2014/10/141021135020.htm>.

Filed Under News | Leave a Comment 

Date:
October 21, 2014

 

Source:
Keele University

 

Summary:
Human exposure to aluminum may be a significant factor in falling sperm counts and reduced male fertility, new research suggests. Fluorescence microscopy using an aluminum-specific stain confirmed the presence of aluminum in semen and showed aluminum inside individual sperm.

 

 

20141022-1

New research from scientists in the UK and France suggests that human exposure to aluminum may be a significant factor in falling sperm counts and reduced male fertility.

 

 

Fluorescence microscopy using an aluminum-specific stain confirmed the presence of aluminum in semen and showed aluminum inside individual sperm.

And the team of scientists, at the universities of Lyon and Saint-Etienne in France and Keele in the UK, found that the higher the aluminum, the lower sperm count.

The research, led by Professor Christopher Exley, a leading authority on human exposure to aluminum at Keele, and Professor Michele Cottier, a specialist in cytology and histology at Saint-Etienne, measured the aluminum content of semen from 62 donors at a French clinic.

Professor Exley said: “There has been a significant decline in male fertility, including sperm count, throughout the developed world over the past several decades and previous research has linked this to environmental factors such as endocrine disruptors.

“Human exposure to aluminum has increased significantly over the same time period and our observation of significant contamination of male semen by aluminum must implicate aluminum as a potential contributor to these changes in reproductive fertility.”

The mean aluminum content for all 62 donors was found to be very high at 339 ppb with the aluminum content of semen from several donors being in excess of 500 ppb. A statistically significant inverse relationship was found between the aluminum content of semen and the sperm count. Higher aluminum resulted in a lower sperm count.


Story Source:

The above story is based on materials provided by Keele University. Note: Materials may be edited for content and length.


Journal Reference:

  1. J.P. Klein, M. Mold, L. Mery, M. Cottier, C. Exley. Aluminium content of human semen: implications for semen quality. Reproductive Toxicology, 2014; DOI:10.1016/j.reprotox.2014.10.001

 

Keele University. “Exposure to aluminum may impact on male fertility, research suggests.” ScienceDaily. ScienceDaily, 21 October 2014. <www.sciencedaily.com/releases/2014/10/141021085114.htm>.

Filed Under News | Leave a Comment 

Date:
October 15, 2014

 

Source:
Vanderbilt University

 

Summary:
Engineers have developed a surgical robot designed to perform brain surgery by entering through the cheek instead of the skull that can operate on a patient in an MRI scanner. Additionally, the engineers have designed the system so that much of it can be made using 3-D printing in order to keep the price low.

 

 

20141022-1
This is a mockup of a patient in an MRI machine shows how the surgical robot that can perform epilepsy surgery through the cheek is set up.
Credit: David Comber, Vanderbilt University

 

 

For those most severely affected, treating epilepsy means drilling through the skull deep into the brain to destroy the small area where the seizures originate — invasive, dangerous and with a long recovery period.

Five years ago, a team of Vanderbilt engineers wondered: Is it possible to address epileptic seizures in a less invasive way? They decided it would be possible. Because the area of the brain involved is the hippocampus, which is located at the bottom of the brain, they could develop a robotic device that pokes through the cheek and enters the brain from underneath which avoids having to drill through the skull and is much closer to the target area.

To do so, however, meant developing a shape-memory alloy needle that can be precisely steered along a curving path and a robotic platform that can operate inside the powerful magnetic field created by an MRI scanner.

The engineers have developed a working prototype, which was unveiled in a live demonstration this week at the Fluid Power Innovation and Research Conference in Nashville by David Comber, the graduate student in mechanical engineering who did much of the design work.

The business end of the device is a 1.14 mm nickel-titanium needle that operates like a mechanical pencil, with concentric tubes, some of which are curved, that allow the tip to follow a curved path into the brain. (Unlike many common metals, nickel-titanium is compatible with MRIs.) Using compressed air, a robotic platform controllably steers and advances the needle segments a millimeter at a time.

According to Comber, they have measured the accuracy of the system in the lab and found that it is better than 1.18 mm, which is considered sufficient for such an operation. In addition, the needle is inserted in tiny, millimeter steps so the surgeon can track its position by taking successive MRI scans.

According to Associate Professor of Mechanical Engineering Eric Barth, who headed the project, the next stage in the surgical robot’s development is testing it with cadavers. He estimates it could be in operating rooms within the next decade.

To come up with the design, the team began with capabilities that they already had. “I’ve done a lot of work in my career on the control of pneumatic systems,” Barth said. “We knew we had this ability to have a robot in the MRI scanner, doing something in a way that other robots could not. Then we thought, ‘What can we do that would have the highest impact?'”

At the same time, Associate Professor of Mechanical Engineering Robert Webster had developed a system of steerable surgical needles. “The idea for this came about when Eric and I were talking in the hallway one day and we figured that his expertise in pneumatics was perfect for the MRI environment and could be combined with the steerable needles I’d been working on,” said Webster.

The engineers identified epilepsy surgery as an ideal, high-impact application through discussions with Associate Professor of Neurological Surgery Joseph Neimat. They learned that currently neuroscientists use the through-the-cheek approach to implant electrodes in the brain to track brain activity and identify the location where the epileptic fits originate. But the straight needles they use can’t reach the source region, so they must drill through the skull and insert the needle used to destroy the misbehaving neurons through the top of the head.

Comber and Barth shadowed Neimat through brain surgeries to understand how their device would work in practice.

“The systems we have now that let us introduce probes into the brain — they deal with straight lines and are only manually guided,” Neimat said. “To have a system with a curved needle and unlimited access would make surgeries minimally invasive. We could do a dramatic surgery with nothing more than a needle stick to the cheek.”

The engineers have designed the system so that much of it can be made using 3-D printing in order to keep the price low. This was achieved by collaborating with Jonathon Slightam and Vito Gervasi at the Milwaukee School of Engineering who specialize in novel applications for additive manufacturing.


Story Source:

The above story is based on materials provided by Vanderbilt University. The original article was written by Heidi Hall and David Salisbury. Note: Materials may be edited for content and length.

Vanderbilt University. “Brain surgery, by robot, through the cheek.” ScienceDaily. ScienceDaily, 15 October 2014. .

Filed Under News | Leave a Comment 

Target Health – What We Do

 

ON TARGET is now sent directly to over 5,200 readers each week, by request only and we thank our loyal readers for the very kind words we constantly receive about ON TARGET. Several times a year we dedicate an edition of ON TARGET to Target Health Inc., the parent company of ON TARGET.

 

Target Health is a New York City-based, full-service, contract research organization (eCRO), providing high level, sophisticated development services to the drug and device industries. Because we have developed software tools to optimize the paperless clinical trial, it is not uncommon for clients to also ask us to provide data management and software product services for projects where the client may be using services of other CRO (e.g. global studies or studies with highly specialized indications). As a result, we are also known as the eCRO for CROs.

 

20141020-1

View from the 24th Floor Offices of Target Health Inc. © Target Health 2014

 

The following is a detailed discussion of who we are and what we do.

 

Innovative Thinking at Target Health

 

Here are few examples of creative thinking in the development space where Target Health has made a difference:

 

For the treatment of Gaucher disease, we helped design a program that went from a Phase 1 study in normal volunteers directly to Phase 3. NDA APPROVED

 

For head lice, we designed the phase 3 study as a placebo controlled, but on the day after the initial treatment, we brought all subjects back and if lice were present, the subject was designated as a treatment failure, but then was retreated, but this time with the active drug. The subject came back the next day, and if lice were not present, the subject continued in the study. This is the standard protocol now for head lice, and we have finished working on our 3rd head lice program. 2 NDAs APPROVED

 

For a program in emergency contraception, there were delays in enrolling subjects in a study in Europe where the control drug was not approved in the US. We were able to get FDA to agree to expand the study to the US where we studied 2 unapproved drugs. We were able to demonstrate to FDA that the control drug was approved and marketed in Europe, thus there was no resistance to bring the drug into the US for the clinical trial.  NDA APPROVED

 

For a PMA device where from the client’s and our point of view a control was not necessarily needed and could delay implementation of the study, we negotiated with FDA to follow a small number of untreated controls rather than randomize to treatment and no treatment. PMA TO BE SUBMITTED THIS YEAR

 

Strategic Planning

 

Planning the drug and device development process involves broad skills including knowledge of science, medicine, chemistry, toxicology, clinical research and yes, regulatory affairs. Joining Target Health in May 2005, Sr. Director Glen Park, PharmD, now leads the regulatory group that acts as the U.S. Agent for 36 companies at FDA. We advise companies on possible/alternative regulatory pathway and how to optimize time to the market. We have also helped clients on how to communicate with potential licensors.

 

In recognition of Target Health’s commitment, knowledge and expertise in drug and device development, Dr. Mitchel, President of Target Health, was recently nominated by the Steering Committee of the Clinical Trials Transformation Initiative (CTTI) to be a member its Executive Committee. CTTI is a public/private partnership with FDA and Duke University whose mandate is to transform the way the pharmaceutical and device industries manage and execute clinical trials.

 

Data Management

 

It is all about the data when convincing regulators to approve, physicians to prescribe and for patients, to use a drug or device. Joining Target Health in May 1999, Executive Director Yong Joong Kim now leads our data management team. Whether it is helping clients in the overall data management process, designing Data Management Plans (DMPs), creating industry standard SDTM datasets, making sure that the data “make sense“ or working closely with programing to design EDC application, Yong Joong has developed the best group of DM professionals in the industry. For multiple FDA pre-approval inspections where Target Health performed DM and EDC services, there has never been a finding by FDA inspectors associated with DM.

 

Software Development

 

It is all about our commitment to the paperless trial. Joining Target Health in July 2001, Sr. Director Joonhyuk Choi now leads Target Health’s software development team which is the best in the pharmaceutical industry. Target Health started with traditional electronic data capture (EDC) software (Target e*CRF®) in 1999. Since that time and with multiple NDA, BLA, PMA and EMA approvals, Target Health has made great strides in being the industry leader and advocate for the paperless clinical trial. Flagship products include Target e*Studio®(version 2 of Target e*CRF®), Target eClinical Trial Record (Target e*CTR®; eSource); Target Document® (eTMF); Target Encoder™ (MedDRA and WHODrug coding); Target e*Pharmacovigilance (SAE management) and a development program with Life-on-Key, funded by the BIRD Foundation to integrate any EDC database with any EMR database.

 

Biostatistics

 

It is all about proving and convincing others that a product actually works. Joining Target Health in August 2002, Director Leigh Ren now leads Target Health’s Biostatistics team which is one of the best in the pharmaceutical industry. Whether we are addressing difficult statistical issues, creating statistical analysis plans (SAPs), running and validating the statistical analyses or writing reports, Leigh’s team can do it all. Supporting Leigh’s team, when necessary, is Dr. Ralph D’Agostino, Jr. Professor of Biostatistics at Wake Forest School of Medicine who has been our statistical consultant since August 1997. A partial list of accomplishments include multiple analyses supporting clinical study reports and the following regulatory approvals: NDA (Gaucher disease, head lice, emergency contraception; infertility); PMA (periodontal disease; adhesion prevention)

 

Clinical Research

 

It is all about managing the process that shows that a product actually works.  Gen Park (2005) runs the clinical research group at Target Health which includes project management, onsite/remote monitoring and medical writing. Clinical programs that have led to approved products include: Gaucher disease, head lice, emergency contraception, periodontal disease and adhesion prevention. One regulatory submission is planned for November of this year and another in Q3 2015, for which full clinical research services have been provided. Current programs include autism, men’s health, rheumatology, and multiple programs in oncology.

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchelor Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

Filed Under News, What's New | Leave a Comment 

Date:
October 16, 2014

 

Source:
University of Virginia

 

Summary:
Parasitic bacteria were the first cousins of the mitochondria that power cells in animals and plants — and first acted as energy parasites in those cells before becoming beneficial, according to a new study.

 

 

20141017-1
Artist’s 3-D rendering of mitochondria (stock illustration).
Credit: © Mopic / Fotolia

 

 

Parasitic bacteria were the first cousins of the mitochondria that power cells in animals and plants — and first acted as energy parasites in those cells before becoming beneficial, according to a new University of Virginia study that used next-generation DNA sequencing technologies to decode the genomes of 18 bacteria that are close relatives of mitochondria.

The study appears this week in the online journal PLoS ONE, published by the Public Library of Science. It provides an alternative theory to two current theories of how simple bacterial cells were swallowed up by host cells and ultimately became mitochondria, the “powerhouse” organelles within virtually all eukaryotic cells — animal and plant cells that contain a nucleus and other features. Mitochondria power the cells by providing them with adenosine triphosphate, or ATP, considered by biologists to be the energy currency of life.

The origin of mitochondria began about 2 billion years ago and is one of the seminal events in the evolutionary history of life. However, little is known about the circumstances surrounding its origin, and that question is considered an enigma in modern biology.

“We believe this study has the potential to change the way we think about the event that led to mitochondria,” said U.Va. biologist Martin Wu, the study’s lead author. “We are saying that the current theories — all claiming that the relationship between the bacteria and the host cell at the very beginning of the symbiosis was mutually beneficial — are likely wrong.

“Instead, we believe the relationship likely was antagonistic — that the bacteria were parasitic and only later became beneficial to the host cell by switching the direction of the ATP transport.”

The finding, Wu said, is a new insight into an event in the early history of life on Earth that ultimately led to the diverse eukaryotic life we see today. Without mitochondria to provide energy to the rest of a cell, there could not have evolved such amazing biodiversity, he said.

“We reconstructed the gene content of mitochondrial ancestors, by sequencing DNAs of its close relatives, and we predict it to be a parasite that actually stole energy in the form of ATP from its host — completely opposite to the current role of mitochondria,” Wu said.

In his study, Wu also identified many human genes that are derived from mitochondria — identification of which has the potential to help understand the genetic basis of human mitochondrial dysfunction that may contribute to several diseases, including Alzheimer’s disease, Parkinson’s disease and diabetes, as well as aging-related diseases.

In addition to the basic essential role of mitochondria in the functioning of cells, the DNA of mitochondria is used by scientists for DNA forensics, genealogy and tracing human evolutionary history.


Story Source:

The above story is based on materials provided by University of Virginia. Note: Materials may be edited for content and length.


Journal Reference:

  1. Zhang Wang, Martin Wu. Phylogenomic Reconstruction Indicates Mitochondrial Ancestor Was an Energy Parasite. PLOS ONE, 2014 DOI:10.1371/journal.pone.0110685

 

University of Virginia. “Cells’ powerhouses were once energy parasites: Study upends current theories of how mitochondria began.” ScienceDaily. ScienceDaily, 16 October 2014. <www.sciencedaily.com/releases/2014/10/141016165955.htm>.

Filed Under News | Leave a Comment 

Date:
October 15, 2014

 

Source:
Drexel University

 

Summary:
One of the tenets for minimizing the risk of spreading Ebola Virus has been a 21-day quarantine period for individuals who might have been exposed to the virus. But a new study suggests that 21 days might not be enough to completely prevent spread of the virus. Experts say there could be up to a 12 percent chance that someone could be infected even after the 21-day quarantine.

 

 

20141016-1
Ebola virus (stock illustration). A new study suggests that 21 days of quarantine might not be enough to completely prevent spread of the virus.
Credit: © krishnacreations / Fotolia

 

 

As medical personnel and public health officials are responding to the first reported cases of Ebola Virus in the United States, many of the safety and treatment procedures for treating the virus and preventing its spread are being reexamined. One of the tenets for minimizing the risk of spreading the disease has been a 21-day quarantine period for individuals who might have been exposed to the virus. But a new study by Charles Haas, PhD, a professor in Drexel’s College of Engineering, suggests that 21 days might not be enough to completely prevent spread of the virus.

Haas’s study “On the Quarantine Period for Ebola Virus,” recently published in PLOS Currents: Outbreaks looks at the murky basis for our knowledge about the virus, namely previous outbreaks in Africa in 1976 (Zaire) and 2000 (Uganda) as well as the first 9 months of the current outbreak.

In both cases, data gathered by the World Health Organization reported a 2-21 day incubation period for the virus -meaning that after 21 days if the individual hasn’t presented symptoms they are likely not to be infected or contagious. This is likely the genesis of the Centers for Disease Control and Prevention’s 21-day quarantine period, but there is little indication from the CDC as to what other considerations played into this policy.

“Twenty-one days has been regarded as the appropriate quarantine period for holding individuals potentially exposed to Ebola Virus to reduce risk of contagion, but there does not appear to be a systemic discussion of the basis for this period,” said Haas, who is the head of the Department of Civil, Architectural and Environmental Engineering at Drexel.

Haas suggests that a broader look at risk factors and costs and benefits should be considered when setting this standard. With any scientific data of this nature there is a standard deviation in results -a percentage by which they may vary. In the case of Ebola’s incubation period the range of results generated from the Zaire and Uganda data varied little. This might have contributed to the health organizations’ certainty that a 21-day quarantine period was a safe course of action.

But looking more broadly at data from other Ebola outbreaks, in Congo in 1995 and recent reports from the outbreak in West Africa, the range of deviation is between 0.1 and 12 percent, according to Haas. This means that there could be up to a 12 percent chance that someone could be infected even after the 21-day quarantine.

“While the 21-day quarantine value, currently used, may have arisen from reasonable interpretation of early outbreak data, this work suggests reconsideration is in order and that 21 days might not be sufficiently protective of public health,” Haas said.

Haas, who has extensive background in analyzing risk of transmitting biological pathogens, explains that these quarantine periods must be determined by looking at the cost of enforcing the quarantine versus the cost of releasing exposed individuals. Looking at the potential tradeoff between costs and benefits as the quarantine time is extended should guide public health officials in determining the appropriate time. Obviously, with more contagious and potentially deadly diseases the cost of making a mistake on the short side when determining a quarantine is extremely high.

“Clearly for pathogens that have a high degree of transmissibility and/or a high degree of severity, the quarantine time should be greater than for agents with lower transmissibility and/or severity. The purpose of this paper is not to estimate where the balancing point should be, but to suggest a method for determining the balancing point.”


Story Source:

The above story is based on materials provided by Drexel University. Note: Materials may be edited for content and length.

 

Journal Reference:

  1. Haas CN. On the Quarantine Period for Ebola Virus. PLOS Currents Outbreaks, 2014 DOI: 10.1371/currents.outbreaks.2ab4b76ba7263ff0f084766e43abbd89

 

 

Drexel University. “Study questions 21-day quarantine period for Ebola.” ScienceDaily. ScienceDaily, 15 October 2014. <www.sciencedaily.com/releases/2014/10/141015112323.htm>.

Filed Under News | Leave a Comment 

Date:
October 14, 2014

 

Source:
University of Copenhagen

 

Summary:
The climate is getting warmer, the ice sheets are melting and sea levels are rising — but how much? The report of the UN’s Intergovernmental Panel on Climate Change (IPCC) in 2013 was based on the best available estimates of future sea levels, but the panel was not able to come up with an upper limit for sea level rise within this century. Now researchers have calculated the risk for a worst-case scenario. The results indicate that at worst, the sea level would rise a maximum of 1.8 meters.

 

 

20141015-1
The worst-case sea level projections is shown in red. There is 95% certainty that sea level will not rise faster than this upper-limit. Purple shows the likely range of sea level rise as projected in the IPCC fifth assessment report under a scenario with rising emissions throughout the 21st century (RCP8.5).
Credit: Aslak Grinsted, NBI

The climate is getting warmer, the ice sheets are melting and sea levels are rising — but how much? The report of the UN’s Intergovernmental Panel on Climate Change (IPCC) in 2013 was based on the best available estimates of future sea levels, but the panel was not able to come up with an upper limit for sea level rise within this century. Now researchers from the Niels Bohr Institute and their colleagues have calculated the risk for a worst-case scenario. The results indicate that at worst, the sea level would rise a maximum of 1.8 meters.

The results are published in the scientific journal Environmental Research Letters.

What causes the sea to rise is when all the water that is now frozen as ice and lies on land melts and flows into the sea. It is first and foremost about the two large, kilometer-thick ice sheets on Greenland and Antarctica, but also mountain glaciers.

In addition, large amounts of groundwater is pumped for both drinking water and agricultural use in many parts of the world and more groundwater is pumped than seeps back down into the ground, so this water also ends up in the oceans.

Finally, what happens is that when the climate gets warmer, the oceans also get warmer and hot water expands and takes up more space. But how much do the experts expect the sea levels to rise during this century at the maximum?

Melting of the ice sheets

“We wanted to try to calculate an upper limit for the rise in sea level and the biggest question is the melting of the ice sheets and how quickly this will happen. The IPCC restricted their projektions to only using results based on models of each process that contributes to sea level. But the greatest uncertainty in assessing the evolution of sea levels is that ice sheet models have only a limited ability to capture the key driving forces in the dynamics of the ice sheets in relation to climatic impact,” Aslak Grinsted, Associate Professor at the Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.

Aslak Grinsted has therefore, in collaboration with researchers from England and China, worked out new calculations. The researchers have combined the IPCC numbers with published data about the expectations within the ice-sheet expert community for the evolution, including the risk for the collapse of parts of Antarctica and how quickly such a collapse would take place.

“We have created a picture of the propable limits for how much global sea levels will rise in this century. Our calculations show that the seas will likely rise around 80 cm. An increase of more than 180 cm has a likelihood of less than 5 percent. We find that a rise in sea levels of more than 2 meters is improbable,” Aslak Grinsted, but points that the results only concern this century and the sea levels will continue to rise for centuries to come.


Story Source:

The above story is based on materials provided by University of Copenhagen. Note: Materials may be edited for content and length.


Journal Reference:

  1. S Jevrejeva, A Grinsted, J C Moore. Upper limit for sea level projections by 2100. Environmental Research Letters, 2014; 9 (10): 104008 DOI: 10.1088/1748-9326/9/10/104008

 

University of Copenhagen. “Rising sea levels of 1.8 meters in worst-case scenario, researchers calculate.” ScienceDaily. ScienceDaily, 14 October 2014. <www.sciencedaily.com/releases/2014/10/141014085902.htm>.

Filed Under News | Leave a Comment 

Date:
October 9, 2014

 

Source:
University of California – Davis

 

Summary:
Neuroscientists have used light to erase a specific memory in mice, showing how the hippocampus and cortex work together to retrieve memories.

 

 

20141014-1
During memory retrieval, cells in the hippocampus connect to cells in the brain cortex.
Credit: Photo illustration by Kazumasa Tanaka and Brian Wiltgen/UC Davis

 

 

Just look into the light: not quite, but researchers at the UC Davis Center for Neuroscience and Department of Psychology have used light to erase specific memories in mice, and proved a basic theory of how different parts of the brain work together to retrieve episodic memories.

Optogenetics, pioneered by Karl Diesseroth at Stanford University, is a new technique for manipulating and studying nerve cells using light. The techniques of optogenetics are rapidly becoming the standard method for investigating brain function.

Kazumasa Tanaka, Brian Wiltgen and colleagues at UC Davis applied the technique to test a long-standing idea about memory retrieval. For about 40 years, Wiltgen said, neuroscientists have theorized that retrieving episodic memories — memories about specific places and events — involves coordinated activity between the cerebral cortex and the hippocampus, a small structure deep in the brain.

“The theory is that learning involves processing in the cortex, and the hippocampus reproduces this pattern of activity during retrieval, allowing you to re-experience the event,” Wiltgen said. If the hippocampus is damaged, patients can lose decades of memories.

But this model has been difficult to test directly, until the arrival of optogenetics.

Wiltgen and Tanaka used mice genetically modified so that when nerve cells are activated, they both fluoresce green and express a protein that allows the cells to be switched off by light. They were therefore able both to follow exactly which nerve cells in the cortex and hippocampus were activated in learning and memory retrieval, and switch them off with light directed through a fiber-optic cable.

They trained the mice by placing them in a cage where they got a mild electric shock. Normally, mice placed in a new environment will nose around and explore. But when placed in a cage where they have previously received a shock, they freeze in place in a “fear response.”

Tanaka and Wiltgen first showed that they could label the cells involved in learning and demonstrate that they were reactivated during memory recall. Then they were able to switch off the specific nerve cells in the hippocampus, and show that the mice lost their memories of the unpleasant event. They were also able to show that turning off other cells in the hippocampus did not affect retrieval of that memory, and to follow fibers from the hippocampus to specific cells in the cortex.

“The cortex can’t do it alone, it needs input from the hippocampus,” Wiltgen said. “This has been a fundamental assumption in our field for a long time and Kazu’s data provides the first direct evidence that it is true.”

They could also see how the specific cells in the cortex were connected to the amygdala, a structure in the brain that is involved in emotion and in generating the freezing response.

Co-authors are Aleksandr Pevzner, Anahita B. Hamidi, Yuki Nakazawa and Jalina Graham, all at the Center for Neuroscience. The work was funded by grants from the Whitehall Foundation, McKnight Foundation, Nakajima Foundation and the National Science Foundation.


Story Source:

The above story is based on materials provided by University of California – Davis.Note: Materials may be edited for content and length.


Journal Reference:

  1. Kazumasa Z. Tanaka, Aleksandr Pevzner, Anahita B. Hamidi, Yuki Nakazawa, Jalina Graham, Brian J. Wiltgen. Cortical Representations Are Reinstated by the Hippocampus during Memory Retrieval. Neuron, 2014 DOI:10.1016/j.neuron.2014.09.037

 

University of California – Davis. “Manipulating memory with light: Scientists erase specific memories in mice.” ScienceDaily. ScienceDaily, 9 October 2014. <www.sciencedaily.com/releases/2014/10/141009163803.htm>.

Filed Under News | Leave a Comment 

New Publication – Society for Clinical Research Sites

 

Target Health is pleased to announce that the Society for Clinical Research Sites has published an article entitled“The Impact on Clinical Research Sites When Direct Data Entry Occurs at the Time of the Office Visit: A Tale of 6 Studies (InSite, 3rd Quarter 2014). The paper is coauthored with Target Health and 4 clinical research sites, including Dr. Tessa Cigler (Weill Cornell Medical College), Dr. Marc Gittelman (South Florida Medical Research), Dr. Stephen Auerbach (Newport Urology) and Dr. Mitchell Efros (AccuMed Research Associates).

 

Abstract: Over the course of 3 years beginning in 2011, 16 studies were initiated under US Investigational New Drug Applications (INDs) and Canadian Clinical Trial Applications (CTAs) and 1 US Investigational Device Exemption (IDE), where direct data entry (DDE) at the time of the office visit, was fully integrated with risk-based monitoring (RBM) and centralized monitoring (CM). These studies have demonstrated major beneficial effects, not just within the clinical research operations of pharmaceutical companies, but also within the clinical research sites. Not unexpectedly, skeptics have challenged the notion that the clinical research sites will be amenable to changing from paper to electronic source records, and to embrace DDE. To address this concern, of the 16 protocols where DDE was performed, six studies were selected as representative case studies in order to evaluate and demonstrate the feasibility of DDE. For reference, one “old“ study was selected which used paper source records. The studies ranged from a single center Phase 1 study in normal volunteers to multiple Phase 2 clinical trials to multiple pivotal studies. Results showed that while there is some variability in the time to data entry from the date of the office visit between studies and sites within studies, it is clearly possible for clinical sites to enter at least 90% of clinical trial subject data on the day of the visit, without the need to “write it down first.“

 

ON TARGET is the newsletter of Target Health Inc., a NYC-based, full-service, contract research organization (eCRO), providing strategic planning, regulatory affairs, clinical research, data management, biostatistics, medical writing and software services to the pharmaceutical and device industries, including the paperless clinical trial.

 

For more information about Target Health contact Warren Pearlson (212-681-2100 ext. 104). For additional information about software tools for paperless clinical trials, please also feel free to contact Dr. Jules T. Mitchelor Ms. Joyce Hays. The Target Health software tools are designed to partner with both CROs and Sponsors. Please visit the Target Health Website.

 

Joyce Hays, Founder and Editor in Chief of On Target

Jules Mitchel, Editor

 

Filed Under News, What's New | Leave a Comment 

How Well Do You Know Your Spleen?

20141013-10

The spleen is an organ found in virtually all vertebrates. Similar in structure to a large lymph node, it acts primarily as a blood 1) ___. It is possible to remove the spleen without jeopardizing life. The spleen plays important roles in regard to red blood cells (also referred to as erythrocytes) and the immune system. It removes old red blood cells and holds a reserve of blood, which can be valuable in case of hemorrhagic shock, and also recycles iron. As a part of the mononuclear phagocyte system, it metabolizes hemoglobin removed from senescent erythrocytes. The globin portion of hemoglobin is degraded to its constitutive amino 2) ___, and the heme portion is metabolized to bilirubin, which is removed in the liver. The spleen synthesizes antibodies in its white pulp and removes antibody-coated bacteria and antibody-coated 3) ___ cells by way of blood and lymph node circulation.

 

A study using mice found that the spleen contains, in its reserve, half of the body’s monocytes within the red pulp. These monocytes, upon moving to injured tissue (such as the heart), turn into dendritic cells and macrophages while promoting tissue healing. The spleen is a center of activity of the mononuclear phagocyte system and can be considered analogous to a large lymph node, as its absence causes a predisposition to certain infections.

 

In humans, the spleen is brownish in color and is located in the left upper quadrant of the 4) ___, and is approximately 7 centimeters (2.8 in) to 14 centimeters (5.5 in) in length. It usually weighs between 150 grams (5.3 oz.) and 200 grams (7.1 oz.). An easy way to remember the anatomy of the spleen is the 1x3x5x7x9x11 rule. The spleen is 1″ by 3″ by 5″, weighs approximately 7 oz, and lies between the 9th and 11th ribs on the left hand side.

 

 

20141013-11

Visceral surface of the spleen

 

The diaphragmatic surface of the spleen (or phrenic surface) is convex, smooth, and is directed upward, backward, and to the left, except at its upper end, where it is directed slightly to the middle. It is in relation with the under surface of the diaphragm, which separates it from the ninth, tenth, and eleventh 5) ___ of the left side, and the intervening lower border of the left lung and pleura. The visceral surface of the spleen is divided by a ridge into two regions: an anterior or gastric and a posterior or renal. The gastric surface (facies gastrica) is directed forward, upward, and toward the middle, is broad and concave, and is in contact with the posterior wall of the stomach. Below this it is in contact with the tail of the pancreas. Near to its mid-border is a long fissure, termed the hilum. This is pierced by several irregular openings, for the entrance and exit of vessels and nerves. The renal surface (facies renalis) is directed medialward and downward. It is somewhat flattened, considerably narrower than the gastric surface, and is in relation with the upper part of the anterior surface of the left 6) ___and occasionally with the left suprarenal gland.

 

Like the thymus, the spleen possesses only efferent lymphatic vessels. The spleen is part of the 7) ___ system. Both the short gastric arteries and the splenic artery supply it with blood. The germinal centers are supplied by arterioles called penicilliary radicles. The spleen is unique in respect to its development within the gut. While most of the gut viscera are endodermally derived (with the exception of the neural-crest derived suprarenal gland), the spleen is derived from mesenchymal tissue. Specifically, the spleen forms within, and from, the dorsal mesentery. However, it still shares the same blood supply – the celiac trunk – as the foregut organs. Mesenchyme is a type of tissue characterized by loosely associated cells that lack polarity and are surrounded by a large extracellular matrix. Mesenchymal cells are able to develop into the tissues of the lymphatic and circulatory systems, as well as connective 8) ___ throughout the body, such as bone and cartilage. Endoderm is one of the three primary germ 9) ___ layers in the very early embryo. The other two layers are the ectoderm (outside layer) and mesoderm (middle layer), with the endoderm as the innermost layer.

 

Other functions of the spleen are less prominent, especially in the healthy adult:

 

  1. Production of opsonins, properdin, and tuftsin.
  2. Creation of red blood cells. While the bone marrow is the primary site of the matopoiesis in the adult, the spleen has important hematopoietic functions up until the fifth month of gestation. After birth, erythropoietic functions cease, except in some hematologic disorders. As a major lymphoid organ and a central player in the reticuloendothelial system, the spleen retains the ability to produce lymphocytes and, as such, remains an hematopoietic organ.
  3. Storage of red blood cells, lymphocytes and other formed elements. In horses, roughly 30% of the red blood cells are stored there. The red blood cells can be released when needed. In humans, up to a cup (236.5 ml) of red blood cells can be held in the spleen and released in cases of hypovolemia. It can store platelets in case of an emergency and also clears old platelets from the circulation. Up to a quarter of lymphocytes can be stored in the spleen at any one time.

 

Enlarged spleen can signify disorders which include splenomegaly, where the spleen is enlarged for various reasons, such as cancer, specifically blood-based leukemias, and asplenia, where the spleen is not present or functions abnormally. Traumas, such as a motor vehicle accident, can cause rupture of the spleen, which is a situation requiring immediate medical attention. Asplenia refers to a non-functioning spleen, which may be congenital or due to surgical removal. These may cause:

 

  1. modest increases in circulating white blood cells and platelets,
  2. diminished responsiveness to some vaccines,
  3. increased susceptibility to 10) ___ by bacteria and protozoa; in particular, there is an increased risk of sepsis from polysaccharide encapsulated bacteria. Encapsulated bacteria inhibit binding of complement or prevent complement assembled on the capsule from interacting with macrophage receptors. Natural antibodies are required for phagocytosis, which are immunoglobulins that facilitate phagocytosis either directly or by complement deposition on the capsule. They are produced by IgM memory B cells in the marginal zone of the spleen. Splenectomy greatly diminishes the frequency of memory B cells.

 

A 28-year follow-up of 740 World War II veterans who had their spleens removed (splenectomy), on the battlefield, showed a significant increase in the usual death rate from pneumonia (6 rather than the expected 1.3) and an increase in the death rate from ischemic heart disease (4.1 rather than the expected 3) but not from other conditions.

 

The word spleen comes from the Ancient Greek, and is the idiomatic equivalent of the heart in English, i.e. to be good-spleened means to be good-hearted or compassionate. In English the word spleen was customary during the period of the 18th century. Authors like Richard Blackmore or George Cheyne employed it to characterize the hypochondriacal and hysterical affections. William Shakespeare, in Julius Caesar uses the spleen to describe Cassius’ irritable nature.

 

Must I observe you? must I stand and crouch
Under your testy humour? By the gods
You shall digest the venom of your spleen,
Though it do split you; for, from this day forth,
I’ll use you for my mirth, yea, for my laughter,
When you are waspish.

.

In French, “splenetique“ refers to a state of pensive sadness or melancholy. It has been popularized by the poet Charles Baudelaire (1821-1867) but was already used before in particular to the Romantic literature (19th century). The word for the organ is “rate“. The connection between spleen (the organ) and melancholy (the temperament) comes from the humoral medicine of the ancient Greeks. One of the humors (body fluid) was the black bile, secreted by the spleen organ and associated with melancholy. In contrast, the Talmud (tractate Berachoth 61b) refers to the spleen as the organ of laughter while possibly suggesting a link with the humoral view of the organ. In eighteenth- and nineteenth-century England, women in bad humor were said to be afflicted by the spleen, or the vapors of the spleen. In modern English, “to vent one’s spleen“ means to vent one’s anger, e.g. by shouting, and can be applied to both males and females. Similarly, the English term “splenetic“ is used to describe a person in a foul mood.

 

ANSWERS: 1) filter; 2) acids; 3) blood; 4) abdomen; 5) ribs; 6) kidney; 7) lymphatic; 8) tissues; 9) cell; 10) infection

 

Filed Under News | Leave a Comment 

Next Page →