Culture

The rules of attraction: Scientists find elusive molecule that helps sperm find egg

image: U. Benjamin Kaupp holds a sea urchin in the Marine Resources Center at the Marine Biological Laboratory, Woods Hole, Mass. Kaupp is a Whitman Center scientist at MBL from Caesar, Bonn, Germany.

Image: 
Megan Costello

WOODS HOLE, Mass. -- Scientists affiliated with the Marine Biological Laboratory (MBL) have identified a key molecule driving chemoattraction between sperm and egg cells in marine invertebrates. The study was recently published in Nature Communications.

More than 100 years ago, MBL Director F.R. Lillie of the University of Chicago discovered that eggs from marine invertebrates release a chemical factor that attracts sperm, a process called chemotaxis. Sperm, for their part, swim up a chemical gradient to reach the egg, assisted by a pulsatile rise in calcium ion (Ca2+) concentration in the sperm tail that controls its beating.

In past years, many of the cellular components that translate chemoattractant stimulation into a Ca2+ response have been revealed, but a crucial ingredient has been missing. A prerequisite for Ca2+ ions from the sperm's environment being able to enter the tail is that the sperm cell's pH becomes more alkaline. The molecule that brings about this change in pH has been elusive.

In this new report, U. Benjamin Kaupp, a MBL Whitman Center Scientist from the Center of Advanced European Studies (Caesar) in Bonn, Germany, identifies this molecule. Kaupp spent 18 summers at the MBL conducting research in the footsteps of F.R. Lillie's original quest.

The molecule that Kaupp and colleagues identified allows sodium ions to flow into the sperm cell and, in exchange, transports protons out of the cell. Such so-called sodium/proton exchangers have been known for a long time, but this one is special. It is a chimaera that shares structural features with ion channels, called pacemaker channels, which control our heartbeat and electrical activity in the brain.

This sodium/proton exchange in the sperm cell, like in the pacemaker channels, is activated by a stretch of positively charged amino acids called the voltage sensor. When sperm capture chemoattractant molecules, the voltage becomes more negative, because potassium channels open and potassium ions leave the cell. The voltage-sensor registers this voltage change and the exchanger begins exporting protons from the cell; the cell's interior becomes more alkaline. When this mechanism is disabled, the Ca2+ pulses in the sperm tail are suppressed, and sperm are lost on their voyage to the egg.

Credit: 
Marine Biological Laboratory

Reading the motor intention from brain activity within 100ms

image: The researchers propose to use a sensory stimulator in parallel with EEG, and decode whether the stimulation matches (or not) the sensory feedback corresponding to the user's motor intention. The presented experiment simulated a wheelchair turning scenario, and utilized a galvanic vestibular stimulator (GVS). B) The subjects were affixed with GVS electrodes during the EEG recording. The subliminal GVS stimulations induced a sensation of turning either right or left. C) Experiment timeline: In each trial, using stereo speakers and a 'high' frequency beep, the subjects were instructed to imagine turning either left or right while sitting on a wheelchair. A subliminal GVS stimulation was applied 2 seconds after the end of each cue, randomly either corresponding to turning right or left. This was followed by a rest period of 3 seconds cued by a 'low' frequency beep (stop cue).

Image: 
<i>Science Advances</i>

A collaborative study by researchers at Tokyo Institute of Technology has developed a new technique to decode motor intention of humans from Electroencephalography (EEG). This technique is motivated by the well documented ability of the brain to predict sensory outcomes of self-generated and imagined actions utilizing so called forward models. The method enabled for the first time, nearly 90% single trial decoding accuracy across tested subjects, within 96 ms of the stimulation, with zero user training, and with no additional cognitive load on the users.

The ultimate dream of brain computer interface (BCI) research is to develop an efficient connection between machines and the human brain, such that the machines may be used at will. For example, enabling an amputee to use a robot arm attached to him, just by thinking of it, as if it was his own arm. A big challenge for such a task is the deciphering of a human user's movement intention from his brain activity, while minimizing the user effort. While a plethora of methods have been suggested for this in the last two decades (1-2), they all require large effort in part of the human user- they either require extensive user training, work well with only a section of the users, or need to use a conspicuous stimulus, inducing additional attentional and cognitive loads on the users. In this study, Researchers from Tokyo Institute of Technology (Tokyo Tech), Le Centre national de la recherche scientifique (CNRS-France), AIST and Osaka University propose a new movement intention decoding philosophy and technique that overcomes all these issues while providing equally much better decoding performance.

The fundamental difference between the previous methods and what they propose is in what is decoded. All the previous methods decode what movement a user intends/imagines, either directly (as in the so called active BCI systems) or indirectly, by decoding what he is attending to (like the reactive BCI systems). Here the researchers propose to use a subliminal sensory stimulator with the Electroencephalography (EEG), and decode, not what movement a user intends/imagines, but to decode whether the movement he intends matches (or not) the sensory feedback sent to the user using the stimulator. Their proposal is motivated by the multitude of studies on so called Forward models in the brain; the neural circuitry implicated in predicting sensory outcomes of self-generated movements (3). The sensory prediction errors, between the forward model predictions and the actual sensory signals, are known to be fundamental for our sensory-motor abilities- for haptic perception (4), motor control (5), motor learning (6), and even inter-personal interactions (7-8) and the cognition of self (9). The researchers therefore hypothesized the predictions errors to have a large signature in EEG, and perturbing the prediction errors (using an external sensory stimulator) to be a promising way to decode movement intentions.

This proposal was tested in a binary simulated wheelchair task, in which users thought of turning their wheelchair either left or right. The researchers stimulated the user's vestibular system (as this is the dominant sensory feedback during turning), towards either the left or right direction, subliminally using a galvanic vestibular stimulator. They then decode for the presence of prediction errors (ie. whether or stimulation direction matches the direction the user imagines, or not) and consequently, as the direction of stimulation is known, the direction the user imagines. This procedure provides excellent single trial decoding accuracy (87.2% median) in all tested subjects, and within 96 ms of stimulation. These results were obtained with zero user training and with no additional cognitive load on the users, as the stimulation was subliminal.

This proposal promises to radically change how movement intention is decoded, due to several reasons. Primarily, because the method promises better decoding accuracies with no user training and without inducing additional cognitive loads on the users. Furthermore, the fact that the decoding can be done in less than 100 ms of the stimulation highlights its use for real-time decoding. Finally, this method is distinct from other methods utilizing ERP, ERD and ERN, showing that it can be used in parallel to current methods to improve their accuracy.

Credit: 
Tokyo Institute of Technology

The new tree of life of freshwater macroinvertebrates in the European continent

image: Researchers gathered and analyzed the information of the ecology of about 6,600 species.

Image: 
Núria Bonada (IRBio-UB)

A study from the Faculty of Biology and the Biodiversity Research Institute of the University of Barcelona (IRBio-UB) analysed how water macroinvertebrate species, such as beetles, mosquitos and dragonflies, evolved and diversified since their beginnings. With the analysis of the ecological features of about 6,600 European species, researchers rebuilt the functional space they occupy.

At the same time, they used DNA sequencing to rebuild the tree of life of aquatic macroinvertebrates -evolutionary and phylogenic relation between species- to estimate when they first appeared and their evolution. Results prove previous studies right, which suggested the number of species of each lineage does not depend on the evolutionary time. This study concludes that oldest lineages have more functional diversity -they can do more things and live in more habitats- than younger ones, whose functional diversity is conditioned by oldest lineages which colonized that habitat previously.

The new study has been selected as the article of the month (July) in the journal Ecography. Its first author is the ecologist Cesc Múrria (IRBio-UB) and is led by Professor Núria Bonada (IRBio-UB), head of the research group Freshwater Ecology, Hydrology and Management (FEHM) of the UB. Other participating experts are Anna Papadopoulou (Doñana Biological Station, CSIC), Sylvain Dolédec (University of Lyon, France), and Alfried Vogler (Natural History Museum - Imperial College London, United Kingdom).

Age of lineage and functional diversity

Macroecology is the field of ecology that studies global patterns in biodiversity, such as the decrease of richness of species ranging from tropical areas to the poles, or how this variety gets reduced while the elevation of a mountain rises. In this study, researchers analysed the tree of life of European aquatic macroinvertebrates to determine the time these colonized water ecosystems out of terrestrial or marine ancestors. For instance, it is well established that lineages such as dragonflies colonized continental freshwaters before others, such as beetles or mosquitoes. The next step was to relate the age of lineage to the functional diversity they currently have. "To understand biodiversity global patterns and the processes that created it, it is important to know what these species do -breathe, eat, breed- and where they live -elevation, pH, temperature, amount of oxygen and organic matter of the habitat-, which is known as functional diversity", says Cesc Múrria, member of the Department of Evolutionary Biology, Ecology and Environmental Sciences and FEHM.

Youngest lineages are found in less used places

To relate the evolutionary age and functional diversity, researchers gathered ecological data from about 6,600 species of aquatic macroinvertebrates published in previous studies. Results prove the hypothesis according to which oldest lineages would have a larger functional diversity than young ones, but it also shows how this evolution occurs. "Our results show that young lineages have a functional space which was not used before by other lineages, such as salty environments where we cannot find old lineages. This diversification would occur due older lineages colonizing continental waters with no competitors to limit the functional space. Therefore, as other lineages appeared and occupied functional space, the new ones would evolve to use ecological spaces which were not used before, and they would do fewer things and live in particular habitats", says Cesc Múrria.

A pioneer research in evolutionary studies

This research study is one of the first ones in the field of evolution which determines how lineages in a new habitat can condition the functional diversity of lineages that will colonize the habitat later. "We offer a new perspective for the evolutionary studies that have to consider the ecology of species and not only the amount of species within different lineages. Although it seems something obvious, since the origin of species depends on what the species do, this ecological and evolutionary view is rare in studies that analyse diversity patterns at a big time and space scale. This involvement goes further than the study of aquatic organisms and it can be applied to the whole biota", adds Cesc Múrria.

"The new study is a step forward to a better understanding of the evolutionary and ecological history of rivers, since the study mixes three research fields that have been worked on separately: phylogeny, functional ecology and evolution", conclude the researchers.

Credit: 
University of Barcelona

New light shed on relationship between calorie-burning fat and muscle function

image: Dr. Rosen studies specific proteins that regulate genes, called transcription factors, focusing on those that are critical for adipose function and metabolism.

Image: 
Beth Israel Deaconess Medical Center

BOSTON - Abundant in human babies and small mammals, brown adipose tissue (BAT), or brown fat, was only recently discovered in human adults, and its role remains unclear. Known to play an integral part in generating body heat and burning stored energy, its presence is linked to lower body weight and improved blood sugar levels, making it an attractive research target for potential treatments for diabetes, obesity and other metabolic diseases.

Now, endocrinologists at Beth Israel Deaconess Medical Center (BIDMC) have shown for the first time that brown fat can exert control over skeletal muscle function. Alterations to the brown adipose tissue in mice resulted in a significant and consistent reduction in exercise performance. The findings, published today in the journal Cell Metabolism, shed new light on the biology of the enigmatic brown adipose tissue as well as open the door to potential new therapies for certain metabolic and muscular diseases.

Evan Rosen, MD, PhD, Chief of the Division of Endocrinology, Diabetes and Metabolism at BIDMC, studies specific proteins that regulate genes, called transcription factors, focusing on those that are critical for adipose function and metabolism. Much of his recent work has centered on a factor called IRF4, which is usually thought of as a protein that regulates the immune system. A decade ago, Rosen and his team discovered that IRF4 is vital for adipose tissue function, and in 2014 they further identified IRF4 as a key regulator of energy burning and heat production (collectively known as thermogenesis) in brown fat.

"We knew that muscles could regulate brown fat - exercising increases brown fat - but it was unknown whether brown fat affected muscle function," says Rosen. "In this new study, we closed the loop and demonstrated that the loss of IRF4 in brown fat tissue reduces exercise capacity in rodents, affecting cellular function and causing physiological abnormalities in the muscle tissue itself."

Rosen - with colleagues including Xingxing Kong, PhD, formerly a post-doctoral fellow in Rosen's lab and now a faculty member at the David Geffen School of Medicine at the University of California, Los Angeles - compared the exercise capacity of mice designed to lack IRF4 in their brown adipose tissue with that of normal - or "wild-type" - mice. While the altered mice looked and acted normal, they consistently demonstrated diminished exercise capacity compared to the wild-type animals.

Mice with brown fat lacking IRF4 performed about 14 percent worse on a slow-speed treadmill and about 38 percent worse at higher speeds. When Rosen and colleagues compared the rodents' muscles, they saw distinct abnormalities in the thigh muscles of mice with altered brown fat. In some of the muscle cells, the structures that allow muscles to contract, the sarcoplasmic reticulum, were abnormally large, coiled up like a garden hose inside each muscle cell - a characteristic reminiscent of a rare muscle disease in humans called tubular aggregate myopathy.

"It left the muscle cells unable to process energy well, but the key finding here is that by altering brown fat tissue, we altered the muscle inadvertently," said Rosen.

To tease out the molecular pathway by which IRF4 exerts this control over muscle, the team of scientists looked at the genes expressed, not in the muscle tissue, but in the altered brown fat. They found that a hormone called myostatin - well-known to suppress muscle function and normally silenced in brown adipose tissue - was switched on in the absence of IRF4.

The team also showed that placing normal mice at warm temperatures, which naturally shuts off IRF4, had the same effect as knocking out the gene; these mice also showed a reduced ability to exercise. This was associated with higher myostatin levels, and surgically removing the BAT (and thus lowering the myostatin levels) in these warm mice restored their ability to exercise normally.

In addition to furthering researchers' basic understanding of brown adipose tissue, the findings may also open the door to new therapies for people with certain muscle diseases. What's more, many athletes suspect training in the cold can improve performance. Rosen suggests his team's findings support this idea.

Credit: 
Beth Israel Deaconess Medical Center

Big-data study pinpoints more than 150 genes associated with atrial fibrillation

ANN ARBOR, Mich. - Drawing on genomic data from more than one million individuals, researchers from the University of Michigan have led a large collaborative effort to discover as-yet unknown genetic risk factors for atrial fibrillation: an irregular, often rapid heart rate affecting millions of Americans and more than 30 million people worldwide. Atrial fibrillation increases one's risk for blood clots, stroke, heart failure, and death.

By performing one large genome-wide association study (GWAS) comprising data from six smaller studies, scientists identified 151 candidate genes for atrial fibrillation. Many of the genes identified are important for fetal development of the heart, implying that genetic variation predisposes the heart to atrial fibrillation during fetal development, or, that the genetic variation could reactivate genes in the adult heart that normally only function during fetal development.

The results of the study have been published in Nature Genetics ("Biobank-driven genomic discovery yields new insight into atrial fibrillation biology").

The increased understanding the study yields of the biological processes underlying atrial fibrillation could lead to better treatment and prevention. "We are hopeful that additional molecular biology experiments will determine how to create sustained regular heart rhythms by studying the genes we and others have identified," said study author Cristen Willer, Ph.D., associate professor at Michigan Medicine and head of U-M's Willer lab.

If atrial fibrillation is detected early, it is possible to prevent complications such as stroke and heart failure. Current treatment options for atrial fibrillation are limited, however, include serious side effects, and are rarely curative. The genetic variants uncovered in this study could potentially improve both early detection and treatment. By identifying genes important for atrial fibrillation, researchers constructed a risk score to help identify high-risk individuals and monitor them accordingly, which "may have important implications for precision health and prevention of cardiovascular disease," said Willer.

Of the 151 genes identified as important for atrial fibrillation, 32 are likely to interact with existing drugs not necessarily developed to treat atrial fibrillation. This study lays the groundwork for follow-up experiments to test whether any of the identified drugs could prevent or terminate atrial fibrillation.

This study used data from multiple biobanks from around the world, including UM's Michigan Genomics Initiative (MGI), UK Biobank, Norway's HUNT study, DiscovEHR, Iceland's deCODE Genetics, and AFGen Consortium. This big-data, precision-health approach yielded insights that may not have been discoverable using a smaller dataset.

"Discovery of novel genetic variants and genes important for atrial fibrillation was only possible because we combined information from multiple biobanks from around the world in a large collaborative effort," said first author Jonas Bille Nielsen, M.D., Ph.D., a cardiovascular researcher at U-M. "Combining the advantages of each of the data sources helped us to better understand the biology underlying atrial fibrillation [and]... revealed the risk score we constructed is very specific for atrial fibrillation. By combining multiple independent data sources, we also found that people with early-onset atrial fibrillation have a higher genetic burden of atrial fibrillation compared with people who develop the disease later in life."

The study's researchers acknowledge that their findings, while significant, need further confirmation, but are hopeful that this work will form the foundation for future experiments to understand the biology behind atrial fibrillation, and to identify tailored, more effective treatment options for the condition.

"As scientists, we need to continue to focus on the goal--helping patients with cardiovascular disease--and collaborate toward that goal," said Willer. "That's exactly what happened here, with the additional benefit of helping train the next generation of cardiovascular geneticists, like first author Jonas Nielsen."

Credit: 
Michigan Medicine - University of Michigan

Measuring climate impact of forests management -- a groundbreaking approach

A JRC-led group of forestry research experts has developed a rigorous new fact-based carbon accounting system that reflects how forest management practices can help mitigate greenhouse gas (GHG) emissions.

This new system has been recently adopted by the EU as the scientific basis for integrating the land-use, land-use change and forestry (LULUCF) sector in its climate strategy.

Forests can play a big role in mitigating greenhouse gases

While they are growing, trees absorb carbon dioxide from the atmosphere through photosynthesis and store it as carbon in their wood.

Through proper forest management, trees acting as "carbon sinks" can have a significant impact on carbon reduction.

Conversely, deforestation can make them "carbon sources", exacerbating global warming.

Therefore, sustainable forest management can help mitigate GHG emissions.

Under the Paris Climate Agreement of 2015, the EU has pledged to cut GHG emissions by at least 40% by 2030.

EU forests absorb the equivalent of nearly 10% of all EU GHG emissions each year.

Conserving and enhancing this sink, while using wood products as substitutes for more carbon-intensive energy and materials, could play an important role in reaching this target.

However, credibly measuring and reporting on the effect of forest management on emission reductions (or absorption increases) has proven to be difficult.

The use of projected 'forest reference levels', as implemented under the Kyoto Protocol (2013-2020), is controversial as it includes the assumed (and therefore unverifiable) effect of future policy impacts, which can lead to counterfactual scenarios that include inflated future harvest figures or fail to account for future increased emissions.

Calculating GHG mitigation by forests based on factual evidence rather than forecasts

The new science-based approach for credible accounting of mitigation in managed forests described in a recent article sets reference levels based on documented historical forest management practices rather than on projected future policy impacts.

In other words, it is based on factual evidence (what has actually happened) rather than projected future outcomes (which may never materialise).

Applied to 26 EU Member States using the Canadian Forest Service's Carbon Budget Model, it found that forests actually absorbed more carbon dioxide in the years 2013-16 than was accounted for under the current Kyoto Protocol method.

This was because the emission forecasts were based on projected increases in forest harvesting that never actually occurred.

Based on the new system, EU forest harvest levels are expected to increase by 12% by 2030, but at a slower rate than in forecasts based on the Kyoto Protocol method.

This is because the system takes age-related dynamics into account but disregards future (unrealised and unverifiable) impacts of policy on harvest volumes.

The new approach leaves countries free to manage forests as they wish, but requires that the atmospheric impact of changes in management relative to an historical period are fully reflected in the accounts.

This ensures the comparability of forest accounting with other sectors, such as the energy sector.

This offers a credible solution to the debate on how to account for forest sinks at the country level, and helps improve transparency and scientific credibility within the Paris Agreement.

Involving land use and forestry in reducing GHG emissions

A new EU Regulation on Land Use, Land Use Change and Forestry (LULUCF) published on 19 June 2018 involves, for the first time, the LULUCF sector in cutting GHG emissions.

The Regulation sets out the commitments and rules for the inclusion of GHG emissions and removals from the LULUCF sector in the framework of the EU's 2030 climate and energy targets.

According to the new Regulation, Member States may use certain mitigation actions in forestry and agricultural land uses to meet their climate targets.

This is in line with the Paris Agreement, which points to the critical role of the LULUCF sector in reaching long-term climate mitigation objectives.

The JRC contributed significantly to this legislative proposal, both during its design and the discussions with Member States and the European Parliament.

In particular, the JRC was heavily involved in the most complex and debated issue in the entire legislation - the rules on how to account for the climate impact of forest management.

A recently published Technical Guidance for implementing the forest reference levels, coordinated by IIASA, extends and further elaborates the scientific principles outlined in the JRC work.

Credit: 
European Commission Joint Research Centre

Insulin resistance under-diagnosed in non-diabetics with Parkinson's disease

Amsterdam, NL, August 2, 2018 - Almost two-thirds of non-diabetic patients with Parkinson's disease (PD) may be insulin resistant, despite having normal blood sugar, report scientists in the Journal of Parkinson's Disease. Their findings suggest that insulin resistance in PD is a common and largely undetected problem, especially in patients who are overweight.

Reduced glucose tolerance has long been recognized as a potential risk factor for PD, and there is increasing scrutiny of insulin resistance as a pathologic driver of neurodegeneration. The key link between the two conditions appears to be insulin resistance, a potentially reversible condition that not only predisposes individuals to type 2 diabetes (DM2) but is also associated with neurodegeneration. However, the prevalence of insulin resistance in PD is unknown.

"There is growing interest in the study of this relationship and the use of diabetes medications in the treatment of PD. However, there is little information regarding the prevalence of insulin resistance in PD," explained lead investigator Michele Tagliati, MD, from the Department of Neurology, Cedar-Sinai Medical Center, Los Angeles, CA, USA. "This study is the first to address this question in a large population of non-diabetic patients."

Investigators tested 154 non-diabetic PD patients for fasting blood sugar and insulin to assess the prevalence of insulin resistance and to correlate insulin resistance with other metabolic indicators, motor and non-motor symptoms of PD, and quality of life. Based a widely used formula, known as the HOMA index, they determined how many of these patients had a reduced response to their own insulin. Among other measurements, their weight and height were recorded and their movement and cognitive performance were measured.

Results showed that nearly two-thirds of patients (58.4%) had undiagnosed insulin resistance, despite normal fasting glucose and, in many cases, normal hemoglobin A1c (HbA1c), a test that is regularly performed for type 1 and type 2 diabetes. Their data confirmed previous studies that insulin resistance is more than double in obese compared with lean individuals, but the investigators also found a substantially higher percentage (41%) of lean PD patients with insulin resistance. They found no correlation between insulin resistance and cognitive decline.

The potential impact of this study is two-fold. Weight gain and obesity is a major public health challenge and insulin resistance appears linked to body weight. These findings could lead to increased screening of PD patients to detect and correct this condition.

The second and more specific impact is that identifying patients with insulin resistance could allow for personalized medicine, whereby PD patients with insulin resistance may be treated with medications targeted to reverse the condition. Research on the use of diabetic medications for PD, such as GLP-1 agonists like exenatide and liraglutide, is ongoing.

"Now that, for the first time, we understand how common insulin resistance is in non-diabetic patients with PD, we can begin to address this public health challenge," commented Dr. Tagliati. "This increases the importance of finding new treatments and lifestyle interventions that can address this metabolic dysfunction with multiple implications, from diabetes to neurodegenerative disorders like PD and Alzheimer's disease."

Credit: 
IOS Press

New research shows Juvenile diversion programs work, also curb reoffending tendencies

image: This is Jeff Kretschmar.

Image: 
CWRU

Juveniles who complete diversion programs for their crimes are less likely to continue their criminal activity as adults, according to new research from Case Western Reserve University.

The researchers got a rare opportunity to examine early adulthood recidivism for juvenile justice-involved youth with behavioral health issues who participated in a diversion program.

The Montgomery County court system in southwest Ohio was able to provide both juvenile and early adulthood data to researchers from Case Western Reserve's Jack, Joseph and Morton Mandel School of Applied Social Sciences.

Their conclusion: youth diversion programs work.

"We examined data from Ohio's Behavioral Health Juvenile Justice (BHJJ) Initiative, a diversion program for juvenile justice-involved youth with behavioral health issues," said Jeff Kretschmar, research associate professor and managing director of the Begun Center for Violence Prevention Research and Education.

"When we evaluate diversion programs like this, we typically only have access to juvenile records. We don't know what happens when kids age out of the juvenile system." he said. "We wanted to know. The data out of Dayton (Montgomery County) suggests the effects of juvenile diversion programs extend to early adulthood."

Three groups were examined for the research: youth appropriate for diversion programs but who did not participate; youth who participated but did not complete treatment; and youth who successfully completed treatment.

Highlights from the research show that compared to the other groups, youths who successfully completed the juvenile diversion program had lower odds of reoffending as young adults, with fewer young-adult offenses.

Next, Kretschmar said researchers will attempt to gather similar data from other program sites in an attempt to replicate these findings. "You can imagine the possibilities additional data can bring," Kretschmar said. "With more data, from all over the state, we could see what treatments work best, for whom, and why."

The work was done with colleagues Fredrick Butcher and Krystel Tossone, research associates from the Begun Center, and Barbara Marsh, from the Board of Public Health of Dayton Montgomery County.

Credit: 
Case Western Reserve University

Clothing, furniture also to blame for ocean and freshwater pollution

image: a, Top five continent-level displacements of marine (top, kt N eq.) and freshwater (bottom, kt P eq.)
eutrophication associated with trade satisfying food and non-food demand. b, Global trade marine (top, Mt N eq.) and freshwater (bottom, kt P eq.)
footprints (FPs) over time based on production sector and consumed product type. Arrows in a represent the gross flow of embodied impacts that occur in
the country of origin (start of arrow) for the consuming country (point of arrow). Grey shading differentiates regions.

Image: 
Helen Hamilton/Nature Sustainability

Think summer holidays and you'll likely call up images of a beautiful beach or a glittering blue lake. But more and more lakes, rivers and coastal areas are plagued by an oversupply of nutrients that causes algae to grow at an explosive rate, which can eventually lead to water bodies that can't support aquatic life.

Scientists call this type of water pollution eutrophication, and it is an enormous problem worldwide: There are more than 400 marine 'dead zones' caused by over-fertilization, covering an estimated 245,000 km2, which is an area six times the size of Switzerland.

In some water bodies, eutrophication causes huge fish kills and toxic blue green algae blooms, which affects food supply, biodiversity and your favorite swimming spot.

Governments around the globe have battled eutrophication by working with farmers to control nutrient-laden runoff from fields and feedlots. But there's more to the picture, a new study published in Nature Sustainability shows.

Using a detailed modeling tool called MRIO, a team of researchers identified important, but often overlooked sources of water pollution, namely clothing, and other manufactured products and services.

When they did their analysis, the team found that the overall demand for non-food products in 2011 accounted for more than one-third of the nutrients causing eutrophication in both marine and freshwater systems worldwide. This was a 28 percent increase compared to 2000.

"Normally we think of food production as being the culprit behind eutrophication. However, if we're trying to fully understand and control eutrophication, ignoring the contributions from other consumer products such as clothing and furniture means that we're only addressing part of cause of the pollution," said Helen Hamilton, a postdoc in the Norwegian University of Science and Technology's Industrial Ecology Programme, and first author of the paper. "We need to look at the whole picture to address the whole problem."

Wealthier world, more pollution

Agriculture will most likely always be the most important cause of eutrophication, the researchers said. But as countries develop and people become richer, the amount of money that is spent on food relative to the total GDP decreases.

With increased wealth, people have the opportunity to spend their extra cash on products that can also depend on agriculture in their supply chains, such as textiles, clothing and furniture.

A second challenge with goods and services is that they can often have long, complex supply chains across a number of countries before reaching the consumer, the researchers said.

"For example, when we buy a shirt that was made in China, it is China and not the consumer that has to deal with the pollution related to producing it. All traded goods have this problem: the place of production and, thus, pollution is often far removed from the consumers," Hamilton said. "This makes it difficult to tackle pollution because the relevant players, such as farmers, policy makers and consumers, are spread across several countries."

All those reasons, the researchers say, make it even more important to know how much goods and services contribute to eutrophication worldwide.

"From our work, we know that non-food consumption is growing over time and as people get richer. It is, therefore, increasingly important to consider the consumption of clothing, textiles and furniture in our strategies for solving this major ecological problem," she said.

Nitrogen and phosphorus most important

Fertilizer typically contains a mix of nitrogen, phosphorus and potassium, all of which are vital to plant growth. But when excess fertilizer reaches water bodies, it's mainly the nitrogen and phosphorus that matter in feeding algal blooms.

The production of non-food goods, such as clothing, can involve the release of nutrients directly, as when a farmer grows cotton or linen for the fabric to make the clothing. There are also more indirect sources, such as when electricity or another energy source is used to power the factories where the clothing is made. That can release NOx, oxides of nitrogen, as air pollution which then can be absorbed by the oceans and add to the nutrient load.

Knowing the importance of these nutrients and how they can be released in any number of different steps during production gives researchers the ability to nail down when and in which stage of production the pollution occurs.

Eutrophication footprints

By looking at how much nitrogen and phosphorous are released along the entire global supply chain for the product, they can then figure out how much the production of different goods and services contributes to eutrophication.

Using their MRIO (which stands for multi-region input output) method, the researchers calculate country-specific "eutrophication footprints", which are simply the sum of all the pollution that occurs worldwide due to a country's consumption.

This includes both the pollution that occurs within the country's own borders and the pollution that is generated in other parts of the world due to the production of imported goods.

A good example of this is the EU, Hamilton says.

"Our results show that the vast majority of all eutrophication related to the EU's non-food consumption occurs in other regions," she said, a phenomenon that researchers call displacement.

"In other words, the EU is generating an enormous amount of pollution in other countries by consuming imported products without having to deal with the consequences," Hamilton said.

The researchers found that the EU drives the largest global non-food eutrophication displacements, to the Asia-Pacific region for marine eutrophication and to Africa for freshwater eutrophication.

US, China play a major role in overall eutrophication

Not surprisingly, the US and China have some of the biggest eutrophication footprints, the researchers found --although most of this pollution occurs within their own borders due to the high consumption of domestic goods, both food and non-food.

China had the largest non-food eutrophication footprint for marine ecosystems. The country's total marine eutrophication footprint was 8.6 metric tons of nitrogen equivalents, with fully 3 metric tonnes of this attributable to the consumption of both imported and domestically produced non-food goods.

"This was also double China's 2000 non-food marine eutrophication footprint, which really exemplifies the recent boom in the Chinese economy," Hamilton said.

The researchers also found a similar trend with China's food marine eutrophication footprint, which increased by over 25% from 2000, peaking at 5.4 metric tonnes of nitrogen equivalents in 2011 for marine eutrophication. That's the highest country-level food footprint, Hamilton said.

"It was also interesting to look at food-related eutrophication impacts to find trends there as well. In China, population growth combined with changes in diet have certainly contributed towards making them the world leader in food-related eutrophication," says Hamilton.

However, when considering total eutrophication, or the sum of both food- and non-food related eutrophication, the U.S. takes the lead. In 2011, the U.S. was the largest overall country-level contributor for both marine and freshwater eutrophication. This is nearly triple its 2000 values, which highlights how much U.S. consumption is growing over time.

More than one-third of eutrophication due to clothing, other products

When they looked at the big picture, the researchers found that clothing, goods for shelter, services and other manufactured products accounted for 35% of global marine eutrophication and 38% of the global freshwater eutrophication footprints in 2011, up from 31 and 33%, respectively, in 2000.

"By comparison, the global food footprints only modestly increased by roughly 10% from 2000 to 2011 values," Hamilton said.

In the end, from a production standpoint, agriculture is the most important contributor to the problem, accounting for 84% of the total footprints for both marine and freshwater eutrophication. But the researchers pointed out that approximately one-quarter of these agricultural impacts in 2011 were due to non-food consumption.

Another important aspect about non-food consumption is that, compared to food, it is also significantly more sensitive to changes in wealth and is more likely to be traded across borders.

"Simply put, there are natural limits to how much people can eat. This means that as the population gets richer, diets and food consumption might change a bit, but where we see the biggest increases is with buying other products such as cars, clothing and furniture," Hamilton said. "These are also the products that are easiest to trade around the world because, unlike food, they don't have an expiration date. Therefore, we see much higher pollution displacement with non-food as compared to food."

Wealthy countries can drive improvements in developing countries

As economies develop, this points to the need for trade agreements and policies to consider the displacement of ecosystem impacts, the researchers said.

And while the EU has developed frameworks and strategies for tackling eutrophication within Europe, for example, there aren't many policies that integrate international supply chains for addressing eutrophication abroad.

"Countries that are responsible for the largest footprints could set consumption-based targets, such as a 40% reduction in the EU's global eutrophication footprint," Hamilton said. "That could help increase the transfer of technology or skills, such as improving fertilizer efficiencies or animal waste management, in producing countries."

It also provides consumers in wealthier countries a way to drive improved environmental policies in developing countries, the researchers said, since wealthy regions can more easily afford the resources needed to support the implementation of these policies in developing countries.

Credit: 
Norwegian University of Science and Technology

How cues drive our behavior

MINNEAPOLIS, MN- August 02, 2018 - Do dopamine neurons have a role in causing cues in our environment to acquire value? And, if so, do different groups of dopamine neurons serve different functions within this process?

Those are the questions researchers at the University of Minnesota Medical School are looking to answer.

Recent research published in Nature Neuroscience by University of Minnesota Medical School neuroscientist Benjamin Saunders, PhD, uses a Pavlovian model of conditioning to see if turning on a light - a simple cue - just before dopamine neurons were activated could motivate action. The classic Pavlov model combined the ringing of a bell with providing a tasty steak to a dog that, over time, conditioned a dog to drool when the bell rang with or without a steak. In this research, however, there was no "real" reward like food or water, in order to allow researchers to isolate the function of dopamine neuron activity.

"We wanted to know if dopamine neurons are actually directly responsible for assigning a value to these transient environmental cues, like signs," said Saunders, who conducted some of his research as a postdoctoral fellow in the laboratory of Patricia Janak, PhD, at Johns Hopkins University.

Dopamine neurons, those cells in the brain that turn on when experiencing a reward. They are also the neurons that degenerate in Parkinson's disease.

"We learned that dopamine neurons are one way our brains give the cues around us meaning," said Saunders. "The activity of dopamine neurons alone - even in the absence of food, drugs, or other innately rewarding substances - can imbue cues with value, giving them the ability to motivate actions."

To answer the second core question, the researchers targeted specific segments of dopamine neurons - those located in the substantial nigra (SNc) and those located in the ventral tegmental area (VTA). These two types of neurons have historically been studied in different disease research fields - SNc neurons in Parkinson's disease, and VTA neurons in addiction studies.

Scientists learned that cues predicting activation of the two types of neurons drove very different responses - those predicting the SNc neurons led to a sort of "get up and go" response of invigorated rapid movement. The cue predicting VTA neuron activation, however, became enticing on it own, driving approach to the cue's location, a sort of "where do I go?" response.

"Our results reveal parallel motivational roles for dopamine neurons in response to cues. In a real world situation, both forms of motivation are critical," said Saunders. "You have to be motivated to move around and behave, and you have to be motivated to go to the specific location of things you want and need."

These results provide important understanding of the function of dopamine neurons related to motivations triggered by environmental cues. And this work contributes to the understanding of relapse for those struggling with addictions.

"If a cue - a sign, an alley, a favorite bar - takes on this powerful motivational value, they will be difficult to resist triggers for relapse," said Saunders. "We know dopamine is involved, but an essential goal for future studies is to understand how normal, healthy cue-triggered motivation differs from dysfunctional motivation that occurs in humans with addiction and related diseases."

Credit: 
University of Minnesota Medical School

Disease threatening the most plentiful starfish in Antarctica discovered

image: This disease attacks the tissues of the epidermis and causes areas of discoloration (white spots), ulcerations and inflammation of tissues as deep as the subepithelial level.

Image: 
UB-IRBio

A study led by experts from the University of Barcelona's Faculty of Biology and Institute for Research on Biodiversity (IRBio) have identified a disease that is affecting the starfish Odontaster validus, one of the most common species on the Antarctic sea floor. The disease, which is the first to be described in an echinoderm in Antarctica's marine environment, has afflicted up to 10% of the populations of the species, which is the most important benthic predator in the coastal communities of Deception Island and other marine regions in Antarctic latitudes.

The new study, which appears in the journal Scientific Reports, is the work of experts Conxita Àvila and Carlos Angulo-Preckler (IRBio, a UB research institute), Laura Núñez-Pons (Anton Dohrn Zoological Station of Naples, Italy), Thierry M. Work (United States Geological Survey) and Juan Moles (Harvard University, US).

Surviving in an extreme environment

Low temperatures, ocean currents, the erosion caused by icebergs, seasonal changes in the light regime and access to food present extreme conditions that affect marine ecosystems on the Antarctic sea floor. In these extreme habitats, the starfish Odontaster validus is a plentiful species in marine substrates across a wide range of depths.

The new study expands our knowledge of the vulnerability of Antarctic benthos communities--in this case, to new diseases--and it reveals that such an emblematic starfish on the Antarctic sea floor could be under threat from a disease "that attacks the tissues of the epidermis and causes areas of discoloration (white spots), ulcerations and inflammation of tissues as deep as the subepithelial level," according to Professor Conxita Àvila. Àvila, who teaches in the UB's Department of Evolutionary Biology, Ecology and Environmental Sciences and works at the research institute IRBio, is the lead investigator on a number of projects investigating the ecology of marine invertebrate communities in Antarctica, including Bluebio, Distantcom, Ecoquim and Actiquim.

"The disease is not bacterial or fungal in origin," Àvila explains. "Everything points to an infection caused by a virus or mycoplasma, which is a hypothesis that we are now studying in greater depth. We don't know whether there is a direct relationship with the temperature, but it's possible that there is. We've seen that the years with higher percentages of diseased starfish--up to 10% of the population--coincide with the years when temperatures have been extremely high. While the disease has only affected 3% of communities in other years, we don't yet know the reasons for the variations."

A disease affecting the starfish of Deception Island

The disease described in the study in Scientific Reports is the first to affect only a single species of echinoderm. Previous scientific literature had defined a number of diseases in other geographical areas that normally affect more than one species of echinoderm.

"By contrast," Àvila points out, "our study shows that this disease affects only the populations of the species Odontaster validus at Deception Island, indicating that its features are different from earlier descriptions in other marine regions of the globe. We've worked in other areas, such as the Weddell Sea, the Antarctic Peninsula and southward as far as the Rothera Research Station, and we've never seen any starfish, not the Odontaster validus or any other, exhibiting these signs of disease. As a result, it's impossible to confirm whether or not there are affected individuals elsewhere."

According to the experts, the disease could be linked to the density of specimens, which is the most common case with echinoderms in other geographical areas. Apparently, though, the disease is not transmissible between individuals. Nor does it appear to progress rapidly, although metabolic processes in Antarctic organisms are very slow and this could obscure the disease's evolution over time.

Protecting the biodiversity of Antarctic ecosystems

The chief aims of the IRBio team are to gain a deeper understanding of the disease and its effects and, if possible, to fight its spread in Antarctic habitats. If the cause is a pathogenic, viral or mycoplasma effect that is directly related to the rise in global temperatures, the only viable strategy will be to implement all possible measures in the struggle against climate change at the individual and collective levels.

"Antarctica is now suffering from the effects of climate change and the steps that must be taken globally are extremely urgent," Àvila warns. "If a species like Odontaster validus disappears, it will doubtless lead to very significant changes in the fauna composition of these Antarctic communities, disturbing the trophic network and upsetting the natural balance in the marine ecosystem."

"If the disease doesn't lead to the disappearance of the species but only to a reduction in the density of specimens, the effect won't be so drastic. But even so, it would still affect both the remaining fauna and the relationships among organisms in Antarctica's marine ecosystems. If the effects on the species' physiology are minor, that is, if they only cause inactivity, reduced predation and so on, the impact will be the same but on a smaller scale," Àvila notes, before she concludes, "In the long run, we still don't know whether the persistence of the disease in Antarctic habitats might end up affecting other local species too."

Credit: 
University of Barcelona

New link between hypoxia and blood clot risk

image: Rinku Majumder, PhD, Associate Professor of Biochemistry at LSU Health New Orleans School of Medicine

Image: 
LSU Health New Orleans

New Orleans, LA - Research led by Rinku Majumder, PhD, Associate Professor of Biochemistry at LSU Health New Orleans School of Medicine, has found how hypoxia (a low concentration of oxygen) decreases Protein S, a natural anticoagulant, resulting in an increased risk for the development of potentially life-threatening blood clots (thrombosis). Although hypoxia has been associated with an increased risk for thrombosis, this research showed for the first time a molecular cause. The work is published in the current issue of Blood, available online here.

"Hypoxia is common in many diseases including cancer, alcoholism, sickle cell anemia, nonalcoholic fatty liver disease and more," notes Dr. Majumder. "Human Protein S (PS) is a natural blood anticoagulant. Although discovered 40 years ago, the exact mechanism of PS's anticoagulant action was deduced only in the last few years. Our earlier work found that PS inhibits a key clotting protein, Factor IXa. We knew that PS deficiency could occur in hypoxia but not why. With this study, our group identified the gene regulatory mechanism by which oxygen concentration controls PS production."

Because Protein S is primarily produced in the liver, the team of researchers cultured human hepatocarcinoma cells at normal oxygen and also hypoxic conditions and then measured levels of the protein. They found that increasing hypoxia not only reduced PS but also significantly increased a protein that turns on the gene to produce hypoxia. This suggested that the protein, hypoxia-inducing factor 1, might regulate Protein S, which the researchers confirmed through biochemical and genetic approaches in a mouse model.

The research is included in the journal's "Issue Highlights" featured on the cover and is accompanied by a commentary that calls the discovery "an important contribution to our understanding of the molecular basis of the augmentation of thrombosis by hypoxia."

"This study will open a new direction for targeting hypoxia-mediated thrombotic disorders," Majumder concludes.

Credit: 
Louisiana State University Health Sciences Center

Machine learning links dimensions of mental illness to abnormalities of brain networks

image: Cross clinical diagnostic categories.

Image: 
Penn Medicine

PHILADELPHIA--A new study using machine learning has identified brain-based dimensions of mental health disorders, an advance towards much-needed biomarkers to more accurately diagnose and treat patients. A team at Penn Medicine led by Theodore D. Satterthwaite, MD, an assistant professor in the department of Psychiatry, mapped abnormalities in brain networks to four dimensions of psychopathology: mood, psychosis, fear, and disruptive externalizing behavior. The research is published in Nature Communications this week.

Currently, psychiatry relies on patient reporting and physician observations alone for clinical decision making, while other branches of medicine have incorporated biomarkers to aid in diagnosis, determination of prognosis, and selection of treatment for patients. While previous studies using standard clinical diagnostic categories have found evidence for brain abnormalities, the high level of diversity within disorders and comorbidity between disorders has limited how this kind of research may lead to improvements in clinical care.

"Psychiatry is behind the rest of medicine when it comes to diagnosing illness," said Satterthwaite. "For example, when a patient comes in to see a doctor with most problems, in addition to talking to the patient, the physician will recommend lab tests and imaging studies to help diagnose their condition. Right now, that is not how things work in psychiatry. In most cases, all psychiatric diagnoses rely on just talking to the patient. One of the reasons for this is that we don't understand how abnormalities in the brain lead to psychiatric symptoms. This research effort aims to link mental health issues and their associated brain network abnormalities to psychiatric symptoms using a data-driven approach."

To uncover the brain networks associated with psychiatric disorders, the team studied a large sample of adolescents and young adults (999 participants, ages 8 to 22). All participants completed both functional MRI scans and a comprehensive evaluation of psychiatric symptoms as part of the Philadelphia Neurodevelopmental Cohort (PNC), an effort lead by Raquel E. Gur, MD, PhD, professor of Psychiatry, Neurology, and Radiology, that was funded by the National Institute of Mental Health. The brain and symptom data were then jointly analyzed using a machine learning method called sparse canonical correlation analysis.

This analysis revealed patterns of changes in brain networks that were strongly related to psychiatric symptoms. In particular, the findings highlighted four distinct dimensions of psychopathology - mood, psychosis, fear, and disruptive behavior - all of which were associated with a distinct pattern of abnormal connectivity across the brain.

The researchers found that each brain-guided dimension contained symptoms from several different clinical diagnostic categories. For example, the mood dimension was comprised of symptoms from three categories, e.g. depression (feeling sad), mania (irritability), and obsessive-compulsive disorder (recurrent thoughts of self-harm). Similarly, the disruptive externalizing behavior dimension was driven primarily by symptoms of both Attention Deficit Hyperactivity Disorder(ADHD) and Oppositional Defiant Disorder (ODD), but also included the irritability item from the depression domain. These findings suggest that when both brain and symptomatic data are taken into consideration, psychiatric symptoms do not neatly fall into established categories. Instead, groups of symptoms emerge from diverse clinical domains to form dimensions that are linked to specific patterns of abnormal connectivity in the brain.

"In addition to these specific brain patterns in each dimension, we also found common brain connectivity abnormalities that are shared across dimensions," said Cedric Xia, a MD-PhD candidate and the paper's lead author. "Specifically, a pair of brain networks called default mode network and frontal-parietal network, whose connections usually grow apart during brain development, become abnormally integrated in all dimensions."

These two brain networks have long intrigued psychiatrists and neuroscientists because of their crucial role in complex mental processes such as self-control, memory, and social interactions. The findings in this study support the theory that many types of psychiatric illness are related to abnormalities of brain development.

The team also examined how psychopathology differed across age and sex. They found that patterns associated with both mood and psychosis became significantly more prominent with age. Additionally, brain connectivity patterns linked to mood and fear were both stronger in female participants than males.

"This study shows that we can start to use the brain to guide our understanding of psychiatric disorders in a way that's fundamentally different than grouping symptoms into clinical diagnostic categories. By moving away from clinical labels developed decades ago, perhaps we can let the biology speak for itself," said Satterthwaite. "Our ultimate hope is that understanding the biology of mental illnesses will allow us to develop better treatments for our patients."

Credit: 
University of Pennsylvania School of Medicine

Metabolomics applications for precision nutrition, formula, & neurodegenerative disorders

image: OMICS: A Journal of Integrative Biology addresses the latest advances at the intersection of postgenomics medicine, biotechnology and global society, including the integration of multi-omics knowledge, data analyses and modeling, and applications of high-throughput approaches to study complex biological and societal problems.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, August 2, 2018--Metabolomics is the latest omics systems science technology with emerging applications towards psychiatry, personalized medicine, and most recently, precision nutrition research. Infant formula, for example, is manufactured to match the molecular composition of human milk. A new study reporting on the comparative lipid profiles of infant formulas and human milk using metabolomics is published in OMICS: A Journal of Integrative Biology, a peer-reviewed publication from Mary Ann Liebert, Inc., publishers.

In the article "Toward Precision Nutrition: Commercial Infant Formulas and Human Milk Compared for Stereospecific Distribution of Fatty Acids Using Metabolomics," Glaucia B. Alcantara, Universidade Federal de Mato Grosso do Sul (Campo Grande, Brazil) and coauthors present new findings on the lipid profiles of several commercially available infant formulas and compare them with human milk. The authors note that the observations can inform optimal design of infant formulas with a view to precision nutrition, and to better approximate the distribution of fatty acids naturally found in human milk, thus contributing to the better nutrition of newborns.

Attesting to the growing importance of metabolomics in medical research, "Metabolic Biomarkers and Neurodegeneration: A Pathway Enrichment Analysis of Alzheimer's Disease, Parkinson's Disease, and Amyotrophic Lateral Sclerosis," an article coauthored by Dilek Kazan, Marmara University (Istanbul, Turkey), examined neurodegenerative diseases such as Alzheimer's disease and Parkinson's disease using data on metabolite-disease associations published over a decade (2006-2016).

A strategic roadmap article on metabolomics and multi-omics science in psychiatry "Toward a Global Roadmap for Precision Medicine in Psychiatry: Challenges and Opportunities" coauthored by Shareefa Dalvie (University of Cape Town, South Africa) has been featured as part of a two-volume special issue on Precision medicine 2.0 edited by Collet Dandara (University of Cape Town) chronicling the past, present and futures of new omics technology innovations in global health.

Access Volume I here.

Access Volume II here.

These articles and special issues highlighting the emerging and recent applications of metabolomics and omics systems sciences in precision nutrition and precision medicine are available free on the OMICS website until September 2, 2018.

"Metabolomics and multi-omics integration with genomics, proteomics and other systems science technologies bring about new insights," says Vural Özdemir, MD, PhD, DABCP, Editor-in-Chief of OMICS, "most notably for preventive medicine, precision nutrition and global health, disease susceptibility, and mechanisms of person-to-person variations in response to food, drugs, vaccines, and other health interventions. Each new omics technology is also an opportunity to think about responsible innovation, and the broad societal contexts in which scientific innovations emerge."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

World experts target guidance on managing dementia symptoms

New research which brings together the views of the world's leading experts has concluded that non-drug approaches should be prioritised in treating agitation in people with Alzheimer's Disease.

The research, published in International Psychogeriatrics and led by the University of Michigan, the University of Exeter and John Hopkins University, provided more specific guidance on the management of behavioural and psychological symptoms in people with Alzheimer's Disease.

It gives the most specific and targeted treatment for psychosis and agitation. Both symptoms are common in dementia and have a significant impact on individuals, families and carers.

The International Delphi Consensus paper incorporates views of a panel of experts from across the globe, who have both clinical and research expertise. Undertaken as an International Psychogeriatric Association taskforce, it brought together the latest evidence on how best to treat symptoms such as psychosis and agitation, to help get the best treatment for the 40 million people with dementia worldwide.

By ranking available treatments in order of the quality of evidence, the paper provides guidance on the order in which clinicians should prioritize treatments.

For treating agitation in people with dementia, the first four highly ranked treatments were all non-pharmacological approaches.

Assessment and management or underlying causes, educating caregivers, adapting environment, person-centred care and a tailored activity programme all ranked more highly than any of the pharmacological treatments. The highest ranked pharmacological treatment was the antidepressant citalopram, which came in at number six.

Of note, of the currently used atypical antipsychotic drugs only risperidone reached consensus as a recommended treatment, at number 7 in the list.

Helen C. Kales MD, Director of the Program for Positive Aging at the University of Michigan and Research Investigator at the VA Center for Clinical Management Research noted: "This research advocates a significant shift from current practice, recommending that non-pharmacological treatments are a first-line approach for agitation in dementia. Aside from risperidone at number 7 in the list, none of the other atypical antipsychotic drugs were recommended. This is a very welcome change, given the known harms associated with these treatments."

For the treatment of psychosis in people with dementia, including symptoms such as hallucinations and delusions, the panel advocated a thorough assessment and management of underlying causes as the first approach. The atypical antipsychotic risperidone came second, as the only pharmacological treatment with any supporting evidence that it works. This highlights a particular gap in the treatment of psychosis in people with dementia, which is a distressing and disabling symptom, and emphasizes tis as a priority area for further research.

Overall, the DICE (Describe, Investigate, Create and Evaluate) therapy approach, which involves identifying triggers, and using music were both found to be effective in managing symptoms without prescribing drugs.

Clive Ballard, Professor of Age-Related Diseases at the University of Exeter Medical School, said: "Symptoms such as psychosis and agitation can be particularly distressing and challenging for people with dementia, their carers and their families. Many commonly prescribed medications can cause harm, in some cases significantly increasing risk of stroke or death. We now know that non-drug approaches are the best starting points and can prove effective. This research provides more specific and targeted guidance to support clinicians to give the best possible treatment options."

Credit: 
University of Exeter