Culture

Many newborn screening recommendations do not assess key evidence on benefits and harms

Many national recommendations on whether to screen newborn babies for rare conditions do not assess the evidence on the key benefits and harms of screening, warn researchers in a study published by The BMJ today.

Effective screening programmes can save lives, whereas ineffective programmes can do more harm than good, yet decisions about which conditions to screen for vary widely between countries, despite similar populations and healthcare systems.

Reasons for these differences are unclear, but it has been suggested that differences in the evidence review process used to generate policy - in particular the use of systematic reviews - may play a role.

Systematic reviews bring together evidence from existing studies and use statistical methods to summarise the results, to help make evidence-based decisions.

To explore this further, a team of UK researchers assessed whether use of a systematic review affects national decisions on whether to screen for a range of conditions using the newborn blood spot test, which is offered to every baby to detect rare but serious health conditions.

Their analysis included 93 reports that assessed 104 conditions across 14 countries, giving a total of 276 recommendations.

Screening was favoured in 159 (58%) recommendations, not favoured in 98 (36%), and not recommended either way in 19 (7%).

Only 60 (22%) of the recommendations were based on evidence from a systematic review. Use of a systematic review was associated with a reduced probability of screening being recommended (38% v 63%).

Evidence for test accuracy was not considered in 115 (42%) of recommendations, while evidence around the benefits of early detection and the potential harm of overdiagnosis were not considered in 83 (30%) and 211 (76%) of recommendations, respectively.

The researchers point to some study limitations, the key one being that use of systematic review methods may have been driven by country level factors. However, strengths include the large number of documents analysed and the ability to take account of potentially influential factors across different conditions.

"This study showed that many national policy decisions about whether to screen for conditions are being made without systematically reviewing the evidence," say the authors. "Yet it remains essential to make evidence based policy decisions because once screening programmes are started they are difficult to stop."

They call for further research "to understand why policy makers do not employ systematic review methods in their evaluations of evidence" - and they propose more international collaboration to undertake such reviews.

Credit: 
BMJ Group

New research reveals how energy dissipates outside Earth's magnetic field

image: In this visualization, as the supersonic solar wind (yellow haze) flows around the Earth's magnetic field (blue wavy lines), it forms a highly turbulent boundary layer called the 'magnetosheath' (yellow swirling area). A new research paper describes observations of small-scale magnetic reconnection within the magnetosheath, revealing important clues about heating in the sun's outer layers and elsewhere in the universe.

Image: 
NASA/GSFC

Earth's magnetic field provides an invisible but crucial barrier that protects Earth from the solar wind--a stream of charged particles launched from the sun's outer layers. The protective properties of the magnetic field can fail due to a process known as magnetic reconnection, which occurs when two opposing magnetic field lines break and reconnect with each other, dissipating massive amounts of energy and accelerating particles that threaten air traffic and satellite communication systems.

Just outside of Earth's magnetic field, the solar wind's onslaught of electrons and ionized gases creates a turbulent maelstrom of magnetic energy known as the magnetosheath. While magnetic reconnection has been well documented closer to Earth, physicists have sought to determine whether reconnection also happens in this turbulent zone.

A new research paper co-authored by University of Maryland Physics Professor James Drake suggests that the answer to this question is yes. The observations, published in the May 10, 2018 issue of the journal Nature, provide the first evidence of magnetic reconnection occurring at very small spatial scales in the turbulent magnetosheath. However, unlike the reconnection that occurs with the Earth's magnetic field, which involves electrons as well as ions, turbulent reconnection in the magnetosheath involves electrons alone.

"We know that magnetic energy in churning, turbulent systems cascades to smaller and smaller scales. At some point that energy is fully dissipated. The big question is how that happens, and what role magnetic reconnection plays at such small scales," Drake said. "This study shows that reconnection indeed can happen at the electron scale, with no ions involved at all, suggesting that reconnection may help dissipate magnetic energy at very small scales."

By drawing a clearer picture of the physics of magnetic reconnection, the discovery promises to advance scientists' understanding of several open questions in solar physics. For example, electron-scale magnetic reconnection may play a role in heating of the solar corona--an expansive layer of charged particles that surrounds the sun and reaches temperatures hundreds of times higher than the sun's visible surface. This in turn could help explain the physics of the solar wind, as well as the nature of turbulent magnetic systems elsewhere in space.

NASA's Magnetospheric Multiscale (MMS) mission gathered the data for the analysis. Flying in a pyramid formation with as little as 4.5 miles' distance between four identical spacecraft, MMS imaged electrons within the pyramid once every 30 milliseconds. These highly precise measurements enabled the researchers to capture turbulent, electron-only magnetic reconnection, a phenomenon not previously observed.

"MMS discovered electron magnetic reconnection, a new process much different from the standard magnetic reconnection that happens in calmer areas around Earth," said Tai Phan, a senior fellow in the Space Sciences Laboratory at the University of California, Berkeley and the lead author of the paper. "This finding helps scientists understand how turbulent magnetic fields dissipate energy throughout the cosmos".

Because turbulent reconnection involves only electrons, it remained hidden from scientists looking for the telltale signature of standard magnetic reconnection: ion jets. Compared with standard reconnection, in which broad jets of ions stream out tens of thousands of miles from the site of reconnection, turbulent reconnection ejects narrow jets of electrons only a couple miles wide.

But MMS scientists were able to leverage the design of one instrument, the Fast Plasma Investigation, to create a technique that allowed them to read between the lines and gather extra data points to resolve the jets.

"The key event of the paper happens in 45 milliseconds. This would be one data point with the regular data," said Amy Rager, a graduate student at the Catholic University of America in Washington, D.C., who worked at NASA's Goddard Space Flight Center to develop the technique. "But instead we can get six to seven data points in that region with this method, allowing us to understand what is happening."

With the new method, MMS scientists are hopeful they can comb through existing data sets to find more of these events and other unexpected discoveries as well.

"There were some surprises in the data," Drake said. "Magnetic reconnection occurs when you have two magnetic fields pointing in opposite directions and they annihilate each other. In the present case a large ambient magnetic field survived after annihilation occurred. Honestly, we were surprised that turbulent reconnection at very small scales could occur with this background magnetic field present."

Magnetic reconnection occurs throughout the universe, so whatever scientists learn about it near Earth can be applied to other phenomena. For example, the discovery of turbulent electron reconnection may help scientists understand the role that magnetic reconnection plays in heating the inexplicably hot solar corona--the sun's outer atmosphere--and accelerating the supersonic solar wind. NASA's upcoming Parker Solar Probe mission will travel directly toward the sun in the summer of 2018 to investigate these questions--armed with a new understanding of magnetic reconnection near Earth.

Credit: 
University of Maryland

The weak side of the proton

image: The Q-weak experiment was conducted in Jefferson Lab's Experimental Hall C, and its goal was to very precisely measure the proton's weak charge, a term that quantifies the influence that the weak force can exert on protons. The Q-weak apparatus, shown here, was installed in the hall for the experimental run, which concluded in 2012.

Image: 
DOE's Jefferson Lab

A new result from the Q-weak experiment at the Department of Energy's Thomas Jefferson National Accelerator Facility provides a precision test of the weak force, one of four fundamental forces in nature. This result, published recently in Nature, also constrains possibilities for new particles and forces beyond our present knowledge.

"Precision measurements like this one can act as windows into a world of potential new particles that otherwise might only be observable using extremely high-energy accelerators that are currently beyond the reach of our technical capabilities," said Roger Carlini, a Jefferson Lab scientist and a co-spokesperson for the Q-weak Collaboration.

While the weak force is difficult to observe directly, its influence can be felt in our everyday world. For example, it initiates the chain of reactions that power the sun and it provides a mechanism for radioactive decays that partially heat the Earth's core and that also enable doctors to detect disease inside the body without surgery.

Now, the Q-weak Collaboration has revealed one of the weak force's secrets: the precise strength of its grip on the proton. They did this by measuring the proton's weak charge to high precision, which they probed using the high-quality beams available at the Continuous Electron Beam Accelerator Facility, a DOE Office of Science User Facility.

The proton's weak charge is analogous to its more familiar electric charge, a measure of the influence the proton experiences from the electromagnetic force. These two interactions are closely related in the Standard Model, a highly successful theory that describes the electromagnetic and weak forces as two different aspects of a single force that interacts with subatomic particles.

To measure the proton's weak charge, an intense beam of electrons was directed onto a target containing cold liquid hydrogen, and the electrons scattered from this target were detected in a precise, custom-built measuring apparatus. The key to the Q-weak experiment is that the electrons in the beam were highly polarized - prepared prior to acceleration to be mostly "spinning" in one direction, parallel or anti-parallel to the beam
direction. With the direction of polarization rapidly reversed in a controlled manner, the experimenters were able to latch onto the weak interaction's unique property of parity (akin to mirror symmetry) violation, in order to isolate its tiny effects to high precision: a different scattering rate by about 2 parts in 10 million was measured for the two beam polarization states.

The proton's weak charge was found to be QWp=0.0719±0.0045, which turns out to be in excellent agreement with predictions of the Standard Model, which takes into account all known subatomic particles and the forces that act on them. Because the proton's weak charge is so precisely predicted in this model, the new Q-weak result provides insight into predictions of hitherto unobserved heavy particles, such as those that may be produced by the Large Hadron Collider (LHC) at CERN in Europe or future high energy particle accelerators.

"This very challenging experimental result is yet another clue in the world-wide search for new physics beyond our current understanding. There is ample evidence the Standard Model of Particle physics provides only an incomplete description of nature's phenomena, but where the breakthrough will come remains elusive," said Timothy J. Hallman, Associate Director for Nuclear Physics of the Department of Energy Office of Science. "Experiments like Q-weak are pressing ever closer to finding the answer."

For example, the Q-weak result has set limits on the possible existence of leptoquarks, which are hypothetical particles that can reverse the identities of two broad classes of very different fundamental particles - turning quarks (the building blocks of nuclear matter) into leptons (electrons and their heavier counterparts) and vice versa.

"After more than a decade of careful work, Q-weak not only informed the Standard Model, it showed that extreme precision can enable moderate-energy experiments to achieve results on par with the largest accelerators available to science," said Anne Kinney, Assistant Director for the Mathematical and Physical Sciences Directorate at the National Science Foundation. "Such precision will be important in the hunt for physics beyond the Standard Model, where new particle effects would likely appear as extremely tiny deviations."

"It's complementary information. So, if they find evidence for new physics in the future at the LHC, we can help identify what it might be, from the limits that we're setting already in this paper," said Greg Smith, Jefferson Lab scientist and Q-weak project manager.

Credit: 
DOE/Thomas Jefferson National Accelerator Facility

York U researcher identifies 15 new species of stealthy cuckoo bees

video: Cuckoo bees sneakily lay their eggs in the nests of other bee species, after which their newly hatched prodigies kill the host egg or larva, and then feed on the stored pollen. The host, a solitary bee, never knows anything is awry. Nine new species of these clandestine bees have been found hiding in collections and museums across North America by York University Ph.D. Candidate Thomas Onuferko, as well as another six unpublished in a decades old academic thesis.

Image: 
York University

TORONTO, Tuesday, May 8, 2018 - Cuckoo bees sneakily lay their eggs in the nests of other bee species, after which their newly hatched prodigies kill the host egg or larva, and then feed on the stored pollen. The host, a solitary bee, never knows anything is awry. Nine new species of these clandestine bees have been found hiding in collections and museums across North America by York University PhD Candidate Thomas Onuferko, as well as another six unpublished in a decades old academic thesis.

More closely resembling wasps in appearance, cuckoo bees lack the typical fuzzy look usually attributed to bees as they don't need those hairs to collect pollen for their young. Although not much is known about them, cuckoo bees are named after cuckoo birds which exhibit the same cleptoparasitic behaviour.

There are now a total of 43 known cuckoo bees in the genus Epeolus (pronounced ee-pee-oh-lus) in North America, many of which go unnoticed hovering low to the ground in backyards or "sleeping" on leaves, as they don't have nests of their own. They are only 5.5 to 10 mm in length, smaller and rarer than the polyester bees whose nests they invade.

"It may seem surprising to some that in well-researched places like Canada and the United States there is still the potential for the discovery of new species," says Onuferko of the York University's Faculty of Science. "People have been aware of a few of the new species that I'm describing, but they've never been formerly named. There is a whole bunch of other species, however, that no one knew about."

Part of the reason it's taken so long to identify these new cuckoo bees is that they are small, uncommonly collected and can be difficult to tell apart. Onuferko visited collections across North America and had specimens sent to the Packer Lab at York University for examination.

Many of the newly described cuckoo bees, including one Onuferko named after well-known British broadcaster and naturalist Sir David Attenborough - Epeolus attenboroughi, possess very short black, white, red and yellow hairs that form attractive patterns.

Onuferko named another cuckoo bee after York University bee expert and thesis adviser Professor Laurence Packer - Epeolus packeri.

Where did the name "epeolus" come from? Onuferko thinks it's likely a diminutive of Epeus/Epeius, the name of the soldier in Greek mythology who is attributed with coming up with the Trojan horse war strategy.

All 15 new species are now formally described, which will allow other researchers and bee enthusiasts to keep a lookout for them.

Credit: 
York University

Troubling stats for kids with intellectual disabilities

COLUMBUS, Ohio - By federal law passed in 1975, children with intellectual disabilities are supposed to spend as much time as possible in general education classrooms.

But a new study suggests that progress toward that goal has stalled.

Findings showed that over the past 40 years, 55 to 73 percent of students with intellectual disabilities spend most or all of the school day in self-contained classrooms or schools and not with their peers without disabilities.

"Given the legal mandate, it is surprising that such a large proportion of students are consistently placed in restrictive settings," said Matthew Brock, author of the study and assistant professor of special education at The Ohio State University.

The study is the first to look at national trends in education placement for students with intellectual disability - previously called mental retardation - for the entire 40 years since the law was enacted.

"I found historical trends of incremental progress toward less restrictive settings, but no evidence of such progress in recent years," said Brock, who is affiliated with Ohio State's Crane Center for Early Childhood Research and Policy.

The study has been accepted for publication by the American Journal on Intellectual and Developmental Disabilities.

The Individuals with Disabilities Education Improvement Act (as the law is now called) has the aim of educating students with disabilities in what it calls the "least restrictive environment." That means they should be placed in general education classrooms alongside peers without disabilities to the maximum extent appropriate.

Decisions about what is appropriate for each child are made by an Individual Education Program team that includes the child's parents, teachers and others.

Brock used several data sources to determine the proportion of students 6 to 21 years old with intellectual disability who were placed in each federally reported educational environment from 1976 to 2014.

The definitions of placement categories changed several times over the 40 years, so it is impossible to directly compare statistics over the entire time period, Brock said. But some general trends can be detected.

He found that in the first years following passage of the law, the proportion of students in less restrictive settings actually decreased. Students served in regular general education classrooms decreased from 38 percent in 1976 to 30 percent in 1983.

From 1984 to 1989 an overall trend is less clear.

From 1990 to 2014, the proportion of students in less restrictive placements initially increased and then plateaued, Brock said.

The proportion of students who spent at least 80 percent of the school day in general education classrooms trended up to near 14 percent in 1998, dropped to 11 percent in 2002, hit a peak of 18 percent in 2010 and decreased slightly to 17 percent in 2014.

"Overall, the most rapid progress toward inclusive placements was in the 1990s, with more gradual progress in the 2000s and a plateau between 2010 and 2014," Brock said.

He believes the rapid progress in the 90s occurred because advocacy for special education was strongest during this period, at least on a national level.

"There are still people working really hard toward the goal of inclusion in some parts of the country, but that doesn't come through in this national data," he said.

One argument could be that inclusion has plateaued in the United States because nearly all students are already in the least restrictive environments possible, as decided by their Individual Education Program teams, Brock said.

But state-by-state data suggests something else must be going on. In 2014, students with intellectual disabilities in Iowa were 13.5 times more likely to spend most of the school day in a general education setting compared to students in the bordering state of Illinois.

These huge discrepancies in placements between states can't be explained by differences in the students. The issue is that states and even individual school districts follow different policies and ways of working with student with disabilities - and not all succeed at giving students the least restrictive environment, according to Brock.

"I don't want to send the message that all kids with intellectual disabilities should spend 100 percent of their time in general education classrooms," he said.

"But I think we need to find opportunities for all kids to spend some time with peers who don't have disabilities if we are going to follow the spirit and letter of the law."

Credit: 
Ohio State University

A new mechanism for neurodegeneration in a form of dementia

Philadelphia, May 8, 2018
A new study in Biological Psychiatry reports that dementia-related and psychiatric-related proteins cluster together to form aggregates in the brain, leading to abnormal cell function and behavior. Aggregation of the protein TDP-43 is a hallmark of a pathological process that leads to dementia called frontotemporal lobar degeneration (FTLD). The study showed that as TDP-43 accumulates in the brain of patients with FTLD, it ropes in DISC1, an important protein in the pathology of many mental conditions.

The findings provide a clue to the unsolved puzzle of why psychiatric disorders often emerge in neurodegenerative disorders. “From the clinical point of view, it is critical to understand the molecular mechanisms underlying psychiatric symptoms in neurodegenerative diseases,” said senior author Motomasa Tanaka, PhD, of RIKEN Brain Science Institute, Japan. The findings reveal that the TDP-43/DISC1 protein clusters disrupt the production of new proteins in neurons, a process critical for higher brain functions that are impaired in psychiatric disorders.

First author Ryo Endo, PhD, and colleagues found the co-aggregates in postmortem brain tissue from patients with FTLD and in a mouse model of FTLD. The FTLD model mice were hyperactive and displayed abnormal social interactions, behaviors relevant to multiple psychiatric conditions. Aggregation of the protein renders it unusable, so Dr. Endo and colleagues added DISC1 back into the mice. The behavioral abnormalities in the mice returned to normal.

“At the mechanistic level, TDP-43 and DISC1 co-aggregation disrupted activity-dependent local translation in dendrites,” said Dr. Tanaka, a process that builds proteins from DNA codes based on neural activity. The disruption in translation resulted in reduced synaptic protein expression in the mice. Adding DISC1 back in also restored the reduced protein levels. The findings demonstrate that the co-aggregation of DISC1 caused the abnormalities in the model mice, suggesting that the co-aggregation of DISC1 with TDP-43 may disrupt cellular function and trigger psychiatric manifestations.

“DISC1 has long been a focus of research as a consequence of its implication in the heritable risk for schizophrenia,” said John Krystal, MD, Editor of Biological Psychiatry. “However, this new study implicates DISC1 in the biology of FTLD. There are still unanswered questions about whether DISC1 is disrupted in association with schizophrenia risk. However, the new study by Dr. Endo and colleagues provides a compelling case for further exploring the role of this protein in frontotemporal dementia,” he added.

Credit: 
Elsevier

Type of maternal homework assistance affects child's persistence

image: Type of maternal homework assistance affects child's persistence.

Image: 
UEF / Varpu Heiskanen

Different types of maternal homework assistance have a different impact on the child's way of completing school assignments in grades 2 to 4 of elementary school, according to a new study from the University of Eastern Finland and the University of Jyväskylä. Although all homework assistance presumably aims at helping the child, not all types of homework assistance lead to equally positive outcomes.

Researchers in the longitudinal First Steps Study found that the more opportunities for autonomous work the mother offered the child, the more task-persistent the child's behaviour. In other words, the child later worked persistently on his or her school assignments, which encouraged mothers to offer more and more opportunities for autonomous working.

However, when the mother provided assistance by concretely helping the child, the less task-persistent the child's later behaviour. This, in turn, made mothers offer more and more help.
These associations between different types of maternal homework assistance and the child's task-persistent behaviour remained even after the child's skill level was controlled for.

"One possible explanation is that when the mother gives her child an opportunity to do homework autonomously, the mother also sends out a message that she believes in the child's skills and capabilities. This, in turn, makes the child believe in him- or herself, and in his or her skills and capabilities," Associate Professor Jaana Viljaranta from the University of Eastern Finland explains.

Similarly, concrete homework assistance - especially if not requested by the child - may send out a message that the mother doesn't believe in the child's ability to do his or her homework.

Homework assistance should consider the child's needs

The findings also indicate that task-persistence is a mediating factor between different types of maternal homework assistance and the child's academic performance. This helps to understand some earlier findings on how some types of maternal homework assistance predict better academic performance than others. When the mother offers the child an opportunity for autonomous working, the child will work persistently, which leads to better development of skills. If, however, the mother's homework assistance involves plenty of concrete help, the child will work less persistently, leading to poorer development of skills.

"It is important for parents to take the child's needs into consideration when offering homework assistance. Of course, parents should offer concrete help when their child clearly needs it. However, concrete help is not something that should be made automatically available in every situation - only when needed," Viljaranta says.

The First Steps Study is an extensive longitudinal study carried out by the University of Jyväskylä, the University of Eastern Finland and the University of Turku. The study examines student learning and motivation among approximately 2,000 children from kindergarten onwards. Children currently participating in the study are in secondary education.

Credit: 
University of Eastern Finland

Study looks at barriers to getting treatment for substance use disorders

May 8, 2018 - For patients with substance use disorders seen in the emergency department or doctor's office, locating and accessing appropriate treatment all too often poses difficult challenges. Healthcare providers and treatment facility administrators share their views on delays and obstacles to prompt receipt of substance use disorder treatment after referral in a study in the Journal of Addiction Medicine, the official journal of the American Society of Addiction Medicine (ASAM). This journal is published in the Lippincott portfolio by Wolters Kluwer.

Issues related to patient eligibility, treatment capacity, understanding of options, and communication problems all contribute to gaps in referral and delays to getting treatment for patients with substance use disorders, according to the new research by Claire Evelyn Blevins, PhD, of Warren Alpert Medical School of Brown University and Butler Hospital, Providence, RI; Nishi Rawat, MD, of OpenBeds, Inc., Washington. DC; and Michael Stein, MD, of Boston University and Butler Hospital.

Four Themes Affecting Obstacles to Treatment for Substance Use Disorders

The ongoing opioid crisis has drawn attention to the widening gap between the high need and limited access to substance use treatment in the United States. A recent Substance Abuse and Mental Health Services Administration report found that of 21.7 million Americans in need of substance use disorder treatment, only 2.35 million received treatment at a specialty facility. Yet there is little information on the organizational-level barriers to treatment for substance use disorders.

To address this issue, Dr. Blevins and colleagues performed a series of interviews with 59 stakeholders in the treatment referral process. The study gathered input from those who make referrals for substance use treatment, including emergency medicine physicians, addiction specialists, and other medical providers; as well as those who receive referrals, including substance use treatment facility staff and administrators.

Analysis of the interviews identified four broad themes:

Patient Eligibility. Healthcare providers face difficulties in determining whether patients meet criteria for admission to a particular treatment center, including the application of treatment eligibility criteria. "Eligibility requirements may prevent a patient from entering a treatment center," the researchers write.

Treatment Capacity. Even if a patient is eligible, providers have trouble finding out whether space is available. "Despite the need for services, treatment centers may not run at capacity, because of frustrations encountered and time wasted on the referral and admission process."

Knowledge of Treatment Options. Providers may not understand the levels of available care for substance use treatment, and how to select the best treatment for their patient. "After determining appropriate level of care, a provider must then find a program that meets the patient's needs, which becomes more difficult with the differences in terminology and program guidelines."

Communication. Difficulties in communication between referring providers and treatment facilities can contribute to delays to starting treatment. The need for direct referral - "from the emergency department to a bed" - is particularly high for patients with opioid use disorders.

"Access to substance use disorder treatment is often a maze that can be difficult to navigate for both providers and patients," Dr. Blevins and coauthors write. Based on the themes identified, they make recommendations for improvement in the referral process, including a database of clear eligibility criteria, real-time information on treatment capacity, and increased education and training for providers on substance use treatment.

They also propose ways to improve communication and reduce treatment waiting times, including new information technologies. The researchers write: "By improving systems that enhance communication across organizations, patient referrals may be more easily completed, improving access to care and expanding the use of appropriate treatments for the many patients in need."

In an accompanying commentary, David L. Rosenbloom, PhD, of Boston University School of Public Health discusses the underlying reasons for the current "dysfunctional referral system." He notes that referrals for other chronic diseases "may be more effective because they are to 'in-house' affiliated providers." Dr. Rosenbloom writes: "The standard of care should be to stabilize, initiate treatment, and provide a hands-on transfer to an entity that can complete a diagnostic assessment and provide evidence-based treatment" for patients with substance use disorders.

Credit: 
Wolters Kluwer Health

Impaired brain pathways may cause attention problems after stroke

image: Stroke lesions. A, Lesion incidence map in patients with acute stroke. B, Lesion incidence map shows regions in which at least 10 patients had a lesion. Color bar denotes the probability of lesion distribution. C, Brain region that is correlated with attention deficit in the voxel-based lesion-symptom mapping (VLSM) analysis. Color bar denotes the t values.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - Damage to some of the pathways that carry information throughout the brain may be responsible for attention deficit in patients who have had a subcortical stroke in the brain's right hemisphere, according to a study published online in the journal Radiology. Researchers hope the findings may provide a measure for selecting suitable patients for early interventions aimed at reducing cognitive decline following stroke.

A stroke may affect cortical regions of the cerebral cortex, which includes the gray matter that lines the surface of the brain, or it may affect brain regions below the cortex, including white matter tracts connecting different regions of the brain. A stroke affecting brain structures below the cortex is known as a subcortical stroke.

More than one-third of patients experience cognitive decline after a stroke, including attention deficit, which can affect and impair the patient's ability to carry out routine activities of daily living.

"Impairment of attention has been observed in patients with both cortical and subcortical stroke," said senior study author Chunshui Yu, M.D., from the Department of Radiology at Tianjin Medical University General Hospital in Tianjin, China. "In cortical stroke, the direct involvement of cortical regions associated with attention may account for the deficit. However, the parts of the nervous and brain systems underlying attention deficit in subcortical stroke remain largely unknown."

To investigate the mechanisms underlying attention deficit in chronic subcortical stroke, Dr. Yu and colleagues combined voxel-based lesion-symptom mapping (VLSM) and diffusion tensor tractography (DTT) in 49 patients (32 men and 17 women between the ages of 40 and 71) after subcortical stroke and 52 control patients (30 men and 22 women, age 40-68). VLSM is a method of analyzing relationships between tissue damage and behavioral deficits, and DTT is an MRI technique that allows for 3-D visualization of specific white matter tracts in the brain.

A modified version of the attention network test was used to assess visual attention function. VLSM was used to identify lesion locations related to attention deficit in the stroke patients. Then DTT was used to determine the responsible impaired brain connections at the chronic stage (> 6 months post-stroke).

The results showed that compared to the controls, patients with chronic stroke exhibited prolonged reaction time during the attention task. VLSM revealed that having an acute stroke lesion in the right caudate nucleus and nearby white matter was correlated to the prolonged reaction time. DTT showed that the responsible lesion was located in the right thalamic- and caudate-prefrontal pathways in controls.

The right brain damage subgroup had significantly decreased fractional anisotropy (FA) in these pathways, which were correlated with the prolonged reaction time. FA provides a way to measure diffusion occurring within a region of the brain. FA is typically higher in brain regions of high organization. Reductions in FA have been previously associated with advancing age and in cases of cognitive impairment.

"The impairment of the right thalamic- and caudate-prefrontal pathways was consistently associated with attention deficit in patients with right subcortical stroke," Dr. Yu said. "Based on this association, one can estimate which patients with stroke would be more likely to develop into long-term persisting attention deficit by evaluating the lesion-induced damage to these pathways."

Credit: 
Radiological Society of North America

Carbon satellite to serve as an important tool for politicians and climate change experts

CLIMATE: A new satellite that measures and provides detailed carbon balance information is one of the most important new tools in carbon measurement since infrared light, believe researchers from the University of Copenhagen. The researchers expect the satellite to be a valuable tool for the UN's work on climate change related to the Paris climate accord.

Carbon balance is important for climate and environment because whenever carbon is converted into carbon dioxide, CO2 emissions increase. On the other hand, carbon is an essential aspect of life on Earth: a felled tree releases carbon into the atmosphere whereas a planted one takes up carbon in vegetation and soil. A lack of carbon in vegetation and soil can create a carbon imbalance and have climate-related consequences.

University of Copenhagen researchers have tested a new French satellite that can measure carbon balance far more precisely than the current method, which uses aerial photography. The satellite uses low-frequency passive microwaves to measure the biomass of above ground vegetation. The studies have recently been published in Nature Ecology and Evolution.

"This is one of the biggest steps related to carbon measurement since infrared measurements were developed in the 1970s," according to Postdoc Martin Stefan Brandt of the Department of Geosciences and Natural Resources Management, who is the researcher behind the study.

"The new satellite can measure emissions from all types of vegetation - including trunks and branches, not just the crowns as has been the case until now. This presents a much more detailed account of the carbon balance in the region concerned."

Vital for further work on climate change

The group of Danish researchers took an image of the African continent for seven years. The satellite made it possible to produce a detailed map of the carbon balance across the whole of Africa.

Over the seven years, the researchers documented that drought and deforestation had a dramatic influence on carbon emissions, which has a negative effect on climate. For this reason, it is important to have a tool on hand for monitoring changes to the landscape.

"We will need to understand how various factors like deforestation and drought affect the carbon balance in order to provide a knowledge base for experts and politicians whose job it is to make decisions related to work on climate change," says Martin Stefan Brandt.

The satellite can prove to be an important tool for future work on climate change and the reduction of CO2 emissions. For example, researchers expect that the UN Intergovernmental Panel on Climate Change (IPCC) will be able to use the satellite in relation to the Paris climate accord because it is well suited to present emissions by country.

Credit: 
University of Copenhagen - Faculty of Science

Hunting dogs may benefit from antioxidant boost in diet

image: Researchers from the University of Illinois tested an antioxidant-rich performance diet in American Foxhounds and found evidence of lower oxidative stress when vitamin E and taurine were consumed at higher concentrations.

Image: 
Preston Buff

URBANA, Ill. - Free radicals, those DNA-damaging single-oxygen atoms, are produced in spades during exercise. Dogs that exercise a lot, like hunting dogs, may need to consume more antioxidants than their less-active counterparts to protect against this damage. But what diet formulation best meets the needs of these furry athletes? A new University of Illinois study provides some answers in a real-world scenario.

Researchers visited a kennel of American Foxhounds in Alabama over the course of a hunting season, providing one group a high-performance commercial diet and another group a test diet similar to the commercial diet, but with added antioxidants (vitamins C and E, and lutein), zinc, and taurine. During the study, dogs from both groups went on two to three hunts per week, each 2 to 5 hours in length.

"We think of it as unstructured endurance exercise. They're not running the entire time. They might stop to sniff or go more slowly to pick up a scent," says Kelly Swanson, corresponding author on the study and Kraft Heinz Company Endowed Professor in Human Nutrition in the Department of Animal Sciences and the Division of Nutritional Sciences at U of I.

Before starting the diets and on four occasions during the seven-month study, researchers took blood samples from the dogs to examine oxidative stress markers and other blood metabolites.

"We hypothesized that dogs fed the test diet would have a lower concentration of oxidative stress markers and improved performance compared to the dogs fed the commercial diet," Swanson says. "It turns out performance wasn't affected by diet, but the test diet did improve indirect measures of oxidative stress. Therefore, improved performance may be expected with more strenuous exercise when metabolic demands are higher."

The amino acid taurine, once thought to be non-essential for dogs but now recognized as an important nutrient for heart health, declined over the course of the season for dogs fed the commercial diet. The same pattern occurred with vitamin E. Although one dog did come close to a critically low level of taurine during the study, all dogs fed the commercial diet stayed within the normal range for all blood metabolites.

For dogs fed the test diet, taurine and vitamin E levels were maintained at or above the baseline. The results suggest to Swanson and his co-authors that these compounds are compromised in athletic dogs over months of unstructured exercise, and more-active dogs such as sled dogs may experience greater depletion.

"We can conclude that athletic dogs may benefit from supplementation of vitamin E and taurine to minimize oxidation and maintain taurine status," he says.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Majority of the population trusts state structures in consumer health protection

In the latest issue, it can be seen that people are becoming more and more aware of glyphosate, the active substance used in certain plant protection products, with three quarters of the population already having heard of it. Despite this, food in Germany is still regarded as safe by over 80 percent of respondents, and more than half trust the state authorities that they protect the health of consumers.

It was determined for the first time in this issue how great the interest in consumer health topics is. "More than two thirds of the population are interested in consumer health protection. That makes our mandate of providing people with comprehensive information on actual and perceived risks all the more important," says BfR President, Professor Dr. Dr. Andreas Hensel. "The goal is for consumers to remain able to decide what to do by themselves and maintain their competence in assessing risks".

http://www.bfr.bund.de/cm/364/bfr-consumer-monitor-02-2018.pdf

The BfR Consumer Monitor is an important instrument of consumer health protection. As a representative consumer survey, it gives an insight every six months as to how the German-speaking population perceives health risks.To do so, roughly 1,000 persons living in private households in Germany aged at least 14 years are interviewed per telephone on behalf of the BfR.

Respondents still perceive smoking, climate and environmental pollution as well as a wrong or unhealthy diet as the greatest health risks. In focus once again, and moving up into fourth place in the list of the greatest health risks, are the shortcomings of the health system. These include a perceived shortage of medical staff, the care crisis and the difficult situation in hospitals. Alcohol and unhealthy or contaminated foods are seen as further risks.

When questions are asked about selected topics, salmonella, genetically modified foods, antimicrobial resistance and plant protection product residues head the list of subjects of which people are most aware. These are also the four topics that cause concern among the most respondents. Compared to the previous year, the topics of aluminium, microplastics and glyphosate in food are much better known. Almost half of the population is concerned about glyphosate, with a similar percentage of people concerned about microplastics. By way of comparison, only a good third of respondents find aluminium in food a cause for concern.

Toys and cosmetics are estimated to be safe by a larger percentage of consumers compared to the previous survey. There has been a slight decrease in the feeling of safety where textiles are concerned.

The BfR Consumer Monitor is dedicated on the one hand to topics which receive a lot of public attention. On the other hand though, it also analyses issues which have been less the focus of attention but which are also relevant, such as Campylobacter and pyrrolizidine alkaloids in food, or new methods of "genome editing" for the targeted modification of genetic makeup. As was the case last year, these topics are hardly visible in public perception and are consequently not regarded as being of particular concern. Food hygiene at home also plays only a minor role in the consciousness of the consumer.

To what extent public perception deviates from the scientific estimation of health risks is of particular interest for the work of the BfR. Through follow-up studies and specific communicative measures on such topics as kitchen hygiene, the BfR aims to counteract false estimations and misunderstandings.

Credit: 
BfR Federal Institute for Risk Assessment

Inequality is normal: Dominance of the big trees

image: These are large-diameter trees in the Douglas-fir/western hemlock forest of Winder River, Washinton, USA

Image: 
James Lutz/Utah State University

The top 1% of the forest has been sharing some vital information with researchers. Ninety-eight scientists and thousands of field staff have concluded the largest study undertaken to date with the Smithsonian Forest Global Earth Observatory (ForestGEO), and what they have found will have profound implications toward ecological theories and carbon storage in forests. Rather than examining tree species diversity in temperate and tropical ecosystems, this global study emphasized forest structure over a vast scale. Using large forest plots from 21 countries and territories, Utah State researchers found that, on average, the largest 1% of trees in mature and older forests comprised 50% of forest biomass worldwide. Furthermore, the amount of carbon that forests can sequester depends mostly on the abundance of big trees. The size of the largest trees was found to be even more important to forest biomass than high densities of small and medium trees. Lead author Jim Lutz, Assistant Professor at Utah State University said, "Big trees provide functions that cannot be duplicated by small or medium-sized trees. They provide unique habitat, strongly influence the forest around them, and store large amounts of carbon."

This study has shown that the structure of the forest is as important to consider as species diversity - the largest trees follow their own set of rules. Using 48 of the large forest dynamics plots from around the world coordinated by the Smithsonian ForestGEO Program, scientists were able to examine the variability of forest structure on a consistent basis. Co-author Dan Johnson, Research Associate at Utah State University said, "Having a worldwide group of scientists following the same methods offers us unique opportunities to explore forests at a global scale. This is a really wonderful group of scientists united by a passion for deepening our understanding of forests."

Tropical forests are well known to typically have many more species than temperate forests. However, this study found that temperate forests have higher structural complexity, both in terms of different tree sizes within an area and also between adjacent areas of forest. Co-lead author Tucker Furniss, PhD student at Utah State University said, "The distribution of big trees has not been well explained by theory. Our results emphasize the importance of considering these rare, but disproportionately important ecosystem elements. We clearly need more applied and theoretical research on these important big trees."

The researchers also found that the largest trees are representatives of the more common tree species. The ability of some trees in any given forest to reach very large sizes relative to the other trees and concentrate resources seems to be a global phenomenon. "Big trees are special." Continued Lutz. "They take a long time to regrow if they are eliminated from a forest. Making sure that we conserve some big trees in forests can promote and maintain all the benefits that forests provide to us."

Credit: 
S.J. & Jessie E. Quinney College of Natural Resources, Utah State University

Stomata -- the plant pores that give us life -- arise thanks to a gene called MUTE

image: Without MUTE, Arabidopsis plants cannot produce stomata, and do not develop past the seedling stage.

Image: 
Soon-Ki Han/ Xingyun Qi

Plants know how to do a neat trick.

Through photosynthesis, they use sunlight and carbon dioxide to make food, belching out the oxygen that we breathe as a byproduct. This evolutionary innovation is so central to plant identity that nearly all land plants use the same pores -- called stomata -- to take in carbon dioxide and release oxygen.

Stomata are tiny, microscopic and critical for photosynthesis. Thousands of them dot on the surface of the plants. Understanding how stomata form is critical basic information toward understanding how plants grow and produce the biomass upon which we thrive.

In a paper published May 7 in the journal Developmental Cell, a University of Washington-led team describes the delicate cellular symphony that produces tiny, functional stomata. The scientists discovered that a gene in plants known as MUTE orchestrates stomatal development. MUTE directs the activity of other genes that tell cells when to divide and not to divide -- much like how a conductor tells musicians when to play and when to stay silent.

"The MUTE gene acts as a master regulator of stomatal development," said senior author Keiko Torii, a UW professor of biology and investigator at the Howard Hughes Medical Institute. "MUTE exerts precision control over the proper formation of stomata by initiating a single round of cell division -- just one -- in the precursor cell that stomata develop from."

Stomata resemble doughnuts -- a circular pore with a hole in the middle for gas to enter or leave the plant. The pore consists of two cells -- each known as a guard cell. They can swell or shrink to open or close the pore, which is critical for regulating gas exchange for photosynthesis, as well as moisture levels in tissues.

"If plants cannot make stomata, they are not viable -- they cannot 'breathe,'" said Torii, who also is a professor at Nagoya University in Japan.

Torii and her team investigated which genes governed stomata formation in Arabidopsis thaliana, a small weed that is one of the most widely studied plants on the planet. Past research by Torii's team and other researchers had indicated that, in Arabidopsis, MUTE plays a central role in the formation of stomata. The MUTE gene encodes instructions for a cellular protein that can control the "on" or "off" state of other plant genes.

The researchers created a strain of Arabidopsis that can artificially produce a lot of the MUTE protein, so they could easily identify which genes the MUTE protein turned on or off. They discovered that many of the activated genes control cell division -- a process that is critical for stomatal development.

In Arabidopsis, as in nearly all plants, stomata form from precursor cells known as guard mother cells, or GMCs. To form a working stoma -- singular for stomata -- a GMC divides once to yield to paired guard cells. Since their data showed that MUTE proteins switched on genes that regulated cell division, Torii and her team wondered if MUTE is the gene that activates this single round of cell division. If so, it would have to be a tightly regulated process. The genetic program would have to switch on cell division in the GMC, and then quickly switch it right back off to ensure that only a single round of division occurs.

Torii's team showed that one of the genes activated by the MUTE protein to its DNA is CYCD5;1, a gene that causes the GMC to divide. The researchers also found that MUTE proteins turn on two genes called FAMA and FOUR LIPS. This was an important discovery because, while CYCD5;1 turns on cell division of the GMC, FAMA and FOUR LIPS turn off -- or repress -- the cell division program.

"Our experiments showed that MUTE was turning on both activators of cell division and repressors of cell division, which seemed counterintuitive -- why would it do both?" said Torii. "That made us very interested in understanding the temporal regulation of these genes in the GMC and the stomata."

Through precise experiments, they gathered data on the timing MUTE activation of these cell division activators and repressors. They incorporated this information into a mathematical model, which simulated how MUTE acts to both activate and repress cell division in the GMC. First, MUTE turns on the activator CYCD5;1 -- which triggers one round of cell division. Then, FAMA and FOUR LIPS act to prevent further cell division, yielding one functional stomata consisting of two guard cells.

"Like a conductor at the podium, MUTE appears to signal its target genes -- each of which has specific, and even opposite, parts to play in the ensuing piece," said Torii. "The result is a tightly coupled sequence of activation and repression that gives rise to one of the most ancient structures on land plants."

Credit: 
University of Washington

The effect of night shifts: Gene expression fails to adapt to new sleep patterns

Have you ever considered that working night shifts may, in the long run, have an impact on your health? A team of researchers from the McGill University affiliated Douglas Mental Health University Institute (DMHUI) has discovered that genes regulating important biological processes are incapable of adapting to new sleeping and eating patterns and that most of them stay tuned to their daytime biological clock rhythms.

In a study published in the Proceedings of the National Academy of Sciences, Laura Kervezee, Marc Cuesta, Nicolas Cermakian and Diane B. Boivin, researchers at the DMHUI (CIUSSS de l'Ouest-de-l'Île-de-Montréal), were able to show the impact that a four-day simulation of night shift work had on the expression of 20,000 genes.

"We now better understand the molecular changes that take place inside the human body when sleeping and eating behaviours are in sync with our biological clock. For example, we found that the expression of genes related to the immune system and metabolic processes did not adapt to the new behaviours," says Dr. Boivin, Director of the Centre for Study and Treatment of Circadian Rhythms and a full professor at McGill University's Department of Psychiatry.

It is known that the expression of many of these genes varies over the course of the day and night. Their repetitive rhythms are important for the regulation of many physiological and behavioural processes. "Almost 25% of the rhythmic genes lost their biological rhythm after our volunteers were exposed to our night shift simulation. 73% did not adapt to the night shift and stayed tuned to their daytime rhythm. And less than 3% partly adapted to the night shift schedule," adds Dr. Cermakian, Director of the Laboratory of Molecular Chronobiology at the DMHUI and a full professor at McGill University's Department of Psychiatry.

Health problems ahead?

For this study, eight healthy volunteers were artificially subjected to a five-day schedule simulating night shift work. In a time-isolation room, they were deprived of any light or sound cues characteristic of the time of day, and were not allowed to use their phones or laptops. The first day the participants slept during their normal bedtimes. The four following days were "night shifts": the volunteers remained awake during the night and slept during the day.

On the first day and after the last night shift, the team collected blood samples at different times for a period of 24 hours. Laura Kervezee, a postdoctoral fellow on Boivin's team, then measured the expression of more than 20,000 genes using a technique called transcriptomic analysis, and assessed which of these genes presented a variation over the day-night cycle.

"We think the molecular changes we observed potentially contribute to the development of health problems like diabetes, obesity, cardiovascular diseases more frequently seen in night-shift workers on the long term," explains Dr. Boivin. However, she adds this will require further investigations.

As the study was conducted under highly controlled conditions in the laboratory, future research should extend these findings by studying the gene expression of actual night shift workers whose physical activity, food intake and timing of sleep might differ from one another. This could also be applied to other people that are at risk of experiencing biological clock misalignment such as travellers crossing time zones on a frequent basis.

Around 20% of the workforce in Canada, the United States and Europe is involved in shift work.

Credit: 
McGill University