Culture

Men take care of their spouses just as well as women (new research suggests)

Men respond to their spouse's illness just as much as women do and as a result are better caregivers in later life than previous research suggests, according to a new Oxford University collaboration.

The study, published in Journals of Gerontology, Series B, is good news for our increasingly stretched adult care services, which have become more reliant on patients' family and spouses for support. Conducted with peers from the University of Pennsylvania, the research sits in contrast to previous studies on spousal caregiving, which found that female caregivers tend to be more responsive. However, the new results reveal that men are just as responsive to a partner's illness, as women.

Using data from the German Socio-Economic Panel Study, the research carried out by Dr Langner of Oxford University* and Professor Frank Furstenberg of the University of Pennsylvania**, focused on 538 couples in Germany with an average age of 69, where one of them had developed the need for spousal care, between 2001-2015, and looked at how caregivers adjusted their hours in response to the new care need: whether directly responding to their physical needs or performing errands and housework.

The findings show that men increased their care hours as much as women did, resulting in similar levels of care once their partner became ill. These similarities were particularly pronounced when a spouse was deemed severely ill, when there was little to no difference in the level of care given.

Perhaps surprisingly, when their spouse is severely ill, men also increase the time they spend on housework and errands, more than women. However, at lower levels of spousal care need - when a spouse is only slightly unwell, women still spend more time doing housework and errands than men - because they already did more housework and errands prior to the disease onset.

There were also significant differences in levels of care given, for couples where the spouse was only unofficially seen to be 'in need of care'. However, these differences disappeared in homes where no other household help was provided, when regardless of gender, male or female, spouses stepped up to care for each other.

Dr Laura Langner, Research Fellow at Nuffield College, Oxford and ESRC Future Research Leader, said: 'Our results suggest that gender differences in spousal caregiving in old age are not as pronounced as previously thought. Past studies had numerous limitations, which we could overcome with our data.

'We found that, unlike many previous studies on caregiving in later life - male caregivers were just as responsive towards their partner's onset of illness as female caregivers. This stands in sharp contrast the division of caregiving (i.e. childcare) and housework in mid-life. There could be a number of reasons for this, but a key factor may be that in later life many people retire and no longer have the responsibility of work, so are able to focus on other priorities - that their spouse may have been doing already'.

Discussing the potential future implications of the research for patients and services, she adds: 'People are living longer, meaning that we have an increasingly dependent aging population and we face an elderly care cost problem. Reforms are likely to continue reducing more expensive institutionalised care, and increase cheaper home care. With the gender gap in life expectancy closing, and children becoming less available to care for their parents, it is likely that many more men will be called upon to care for their partners. But, our findings at least suggest that women won't have to worry that their partners are not up to the job of caring for them, should they need to.'

The team intend to build on the findings by applying the research approach to other countries and assessing how the results compare.

Credit: 
University of Oxford

Generating DNA sequence data in the developing world

image: Left: Endro Setiawan and Acun Hery Yanto preparing collections at Riam Berasap, Gunung Palung National Park. Right: Rani Asmarayani and Gillian Dean in the Molecular Systematics Laboratory at the Herbarium Bogoriense, Research Center for Biology, Indonesian Institute of Sciences (LIPI), Cibinong, West Java, Indonesia.

Image: 
Campbell O. Webb

Globally, biodiversity is concentrated around the equator, but the scientific institutions generating DNA sequence data to study that biodiversity tend to be clustered in developed countries toward the poles. However, the rapidly decreasing cost of DNA sequencing has the potential to change this dynamic and create a more equitable global distribution of genetic research. In research published in a recent issue of Applications in Plant Sciences, Dr. Gillian Dean and colleagues show the feasibility of producing high-quality sequence data at a laboratory in Indonesia.

For many laboratories in the developing world, a lack of funding and practical experience are major hurdles to generating their own DNA sequence data. However, the financial, technical, and logistical burden of producing DNA sequence data has dropped precipitously in recent years. DNA sequencing is increasingly done at centralized "core" facilities dedicated to producing high-quality sequence data from prepared samples at high volume and low cost. This means that laboratories need only do initial processing of tissue to prepare DNA samples to be sent to a sequencing core facility.

Molecular techniques like DNA extraction, purification, and PCR, necessary to prepare samples for sequencing, are now quite well established, with protocols that are relatively simple using fairly inexpensive reagents (ingredients). "It can seem daunting to set up a molecular biology lab, but as methods become cheaper and more robust, it really is within reach of more and more researchers," explains Dr. Gillian Dean, the lead author of the study. "We want to share our expertise to help people set up these methods, and take full advantage of the kind of data that can be generated." Accordingly, Dr. Dean and colleagues compared methods for generating DNA sequence data at the Molecular Systematics Laboratory at the Herbarium Bogoriense, Research Center for Biology, Indonesian Institute of Sciences (LIPI), in Cibinong, West Java, Indonesia. The DNA barcode data generated in this pilot study was part of a project to create an open access digital flora of Gunung Palung National Park in West Kalimantan, Indonesia.

Helping laboratories in developing countries set up the capacity to produce DNA sequence data expands the number of contributors to DNA sequence databases, which is good for science generally and may be helpful for any number of bioinformatic analyses. According to Dr. Dean, "The wider scientific community benefits by having more capacity in terms of people producing data." But it also strengthens scientific institutions in the developing world, expanding the kinds of studies scientists can perform and questions they can answer.

"The ability to prepare the DNA of interest for sequencing means that scientists in developing countries have more autonomy when it comes to designing projects, to choose exactly the data they generate, and for what purpose," explains Dr. Dean. "It also enables those scientists to have equal access to and control over the resulting data." Aside from the scientific benefits, there is a legal rationale for expanding the capacity to generate DNA sequence data into more countries, as many biodiverse countries have enacted laws restricting access to sampling for studies conducted outside their borders.

Dr. Dean also stresses the ethical dimension of expanding the capacity of molecular laboratories in developing countries, which helps to create a more equitable global scientific establishment and address the legacy of colonialism. "I think the main point is that we have to be more inclusive of scientists from countries where samples originate. Historically, researchers from more developed countries have arrived and taken specimens back to their home country, and in many cases have not shared the resulting data with the country of origin," says Dr. Dean. "We must move away from this, and start to have an equitable approach to doing research in less developed countries." This study represents a step on that long journey.

Credit: 
Botanical Society of America

Mere expectation of checking work email after hours harms health of workers and families

image: A new study demonstrates that employees do not need to spend actual time on work in their off-hours to experience the harmful effects. The mere expectations of availability increase strain for employees and their significant others -- even when employees do not engage in actual work during nonwork time.

Image: 
Virginia Tech

Employer expectations of work email monitoring during nonwork hours are detrimental to the health and well-being of not only employees but their family members as well.

William Becker, a Virginia Tech associate professor of management in the Pamplin College of Business, co-authored a new study, "Killing me softly: electronic communications monitoring and employee and significant-other well-being," showing that such expectations result in anxiety, which adversely affects the health of employees and their families.

"The competing demands of work and nonwork lives present a dilemma for employees," Becker said, "which triggers feelings of anxiety and endangers work and personal lives."

Other studies have shown that the stress of increased job demands leads to strain and conflict in family relationships when the employee is unable to fulfill nonwork roles at home -- "such as when someone brings work home to finish up."

Their new study, he said, demonstrates that employees do not need to spend actual time on work in their off-hours to experience the harmful effects. The mere expectations of availability increase strain for employees and their significant others -- even when employees do not engage in actual work during nonwork time.

Unlike work-related demands that deplete employee resources, physical and psychological, by requiring time away from home, "the insidious impact of 'always on' organizational culture is often unaccounted for or disguised as a benefit -- increased convenience, for example, or higher autonomy and control over work-life boundaries," Becker said.

"Our research exposes the reality: 'flexible work boundaries' often turn into 'work without boundaries,' compromising an employee's and their family's health and well-being."

As negative health outcomes are costly to them, what can employers do to mitigate the adverse effects identified by the study? Becker said policies that reduce expectations to monitor electronic communication outside of work would be ideal.

When that is not an option, the solution may be to establish boundaries on when electronic communication is acceptable during off-hours by setting up off-hour email windows or schedules when employees are available to respond.

Additionally, he said, organizational expectations should be communicated clearly. "If the nature of a job requires email availability, such expectations should be stated formally as a part of job responsibilities." Knowing these expectations upfront may reduce anxiety in employees and increase understanding from their family members, he said.

As for employees, they could consider practicing mindfulness, which has been shown to be effective in reducing anxiety, Becker said. Mindfulness may help employees "be present" in family interactions, which could help reduce conflict and improve relationship satisfaction. And, he added, mindfulness is within the employee's control when email expectations are not.

Becker, whose research interests include work emotion, turnover, organizational neuroscience, and leadership, is based at Virginia Tech's National Capital Region campus in metro Washington, D.C.

His study, co-authored with Liuba Y. Belkin, of Lehigh University; Samantha A. Conroy, of Colorado State University; and Sarah Tuskey, a Virginia Tech Ph.D. student in executive business research, will be presented at the Academy of Management annual meeting in Chicago on August 10-14.

"Employees today must navigate more complex boundaries between work and family than ever before," said Becker. "Employer expectations during nonwork hours appear to increase this burden, as employees feel an obligation to shift roles throughout their nonwork time.

"Efforts to manage these expectations are more important than ever, given our findings that employees' families are also affected by these expectations."

Credit: 
Virginia Tech

Young drinkers beware: Binge drinking may cause stroke, heart risks

You might want to think before you go out drinking again tonight.

Research by Mariann Piano, senior associate dean of research at Vanderbilt University School of Nursing, has found that young adults who frequently binge drink were more likely to have specific cardiovascular risk factors such as higher blood pressure, cholesterol and blood sugar at a younger age than non-binge drinkers.

In a study published in the Journal of the American Heart Association, researchers found that binge drinking by young men was associated with higher systolic blood pressure (the force on blood vessels when the heart beats) and that frequent binge drinking had additional effects on cholesterol, both factors in contributing to cardiovascular disease. Female binge drinkers had higher blood glucose levels than abstainers.

In reporting her findings, Piano, PhD, FAAN, the Nancy and Hilliard Travis Professor at Vanderbilt, said that young adults need to be aware that repeated binge drinking may have consequences beyond the immediate.

"The risk extends beyond poor school performance and increased risk for accidental injury," she said.

Current evidence suggests that development of high blood pressure before age 45 is associated with significantly higher risks of cardiovascular death later in life.

The study also found differences in how binge drinking affected young men and women. Young men who reported that they repeatedly binge drink had higher systolic blood pressure and total cholesterol while young women who repeatedly binge drink had higher blood sugar levels compared to non-binge drinkers.

Piano and her co-authors examined high blood pressure, cholesterol, blood sugar and other cardiovascular risks in 4,710 adults ages 18-45 who responded to the 2011-2012 and 2013-2014 U.S. National Health and Nutrition Examination Survey. Participants were classified as non-drinkers, binge drinkers 12 times or less a year, and high-frequency binge drinkers (more than 12 times a year).

High-frequency binge drinking was reported by 25.1 percent of men and 11.8 percent of women. Binge drinking 12 times a year or less was reported by 29.0 percent of men and 25.1 percent of women.

Binge drinking rates are at an all-time high, Piano said. One in five college-age students reports three or more binge drinking episodes in the prior two weeks. More students drink to get drunk, then black out. They consume six to seven drinks per binge drinking episode. Compared to previous generations, the pervasiveness, regularity and intensity of binge drinking may place today's youth at greater risk for alcohol-related harm.

Credit: 
Vanderbilt University Medical Center

The Lancet: Sodium reduction programmes may only be appropriate for communities with very high salt intake

A new study shows that for the vast majority of communities, sodium consumption is not associated with an increase in health risks except for those whose average consumption exceeds 5g/day (equivalent to 12.5g of salt, or 2½ teaspoons). Communities with high average levels of sodium intake (above 5g/day) were mostly seen in China, with only about 15% of communities outside China exceeding this level of consumption.

WHO guidelines recommend a global approach to reducing sodium intake in all populations to below 2g/day, but this has not been achieved in any country. The authors say that sodium reduction strategies should instead target communities with high average levels of sodium consumption (above 5g/day).

The findings come from a new observational study of over 90000 people in more than 300 communities in 18 countries, published in The Lancet.

"No country has managed to reduce levels of sodium consumption from moderate to very low (below 2g/day), and our study shows we should be far more concerned about targeting communities and countries with high average sodium intake (above 5g/day, such as China) and bringing them down to the moderate range (3 to 5g/day)," says Professor Andrew Mente, Population Health Research Institute (PHRI) of Hamilton Health Sciences and McMaster University (Canada). [1]

Data from the ongoing Prospective Urban Rural Epidemiology (PURE) study was used in the analysis, and 95,767 participants aged 35-70 years in 369 communities in 18 countries [2] were included in the study. A morning fasting midstream urine sample was collected from every participant and was used to estimate 24h urinary sodium and potassium intake. Information about demographic factors, lifestyle, health history, and medication use were recorded and height, weight and blood pressure were measured.

Average follow-up was 8.1 years, during which time 3695 people died, 3543 had major cardiovascular events (1372 myocardial infarctions; 1965 strokes; 343 heart failures; 914 cardiovascular deaths). The analysis was based on the number of people who suffered a cardiovascular event or death (6281).

The analysis was done at a community level: 255 communities (all with over 100 participants) for cardiovascular disease and mortality, and 369 (all with over 50 participants) for blood pressure.

80% (82/103) of the communities in China has a mean sodium intake greater than 5g/day, whereas in other countries, 84% (224/266) communities had a mean intake of 3-5g/day. No communities in the study had a mean sodium intake below 3 g/day.

Higher sodium intake was associated with increased blood pressure and increased incidence of stroke, but the association was found in communities with very high sodium intake (mostly in China) and not others. Higher sodium intake was associated with lower rates of myocardial infarction and total mortality.

"Our study adds to growing evidence to suggest that, at moderate intake, sodium may have a beneficial role in cardiovascular health, but a potentially more harmful role when intake is very high or very low. This is the relationship we would expect for any essential nutrient and health. Our bodies need essential nutrients like sodium, but the question is how much. The recommendation to lower sodium consumption to 2g/day is based on short-term trials of sodium intake and blood pressure, and the assumption that any approach to reduce blood pressure will necessarily translate into a lower risk of cardiovascular disease with no unintended consequences. While low sodium intake does reduce blood pressure, at very low levels it may also have other effects, including adverse elevations of certain hormones associated with an increase in risk of death and cardiovascular diseases," adds Professor Mente. [1]

Furthermore, rates of stroke, cardiovascular death, and total mortality decreased with increasing potassium intake in these communities. Diets rich in fruit and vegetables are high in potassium. However, it is not known whether potassium itself is protective, or whether it might simply be a marker of a healthy diet.

Professor Martin O'Donnell, McMaster University, co-author on the study, adds: "Our findings support other research recommending an all-round healthy diet with an emphasis fruit and vegetables, dairy foods, potatoes, nuts and beans. Very high sodium consumption (above 5g/day) is harmful, but the amount that is consumed by the majority of people does not appear to be linked to an increased risk of cardiovascular disease or death." [1]

The study published today follows a paper published in The Lancet in 2016 [3], which used the same cohort but the analyses were performed at an individual level, rather than community. Compared with moderate sodium intake, the study found that high sodium intake (above 7g/day) was associated with an increased risk of cardiovascular events and mortality in hypertensive populations, and low sodium (below 3g/day) intake was associated with an increased risk of cardiovascular events and mortality in people with or without hypertension. By including the community level analyses, and additional years' follow-up, the new study adds additional evidence and approaches to prevention for communities and countries.

Writing in a linked Comment, Franz H Messerli and Louis Hofstetter, University Hospital, Bern (Switzerland) and Sripal Bangalore, New York University School of Medicine (USA), note: "A cursory look at 24h urinary sodium excretion in 2010 and the 2012 UN healthy life expectancy at birth in 182 countries, ignoring potential confounders, such as gross domestic product, does not seem to indicate that salt intake, except possibly when very high, curtails life span... Before we change recommendations, let us remember, that Mente and colleagues' findings are observational data in a predominately Asian population, and base 24 h sodium excretion calculations on overnight fasting urine measurements. It does not necessarily follow that active intervention, such as decreasing salt intake in patients at risk of stroke or increasing salt intake in patients at risk of myocardial infarction, will turn out to be beneficial. Nevertheless, the findings are exceedingly interesting and should be tested in a randomised controlled trial. Indeed, such a trial has been proposed in a closely controlled environment, the federal prison population in the USA... The simple fact that a trial looking at salt restriction has to be done in the federal prison population indicates that curtailing salt intake is notoriously difficult. Incentivising people to enrich their diets with potassium through eating more fruit and vegetables is likely to need less persuasion."

Credit: 
The Lancet

Increase acceptance of police use of body-worn cameras by making them federal rules?

Thousands of police departments have adopted body-worn cameras over the last few years. Previous research on acceptance of the cameras has yielded mixed findings. A new study that examined how Tempe, Arizona, planned and carried out a body-worn camera program found that adhering to federal guidelines helped ensure integration and acceptance among police, citizens, and other stakeholders.

The study, conducted by researchers at Arizona State University (ASU), the University of Alabama at Birmingham, and East Carolina University, appears in the journal Criminology & Public Policy, a publication of the American Society of Criminology.

"Police departments that are considering a program involving body-worn cameras would be well-advised to closely follow the federal implementation guidelines," suggests Michael D. White, professor in the School of Criminology and Criminal Justice at ASU and associate director of ASU's Center for Violence Prevention and Community Safety, who led the study.

"The guidelines offer a road map to successfully plan and implement programs, and increase the likelihood of achieving positive results," White adds. Such results can include reductions in complaints by citizens and in use of force, as well as smoother and faster processing of criminal cases.

The researchers assessed how the use of body-worn cameras was perceived in Tempe in 2015-16. The Tempe Police Department adhered closely to the principles and strategies outlined in the U.S. Department of Justice's BWC Implementation Guide, a best-practices, evidence-based guide that offers more than two dozen steps to follow, including forming a working group, developing policy, and communicating with stakeholders.

The study surveyed 200 officers before and after they used body-worn cameras, interviewed 279 citizens by phone who had recent encounters with police, and interviewed 17 stakeholders in person (e.g., advocacy groups, first responders, prosecutors). Researchers also looked at official agency data to determine the impact of the cameras on officers' levels of proactivity (i.e., self-initiated calls), the time it took for misdemeanor cases to be processed and resolved, and outcomes of cases.

The study found high levels of acceptance of the cameras among all groups surveyed. Officers' level of proactivity did not change. The authors reported two changes in processing misdemeanor cases: The time it took for cases involving body cameras to be resolved decreased slightly, arguably because of the ability of the cameras to document the criminal activity, and the proportion of guilty outcomes increased modestly; the latter may be an indicator of the acceptance of body cameras by judges, prosecutors, and others in the courtroom.

The researchers caution that their findings, based on one police department, may not be generalizable to other jurisdictions. They also point out that the analysis is primarily descriptive, so while there are associations between Tempe's close adherence to the federal guidelines and the levels of integration and acceptance, causality cannot be inferred.

"Many departments use body-worn cameras without a full understanding of the issues, costs, and challenges involved," explains Natalie Todak, assistant professor in the Department of Criminal Justice at the University of Alabama at Birmingham, who coauthored the study. "If a department rushes to use the cameras without proper planning and implementation, the likelihood of benefiting declines.

"Given the widespread and rapid use of body-worn cameras in American policing, it is difficult to overstate the importance of using empirically supported guidance such as the Justice Department guidelines," Todak adds.

Credit: 
Crime and Justice Research Alliance

A video game can change the brain, may improve empathy in middle schoolers

image: In the experimental video game "Crystals of Kaydor," a robot crash lands on an alien planet. In order to rebuild the spaceship, players must, as the robot, build rapport with the aliens by deciphering the emotions on their humanlike faces.

Image: 
Center for Healthy Minds/University of Wisconsin-Madison

MADISON, Wis. -- A space-exploring robot crashes on a distant planet. In order to gather the pieces of its damaged spaceship, it needs to build emotional rapport with the local alien inhabitants. The aliens speak a different language but their facial expressions are remarkably humanlike.

This fantastical scenario is the premise of a video game developed for middle schoolers by University of Wisconsin-Madison researchers to study whether video games can boost kids' empathy, and to understand how learning such skills can change neural connections in the brain.

Results published this week in npj Science of Learning (a Nature journal) reveal for the first time that, in as few as two weeks, kids who played a video game designed to train empathy showed greater connectivity in brain networks related to empathy and perspective taking. Some also showed altered neural networks commonly linked to emotion regulation, a crucial skill that this age group is beginning to develop, the study authors say.

"The realization that these skills are actually trainable with video games is important because they are predictors of emotional well-being and health throughout life, and can be practiced anytime -- with or without video games," says Tammi Kral, a UW-Madison graduate student in psychology who led the research at the Center for Healthy Minds.

Richard Davidson, director of the center and a professor of psychology and psychiatry at UW-Madison, explains that empathy is the first step in a sequence that can lead to prosocial behavior, such as helping others in need.

"If we can't empathize with another's difficulty or problem, the motivation for helping will not arise," says Davidson, who headed the research team. "Our long-term aspiration for this work is that video games may be harnessed for good and if the gaming industry and consumers took this message to heart, they could potentially create video games that change the brain in ways that support virtuous qualities rather than destructive qualities."

On average, youth between the ages of 8 and 18 rack up more than 70 minutes of video gameplay daily, according to data from the Kaiser Family Foundation. This spike in gameplay during adolescence coincides with an explosion in brain growth as well as a time when kids are susceptible to first encounters with depression, anxiety and bullying. The team wanted to learn whether there were ways to use video games as a vehicle for positive emotional development during this critical period.

Researchers randomly assigned 150 middle schoolers to two groups. One played the experimental game, called "Crystals of Kaydor," which was created for research purposes and intended to teach empathy. The second group played a commercially available and entertaining control game called "Bastion" that does not target empathy.

In Crystals of Kaydor, kids interacted with the aliens on the distant planet and learned to identify the intensity of emotions they witnessed on their humanlike faces, such as anger, fear, happiness, surprise, disgust and sadness. The researchers measured how accurate the players were in identifying the emotions of the characters in the game. The activity was also intended to help the kids practice and learn empathy.

Those who played Bastion partook in a storyline where they collected materials needed to build a machine to save their village, but tasks were not designed to teach or measure empathy. Researchers used the game because of its immersive graphics and third-person perspective.

The team obtained functional magnetic resonance imaging scans in the laboratory from both groups before and after two weeks of gameplay, looking at connections among areas of the brain, including those associated with empathy and emotion regulation. Participants in the study also completed tests during the brain scans that measured how accurately they empathized with others.

The researchers found stronger connectivity in empathy-related brain networks after the middle schoolers played Crystals of Kaydor compared to Bastion. Moreover, Crystals players who showed strengthened neural connectivity in key brain networks for emotion regulation also improved their score on the empathy test. Kids who did not show increased neural connectivity in the brain did not improve on the test of empathic accuracy.

"The fact that not all children showed changes in the brain and corresponding improvements in empathic accuracy underscores the well-known adage that one size does not fit all," says Davidson. "One of the key challenges for future research is to determine which children benefit most from this type of training and why."

Teaching empathy skills in such an accessible way may benefit populations who find these skills challenging, including individuals on the autism spectrum, Davidson adds.

Credit: 
University of Wisconsin-Madison

Penalty kick research hits the spot

New research from the University of Portsmouth could help Premiership footballers ahead of the new season, which starts tonight (10 August).

The study, published in the journal Human Movement Science, has come up with the best way to practice penalty kicks if a player favours waiting for the goalkeeper to move rather than just deciding on a spot before taking their penalty.

Football players adopt two penalty-taking methods. One is deciding where to place the ball irrespective of the goalkeeper's actions, known as goalkeeper-independent. The second is to place the penalty to the other side of the goalkeeper's dive, or the goalkeeper-dependent strategy.

In this second method, the penalty taker must anticipate and decide where to kick the ball at the same as running up and taking the penalty. However, research has shown that when the time available to make the decision is reduced, for instance because the goalkeeper starts moving late, this adversely affects a kicker's ability to accurately direct the ball to the side opposite to the goalkeepers dive.

If a player waits for the goalkeeper to move before deciding where to place their penalty, they should use 'implicit' practice methods to improve their penalty-taking skills. Implicit learning methods encourage the player to develop their skills through independent decision-making, rather than explicit methods, which involve coaching-led development.

This means that players should practice taking penalties by gradually increasing the difficulty of the penalty kick. For instance, by initially kicking from shorter distances and by using relatively large targets.

Using this method, the amount of thinking required by the player during their run-up is reduced as their penalty taking skills are improved, therefore they can focus on the accuracy of their kick.

Lead author of the study, Dr Martina Navarro, a lecturer in Sport and Exercise Science at the University of Portsmouth, said: "A successful penalty kick requires that the penalty taker produces an accurate, well-controlled kicking action and at the same time watches the goalkeeper and makes a decision to which side to kick the ball. In other words, it is a defining feature of the goalkeeper-dependent strategy that a conscious decision is made while kicking. This makes the goalkeeper-dependent strategy essentially a dual task.

"By practicing kicking skill and accuracy in an implicit manner will benefit penalty kick performance with a goalkeeper-dependent strategy compared to performance following an explicit intervention to improve kicking accuracy."

The study compared the effects of implicit and explicit training methods on the penalty taking performance of 20 football players from the youth academies of VU University Amsterdam in the Netherlands and Red Bull Brasil in San Paulo, Brazil.

The players were divided in two groups and took part in a practice session to improve kicking accuracy (without a goalkeeper) and then in a post-test in order to examine the accuracy of their penalty kick performance (including a decision to kick to the side opposite the goalkeeper's dive).

The results found that the implicit and explicit training method resulted in similar levels of decision-making, but after implicit training this was achieved with higher penalty accuracy.

Dr Navarro added: "When compared to explicit training, implicit training strategy results in higher kicking accuracy because it relies on an unconscious way of learning resulting in less cognitive demands while controlling the kick and therefore more attentional resources for deciding what side to kick."

Credit: 
University of Portsmouth

Doctors reduced opioid prescriptions after learning a patient overdosed

Will clinicians become more careful in prescribing opioids if they are made of aware of the risks of these drugs first-hand? That was one of the core questions researchers set out to explore in a new study published in the August 2018 issue of Science. In doing so, they found that many clinicians do not learn of the deaths of those patients who overdose as they just disappear from their practice, outcomes unknown.

This disconnect from the personal experience of losing a patient due to fatal overdose, related to a prescription for opioids to relieve pain, makes the problem of the nation's opioid crisis seem remote - statistics happening elsewhere. While the epidemic continues to exert its outsized impact, opioid prescription-writing levels have not responded with adequate risk-benefit analysis by prescribers tasked with caring for patients with complaints around pain.

"Clinicians may never know a patient they prescribed opioids to suffered a fatal overdose," explained lead author Jason Doctor. "What we wanted to evaluate is whether closing that information gap will make them more judicious prescribers." Doctor is the Director of Health Informatics at the USC Schaeffer Center for Health Policy & Economics and Associate Professor at the Price School of Public Policy.

The study leverages behavioral insights and psychology to give prescribers personal experience with the risk associated with opioids, and finds that when a clinician learns one of their patients had suffered a fatal overdose they reduced the amount of opioids prescribed by almost 10 percent in the following three months.

Doctor and his colleagues conducted a randomized trial between July 2015 and June 2016 of 861 clinicians who had prescribed to 170 patients who subsequently suffered a fatal overdose involving prescription opioids. Half the clinicians, who all practiced in San Diego County, were randomly selected to receive a letter from the county medical examiner notifying them that a patient they had prescribed opioids to in the past twelve months had a fatal overdose. The letter, which was supportive in tone, also provided information from the Centers for Disease Control and Prevention on safe prescribing guidelines, nudging clinicians toward better prescribing habits.

In the three months after receiving the letter, prescribing decreased by 9.7 percent compared to the control group who didn't receive a letter. Furthermore, clinicians who received the letter were 7 percent less likely to start a new patient on opioids and less likely to prescribe higher doses.

The results are particularly exciting given that numerous, more traditional state regulations which often involve mandated limits on opioids have not been shown to have much impact. The authors point to numerous reasons why this study showed more promising results including its simplicity, that the letters still allows clinicians to decide when they will prescribe opioid analgesics and that it provides an important missing piece of clinical information to them.

This intervention is easily scalable nationwide as existing state and national resources already track the information necessary around overdose deaths associated with prescription and illicit drugs.

"Interventions that use behavioral insights to nudge clinicians to correct course are powerful, low-cost tools because they maintain the autonomy of the physician to ultimately decide the best course of care for their patient," said Doctor. "In this case, we know opioids, though beneficial to some patients with certain conditions, come with high risks that the doctor may not fully grasp when observing patients in the clinic. Providing information about harm that would otherwise go unseen by them gives physicians a clearer picture."

Credit: 
University of Southern California

Can rare lymphocytes combat rheumatoid arthritis?

Immunologists at Friedrich-Alexander-Universität Erlangen-Nürnberg have demonstrated that ILC2, a group of rare lymphoid cells, play a key role in the development of inflammatory arthritis. ILCs have several functional similarities to T-cells and are important agents of our congenital immune system. The FAU researchers' findings could form the basis for new approaches for treating rheumatoid arthritis. The findings have now been published in the renowned journal Cell Reports.

Rheumatoid arthritis is the most common form of inflammatory joint conditions. In contrast to osteoarthritis, where patients' joints degenerate, the symptoms of arthritis such as overheating, swelling and redness, occur in flare-ups and are frequently caused by disturbances in the immune system. The disease mainly affects the fingers and toes, but also knees, shoulders and hip joints. Around one percent of the population suffer from the condition and women are three times more likely to suffer from it than men. Treatment usually focuses on easing pain and slowing down the progression of the disease as there is no cure for rheumatoid arthritis.

Rare immune cell regulates arthritis

Immunologists at FAU have now proven that ILC2, a rare form of lymphocyte, plays a key role in the development of rheumatoid arthritis. Although ILCs, so-called 'innate lymphoid cells', do not have the T cell and B cell receptors nor cell type markers that are otherwise typical for lymphocytes, they are pivotal in defending the human body from pathogens. They are often the 'first aiders' who alarm the immune system before the actual immunisation begins. 'From earlier research, we know that ILC2 can initiate the suppression of chronic inflammation by producing the cell signal molecule IL-9', says project manager Dr. Mario Zaiss from the Department of Medicine 3 - Rheumatology and Immunology at Universitätsklinikum Erlangen. 'In our current study, we specifically examined the role of ILC2s in the early stage of rheumatoid arthritis'.

ILC2 only helps before the onset of the disease

Firstly, Zaiss and his colleagues were able to demonstrate that the number of ILC2 in the peripheral blood and in the joints of patients with rheumatoid arthritis is significantly higher than in healthy people. Laboratory tests confirmed the regulatory function of ILC2. When the researchers reduced the number of these immune cells genetically, this exacerbated the progression of the disease later on, while increasing the number of ILC2 during therapy significantly reduced the arthritis. The researchers, however, cannot hold out any hope that they will be able to cure patients who already have inflammatory arthritis through targeted enrichment of ILC2. 'There is no doubt that ILC2 has a regulatory effect during the early stage of arthritis,' explains Mario Zaiss. 'However, any treatment must start before the onset of the disease - transferring ILC2 later on does not improve symptoms.'

Further research is set to find safe methods of increasing the number of ILC2 in the body in a targeted manner. Researchers must also find new and reliable methods of detecting signs of arthritis before the onset of the disease as this is the only time when these rare lymphocytes can be used as a treatment.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Ebola virus experts discover powerful, new approach for future therapeutics

image: The study included (left to right) Marnie Fusco, Erica Ollmann Saphire and Sharon Schendel.

Image: 
Photo by Don Boomer, Scripps Research

A one-two punch of powerful antibodies may be the best way to stop Ebola virus, reports an international team of scientists in the journal Cell. Their findings suggest new therapies should disable Ebola virus's infection machinery and spark the patient's immune system to call in reinforcements.

"This study presents results from an unprecedented international collaboration and demonstrates how 43 previously competing labs can together accelerate therapeutics and vaccine design," says Erica Ollmann Saphire, PhD, professor at Scripps Research and director of the Viral Hemorrhagic Fever Immunotherapeutic Consortium (VIC).

From 2013-2016, West Africa faced the deadliest Ebola outbreak the world has ever seen. By the time the outbreak was declared over, 11,325 people had died. The VIC is an international group of the world's leading virologists, immunologists, systems biologists and structural biologists working to stop an outbreak on that scale from ever striking again.

The VIC researchers aim to understand which Ebola-fighting antibodies are best-and why. The hope is that the most effective antibodies can be combined in a therapeutic "cocktail." Unlike an Ebola vaccine, these cocktails could be given to those already infected, which is important for stopping a disease that tends to emerge unexpectedly in remote locations.

Ollmann Saphire and her colleagues in the VIC have published more than 40 studies in just the last five years. This landmark study is the first-ever side-by-side comparison of 171 antibodies against Ebola virus and other related viruses, known as filoviruses. All antibodies in the panel were donated by different labs around the world, and many had not been previously characterized in such extensive detail.

"Through the VIC, we could test a larger pool of antibodies in parallel, which increased the potential to detect statistically significant relationships between antibody features and protection," says Saphire. "We used this global pool of antibodies to evaluate, and streamline, the research pipeline itself."

In addition to identifying links between antibody target locations and activity, VIC researchers tested this huge pool of antibodies to reveal which antibodies "neutralized" the virus, why neutralization assays so often disagree, and whether or not neutralization in test tubes adequately predicted how well these antibodies would protect live animals from Ebola virus infection. Unexpectedly, neutralization alone was not always associated with the protective ability of an antibody.

Notably, the scientists found nine antibodies that protected mice from infection without neutralizing the virus in test tubes. These antibodies likely fight infection by interacting with an infected person's immune system, helping orchestrate a better immune response to the virus.

This "immune effector" activity is featured in the team's companion study published simultaneously in Cell Host & Microbe. "The ability to evoke an immune response will likely represent a new avenue of study for therapeutic antibodies for Ebola virus infection," says Sharon Schendel, project manager for the VIC and science writer in the Saphire lab.

From the large body of results, VIC member and Scripps Research faculty member Kristian Andersen, PhD, and his graduate student Karthik Gangavarapu developed a network describing how each antibody feature correlates to protection, which can serve as a guide to predict whether newly identified antibodies will have therapeutic value. Saphire says the next steps for the VIC are to further test promising antibody cocktails in non-human primates. The team will also pursue engineering of antibodies that carry signature features to better drive immune system response.

Credit: 
Scripps Research Institute

For UW physicists, the 2-D form of tungsten ditelluride is full of surprises

image: When two monolayers of WTe2 are stacked into a bilayer, a spontaneous electrical polarization appears, one layer becoming positively charged and the other negatively charged. This polarization can be flipped by applying an electric field.

Image: 
Joshua Kahn

The general public might think of the 21st century as an era of revolutionary technological platforms, such as smartphones or social media. But for many scientists, this century is the era of another type of platform: two-dimensional materials, and their unexpected secrets.

These 2-D materials can be prepared in crystalline sheets as thin as a single monolayer, only one or a few atoms thick. Within a monolayer, electrons are restricted in how they can move: Like pieces on a board game, they can move front to back, side to side or diagonally -- but not up or down. This constraint makes monolayers functionally two-dimensional.

The 2-D realm exposes properties predicted by quantum mechanics -- the probability-wave-based rules that underlie the behavior of all matter. Since graphene -- the first monolayer -- debuted in 2004, scientists have isolated many other 2-D materials and shown that they harbor unique physical and chemical properties that could revolutionize computing and telecommunications, among other fields.

For a team led by scientists at the University of Washington, the 2-D form of one metallic compound -- tungsten ditelluride, or WTe2 -- is a bevy of quantum revelations. In a paper published online July 23 in the journal Nature, researchers report their latest discovery about WTe2: Its 2-D form can undergo "ferroelectric switching." They found that when two monolayers are combined, the resulting "bilayer" develops a spontaneous electrical polarization. This polarization can be flipped between two opposite states by an applied electric field.

"Finding ferroelectric switching in this 2-D material was a complete surprise," said senior author David Cobden, a UW professor of physics. "We weren't looking for it, but we saw odd behavior, and after making a hypothesis about its nature we designed some experiments that confirmed it nicely."

Materials with ferroelectric properties can have applications in memory storage, capacitors, RFID card technologies and even medical sensors.

"Think of ferroelectrics as nature's switch," said Cobden. "The polarized state of the ferroelectric material means that you have an uneven distribution of charges within the material -- and when the ferroelectric switching occurs, the charges move collectively, rather as they would in an artificial electronic switch based on transistors."

The UW team created WTe2 monolayers from its the 3-D crystalline form, which was grown by co-authors Jiaqiang Yan at Oak Ridge National Laboratory and Zhiying Zhao at the University of Tennessee, Knoxville. Then the UW team, working in an oxygen-free isolation box to prevent WTe2 from degrading, used Scotch Tape to exfoliate thin sheets of WTe2 from the crystal -- a technique widely used to isolate graphene and other 2-D materials. With these sheets isolated, they could measure their physical and chemical properties, which led to the discovery of the ferroelectric characteristics.

WTe2 is the first exfoliated 2-D material known to undergo ferroelectric switching. Before this discovery, scientists had only seen ferroelectric switching in electrical insulators. But WTe2 isn't an electrical insulator; it is actually a metal, albeit not a very good one. WTe2 also maintains the ferroelectric switching at room temperature, and its switching is reliable and doesn't degrade over time, unlike many conventional 3-D ferroelectric materials, according to Cobden. These characteristics may make WTe2 a promising material for smaller, more robust technological applications than other ferroelectric compounds.

"The unique combination of physical characteristics we saw in WTe2 is a reminder that all sorts of new phenomena can be observed in 2-D materials," said Cobden.

Ferroelectric switching is the second major discovery Cobden and his team have made about monolayer WTe2. In a 2017 paper in Nature Physics, the team reported that this material is also a "topological insulator," the first 2-D material with this exotic property.

In a topological insulator, the electrons' wave functions -- mathematical summaries of their quantum mechanical states -- have a kind of built-in twist. Thanks to the difficulty of removing this twist, topological insulators could have applications in quantum computing -- a field that seeks to exploit the quantum-mechanical properties of electrons, atoms or crystals to generate computing power that is exponentially faster than today's technology. The UW team's discovery also stemmed from theories developed by David J. Thouless, a UW professor emeritus of physics who shared the 2016 Nobel Prize in Physics in part for his work on topology in the 2-D realm.

Cobden and his colleagues plan to keep exploring monolayer WTe2 to see what else they can learn.

"Everything we have measured so far about WTe2 has some surprise in it," said Cobden. "It's exciting to think what we might find next."

Credit: 
University of Washington

Lichen is losing to wildfire, years after flames are gone

image: Lichen on a Napa County oak tree.

Image: 
Jesse Miller/UC Davis

As increasingly hot and severe wildfires scorch the West, some lichen communities integral to conifer forests aren't returning, even years after the flames have been extinguished, according to a study from scientists at the University of California, Davis.

Lichen, an often overlooked organism that forms fuzzy, leaf-like layers over tree bark and rocks, is an unsung hero in forest ecosystems. It provides food for deer, caribou, and elk and is sometimes the only food source for flying squirrels, which are key prey for threatened spotted owls. Birds and insects use it to eat and nest. An important contributor to the nutrient cycle, it also helps fix nitrogen in forest soils.

"Lichen are beautiful, ecologically important, are all around us and tell us important things about the environment," said lead author Jesse Miller, a postdoctoral scholar in the Department of Environmental Science and Policy at UC Davis. "But even if you don't notice lichens, you would notice the consequences in ecosystems when they are lost."

LICHEN LOSS AND FIRE SEVERITY

For the study, published August 9 in the journal Global Change Biology, researchers sampled lichen communities in about 100 study plots across California's Sierra Nevada region. Five wildfires had burned, at varying levels of severity, in and around the plots between four and 16 years before the study's sampling.

The results show that lichen communities were largely unaffected by low-severity fires. This suggests that prescribed fires and natural wildfires under moderate weather and fuels conditions are compatible with lichen diversity.

But areas that experienced higher severity wildfires had significantly lower abundance and diversity of lichen.

In severely burned areas where most of the trees died, nearly all the lichen were gone, even 16 years after the fire.

RECOVERY RACE

The lichens' recovery is likely held back by the loss of tree canopy after the fire, the researchers said. The hot, dry microclimate left in the forest post-fire is not conducive to lichen growth. This indicates that lichen communities burned in Sierra Nevada forests likely won't recolonize until mature trees regrow and the forest canopy is restored. This may exacerbate the effects of climate change that already threaten lichens.

"If the species could keep pace with the rate of climate change, the effects of fire might not be so bad," Miller said. "But the concern is they might not. These fires happen so quickly and in such a large area, they could cause species ranges to contract faster than they are expanding."

The study also indicates that the trend of increasingly dry forests and hotter, bigger and more severe wildfires could cause broad impacts to lichen diversity across the landscape, which could impact nutrient cycling and multiple food-chain interactions among wildlife.

Credit: 
University of California - Davis

Despite ACA, lesbian, gay and bisexual adults still have trouble affording health care

PROVIDENCE, R.I. [Brown University] -- The Affordable Care Act (ACA), commonly known as Obamacare, has reduced the number of Americans without health insurance from 18 percent to about 13 percent, statistics from the Centers for Disease Control and Prevention (CDC) show.

And though the exact percentage of lesbian, gay and bisexual (LGB) adults with health insurance prior to the launch of the ACA isn't known precisely, a new study reports that they are now insured at the same rate as their straight peers. However, they are still more likely to avoid necessary medical treatment due to cost.

"I started looking at this question because I had read a few studies indicating that following the ACA's implementation in 2014 and the legalization of same-sex marriage in 2015, there were comparable rates of uninsurance for LGB adults," said Kevin Nguyen, lead author of the Aug. 6 study in the August issue of Health Affairs. "However, insurance is only one step in receiving care -- I was curious to see if there were other differences in the access to care and health outcomes."

Nguyen is a doctoral student at the Brown University School of Public Health. Dr. Amal N. Trivedi, an associate professor of health services, policy and practice, and an associate professor of medicine at Brown, and Theresa I. Shireman, a professor of health services, policy and practice at Brown, were co-authors on the study.

The researchers analyzed three publically available data sets from the CDC spanning from 2014-15 -- the first year the survey asked about sexual orientation -- to 2016-17, the most current data set. The study included about 330,000 adults between ages 18 and 64, 4.3 percent of whom identified as lesbian, gay or bisexual.

Of those surveyed via the study, 16.4 percent of LGB adults reported avoiding or delaying medical treatment for financial reasons, compared to 14.2 percent of their straight peers. Given the large number of people surveyed, that difference proved statistically significant, the researchers said.

Though the study did not explore the cause of the difference in being able to afford medical treatment, Nguyen said, based on prior studies, one possible explanation is that more LGB adults had individually purchased insurance, which may have higher copays or deductibles than employer-sponsored insurance. Another possibility is that on average LGB individuals may need more medical services than straight individuals.

In addition to similar levels of health insurance, the study also found that LGB individuals now report comparable levels of having a primary care doctor and having an annual check-up as their straight peers -- approximately 77 percent and 66 percent, respectively.

Yet despite comparable levels of access to health care, LGB adults reported 5.3 days in the last month where poor health prevented them from doing normal activities, compared to 4.9 for their straight peers. Similarly, they reported 5.6 poor mental health days (days with stress, depression or emotional problems) in the last month, compared to 3.9.

The researchers at Brown used an adjusted analysis of the survey data to take demographic differences into account, including variations in age, race, marital status, income and education levels between LGB and straight adults. They did not focus on transgender individuals in this study, as they face additional challenges in accessing health care, whether they identify as LGB or not.

Nguyen is interested in looking at the data from the 2017-18 survey upon its release to see how various health policy debates and changes have impacted the LGB community, he said.

"It really is important to collect sexual orientation data in these nationwide surveys because it allows us to answer questions about access to care and health outcomes," he said, adding that the CDC has not yet decided whether the sexual orientation module will be included in the 2019 survey. "This helps us understand the experiences of certain communities who face very high barriers to care and allows us to monitor whether we're making improvements or not."

Credit: 
Brown University

Ketogenic diets may lead to an increased risk of diabetes

New research published in the Journal of Physiology indicates that ketogenic diets, which are low carbohydrate high fat eating plans that are known to lead to weight loss, may cause an increased risk of Type 2 diabetes in the early stage of the diet.

Type 2 diabetes is one of the most pressing challenges of our time and its ultimate cause has not been fully understood. Ketogenic diets, which are low in carbohydrate and high in fat, are known to lead to weight loss and have been considered to be healthy. These findings raise new questions about ketogenic diets and whether or not they are actually healthy.

Insulin is released in the blood and used to control blood sugar levels including signaling the liver to stop producing sugar. If this system is impaired and the body does not use insulin properly, which is called insulin resistance, individuals are likely to develop high blood sugar levels. In this study the researchers showed that for ketogenic diets this process for controlling blood sugar levels does not work properly and there was insulin resistance in the liver. When the liver is unable to respond to normal levels of insulin to control blood sugar levels this may lead to an increased risk of Type 2 diabetes.

The study, which was conducted by ETH Zurich in conjunction with University Children's Hospital Zurich, involved feeding mice two different types of diet (a ketogenic diet and a high fat diet, which causes the liver to become resistant to insulin) and then performing standard metabolic tests on them. Using specialized procedures the researchers were able to determine the effects of internal sugar production from the animal (mostly the liver), and sugar uptake into tissues (mostly the muscle), during insulin action.

It is important to note that the research did not analyze whether the diet employed causes obesity, if given long term. The mechanism behind the whole process was undetermined; therefore, the existence of a shared physiological response between low carb and regular carb high fat diets that cause insulin resistance in the liver requires further exploration.

Christian Wolfrum, one of the corresponding authors on the paper said 'Diabetes is one of the biggest health issues we face. Although ketogenic diets are known to be healthy, our findings indicate that there may be an increased risk of insulin resistance with this type of diet that may lead to Type 2 diabetes. The next step is to try to identify the mechanism for this effect and to address whether this is a physiological adaptation. Our hypothesis is that when fatty acids are metabolized, their products might have important signaling roles to play in the brain.'

Credit: 
The Physiological Society