Culture

All in the family: Kin of gravitational wave source discovered

image: This image provides three different perspectives on GRB150101B, the first known cosmic analogue of GW170817, the gravitational wave event discovered in 2017. At center, an image from the Hubble Space Telescope shows the galaxy where GRB150101B took place. At top right, two X-ray images from NASA's Chandra X-ray observatory show the event as it appeared on January 9, 2015 (left), with a jet visible below and to the left; and a month later, on February 10, 2015 (right), as the jet faded away. The bright X-ray spot is the galaxy's nucleus.

Image: 
NASA/CXC

On October 16, 2017, an international group of astronomers and physicists excitedly reported the first simultaneous detection of light and gravitational waves from the same source--a merger of two neutron stars. Now, a team that includes several University of Maryland astronomers has identified a direct relative of that historic event.

The newly described object, named GRB150101B, was reported as a gamma-ray burst localized by NASA's Neil Gehrels Swift Observatory in 2015. Follow-up observations by NASA's Chandra X-ray Observatory, the Hubble Space Telescope (HST) and the Discovery Channel Telescope (DCT) suggest that GRB150101B shares remarkable similarities with the neutron star merger, named GW170817, discovered by the Laser Interferometer Gravitational-wave Observatory (LIGO) and observed by multiple light-gathering telescopes in 2017. 

A new study suggests that these two separate objects may, in fact, be directly related. The results were published on October 16, 2018 in the journal Nature Communications.

"It's a big step to go from one detected object to two," said study lead author Eleonora Troja, an associate research scientist in the UMD Department of Astronomy with a joint appointment at NASA's Goddard Space Flight Center. "Our discovery tells us that events like GW170817 and GRB150101B could represent a whole new class of erupting objects that turn on and off--and might actually be relatively common."

Troja and her colleagues suspect that both GRB150101B and GW170817 were produced by the same type of event: a merger of two neutron stars. These catastrophic coalescences each generated a narrow jet, or beam, of high-energy particles. The jets each produced a short, intense gamma-ray burst (GRB)--a powerful flash that lasts only a few seconds. GW170817 also created ripples in space-time called gravitational waves, suggesting that this might be a common feature of neutron star mergers.

The apparent match between GRB150101B and GW170817 is striking: both produced an unusually faint and short-lived gamma ray burst and both were a source of bright, blue optical light and long-lasting X-ray emission. The host galaxies are also remarkably similar, based on HST and DCT observations. Both are bright elliptical galaxies with a population of stars a few billion years old that display no evidence of new star formation. 

"We have a case of cosmic look-alikes," said study co-author Geoffrey Ryan, a postdoctoral researcher in the UMD Department of Astronomy and a fellow of the Joint Space-Science Institute. "They look the same, act the same and come from similar neighborhoods, so the simplest explanation is that they are from the same family of objects."

In the cases of both GRB150101B and GW170817, the explosion was likely viewed "off-axis," that is, with the jet not pointing directly towards Earth. So far, these events are the only two off-axis short GRBs that astronomers have identified.  

The optical emission from GRB150101B is largely in the blue portion of the spectrum, providing an important clue that this event is another kilonova, as seen in GW170817. A kilonova is a luminous flash of radioactive light that produces large quantities of important elements like silver, gold, platinum and uranium.

While there are many commonalities between GRB150101B and GW170817, there are two very important differences. One is their location: GW170817 is relatively close, at about 130 million light years from Earth, while GRB150101B lies about 1.7 billion light years away.

The second important difference is that, unlike GW170817, gravitational wave data does not exist for GRB150101B. Without this information, the team cannot calculate the masses of the two objects that merged. It is possible that the event resulted from the merger of a black hole and a neutron star, rather than two neutron stars.

"Surely it's only a matter of time before another event like GW170817 will provide both gravitational wave data and electromagnetic imagery. If the next such observation reveals a merger between a neutron star and a black hole, that would be truly groundbreaking," said study co-author Alexander Kutyrev, an associate research scientist in the UMD Department of Astronomy with a joint appointment at NASA's Goddard Space Flight Center. "Our latest observations give us renewed hope that we'll see such an event before too long."

It is possible that a few mergers like the ones seen in GW170817 and GRB150101B have been detected previously, but were not properly identified using complementary observations in different wavelengths of light, according to the researchers. Without such detections--in particular, at longer wavelengths such as X-rays or optical light--it is very difficult to determine the precise location of events that produce gamma-ray bursts. 

In the case of GRB150101B, astronomers first thought that the event might coincide with an X-ray source detected by Swift in the center of the galaxy. The most likely explanation for such a source would be a supermassive black hole devouring gas and dust. However, follow-up observations with Chandra placed the event further away from the center of the host galaxy.

According to the researchers, even if LIGO had been operational in early 2015, it would very likely not have detected gravitational waves from GRB150101B because of the event's greater distance from Earth. All the same, every new event observed with both LIGO and multiple light-gathering telescopes will add important new pieces to the puzzle.

"Every new observation helps us learn better how to identify kilonovae with spectral fingerprints: silver creates a blue color, whereas gold and platinum add a shade of red, for example," Troja added. "We've been able identify this kilonova without gravitational wave data, so maybe in the future, we'll even be able to do this without directly observing a gamma-ray burst."

Credit: 
University of Maryland

Just how blind are bats? Color vision gene study examines key sensory tradeoffs

image: A new study led by Bruno Simões, Emma Teeling and colleagues has examined the evolution of color vision genes across a large and diverse group of bat species.

Image: 
Professor Gareth Jones

Could bats' cave-dwelling nocturnal habits over eons enhanced their echolocation acoustic abilities, but also spurred their loss of vision?

A new study led by Bruno Simões, Emma Teeling and colleagues has examined this question in the evolution of color vision genes across a large and diverse group of bat species.

They show that the popular expression of being "blind as a bat" really doesn't hold true. Some bats that have the most advanced type of echolocation appear to have traded UV vision for exquisite hearing, and all bats that do not echolocate but live in caves have also lost UV vision. This suggests that not all bats are blind but some certainly have selected other senses over vision.

"Bats' sensory abilities have long been a source of fascination for evolutionary biologists," said Emma Teeling, the corresponding author of the study, which appears in the advanced online edition of the journal Molecular Biology and Evolution. "Using phylogenetics and molecular biology we are now able to delve more deeply into the evolutionary price of acquiring echolocation and nocturnality."

Bats are not just the only mammals that can truly fly, but also the only ones to rely on echolocation to make their way at finding prey in the dark. It's long been argued by scientists that tradeoffs to bat vision were made as a result of this gaining this unique nocturnal sensory adaptation.

To better understand the sources of these tradeoffs, the research team performed DNA sequencing and analyzed the key vision genes in bats, including the SWS1 (short wavelength sensitive, for blue/UV light) and MWS/LWS (medium or long wavelength sensitive, for green, yellow and red light) opsin genes.

An opsin gene's job is to make the photosensitive retina proteins that can turn photons of light into sight to see particular wavelengths. The research team's opsin gene analyses surveyed the largest dataset to date in bats, representing 20 out of 21 bat existing families that were primarily chosen for their diverse echolocation types and ecological niches.

In the study, the authors have shown, among the 111 species examined, the loss of SWS1 gene function is more common in bats than previously thought and suggests that this may be associated with the adoption of cave roosting first traced back to almost 30 million years ago. They found various mutations in the bat genomes that effected SWS1 gene function, with a completely nonfunctional gene found in two species.

Overall, there is a spectral fine tuning that happened to their vision to completely lose short wavelength visible light in the blue/UV wavelengths in 26 of the 111 species examined. They found that for the majority of Old-World, cave roosting bats have a non-functional SWS1 opsin.

Selection on the blue-sensitive SWS1 opsin gene, however, was found to vary significantly among bat species. The research team found evidence of multiple genetic mutations in which different bat species have lost the function of the SWS1 gene. To identify these genetic roots, they used phylogenetic analyses to build gene trees based on the SWS1 results and compared signatures of selection between different ecological niches, such as echolocating vs. non-echolocating species and cave roosting vs. non-cave roosting.

"Our work supports previous hypotheses which suggest the pseudogenization [or loss] of the SWS1 opsin may be related to the adoption of advanced echolocation (high-duty cycle) and, cave roosting habits," said Teeling.

When the SWS1 gene is present and working, the authors confirmed that it gives bats the ability to see in UV light.

"Our spectral tuning analysis of the 11 sites responsible for light sensitivity in the SWS1 opsin gene in both ancestral and extant bat species, provide further support for the presence of UV vision in bats," said Teeling. "In closed-canopy, forest-dwelling mammals a UV sensitive SWS1 opsin is associated with a nocturnal lifestyle. Furthermore, our results demonstrate that this visual pigment has been UV sensitive in all bats since they first diverged from other placental mammals about 78 MYA."

Importantly, the new data makes clear that loss of the SWS1 gene function is not always associated with the acquisition of advanced echolocation, as has been previously suggested.

For the other visual genes, they found that among the 45 species examined, the MWS/LWS opsin gene is highly conserved across lineages and under strong evolutionary pressure to maintain its function.

"Our spectral tuning analysis of the 5 amino-acid sites responsible for the λ max revealed that the majority of bat MWS/LWS visual pigments are tuned to a long-wavelength (~555 - 560nm)," said Teeling. "This suggests that despite the acquisition of laryngeal echolocation and a long history of nocturnality, the MWS/LWS opsin gene has evolved under very strong functional constraint in bats."

"Bats are not blind, with most species capable of seeing in both the UV and middle range of the color spectrum. This suggests that vision is still an important means of sensory perception even in echolocating, nocturnal bats. However, acquisition of the most advanced type of echolocation does coincide with loss of UV vision in most bats and surprisingly cave-roosting drives loss of UV vision in the non-echolocating lineages. This suggests that sensory trade-offs are more complex that previously considered and that bats still make fascinating subjects to understand the evolution of the mammalian sensome!"

This study makes a strong contribution to the ongoing scientific debate regarding the importance of color vision for nocturnal animals.

Credit: 
SMBE Journals (Molecular Biology and Evolution and Genome Biology and Evolution)

Diabetics more at risk of death from alcohol, accidents, suicide

Diabetic patients are more likely to die from alcohol-related factors, accidents or suicide, according to a study published in the European Journal of Endocrinology. The study findings suggest that the increased risk of death from these causes may be related to the mental health of patients, which may be adversely affected by the psychological burden of living with and self-treating this debilitating disease, with potentially serious complications.

Eating with your eyes: Virtual reality can alter taste

ITHACA, N.Y. - Humans not only relish the sweet, savory and saltiness of foods, but they are influenced by the environment in which they eat. Cornell University food scientists used virtual reality to show how people's perception of real food can be altered by their surroundings, according to research published in the Journal of Food Science.

"When we eat, we perceive not only just the taste and aroma of foods, we get sensory input from our surroundings - our eyes, ears, even our memories about surroundings," said Robin Dando, associate professor of food science and senior author of the study.

About 50 panelists who used virtual reality headsets as they ate were given three identical samples of blue cheese. The study participants were virtually placed in a standard sensory booth, a pleasant park bench and the Cornell cow barn to see custom-recorded 360-degree videos.

The panelists were unaware that the cheese samples were identical, and rated the pungency of the blue cheese significantly higher in the cow barn setting than in the sensory booth or the virtual park bench.

To control for the pungency results, panelists also rated the saltiness of the three samples - and researchers found there was no statistical difference among them.

The purpose of this project was to develop an easy-to-implement and affordable method for adapting virtual reality technology for use in food sensory evaluation, said Dando.

Our environs are a critical part of the eating experience, he said. "We consume foods in surroundings that can spill over into our perceptions of the food," said Dando. This kind of testing offers advantages of convenience and flexibility, compared to building physical environments.

"This research validates that virtual reality can be used, as it provides an immersive environment for testing," said Dando. "Visually, virtual reality imparts qualities of the environment itself to the food being consumed - making this kind of testing cost-efficient."

Credit: 
Cornell University

Older people who self-harm at highest risk of suicide, finds study

People over 65 who harm themselves are more likely to die by suicide than other age groups according to new research published in the Lancet Psychiatry by University of Manchester and Keele University academics.

Funded by the NIHR Greater Manchester Patient Safety Translational Research Centre, the study analysed patient records using the Clinical Practice Research Datalink (CPRD) and found that 4,124 patients harmed themselves between 2001 and 2014, mostly by taking overdoses of medication.

It showed that people over 65 who self-harm are 20 times more likely to die an unnatural death and 145 times more likely to die by suicide than people of the same age who had not self-harmed.

The study also found that only 12% of older patients who self-harmed had a record of being referred to a mental health service for aftercare. National Institute of Health and Care Excellence (NICE) guidelines suggest that the involvement of mental health specialists is important because older people who self-harm may have higher suicidal intent than younger people.

Physical health problems were more common in older patients who had harmed themselves compared to those who had not.

Following a self-harm episode, over one in ten of those aged 65 and over were prescribed tricyclic antidepressants which can be toxic when taken in overdose.

Dr Cathy Morgan, who led the study and analysed the data said: "With the aging population rising and a lack of research in this age group, this study - - the first of its kind conducted in primary care - has gone some way in highlighting the risks of self-harm in older people.

"This study emphasises the need for early intervention, careful alternative prescribing and better support for older people who may see their GP following an episode of self-harm or for other health problems."

Professor Nav Kapur, one of the authors of the paper who chaired the NICE guidelines for self-harm said: "We sometimes think of self-harm as a problem in younger people and of course it is. But it effects older adults too and the concerning issue is the link with increased risk of suicide.

"Older people might be particularly vulnerable as they are uniquely exposed to issues such as bereavement, isolation and physical as well as mental illness.

"They also might fear the consequences of becoming a burden to their family or friends, or not being able to function from day to day."

He added: "We hope our study will alert clinicians, service planners, and policy makers to the need to implement preventative measures for this potentially vulnerable group of people. Referral and management of mental health conditions are likely to be key"

Professor Carolyn Chew-Graham, from Keele University one of the research team and a practising GP said: "Since drug ingestion is one of the main methods of self-harm, we highlight the need to consider less toxic medication in older adults for the management of both mental illness and pain related conditions.

"We also recommend maintaining frequent medication reviews following self-harm."

"GPs are after all in a unique position to intervene as older patients come to the surgery more frequently than younger adults."

Credit: 
University of Manchester

Men in leadership gain from psychopathic behavior, women punished

TUSCALOOSA, Ala. -- People with psychopathic tendencies are slightly more likely to be a company boss, but a new study finds men are allowed a pass for those inclinations while women are punished.

The study, published online today in the Journal of Applied Psychology, finds concern over psychopathic tendencies in bosses may be overblown, but that gender can function to obscure the real effects.

"Aggressive behavior is seen as more prototypical of men, and so people allow more displays of that kind of behavior without social sanctions," said Dr. Peter Harms, associate professor of management at The University of Alabama. "If women behave counter to gender norms, it seems like they get punished for it more readily."

Karen Landay, lead author on the paper and doctoral student in management at UA, and co-author Dr. Marcus Credé, assistant professor of psychology at Iowa State University, are part of a review of previous studies along with new research.

A psychopathic personality has three characteristics, including boldness in asserting dominance over others, being impulsive without inhibition, and a lack of empathy. There have been several studies showing people with some level of those traits are over represented as organizational bosses.

For this study, the researchers set out to determine whether there are optimal levels of psychopathy for success as a leader and if gender made a difference. They found tendencies help slightly when a person is rising through management ranks, but bosses behaving this way are less effective as leaders.

"Overall, although there is no positive or negative relation to a company's bottom line when psychopathic tendencies are present in organizational leaders, their subordinates will still hate them," Harms said. "So we can probably assume they behave in a manner that is noxious and whatever threats they make to 'motivate' workers don't really pay off."

When the data is broken down by gender, though, it shows that psychopathic traits in men help them emerge as leaders and be seen as effective, but these same tendencies are seen as a negative in women.

"The existence of this double standard is certainly disheartening," Landay said. "I can imagine that women seeking corporate leadership positions getting told that they should emulate successful male leaders who display psychopathic tendencies. But these aspiring female leaders may then be unpleasantly surprised to find that their own outcomes are not nearly as positive."

This double standard could be a fruitful area for future research, but, for Harms, the implications for organizations are clear.

"We should be more aware of and less tolerant of bad behavior in men," he said. "It is not OK to lie, cheat, steal and hurt others whether it is in the pursuit of personal ambition, organizational demands or just for fun."

Credit: 
University of Alabama in Tuscaloosa

Mammals cannot evolve fast enough to escape current extinction crisis

image: An illustration of how the smaller mammals will have to evolve and diversify over the next 3-5 million years to make up for the loss of the large mammals.

Image: 
Matt Davis, Aarhus University

We humans are exterminating animal and plant species so quickly that nature's built-in defence mechanism, evolution, cannot keep up. An Aarhus-led research team calculated that if current conservation efforts are not improved, so many mammal species will become extinct during the next five decades that nature will need 3-5 million years to recover.

There have been five upheavals over the past 450 million years when the environment on our planet has changed so dramatically that the majority of Earth's plant and animal species became extinct. After each mass extinction, evolution has slowly filled in the gaps with new species.

The sixth mass extinction is happening now, but this time the extinctions are not being caused by natural disasters; they are the work of humans. A team of researchers from Aarhus University and the University of Gothenburg has calculated that the extinctions are moving too rapidly for evolution to keep up.

If mammals diversify at their normal rates, it will still take them 5-7 million years to restore biodiversity to its level before modern humans evolved, and 3-5 million years just to reach current biodiversity levels, according to the analysis, which was published recently in the prestigious scientific journal, PNAS.

Some species are more distinct than others

The researchers used their extensive database of mammals, which includes not only species that still exist, but also the hundreds of species that lived in the recent past and became extinct as Homo sapiens spread across the globe. This meant that the researchers could study the full impact of our species on other mammals.

However, not all species have the same significance. Some extinct animals, such as the Australian leopard-like marsupial lion Thylacoleo, or the strange South American Macrauchenia (imagine a lama with an elephant trunk) were evolutionary distinct lineages and had only few close relatives. When these animals became extinct, they took whole branches of the evolutionary tree of life with them. We not only lost these species, we also lost the unique ecological functions and the millions of years of evolutionary history they represented.

"Large mammals, or megafauna, such as giant sloths and sabre-toothed tigers, which became extinct about 10,000 years ago, were highly evolutionarily distinct. Since they had few close relatives, their extinctions meant that entire branches of Earth's evolutionary tree were chopped off" says palaeontologist Matt Davis from Aarhus University, who led the study. And he adds:

"There are hundreds of species of shrew, so they can weather a few extinctions. There were only four species of sabre-toothed tiger; they all went extinct."

Long waits for replacement rhinos

Regenerating 2.5 billion years of evolutionary history is hard enough, but today's mammals are also facing increasing rates of extinction. Critically endangered species such as the black rhino are at high risk of becoming extinct within the next 50 years. Asian elephants, one of only two surviving species of a once mighty mammalian order that included mammoths and mastodons, have less than a 33 percent chance of surviving past this century.

The researchers incorporated these expected extinctions in their calculations of lost evolutionary history and asked themselves: Can existing mammals naturally regenerate this lost biodiversity?

Using powerful computers, advanced evolutionary simulations and comprehensive data about evolutionary relationships and body sizes of existing and extinct mammals, the researchers were able to quantify how much evolutionary time would be lost from past and potential future extinctions as well as how long recovery would take.

The researchers came up with a best-case scenario of the future, where humans have stopped destroying habitats and eradicating species, reducing extinction rates to the low background levels seen in fossils. However, even with this overly optimistic scenario, it will take mammals 3-5 million years just to diversify enough to regenerate the branches of the evolutionary tree that they are expected to lose over the next 50 years. It will take more than 5 million years to regenerate what was lost from giant Ice Age species.

Prioritizing conservation work

"Although we once lived in a world of giants: giant beavers, giant armadillos, giant deer, etc., we now live in a world that is becoming increasingly impoverished of large wild mammalian species. The few remaining giants, such as rhinos and elephants, are in danger of being wiped out very rapidly," says Professor Jens-Christian Svenning from Aarhus University, who heads a large research program on megafauna, which includes the study.

The research team doesn't have only bad news, however. Their data and methods could be used to quickly identify endangered, evolutionarily distinct species, so that we can prioritise conservation efforts, and focus on avoiding the most serious extinctions.

As Matt Davis says: "It is much easier to save biodiversity now than to re-evolve it later."

Credit: 
Aarhus University

Us vs. them: Understanding the neurobiology of stereotypes

BIDMC Research Briefs showcase groundbreaking scientific advances that are transforming medical care.

Recent studies into how human beings think about members of other social groups reveal that biases sometimes operate beyond our conscious control. Called implicit bias, the tendency to be suspicious of people we perceive as strangers or "not like us" probably evolved early in our ancestry, when small groups of humans competed against each other for precious resources like food and water. Today, our brains' inherent tendency to stereotype can result in discrimination, injustice and conflict.

In a review published in the journal Trends in Cognitive Science, Alvaro Pascual-Leone, MD, PhD, and colleagues describe how non-invasive brain stimulation - a technique he and others have pioneered to unlock the secrets of the brain - could shed light on the neurobiology underlying implicit bias. What's more, Pascual-Leone and his co-authors, suggest the technique may also be used to evaluate potential behavioral interventions intended to reduce stereotyping and discriminatory practices.

In non-invasive brain stimulation, an electric current applied to the outside of the skull influences brain cell activity. Neuroscientists apply the current to research participants as they perform mental tasks - filling out a crossword puzzle, for example - to determine whether a given brain region is involved in that task.

"Unlike traditional brain imaging techniques, non-invasive brain stimulation can directly impact brain activity and provide powerful evidence that specific brain regions are linked to specific social behaviors - in this case, we applied it to attitudes and stereotypes towards groups that vary in social characteristics, such as race and ethnicity," said Pascual-Leone, Chief for the Division of Cognitive Neurology and the Director of the Berenson-Allen Center for Noninvasive Brain Stimulation at BIDMC. "Modulating the brain activity in these regions can yield insights relevant to our modern, more diverse societies - in which our primitive group allegiances can be in conflict even with one's own standards of equal opportunity, fairness and justice."

In their review, Pascual-Leone and colleagues consider publications by investigators who administered the well-validated implicit bias test, in which participants quickly sort words related to social characteristics ("obese" or "thin," for example) with words that convey a value judgment (such as "lazy" or "good") while undergoing non-invasive brain stimulation. One such study demonstrated that stimulation to the brain's anterior temporal lobe reduced participants' stereotypical association between "Arab" and "terrorist." Another experiment reduced the implicit cognitive associations between "male" and "science" and "female" and "humanities."

"Social beliefs reflect associations that strongly ingrained in our brains, and changing them will likely entail the reconfiguration of their underlying biological processes," said the paper's lead author, Maddalena Marini, formerly a post doctorate fellow in the Department of Psychology at Harvard University, who approached Pascuale-Leone with the multidisciplinary research proposal. "No behavioral interventions designed to shift social beliefs so far - such as empathy training - have produced robust and long-lasting effects. Non-invasive brain stimulation techniques can provide insights that may help meet the urgent need in our society to better understand our intergroup social behavior."

#

Co-authors included lead author Maddalena Marini of Istituto Italiano di Tecnologia, and Mahzarin R. Banaji of the Department of Psychology, Harvard University.

This work was supported by 2016-2017 Postdoctoral Fellow Award of Harvard Mind Brain Behavior Interfaculty Initiative (MBB) to Maddalena Marini.

##

Journal

Trends in Cognitive Sciences

DOI

10.1016/j.tics.2018.07.014

Credit: 
Beth Israel Deaconess Medical Center

Cumulative sub-concussive impacts in a single season of youth football

In an investigation of head impact burden and change in neurocognitive function during a season of youth football, researchers find that sub-concussive impacts are not correlated with worsening performance in neurocognitive function.

Each year, more than 3 million children in primary and high school play tackle football in the United States. Growing concern about the possible negative effects of repetitive sub-concussive head impacts led to an increased number of physicians and parents who counsel against youth participation in full-contact sports.

A research team, led by Sean Rose, MD, pediatric sports neurologist and co-director of the Complex Concussion Clinic at Nationwide Children's Hospital, followed 112 youth football players age 9-18 during the 2016 season in a prospective study.

"When trying to determine the chronic effects of repetitive sub-concussive head impacts, prospective outcomes studies are an important complement to the existing retrospective studies," says Dr. Rose. "In this study of primary school and high school football players, a battery of neurocognitive outcomes tests did not detect any worsening of performance associated with cumulative head impacts."

The pre- and post-season assessments used to measure outcomes included:

Neuropsychological testing

Symptoms assessment

Vestibular and ocular-motor screening

Balance testing

Parent-reported ADHD symptoms

Self-reported behavioral adjustment

Sensors placed in the helmets recorded sub-concussive head impacts during practices and games. Researchers added the impact g-forces to yield a cumulative impact measure. According to the study, cumulative impact did not predict changes (from pre-season to post-season) in any of the outcome measures.
Additionally, Dr. Rose notes, having sustained one or more concussions prior to entering the study was not associated with worse pre-season testing.

In their secondary analysis, they found that younger age and reported history of attention deficit hyperactivity disorder (ADHD) predicted score changes on several cognitive testing measures and parent-reported ADHD symptoms. Additionally, a reported history of anxiety or depression predicted changes in scores of symptom reporting.

"We expected repetitive impacts to correlate with worsening neurocognitive function, but we found that sub-concussive head impacts sustained over the course of a single season were not associated with neurocognitive functional outcomes. And also surprising, sustaining isolated high g-force impacts was also not associated with worse outcome," says Dr. Rose. "The lack of a significant association may reflect the need for longer follow up - so we are continuing to follow kids across multiple seasons."

This publication is the first analysis in a four-year prospective cohort study. Dr. Rose will be presenting data from the second year of the study at the upcoming Child Neurology Society meeting in mid-October. The team is currently collecting data for a third year.

Credit: 
Nationwide Children's Hospital

Survey shows widespread skepticism of flu shot

video: More than half of parents think their children can get the flu from the vaccine, a third say it doesn't work.

Image: 
Orlando Health

When a child gets the flu, they're not only sick in bed for a week or more, but the illness can also have serious and even life-threatening consequences. In fact, 180 children died after contracting the flu during the 2017-2018 season, one of the most severe on record. Despite the consensus of the medical community, a new national survey by Orlando Health Arnold Palmer Hospital for Children finds that a shocking number of parents are still skeptical about the safety and effectiveness of the flu shot.

"We know that there are a lot of myths and misconceptions about the flu," said Jean Moorjani, MD, a board-certified pediatrician at Arnold Palmer Hospital. "In this day and age we have so many ways to get information, so if anybody has questions or concerns, we recommend they talk to a doctor they trust to get the right information about what's best to protect themselves and their families."

The survey found that more than half of parents with children under age 18 believe that their child can get the flu from the flu shot, while a third think that the shot does not protect against the flu.

Moorjani says many parents are unaware that it takes about two weeks after getting the vaccine for the body to build up antibodies to adequately protect against the flu, during which someone is still susceptible to contracting the virus. When someone gets sick within that time period, they may incorrectly assume that the flu was caused by the shot.

"The parts of the virus that are used in the vaccine are completely dead, so you cannot get the flu from the flu shot," said Moorjani. "It takes time for your body to get strong and ready for flu season, which is why we recommend everybody get the shot as soon as they can. If you are infected with the flu shortly after getting your flu shot, your body may not be able to fight it off."

In addition to the effectiveness of the flu shot, the survey also found that many parents question the vaccine's safety, as well. 30 percent think that the flu shot is a conspiracy, while 28 percent believe it can cause autism.

"After extensive studies, we know that the flu vaccine is safe," said Moorjani. "You cannot get autism from the flu vaccine. It is not a conspiracy for doctors to recommend the flu vaccine. Doctors recommend it because we know -- based on science, research and facts -- that it is the best way to protect yourself and your family against the flu."

Ehren McMichael makes it a point to take her three children to get their flu shots every fall. While she's aware that the shot is not 100 percent effective in preventing the flu, she knows that it is the best method available to keep her family healthy through flu season.

"My youngest child got the flu last year before we were able to get our flu shots, and he was miserable for about four days," said McMichael. "I know moms who believe a lot of the misinformation that's out there, but I think that our pediatrician is the best person to get our information from. As a parent, the flu shot is just another level of protection I can give my kids, and with so many places offering flu shots, it's really simple."

Experts recommend that everyone over the age of six months get the flu shot, and the sooner the better. If you have the choice, doctors say the shot has been found to be more effective than the nasal spray in preventing the flu. If your child is under the age of eight and it is the first time they've received the flu shot, they will need two shots, spaced a month apart, to build up their resistance. Getting the shot not only helps protect those who receive it, but also vulnerable populations, such as infants, who can't be vaccinated and are more likely to experience serious health effects from the flu.

Credit: 
MediaSource

Costs of Medicare diabetes prevention program may exceed reimbursements

October 15, 2018- For some healthcare providers - especially those serving racial/ethnic minority and low-income patients - the costs of delivering a new Medicare Diabetes Prevention Program (MDPP) may be much higher than the expected reimbursement, reports a study in the November issue of Medical Care. The journal is published in the Lippincott portfolio by @WKHealth.

Payments may cover as little as one-fifth of the costs of delivering recommended diabetes prevention services in "safety-net" healthcare settings, according to the brief report by Natalie D. Ritchie, PhD, of Denver Health and Hospital Authority and R. Mark Gritz, PhD, of University of Colorado School of Medicine. "While many MDPP suppliers are needed to reach all Medicare beneficiaries with prediabetes, insufficient reimbursement may be a deterrent," the researchers write.

Analysis Suggests $661 Gap Between MDPP Costs and Payments

The MDPP is an innovative program to prevent diabetes in Medicare beneficiaries. A previous report on the National Diabetes Prevention Program suggested that educational and coaching sessions over one year reduce the risk of developing type 2 diabetes in older adults at high risk. The MDPP targets older adults meeting criteria for "prediabetes" - estimated to be present in about 48 percent of US seniors.

Under a "pay-for-performance" scheme announced last year, healthcare providers will be reimbursed for MDPP services based on patients' attendance and weight loss. The program targets a five percent reduction in body weight, which can delay or prevent type 2 diabetes. But there are concerns about whether reimbursements will cover the costs of MDPP services - particularly in minority and low-income populations, who have a disparately high risk of diabetes.

Drs. Ritchie and Gritz analyzed the costs of and expected reimbursement for providing MDPP services to 213 Medicare beneficiaries with prediabetes or other diabetes risk factors in Denver's safety-net healthcare system. Most patients were of minority race/ethnicity (41 percent Hispanic, 32 percent black) and classified as low-income (about 70 percent).

Average projected reimbursement was about $139 per patient, based on coverage rules issued by the Centers for Medicare and Medicaid Services. Consistent with previous studies, outcomes of the MDPP intervention were not as good in this group of largely minority, low-income Medicare beneficiaries. Average weight loss was just under two percent, which may still be beneficial for reducing diabetes risk. Less than five percent of participants met all milestones (attendance and weight loss) needed to reach the maximum payment of $470 for a full year of services.

By comparison, the costs of delivering the program were estimated at $800 per patient. Subtracting the average payment of $139, there was a $661 gap between the costs of the program and the expected payments. Expenses for coaching staff accounted for a little over half of the costs of providing MDPP services. "Even if program delivery costs were reduced and Medicare beneficiaries performed better in the future, it appears that an unsustainable funding gap might likely remain," commented Dr. Ritchie.

Preventing diabetes is a major priority for improving health and reducing healthcare costs in older Americans. The introduction of the MDPP offers an "unprecedented opportunity to provide lifestyle intervention to Medicare beneficiaries with prediabetes," the researchers believe.

Their study adds to concerns that reimbursement provided by Medicare will fall short of covering the costs for providing MDPP services - especially for healthcare providers serving diverse, underserved patient populations. The payment gap could limit access to recommended diabetes prevention services and contribute to widening health disparities.

Drs. Ritchie and Gritz present an analysis suggesting that reimbursing the full beneficiary cost of $800 per patient would pay for itself within the first year due to healthcare expenditures avoided - and bring an even larger return on investment in subsequent years. The authors conclude: "The high economic and societal costs of diabetes and the importance of reducing health disparities suggest that financially sustainable payments are necessary to ensure benefit access, while yielding cost-savings for Medicare."

Credit: 
Wolters Kluwer Health

Cesarean-born mice show altered patterns of brain development, study finds

image: this is Dr. Nancy Forger, director of the Neuroscience Institute at Georgia State University.

Image: 
Georgia State University

ATLANTA--Cesarean-born mice show altered patterns of cell death across the brain, exhibiting greater nerve cell death than vaginally delivered mice in at least one brain area, a finding by Georgia State University researchers that suggests birth mode may have acute effects on human neurodevelopment that may lead to long-lasting changes in the brain and behavior.

The team of neuroscientists examined the effect of birth mode (vaginal delivery versus Cesarean section) on neuronal cell death, an important process that reshapes neural circuits early in development. This process, which takes place in mice during the first week after birth, also occurs in humans. Their study's findings are published in the journal Proceedings of the National Academy of Sciences.

With the advent of modern medicine, Cesarean births, also known as C-sections, are becoming a widespread practice around the world. In the United States, C-sections account for about 30 percent of births every year, and many of these are elective. Cesarean births have been linked to behavioral effects in the offspring, which suggests effects on the brain, but human studies are confounded by the medical complications, altered birth timing and maternal factors often associated with Cesarean delivery.

Dr. Alexandra Castillo-Ruiz, Dr. Nancy Forger and their students in the Neuroscience Institute at Georgia State addressed these limitations in a carefully controlled study in mice by examining the brains of offspring before and after a vaginal or Cesarean birth up to weaning age, carefully matching pups for time of delivery.

They found that vaginally delivered mice had a decrease in cell death across the brain within hours of birth, but this did not occur in Cesarean-born offspring. The most dramatic difference was seen in a region of the hypothalamus that regulates the stress response and brain-immune interactions. The greater cell death in Cesarean-delivered newborns was associated with a reduction in the number of neurons in at least one brain area and was also associated with altered behavior in a maternal separation test.

Birth mode did not affect general measures of development such as total brain size or day of eye-opening in juvenile mice. However, the authors did observe increased weight gain in Cesarean-born mice at weaning age, which is consistent with clinical reports of higher body mass index in humans born by C-section.

Credit: 
Georgia State University

Males have greater reproductive success if they spend more time taking care of kids

Male mountain gorillas who spend more time taking care of kids are more reproductively successful

Research challenges assumptions scientists have made about paternal care in gorillas and other primates

Researcher: "Our findings suggest an alternative pathway by which evolution might have eventually led to fathering behavior in humans."

EVANSTON, Ill. --- Males have greater reproductive success if they spend more time taking care of kids -- and not necessarily only their own, according to new research published by anthropologists at Northwestern University.

In a previous study, the researchers found that wild male mountain gorillas living in Rwanda do something that is quite unusual for a mammal -- they help take care of all of the kids that live in their social group, regardless of whether they are the father. The goal of the new study was to figure out why.

"Mountain gorillas and humans are the only great apes in which males regularly develop strong social bonds with kids, so learning about what mountain gorillas do and why helps us understand how human males may have started down the path to our more involved form of fatherhood," said Stacy Rosenbaum, lead author of the study and a post-doctoral fellow in anthropology at Northwestern.

Christopher Kuzawa, a co-author of the study, said the findings run counter to how we typically think of male mountain gorillas -- huge, competitive and with reproduction in the group dominated by a single alpha male.

"Males are spending a lot of time with groups of kids -- and those who groom and rest more with them end up having more reproductive opportunities," said Kuzawa, professor of anthropology at Northwestern and a faculty fellow at the University's Institute for Policy Research. "One likely interpretation is that females are choosing to mate with males based upon these interactions."

Added Rosenbaum: "We've known for a long time that male mountain gorillas compete with one another to gain access to females and mating opportunities, but these new data suggest that they may have a more diverse strategy. Even after multiple controls for dominance ranks, age and the number of reproductive chances they get, males who have these bonds with kids are much more successful."

This research suggests an alternative route by which fathering behaviors might have evolved in our own species, Rosenbaum said.

"We traditionally have believed that male caretaking is reliant on a specific social structure, monogamy, because it helps ensure that males are taking care of their own kids. Our data suggest that there is an alternative pathway by which evolution can generate this behavior, even when males may not know who their offspring are," Rosenbaum said.

This raises the possibility that similar behaviors could have been important in the initial establishment of fathering behaviors in distant human ancestors.

The researchers are currently investigating whether hormones might play a role in helping facilitate these male behaviors, as they do in humans. Seminal work on the hormonal changes that men experience as they become fathers and care for kids has been conducted in the anthropology department at Northwestern.

"In human males, testosterone declines as men become fathers, and this is believed to help focus their attention on the needs of the newborn," said Kuzawa, who co-authored a study on this topic in the journal Proceedings of the National Academy of Sciences in 2011. "Might gorillas that are particularly engaged in infant interaction experience similar declines in testosterone? Because this would probably impede their ability to compete with other males, evidence that testosterone goes down would be a clear indication that they must be gaining some real benefit -- such as attracting mates. Alternatively, if it does not go down, this suggests that high testosterone and caretaking behavior don't have to be mutually exclusive in mountain gorillas."

The researchers look forward to exploring these new questions. "We're working on characterizing these males' hormone profiles across time, to see if events such as the birth of new infants might be related to their testosterone levels," Rosenbaum said. "We're fortunate to have data that span many years of their lives."

The study's senior author, Tara Stoinski of The Dian Fossey Gorilla Fund, added that such work highlights the critical importance of long-term research studies.

"Dian Fossey first went to study these mountain gorillas in the 1960s hoping to further our understanding of human evolution," Stoinski said. "More than 50 years later, the continued research on this population is still providing insights, not only on a critically endangered species, but also into what it means to be human."

Credit: 
Northwestern University

Study reveals best use of wildflowers to benefit crops on farms

image: This is a bumblebee foraging on a cup plant. Cup plants attracted the most bees to wildflower strips, according to a survey done by Cornell researchers.

Image: 
Heather Grab

ITHACA, N.Y. - With bee pollinators in decline and pesky crop pests lowering yields, sustainable and organic farmers need environmentally friendly solutions.

One strategy is to border crops with wildflower plantings to attract pollinators and pest predators. But scientists have suggested that such plantings may only be effective when farms are surrounded by the right mix of natural habitat and agricultural land.

For the first time, a Cornell University study of strawberry crops on New York farms tested this theory and found that wildflower strips on farms added pollinators when the farm lay within a "Goldilocks zone," where 25 to 55 percent of the surrounding area contained natural lands. Outside this zone, flower plantings also drew more strawberry pests, while having no effect on wasps that kill those pests.

Still, more pollinators in this ideal landscape zone boosted strawberry yields overall.
The analysis has implications for many types of state and federal programs in the United States and abroad that promote establishing pollinator habitats on farms.

"We're investing huge amounts of money on these programs and right now it's not part of the policy to think about the landscape context of where these habitats are placed," said Heather Grab, Ph.D. '17, the paper's first author and a postdoctoral researcher in the lab of Katja Poveda, associate professor of entomology and a co-author of the study.

The paper, "Landscape Context Shifts the Balance of Costs and Benefits From Wildflower Border on Multiple Ecosystem Services," published Aug. 1 in Proceedings of the Royal Society B, suggests targeting wildflower borders to farms with the right conditions and modifying wildflower plant species could maximize success.

The rationale behind the Goldilocks zone theory: wildflower strips surrounded by too much natural land would not add additional beneficial insects, because ample habitat would drown out a small strip of flowers. On the other hand, farms surrounded by other farms are already low on natural habitat and beneficial insects, making a wildflower strip too small to attract more insects.

"It's in this zone in the middle, where there's enough natural habitat around, and there are beneficial insects there, and you can attract them [insects] from the natural habitat into the crop habitat to actually see a benefit in terms of crop production," Grab said.

In the study, the researchers planted strawberry plots on 12 small New York state farms that represented a gradient of landscapes, from farms surrounded by natural habitat to farms next to agricultural lands. Each farm had two strawberry plantings, one plot bordered by a wildflower strip, and another control plot on the other side of the farm, edged with mowed grass.

The researchers conducted surveys of pollinators, pests, wasps that parasitize pests, fruit yield and fruit damage over three years. The tiny parasitizing wasps lay their eggs inside tarnished plant bug nymphs - a pest that costs New York strawberry growers 30 percent of their annual yield. When the eggs hatch, the larvae feed on the nymphs.

The wildflower strips were increasingly effective at attracting pollinators as each year passed. The result that "between 25 and 55 percent [surrounding natural landscapes] was the best range in terms of promoting bees," closely matched what the Goldilocks theory predicted, Grab said.

But when it came to pests, wildflowers outside the Goldilocks zone attracted the most pests and didn't add more wasps. "It suggests the parasitoids are not responding to wildflower strips at all," Grab said. More study is needed to understand why.

Analyses revealed many wildflower species attracted both pests and bees, but some species like fleabane (Erigeron annuus) lured the most pests and were least effective at drawing bees.

"If you wanted to optimize the wildflower patches, I would suggest we eliminate some of those from the list of recommended species in the plantings," Grab said.

Future work will investigate how floral habitats influence pathogen transmission for bees, the leading driver of bee declines.

Bryan Danforth and Greg Loeb, Cornell professors of entomology, co-authored the study, which was funded in part by a Northeast Sustainable Agriculture Research and Education grant.

Cornell University has dedicated television and audio studios available for media interviews supporting full HD, ISDN and web-based platforms. For additional information, see this Cornell Chronicle story.

-30-

Credit: 
Cornell University

Feminine leadership traits: Nice but expendable frills?

Despite expectations that stereotypically feminine leadership traits like communality will define 21st century leaders, the higher up we look across different types of organizations, the fewer women we find. A new study exploring this apparent contradiction reveals these communal leader traits -- like being tolerant and cooperative -- are viewed as desirable but ultimately superfluous add-ons. Instead, both men and women believe successful leaders need stereotypically masculine traits such as assertiveness and competence. Published in Frontiers in Psychology, the study is the first to examine potential tradeoffs in masculine versus feminine leadership traits and provides insights into the continued concentration of men in top leadership roles.

"When looking at the tradeoff that people make between communal and agency leadership traits, we found both men and women continue to see agentic traits as the hallmark of leadership. These are traits that are often associated more with men than with women," says Dr Andrea Vial of New York University, USA, who carried out the study along with Dr Jaime Napier of New York University Abu Dhabi.

"The idea that one must be highly agentic in order to succeed as a leader could discourage women from pursuing a high-power role -- and also discourage men from appointing women in such roles."

The researchers asked 273 men and women to design their ideal leader by "purchasing" traits from a list of stereotypically masculine (i.e., agency) and feminine (i.e., communality) leadership characteristics (see full list below). The results indicate that when the choice is not constrained, communality traits are valued. However, when the choice is limited then both men and women view competence and assertiveness as more of a necessity and communality as more of a luxury -- with the effect being stronger in men for competence.

The participants were also asked how much they would pay to minimize negative leadership traits. Here both men and women prefer to curb traits typically thought of as masculine, such as arrogance and stubbornness, than those typically thought of as feminine, such as shyness and being emotional -- with the desire to curb negative male traits being stronger for women.

The study further examined how a different sample of 249 men and women think of themselves in either a leadership role or an assistant role, and what kinds of attributes they would need to be effective. Here, both men and women equally think they should primarily be agentic in order to be a successful leader. In contrast, they view communal attributes as important to help them succeed in low-power assistant positions.

"Our results underscore that women internalize a stereotypically masculine view of leadership," says Vial. "Although women seem to value communality more than men when thinking about other leaders, they may feel that acting in a stereotypically feminine way themselves could place them at a disadvantage compared to male leaders."

Although it examined a selection of leader attributes without specific context, the study -- which is part of a research collection on gender roles in the future -- makes an important contribution to illuminating the continued scarcity of women at the very top of organizations.

"Our results suggest that the concentration of men in top decision-making roles such as corporate boards and chief executive offices may be self-sustaining because men in particular tend to devalue more communal styles of leadership -- and men are typically the gatekeepers to top organizational positions of prestige and authority," explains Vial.

On a positive note, the study indicates women might be more supportive than men of leaders with communal leadership styles.

"While it may be too soon to tell whether more communal traits will indeed define the leaders of the 21st century, our research suggests women might be more willing to embrace this trend," says Vial.

Credit: 
Frontiers