Culture

Concussion may bring greater risks for athletes with ADHD

INDIANAPOLIS - Athletes who have attention deficit hyperactivity disorder (ADHD) may be at greater risk for experiencing persistent anxiety and depression after a concussion than people who do not have ADHD, according to a preliminary study released today that will be presented at the American Academy of Neurology's Sports Concussion Conference in Indianapolis, July 20 to 22, 2018. ADHD is a brain disorder that affects attention and behavior.

"These findings suggest that ADHD and concussion may have a cumulative effect on anxiety and depression beyond that of either ADHD or concussion alone," said study author Robert Davis Moore, MS, PhD, of the University of South Carolina in Columbia. "Athletes with ADHD should be monitored with this in mind, as they may be more susceptible to experience symptoms of depression and anxiety following a concussion."

The study involved 979 NCAA Division I college athletes at the University of South Carolina. Information on ADHD diagnosis and any history of concussion was gathered, along with the athletes' scores on questionnaires measuring anxiety and depression before the start of athletes' sporting seasons.

Athletes were divided into four groups: those with ADHD who also had experienced concussion; those with ADHD who had not experienced a concussion; those with concussion and no ADHD; and those with neither a history of concussion nor ADHD.

The researchers found that athletes with both ADHD and concussion had significantly higher scores on the tests for anxiety and depression than any of the other groups. Moore noted that athletes with a history of concussion were evaluated six or more months after the injury, indicating that the differences lasted longer than what might be expected in the weeks after the concussion.

Athletes with ADHD but no history of concussion did not show increased anxiety or depression.

The anxiety test asks people how often they agree with statements such as "I am tense; I am worried" and "I worry too much over something that really doesn't matter," with answers ranging from "Almost never" to "Almost always." Scores range from 20 to 80. The athletes with both concussion and ADHD had average scores of 42 on the test, compared to an average score of 33 for the other three groups.

The depression test asks how often during the past week people have agreed with statements such as "I did not feel like eating; my appetite was poor" and "I felt that everything I did was an effort," with answers ranging from "Rarely or none of the time (less than 1 day)" to "Most or all of the time (5-7 days)." Scores range from zero to 60, with scores of 16 or higher indicating that a person may be at risk for clinical depression. The athletes with both concussion and ADHD had average scores of 26, compared to an average score of 16 for the other three groups.

A limitation of the study is that it is cross-sectional, which means it looks at one point in time and does not show cause and effect. Moore said additional research is needed with multiple tests both before and after any concussions occur.

Credit: 
American Academy of Neurology

Hospitals may take too much of the blame for unplanned readmissions

BIDMC Research Briefs showcase groundbreaking scientific advances that are transforming medical care.

A major goal of hospitals is to prevent unplanned readmissions of patients after they are discharged. A new study reveals that the preventability of readmissions changes over time: readmissions within the first week after discharge are often preventable by the hospital, whereas readmissions later are often related to patients' difficultly accessing outpatient clinics.

"Patients discharged from a hospital are usually recovering from a serious medical condition as well as managing other chronic medical conditions, and they often encounter new logistical challenges adapting to this recovery period," said Kelly Graham, MD, MPH, Director of Ambulatory Residency Training at BIDMC and an Instructor in Medicine at Harvard Medical School. "Hospitals and outpatient clinics must work together more seamlessly to ensure that patients are equipped to manage these challenges at home."

For their study, published in the Annals of Internal Medicine, Graham and her colleagues examined information on 822 general medicine patients readmitted to 10 academic medical centers in the United States. Overall, 36.2% of early readmissions versus 23.0% of late readmissions were deemed preventable. Hospitals were identified as better locations for preventing early readmissions, whereas outpatient clinics and home were better for preventing late readmissions.

Premature discharge and problems with physician decision-making related to diagnosis and management during the initial hospitalization were likely causes of readmissions in the early period. More likely to be amenable to interventions outside the hospital, later readmissions were most often caused by factors over which the hospital has less direct control, such as monitoring and managing of symptoms after discharge by primary care clinicians, as well as end-of-life issues.

Taken together, the findings suggest that readmissions in the week after discharge are more preventable and more likely to be caused by factors over which the hospital has direct control than those later in the 30-day window. In the current US system, however, unplanned readmissions within the 30 days after hospital discharge are considered uniformly preventable by hospitals, and thus hospitals are punished with financial penalties for these events.

"Our findings suggest that the 30 days following hospital discharge are not the same with regard to what influences outcomes for sick patients, and that the current model over-simplifies this high-risk time," said Graham. "One potential unintended consequence of this is that outpatient environments have not been involved in efforts to manage this high-risk timeframe, which results in poorly coordinated care and worse outcomes for our patients."

Graham noted that interventions to improve outcomes after hospital discharge should engage the ambulatory care system, with attention to improving access to primary care. "We should also be careful not to put too much focus on reducing length of stay in the hospital, which may be a driver of premature discharge and early readmissions," she said.

Credit: 
Beth Israel Deaconess Medical Center

Understanding the social dynamics that cause cooperation to thrive, or fail

image: Despite their reputation, spotted hyenas are often cooperative animals, dwelling in large groups and assisting one another during hunts. Penn biologist Erol Akçay modeled a theoretical social group to show how cooperation can arise or collapse.

Image: 
Amiyaal Ilany

Examples of cooperation abound in nature, from honeybee hives to human families. Yet it's also easy enough to find examples of selfishness and conflict. Studying the conditions that give rise to cooperation has occupied researchers for generations, with implications for understanding the forces that drive workplace dynamics, charitable giving, animal behavior, even international relations.

A basic tenet of these studies is that cooperative behavior arises when individuals interacting in a social network derive some benefit from being generous with one another. Yet social networks are not fixed. What if the structure of the network itself alters as individuals become more cooperative?

In a new report in the journal Nature Communications, Erol Akçay, an assistant professor of biology in Penn's School of Arts and Sciences, addresses this question of how an evolving social network influences the likelihood of cooperation in a theoretical social group. He finds that, although networks where connected individuals are closely related are more likely to cooperate, such groups can trigger a feedback loop that alters the structure of the network and leads to cooperation's collapse.

"We know from a half-century of study that cooperation is quite easy to evolve in principle," says Akçay, "in the sense that there are many, many sets of conditions that can make cooperative behaviors a better strategy than non-cooperative behaviors. So given that, why isn't the world a cooperative paradise? Because we know it isn't."

Akçay's theoretical work points to a reason why: It's possible that the social structure that gave rise to high levels of cooperation may not be stable in such a cooperative environment. Yet, his model also suggests that cooperation can be maintained if the benefits of generous behavior are great enough.

The work builds upon studies that Akçay pursued with former postdoctoral researcher Amiyaal Ilany, now a faculty member at Bar-Ilan University. They developed a mathematical model of how individual animals inherit their social connections that can explain the structure of social networks in animal groups. In the new work, Akçay built on that earlier model by adding in an element of choice; individuals in the network could either connect with a parent's connection, or randomly with individuals aside from a parent's connections. The probabilities of making each type of connection determine the structure of the network.

Each individual in the model was further designated to be either a cooperator or a defector. Cooperators provide a benefit to those they connect with, but the total amount they provide is fixed, so the more connections they have, the less each connection receives. Both cooperators and defectors reap a benefit based on the number of links to cooperators they possess, but defectors don't offer anything in return.

Somewhat intuitively, Akçay found that groups with low levels of random linking--that is, connections not made through a parent--were more likely to have cooperation emerge, because they resulted in high relatedness between connected individuals. In contrast, the probability of making connections through one's parent had a relatively small effect on cooperation. But when he let the model continue to run, he found something he hadn't anticipated.

"If you suddenly find yourself in a population where most individuals are cooperators," he says, "then you shouldn't be selective about who you connect to, you should just make links with anyone who comes along."

In other words, in a mostly cooperative population, making random links is just as beneficial as only making links to your parent's connections. That leads to a situation in which cooperators begin forming connections with defectors, triggering a decline in the overall cooperative nature of the network.

"If everyone is handing out candy," Akçay says, "you should just go collect candy from everyone without being too selective about the connection."

But Akçay did find a way to push back against the descent into defection. When making a social link is costly--such as the time primates spend grooming one another, or the effort that goes into remembering to send a holiday gift to distant relatives--the likelihood of making random links goes down, and so, too, does the probability that a cooperative society will collapse into selfishness.

In future work, Akçay hopes to consider other elements of social groups that influence the rise of cooperation beyond social network structure, including individual preference, life history, and the costs and benefits of cooperating. Factoring in these other aspects of a group dynamic may, he hopes, shed light on strategies to foster more cooperation and generosity in our own society.

Credit: 
University of Pennsylvania

The more you smoke, the greater your risk of a heart rhythm disorder

Sophia Antipolis, 12 July 2018: The more you smoke, the greater your risk of a heart rhythm disorder called atrial fibrillation. That's the finding of a study published today in the European Journal of Preventive Cardiology, a European Society of Cardiology (ESC) journal.(1)

The study found a 14% increase in the risk of atrial fibrillation for every ten cigarettes smoked per day. There was a linear dose-response relationship, meaning that the risk increased with each additional cigarette smoked.

Compared to people who had never smoked, current smokers had a 32% increased risk of atrial fibrillation, while ever smokers (current and former smokers combined) had a 21% increased risk, and former smokers had a 9% increased risk - providing further evidence of a dose-response relationship.

"If you smoke, stop smoking and if you don't smoke, don't start," said study author Dr Dagfinn Aune, postdoctoral researcher at Imperial College London, UK, and associate professor at Bjørknes University College in Oslo, Norway. "We found that smokers are at increased risk of atrial fibrillation, but the risk is reduced considerably in those who quit."

Smoking is a lethal addictive disorder.(2) A lifetime smoker has a 50% probability of dying due to smoking, and on average will lose ten years of life. Slightly less than half of lifetime smokers will continue smoking until death. The rate of smoking is declining in Europe, but it is still very common and is increasing in women, adolescents and the socially disadvantaged.

Atrial fibrillation is the most common heart rhythm disorder (arrhythmia). It causes 20-30% of all strokes and increases the risk of dying prematurely.(3) One in four middle-aged adults in Europe and the US will develop atrial fibrillation. It is estimated that by 2030 there will be 14-17 million patients with atrial fibrillation in the European Union, with 120,000-215,000 new diagnoses each year.

Few studies have assessed whether there is a dose-response relationship between the number of cigarettes smoked and the risk of atrial fibrillation. The authors of the current study investigated this issue by conducting a meta-analysis of 29 prospective studies from Europe, North America, Australia and Japan with a total of 39,282 incident cases of atrial fibrillation among 677,785 participants.

Compared to zero cigarettes per day, smoking five, ten, 15, 20, 25 and 29 cigarettes per day was associated with a 9%, 17%, 25%, 32%, 39%, and 45% increased risk of atrial fibrillation, respectively.

Every ten pack-years of smoking was associated with a 16% increased risk of developing atrial fibrillation. Pack-years are calculated by multiplying the number of packs of cigarettes smoked per day by the number of years the person has smoked.

European guidelines on the prevention of cardiovascular disease recommend avoiding tobacco in any form.2 All types of smoked tobacco, including low-tar ("mild" or "light") cigarettes, filtered cigarettes, cigars, pipes, and water pipes are harmful.

Dr Aune said: "Our results provide further evidence of the health benefits of quitting smoking and, even better, to never start smoking in the first place. This is important from a public health perspective to prevent atrial fibrillation and many other chronic diseases."

Dr Aune noted that more research is needed to identify the duration of smoking cessation needed to reduce the risk of atrial fibrillation, and whether the risk at some point reaches that of people who have never smoked.

Credit: 
European Society of Cardiology

The Lancet Psychiatry: Automated virtual reality-based psychological therapy may help reduce fear of heights

Psychological therapy delivered by a virtual reality coach can help people with a clinically diagnosed fear of heights overcome their fear, according to a randomised controlled trial of 100 people published in The Lancet Psychiatry journal.

The study is the first to use virtual reality technology as a treatment without a therapist, providing a proof of concept for how some psychological interventions might be offered in future. However, more research is needed to understand how automated therapy would apply in other conditions, including more severe mental health disorders such as psychosis, where therapy is currently delivered by experienced mental health professionals.

Having a fear of heights is the most common phobia, with one in five people reporting having a fear of heights during their lifetime, and one in 20 people clinically diagnosed with a fear of heights.

In previous research, people with a fear of heights used virtual reality training in sessions with a therapist. The study found that it was as effective as exposure to heights in real life, and that the reduced fear lasted for at least a year.

In the new study, 100 people with clinically diagnosed fear of heights who were not receiving psychological therapy for it were given either the new automated virtual reality treatment (49 people) or usual care, which was typically no treatment (51 people). On average, participants had suffered a fear of heights for 30 years.

All participants completed questionnaires on the severity of their fear of heights at the start of the trial, at the end of treatment (two weeks later), and at follow-up after four weeks.

Participants given the virtual reality treatment had roughly six 30-minute sessions over two weeks, where they wore a virtual reality headset. In the first session, participants discussed their fear of heights with the virtual coach, explaining what caused their fear (for example, fear of falling, fear of throwing oneself off the building, fear of the building collapsing) while the virtual coach gave basic information about fear of heights [1].

Participants then entered a virtual office complex with ten floors and a large atrium space, where they took part in activities that challenged their fears and helped them learn that they were safer than they thought. These started with simpler tasks, such as watching a safety barrier to a drop gradually lowering, and built up to harder tasks, such as walking out on a platform over a large drop. Other tasks also included rescuing a cat from a tree, playing a xylophone near an edge, and throwing balls over the edge of a drop.

Throughout the activities the virtual coach offered encouragement, and afterwards they explained what the participant had learnt from their activities and asked whether they felt safer than before. The virtual coach also encouraged participants to try real heights between sessions.

Of the 49 participants offered the virtual reality treatment, 47 took part in at least one session, and 44 completed the full course of treatment. The three people who did not complete the intervention either found the sessions too difficult (two people) or were unable to attend further appointment sessions (one participant).

At the end of treatment and at follow-up, control group participants rated their fear of heights as remaining similar, but all participants in the virtual reality treatment group rated that their fear of heights had reduced. By follow-up, 34 of 49 people in this group were not rated as having a fear of heights, compared with none of the 51 people in the control group.

There were no adverse events reported by any participants.

"Immersive virtual reality therapies that do not need a therapist have the potential to dramatically increase access to psychological interventions," says lead author Professor Daniel Freeman, University of Oxford, UK. "We need a greater number of skilled therapists, not fewer, but to meet the large demand for mental health treatment we also require powerful technological solutions. As seen in our clinical trial, virtual reality treatments have the potential to be effective, and faster and more appealing for many patients than traditional face-to-face therapies. With our unique automation of therapy using virtual reality there is the opportunity to provide really high quality treatment to many more people at an affordable cost. Our study is an important first step, and we are carrying out clinical testing to learn whether automation of psychological treatment using virtual reality works for other mental health disorders." [1]

The authors note some limitations in their study, including that they did not compare against the psychological therapies currently used to treat phobias, such as counselling, psychotherapy, or cognitive behavioural therapy. Instead participants in the usual care group typically received no treatment. The participants also referred themselves to take part in the study, so may not be representative of all people with fear of heights.

The authors also note that they relied on questionnaires to assess participants' fear of heights, and did not test them in real-world scenarios.

The initial cost of software development was high, with a team of psychologists, programmers, script writers, and an actor working intensively for six months, but subsequent costs for the treatment were low, as a therapist does not need to be present and consumer virtual reality hardware is now inexpensive.

The trial did not assess which part of the treatment caused the improvements, and did not test long-term outcomes of the treatment as previous studies have shown that virtual reality treatment can reduce anxiety for several years. The authors note that the treatment was brief, and further benefits could be possible with a longer treatment duration.

Commenting on the virtual reality treatment, one participant said: "What I'm noticing is that in day-to-day life I'm much less averse to edges, and steps, and heights, and I'm noticing in myself that when I'm doing the VR and outside I'm able to say 'Hello' to the edge instead of bracing against it and backing up. When I'm doing the VR I'm, as best as I'm able to, being open and curious around me as much as I can and noticing how the anxiety feels in my body, and then noticing that it goes really quickly now. So, when I've always got anxious about an edge I could feel the adrenaline in my legs, that fight/flight thing; that's not happening as much now. I'm still getting a bit of a reaction to it, both in VR and outside as well, but it's much more brief, and I can then feel my thighs soften up as I'm not bracing up against that edge. I feel as if I'm making enormous progress, and feel very happy with what I've gained." [3]

Writing in a linked Comment, Dr Mark Hayward, University of Sussex, UK, welcomes the study's benefits for people with a fear of heights, but questions how virtual reality treatments could apply for people with more severe mental health problems. He says: " Psychological treatments for patients with psychosis face many challenges, because access to the treatments can be restricted and the treatment might generate only small effects. Symptom-specific treatments targeting either paranoia or auditory hallucinations are generating promising outcomes that might increase effect sizes, but their delivery in traditional face-to-face formats by expert therapists will do little to increase access (even when technology is utilised, such as in AVATAR therapy). VR is a promising method for delivering psychological treatments to patients with psychosis, but can a fully automated delivery system increase access? And are greater effects also possible because of the virtual exposure to everyday situations that are experienced as threatening?"

Credit: 
The Lancet

People trust scientific experts more than the government even when the evidence is outlandish

Members of the public in the UK and US have far greater trust in scientific experts than the government, according to a new study by Queen Mary University of London.

In three large scale experiments, participants were asked to make several judgments about nudges -behavioural interventions designed to improve decisions in our day-to-day lives.

In three large scale experiments, participants were asked to make several judgments about nudges -behavioural interventions designed to improve decisions in our day-to-day lives.

The nudges were introduced either by a group of leading scientific experts or a government working group consisting of special interest groups and policy makers.

Some of the nudges were real and had been implemented, such as using catchy pictures in stairwells to encourage people to take the stairs, while others were fictitious and actually implausible like stirring coffee anti-clockwise for two minutes to avoid any cancerous effects.

The study, published in the journal Basic and Applied Social Psychology, found that trust was higher for scientists than the government working group, even when the scientists were proposing fictitious nudges.

Co-author Professor Norman Fenton, from Queen Mary's School of Electronic Engineering and Computer Science, said: "While people judged genuine nudges as more plausible than fictitious nudges, people trusted some fictitious nudges proposed by scientists as more plausible than genuine nudges proposed by government. For example, people were more likely trust the health benefits of coffee stirring than exercise if the former was recommended by scientists and the latter by government."

The results also revealed that there was a slight tendency for the US sample to find the nudges more plausible and more ethical overall compared to the UK sample.

Lead author Dr Magda Osman from Queen Mary's School of Biological and Chemical Sciences, said: "In the context of debates regarding the loss of trust in experts, what we show is that in actual fact, when compared to a government working group, the public in the US and UK judge scientists very favourably, so much so that they show greater levels of trust even when the interventions that are being proposed are implausible and most likely ineffective. This means that the public still have a high degree of trust in experts, in particular, in this case, social scientists."

She added: "The evidence suggests that trust in scientists is high, but that the public are sceptical about nudges in which they might be manipulated without them knowing. They consider these as less ethical and trust the experts proposing them less with nudges in which they do have an idea of what is going on."

Nudges have become highly popular decision-support methods used by governments to help in a wide range of areas such as health, personal finances, and general wellbeing.

The scientific claim is that to help people make better decisions regarding their lifestyle choices, and those that improve the welfare of the state, it is potentially effective to subtly change the framing of the decision-making context, which makes the option which maximises long term future gains more prominent.

In essence, the position adopted by nudge enthusiasts is that poor social outcomes are often the result of poor decision-making, and in order to address this, behavioural interventions such as nudges can be used to reduce the likelihood of poor decisions being made in the first place.

Dr Osman said: "Overall, the public make pretty sensible judgments, and what this shows is that people will scrutinise the information they are provided by experts, so long as they are given a means to do it. In other words, ask the questions in the right way, and people will show a level of scrutiny that is often not attributed to them. So, before there are strong claims made about public opinion about experts, and knee-jerk policy responses to this, it might be worth being a bit more careful about how the public are surveyed in the first place."

Credit: 
Queen Mary University of London

Healthy diet reduces asthma symptoms

People who eat a healthy diet experience fewer asthma symptoms and better control of their condition, according to a new study published in the European Respiratory Journal [1].

Diets with better asthma outcomes are characterised by being healthier, with greater consumption of fruits, vegetables and whole grain cereals. Unhealthy diets, with high consumption of meat, salt and sugar, have the poorest outcomes.

The study strengthens the evidence on the role of a healthy diet in managing asthma symptoms, and offers new insights on the potential impact of diet in the prevention of asthma in adults.

Lead researcher Dr Roland Andrianasolo, from the Nutritional Epidemiology Research Team at Inserm, Inra [2], and Paris 13 University said: "Existing research on the relationship between diet and asthma is inconclusive, and compared to other chronic diseases, the role of diet in asthma is still debated. This has resulted in a lack of clear nutritional recommendations for asthma prevention, and little guidance for people living with asthma on how to reduce their symptoms through diet.

"To address this gap, we wanted to make more detailed and precise assessments of dietary habits and the associations between several dietary scores and asthma symptoms, as well as the level of asthma control."

The research team analysed data from 34,776 French adults who answered a detailed respiratory questionnaire as part of the 2017 NutriNet-Santé study. This included 28% of women and 25% of men who the researchers identified as having at least one asthma symptom. The number of asthma symptoms experienced by all of the participants was measured using self-report data over a 12-month period.

To assess asthma control in the participants already living with asthma, the researchers used a self-administered questionnaire, which evaluates asthma control over a four-week period. Measures such as occurrence of asthma symptoms, use of emergency medication and limitations on activity indicated the level of asthma control.

Quality of diet was assessed based on three randomly collected 24-hour dietary records and each participant's adherence to three dietary scores. Generally, the dietary scores all considered diets with high fruit, vegetable and whole grain cereal intake as the healthiest, while diets high in meat, salt and sugar were the least healthy.

The researchers adjusted their analysis to consider other factors known to be linked with asthma, such as smoking and exercise.

The data showed that, overall, men who ate a healthier diet had a 30% lower chance of experiencing asthma symptoms. In women with healthier diets, the chance of experiencing symptoms was 20% lower.

The research also showed that for men with asthma the likelihood of poorly controlled symptoms was around 60% lower in those who had healthy diets and among women with asthma, poorly controlled disease was 27% lower in those with healthy diets.

The researchers say that the results suggest a healthy diet may have a role in preventing the onset of asthma as well as controlling asthma in adults. Dr Andrianasolo explained: "This study was designed to assess the role of an overall healthy diet on asthma symptoms and control, rather than identify particular specific foods or nutrients. Our results strongly encourage the promotion of healthy diets for preventing asthma symptoms and managing the disease.

"A healthy diet, as assessed by the dietary scores we used, is mostly made up of a high intake of fruit, vegetables and fibre. These have antioxidant and anti-inflammatory properties and are elements in a healthy diet that potentially lower symptoms. In contrast, the least healthy diets include high consumption of meat, salt and sugar, and these are elements with pro-inflammatory capacities that may potentially worsen symptoms of asthma."

The researchers note that caution is needed when interpreting the results from this study as it only provides a snap-shot of the possible effects of diet on asthma, and say they plan to conduct longer-term studies in future to confirm their findings.

Dr Andrianasolo added: "Although further studies are needed to confirm our observations, our findings contribute to evidence on the role of diet in asthma, and extend and justify the need to continually support public health recommendations on promoting a healthy diet."

Professor Mina Gaga, President of the European Respiratory Society, and Medical Director and Head of the Respiratory Department of Athens Chest Hospital, said: "This research adds to the evidence on the importance of a healthy diet in managing asthma and its possible role in helping prevent the onset of asthma in adults. Healthcare professionals must find the time to discuss diet with their patients, as this research suggests it could play an important role in preventing asthma."

Credit: 
European Respiratory Society

Fuzzy yellow bats reveal evolutionary relationships in Kenya

image: This is one of the African yellow house bats studied by scientists to better understand the evolution of this family of bats.

Image: 
(c) P.W. Webala Maasai Mara Universit

After Halloween, people tend to forget about bats. But, for farmers, residents of Kenya, and scientists, bats are a part of everyday life. While North America has 44 species, Kenya, a country the size of Texas, has 110 bat species. Many of these species also contain subspecies and further divisions that can make the bat family tree look like a tangled mess. Researchers set out to cut the clutter by sorting the lineages of yellow house bats and in the process found two new species.

The bats of Scotophilus vary in size and other characteristics but, in general, "They're cute. They look a lot like the bats you see in Chicago but they're this great yellow color," says Terry Demos, a Postdoctoral Fellow at Chicago's Field Museum and lead author of a recent paper in the journal Frontiers in Ecology and Evolution. These furry creatures can roost in the nooks and crannies of homes in Kenya. "These are bats that live with people--they don't call them house bats for nothing," adds Bruce Patterson, MacArthur Curator of Mammals at the Field Museum and co-author of the study. Bats usually don't fly too far to find a home either. Despite having wings, bats prefer to stay in a specific region, resulting in huge amounts of diversity throughout Africa.

Before understanding how these bat species related to one another, it was difficult to even research them. "We were using three different names for these bats in the field," says Patterson. That kind of evolutionary confusion is enough to make anyone batty. As Demos and Patterson explain, bats that look very similar could have wildly different genetic information. This means that new species could be hiding in plain sight due to their physical similarities to other species. The only way to solve this mystery is to use cutting-edge genetic analysis techniques.

Skin samples collected from the field in Kenya, combined with information from an online genetic database, provided clarity to species confusion. Comparing all the DNA sequences of the samples showed the amount of similarity. The more similar the DNA, the closer species are to each other evolutionarily. This information was then used to make a chart that looks like a tree, with branches coming off one point. The tree is similar to a family tree, but instead of showing the relationships between different family members, it shows the relationships between species. The results accomplished the goal of finding the limits of species but also showed unexpected results. Besides sorting the known species, the tree predicted at least two new bat species. "These new species are unknown to science," says Demos. "There was no reason to expect that we'd find two new species there." When Patterson saw these two undescribed species, he got excited: "It's cool because it says there's a chapter of evolution that no one's stumbled across before."

These findings are not only interesting to scientists but to the local farming industry. Organic groceries at Trader Joe's would be next to impossible without bats. They act as a natural pesticide, eating insects that threaten crops. Besides farmers, local health officials also rely on bat research because bats can be disease vectors that threaten public health. Being able to understand bats means that scientists can protect public health and plates of food.

This unexpected finding attests to the diversity of life in Kenya and other tropical locales in Africa. The variety of species in these regions is not ye described because, "Africa is understudied, and its biodiversity is underestimated, and it's critical because there are threats to its biodiversity," says Demos. This research gives a framework for future scientists to categorize species of bats and describe new species.

In the United States, because our bats are well researched, there is an app that can recognize bat calls, kind of like Shazam for bats. Patterson plays bat sounds off his phone,"I recorded this in my driveway and an app was able to identify the bat. This is what we want to be able to do in the field someday." The next step in this research is using the genetic analysis of Scotophilus bats as a framework that allows scientists to categorize and eventually recognize species based on observable features, such as the chirps, squeaks, and sounds human ears can't hear.

Demos notes that it is important to better understand these mysterious flying mammals to help conservation and local farming efforts. This study surveying Kenya paves the way for exploring other regions using the same methods. Science has brought us closer to understanding how bat species relate to one another, but Patterson says there is still more to discover--"No interesting biological questions are ever fully answered, and progress towards answering them invariably opens up a variety of others."

Credit: 
Field Museum

Giant, recently extinct seabird also inhabited Japan

image: Geographic locations of Bering Island and Shiriya. The red area indicates the possible past distribution of the cormorant, eventually residing only on Bering Island.

Image: 
Kyoto University / Junya Watanabe

Japan -- Scientists report that a large, extinct seabird called the spectacled cormorant, Phalacrocorax perspicillatus -- originally thought to be restricted to Bering Island, far to the north -- also resided in Japan nearly 120,000 years ago.

Writing in The Auk Ornithological Advances, the team indicates that the species under-went a drastic range contraction or shift, and that specimens found on Bering Island are 'relicts' -- remnants of a species that was once more widespread.

The global threat of human activity on species diversity is grave. To correctly assess re-lated extinction events, it is imperative to study natural distributions before first contact with humans. This is where archaeological and fossil records play crucial roles.

The spectacled cormorant, a large-bodied seabird first discovered in the 18th century on Bering Island, was later driven to extinction through hunting, following colonization of the island by humans in the early 1800s.

"Before our report, there was no evidence that the cormorant lived outside of Bering Island," explains first author Junya Watanabe of Kyoto University's Department of Ge-ology and Mineralogy.

Studying bird fossils recovered from Shiriya, Aomori prefecture, Watanabe and his team identified 13 bones of the spectacled cormorant from upper Pleistocene deposits, formed nearly 120,000 years ago.

"It became clear that we were seeing a cormorant species much larger than any of the four native species in present-day Japan," states co-author Hiroshige Matsuoka. "At first we thought this might be a new species, but these fossils matched bones of the specta-cled cormorant stored at the Smithsonian Institution."

Changes of oceanographic conditions may be responsible for the local disappearance of the species in Japan. Paleoclimate studies show that oceanic productivity around Shiriya dropped drastically in the Last Glacial Maximum, around 20,000 years ago. This would have seriously affected the population of the cormorant.

Although it might be possible that hunting of the species by humans took place in pre-historic Japan, archaeological evidence of this has yet to be discovered. The entire pic-ture of the extinction event of the spectacled cormorant may be more complex than pre-viously thought.

"The cormorant was a gigantic animal, its large size thought to have been achieved through adaptation to the island-oriented lifestyle on Bering," adds Watanabe. "But our finding suggests that this might not have been the case; after all, it just resided there as a relict. The biological aspects of these animals deserve much more attention."

Credit: 
Kyoto University

Colorful celestial landscape

image: New observations with ESO's Very Large Telescope show the star cluster RCW 38 in all its glory. This image was taken during testing of the HAWK-I camera with the GRAAL adaptive optics system. It shows the cluster and its surrounding clouds of brightly glowing gas in exquisite detail, with dark tendrils of dust threading through the bright core of this young gathering of stars.

Image: 
ESO/K. Muzic

This image shows the star cluster RCW 38, as captured by the HAWK-I infrared imager mounted on ESO's Very Large Telescope (VLT) in Chile. By gazing into infrared wavelengths, HAWK-I can examine dust-shrouded star clusters like RCW 38, providing an unparalleled view of the stars forming within. This cluster contains hundreds of young, hot, massive stars, and lies some 5500 light-years away in the constellation of Vela (The Sails).

The central area of RCW 38 is visible here as a bright, blue-tinted region, an area inhabited by numerous very young stars and protostars that are still in the process of forming. The intense radiation pouring out from these newly born stars causes the surrounding gas to glow brightly. This is in stark contrast to the streams of cooler cosmic dust winding through the region, which glow gently in dark shades of red and orange. The contrast creates this spectacular scene -- a piece of celestial artwork.

Previous images of this region taken in optical wavelengths are strikingly different -- optical images appear emptier of stars due to dust and gas blocking our view of the cluster. Observations in the infrared, however, allow us to peer through the dust that obscures the view in the optical and delve into the heart of this star cluster.

HAWK-I is installed on Unit Telescope 4 (Yepun) of the VLT, and operates at near-infrared wavelengths. It has many scientific roles, including obtaining images of nearby galaxies or large nebulae as well as individual stars and exoplanets. GRAAL is an adaptive optics module which helps HAWK-I to produce these spectacular images. It makes use of four laser beams projected into the night sky, which act as artificial reference stars, used to correct for the effects of atmospheric turbulence -- providing a sharper image.

This image was captured as part of a series of test observations -- a process known as science verification -- for HAWK-I and GRAAL. These tests are an integral part of the commissioning of a new instrument on the VLT, and include a set of typical scientific observations that verify and demonstrate the capabilities of the new instrument.

Credit: 
ESO

New approach to treating infectious diseases as an alternative to antibiotics

image: Fig.1. The mechanism of initial attachment of ETEC to human intestinal epithelium and its inhibition by antibodies.

Image: 
Osaka University

Enterotoxigenic Escherichia coli (ETEC) is known as a major cause of diarrhea in travelers and people living in developing countries. According to the World Health Organization (WHO), ETEC is responsible for 300,000 ~ 500,000 deaths a year, constituting a serious problem.

Effective vaccines for ETEC have not been developed, so patients infected with ETEC are treated with antibiotics and supporting measures. However, the emergence of multidrug-resistant bacteria has become a social issue, so the development of new treatment methods is sought after.

Adherence to the host intestinal epithelium is an essential step for ETEC infection in humans. It was thought that a filamentous structure on the surface of bacteria called 'type IV pilus' was important for bacterial attachment, but its detailed adhesion mechanism was not known.

Osaka University-led researchers clarified how pathogenic E.coli attached to the host intestinal epithelium using type IV pili and secreted proteins. Their research results were published in PNAS.

One of the corresponding authors, Shota Nakamura, says, "We demonstrated that type IV pili on the surface of the bacteria were not sufficient for ETEC adherence to intestinal epithelial cells and that proteins secreted by E.coli were also necessary. The administration of antibodies against the secreted proteins inhibited attachment of the E.coli."

Using X-ray crystallography, the researchers studied how a protein located only at the pilus-tip interacts with a protein secreted by E.coli in the intestines (Figure 2), clarifying the attachment mechanism of ETEC; that is, secreted proteins serve as molecular bridges that bind both type IV pili on the surface of the bacteria and intestinal epithelial cells in humans.

Nakamura also says, "It's possible that this attachment is a common feature in many type IV pili expressing enteropathogens such as Vibrio cholerae and constitutes a new therapeutic target against such bacterial pathogens."

Their research results will lead to the development of not only new vaccines for ETEC, but also anti-adhesion agents for preventing the binding of proteins implicated in bacterial attachment. Anti-adhesion agents can rinse pathogenic bacteria out from the body without destroying them, so there is no danger of producing drug-resistant bacteria. These agents, once developed, will act as a novel treatment approach that may serve as an alternative to antibiotics.

Credit: 
Osaka University

New report says individual research results should be shared with participants more often

WASHINGTON - When conducting research involving the testing of human biospecimens, investigators and their institutions should routinely consider whether and how to return individual research results on a study-specific basis through an informed decision-making process, says a new report from the National Academies of Sciences, Engineering, and Medicine. Decisions on whether to return individual research results will vary depending on the characteristics of the research, the nature of the results, and the interests of participants.

The undertaking of biomedical research with human participants -- from exploratory, basic science inquiries to clinical trials using well-validated tests -- often includes development of laboratory test results associated with an individual research participant. Recent changes to federal regulations have promoted transparency and allowed individuals greater access to these results; however, regulations are not consistent, the report says. For example, the Centers for Medicare & Medicaid Services (CMS) prohibits the return of results from laboratories that are not certified under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), but in some circumstances the Health Insurance Portability and Accountability Act of 1996 (HIPAA) may require the return of results requested by a participant, regardless of whether they were generated in a CLIA-certified laboratory. CLIA requirements ensure the quality and integrity of data, accurate reconstruction of test validation and test performance, and the comparability of test results regardless of performance location.

"There is a long-standing tension in biomedical research arising from a conflict in core values - the desire to respect the interests of research participants by communicating results versus the responsibility to protect participants from uncertain, perhaps poorly validated information," said Jeffrey Botkin, associate vice president for research and professor of pediatrics at University of Utah and chair of the study committee that wrote the report. "In weighing the complex and competing considerations, we recommend a transition away from firm rules embodied in current CLIA and HIPAA regulations toward a process-oriented approach favoring communication of results while seeking to enhance the quality of results emerging from research laboratories. Our hope is that this report will provide a road map toward better and more collaborative and transparent research practices that will benefit participants, investigators, and society more broadly."

The justification for returning results becomes stronger as both the potential value of the result to participants and the feasibility of return increase, the report says. To harmonize relevant regulations, regulators and policymakers should revise them in a way that respects the interests of research participants in obtaining individual research results and balances the competing considerations of safety, quality, and burdens on the research enterprise. For example, CMS should revise CLIA regulations to allow for the return of results from non-CLIA certified laboratories when results are requested under the HIPAA access right and also when an institutional review board process determines it is permissible.

Establishing laboratory processes to give all stakeholders confidence in the validity of the individual research results is critical to ensuring the accuracy of information provided to research participants as well as the quality of the science. Currently, there is no accepted quality management system (QMS) for research laboratories that could serve as an alternative to CLIA certification. The committee recommended that the National Institutes of Health lead an effort with other relevant federal agencies, nongovernmental organizations, and patient and community groups to develop a QMS with external accountability for non-CLIA certified research laboratories testing human biospecimens.

To minimize the burden for research laboratories with constrained resources to put such a QMS in place, sponsors, funding agencies, and research institutions should facilitate access to resources and support training and the development of the necessary laboratory infrastructure. The initial training, cost, and time commitment will likely be significant, but the value added will be considerable for both participants and science, the report says.

Furthermore, the use of effective communication strategies can minimize the risk of misinterpretation or over-interpretation of research results. In the consent process, investigators should communicate clearly to research participants whether, under what circumstances, and how investigators will offer and return research results. When individual results are communicated to participants, investigators should facilitate understanding of the meaning and limitations of results by, for example, ensuring there is a clear take-away message, explaining the level of uncertainty, and providing mechanisms for the participants to obtain additional information and guidance for follow-up consultation, when appropriate.

The report also includes recommendations for investigators to engage community groups and advocacy organizations to make sure participant needs and values are incorporated into decisions about returning individual results, regardless of participant social or economic status, and for research sponsors to require planning for the return of individual results in funding applications.

Credit: 
National Academies of Sciences, Engineering, and Medicine

Primates adjust grooming to their social environment

image: Chimpanzees prefer grooming mothers with babies, or their friends.

Image: 
MPI f. Evolutionary Anthropology/ A. Mielke

Working together and exchanging services for the benefit of everyone involved is crucial for humans and partly responsible for our success as a species. In order to achieve a goal, we need to choose the best possible cooperation partners. Yet who qualifies as the best possible partner depends on the task at hand, the abilities of all available candidates, and on our social relationships with them. Like humans, many non-human primates live in close-knit social groups. Individuals cooperate with each other to their mutual benefit, and often by exchanging services.

Grooming interactions play a special role within this system as they are exchangeable for support during a fight, for sharing food from a hunt or other services. Different group members can offer different services as reciprocation for the grooming; for example, high-ranking individuals are more useful supporters in a fight. Since grooming time is limited individuals strive to pick the best grooming partner from all available candidates. Importantly, the success of the grooming interaction does not only depend on the individual and his or her chosen partner, but also on the audience: If the partner has a friend in the audience, the friend might interfere or the partner might leave the grooming bout early, in which case time and effort of the groomer would be wasted.

Data out of the rainforest

Primatologist Alexander Mielke and his colleagues from the Max Planck Institute for Evolutionary Anthropology have now investigated at Taï National Park, Côte d'Ivoire, which properties chimpanzees and sooty mangabeys take into account when selecting a grooming partner, and if the composition of their audience influences their decision. Within the context of the Taï Chimpanzee Project, the researchers collected data in two chimpanzee communities and one mangabey community. In contrast to previous studies, they regarded each grooming initiation as an individual's personal decision: From all possible partners with their specific properties, which one would they choose?

In order to determine the social relationships and ranks of all individuals, Mielke and his colleagues analyzed data that researchers and field assistants had collected over many years. They then assessed the impact of the reproductive state of the potential partner (whether they had a baby or were receptive females), their social relationship with the decision-maker, whether there had been aggression between the two just before the decision was made, their sex and their dominance rank.

The researchers checked for each individual whether they had friends amongst the by-standers, and they differentiated between two types of dominance rank: global (compared to all individuals in the community) and local (in comparison to those nearby). This was to test whether non-human primates change their preference for individuals fluidly based on the social environment, or whether they simply preferred certain individuals based on their rank.

Friends may interrupt grooming

"Choosing a grooming partner from among ten, fifteen possible candidates - some of them friends, high-ranking, or with babies - is a very difficult task indeed. And yet individuals from both species, chimpanzee and sooty mangabey, chose their partner flexibly", says Alexander Mielke, first author of the study. "Both mangabeys and chimpanzees actually preferred grooming mothers with babies, something we did not know was the case in chimpanzees. Both species used grooming for reconciliation, and both species groomed their friends more. Most strikingly, in both species grooming choices depended strongly on the social environment. The animals avoided grooming individuals with close friends nearby, possibly because these friends might interrupt the grooming interaction, or because their potential partner might prefer to go and groom these friends", explains Mielke. "Individuals also chose partners who were higher-ranking compared to other possible partners, independent of their overall rank in the community. This shows that both species are flexible when it comes to taking a decision, and that individuals use the information they have about all available partners. Yet they also consider the wider social circumstances, and adapt their choice to maximize their own benefit."

These results show that primates are not only aware of the ranks and social relationships of their conspecifics, but that they can judge many individuals simultaneously and flexibly select the best option. The impact of the social environment suggests that additionally, mangabeys and chimpanzees can inhibit a preferred response (e.g. grooming a high-ranking group mate) if their action would not lead to success (because this individual's friend is present), a skill often considered too difficult for non-human animals.

"The fact that we found these results in both mangabeys and chimpanzees might indicate that this impressive cognitive feat is more widespread amongst primates than previously known", concludes Mielke. "Grooming is an important part of primate cooperation and choosing the best partner in a specific situation is a vital skill. As in humans, primate social groups consist of many individuals, each with their own status, own objectives, own history. This study gives further evidence that at least mangabeys and chimpanzees are equipped with the cognitive abilities to navigate and thrive in this complex social world."

Credit: 
Max Planck Institute for Evolutionary Anthropology

Ancient bones reveal 2 whale species lost from the Mediterranean Sea

image: Aerial view of some of the fish-salting tanks (cetaria) in the ancient Roman city of Baelo Claudia, near today's Tarifa in Spain. The largest circular tank is 3 meters wide, with a 18m3 capacity. These tanks were used to process large fish, particularly tuna. This study supports the possibility that they could have also been used to process whales.

Image: 
D. Bernal-Casasola, University of Cadiz

Two thousand years ago the Mediterranean Sea was a haven for two species of whale which have since virtually disappeared from the North Atlantic, a new study analysing ancient bones suggests.

The discovery of the whale bones in the ruins of a Roman fish processing factory located at the strait of Gibraltar also hints at the possibility that the Romans may have hunted the whales.

Prior to the study, by an international team of ecologists, archaeologists and geneticists, it was assumed that the Mediterranean Sea was outside of the historical range of the right and gray whale.

Academics from the Archaeology Department at the University of York used ancient DNA analysis and collagen fingerprinting to identify the bones as belonging to the North Atlantic right whale (Eubalaena glacialis) and the Atlantic gray whale (Eschrichtius robustus).

After centuries of whaling, the right whale currently occurs as a very threatened population off eastern North America and the gray whale has completely disappeared from the North Atlantic and is now restricted to the North Pacific.

Co-author of the study Dr Camilla Speller, from the University of York, said: "These new molecular methods are opening whole new windows into past ecosystems. Whales are often neglected in Archaeological studies, because their bones are frequently too fragmented to be identifiable by their shape.

"Our study shows that these two species were once part of the Mediterranean marine ecosystem and probably used the sheltered basin as a calving ground.

"The findings contribute to the debate on whether, alongside catching large fish such as tuna, the Romans had a form of whaling industry or if perhaps the bones are evidence of opportunistic scavenging from beached whales along the coast line."

Both species of whale are migratory, and their presence east of Gibraltar is a strong indication that they previously entered the Mediterranean Sea to give birth.

The Gibraltar region was at the centre of a massive fish-processing industry during Roman times, with products exported across the entire Roman Empire. The ruins of hundreds of factories with large salting tanks can still be seen today in the region.

Lead author of the study Dr Ana Rodrigues, from the French National Centre for Scientific Research, said: "Romans did not have the necessary technology to capture the types of large whales currently found in the Mediterranean, which are high-seas species. But right and gray whales and their calves would have come very close to shore, making them tempting targets for local fishermen."

It is possible that both species could have been captured using small rowing boats and hand harpoons, methods used by medieval Basque whalers centuries later.

The knowledge that coastal whales were once present in the Mediterranean also sheds new light on ancient historical sources.

Anne Charpentier, lecturer at the University of Montpellier and co-author in the study, said: "We can finally understand a 1st-Century description by the famous Roman naturalist Pliny the Elder, of killer whales attacking whales and their new-born calves in the Cadiz bay.

"It doesn't match anything that can be seen there today, but it fits perfectly with the ecology if right and gray whales used to be present."

The study authors are now calling for historians and archaeologists to re-examine their material in the light of the knowledge that coastal whales where once part of the Mediterranean marine ecosystem.

Dr Rodriguez added: "It seems incredible that we could have lost and then forgotten two large whale species in a region as well-studied as the Mediterranean. It makes you wonder what else we have forgotten".

Forgotten Mediterranean calving grounds of gray and North Atlantic right whales: evidence from Roman archaeological records is published in the journal Proceedings of the Royal Society of London B.

The study was an international collaboration between scientists at the universities of York, Montpellier (France), Cadiz (Spain), Oviedo (Spain) and the Centre for Fishery Studies in Asturias, Spain.

Credit: 
University of York

Rising carbon dioxide levels pose a previously unrecognized threat to monarch butterflies

image: This is researcher Leslie Decker with a monarch butterfly in a University of Michigan laboratory.

Image: 
Austin Thomason/Michigan Photography

ANN ARBOR--A new study conducted at the University of Michigan reveals a previously unrecognized threat to monarch butterflies: Mounting levels of atmospheric carbon dioxide reduce the medicinal properties of milkweed plants that protect the iconic insects from disease.

Milkweed leaves contain bitter toxins that help monarchs ward off predators and parasites, and the plant is the sole food of monarch caterpillars. In a multi-year experiment at the U-M Biological Station, researchers grew four milkweed species with varying levels of those protective compounds, which are called cardenolides.

Half the plants were grown under normal carbon dioxide levels, and half of them were bathed, from dawn to dusk, in nearly twice that amount. Then the plants were fed to hundreds of monarch caterpillars.

The study showed that the most protective of the four milkweed species lost its medicinal properties when grown under elevated CO2, resulting in a steep decline in the monarch's ability to tolerate a common parasite, as well as a lifespan reduction of one week.

The study looked solely at how elevated carbon dioxide levels alter plant chemistry and how those changes, in turn, affect interactions between monarchs and their parasites. It did not examine the climate-altering effects of the heat-trapping gas emitted when fossil fuels are burned.

"We discovered a previously unrecognized, indirect mechanism by which ongoing environmental change--in this case, rising levels of atmospheric CO2--can act on disease in monarch butterflies," said Leslie Decker, first author of the study, which is scheduled for publication July 10 in the journal Ecology Letters.

"Our results emphasize that global environmental change may influence parasite-host interactions through changes in the medicinal properties of plants," said Decker, who conducted the research for her doctoral dissertation in the U-M Department of Ecology and Evolutionary Biology. She is now a postdoctoral researcher at Stanford University.

U-M ecologist Mark Hunter, Decker's dissertation adviser and co-author of the Ecology Letters paper, said findings of the monarch study have broad implications. Many animals, including humans, use chemicals in the environment to help them control parasites and diseases. Aspirin, digitalis, Taxol and many other drugs originally came from plants.

"If elevated carbon dioxide reduces the concentration of medicines in plants that monarchs use, it could be changing the concentration of drugs for all animals that self-medicate, including humans," said Hunter, who has studied monarchs at the U-M Biological Station, at the northern tip of Michigan's Lower Peninsula, for more than a decade.

"When we play Russian roulette with the concentration of atmospheric gases, we are playing Russian roulette with our ability to find new medicines in nature," he said.

Earlier work in Hunter's lab had shown that some species of milkweed produce lower cardenolide levels when grown under elevated carbon dioxide. That finding caught the attention of Decker, who with Hunter designed a follow-up study to look at the potential impact of rising CO2 on the disease susceptibility of monarchs in the future.

They created an experimental system that allowed them to manipulate and measure all the key links in the chain: carbon dioxide levels, toxin concentrations in milkweed leaves, infection by parasites, and monarch susceptibility to those parasites. The fieldwork was conducted in 2014 and 2015.

Inside 40 growth chambers on a hilltop at the Biological Station, they exposed milkweed plants to two different carbon dioxide levels. Twenty chambers were maintained at current global CO2 concentrations of around 400 parts per million, and 20 chambers received 760 ppm of CO2, a level that could be reached well before the end of the century if the burning of fossil fuels continues unabated.

The four milkweed species differed in their levels of protective cardenolide compounds. The most protective species was Asclepias curassavica, commonly known as tropical milkweed. The chamber-raised plants were fed to monarch caterpillars, and each caterpillar got a steady diet of a single milkweed species with known carbon dioxide exposure.

Three-day-old caterpillars were also infected with carefully controlled doses of a common monarch parasite that is distantly related to the malaria pathogen. Ophryocystis elektroscirrha is a protozoan that shortens adult monarch lifespan, impedes its ability to fly and reduces the number of offspring it produces.

Over about two weeks' time, the infected caterpillars grew to a length of about 2 inches, with striking yellow, white and black bands. Then they pupated inside a hard-shelled chrysalis for about 10 days before emerging as orange-and-black butterflies.

At their Biological Station lab, Decker and Hunter raised hundreds of adult monarchs. The lifespan of each individual--in Michigan, monarch butterflies typically live for about a month--was recorded, and the number of parasitic spores on each carcass was counted.

Piecing together all this data, the researchers were able to determine how changes in atmospheric carbon dioxide levels altered toxin concentrations in the four milkweed species and, in turn, how exposure to those plants affected the monarch's lifespan and disease susceptibility.

The largest declines in parasite tolerance and butterfly lifespan occurred in monarchs that fed on A. curassavica, a milkweed species in which cardenolide production declined by nearly 25 percent when grown under elevated CO2.

In caterpillars that fed on A. curassavica milkweed grown under elevated CO2, tolerance to the parasite declined by a whopping 77 percent when compared to caterpillars that fed on A. curassavica grown under ambient-level CO2.

Monarchs that fed on A. curassavica grown under elevated CO2 suffered a reduction in lifespan of seven days due to parasitic infection. Parasites reduced mean lifespan by only two days for monarchs that ate A. curassavica grown under ambient CO2 levels.

"We've been able to show that a medicinal milkweed species loses its protective abilities under elevated carbon dioxide," Decker said. "Our results suggest that rising CO2 will reduce the tolerance of monarch butterflies to their common parasite and will increase parasite virulence."

In recent years, monarch populations have been declining rapidly. Most discussions of the monarch butterfly's plight focus on habitat loss: logging of trees in the Mexican forest where monarchs spend the winter, as well as the loss of wild milkweed plants that sustain them during their annual migration across North America.

"Habitat loss, problems during migration and climate change all contribute to monarch declines," Hunter said. "Unfortunately, our results add to that list and suggest that parasite-infected monarchs will become steadily sicker if atmospheric concentrations of CO2 continue to rise."

Credit: 
University of Michigan