Culture

New species may arise from rapid mitochondrial evolution

CORVALLIS, Ore. - Genetic research at Oregon State University has shed new light on how isolated populations of the same species evolve toward reproductive incompatibility and thus become separate species.

Scientists sequenced the entire genome of a Pacific tidepool crustacean, Tigriopus californicus, a model species for differentiation based on geographic separation - an early stage of one species becoming multiple species.

They examined the co-evolution of mitochondrial and nuclear genes. Mitochondria act as a cell's power plant, generating adenosine triphosphate, or ATP - a source of chemical energy.

As in all animals, most of a T. californicus cell's genes are in its nucleus but some are in the mitochondria.

"The mitochondria organelle contains a small chromosome with only 37 genes, but these genes are absolutely essential for metabolism," said the study's corresponding author, Felipe Barreto, assistant professor of integrative biology in OSU's College of Science. "In order for ATP to be produced properly in a cell, a few hundred other genes encoded in the nucleus must interact directly with the 37 mitochondrial genes. Mutations in the mitochondrial genes may cause these interactions to be subpar and thus cause reductions in metabolic performance."

T. californicus populations along the Pacific coast of North America have mitochondrial genes that differ widely from one population to the next - there are lots of mutations relative to each other.

"As a result, hybrid offspring between populations suffer from lowered fitness in the form of lower fecundity, slow development and lower ATP production as determined by several previous experiments," Barreto said.

Barreto and collaborators from the University of California, San Diego, the University of Southern California and the University of North Carolina used molecular statistical models to screen the genomes of eight populations in order to detect which genes might be incompatible between populations.

"Those genes may therefore be candidate genes for understanding how different populations become incompatible and possibly eventually become different species," he said.

Credit: 
Oregon State University

Massive genome havoc in breast cancer is revealed

image: Long-read sequencing enabled the team to reconstruct in great detail the history of how the HER2 gene gets massively amplified in HER2-positive breast cancer cells, says Dr. Schatz. Top rectangle shows a 2 million base-pair segment of chromosome 17 occupied by the HER2 gene (also called ERBB2). A small segment of the gene, already massively amplified, breaks off and fuses with chromosome 8 (lower rectangle). On that chromosome, parts of the gene are copied as many as 1000 times, with various segments jumping around within the chromosome (green arcs). This shows why we want to identify HER2-positive patients as early as possible, to prevent the kind of chaos that we register here cumulatively, says Schatz.

Image: 
Schatz Lab, CSHL/JHU

Cold Spring Harbor, NY - In cancer cells, genetic errors wreak havoc. Misspelled genes, as well as structural variations -- larger-scale rearrangements of DNA that can encompass large chunks of chromosomes -- disturb carefully balanced mechanisms that have evolved to regulate cell growth. Genes that are normally silent are massively activated and mutant proteins are formed. These and other disruptions cause a plethora of problems that cause cells to grow without restraint, cancer's most infamous hallmark.

This week, scientists at Cold Spring Harbor Laboratory (CSHL) have published in Genome Research one of the most detailed maps ever made of structural variations in a cancer cell's genome. The map reveals about 20,000 structural variations, few of which have ever been noted due to technological limitations in a long-popular method of genome sequencing.

The team, led by sequencing experts Michael C. Schatz and W. Richard McCombie, read genomes of the cancer cells with so-called long-read sequencing technology. This technology reads much lengthier segments of DNA than older short-read technology. When the results are interpreted with two sophisticated software packages recently published by the team, two advantages are evident: long-read sequencing is richer in terms of both information and context. It can, for instance, make better sense of repetitive stretches of DNA letters - which pervade the genome - in part by seeing them within a physically larger context.

The team demonstrated the power of long-read technology by using it to read the genomes of cells derived from a cell line called SK-BR-3, an important model for breast cancer cells with variations in a gene called HER2 (sometimes also called ERBB2). About 20% of breast cancers are "HER2-positive," meaning they overproduce the HER2 protein. These cancers tend to be among the most aggressive.

"Most of the 20,000 variants we identified in this cell line were missed by short-read sequencing," says Maria Nattestad, Ph.D., who performed the work with colleagues while still a member of the Schatz lab at CSHL and Johns Hopkins University. "Of particular interest, we found a highly complex set of DNA variations surrounding the HER2 gene."

In their analysis, the team combined the results of long-read sequencing with results of another kind of experiment that reads the messages, or transcripts, that are being generated by activated genes. This fuller picture yielded an extraordinarily detailed account of how structural variations disrupt the genome in cancer cells and sheds light on how cancer cells rapidly evolve.

Schatz, Adjunct Associate Professor at CSHL and Bloomberg Distinguished Associate Professor at Johns Hopkins University, and McCombie, a CSHL Professor, say it is "essential to continue building a catalog of variant cancer cell types using the best available technologies. Long-read sequencing is an invaluable tool to capture the complexity of structural variations, so we expect its widespread adoption for use in research and clinical practice, especially as sequencing costs further decline."

Credit: 
Cold Spring Harbor Laboratory

The Lancet: UK-US post-Brexit trade deal risks increased drug prices, and may threaten the NHS

A trade deal between the UK and USA could risk increasing drug prices in the UK, which could diminish the affordability and accessibility of the NHS, according to a Viewpoint published in The Lancet.

The opinion piece outlines how the USA's targeting of so-called 'foreign free-riding' in trade deals could lead to a poor deal on pharmaceuticals for the UK post-Brexit. The authors raise concerns that the USA could pressure the UK to change the way it regulates pharmaceuticals in trade deals.

Currently, all of the UK's international trade deals are negotiated through the European Union, but, the UK Government will need to negotiate new deals to replace existing agreements post-Brexit. The USA is one of the UK's most important trading partners after the rest of the European Union, and a US-UK bilateral trade deal is a key post-Brexit priority, with conversations already taking place to achieve this.

US policies to reduce 'foreign free-riding'

American drug prices are some of the highest in the world, leading to economic hardship and poorer health outcomes for uninsured Americans and those who cannot afford to pay out-of-pocket. The US Government does not negotiate drug prices, and Medicare is prohibited by law from doing so. In addition, the Affordable Care Act prevents the US Food and Drug Administration and the Secretary of Health and Human Services from basing drug approval decisions on cost-effectiveness.

By contrast, similar drugs are often significantly cheaper in the UK, where the Government negotiates prices with pharmaceutical companies via the Pharmaceutical Price Regulation Scheme, and the NHS makes purchases based on clinical and cost-effectiveness assessments from the UK's National Institute for Health and Care Excellence (NICE).

In May 2018, the US Department of Health and Human Services introduced the American Patients First blueprint, outlining measures to reduce US drug prices which include putting pressure on other countries to allow drug prices to rise in their jurisdictions. Unveiling the new policy, President Trump stated that "as we demand fairness for American patients at home, we will also demand fairness overseas. When foreign governments extort unreasonably low prices from US pharmaceutical companies, Americans have to pay more to subsidise the enormous cost of research and development".

The Trump administration blueprint criticises single-payer healthcare systems which impose drug price controls, and accuses foreign governments of not paying their fair share of research and development costs to bring innovative drugs to market - proposing that other nations 'free-ride' off of American investment in drug development.

However, the Viewpoint authors argue that this interpretation overlooks the high cost of drugs for patients, and that drugs should be priced in line with how much benefit they give. They raise concerns that, rather than tackling the inherent issues of funding drug development, American Patients First instead represents the interests of pharmaceutical companies and their stakeholders, aiming to maintain or increase revenues.

"American Patients First is an attempt by the Trump Administration to make the USA's drug pricing problem everybody else's problem," says lead author Dr Holly Jarman, University of Michigan, USA. "By shifting the economic, political, and social costs of policies made in the USA onto America's trading partners, the Trump Administration is attempting to show voters that they are doing something about high drug prices while providing benefits to pharmaceutical companies and sympathetic campaign donors." [1]

Potential effects for the UK post-Brexit

With the American Patients First policy and a turn towards protectionist, pro-American trade deals, the authors believe that the USA will push to alter drug regulation in future trade negotiations with the UK. This raises concerns because the ability to enter into trade deals is an executive power in the UK.

One of the authors, Professor Tamara Hervey, University of Sheffield, UK, points out: "While deals have to be ratified by Parliament, Parliament cannot amend the agreement that the Government negotiates, or be directly involved as the negotiation takes place. In addition, health groups are rarely consulted on trade deals, and Parliament and other health-focused stakeholders have limited opportunities to hold the Government to account in its trade negotiations." [1]

This is also exacerbated by the Brexit process. The Trade Bill introduced this year - if passed into law - could further limit parliamentary and public scrutiny of US-UK trade deals.

In addition, until recently the UK did not have a fully functioning trade ministry capable of complex negotiations with a highly experienced country like the USA. The authors say that the UK is unlikely to get a better deal with the USA than it has as part of the European Union, as the UK market is much smaller than the whole of the EU, meaning the UK has less bargaining power in negotiations.

Lastly, the authors raise concerns that pharmaceuticals will be one part of a wider trade agreement between the UK and USA, and could be an area that the UK compromises on to ensure better deals elsewhere.

"Perhaps the ultimate issue is the extent to which UK politicians are willing to defend the existing regulatory regime in the context of risks to their political careers," says Professor Martin McKee, one of the authors from the London School of Hygiene & Tropical Medicine, and member of the advisory board of Healthier In (an NGO which campaigned to remain in the EU). [1]

He continues: "In view of President Trump's recent trade policy actions, policy makers should take the administration's argument over drug prices very seriously. The UK health policy community should press the UK Government to make clear to the public and to the USA that they are not willing to bring the UK's drug pricing and evaluation regime to the negotiating table in any future UK-US trade talks." [1]

Credit: 
The Lancet

Concussion may bring greater risks for athletes with ADHD

INDIANAPOLIS - Athletes who have attention deficit hyperactivity disorder (ADHD) may be at greater risk for experiencing persistent anxiety and depression after a concussion than people who do not have ADHD, according to a preliminary study released today that will be presented at the American Academy of Neurology's Sports Concussion Conference in Indianapolis, July 20 to 22, 2018. ADHD is a brain disorder that affects attention and behavior.

"These findings suggest that ADHD and concussion may have a cumulative effect on anxiety and depression beyond that of either ADHD or concussion alone," said study author Robert Davis Moore, MS, PhD, of the University of South Carolina in Columbia. "Athletes with ADHD should be monitored with this in mind, as they may be more susceptible to experience symptoms of depression and anxiety following a concussion."

The study involved 979 NCAA Division I college athletes at the University of South Carolina. Information on ADHD diagnosis and any history of concussion was gathered, along with the athletes' scores on questionnaires measuring anxiety and depression before the start of athletes' sporting seasons.

Athletes were divided into four groups: those with ADHD who also had experienced concussion; those with ADHD who had not experienced a concussion; those with concussion and no ADHD; and those with neither a history of concussion nor ADHD.

The researchers found that athletes with both ADHD and concussion had significantly higher scores on the tests for anxiety and depression than any of the other groups. Moore noted that athletes with a history of concussion were evaluated six or more months after the injury, indicating that the differences lasted longer than what might be expected in the weeks after the concussion.

Athletes with ADHD but no history of concussion did not show increased anxiety or depression.

The anxiety test asks people how often they agree with statements such as "I am tense; I am worried" and "I worry too much over something that really doesn't matter," with answers ranging from "Almost never" to "Almost always." Scores range from 20 to 80. The athletes with both concussion and ADHD had average scores of 42 on the test, compared to an average score of 33 for the other three groups.

The depression test asks how often during the past week people have agreed with statements such as "I did not feel like eating; my appetite was poor" and "I felt that everything I did was an effort," with answers ranging from "Rarely or none of the time (less than 1 day)" to "Most or all of the time (5-7 days)." Scores range from zero to 60, with scores of 16 or higher indicating that a person may be at risk for clinical depression. The athletes with both concussion and ADHD had average scores of 26, compared to an average score of 16 for the other three groups.

A limitation of the study is that it is cross-sectional, which means it looks at one point in time and does not show cause and effect. Moore said additional research is needed with multiple tests both before and after any concussions occur.

Credit: 
American Academy of Neurology

Hospitals may take too much of the blame for unplanned readmissions

BIDMC Research Briefs showcase groundbreaking scientific advances that are transforming medical care.

A major goal of hospitals is to prevent unplanned readmissions of patients after they are discharged. A new study reveals that the preventability of readmissions changes over time: readmissions within the first week after discharge are often preventable by the hospital, whereas readmissions later are often related to patients' difficultly accessing outpatient clinics.

"Patients discharged from a hospital are usually recovering from a serious medical condition as well as managing other chronic medical conditions, and they often encounter new logistical challenges adapting to this recovery period," said Kelly Graham, MD, MPH, Director of Ambulatory Residency Training at BIDMC and an Instructor in Medicine at Harvard Medical School. "Hospitals and outpatient clinics must work together more seamlessly to ensure that patients are equipped to manage these challenges at home."

For their study, published in the Annals of Internal Medicine, Graham and her colleagues examined information on 822 general medicine patients readmitted to 10 academic medical centers in the United States. Overall, 36.2% of early readmissions versus 23.0% of late readmissions were deemed preventable. Hospitals were identified as better locations for preventing early readmissions, whereas outpatient clinics and home were better for preventing late readmissions.

Premature discharge and problems with physician decision-making related to diagnosis and management during the initial hospitalization were likely causes of readmissions in the early period. More likely to be amenable to interventions outside the hospital, later readmissions were most often caused by factors over which the hospital has less direct control, such as monitoring and managing of symptoms after discharge by primary care clinicians, as well as end-of-life issues.

Taken together, the findings suggest that readmissions in the week after discharge are more preventable and more likely to be caused by factors over which the hospital has direct control than those later in the 30-day window. In the current US system, however, unplanned readmissions within the 30 days after hospital discharge are considered uniformly preventable by hospitals, and thus hospitals are punished with financial penalties for these events.

"Our findings suggest that the 30 days following hospital discharge are not the same with regard to what influences outcomes for sick patients, and that the current model over-simplifies this high-risk time," said Graham. "One potential unintended consequence of this is that outpatient environments have not been involved in efforts to manage this high-risk timeframe, which results in poorly coordinated care and worse outcomes for our patients."

Graham noted that interventions to improve outcomes after hospital discharge should engage the ambulatory care system, with attention to improving access to primary care. "We should also be careful not to put too much focus on reducing length of stay in the hospital, which may be a driver of premature discharge and early readmissions," she said.

Credit: 
Beth Israel Deaconess Medical Center

Understanding the social dynamics that cause cooperation to thrive, or fail

image: Despite their reputation, spotted hyenas are often cooperative animals, dwelling in large groups and assisting one another during hunts. Penn biologist Erol Akçay modeled a theoretical social group to show how cooperation can arise or collapse.

Image: 
Amiyaal Ilany

Examples of cooperation abound in nature, from honeybee hives to human families. Yet it's also easy enough to find examples of selfishness and conflict. Studying the conditions that give rise to cooperation has occupied researchers for generations, with implications for understanding the forces that drive workplace dynamics, charitable giving, animal behavior, even international relations.

A basic tenet of these studies is that cooperative behavior arises when individuals interacting in a social network derive some benefit from being generous with one another. Yet social networks are not fixed. What if the structure of the network itself alters as individuals become more cooperative?

In a new report in the journal Nature Communications, Erol Akçay, an assistant professor of biology in Penn's School of Arts and Sciences, addresses this question of how an evolving social network influences the likelihood of cooperation in a theoretical social group. He finds that, although networks where connected individuals are closely related are more likely to cooperate, such groups can trigger a feedback loop that alters the structure of the network and leads to cooperation's collapse.

"We know from a half-century of study that cooperation is quite easy to evolve in principle," says Akçay, "in the sense that there are many, many sets of conditions that can make cooperative behaviors a better strategy than non-cooperative behaviors. So given that, why isn't the world a cooperative paradise? Because we know it isn't."

Akçay's theoretical work points to a reason why: It's possible that the social structure that gave rise to high levels of cooperation may not be stable in such a cooperative environment. Yet, his model also suggests that cooperation can be maintained if the benefits of generous behavior are great enough.

The work builds upon studies that Akçay pursued with former postdoctoral researcher Amiyaal Ilany, now a faculty member at Bar-Ilan University. They developed a mathematical model of how individual animals inherit their social connections that can explain the structure of social networks in animal groups. In the new work, Akçay built on that earlier model by adding in an element of choice; individuals in the network could either connect with a parent's connection, or randomly with individuals aside from a parent's connections. The probabilities of making each type of connection determine the structure of the network.

Each individual in the model was further designated to be either a cooperator or a defector. Cooperators provide a benefit to those they connect with, but the total amount they provide is fixed, so the more connections they have, the less each connection receives. Both cooperators and defectors reap a benefit based on the number of links to cooperators they possess, but defectors don't offer anything in return.

Somewhat intuitively, Akçay found that groups with low levels of random linking--that is, connections not made through a parent--were more likely to have cooperation emerge, because they resulted in high relatedness between connected individuals. In contrast, the probability of making connections through one's parent had a relatively small effect on cooperation. But when he let the model continue to run, he found something he hadn't anticipated.

"If you suddenly find yourself in a population where most individuals are cooperators," he says, "then you shouldn't be selective about who you connect to, you should just make links with anyone who comes along."

In other words, in a mostly cooperative population, making random links is just as beneficial as only making links to your parent's connections. That leads to a situation in which cooperators begin forming connections with defectors, triggering a decline in the overall cooperative nature of the network.

"If everyone is handing out candy," Akçay says, "you should just go collect candy from everyone without being too selective about the connection."

But Akçay did find a way to push back against the descent into defection. When making a social link is costly--such as the time primates spend grooming one another, or the effort that goes into remembering to send a holiday gift to distant relatives--the likelihood of making random links goes down, and so, too, does the probability that a cooperative society will collapse into selfishness.

In future work, Akçay hopes to consider other elements of social groups that influence the rise of cooperation beyond social network structure, including individual preference, life history, and the costs and benefits of cooperating. Factoring in these other aspects of a group dynamic may, he hopes, shed light on strategies to foster more cooperation and generosity in our own society.

Credit: 
University of Pennsylvania

The more you smoke, the greater your risk of a heart rhythm disorder

Sophia Antipolis, 12 July 2018: The more you smoke, the greater your risk of a heart rhythm disorder called atrial fibrillation. That's the finding of a study published today in the European Journal of Preventive Cardiology, a European Society of Cardiology (ESC) journal.(1)

The study found a 14% increase in the risk of atrial fibrillation for every ten cigarettes smoked per day. There was a linear dose-response relationship, meaning that the risk increased with each additional cigarette smoked.

Compared to people who had never smoked, current smokers had a 32% increased risk of atrial fibrillation, while ever smokers (current and former smokers combined) had a 21% increased risk, and former smokers had a 9% increased risk - providing further evidence of a dose-response relationship.

"If you smoke, stop smoking and if you don't smoke, don't start," said study author Dr Dagfinn Aune, postdoctoral researcher at Imperial College London, UK, and associate professor at Bjørknes University College in Oslo, Norway. "We found that smokers are at increased risk of atrial fibrillation, but the risk is reduced considerably in those who quit."

Smoking is a lethal addictive disorder.(2) A lifetime smoker has a 50% probability of dying due to smoking, and on average will lose ten years of life. Slightly less than half of lifetime smokers will continue smoking until death. The rate of smoking is declining in Europe, but it is still very common and is increasing in women, adolescents and the socially disadvantaged.

Atrial fibrillation is the most common heart rhythm disorder (arrhythmia). It causes 20-30% of all strokes and increases the risk of dying prematurely.(3) One in four middle-aged adults in Europe and the US will develop atrial fibrillation. It is estimated that by 2030 there will be 14-17 million patients with atrial fibrillation in the European Union, with 120,000-215,000 new diagnoses each year.

Few studies have assessed whether there is a dose-response relationship between the number of cigarettes smoked and the risk of atrial fibrillation. The authors of the current study investigated this issue by conducting a meta-analysis of 29 prospective studies from Europe, North America, Australia and Japan with a total of 39,282 incident cases of atrial fibrillation among 677,785 participants.

Compared to zero cigarettes per day, smoking five, ten, 15, 20, 25 and 29 cigarettes per day was associated with a 9%, 17%, 25%, 32%, 39%, and 45% increased risk of atrial fibrillation, respectively.

Every ten pack-years of smoking was associated with a 16% increased risk of developing atrial fibrillation. Pack-years are calculated by multiplying the number of packs of cigarettes smoked per day by the number of years the person has smoked.

European guidelines on the prevention of cardiovascular disease recommend avoiding tobacco in any form.2 All types of smoked tobacco, including low-tar ("mild" or "light") cigarettes, filtered cigarettes, cigars, pipes, and water pipes are harmful.

Dr Aune said: "Our results provide further evidence of the health benefits of quitting smoking and, even better, to never start smoking in the first place. This is important from a public health perspective to prevent atrial fibrillation and many other chronic diseases."

Dr Aune noted that more research is needed to identify the duration of smoking cessation needed to reduce the risk of atrial fibrillation, and whether the risk at some point reaches that of people who have never smoked.

Credit: 
European Society of Cardiology

The Lancet Psychiatry: Automated virtual reality-based psychological therapy may help reduce fear of heights

Psychological therapy delivered by a virtual reality coach can help people with a clinically diagnosed fear of heights overcome their fear, according to a randomised controlled trial of 100 people published in The Lancet Psychiatry journal.

The study is the first to use virtual reality technology as a treatment without a therapist, providing a proof of concept for how some psychological interventions might be offered in future. However, more research is needed to understand how automated therapy would apply in other conditions, including more severe mental health disorders such as psychosis, where therapy is currently delivered by experienced mental health professionals.

Having a fear of heights is the most common phobia, with one in five people reporting having a fear of heights during their lifetime, and one in 20 people clinically diagnosed with a fear of heights.

In previous research, people with a fear of heights used virtual reality training in sessions with a therapist. The study found that it was as effective as exposure to heights in real life, and that the reduced fear lasted for at least a year.

In the new study, 100 people with clinically diagnosed fear of heights who were not receiving psychological therapy for it were given either the new automated virtual reality treatment (49 people) or usual care, which was typically no treatment (51 people). On average, participants had suffered a fear of heights for 30 years.

All participants completed questionnaires on the severity of their fear of heights at the start of the trial, at the end of treatment (two weeks later), and at follow-up after four weeks.

Participants given the virtual reality treatment had roughly six 30-minute sessions over two weeks, where they wore a virtual reality headset. In the first session, participants discussed their fear of heights with the virtual coach, explaining what caused their fear (for example, fear of falling, fear of throwing oneself off the building, fear of the building collapsing) while the virtual coach gave basic information about fear of heights [1].

Participants then entered a virtual office complex with ten floors and a large atrium space, where they took part in activities that challenged their fears and helped them learn that they were safer than they thought. These started with simpler tasks, such as watching a safety barrier to a drop gradually lowering, and built up to harder tasks, such as walking out on a platform over a large drop. Other tasks also included rescuing a cat from a tree, playing a xylophone near an edge, and throwing balls over the edge of a drop.

Throughout the activities the virtual coach offered encouragement, and afterwards they explained what the participant had learnt from their activities and asked whether they felt safer than before. The virtual coach also encouraged participants to try real heights between sessions.

Of the 49 participants offered the virtual reality treatment, 47 took part in at least one session, and 44 completed the full course of treatment. The three people who did not complete the intervention either found the sessions too difficult (two people) or were unable to attend further appointment sessions (one participant).

At the end of treatment and at follow-up, control group participants rated their fear of heights as remaining similar, but all participants in the virtual reality treatment group rated that their fear of heights had reduced. By follow-up, 34 of 49 people in this group were not rated as having a fear of heights, compared with none of the 51 people in the control group.

There were no adverse events reported by any participants.

"Immersive virtual reality therapies that do not need a therapist have the potential to dramatically increase access to psychological interventions," says lead author Professor Daniel Freeman, University of Oxford, UK. "We need a greater number of skilled therapists, not fewer, but to meet the large demand for mental health treatment we also require powerful technological solutions. As seen in our clinical trial, virtual reality treatments have the potential to be effective, and faster and more appealing for many patients than traditional face-to-face therapies. With our unique automation of therapy using virtual reality there is the opportunity to provide really high quality treatment to many more people at an affordable cost. Our study is an important first step, and we are carrying out clinical testing to learn whether automation of psychological treatment using virtual reality works for other mental health disorders." [1]

The authors note some limitations in their study, including that they did not compare against the psychological therapies currently used to treat phobias, such as counselling, psychotherapy, or cognitive behavioural therapy. Instead participants in the usual care group typically received no treatment. The participants also referred themselves to take part in the study, so may not be representative of all people with fear of heights.

The authors also note that they relied on questionnaires to assess participants' fear of heights, and did not test them in real-world scenarios.

The initial cost of software development was high, with a team of psychologists, programmers, script writers, and an actor working intensively for six months, but subsequent costs for the treatment were low, as a therapist does not need to be present and consumer virtual reality hardware is now inexpensive.

The trial did not assess which part of the treatment caused the improvements, and did not test long-term outcomes of the treatment as previous studies have shown that virtual reality treatment can reduce anxiety for several years. The authors note that the treatment was brief, and further benefits could be possible with a longer treatment duration.

Commenting on the virtual reality treatment, one participant said: "What I'm noticing is that in day-to-day life I'm much less averse to edges, and steps, and heights, and I'm noticing in myself that when I'm doing the VR and outside I'm able to say 'Hello' to the edge instead of bracing against it and backing up. When I'm doing the VR I'm, as best as I'm able to, being open and curious around me as much as I can and noticing how the anxiety feels in my body, and then noticing that it goes really quickly now. So, when I've always got anxious about an edge I could feel the adrenaline in my legs, that fight/flight thing; that's not happening as much now. I'm still getting a bit of a reaction to it, both in VR and outside as well, but it's much more brief, and I can then feel my thighs soften up as I'm not bracing up against that edge. I feel as if I'm making enormous progress, and feel very happy with what I've gained." [3]

Writing in a linked Comment, Dr Mark Hayward, University of Sussex, UK, welcomes the study's benefits for people with a fear of heights, but questions how virtual reality treatments could apply for people with more severe mental health problems. He says: " Psychological treatments for patients with psychosis face many challenges, because access to the treatments can be restricted and the treatment might generate only small effects. Symptom-specific treatments targeting either paranoia or auditory hallucinations are generating promising outcomes that might increase effect sizes, but their delivery in traditional face-to-face formats by expert therapists will do little to increase access (even when technology is utilised, such as in AVATAR therapy). VR is a promising method for delivering psychological treatments to patients with psychosis, but can a fully automated delivery system increase access? And are greater effects also possible because of the virtual exposure to everyday situations that are experienced as threatening?"

Credit: 
The Lancet

People trust scientific experts more than the government even when the evidence is outlandish

Members of the public in the UK and US have far greater trust in scientific experts than the government, according to a new study by Queen Mary University of London.

In three large scale experiments, participants were asked to make several judgments about nudges -behavioural interventions designed to improve decisions in our day-to-day lives.

In three large scale experiments, participants were asked to make several judgments about nudges -behavioural interventions designed to improve decisions in our day-to-day lives.

The nudges were introduced either by a group of leading scientific experts or a government working group consisting of special interest groups and policy makers.

Some of the nudges were real and had been implemented, such as using catchy pictures in stairwells to encourage people to take the stairs, while others were fictitious and actually implausible like stirring coffee anti-clockwise for two minutes to avoid any cancerous effects.

The study, published in the journal Basic and Applied Social Psychology, found that trust was higher for scientists than the government working group, even when the scientists were proposing fictitious nudges.

Co-author Professor Norman Fenton, from Queen Mary's School of Electronic Engineering and Computer Science, said: "While people judged genuine nudges as more plausible than fictitious nudges, people trusted some fictitious nudges proposed by scientists as more plausible than genuine nudges proposed by government. For example, people were more likely trust the health benefits of coffee stirring than exercise if the former was recommended by scientists and the latter by government."

The results also revealed that there was a slight tendency for the US sample to find the nudges more plausible and more ethical overall compared to the UK sample.

Lead author Dr Magda Osman from Queen Mary's School of Biological and Chemical Sciences, said: "In the context of debates regarding the loss of trust in experts, what we show is that in actual fact, when compared to a government working group, the public in the US and UK judge scientists very favourably, so much so that they show greater levels of trust even when the interventions that are being proposed are implausible and most likely ineffective. This means that the public still have a high degree of trust in experts, in particular, in this case, social scientists."

She added: "The evidence suggests that trust in scientists is high, but that the public are sceptical about nudges in which they might be manipulated without them knowing. They consider these as less ethical and trust the experts proposing them less with nudges in which they do have an idea of what is going on."

Nudges have become highly popular decision-support methods used by governments to help in a wide range of areas such as health, personal finances, and general wellbeing.

The scientific claim is that to help people make better decisions regarding their lifestyle choices, and those that improve the welfare of the state, it is potentially effective to subtly change the framing of the decision-making context, which makes the option which maximises long term future gains more prominent.

In essence, the position adopted by nudge enthusiasts is that poor social outcomes are often the result of poor decision-making, and in order to address this, behavioural interventions such as nudges can be used to reduce the likelihood of poor decisions being made in the first place.

Dr Osman said: "Overall, the public make pretty sensible judgments, and what this shows is that people will scrutinise the information they are provided by experts, so long as they are given a means to do it. In other words, ask the questions in the right way, and people will show a level of scrutiny that is often not attributed to them. So, before there are strong claims made about public opinion about experts, and knee-jerk policy responses to this, it might be worth being a bit more careful about how the public are surveyed in the first place."

Credit: 
Queen Mary University of London

Healthy diet reduces asthma symptoms

People who eat a healthy diet experience fewer asthma symptoms and better control of their condition, according to a new study published in the European Respiratory Journal [1].

Diets with better asthma outcomes are characterised by being healthier, with greater consumption of fruits, vegetables and whole grain cereals. Unhealthy diets, with high consumption of meat, salt and sugar, have the poorest outcomes.

The study strengthens the evidence on the role of a healthy diet in managing asthma symptoms, and offers new insights on the potential impact of diet in the prevention of asthma in adults.

Lead researcher Dr Roland Andrianasolo, from the Nutritional Epidemiology Research Team at Inserm, Inra [2], and Paris 13 University said: "Existing research on the relationship between diet and asthma is inconclusive, and compared to other chronic diseases, the role of diet in asthma is still debated. This has resulted in a lack of clear nutritional recommendations for asthma prevention, and little guidance for people living with asthma on how to reduce their symptoms through diet.

"To address this gap, we wanted to make more detailed and precise assessments of dietary habits and the associations between several dietary scores and asthma symptoms, as well as the level of asthma control."

The research team analysed data from 34,776 French adults who answered a detailed respiratory questionnaire as part of the 2017 NutriNet-Santé study. This included 28% of women and 25% of men who the researchers identified as having at least one asthma symptom. The number of asthma symptoms experienced by all of the participants was measured using self-report data over a 12-month period.

To assess asthma control in the participants already living with asthma, the researchers used a self-administered questionnaire, which evaluates asthma control over a four-week period. Measures such as occurrence of asthma symptoms, use of emergency medication and limitations on activity indicated the level of asthma control.

Quality of diet was assessed based on three randomly collected 24-hour dietary records and each participant's adherence to three dietary scores. Generally, the dietary scores all considered diets with high fruit, vegetable and whole grain cereal intake as the healthiest, while diets high in meat, salt and sugar were the least healthy.

The researchers adjusted their analysis to consider other factors known to be linked with asthma, such as smoking and exercise.

The data showed that, overall, men who ate a healthier diet had a 30% lower chance of experiencing asthma symptoms. In women with healthier diets, the chance of experiencing symptoms was 20% lower.

The research also showed that for men with asthma the likelihood of poorly controlled symptoms was around 60% lower in those who had healthy diets and among women with asthma, poorly controlled disease was 27% lower in those with healthy diets.

The researchers say that the results suggest a healthy diet may have a role in preventing the onset of asthma as well as controlling asthma in adults. Dr Andrianasolo explained: "This study was designed to assess the role of an overall healthy diet on asthma symptoms and control, rather than identify particular specific foods or nutrients. Our results strongly encourage the promotion of healthy diets for preventing asthma symptoms and managing the disease.

"A healthy diet, as assessed by the dietary scores we used, is mostly made up of a high intake of fruit, vegetables and fibre. These have antioxidant and anti-inflammatory properties and are elements in a healthy diet that potentially lower symptoms. In contrast, the least healthy diets include high consumption of meat, salt and sugar, and these are elements with pro-inflammatory capacities that may potentially worsen symptoms of asthma."

The researchers note that caution is needed when interpreting the results from this study as it only provides a snap-shot of the possible effects of diet on asthma, and say they plan to conduct longer-term studies in future to confirm their findings.

Dr Andrianasolo added: "Although further studies are needed to confirm our observations, our findings contribute to evidence on the role of diet in asthma, and extend and justify the need to continually support public health recommendations on promoting a healthy diet."

Professor Mina Gaga, President of the European Respiratory Society, and Medical Director and Head of the Respiratory Department of Athens Chest Hospital, said: "This research adds to the evidence on the importance of a healthy diet in managing asthma and its possible role in helping prevent the onset of asthma in adults. Healthcare professionals must find the time to discuss diet with their patients, as this research suggests it could play an important role in preventing asthma."

Credit: 
European Respiratory Society

Fuzzy yellow bats reveal evolutionary relationships in Kenya

image: This is one of the African yellow house bats studied by scientists to better understand the evolution of this family of bats.

Image: 
(c) P.W. Webala Maasai Mara Universit

After Halloween, people tend to forget about bats. But, for farmers, residents of Kenya, and scientists, bats are a part of everyday life. While North America has 44 species, Kenya, a country the size of Texas, has 110 bat species. Many of these species also contain subspecies and further divisions that can make the bat family tree look like a tangled mess. Researchers set out to cut the clutter by sorting the lineages of yellow house bats and in the process found two new species.

The bats of Scotophilus vary in size and other characteristics but, in general, "They're cute. They look a lot like the bats you see in Chicago but they're this great yellow color," says Terry Demos, a Postdoctoral Fellow at Chicago's Field Museum and lead author of a recent paper in the journal Frontiers in Ecology and Evolution. These furry creatures can roost in the nooks and crannies of homes in Kenya. "These are bats that live with people--they don't call them house bats for nothing," adds Bruce Patterson, MacArthur Curator of Mammals at the Field Museum and co-author of the study. Bats usually don't fly too far to find a home either. Despite having wings, bats prefer to stay in a specific region, resulting in huge amounts of diversity throughout Africa.

Before understanding how these bat species related to one another, it was difficult to even research them. "We were using three different names for these bats in the field," says Patterson. That kind of evolutionary confusion is enough to make anyone batty. As Demos and Patterson explain, bats that look very similar could have wildly different genetic information. This means that new species could be hiding in plain sight due to their physical similarities to other species. The only way to solve this mystery is to use cutting-edge genetic analysis techniques.

Skin samples collected from the field in Kenya, combined with information from an online genetic database, provided clarity to species confusion. Comparing all the DNA sequences of the samples showed the amount of similarity. The more similar the DNA, the closer species are to each other evolutionarily. This information was then used to make a chart that looks like a tree, with branches coming off one point. The tree is similar to a family tree, but instead of showing the relationships between different family members, it shows the relationships between species. The results accomplished the goal of finding the limits of species but also showed unexpected results. Besides sorting the known species, the tree predicted at least two new bat species. "These new species are unknown to science," says Demos. "There was no reason to expect that we'd find two new species there." When Patterson saw these two undescribed species, he got excited: "It's cool because it says there's a chapter of evolution that no one's stumbled across before."

These findings are not only interesting to scientists but to the local farming industry. Organic groceries at Trader Joe's would be next to impossible without bats. They act as a natural pesticide, eating insects that threaten crops. Besides farmers, local health officials also rely on bat research because bats can be disease vectors that threaten public health. Being able to understand bats means that scientists can protect public health and plates of food.

This unexpected finding attests to the diversity of life in Kenya and other tropical locales in Africa. The variety of species in these regions is not ye described because, "Africa is understudied, and its biodiversity is underestimated, and it's critical because there are threats to its biodiversity," says Demos. This research gives a framework for future scientists to categorize species of bats and describe new species.

In the United States, because our bats are well researched, there is an app that can recognize bat calls, kind of like Shazam for bats. Patterson plays bat sounds off his phone,"I recorded this in my driveway and an app was able to identify the bat. This is what we want to be able to do in the field someday." The next step in this research is using the genetic analysis of Scotophilus bats as a framework that allows scientists to categorize and eventually recognize species based on observable features, such as the chirps, squeaks, and sounds human ears can't hear.

Demos notes that it is important to better understand these mysterious flying mammals to help conservation and local farming efforts. This study surveying Kenya paves the way for exploring other regions using the same methods. Science has brought us closer to understanding how bat species relate to one another, but Patterson says there is still more to discover--"No interesting biological questions are ever fully answered, and progress towards answering them invariably opens up a variety of others."

Credit: 
Field Museum

Giant, recently extinct seabird also inhabited Japan

image: Geographic locations of Bering Island and Shiriya. The red area indicates the possible past distribution of the cormorant, eventually residing only on Bering Island.

Image: 
Kyoto University / Junya Watanabe

Japan -- Scientists report that a large, extinct seabird called the spectacled cormorant, Phalacrocorax perspicillatus -- originally thought to be restricted to Bering Island, far to the north -- also resided in Japan nearly 120,000 years ago.

Writing in The Auk Ornithological Advances, the team indicates that the species under-went a drastic range contraction or shift, and that specimens found on Bering Island are 'relicts' -- remnants of a species that was once more widespread.

The global threat of human activity on species diversity is grave. To correctly assess re-lated extinction events, it is imperative to study natural distributions before first contact with humans. This is where archaeological and fossil records play crucial roles.

The spectacled cormorant, a large-bodied seabird first discovered in the 18th century on Bering Island, was later driven to extinction through hunting, following colonization of the island by humans in the early 1800s.

"Before our report, there was no evidence that the cormorant lived outside of Bering Island," explains first author Junya Watanabe of Kyoto University's Department of Ge-ology and Mineralogy.

Studying bird fossils recovered from Shiriya, Aomori prefecture, Watanabe and his team identified 13 bones of the spectacled cormorant from upper Pleistocene deposits, formed nearly 120,000 years ago.

"It became clear that we were seeing a cormorant species much larger than any of the four native species in present-day Japan," states co-author Hiroshige Matsuoka. "At first we thought this might be a new species, but these fossils matched bones of the specta-cled cormorant stored at the Smithsonian Institution."

Changes of oceanographic conditions may be responsible for the local disappearance of the species in Japan. Paleoclimate studies show that oceanic productivity around Shiriya dropped drastically in the Last Glacial Maximum, around 20,000 years ago. This would have seriously affected the population of the cormorant.

Although it might be possible that hunting of the species by humans took place in pre-historic Japan, archaeological evidence of this has yet to be discovered. The entire pic-ture of the extinction event of the spectacled cormorant may be more complex than pre-viously thought.

"The cormorant was a gigantic animal, its large size thought to have been achieved through adaptation to the island-oriented lifestyle on Bering," adds Watanabe. "But our finding suggests that this might not have been the case; after all, it just resided there as a relict. The biological aspects of these animals deserve much more attention."

Credit: 
Kyoto University

Colorful celestial landscape

image: New observations with ESO's Very Large Telescope show the star cluster RCW 38 in all its glory. This image was taken during testing of the HAWK-I camera with the GRAAL adaptive optics system. It shows the cluster and its surrounding clouds of brightly glowing gas in exquisite detail, with dark tendrils of dust threading through the bright core of this young gathering of stars.

Image: 
ESO/K. Muzic

This image shows the star cluster RCW 38, as captured by the HAWK-I infrared imager mounted on ESO's Very Large Telescope (VLT) in Chile. By gazing into infrared wavelengths, HAWK-I can examine dust-shrouded star clusters like RCW 38, providing an unparalleled view of the stars forming within. This cluster contains hundreds of young, hot, massive stars, and lies some 5500 light-years away in the constellation of Vela (The Sails).

The central area of RCW 38 is visible here as a bright, blue-tinted region, an area inhabited by numerous very young stars and protostars that are still in the process of forming. The intense radiation pouring out from these newly born stars causes the surrounding gas to glow brightly. This is in stark contrast to the streams of cooler cosmic dust winding through the region, which glow gently in dark shades of red and orange. The contrast creates this spectacular scene -- a piece of celestial artwork.

Previous images of this region taken in optical wavelengths are strikingly different -- optical images appear emptier of stars due to dust and gas blocking our view of the cluster. Observations in the infrared, however, allow us to peer through the dust that obscures the view in the optical and delve into the heart of this star cluster.

HAWK-I is installed on Unit Telescope 4 (Yepun) of the VLT, and operates at near-infrared wavelengths. It has many scientific roles, including obtaining images of nearby galaxies or large nebulae as well as individual stars and exoplanets. GRAAL is an adaptive optics module which helps HAWK-I to produce these spectacular images. It makes use of four laser beams projected into the night sky, which act as artificial reference stars, used to correct for the effects of atmospheric turbulence -- providing a sharper image.

This image was captured as part of a series of test observations -- a process known as science verification -- for HAWK-I and GRAAL. These tests are an integral part of the commissioning of a new instrument on the VLT, and include a set of typical scientific observations that verify and demonstrate the capabilities of the new instrument.

Credit: 
ESO

New approach to treating infectious diseases as an alternative to antibiotics

image: Fig.1. The mechanism of initial attachment of ETEC to human intestinal epithelium and its inhibition by antibodies.

Image: 
Osaka University

Enterotoxigenic Escherichia coli (ETEC) is known as a major cause of diarrhea in travelers and people living in developing countries. According to the World Health Organization (WHO), ETEC is responsible for 300,000 ~ 500,000 deaths a year, constituting a serious problem.

Effective vaccines for ETEC have not been developed, so patients infected with ETEC are treated with antibiotics and supporting measures. However, the emergence of multidrug-resistant bacteria has become a social issue, so the development of new treatment methods is sought after.

Adherence to the host intestinal epithelium is an essential step for ETEC infection in humans. It was thought that a filamentous structure on the surface of bacteria called 'type IV pilus' was important for bacterial attachment, but its detailed adhesion mechanism was not known.

Osaka University-led researchers clarified how pathogenic E.coli attached to the host intestinal epithelium using type IV pili and secreted proteins. Their research results were published in PNAS.

One of the corresponding authors, Shota Nakamura, says, "We demonstrated that type IV pili on the surface of the bacteria were not sufficient for ETEC adherence to intestinal epithelial cells and that proteins secreted by E.coli were also necessary. The administration of antibodies against the secreted proteins inhibited attachment of the E.coli."

Using X-ray crystallography, the researchers studied how a protein located only at the pilus-tip interacts with a protein secreted by E.coli in the intestines (Figure 2), clarifying the attachment mechanism of ETEC; that is, secreted proteins serve as molecular bridges that bind both type IV pili on the surface of the bacteria and intestinal epithelial cells in humans.

Nakamura also says, "It's possible that this attachment is a common feature in many type IV pili expressing enteropathogens such as Vibrio cholerae and constitutes a new therapeutic target against such bacterial pathogens."

Their research results will lead to the development of not only new vaccines for ETEC, but also anti-adhesion agents for preventing the binding of proteins implicated in bacterial attachment. Anti-adhesion agents can rinse pathogenic bacteria out from the body without destroying them, so there is no danger of producing drug-resistant bacteria. These agents, once developed, will act as a novel treatment approach that may serve as an alternative to antibiotics.

Credit: 
Osaka University

New report says individual research results should be shared with participants more often

WASHINGTON - When conducting research involving the testing of human biospecimens, investigators and their institutions should routinely consider whether and how to return individual research results on a study-specific basis through an informed decision-making process, says a new report from the National Academies of Sciences, Engineering, and Medicine. Decisions on whether to return individual research results will vary depending on the characteristics of the research, the nature of the results, and the interests of participants.

The undertaking of biomedical research with human participants -- from exploratory, basic science inquiries to clinical trials using well-validated tests -- often includes development of laboratory test results associated with an individual research participant. Recent changes to federal regulations have promoted transparency and allowed individuals greater access to these results; however, regulations are not consistent, the report says. For example, the Centers for Medicare & Medicaid Services (CMS) prohibits the return of results from laboratories that are not certified under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), but in some circumstances the Health Insurance Portability and Accountability Act of 1996 (HIPAA) may require the return of results requested by a participant, regardless of whether they were generated in a CLIA-certified laboratory. CLIA requirements ensure the quality and integrity of data, accurate reconstruction of test validation and test performance, and the comparability of test results regardless of performance location.

"There is a long-standing tension in biomedical research arising from a conflict in core values - the desire to respect the interests of research participants by communicating results versus the responsibility to protect participants from uncertain, perhaps poorly validated information," said Jeffrey Botkin, associate vice president for research and professor of pediatrics at University of Utah and chair of the study committee that wrote the report. "In weighing the complex and competing considerations, we recommend a transition away from firm rules embodied in current CLIA and HIPAA regulations toward a process-oriented approach favoring communication of results while seeking to enhance the quality of results emerging from research laboratories. Our hope is that this report will provide a road map toward better and more collaborative and transparent research practices that will benefit participants, investigators, and society more broadly."

The justification for returning results becomes stronger as both the potential value of the result to participants and the feasibility of return increase, the report says. To harmonize relevant regulations, regulators and policymakers should revise them in a way that respects the interests of research participants in obtaining individual research results and balances the competing considerations of safety, quality, and burdens on the research enterprise. For example, CMS should revise CLIA regulations to allow for the return of results from non-CLIA certified laboratories when results are requested under the HIPAA access right and also when an institutional review board process determines it is permissible.

Establishing laboratory processes to give all stakeholders confidence in the validity of the individual research results is critical to ensuring the accuracy of information provided to research participants as well as the quality of the science. Currently, there is no accepted quality management system (QMS) for research laboratories that could serve as an alternative to CLIA certification. The committee recommended that the National Institutes of Health lead an effort with other relevant federal agencies, nongovernmental organizations, and patient and community groups to develop a QMS with external accountability for non-CLIA certified research laboratories testing human biospecimens.

To minimize the burden for research laboratories with constrained resources to put such a QMS in place, sponsors, funding agencies, and research institutions should facilitate access to resources and support training and the development of the necessary laboratory infrastructure. The initial training, cost, and time commitment will likely be significant, but the value added will be considerable for both participants and science, the report says.

Furthermore, the use of effective communication strategies can minimize the risk of misinterpretation or over-interpretation of research results. In the consent process, investigators should communicate clearly to research participants whether, under what circumstances, and how investigators will offer and return research results. When individual results are communicated to participants, investigators should facilitate understanding of the meaning and limitations of results by, for example, ensuring there is a clear take-away message, explaining the level of uncertainty, and providing mechanisms for the participants to obtain additional information and guidance for follow-up consultation, when appropriate.

The report also includes recommendations for investigators to engage community groups and advocacy organizations to make sure participant needs and values are incorporated into decisions about returning individual results, regardless of participant social or economic status, and for research sponsors to require planning for the return of individual results in funding applications.

Credit: 
National Academies of Sciences, Engineering, and Medicine