Culture

Identifying colorectal cancer subtypes could lead to improved treatment decisions

image: This is Heinz-Josef Lenz, MD.

Image: 
Ricardo Carrasco III, Keck Medicine of USC

LOS ANGELES - Colorectal cancer is the second leading cause of cancer death in the U.S., expected to cause about 51,000 deaths in 2019. But until now, it was unclear which drugs were most effective for which patients.

Researchers at the USC Norris Comprehensive Cancer Center found that identifying a metastatic colorectal cancer patient's Consensus Molecular Subtype (CMS) could help oncologists determine the most effective course of treatment. CMS also had prognostic value, meaning each subgroup was indicative of a patient's overall survival, regardless of therapy. The results are from the multi-center Phase III CALGB/SWOG 80405 trial and published in the Journal of Clinical Oncology.

CMS categorizes colorectal cancer into four distinct, biologically characterized subgroups based on how mutations in the tumor behave. The subgroups were created using data from several research teams around the world that had previously analyzed tumors of colorectal cancer patients who were treated with surgery and adjuvant chemotherapy. Although CMS classification is not based on clinical outcomes, there seemed to be patterns in how different subtypes responded to treatment.

"We wanted to understand the importance of CMS for patients with metastatic disease who are treated with the two most important first-line therapies," says Heinz-Josef Lenz, MD, Professor of Medicine in the Division of Oncology at the Keck School of Medicine of USC and J. Terrence Lanni Chair in Gastrointestinal Cancer Research at USC Norris. Lenz was the lead author on the study. "We anticipated that CMS had prognostic value, but we were impressed at how strongly CMS was associated with outcomes."

The study compared the efficacy of two different therapies (chemotherapy and cetuximab versus bevacizumab) on 581 metastatic colorectal cancer patients categorized by CMS. The data showed a strong association between a patient's CMS subtype and both overall survival and progression-free survival. For example, patients in CMS2 had a median overall survival of 40 months compared to 15 months for patients in CMS1.

CMS also was predictive of overall survival among patients on either treatment, with patients in certain subtypes faring better on one therapy over the other. Survival for CMS1 patients on bevacizumab was twice that of those on cetuximab, whereas survival for CMS2 patients on cetuximab was six months longer than for bevacizumab.

"This study establishes the clinical utility of CMS in treating colorectal cancer," Lenz says. "It also provides the basis for more research to uncover additional clinically significant predictive signatures within these subtypes that might better personalize patient care."

Currently, it is not possible to order patient subtyping, though multiple efforts are underway to develop an assay approved for clinical use. Lenz estimates that this could happen in a matter of months. Until then, he and his colleagues continue to analyze data from more than 44,000 samples of blood, tissue and plasma in one of the largest, most comprehensive research efforts to characterize DNA, RNA and germline DNA in colon cancer. "This is only one study of many more to come that will help us understand this disease at the molecular level so we can provide better care for patients," Lenz says.

Credit: 
University of Southern California - Health Sciences

Algorithm to transform investment banking with higher returns

image: This is Dr. Arman Hassanniakalager of the University of Bath's School of Management.

Image: 
University of Bath

A University of Bath researcher has created an algorithm which aims to remove the elements of chance, bias or emotion from investment banking decisions, a development which has the potential to reduce errors in financial decision making and improve financial returns in global markets.

"There is a global race to find a viable solution to create more reliable - and better performing - investment decisions in financial trading. Our model offers consistently higher returns compared to others developed to date," says Dr Arman Hassanniakalager of the university's School of Management.

Hassanniakalager, who will present the research at the Financial Management Association conference in Glasgow this week, says his model has been shown to result in a 3% higher return than the benchmark U.S. Federal Reserve Funds rate, based on evidence from 12 stock market indices from around the globe. An improvement of 0.5-1.0% would be regarded as significant.

The search for an all-powerful investment algorithm has stepped up in recent years and early results have been mixed. The challenge is to create a level of reliability that consistently outperforms investment bankers and financiers and a tool that can function equally well in rising and falling markets.

The continued development of algorithms and their perceived benefits are raising hopes and optimism among many in the markets. But the increasing reliance on the tools has also created some nervousness in the top tiers of the world's financial systems - and some scepticism from those who believe there will always be a role for the inspired human touch.

Hassanniakalager, whose expertise is in developing novel artificial intelligence and statistical methods for financial decision making, said his algorithm has reached the point where it is consistently outperforming both conventional methods of investment and algorithmic tools.

"There is a lot of theoretical thinking and aspirations around about such investment tools but the key question is solving how to make them work in the real world. We think we have addressed that question," Hassanniakalager said.

The algorithm can be linked to artificial intelligence, which will learn from investment decisions and fine-tune itself automatically. He envisages a black-box solution for investment managers who will be able to run complex alternative investment scenarios in real time.

The primary use would be in trading rooms, in particular in the technical analysis field, assessing how stock markets react to company news or in gauging the performance of derivative instruments and offering different investment paths to managers.

The tool will change the decision-making process and potentially the market landscape itself - the days of multiple screens in trading rooms and managers seeking to make sense of an increasingly complex multitude of real-time and historic data may be numbered.

There may even be a question mark over the future of decision-makers themselves.

"Whoever succeeds in this has the potential to transform financial markets and particularly investment banking and equities trading. There will be winners and losers - it isn't hard to imagine the radical impact on employment at the highest banking levels if investment decisions are increasingly automated," Hassanniakalager says.

The algorithm, which Hassanniakalager describes as universal, may have applications beyond financial markets. "If you learn what is changing statistically, you can apply that to other fields, such as genetics. That's the beauty of statistics," he says.

Hassanniakalager will present the findings of the research team, which includes academics from the Universities of Glasgow and St Andrews, on Friday 14 June at the FMA International conference at the University of Strathclyde.

Credit: 
University of Bath

Cancer survival rates in the young show inconsistent progress

A new study in JNCI Cancer Spectrum finds that dramatic increases in cancer survival in adolescents and young adults are undermined by continuing disparities by race, ethnicity, and socioeconomic status. The patterns here suggest that most of the recent survival increases in this age group were driven by improvements in treatments for HIV/AIDS and related cancers.

In a 2006 report, Closing the Gap, the National Cancer Institute Progress Review Group reported that from 1977 to 1997, adolescents and young adults (people between 15 and 39) demonstrated an alarming lack of improvement in cancer survival compared with other age groups.

The report served as a call to action. Thereafter several groups worked to improve outcomes for adolescents and young adults with cancer. These efforts included specific research initiatives and establishing an oncology discipline committee for this age group in the Children's Oncology Group and in other adult oncology cooperative groups.

This latest study looking at the more recent time period from 1988-2014 shows that among 30 to 34 year olds with cancer, the 5- year survival rate increased by 20.6% in males and 4.2% in females from 1988-2000 to 2001-2014. Among 35 to 39 year olds the 5-year survival rate increased by 18.9% in males and 4.2% in females. Among males of all ages, survival improvement was greatest for adolescents and young adults.

This study also examined the changes in survival among adolescents and young adults over two time periods: 1988-2000 and 2001-2014. There were no differences in survival between the time periods for adolescents and young adults with bone/soft tissue sarcoma, ovarian carcinoma, ovarian germ cell tumor, stomach cancer, testicular cancer, thyroid cancer, and uterine cancer. Cancer types showing the greatest improvement in survival between the two time periods were Kaposi sarcoma, chronic myeloid leukemia, and non-Hodgkin lymphoma. There were no cancer types that had worse survival after adjusting for cancer and demographic factors. Both Kaposi sarcoma and non-Hodgkin lymphoma are cancers that disproportionately affect people with HIV/AIDS.

Subgroups of adolescents and young adults that showed worsening trends in survival between the two time periods included black people with bone/soft tissue sarcoma, 20 to 24 year old women with cervical cancer, and poor adolescents and young adults with cervical cancer. The survival improvements in Kaposi sarcoma and non-Hodgkin lymphoma were more evident in those from higher socioeconomic groups. In addition, survival improvements in brain cancer, colorectal cancer, melanoma, stomach cancer, and testicular cancer appear to be greater among whites than among other racial/ethnic groups.

For race/ethnicity, compared to white adolescents and young adults, black patients had the highest risk of dying for all cancers combined, a disparity present in almost every cancer type, followed by Asian-Pacific Islanders and Latinos. For every racial/ethnic minority group compared to whites, survival disparities for all cancers combined worsened over time. For all cancers, poorer patients had an increased risk of death compared to those of higher socioeconomic status; and like the racial/ethnic disparities, this risk increased over time. The largest risk factor for death in adolescents and young adults was stage of disease, with those with advanced stage cancer having the worst outcomes.

It appears the negative survival improvement gap observed in this age group from 1977 to 1997 was essentially driven by the rise and height of HIV/AIDs related cancers in males. Similarly, the dramatic survival improvement researchers here found among adolescents and young adults in this study from 1988 to 2014, especially among males, was the result of the baseline survival corresponding to the height of the HIV/AIDS epidemic. The large survival improvement essentially resulted from its fall.

Even though the historical gap in survival improvement within this age group has largely been closed with the advent of effective treatment for HIV/AIDS, adolescents and young adults are a vulnerable population with unique challenges.

"This highlights that continued research is required in this vulnerable age group," said the paper's lead author, Diana Moke. "This research is necessary to improve survival in certain cancer types that have not shown any recent progress, find novel therapies for advanced stage disease in all cancer types, and ensure that survival improvements reach all young patients, regardless of race/ethnicity or socioeconomic status."

Credit: 
Oxford University Press USA

Epilepsy drugs linked to increased risk of suicidal behavior, particularly in young people

Treatment with gabapentinoids - a group of drugs used for epilepsy, nerve pain and anxiety disorders - is associated with an increased risk of suicidal behaviour, unintentional overdose, injuries, and road traffic incidents, finds a study from Sweden published by The BMJ today.

Prescriptions have risen steeply in recent years, and gabapentinoids are among the top 15 drugs globally in terms of revenue.

The risks are strongest among 15 to 24 year-olds, prompting the researchers to suggest that treatment guidelines for young people should be reviewed.

Previous studies have linked gabapentinoids to suicidal behaviour and overdose related deaths, but findings have been inconsistent and data on longer term harms are lacking.

Concerns that these drugs are also being used as an opioid substitute and for recreational use have led to prescribing restrictions in several countries, including the UK.

To help fill this evidence gap, an international research team examined associations between gabapentinoids and a range of harms including suicidal behaviour, unintentional overdose, injuries, road traffic incidents, and violent crime.

Using national prescription, patient, death, and crime registers, they identified 191,973 people aged 15 years and older who were prescribed pregabalin or gabapentin in Sweden between 2006 and 2013.

Overall, 59% of participants were women, and most were 45 years or older.

The researchers then compared the risk of harms during treatment periods with baseline risk during periods without treatment.

After taking account of potentially influential factors, they found that during treatment periods, participants were at a 26% increased risk of suicidal behaviour or death from suicide, a 24% increased risk of unintentional overdose, a 22% increased risk of head or body injuries, and a 13% increased risk of road traffic incidents or offences.

There were no statistically significant associations between gabapentinoid treatment and violent crime.

When drugs were examined separately, only pregabalin, not gabapentin, was associated with increased risks of harm.

And when the results were analysed by age, risks were greatest among 15 to 24 year-olds. This could be due to impulsivity and risk taking behaviour, or use of alcohol and illicit drugs alongside gabapentinoids, suggest the authors.

This is an observational study, so can't establish cause, and the researchers weren't able to account for drug adherence or any interplay between alcohol and illicit drug use. Nevertheless, this was a large study examining a wide range of outcomes and designed to minimise the effects of unmeasured (confounding) factors.

Further research is needed to better understand the increased risks found in adolescents and young adults prescribed gabapentinoids, particularly for suicidal behaviour and unintentional overdoses, while clinical guidelines may also need review, they conclude.

In a linked editorial, Derek Tracy, a consultant psychiatrist at Queen Mary's Hospital in London, says these findings provide "solid data to inform patients on the risks associated with treatment."

The findings suggest it might be time to uncouple pregabalin and gabapentin for the purposes of legislation and guidelines, he writes. "We also need to understand what is driving the age related differences in risks and how recent legal restrictions will affect the illicit market in diverted drugs."

Despite reasonable concerns, gabapentinoids "remain a valued therapeutic option for many people," he concludes. "Medicines can harm as well as heal, and the best treatment decisions are made in full partnership with patients, after consideration of all available evidence on both."

Peer-reviewed? Yes (research); No (linked editorial)
Evidence type: Observational; Opinion
Subjects: People

Credit: 
BMJ Group

Braces won't always bring happiness

Research undertaken at the University of Adelaide overturns the belief that turning your crooked teeth into a beautiful smile will automatically boost your self-confidence.

The study, carried out by Dr Esma Dogramaci and Professor David Brennan from the University of Adelaide's Dental School, followed 448 13-year-olds from South Australia in 1988 and 1989. By the time that they turned 30 in 2005 and 2006 more than a third of them had received orthodontic treatment.

"The study, which is the first of its type undertaken in Australia and only the second in the world, examined if having braces lead to a greater level of happiness or psychosocial outcomes, later in life," says Dr Dogramaci.

"There was a pattern of higher psychosocial scores in people who did not have orthodontic treatment meaning people who hadn't had braces fitted were significantly more optimistic than the ones that did have braces.

"Those who didn't have braces had varying levels of crooked teeth, just like those who had braces treatment - ranging from mild through to very severe."

The study looked at four psychosocial aspects: how well people felt they coped with new or difficult situations and associated setbacks; how much they felt that could take care of their own health; the support the person believed they received from their personal network and finally their own level of optimism.

"These indicators were chosen because they are important for psychosocial functioning and are relevant to health behaviours and health outcomes; since the core research question was the impact of braces treatment on patients' self-confidence and happiness in later life," says Dr Dogramaci.

Fourth year dental student Alex Furlan has never had braces fitted: "My orthodontist recommended that I have braces fitted but I'm quite happy without them. I've never felt the need to straighten my teeth - I can get on in life without having perfectly straight teeth," he says.

"A lot of people are convinced that if they have braces, they will feel more positive about themselves and do well, psychosocially, in later life. This study confirmed that other factors play a role in predicting psychosocial functioning as adults - braces as a youngster was not one of them," says Dr Dogramacci.

"But brushing at least twice a day and seeing a dentist regularly were amongst the factors related to better psychosocial scores."

"On a population level, those who have never had braces were more positive than those who had braces. While experiencing braces treatment won't guarantee happiness later in life, brushing teeth twice a day and seeing a dentist for regular check-ups will help to keep you healthy and happy."

Credit: 
University of Adelaide

Determining risk of recurrence in triple-negative breast cancer

image: Katherine Varley in her lab at Huntsman Cancer Institute at the University of Utah.

Image: 
Huntsman Cancer Institute

SALT LAKE CITY - A personalized prognosis for patients diagnosed with triple-negative breast cancer was the goal of a new study by Katherine Varley, PhD, researcher at Huntsman Cancer Institute (HCI) and assistant professor of oncological sciences at the University of Utah.

Twenty percent of women diagnosed with breast cancer in the United States will learn they have triple-negative breast cancer. That diagnosis means the three most common proteins known to fuel breast cancer growth--estrogen receptor, progesterone receptor, and HER2--are not present in the tumor. Those patients will not respond to any of the targeted therapies developed to treat breast cancer with those characteristics. After surgery, their only treatment option is chemotherapy. Targeted therapy allows healthy cells to survive, but chemotherapy can kill normal cells when eliminating the cancer cells.

Sixty percent of patients with triple-negative breast cancer will survive more than five years without disease, but four out of ten women will have a rapid recurrence of the disease. There are currently no clinical tests to assess an individual patient's prognosis, so all patients receive aggressive chemotherapy that can include up to four chemotherapy drugs and six months of treatment. Varley's new findings, recently published in Cancer Research, could change that. "We could very accurately predict which patients were going to have long-term disease-free survival and which patients were likely to have recurring disease. This is very exciting because it could be the first clinical test to enable personalized prognosis for triple-negative breast cancer patients," said Varley.

Varley previously discovered triple-negative breast cancer patients, whose tumors naturally turned on an immune response, were disease-free for much longer than those who did not. The objective of the new study was to find a way to translate this discovery into a clinical test to determine which patients have an inherently good prognosis and might safely be treated with less aggressive therapy. "That's significant because chemotherapy can lead to long-term heart and nerve problems," Varley noted. "If we can understand which patients need aggressive treatment and which patients will likely do well with less aggressive treatment, we could make a big difference in their lives."

Varley worked closely on the study with Rachel Stewart, DO, PhD, assistant professor of pathology and laboratory medicine at the University of Kentucky. They used specimens from patients treated at HCI. The tumor samples were taken more than five years ago, so the researchers could determine how each patient fared in the long term. The next step was developing a way to test for biomarkers of the immune response. The biomarker test was developed using formalin-fixed, paraffin-embedded tissues. This is important because it means this test can be run on tumor biopsy specimens that are routinely collected for breast cancer diagnosis.

The research team is currently applying the test to triple-negative breast cancer patient samples from clinical trials of chemotherapy and immunotherapy. Their next step is to validate that the test can be used to predict prognosis and choose the most effective and safest treatments. They are also investigating whether this test could be used for patients with HER2 positive breast cancer, lung cancer, ovarian cancer, and melanoma because the immune response is similar in those diseases.

"We're working as fast as possible to validate the test so it can benefit patients," said Varley. "One of my goals is to translate the discoveries we make in basic science and in our genomics research into clinical tests because I know patients are waiting."

Credit: 
Huntsman Cancer Institute

Increasing red meat consumption linked with higher risk of premature death

People who increased their daily servings of red meat over an eight-year period were more likely to die during the subsequent eight years compared to people who did not increase their red meat consumption, according to a new study led by researchers from Harvard T.H. Chan School of Public Health. The study also found that decreasing red meat and simultaneously increasing healthy alternative food choices over time was associated with lower mortality.

The study will appear online June 12, 2019 in BMJ.

A large body of evidence has shown that higher consumption of red meat, especially processed red meat, is associated with higher risk of type 2 diabetes, cardiovascular disease, certain types of cancers including those of the colon and rectum, and premature death. This is the first longitudinal study to examine how changes in red meat consumption over time may influence risk of early death.

For this study, researchers used health data from 53,553 women in The Nurses' Health Study and 27,916 men in the Health Professionals Follow-up Study who were free of cardiovascular disease and cancer at baseline. They looked at whether changes in red meat consumption from 1986-1994 predicted mortality in 1994-2002, and whether changes from 1994-2002 predicted mortality in 2002-2010.

Increasing total processed meat intake by half a daily serving or more was associated with a 13% higher risk of mortality from all causes. The same amount of unprocessed meat increased mortality risk by 9%. The researchers also found significant associations between increased red meat consumption and increased deaths due to cardiovascular disease, respiratory disease, and neurodegenerative disease.

The association of increases in red meat consumption with increased relative risk of premature mortality was consistent across participants irrespective of age, physical activity level, dietary quality, smoking status, or alcohol consumption, according to the researchers.

Study results also showed that, overall, a decrease in red meat together with an increase in nuts, fish, poultry without skin, dairy, eggs, whole grains, or vegetables over eight years was associated with a lower risk of death in the subsequent eight years.

The researchers suggest that the association between red meat consumption and increased risk of death may be due to a combination of components that promote cardiometabolic disturbances, including saturated fat, cholesterol, heme iron, preservatives, and carcinogenic compounds produced by high temperature cooking. Red meat consumption has also recently been linked to gut microbiota-derived metabolite trimethylamine N-oxide (TMAO) that might promote atherosclerosis.

"This long-term study provides further evidence that reducing red meat intake while eating other protein foods or more whole grains and vegetables may reduce risk of premature death. To improve both human health and environmental sustainability, it is important to adopt a Mediterranean-style or other diet that emphasizes healthy plant foods," said senior author Frank Hu, Fredrick J. Stare Professor of Nutrition and Epidemiology and chair, Department of Nutrition.

Credit: 
Harvard T.H. Chan School of Public Health

Increasing red meat intake linked with heightened risk of death

Increasing red meat intake, particularly processed red meat, is associated with a heightened risk of death, suggests a large US study published in The BMJ today.

However, reducing red meat intake while increasing healthy protein sources, such as eggs and fish, whole grains and vegetables over time may lower the risk, the researchers say.

High intake of red meat, such as beef, pork and lamb, has been previously linked with a higher risk of type 2 diabetes, cardiovascular disease, certain types of cancers, and premature death. But little is known about how changes in red meat intake may influence risk of death.

So to explore this further, a team of researchers based in the US and China looked at the link between changes in red meat consumption over an eight year period with mortality during the next eight years, starting from 1986 to the end of follow-up in 2010.

They used data for 53,553 US registered female nurses, aged 30 to 55, from the Nurses' Health Study (NHS) and 27,916 US male health professionals, aged 40 to 75, from the Health Professionals Follow-up Study (HPFS), who were free of cardiovascular disease and cancer at the start of the study.

Every four years the participants completed a food frequency questionnaire (FFQ) where they were asked how often, on average, they ate each food of a standard portion size in the past year, ranging from "never or less than once per month" to "6 or more times a day". They were then divided into five categories based on their changes in red meat intake.

During the study period, the total number of deaths from any cause (known as "all cause mortality") reached 14,019 (8,426 women and 5,593 men). The leading causes were cardiovascular disease, cancer, respiratory disease and neurodegenerative disease.

After adjusting for age and other potentially influential factors, increasing total red meat intake (both processed and unprocessed) by 3.5 servings a week or more over an eight year period was associated with a 10% higher risk of death in the next eight years.

Similarly, increasing processed red meat intake, such as bacon, hot dogs, sausages and salami, by 3.5 servings a week or more was associated with a 13% higher risk of death, whereas increasing intake of unprocessed red meat was associated with a 9% higher risk.

These associations were largely consistent across different age groups, levels of physical activity, dietary quality, smoking and alcohol consumption habits.

Overall, reducing red meat intake while eating more whole grains, vegetables, or other protein foods such as poultry without skin, eggs and fish, was associated with a lower risk of death among both men and women.

For example, swapping out one serving per day of red meat for one serving of fish per day over eight years was linked to a 17% lower risk of death in the subsequent eight years.

Similar findings were seen in the shorter-term (four years) and longer-term (12 years) for the link between changes in red meat intake and mortality, and for replacing red meat with healthier food alternatives.

This is an observational study, and as such, can't establish cause. And the authors point out some limitations, including that they did not look at the reasons for changes in red meat consumption which could have influenced the results.

And the study participants were mainly white registered health professionals so the findings may not be more widely applicable.

But the authors say that the data gathered covered a large number of people over a long follow-up period, with repeated assessment of diet and lifestyle factors, and consistent results between the two cohorts. What's more, this is the first study of its kind to examine the association between changes in red meat intake and subsequent risk of mortality.

The findings provide "a practical message to the general public of how dynamic changes in red consumption is associated with health," they write.

"A change in protein source or eating healthy plant based foods such as vegetables or whole grains can improve longevity," they conclude.

Credit: 
BMJ Group

Strobe lighting at dance music festivals linked to tripling in epileptic fit risk

Strobe lighting at electronic dance music festivals may be linked to a tripling in the risk of epileptic fits in susceptible individuals, suggests research published in the online journal BMJ Open.

Organisers need to issue warnings and advice on preventive measures, particularly for those who have a history of epilepsy that responds to flashing lights, known as photosensitive epilepsy, argue the researchers, who note that the popularity of dance music events generates revenues of US$ 5.7 billion every year worldwide.

Strobe lighting is known to heighten the risk of epileptic seizures in susceptible individuals. But the risks associated with attending electronic dance music festivals are not widely known, and organisers consequently don't routinely warn visitors about them.

Prompted by the case of a 20 year old who collapsed at one such festival and then experienced an epileptic seizure for the first time, the researchers decided to look in more detail at the potential health impacts of strobe lighting at dance music events.

They drew on data for incidents requiring medical assistance, including for ecstasy use, among 400,343 visitors to 28 daytime and nighttime electronic dance music festivals in The Netherlands throughout 2015.

They used data from one company which provides medical services to nearly all dance music festivals in The Netherlands.

Eyewitness reports of sudden loss of consciousness and muscle twitching combined with physical findings, such as evidence of tongue biting and temporary urinary incontinence, were used to inform a diagnosis of an epileptic seizure.

Some 241,543 people attended nighttime gigs, where strobe lighting was used, and 158,800 attended daytime gigs, where strobe lighting was less intense because of the effects of sunlight.

In all, medical assistance was provided on 2776 occasions. In 39 cases this was for an epileptic seizure, 30 of which occurred during nighttime gigs, meaning that the risk of a seizure associated with a nighttime event was 3.5 times greater than for a daytime event.

Use of ecstasy, which is the most commonly used recreational drug at dance music events, and which has been associated with heightened epileptic seizure risk, was more likely among those attending nighttime events: around one in four compared with one in 10 of those attending daytime events.

But the proportion of cases in which the drug had been used was similar in both groups of visitors, suggesting that this alone wasn't responsible for the heightened seizure risk, suggest the researchers.

This is an observational study, and as such, can't establish cause. What's more, the researchers weren't able to glean other potentially influential factors, such as medical history, sleep deprivation, or use of other medication, and they relied on witness reports/on site medical assessments, all of which may have affected the accuracy of the figures.

But, they write: "We think, however, that our numbers are probably an underestimate of the total number of people who had epileptic seizures."

And they add: "Regardless of whether stroboscopic light effects are solely responsible or whether sleep deprivation and/or substance abuse also play a role, the appropriate interpretation is that large [electronic dance music] festivals, especially during nighttime, probably cause at least a number of people per event to suffer epileptic seizures."

They advise anyone with photosensitive epilepsy to either avoid such events or to take precautionary measures, such as getting enough sleep and not taking drugs, not standing close to the stage, and leaving quickly if they experience any prodromal 'aura' effects.

"Given the large dataset, we believe our findings are externally valid, at least for other [electronic dance music] festivals in other countries which generally attract a similar audience," they conclude.

Credit: 
BMJ Group

Why Noah's ark won't work

video: A purple sea urchin, in a University of Vermont laboratory, part of a new study that shows that for ocean species to survive climate change large populations will be needed.

Image: 
Joshua Brown/UVM

A Noah's Ark strategy will fail. In the roughest sense, that's the conclusion of a first-of-its-kind study that illuminates which marine species may have the ability to survive in a world where temperatures are rising and oceans are becoming acidic.

Two-by-two, or even moderately sized, remnants may have little chance to persist on a climate-changed planet. Instead, for many species, "we'll need large populations," says Melissa Pespeni a biologist at the University of Vermont who led the new research examining how hundreds of thousands of sea urchin larvae responded to experiments where their seawater was made either moderately or extremely acidic.

The study was published on June 11, 2019, in the Proceedings of the Royal Society B.

RARE RELIEF

Pespeni and her team were surprised to discover that rare variation in the DNA of a small minority of the urchins were highly useful for survival. These rare genetic variants are "a bit like having one winter coat among fifty lightweight jackets when the weather hits twenty below in Vermont," Pespeni says. "It's that coat that lets you survive." When the water conditions were made extremely acidic, these rare variants increased in frequency in the larvae. These are the genes that let the next generation of urchins alter how various proteins function--like the ones they use to make their hard-but-easily-dissolved shells and manage the acidity in their cells.

But to maintain these rare variants in the population--plus other needed genetic variation that is more common and allows for response to a range of acid levels in the water--requires many individuals.

"The bigger the population, the more rare variation you'll have," says Reid Brennan, a post-doctoral researcher in Pespeni's UVM lab and lead author on the new study. "If we reduce population sizes, then we're going to have less fodder for evolution--and less chance to have the rare genetic variation that might be beneficial."

In other words, some organisms might persist in a climate-changed world because they're able to change their physiology--think of sweating more; some will be able to migrate, perhaps farther north or upslope. But for many others, their only hope is to evolve--rescued by the potential for change that lies waiting in rare stretches of DNA.

RAPID ADAPTATION

The purple sea urchins the UVM team studied in their Vermont lab are part of natural populations that stretch from Baja, California to Alaska. Found in rocky reefs and kelp forests, these prickly creatures are a favorite snack of sea otters--and a key species in shaping life in the intertidal and subtidal zones. Because of their huge numbers, geographic range, and the varying conditions they live in, the urchins have high "standing genetic variation," the scientists note. This makes purple urchins likely survivors in the harsh future of an acidified ocean--and good candidates for understanding how marine creatures may adapt to rapidly changing conditions.

It is well understood that rising average global temperatures are a fundamental driver of the imminent extinction faced by a million or more species--as a recent UN biodiversity report notes. But it's not just rising averages that matter. It may be the hottest--or most acidic--moments that test an organism's limits and control its survival. And, as the UVM team writes, "the genetic mechanisms that allow rapid adaptation to extreme conditions have been rarely explored."

CURRENCY IN THE CURRENT SEA

The new study used an innovative "single-generation selection" experiment that began with twenty-five wild-caught adult urchins. Each female produced about 200,000 eggs from which the scientists were able extract DNA out of pools of about 20,000 surviving larvae that were living in differing water conditions. This very large number of individuals gave the scientists a clear view that purple urchins possess a genetic heritage that lets them adapt to extremely acidic ocean water. "This species of sea urchin is going to be okay in the short term. They can respond to these low pH conditions and have the needed genetic variation to evolve," says UVM's Reid Brennan. "So long as we do our part to protect their habitats and keep their populations large."

But coming through the ferocious challenge of rapid climate change may come at a high cost. "It's hopeful that evolution happens--and it's surprising and exciting that these rare variants play such a powerful role," says Melissa Pespeni, an assistant professor in UVM's biology department and expert on ocean ecosystems. "This discovery has important implications for long-term species persistence. These rare variants are a kind of currency that urchins have to spend," she says. "But they can only spend it once."

Credit: 
University of Vermont

Love songs from paradise take a nosedive

image: Fledgling tree finches may be infested in the nest.

Image: 
Dr Katharina Peters, Flinders University

The Galápagos Islands finches named after Charles Darwin are starting to sing a different tune because of an introduced pest on the once pristine environment.

International bird ecology experts, including Professor Sonia Kleindorfer and Dr Katharina Peters from Flinders University in South Australia, have found the beaks of Darwin's finches have changed to cope with infestations of the parasite Philornis downsi - and this is now affecting the birds' mating powers.

In a new paper published in The Royal Society's Proceedings of the Royal Society B, the researchers show that Darwin's finch males whose beaks and nostril (naris) have been damaged by the parasitic invasion are producing "sub-par song".

"In our newest research, we show that Darwin's finch males whose nares have been deformed by the parasite had greater vocal deviation - which females didn't like during mate choice - and had songs with lower maximum frequency," says Professor Kleindorfer, adding this also "confused the species identity of the singer".

The researchers, including University of California, Berkeley Adjunct Professor Frank J Sulloway, conclude that the Galápagos investigation specifically showed that critically endangered medium tree finches with enlarged naris size produced song that was indistinguishable from song of other finches.

"Given that small tree finch and medium tree finch are hybridised on Floreana Island, we suspect that this blurred species-signalling function of song may be partly to blame for the observed reverse speciation that is currently occurring," Dr Peters says.

"This research is evidence that parasite-induced morphological deformation can disrupt host mating signal with devastating effects on bird populations.

"The Philornis downsi larvae -- an accidentally introduced parasite - feed internally on the beaks of Galápagos birds causing permanent structural damage and enlarged naris (nostril) size.

The so-called Darwin's finches captivated the British naturalist during his Galápagos research in the 1830s and became the first vertebrate system to provide compelling field-based evidence for evolution of natural selection.

Years after Darwin's investigations, the finches became known as Darwin's finches.

Credit: 
Flinders University

Rescuers often driven by emotion

Scientists from James Cook University and Royal Life Saving Society - Australia have found reason can go out the window when people's family members, children and pets are in trouble in the water, and people should be better trained in water rescue skills.

JCU's Associate Professor Richard Franklin was part of a study that examined successful rescues, and drownings where someone had died trying to rescue another in trouble in the water.

He said many drowning prevention organisations emphasise the need for people to think about whether they can safely complete a rescue before they attempt it.

"What we have found is that about half of all those being rescued were close family or friends of the rescuer, and half were aged under 10 years. It's possible that thoughts of self-preservation go out the window when the potential drowning victim is from these groups," Dr Franklin said.

He said that seventeen rescuers drowned between 2002 and 2017 while trying to rescue children, while another six who died between January 2006 and December 2015 were trying to rescue a dog.

He said another piece of information was also important.

"About 14 % of the people who had completed a rescue were either current or former lifeguards, and none of them drowned in the attempt. This was almost certainly due to their training and experience," he said.

Dr Franklin said the scientists were calling for skills on safe rescues and effective resuscitation to be taught in high schools and regularly renewed.

"It's best to use primary prevention methods - targeted interventions such as concentrating on specific age groups, locations or activities - to prevent drownings, but we think secondary prevention measures such as rescues and resuscitation are also important for reducing the drowning toll," he said.

Amy Peden, Senior Research Fellow with Royal Life Saving Society - Australia, and an author of the paper, encouraged people to learn resuscitation and consider participating in a Bronze Medallion course.

"Most of those in our study were unprepared for undertaking rescues and often acted in the heat of the moment, to rescue a loved one. Having the skills to act in an emergency is vital to reducing the risk and avoiding an all too common scenario, the rescuer who drowns," she said.

Credit: 
James Cook University

Genetic marker linked to increased risk of diabetic peripheral neuropathy

image: Alessandro Doria, MD, PhD, MPH, Director of the Molecular Phenotyping and Genotyping at Joslin Diabetes Center and Professor of Medicine at Harvard Medical School

Image: 
John Soares

BOSTON - (June 11, 2019) - Researchers from Joslin Diabetes Center, using a genome-wide association study, have identified a genetic factor linked to the development of diabetic peripheral neuropathy. This finding suggests a new target for preventive therapies. The research has been published online and will appear in the August print issue of Diabetes.

While neuropathy, which causes pain or numbness in the legs and an increase risk of foot ulcers, is a major problem for many people with diabetes, there is significant variability in its onset: some people develop this complication, and others do not, says Alessandro Doria, MD, PhD, MPH, a study senior author and Director of the Molecular Phenotyping and Genotyping at Joslin Diabetes Center and Professor of Medicine at Harvard Medical School in Boston. "Therefore, we wanted to see if we could discover genetic factors that predispose people with diabetes to developing this complication versus being protected from it."

For this study, researchers used an approach called a genome-wide association study, or GWAS. This analysis is used to find disease-associated variants throughout the genome. A GWAS for diabetic peripheral neuropathy was carried out in 5,168 participants from the Action to Control Cardiovascular Risk in Diabetes (ACCORD) clinical trial --- 4,384 with evidence of peripheral neuropathy and 784 who were spared this complication.

After screening millions of small variations of the genome sequence (genetic variants), the study identified a region on chromosome 2q24 as having a powerful impact on the risk of peripheral neuropathy in type 2 diabetes. While the precise mechanisms are not known, there were some hints that the genetic variants in this region may act by affecting a nearby sodium channel regulating the transmission of sensory signals in peripheral nerves.

"People carrying the less frequent variant at that location were protected from neuropathy and people carrying the more common variant at that same location were predisposed to this complication," says Doria.

The implication is that this could be a target for pharmacological therapy to protect people from diabetic peripheral neuropathy. "We found that people with the protective allele have higher amounts of this sodium channel," says Doria. "This suggests that the sodium channel in the peripheral nerves might be used to protect people from neuropathy, by developing a drug that activates this channel."

This finding was replicated in an independent study, the Bypass Angioplasty Revascularization Investigation 2 Diabetes (BARI 2D) trial.

"The study is important because it's the first real effort to have a genome wide search for genes predisposing to this complication of diabetes. Diabetic peripheral neuropathy is often overlooked," says Hetal Shah, MD, MPH - a study senior author and a Research Associate at the Joslin Diabetes Center and Instructor of Medicine at Harvard Medical School. "Yet nearly one-fourth of the annual US expenditure on diabetes is due to diabetic peripheral neuropathy."

One limitation of this study is that it included only white subjects, so it's not known whether these findings also apply to people of other races.

Credit: 
Joslin Diabetes Center

Superweed resists another class of herbicides, study finds

image: Aaron Hager stands in a soybean field infested with multiple-herbicide-resistant waterhemp, a superweed becoming harder and harder to kill. Hager and his co-authors demonstrate that the weed is now resistant to one more class of popular herbicide products.

Image: 
L. Brian Stauffer, University of Illinois

URBANA, Ill. - We've all heard about bacteria that are becoming resistant to multiple types of antibiotics. These are the so-called superbugs perplexing and panicking medical science. The plant analogue may just be waterhemp, a broadleaf weed common to corn and soybean fields across the Midwest. With resistance to multiple common herbicides, waterhemp is getting much harder to kill.

In a new study from the University of Illinois, scientists document waterhemp's resistance to yet another class of herbicides, known as Group 15s. The study provides the first documentation of a non-grass plant to be resistant to Group 15 herbicides.

There are many herbicides on the market, but they all fall into one of 16 classes describing their mode of action (MOA), or specific target in the plant that the chemical attacks. Because of various regulations and biological realities, a smaller number of herbicide MOAs can be used on any given crop and the suite of weeds that goes along with it. Historically, about nine have been useful for waterhemp - and now the weed appears to be resistant to at least seven.

"In some areas, we're one or two MOAs away from completely losing chemical control of waterhemp and other multiple-herbicide-resistant weeds," says Adam Davis, head of the Department of Crop Sciences at Illinois and co-author on the study. "And there are no new herbicide MOAs coming out. There haven't been for 30 years."

Illinois weed scientist and co-author Aaron Hager adds, "We don't want to panic people, but farmers need to be aware this is real. It continues on with the challenges we've warned people about for years."

The research team tested the effectiveness of soil-applied Group 15 herbicides in a Champaign County population already resistant to five MOAs. They applied eight Group 15 formulations in the field at their label rates, and chose three - non-encapsulated acetochlor (Harness), S-metolachlor (Dual Magnum), and pyroxasulfone (Zidua) - for a rate-titration experiment in which the herbicides were applied at one-half, one, two, and four times the label rate.

The eight Group 15 products varied in their effectiveness, with encapsulated acetochlor (Warrant), S-metolachlor, metolachlor (Stalwart), and dimethenamid-P (Outlook) performing the worst. These products provided less than 25% control 28 days after application and less than 6% control 14 days later.

Of the rate-titration experiment, Hager says, "We found we could apply significantly higher than the labeled dose and still see resistance." For example, S-metolachlor provided only 10% control at the standard label rate, 20% at 2x the label rate, and 45% at 4x the label rate.

Hager says farmers might not notice the poor performance of these soil-applied pre-emergence herbicides because waterhemp germinates continuously throughout the season. When a weed pops up mid-season, it's hard to tell exactly when it emerged and whether it was exposed to residual soil-applied herbicides.

"If you think about how you use these products, rarely do they last the entire year. They're very dependent on environmental conditions to work effectively. It could be too wet or too dry. Generally speaking, you have some weed escape. But many farmers would chalk it up to these weather issues. If you're not thinking about it, you could very easily overlook resistance," Hager says.

To confirm results from the field, the team performed a dose response test in the greenhouse. In that test, four waterhemp populations - three with resistance to multiple herbicides and one that is sensitive to all herbicides - were dosed with increasing levels of S-metolachlor, acetochlor, dimethenamid-P, and pyroxasulfone. Populations from Champaign County and McLean County survived higher levels of the Group 15 herbicides than the other populations.

Hager suspects the plants are breaking the chemicals down before they cause damage, a trick known as metabolic resistance. All organisms can turn on cellular defenses against toxins, but it is rather worrisome when weeds and other undesirable pests use their biology against human interventions.

"As we get into the era of metabolic resistance, our predictability is virtually zero. We have no idea what these populations are resistant to until we get them under controlled conditions," Hager says. "It's just another example of how we need a more integrated system, rather than relying on chemistry only. We can still use the chemistry, but have to do something in addition.

"We want farmers to understand that we have to rethink how we manage waterhemp long term."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Brain activation provides individual-level prediction of bipolar disorder risk

Philadelphia, June 11, 2019 - Patterns of brain activation during reward anticipation may help identify people most at risk for developing bipolar spectrum disorders (BPSD), according to a study in Biological Psychiatry: Cognitive Neuroscience and Neuroimaging, published by Elsevier. Mania in people with BPSD is often accompanied by impulsivity, including impulsive responses to potential rewards. In the study, patterns of neural activation during a reward task predicted the severity of the mania symptom in young adults who have not yet developed the disorder.

"Given that emerging manic symptoms predispose to bipolar disorders, these findings can provide neural biomarkers to aid early identification of bipolar disorder risk in young adults," said first author Leticia de Oliveira, PhD, Federal Fluminense University, Brazil.

Having a family member with BPSD puts a person at risk for the disorder, but the relationship doesn't provide enough information to make decisions about potential interventions to help delay or prevent the disorder. The new study shows for the first time that brain activation patterns could be used to predict BPSD risk on an individual level. "These findings could be potentially used to guide the development and choice of early therapeutic interventions, reducing the significant social costs and deleterious outcomes associated with the disorder in these vulnerable individuals," said Dr. Oliveira.

To be sure that the approach would apply to anyone at risk, Dr. Oliveira and colleagues performed the brain imaging in a transdiagnostic group of young adults--the participants had a variety of psychiatric complications, but none had yet developed BPSD.

Of the whole brain, activation in a brain region used during decision making in reward contexts, called the ventrolateral prefrontal cortex (vlPFC), contributed the most to the prediction of symptom severity. This suggests that vlPFC activity in particular may be useful to predict severity of mania symptoms associated with BPSD risk in young adults.

"This study shows how the powerful combination of computational image analysis tools and functionally targeted task fMRI (in this case reward processing) can provide insights into the neural systems underlying symptoms that may indicate liability to mania, in a young, non-bipolar transdiagnostic group of psychiatric patients," said Cameron Carter, MD, Editor of Biological Psychiatry: Cognitive Neuroscience and Neuroimaging.

The researchers replicated the results and the role of the vlPFC in a second independent sample of young adults in the same study, further confirming the potential utility of neural activation in this brain region as a biomarker for BPSD risk.

Credit: 
Elsevier