Culture

Sports playbook helps doctors predict cancer patient outcomes, say Stanford researchers

In this season of global soccer competitions and hotly contested political primaries, bookies and pundits are scouring every evolving scrap of information and sifting through mountains of data in an effort to predict the outcome of the next game or election. These predictions can change on a dime, however, based on a player's poor pass or a candidate's stellar debate performance.

Statisticians refer to the technique of incorporating a variety of continuously generated information -- who is on the bench, who was injured in the first half of the match, who polled well in Iowa yesterday -- as calculating in-game win probability, and it's been used for decades to predict the outcome of ongoing sports matches or elections.

Now researchers at the Stanford University School of Medicine have taken a page from this playbook to generate more accurate prognoses for cancer patients. They've done so by designing a computer algorithm that can integrate many different types of predictive data -- including a tumor's response to treatment and the amount of cancer DNA circulating in a patient's blood during therapy -- to generate a single, dynamic risk assessment at any point in time during a patient's course of treatment. Such an advance could be deeply meaningful for patients and their doctors.

"When we care for our patients, we are walking on eggshells for a profound period of time while we try to determine whether the cancer is truly gone, or if it is likely to return," said associate professor of medicine Ash Alizadeh, MD, PhD. "And patients are wondering 'Should I be planning to attend my child's wedding next summer, or should I prioritize making my will?' We are trying to come up with a better way to predict at any point during a patient's course of treatment what their outcome is likely to be."

Surprisingly, the researchers have also found that the approach, which they've termed CIRI for Continuous Individualized Risk Index, may also help doctors to pinpoint people who might benefit from early, more aggressive treatments as well as those who are likely to be cured by standard methods.

The study will be published online July 4 in Cell. Alizadeh, a Stanford Health Care oncologist who specializes in treating patients with blood cancers, shares senior authorship with associate professor of radiation oncology Maxmilian Diehn, MD, PhD. Instructor of medicine David Kurtz, MD, PhD, and postdoctoral scholars Mohammad Esfahani, PhD, and Florian Scherer, MD, are the lead authors.

Getting a more complete picture

The researchers began their study by looking at people previously diagnosed with diffuse large B-cell lymphoma, which is the most common blood cancer in the United States. Although nearly two-thirds of adults with DLBCL are cured with standard treatment protocols, the remaining third will likely die from the disease.

When a DLBCL patient is diagnosed, clinicians like Alizadeh, Diehn and Kurtz assess the initial symptoms, the cell type from which the cancer originated and the size and location of the tumor after the first imaging scan to generate an initial prognosis. More recently, clinicians have also been able to assess the amount of tumor DNA circulating in a patient's blood after the first one or two rounds of therapy to determine how the tumor is responding and estimate a patient's overall risk of succumbing to their disease.

But each of these situations gives a risk based on a snapshot in time rather than aggregating all the data available to generate a single, dynamic risk assessment that can be updated throughout the course of a patient's treatment.

"What we're doing now is somewhat like trying to predict the outcome of a basketball game by tuning in at halftime to check the score, or by watching only the tipoff," Diehn said, "when in reality we know that there are any number of things that could have happened during the first half that we aren't taking into account. We wanted to learn if it's best to look at the latest information available about a patient, the earliest information we gathered, or whether it's best to aggregate all of this data over many time points."

Alizadeh and his colleagues gathered data on more than 2,500 DLBCL patients from 11 previously published studies for whom the three most common predictors of prognosis were available. They used the data to train a computer algorithm to recognize patterns and combinations likely to affect whether a patient lived for at least 24 months after seemingly successful treatment without experiencing a recurrence of their disease. They also included information from 132 patients for whom data about circulating tumor DNA levels were available prior to and after the first and second rounds of treatment.

"Our standard methods of predicting prognoses in these patients are not that accurate," Kurtz said. "Using standard baseline variables it becomes almost a crystal ball exercise. If a perfectly accurate test has a score of 1, and a test that assigns patients randomly to one of two groups has a score of 0.5 -- essentially a coin toss -- our current methods score at about 0.6. But CIRI's score was around 0.8. Not perfect, but markedly better than we've done in the past."

Identifying better treatment options

The researchers next tested CIRI's performance on data from previously published panels of people with a common leukemia and another on breast cancer patients. Although the prognostic indicators varied for each disease, they found that, by serially integrating the predictive information over time, CIRI outperformed the standard methods. Furthermore, it suggested that it might be useful to identify patients who might need more aggressive intervention within one or two rounds of treatment rather than waiting to see if the disease recurs.

"What I didn't expect was that aggregating all this information through time may also be predictive," Alizadeh said. "It might tell us 'you're going down the wrong path with this therapy, and this other therapy might be better.' Now we have a mathematical model that might help us identify subsets of patients who are unlikely to do well with standard treatments."

The researchers are next planning to test CIRI's predictive capabilities in people recently diagnosed with aggressive lymphoma.

Credit: 
Stanford Medicine

Scientists invent fast method for 'directed evolution' of molecules

CHAPEL HILL, N.C. - UNC School of Medicine scientists created a powerful new "directed evolution" technique for the rapid development of scientific tools and new treatments for many diseases.

The scientists, whose breakthrough is reported in Cell, demonstrated the technique by evolving several proteins to perform precise new tasks, each time doing it in a matter of days. Existing methods of directed evolution are more laborious and time-consuming, and are typically applied in bacterial cells, which limits the usefulness of this technology for evolving proteins for use in human cells.

Directed evolution is an artificial, sped up version of the evolution process in nature. The idea is to focus the evolutionary process on a single DNA sequence to make it perform a specified task. Directed evolution can be used, in principle, to make new therapeutics that work powerfully to stop diseases and have few or no side effects. The initial groundbreaking scientific work on directed evolution won the 2018 Nobel Prize in Chemistry.

"What we have developed is the most robust system yet for directed evolution in mammalian cells," said study lead author Justin English, PhD, a postdoctoral research associate in the Department of Pharmacology at the UNC School of Medicine.

"The scientific community has needed a tool like this for a long time", said study senior author Bryan L. Roth, MD, PhD, the Michael Hooker Distinguished Professor in the Department of Pharmacology at the UNC School of Medicine. "We believe our technique will accelerate research and ultimately lead to better therapeutics for people suffering with many of the diseases for which we need much better treatments."

The broad concept of directed evolution is not new. Researchers have been applying it for centuries in selecting and breeding variants of animals and plants that have desired characteristics, such as crop varieties with larger fruits. Biologists in recent decades also have used directed evolution at the molecular level in the laboratory, for example, by mutating a gene randomly until a variant appears that has a desired property. But on the whole, directed evolution methods for biological molecules have been difficult to use and limited in their application.

The new method developed by Roth, English, and colleagues is comparatively quick, easy, and versatile. It uses the Sindbis virus as the carrier of the gene to be modified. The virus with its genetic cargo can infect cells in a culture dish and mutate quite rapidly. The researchers set up conditions so that the only mutant genes to thrive are the ones encoding proteins capable of accomplishing a desired function within the cells, such as activating a certain receptor, or switching on certain genes. Because the system works in mammalian cells, it can be used to evolve new human, mouse, or other mammalian proteins that would be burdensome or impossible to generate with traditional bacterial cell-based methods.

English and his colleagues call the new system "VEGAS" for Viral Evolution of Genetically Actuating Sequences. In an initial demonstration, Roth's lab modified a protein called a tetracycline transactivator (tTA), which works as a switch to activate genes and is a standard tool used in biology experiments. Normally tTA stops working if it encounters the antibiotic tetracycline or closely related doxycycline, but the researchers evolved a new version with 22 mutations that allows tTA to keep working despite very high levels of doxycycline. The process took just seven days.

"To get a sense of how efficient that is, consider that a previously reported mammalian directed evolution method applied to the tetracycline transactivator took four months to yield just two mutations that conferred only partial insensitivity to doxycycline," English said.

The scientists next applied VEGAS to a common type of cellular receptor called a G protein-coupled-receptor (GPCR). There are hundreds of different GPCRs on human cells, and many are targeted by modern drugs to treat a wide variety of conditions. Precisely how a given GPCR changes shape when it switches from being inactive to active is of great interest to researchers trying to create more precise treatments. English and colleagues used VEGAS to quickly mutate a little-studied GPCR called MRGPRX2 so that it would stay in an always-active state.

"Identifying the mutations that occurred during this rapid evolution helps us understand for the first time the key regions in the receptor protein involved in the transition to an active state," English said.

In a final demonstration, the team showed the potential of VEGAS to guide drug development more directly. They used VEGAS to rapidly evolve small biological molecules called nanobodies that could activate different GPCRs - including serotonin and dopamine receptors, which are found on brain cells and are targeted by many psychiatric drugs.

The team is now using VEGAS in an effort to develop highly efficient gene-editing tools, potentially for curing genetic diseases, and to engineer nanobodies that can neutralize cancer-causing genes.

Credit: 
University of North Carolina Health Care

Discovery of mechanism behind precision cancer drug opens door for more targeted treatment

image: University of Alberta oncologist and cell biologist Michael Hendzel (right) was part of a national research team that identified how a new class of cancer drugs known as PARP inhibitors work, opening the door to better targeted therapy for cancer patients.

Image: 
Faculty of Medicine & Dentistry, University of Alberta

New research that uncovers the mechanism behind the newest generation of cancer drugs is opening the door for better targeted therapy.

PARP inhibitors are molecular targeted cancer drugs used to treat women with ovarian cancer who have the BRCA1 and BRCA2 gene mutations.

The drugs are showing promise in late-stage clinical trials for breast cancer, prostate cancer and pancreatic cancer and are part of an approach known as precision medicine, which targets treatments based on genetic, environmental and lifestyle factors.

"What we've done is identify how the drugs work," said University of Alberta oncologist and cell biologist Michael Hendzel. "Knowing how they work will enable us to come up with new applications for them, so we can make this drug as useful as possible for as many patients as possible."

People with the BRCA1 or BRCA2 gene mutation have a defect in their cells' ability to repair double-strand breaks in the DNA, which puts them at increased risk of developing breast cancer. The PARP inhibitors take advantage of that weakness and further interfere with the proteins known as poly-ADP ribose polymerase (PARP1 and PARP2), which cells use to repair the daily damage to DNA that occurs normally. When the cells can't repair themselves, they die. Normal cells are unaffected.

"Cells often have redundancy, so, for example, if they have a defect in one way they use to repair damage, they can use a different way to do it," explained Hendzel, who is also lead for the genomics stability research group at the Cancer Research Institute of Northern Alberta.

"But when they have that defect, and you have a drug to interfere with the backup pathway, then you kill those cells."

The research is the result of a 20-year collaboration between Hendzel's research lab, and those of Guy Poirier at Laval University's Centre de recherche sur le cancer and Jean-Yves Masson of the CHU de Québec's research centre.

PARP inhibitors are the first cancer therapies developed to exploit a process known as synthetic lethality, in which cancers with specific mutations are many times more sensitive to the drug than normal cancer cells.

Poirier said that one per cent of all cancer clinical trials now involve PARP inhibitors and they could be key to treating some intractable, aggressive cancers.

"PARP inhibitors work for cancers where no other treatment shows promise, such as metastatic pancreatic cancers and castration-resistant prostate cancer," he said.

Until now it was not understood how the PARP inhibitors work to interfere with cell repair. The new research reveals that PARP proteins regulate double-strand repair in the DNA, and the inhibitors prevent the control of the process that digests away one strand of DNA so it can be matched up with a copy that is used to repair it.

In previous research, Hendzel, Poirier and Masson were the first to establish that PARP played a role in double-strand break repair. Their new results explain many effects of PARP inhibition that were not previously understood.

The new study shows there is additional potential to develop and improve existing combination cancer treatments where radiation or chemotherapy that damages DNA is combined with drugs that target PARP. The results predict what properties a cancer must have in order for PARP inhibition to improve therapeutic effectiveness in combination therapy. A large number of clinical trials are currently combining PARP inhibitors with radiotherapy or chemotherapy.

"Our work explains why PARP inhibitors and radiotherapy are a good combo," said Masson. "In fact, PARP inhibitors will increase sensitivity to radiation therapy in some patients."

"In an era where we will commonly have complete genome sequences of human cancers, this study will enable the deployment of PARP inhibitors as a precision medicine in combination therapies," said Hendzel.

Credit: 
University of Alberta Faculty of Medicine & Dentistry

More money, skills and knowledge needed for social prescribing to serve as route into work

New funding, greater expertise and wider awareness in the system - and beyond - are needed to embed work outcomes into social prescribing practice.

A new report from The Work Foundation, Embedding Work and Related Outcomes into Social Prescribing: Overcoming Challenges and Maximising Opportunities, says social prescribing can be an effective means of integrating people into work. However, government and wider stakeholders must address a range of cultural and practical barriers to realise this potential.

The report, supported by a grant from AbbVie, as part of the Fit for Work UK Coalition, looked at changes in the social prescribing landscape since 2016. In that period, senior government ministers and government departments, including the Department of Health and Social Care (DHSC) and the Department for Work and Pensions, have increasingly referenced social prescribing, both generally and, more particularly, as a route to work.

However, while there is more widespread recognition of social prescribing as a route to work since 2016 - both in literature and among practitioners - such high-level policy aspirations to enhance the focus on work are still some way from being realised.

The research found evidence of a lack of recognition of the role work can play in improving health among social prescribing practitioners, as well as a lack of capacity and expertise to focus on work and related outcomes, but also highlighted the potential for positive results if these challenges are overcome.

Additional funding to incentivise work and related outcomes is needed to fill a gap in capacity and provide training for social prescribing practitioners, such as 'link workers', to develop the relevant expertise.

Building on social prescribing's capacity to foster partnerships - including across local health services and with wider services in the community - links could be sought with employment bodies/services, further expanding social prescribing's remit beyond the health and social care space.

A new message to articulate the effects social prescribing can have in relation to work and related outcomes is also needed to make the case for expansion into the employment space - ensuring 'work as a health outcome' is better recognised among healthcare professionals and work-focused providers, while breaking down barriers between the two groups. The DHSC's proposed National Academy for Social Prescribing could play a key role.

The report's authors stress that integrating work and related outcomes into social prescribing must be done carefully. There neds to be recognition it often serves as an 'indirect' rather than 'direct' route to work - via training, volunteering or other means - and there would be risks and problems if expectation of an immediate return to work is imposed, or if such results are rewarded through 'payment by results' to stakeholders.

They also call for new research to demonstrate evidence of the social prescribing's effectiveness breaking down clients' barriers to work.

"Social prescribing is a powerful means of reintegrating the 'hardest to reach' groups into the community, improving their health and wellbeing, and, by first prioritising their basic needs, empowering them to take steps towards employment," said lead author Dr James Chandler.

"Work - and in particular 'good work' - is hugely important for people's self-esteem, personal fulfilment, and, in turn, their health and wellbeing. As such, work is increasingly recognised as a 'health outcome' in and of itself.

"Social prescribing is currently primarily considered a health 'intervention' and often seen as a means of reducing direct health and social care costs. However, through its holistic approach, building on people's assets, improving their confidence and self-esteem, it naturally breaks down barriers to work, setting people on a journey towards improved health and wellbeing which - provided they want to - can culminate in employment.

"There are barriers to integrating work and related outcomes into social prescribing. One way of addressing them is improving awareness of work as a health outcome among social prescribing stakeholders. Another is equipping link workers with knowledge and expertise to support clients on a journey towards work."

Credit: 
Lancaster University

Collision course: Amateur astronomers play a part in efforts to keep space safe

image: Members of the Basingstoke Astronomical Society and Dstl at Porton Down.

Image: 
Dstl

Heavy traffic is commonplace on Earth but now congestion is becoming an increasing problem in space. With over 22,000 artificial satellites in orbit it is essential to keep track of their positions in order to avoid unexpected collisions. Amateur astronomers from the Basingstoke Astronomical Society have been helping the Ministry of Defence explore what is possible using high-end consumer equipment to track objects in space.

Grant Privett, of the Defence Science and Technology Laboratory (Dstl), will talk about this surprising collaboration on Thursday 4th July at the Royal Astronomical Society's National Astronomy Meeting in Lancaster.

When the Basingstoke Astronomy Society (BAS) heard about Dstl's Space Programme they were keen to find ways that they could help. Dstl's team came up with an idea that would help them to find out whether a low-cost distributed network of cameras could make a worthwhile contribution to the future UK space situational awareness effort.

The amateur astronomers used commercially available telescopes, tripod-mounted DSLR cameras and low-light cameras to record images of satellites such as the International Space Station, Cryosat, and Remove Debris. By collecting accurate time stamps for the images, Dstl were then able to process the data and compare expected orbits to the data provided by the astronomers.

Dstl were pleased to find that for low Earth orbit satellites, down to about the size of a kitchen freezer, small aperture prosumer level lenses and cooled CCDs similar to those used by amateur astronomers, are capable of monitoring their positions and maintaining a reasonably accurate orbit.

"The accuracy of the exposure timing is absolutely critical, and requires some attention to detail" explains Privett, "the BAS astronomers were very good and clearly highly talented so together we formed a good team".

"We found there are no obvious impediments to using commercially available kit to provide small component of a more capable and diverse system for monitoring space, where satellites of importance to UK communications, economy, and defence operate".

The ability of such a relatively low-cost and deployable approach to data gathering is being examined to ensure Dstl can provide the best possible guidance and advice to UK Government in its future procurements. The full technical results from the collaboration will be published later this year.

Credit: 
Royal Astronomical Society

Deep-CEE: The AI deep learning tool helping astronomers explore deep space

image: Image showing the galaxy cluster Abell1689. The novel deep learning tool Deep-CEE has been developed to speed up the process of finding galaxy clusters such as this one, and takes inspiration in its approach from the pioneer of galaxy cluster finding, George Abell, who manually searched thousands of photographic plates in the 1950s.

Image: 
NASA/ESA

Galaxy clusters are some of the most massive structures in the cosmos, but despite being millions of lightyears across, they can still be hard to spot. Researchers at Lancaster University have turned to artificial intelligence for assistance, developing "Deep-CEE" (Deep Learning for Galaxy Cluster Extraction and Evaluation), a novel deep learning technique to speed up the process of finding them. Matthew Chan, a PhD student at Lancaster University, is presenting this work at the Royal Astronomical Society's National Astronomy meeting on 4 July at 3:45pm in the Machine Learning in Astrophysics session.

Most galaxies in the universe live in low-density environments known as "the field", or in small groups, like the one that contains our Milky Way and Andromeda. Galaxy clusters are rarer, but they represent the most extreme environments that galaxies can live in and studying them can help us better understand dark matter and dark energy.

During 1950s the pioneer of galaxy cluster-finding, astronomer George Abell, spent many years searching for galaxy clusters by eye, using a magnifying lens and photographic plates to locate them. Abell manually analysed around 2,000 photographic plates, looking for visual signatures the of galaxy clusters, and detailing the astronomical coordinates of the dense regions of galaxies. His work resulted in the 'Abell catalogue' of galaxy clusters found in the northern hemisphere.

Deep-CEE builds on Abell's approach for identifying galaxy clusters but replaces the astronomer with an AI model that has been trained to "look" at colour images and identify galaxy clusters. It is a state-of-the-art model based on neural networks, which are designed to mimic the way a human brain learns to recognise objects by activating specific neurons when visualizing distinctive patterns and colours.

Chan trained the AI by repeatedly showing it examples of known, labelled, objects in images until the algorithm is able to learn to associate objects on its own. Then ran a pilot study to test the algorithm's ability to identify and classify galaxy clusters in images that contain many other astronomical objects.

"We have successfully applied Deep-CEE to the Sloan Digital Sky Survey" says Chan, "ultimately, we will run our model on revolutionary surveys such as the Large Synoptic Survey telescope (LSST) that will probe wider and deeper into regions of the Universe never before explored.

New state-of-the-art telescopes have enabled astronomers to observe wider and deeper than ever before, such as studying the large-scale structure of the universe and mapping its vast undiscovered content.

By automating the discovery process, scientists can quickly scan sets of images, and return precise predictions with minimal human interaction. This will be essential for analysing data in future. The upcoming LSST sky survey (due to come online in 2021) will image the skies of the entire southern hemisphere, generating an estimated 15 TB of data every night.

"Data mining techniques such as deep learning will help us to analyse the enormous outputs of modern telescopes" says Dr John Stott (Chan's PhD supervisor). "We expect our method to find thousands of clusters never seen before by science".

Chan will present the findings of his paper "Fishing for galaxy clusters with "Deep-CEE" neural nets" on 4 July at 3:45pm in the 'Machine Learning in Astrophysics' session. (Chan and Stott 2019) which has been submitted to MNRAS and can be found on Arxiv here: https://arxiv.org/abs/1906.08784.

Credit: 
Royal Astronomical Society

The Lancet Public Health: Incarceration and economic hardship strongly associated with drug-related deaths in the USA

Unique analysis of US county-level data finds a strong association between incarceration and drug-mortality, and economic hardship and drug-mortality, independent of opioid prescription rates

County-level incarceration may provide a further, plausible explanation to the underlying geographic variations in US drug-mortality, with the highest incarceration rates linked with a more than 50% increase in drug-mortality compared to counties with lowest incarceration

Growing rates of incarceration in the USA since the mid-1970s may be linked with a rise in drug-related mortality, and may exacerbate the harmful health effects of economic hardship, according to an observational study involving 2,640 US counties between 1983 and 2014, published in The Lancet Public Health journal.

Major increases in admissions rates to local jails (with average rate of 7,018 per 100,000 population) and state prisons (averaging 255 per 100,000 population) were associated with a 1.5% and 2.6% increase in death rates from drug overdoses respectively, over and above the effects of household income and other county-level factors, such as violent crime, ethnicity, and education. Even after taking into account the role of opioid prescription rates, the association between incarceration and overdose mortality persists.

Counties with the highest incarceration rates had an average of two excess drug-related deaths per 100,000 population compared to counties with the lowest rates (5.4 deaths per 100,000 vs 3.5 deaths per 100,000), a more than 50% increase in drug-mortality.

For counties experiencing large decreases in average household income (equivalent to a drop of about US$12,000 from average income of US$46,841), there was an associated 13% increase in drug-related deaths, even after controlling for the same county-level factors.

The study is the first to examine the link between the expansion of the jail and prison population and overdose deaths at the county level, and comes as the opioid epidemic continues to harm communities across the country. In 2017, more than 72,000 people died from overdose in the USA. The number of overdose deaths has increased in every county since 1980 but at very different rates--ranging from 8% to more than 8,000%.

"The rapid expansion of the prison and jail population since the mid-1970s, largely driven by a series of sentencing reforms including mandatory sentences for drug convictions, is likely to have made a substantial contribution to the more than 500,000 overdose deaths across America over the past 35 years", says Professor Lawrence King from the University of Massachusetts, Amherst, USA who co-led the research.

"Our findings indicate that the 3,000 local jails in the USA are an overlooked but important independent contributor to overdose deaths and may help to explain the geographical differences in drug-related deaths--identifying a potential cause that has remained elusive until now." [2]

For decades, the challenge of America's opioid epidemic - a major contributor to the rise in drug-related deaths - has been debated. Many have linked rising drug-misuse and its harmful health effects with the role of pharmaceutical companies in lobbying doctors to prescribe more opioid painkillers, and of individuals substituting prescription opioids for heroin and fentanyl. Other research has suggested links to economic decline and downward social mobility. However, neither of these explanations completely account for the wide geographical variation in overdose deaths.

In the study, the authors used data from the US National Vital Statistics, the US Census Bureau, and previously unavailable county-level incarceration data from the Vera Institute of Justice between 1983 and 2014, to model associations between county-level economic decline, incarceration rates, and deaths from drug use.

The analysis suggested that household income and incarceration rates explained almost all of the variation in drug-death rates within counties over time.

"Incarceration can lead to an increased number of overdose deaths in multiple ways," says lead author Dr Elias Nosrati from the University of Oxford, UK. "At the community level the criminal justice system removes working-age men from their local communities, separates families, and disrupts social networks. When coupled with economic hardship, prison and jail systems may constitute an upstream determinant of 'despair', whereby regular exposures to neighbourhood violence, unstable social and family relationships, and stress can trigger destructive behaviours." [1]

County jails generally house inmates that are serving less than a year or awaiting trial. State and federal prisons, on the other hand, hold inmates convicted of more serious crimes serving lengthier sentences. At any given time, jails hold about half as many inmates as prisons (744,600 vs 1,562,000). However, many more people (over 11 million people) enter jail every year (most awaiting trial)--almost 20 times more than the roughly 600,000 admitted to prison.

"The criminal justice system can either be part of the problem, or part of the solution", says Professor King "Whilst policies to reverse regional economic decline are likely to be difficult and expensive to implement, reform of arrest and sentencing policies is technically simple, would be economical, and could potentially save thousands of lives." [1]

Despite only 5% of the world's population living in the country, the USA imprisons nearly 25% of all incarcerated people globally; the highest rate of imprisonment in the world.

An accompanying Editorial in The Lancet Public Health journal makes the case for "drastic changes in the justice system" in response to these findings. It states: "Over the past 40 years, US politicians of all stripes have sought to appear tough on crime, which has led to an over-reliance on incarceration across many types of offences and damaged public health...Drug misuse is a public health issue; more than a criminal one, and like many other petty crimes, it would be more effectively addressed by investment in social and community services, and not in steel bars."

The authors note several limitations, including that unmeasured confounding (ie, differences in unmeasured factors which may have affected incarceration or mortality from drug use) such as state- or county-level differences in treatment and rehabilitation programmes for drug users, and differences in social and health care may moderate the deleterious effects of economic decline and/or incarceration and may limit the conclusions that can be drawn. They also point out that missing data for prison rates in several counties might have skewed the study sample away from Southern states where incarceration rates are the highest. Lastly, they were unable to examine state-level differences in the quality of post-incarceration support, which is an important predictor of health and employment outcomes for ex-inmates--which may have influenced results.

Writing in a linked Comment, Dr James LePage from VA North Texas Health Care System in the USA discusses incarceration as a potential contributor to the public health crisis of harmful drug use, and calls for more research on arrest and sentencing policies. He adds: "A focus on local and county policies and laws is also required. At present, county laws and policies such as trespassing, loitering, and vagrancy laws, unfairly criminalise individuals of low economic status and homeless individuals. These laws increase the level of interact ion these at-risk groups have with the legal system, resulting in higher incarceration rate ...Future studies should focus on racial and ethnic biases in arrests and sentencing, and the subsequent effect on drug related mortality."

Credit: 
The Lancet

Exercise improves anxiety and mood in older adults undergoing chemotherapy

Although we know that exercise improves anxiety and mood problems in younger people with cancer, few studies have looked at the effects of exercise on older adults with cancer. Since most new cancer cases occur in adults aged 60 or older, a team of researchers from the University of Rochester Medical Center and other institutions designed a study to learn more.

Their study appeared in the June issue of the Journal of the American Geriatrics Society (JAGS).

Having cancer increases the chances of people experiencing anxiety and mood issues, which can affect emotional and social well-being. In turn, this may lead people to discontinue cancer treatments--which can mean shortening their survival.

Chemotherapy can benefit older adults with cancer, even though older people receiving this type of treatment often experience higher rates of dangerous side effects than younger people do. Older adults often experience anxiety and other mood disorders during their treatment for cancer, too--and treating those problems with medications can often cause potentially dangerous side effects.

What's more, many anti-anxiety medications such as benzodiazepines and antidepressants are listed in the American Geriatrics Society (AGS) Beers Criteria® as being potentially inappropriate for older adults. That's why it is desirable to seek alternative treatments that are safe and effective at improving anxiety, mood disturbances, and emotional and social well-being, including treatments that don't rely on medications. For example, several studies have been conducted to examine the relationship between exercise and mood in cancer survivors and most have shown positive results.

The researchers in the new JAGS study examined the Exercise for Cancer Patients (EXCAP) program, a home-based, low- to moderate-intensity aerobic and resistance exercise program. In the study, those who were assigned to the EXCAP program received an exercise kit. It contained a pedometer, three exercise bands (medium, heavy, extra heavy), and an instruction manual.

During the program, participants increased the length and intensity of their workouts over time. For example, participants received an individually tailored, progressive walking routine, and they wore a pedometer and recorded their daily steps over six weeks, starting on their first day of chemotherapy treatment. They were encouraged to gradually increase their steps by five to 20 percent every week. For resistance exercise, they performed exercises with therapeutic exercise bands. Participants were given individually-tailored workout plans that encouraged them to perform 10 required exercises (such as squats and chest presses) and four optional exercises daily. Participants were also encouraged to increase the intensity and number of repetitions of resistance band exercises gradually over the course of the program.

The researchers concluded that a low- to moderate-intensity home-based exercise program improved anxiety, mood, and social and emotional well-being for older patients with cancer who received chemotherapy treatments.

The researchers also noted that in the study, the people who benefited the most from the exercise program were older adults who received chemotherapy and started off with worse anxiety, mood, and social and emotional well-being.

Credit: 
American Geriatrics Society

Researchers save images not with a microchip, but with metabolites

image: Writing and reading data encoded in mixtures of metabolites.
(a) Binary image data is mapped onto a set of metabolite mixtures, with each bit determining the presence/absence of one compound in one mixture. For example, a spot mapped to four bits with values [0 1 0 1] may contain the 2nd and 4th metabolite at that location. (b) Small volumes of the mixtures are spotted onto a steel plate and the solvent is evaporated (scale bars: 5 mm). This chemical dataset is analyzed by MALDI mass spectrometry (b, bottom). Using the observed mass spectrum peaks, decisions are made about which metabolites are present. These decisions are assembled from the array of spots to recover the original image. The image shown is the Rhode Island Hope Regiment Colors [28].

Image: 
Kennedy <em>et al</em>., 2019

An anchor, an ibex and an Egyptian cat: all images that a research team from Brown University, led by Jacob Rosenstein, encoded and decoded from mixtures of small molecules called metabolites. They demonstrate the potential of this small-molecule information storage system in a new paper published July 3 in the open-access journal PLOS ONE.

Microchips have become the standard way for people to store data, such as a document on a home computer or a snapshot on a phone. Recently, scientists have experimented with ways to encode information using biomolecules, such as DNA, by synthesizing an artificial genome. In this new study, the researchers show that they can encode information using a different type of biomolecule, metabolites. They used liquid-handling robots to write digital information by dotting mixtures of metabolites into a grid on a surface. The locations and identities of the metabolites, when read by an instrument called a mass spectrometer, reported out binary data. Using this method, they could encode the information from a picture and then decode it to redraw the image with 98 to 99.5 percent accuracy.

Mixtures of metabolites, called metabolomes, offer many advantages over genomes for recording information. Metabolites are smaller, more diverse, and have the potential to store information at greater density. This proof-of-principle study shows that small-molecule information storage can successfully encode more than 100,000 bits of digital images into synthetic metabolomes. They expect that with additional development, the amount and density of information that can be encoded will grow significantly.

Rosenstein adds: "We encoded several small digital images into mixtures of metabolites, and read back the data by chemically analyzing the mixtures. A molecular hard drive or a chemical computer might still seem like science fiction, but biology shows us it is possible. We wanted to show in a mathematically precise way how to write and read digital data using some of the small molecules that our bodies use every day."

Credit: 
PLOS

Scent composition data reveal new insights into perfume success

Mathematical analysis of online perfume data shows how the unique scent combinations found in different perfumes contribute to product popularity and consumer ratings. Vaiva Vasiliauskaite and Tim Evans of Imperial College London, U.K., present these findings in the open-access journal PLOS ONE on July 3, 2019.

Each perfume is a unique combination of different olfactory ingredients, oils and chemical molecules, that together form a harmonious aroma. The smell of a perfume is often described using so-called notes, such as vanilla, and their combinations, such as musk together with jasmine, which are called accords. By making an assumption that a given perfume is popular largely because of its good smell, the researchers aim to understand what constitutes a popular smell. To achieve this, the structure of perfumes and their constituent notes are studied according to the principles of a mathematical field known as complex network analysis.

To better understand how accords contribute to the success of perfumes, Vasiliauskaite and Evans applied complex network analysis to online data on of 1,000 notes found in more than 10,000 perfume products. The dataset included consumer ratings and information on the popularity of each perfume.

This analysis revealed which notes and accords are used more often than one would expect by chance (are "over-represented"), which are the most popular, and which are found in the highest-rated perfumes. The researchers found that the most popular notes and the most commonly used accords do not necessarily correlate with the highest perfume ratings. For instance, the accord of jasmine and mint notes contributed significantly to higher ratings but was under-represented in the studied perfumes.

The researchers also determined which notes, when added to existing accords, appeared to enhance accords the most. They found that notes with high popularity, such as musk and vanilla, tended to enhance accords the most, as did generically-named notes such as floral notes. Further analysis showed that major fashion brands produced many of the most successful perfumes, but a perfume's popularity did not seem to be linked to its price, nor when it was launched.

These findings suggest that complex network analysis could be a useful tool for perfume-makers to explore new accords that boost a perfume's potential for success.

The authors add: "We studied data on perfumes and their odour descriptors-notes-as a complex network to gain insights about how note compositions, called accords, influence success of fragrances. We obtained accords which tend to be present in perfumes that receive significantly more customer ratings, for example, an accord composed of Geranium, Lavender and Oakmoss. This work presents a framework which would be a timely tool for perfumers to explore a multidimensional space of scent compositions."

Credit: 
PLOS

Plants don't think, they grow: The case against plant consciousness

If a tree falls, and no one's there to hear it, does it feel pain and loneliness? No, experts argue in an opinion article publishing on July 3rd in the journal Trends in Plant Science. They draw this conclusion from the research of Todd Feinberg and Jon Mallatt, which explores the evolution of consciousness through comparative studies of simple and complex animal brains.

"Feinberg and Mallatt concluded that only vertebrates, arthropods, and cephalopods possess the threshold brain structure for consciousness. And if there are animals that don't have consciousness, then you can be pretty confident that plants, which don't even have neurons--let alone brains--don't have it either," says Lincoln Taiz, Professor Emeritus of molecular, cell, and developmental biology at University of California at Santa Cruz.

The topic of whether plants can think, learn, and intentionally choose their actions has been under debate since the establishment of plant neurobiology as a field in 2006 (10.1016/j.tplants.2006.06.009). Taiz was an original signer of a letter, also in Trends in Plant Science (10.1016/j.tplants.2007.03.002), arguing against the suggestion that plants have neurobiology to study at all.

"The biggest danger of anthropomorphizing plants in research is that it undermines the objectivity of the researcher," Taiz says. "What we've seen is that plants and animals evolved very different life strategies. The brain is very expensive organ, and there's absolutely no advantage to the plant to have a highly developed nervous system."

Plant neurobiology proponents draw parallels between electrical signaling in plants and nervous systems in animals. But Taiz and his co-authors argue that the proponents draw this parallel by describing the brain as something no more complex than a sponge. The Feinberg-Mallatt model of consciousness, by contrast, describes a specific level of organizational complexity of the brain that is required for subjective experience.

Plants use electrical signals in two ways: to regulate the distribution of charged molecules across membranes and to send messages long-distance across the organism. In the former, a plant's leaves might curl up because the movement of ions resulted in movement of water out of the cells, which changes their shape; and in the latter, an insect bite on one leaf might initiate defense responses of distant leaves. Both actions can appear like a plant is choosing to react to a stimulus, but Taiz and his co-authors emphasize that these responses are genetically encoded and have been fine-tuned through generations of natural selection.

"I feel a special responsibility to take a public position because I'm a co-author of a plant physiology textbook," he says. "I know a lot of people in the plant neurobiology community would like to see their field in the textbooks, but so far, there are just too many unanswered questions."

One frequently referenced study on plant learning is the apparent habituation of Mimosa pudica (10.1007/s00442-013-2873-7). In this experiment, a plant is dropped, and its leaves curl up in defense. After being dropped many times, but sustaining no serious damage, the leaves stop curling. When the plant is shaken, the leaves do curl, ostensibly ruling out motor fatigue as a cause of the lack of response when dropped.

"The shaking was actually quite violent. Because the shaking stimulus was stronger than the dropping stimulus, it doesn't definitively rule out sensory adaptation, which doesn't involve learning," Taiz argues. "Related experiments with peas purporting to show Pavlovian classical conditioning are also problematical because of the lack of sufficient controls."

Taiz and his co-authors hope that further research will address the questions left unanswered by current plant neurobiology experiments by using more stringent conditions and controls.

Credit: 
Cell Press

Discovery linking microbes to methane emissions could make agriculture more sustainable

Common dairy cows share the same core group of genetically inherited gut microbes, which influence factors such as how much methane the animals release during digestion and how efficiently they produce milk, according to a new study. By identifying these microbes, the research may help enable scientists to manipulate the rumen (a cow's first stomach, where microbes break down ingested food), facilitating a transition towards more eco-friendly and productive agriculture. Scientists have long wondered about the connection between a cow's genetics, its productivity, and the composition of its microbiome. To begin to uncover answers, Robert John Wallace and his team used common nucleotide variations between genes to study the genotypes of an unprecedented 1000 Holstein and Nordic Red dairy cows from the UK, Italy, Sweden, and Finland--the most popular and productive milking cow breeds in developed countries. They identified a core microbiome: a selection of closely related microbes present in at least 50% of all the cattle. The researchers then used two machine learning algorithms to determine that they could accurately predict rumen metabolism, diet, and traits including milk output and methane emissions based on this core microbiome's composition. A Canonical Correlation Analysis (CCA) showed that the core microbiome was correlated with genetics, suggesting that inherited genes give rise to microbes responsible for methane emissions and other cattle traits. The finding that these influential microbes are linked to heritable genes could enable programs in which breeders select for cattle with microbiomes that produce the least methane.

Credit: 
American Association for the Advancement of Science (AAAS)

11% of destroyed moist tropical forests could be restored to boost climate, environment

WASHINGTON, DC (3 July, 2019)--In a peer-reviewed report released today, researchers identified more than 100 million hectares of lost lowland tropical rain forests--restoration hotspots--spread out across Central and South America, Africa and Southeast Asia that present the most compelling opportunities for restoration to overcome rising global temperatures, water pollution and shortages, and the extinction of plant and animal life. Brazil, Indonesia, Madagascar, India and Colombia have the largest accumulated area of restoration hotspots; six African countries--Rwanda, Uganda, Burundi, Togo, South Sudan, and Madagascar--are home to the areas presenting the best restoration opportunities on average.

"Restoring tropical forests is fundamental to the planet's health, now and for generations to come," said lead author Pedro Brancalion, from the University of São Paulo, Brazil. "For the first time, our study helps governments, investors and others seeking to restore global tropical moist forests to determine precise locations where restoring forests is most viable, enduring and beneficial. Restoring forests is a must do--and it's doable."

The 12 authors of the study Global restoration opportunities in tropical rainforest landscapes, appearing today in the journal Science Advances, used high-resolution satellite imagery and the latest peer-reviewed research on four forest benefits (biodiversity, climate change mitigation, climate change adaptation, and water security) and three aspects of restoration effort (cost, investment risk and the likelihood of restored forests surviving into the future) to assess and "score" all tropical lands worldwide in 1 kilometer square blocks that retained less than 90% of their forest cover.

Restoration hotspots are those lands that scored in the top 10%, meaning that restoring them would be the most beneficial and the least costly and risky.

The top 15 countries with the largest restoration hotspots were found across all the tropical forest biomes, or zones: three in the Neotropics, five in the Afrotropics, and seven in Indo-Malaysia and Australasia.

The five countries with the largest restoration hotspot by area are Brazil, Indonesia, India, Madagascar and Colombia.

The six countries with the highest mean score were found in Africa: Rwanda, Uganda, Burundi, Togo, South Sudan and Madagascar. "We were surprised to find such a concentration of highly ranked countries in a single continent," co-author Robin Chazdon said. "The study really highlights the high potential for successful rainforest restoration outcomes in these African countries."

Nearly 87% of the restoration hotspots were found within biodiversity conservation hotspots, areas that hold high concentrations of species found nowhere else, but are at high risk for deforestation.

Seventy-three percent of the restoration hotspots were found in countries that have made restoration commitments as part of the Bonn Challenge, a global effort to bring 150 million hectares of the world's deforested and degraded land into restoration by 2020, and 350 million hectares by 2030. "It's encouraging that so many hotspots are located in countries where restoring forests and landscapes is already a priority," said Brancalion.

In most cases, restoration hotspots overlap with fields and pastures currently in use by farmers. As a result, the study shows, restoring forests is most feasible on lands of low value for agricultural production. Alternatively, the researchers argue, restoration could be coupled with income-generating forms of production through, for example, enriching pastures with trees, harvesting forest-based products like rattan and growing coffee or cocoa beneath a forest canopy. Any decisions about changing land use must fully engage local communities, as restoration should complement rather than compete with food security and land rights. In other cases, these hotspots include abandoned, degraded farmlands or government lands.

"Restoration involves far more than simply planting trees," said Chazdon. "It starts with the need for mutually beneficial agreements with those currently using the land and doesn't end until forests host the rich diversity of plant and animal life that make them so awe-inspiring and valuable. But, fortunately, studies show it doesn't take long for the benefits of new forests to kick-in."

Consensus is emerging that forest restoration--together with the protection of natural, old-growth forests--is one of the most cost-effective and readily available solutions to current climate and environment woes. A statement signed by 40 scientists last year laid out the "five often overlooked reasons why limiting global warming requires protecting and sustainably managing the forests we have, and restoring the forests we've lost." The scientists stress that the world must focus on rapidly decreasing fossil fuel use and stopping deforestation, while seeking ways to increase carbon sinks. Ramping up restoration, they caution, will help meet climate goals, but it cannot supplant the urgent need to reduce emissions.

While some countries, most notably China and India, have already launched large-scale tree planting efforts with some success, these efforts are getting mixed reviews in terms of the quality of plantation cover and its value for protecting native species. In some cases, countries are establishing monoculture tree plantations--one species of tree planted over and over again--to meet restoration commitments. Experts caution, however, that a focus on protecting and restoring natural forests, not planting monoculture plantations is essential to meeting climate and other co-benefits of restoration.

Brancalion added, "Pledges and agreements like the Bonn Challenge and the New York Declaration on Forests show that there is will to restore and protect forests. With the tools we have developed, countries, companies and other actors who have pledged to restore forests have the precise information they need to roll up their sleeves and dive into the difficult work of bringing our forests back. There are no shortcuts when it comes to forest restoration, but there is low-hanging fruit that we need to seize now, before it's too late."

Credit: 
Burness

Estimates of lost earnings from cancer deaths in US

Bottom Line: Cancer has significant impact on the U.S. economy, in part, because of lost productivity from premature deaths. This analysis estimated lost earnings for individuals ages 16 to 84 who died from cancer in 2015 by using data on cancer deaths, life expectancy and annual earnings. There were an estimated $94.4 billion in lost earnings due to cancer deaths in 2015, with large variation across the states. The study has several limitations including that lost productivity was likely underestimated in this study.

Authors: Farhad Islami, M.D., Ph.D., American Cancer Society, Atlanta, and coauthors

(doi:10.1001/jamaoncol.2019.1460)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Immune-boosting compound makes immunotherapy effective against pancreatic cancer

Pancreatic cancer is especially challenging to treat - only eight percent of patients are still alive five years after diagnosis. Chemotherapy and radiation therapy are of limited benefit, and even immunotherapy - which revolutionized treatment for other kinds of cancer by activating the body's immune system to attack cancer cells - has been largely ineffective because pancreatic tumors have ways to dampen the immune assault.

Now, researchers at Washington University School of Medicine in St. Louis and Rush University in Chicago have found a chemical compound that promotes a vigorous immune assault against the deadly cancer. Alone, the compound reduces pancreatic tumor growth and metastases in mice. But when combined with immunotherapy, the compound significantly shrank tumors and dramatically improved survival in the animals.

The findings, published July 3 in Science Translational Medicine, suggest that the immune-boosting compound could potentially make resistant pancreatic cancers susceptible to immunotherapy and improve treatment options for people with the devastating disease.

"Pancreatic cancer is a highly lethal disease, and we are in desperate need of new therapeutic approaches," said co-senior author David DeNardo, PhD, an associate professor of medicine and of pathology and immunology at Washington University School of Medicine. "In animal studies, this small molecule led to very marked improvements and was even curative in some cases. We are hopeful that this approach could help pancreatic cancer patients."

On paper, immunotherapies for pancreatic cancer seem like a good idea. The technique works by releasing a brake on specialized immune cells called T cells so they can attack the cancer. In the past, researchers working in the lab found they could release the brake and prod T cells into killing pancreatic cancer cells. But when doctors tried to treat people with pancreatic cancer using immunotherapies, fewer than five percent of patients improved.

This failure of immunotherapy in pancreatic cancer has puzzled scientists. But T cells aren't the only player in the immune assault on cancer. Myeloid cells, another kind of immune cell found in and around tumors, can either tamp down or ramp up the immune response. They tilt the playing field by releasing immune molecules that affect how many T cells are recruited to the tumor, and whether the T cells show up at the tumors activated and ready to kill, or suppressed and inclined to ignore the tumor cells. In pancreatic tumors, myeloid cells typically suppress other immune cells, undermining the effects of immunotherapy.

DeNardo, co-senior author Vineet Gupta, PhD, of Rush University, and colleagues realized that releasing the brake on T cells might not be enough to treat pancreatic cancer. Unleashing the power of immunotherapy might require also shifting the balance of myeloid cells toward those that activate T cells to attack.

The researchers identified a compound, called ADH-503, that interferes with the migration of myeloid cells. Normally, pancreatic tumors are teeming with myeloid cells that suppress the immune response. When the researchers gave the compound to mice with pancreatic cancer, the number of myeloid cells in and near the tumors dropped, and the remaining myeloid cells were of the kind that promoted, rather than suppressed, immune responses. This environment translated into greater numbers of cancer-killing T cells in the tumor, significantly slower tumor growth and longer survival.

Then, the researchers - including first author Roheena Panni, MD, resident in general surgery at Washington University and Barnes-Jewish Hospital, and co-author William Hawkins, MD, the Neidorff Family and Robert C. Packman Professor of Surgery at Washington University School of Medicine - investigated whether creating this same environment could make pancreatic tumors susceptible to standard immunotherapy. First, they treated mice with a so-called PD-1 inhibitor, a standard immunotherapy used to treat other kinds of cancer. Unsurprisingly, they saw no effect. But when the researchers gave the mice the immunotherapy in conjunction with ADH-503, the tumors shrank and the mice survived significantly longer. In some experiments, all the tumors disappeared within a month of treatment, and all the mice survived for four months, when the researchers stopped monitoring them. In comparison, all the untreated mice died within six weeks.

Gupta noted that while pancreatic cancer is the third leading cause of cancer-related death in the United States, only about three percent of clinical trials for cancer immunotherapies target pancreatic cancer.

"Unlocking the promise of immunotherapies for pancreatic cancer requires a new approach," Gupta said. "We believe these data demonstrate that targeting myeloid cells can help overcome resistance to immunotherapies."

The strategy of boosting antitumor immune activity by shifting the balance of myeloid cells improved the effectiveness of other pancreatic cancer therapies as well, the researchers said. Mice treated with chemotherapy or radiation therapy both fared significantly better when ADH-503 was added to the regimen.

"You can't make a one-to-one translation between animal studies and people, but this is very encouraging," DeNardo said. "More study is needed to understand if the compound is safe and effective in people, which is why this compound is going into phase I safety studies in people later this year at Washington University and other sites."

Credit: 
Washington University School of Medicine