Culture

A person's perception of risk can tell us about their chances of opioid relapse

People in treatment for opioid addiction are more likely to relapse when they become more tolerant of risks, according to a study by Rutgers and other institutions. The findings can help clinicians better predict which patients are most vulnerable.

In the study, published in JAMA Psychiatry, researchers followed 70 people during their first seven months of treatment for opioid addiction - the period associated with the highest relapse and overdose risk.

Forty-six percent returned to opioid use during that time. Most relapses occurred when patients exhibited a strong tolerance for risk-taking in situations where the risk associated with these decisions was not fully knowable, according to their performance in a computer game created for the study.

According to the National Institute on Drug Abuse, the relapse rate for substance use disorders is estimated to be between 40 percent and 60 percent.

"Although it is well known that people addicted to opioids cycle through periods of abstinence and use, we lack the tools needed to prospectively identify when these transitions are more likely to occur. Here, given that opioid use during treatment is quite risky, we wanted to examine whether a patient's tolerance for risky decisions is informative about their vulnerability to relapse," said Anna Konova, an assistant professor at Rutgers University Behavioral Health Care and Rutgers Robert Wood Johnson Medical School, and a faculty member in the Brain Health Institute.

Each patient completed up to 15 study visits over seven months, during which they had an opportunity to play the computer game for financial rewards. The computer game required patients to make decisions that involved two types of risk: Known risk, in which they had complete information on the likelihood of a decision's outcome to lead to reward; and ambiguous risk, in which they did not have full information on the possible outcomes.

The researchers measured the computer test results against clinical assessments of the patient's anxiety, craving, withdrawal and nonadherence to treatment. Opioid use was determined by random urine tests and self-reporting.

"Used in conjunction with clinical assessments, the computer model can be an important risk calculator, allowing clinicians in large, but short-staffed, treatment centers to allocate appropriate attention to those at greater risk for relapse and treatment failure," said Konova. "The goal is to eventually create a mobile app based on the game that people can play remotely, which could convey information about relapse risk in real time to the patient, clinician or caretaker."

This knowledge will allow clinicians to monitor vulnerable patients for changes that might affect their short-term and long-term vulnerability to relapse.

Credit: 
Rutgers University

Follicular lymphoma remission for 2+ years indicates disease-free status could be lifelong

ATLANTA --- People with follicular lymphoma, a slow-growing lymphatic-system cancer, who have been treated and are in remission for at least two years, may no longer have what has been considered an incurable disease based on highly sensitive testing; this means they no longer need therapy or active follow-up.

That is the finding of a new study from researchers at Georgetown Lombardi Comprehensive Cancer Center, by Maryam Sarraf Yazdy, MD, Bruce Cheson, MD, and colleagues, that will be presented in a poster session at the annual American Society of Hematology (ASH) meeting in Orlando, Fla., at 6 pm ET on December 8, 2019.

Follicular lymphoma accounts for about a third of all non-Hodgkin's lymphomas. Approximately 20,000 people are diagnosed with the disease annually in the United States.

"While follicular lymphoma is not one of the more aggressive types of cancer we treat, the majority of patients continue to experience disease recurrence over many years and have to receive different types of therapy," said Maryam Sarraf Yazdy, MD, a hematologist/ oncologist at MedStar Georgetown University Hospital and Georgetown Lombardi. "This disease has been considered incurable, but for some patients who have been disease-free for at least two years after remission, our pilot study gives hope that calling the disease incurable may no longer be accurate."

The study enrolled 68 people with follicular lymphoma at Georgetown. They had all undergone conventional treatments for their disease and had been in clinical remission for over two years. Twenty-five patients had biopsy samples that did not meet the study criteria, therefore only 43 patient samples were able to be fully assessed.

As a first step in the researcher's process, patients' biopsy samples from the time of their initial diagnoses were examined for changes in their lymphoma cells with next-generation sequencing tests. These tests now have the ability to detect minute genetic alterations that are specific to follicular lymphoma. For the second step, a current sample of each patient's blood was evaluated to detect possible remaining lymphoma cells by searching for the specific genetic changes which were identified in the original biopsy samples. Upon analysis of 43 patients with next-generation sequencing, 38 of them did not show any evidence of lymphoma in their blood.

The ability to detect which patients might be disease-free and might not need treatment anymore is important because many of these patients had undergone numerous therapies, often due to multiple relapses, and were always concerned about the possibility of their disease relapsing.

"More important than anything perhaps, is the lifting of the psychological burden these patients faced with a diagnosis of a presumed incurable disease," concluded Dr. Sarraf Yazdy. "This is a pilot study in a small number of patients with a short follow-up time. We need to do more work, study a larger number of patients, and monitor them for a longer time, but this is an important first step."

Credit: 
Georgetown University Medical Center

Too few hospitals have clinical decision support tools to calculate nutrition in NICU

Most neonatal intensive care units (NICUs) participating in the Children's Hospitals Neonatal Consortium are unable to reliably and consistently monitor caloric intake delivered to critically ill infants at risk for growth failure, according to a study published in the Journal of Perinatology. Managing optimal nutrition for preemies is a complex process, especially when the baby is transitioned from receiving nutrition intravenously to enteral (or through the gut) feeds. The study found low prevalence of fully automated clinical decision support systems used to calculate and adjust nutritional intake for premature infants.

"Delivery of appropriate amounts of calories, protein, fat and carbohydrates to premature infants in the NICU is associated with improved outcomes, including better growth and decreased risk of neurodevelopmental impairment," says lead author Gustave Falciglia, MD, MSCI, MSHQS, neonatologist at Ann & Robert H. Lurie Children's Hospital of Chicago and Assistant Professor of Neonatal-Perinatal Medicine at Northwestern University Feinberg School of Medicine. "We have the electronic health record but most of us still lack a fully automated system to track the baby's caloric intake. We are still doing manual calculations, which take time and are susceptible to error. Our findings highlight the pervasive need for clinical decision support to monitor and improve delivery of calories and nutrients to babies in our NICUs."

Dr. Falciglia and colleagues surveyed 34 regional level IV NICUs on availability of clinical decision support systems to calculate nutrition and fluids the infant received in the prior 24 hours and to estimate projected nutrition and fluids that the infant should receive in the next 24 hours. They found that more NICUs have clinical decision support to calculate fluid intake compared to caloric or nutrient intake.

"Fluid intake is much more straightforward to calculate than caloric needs, so it is not surprising that clinical decision support is more commonly available for this function," says Dr. Falciglia. "Caloric calculations involve many more factors and there is less consistency among NICUs in how nutrition calculations are approached. We need to establish and share best practices, in order to develop a standardized computerized solution."

Credit: 
Ann & Robert H. Lurie Children's Hospital of Chicago

Probiotics and prebiotics work differently in girls and boys according to piglet study

The team from the Universities of Bristol and Reading found that 28-day old piglets produced very different levels of immune cells, antibodies and other immune-associated molecules depending on their sex, contradicting previous evidence suggesting that the difference in immunity begins during puberty.

Dr Marie Lewis, principal investigator and Lecturer in Gut Immunology and Microbiology at the University of Reading, said: "Correct development of the immune system is essential in ensuring it responds appropriately to both harmful and harmless stimulation throughout life and this development, even during the first days of life, depends on your sex. Although we don't know why, we know that young girls tend to produce a more protective immune response to vaccination than boys. But what we did not expect to find is that young girls also appear to have a more regulated immune environment in their intestinal tissues than boys. This is important because around 70 per cent of the immune system is in the gut and this is also where its development is driven during early life, largely by the resident gut bacteria.

"Piglets are valuable pre-clinical models for human infants, especially for nutritional studies, and we also show for the first time that probiotics and prebiotics can have different effects on the immune system in male compared to female piglets. For example, the prebiotic inulin significantly increases the number of cells responsible for controlling immune responses, the regulatory T-cells, in male guts but not in female guts.

"The consequence of this study is that we need to rethink how we design, and analyse the data from, nutritional trials in youngsters. Currently, studies looking at the effectiveness of dietary supplements on the immune system assume that the same thing happens in boys and girls. But we show this is not the case and that sex may be influencing data on the effectiveness of probiotics and prebiotics in infanthood."

Mick Bailey, Professor of Comparative Immunology at Bristol Veterinary School, added: "The work raises some really important questions about why this happens - is it because the levels of the different sex hormones make the immune systems different almost as a side effect, even at this age, or is it because the immune and reproductive systems need to be fundamentally linked during early development?"

The paper found that probiotics and prebiotics work differently in male compared to female 28-day old piglets. The team note that the effect of these nutritional interventions can be masked if males and females were looked at all together.

Dr Lewis continued: "This also means that treatments for immune disorders may need to be designed differently for infant girls and boys. In the future, we could find that specific probiotics or prebiotics are more beneficial for girls, whilst others could generate better health outcomes for boys. Given the underlying differences in immune development we identified between boys and girls, taking sex into account could provide a simple means to improve the effectiveness of pharmaceutics and other therapies which act on the immune system."

In a previous study in 2018, Dr Lewis found that how the immune system develops in the guts of pigs can be programmed very early on. The team found that piglets exposed to an extensive, outdoor farm during just the first day of life retained a regulatory immune environment in their guts even after 28 days of 'urban living'. This could be linked to the reduced incidences of allergy development which occurs in children raised on farms.

Credit: 
University of Bristol

Ancient worm reveals way to destroy toxic cells in Huntington's disease

Insights from their study may provide a novel therapeutic approach for diseases such as Huntington's and Parkinson's.

Associate Professor Roger Pocock, from the Monash Biomedicine Discovery Institute (BDI), and colleagues from the University of Cambridge led by Professor David Rubinsztein, found that microRNAs are important in controlling protein aggregates, proteins that have amassed due to a malfunction in the process of 'folding' that determines their shape.

Their findings were published in eLife today.

MicroRNAs, short strands of genetic material, are tiny but powerful molecules that regulate many different genes simultaneously. The scientists sought to identify particular microRNAs that are important for regulating protein aggregates and homed in on miR-1, which is found in low levels in patients with neurodegenerative diseases such as Parkinson's disease.

"The sequence of miR-1 is 100 per cent conserved; it's the same sequence in the Caenorhabditis elegans worm as in humans even though they are separated by 600 million years of evolution," Associate Professor Pocock said.

"We deleted miR-1 in the worm and looked at the effect in a preclinical model of Huntington's and found that when you don't have this microRNA there's more aggregation," he said. "This suggested miR-1 was important to remove Huntington's aggregates."

The researchers then showed that miR-1 helped protect against toxic protein aggregates by controlling the expression of the TBC-7 protein in worms. This protein regulates the process of autophagy, the body's way of removing and recycling damaged cells and is crucial for clearing toxic proteins from cells.

"When you don't have miR-1, autophagy doesn't work correctly and you have aggregation of these Huntington's proteins in worms," Associate Professor Pocock said.

Professor Rubinsztein then conducted research which showed that the same microRNA regulates a related pathway to control autophagy in human cells.

"Expressing more miR-1 removes Huntington's aggregates in human cells," Associate Professor Pocock said.

"It's a novel pathway that can control these aggregation-prone proteins. As a potential means of alleviating neurodegenerative disease, it's up there," he said.

Additional work by Associate Professor Pocock's colleagues showed that when human cells are supplied with a molecule called interferon-b the miR-1 pathway is upregulated, revealing a way of manipulating it.

He said the studies demonstrated the fundamental importance of discovery research. "We asked a fundamental biological question to dissect a molecular mechanism that now is shown to be really important for potential therapies."

The researchers have provisionally patented their findings and are in discussions with pharmaceutical companies about translating the research. They will further test it in preclinical models for Huntington's and Parkinson's disease.

Credit: 
Monash University

Reduced soil tilling helps both soils and yields

Agriculture degrades over 24 million acres of fertile soil every year, raising concerns about meeting the rising global demand for food. But a simple farming practice born from the 1930's Dust Bowl could provide a solution, according to new Stanford research. The study, published Dec. 6 in Environmental Research Letters, shows that Midwest farmers who reduced how much they overturned the soil - known as tilling - increased corn and soybean yields while also nurturing healthier soils and lowering production costs.

"Reduced tillage is a win-win for agriculture across the Corn Belt," said study lead author Jillian Deines, a postdoctoral scholar at Stanford's Center on Food Security and the Environment. "Worries that it can hurt crop yields have prevented some farmers from switching practices, but we found it typically leads to increased yields."

The U.S. - the largest producer of corn and soybeans worldwide - grows a majority of these two crops in the Midwest. Farmers plucked about 367 million metric tons of corn and 108 million metric tons of soybeans from American soil this past growing season, providing key food, oil, feedstock, ethanol and export value.

Monitoring farming from space

Farmers generally till the soil prior to planting corn or soybeans - a practice known to control weeds, mix nutrients, break up compacted dirt and ultimately increase food production over the short term. However, over time this method degrades soil. A 2015 report from the Food and Agriculture Organization of the United Nations found that in the past 40 years the world has lost a third of food-producing land to diminished soil. The demise of once fertile land poses a serious challenge for food production, especially with mounting pressures on agriculture to feed a growing global population.

In contrast, reduced tillage - also known as conservation tillage - promotes healthier soil management, reduces erosion and runoff and improves water retention and drainage. It involves leaving the previous year's crop residue (such as corn stalks) on the ground when planting the next crop, with little or no mechanical tillage. The practice is used globally on over 370 million acres, mostly in South America, Oceania and North America. However, many farmers fear the method could reduce yields and profits. Past studies of yield effects have been limited to local experiments, often at research stations, that don't fully reflect production-scale practices.

The Stanford team turned to machine learning and satellite datasets to address this knowledge gap. First, they identified areas of reduced and conventional tilling from previously published data outlining annual U.S. practices for 2005 to 2016. Using satellite-based crop yield models - which take into account variables such as climate and crop life-cycles - they also reviewed corn and soybean yields during this time. To quantify the impact of reduced tillage on crop yields, the researchers trained a computer model to compare changes in yields based on tillage practice. They also recorded elements such as soil type and weather to help determine which conditions had a larger influence on harvests.

Improved yields

The researchers calculated corn yields improved an average of 3.3 percent and soybeans by 0.74 percent across fields managed with long-term conservation tillage practices in the nine states sampled. Yields from the additional tonnage rank in the top 15 worldwide for both crops. For corn, this totals approximately 11 million additional metric tons matching the 2018 country output of South Africa, Indonesia, Russia or Nigeria. For soybeans, the added 800,000 metric tons ranks in between Indonesia and South Africa's country totals.

Some areas experienced up to an 8.1 percent increase for corn and 5.8 percent for soybeans. In other fields, negative yields of 1.3 percent for corn and 4.7 for soybeans occurred. Water within the soil and seasonal temperatures were the most influential factors in yield differences, especially in drier, warmer regions. Wet conditions were also found favorable to crops except during the early season where water-logged soils benefit from conventional tillage that in turn dries and aerates.

"Figuring out when and where reduced tillage works best could help maximize the benefits of the technology and guide farmers into the future," said study senior author David Lobell, a professor of Earth system science in the School of Earth, Energy & Environmental Sciences and the Gloria and Richard Kushel Director of the Center on Food Security and the Environment.

It takes time to see the benefits from reduced tillage, as it works best under continuous implementation. According to the researchers' calculations, corn farmers won't see the full benefits for the first 11 years, and soybeans take twice as long for full yields to materialize. However, the approach also results in lower costs due to reduced need for labor, fuel and farming equipment while also sustaining fertile lands for continuous food production. The study does show a small positive gain even during the first year of implementation, with higher gains accruing over time as soil health improves. According to a 2017 Agricultural Censuses report, farmers appear to be getting on board with the long-term investment and close to 35 percent of cropland in the U.S. is now managed with reduced tillage.

"One of the big challenges in agriculture is achieving the best crop yields today without comprising future production. This research demonstrates that reduced tillage can be a solution for long-term crop productivity," Deines said.

Credit: 
Stanford University

Discovery of a new protein gives insight into a long-standing plant immunity mystery

image: Mai1 acts upstream of M3Kα and MKK2. Representative photographs of the cell death observed 5 days postagroinfiltration of the cell death elicitors SlM3Kα or constitutive-active NtMKK2DD into N. benthamiana leaves that were silenced using the indicated VIGS constructs.

Image: 
Robyn Roberts et al.

When a plant senses an invading pathogen, it activates a molecular signaling cascade that switch on its defense mechanisms. One such mechanism involves sacrificing host cells to the pathogen. This is a tightly controlled process that involves the work of plant proteins to ensure that the sacrificial cells are only killed if the pathogen is attacking. This process, called the cell death response, ensures that only a few host cells die.

Tomatoes employ this method when they are invaded by a bacterial pathogen known as Pseudomonas syringae pv. tomato, which causes speck disease. Scientists understand how the tomato recognizes this pathogen and know many of the plant proteins that are involved in the signaling cascade, but until recently they did not know what linked these two processes, a mystery that has been around for decades.

In a recent paper published in Molecular Plant-Microbe Interactions, scientists introduce a protein, called Mai1, that plays a role in this missing link. They found that when they muted the expression of Mai1, the plants could no longer defend themselves against pathogens through the cell death response. As a result, these plants were more susceptible to bacterial infection.

They also found that Mai1 directly interacts with a protein at the top of the signaling cascade and upregulates its activity, suggesting that Mai1 plays a key role in activating the cascade.

"Our research suggests that Mai1 has a central role in immunity that likely can not be substituted by other proteins," according to first author Robyn Roberts. "Not only does this work give us better insight into how plants defend themselves on the molecular level, but this work reveals a key protein that is broadly involved in immunity. It is possible that Mai1 could serve as a target for crop improvement in the future."

This research also showed that the muting of Mai1 stunted the plants, leaving them with brittle leaves and heightened sensitivity to mild stress, including pesticide application. This further shows the importance of Mai1, suggesting that the protein might also be involved in both immunity and plant growth and development.

Credit: 
American Phytopathological Society

Team finds link between vitamin A and brain response in Monarch butterflies

image: Texas A&M University biologist Christine Merlin studies Monarch butterflies to learn more about animal migration, the role of circadian clocks in regulating daily and seasonal animal physiology and behavior, and the evolution of the animal clockwork.

Image: 
Texas A&M University

Biologists at Texas A&M University are making strides in understanding biological clock function in several model organisms and translating these studies into broader implications for human health.

The Merlin Laboratory in the Texas A&M Department of Biology has found genetic evidence linking circadian clock genes and clock-regulated molecular pathways to the Monarch butterfly's uncanny ability to sense the changes in day length, or photoperiod -- an environmental cue that signals them to migrate and triggers the reproductive dormancy they exhibit in the process. Their work establishes a clear connection between clock genes and the vitamin A pathway within the brain of this iconic insect.

The Merlin Lab's study, published November 25 in the Proceedings of the National Academy of Sciences, not only provides genetic proof for the photoperiod-clock connection but also demonstrates for the first time that it also regulates a critical vitamin A pathway necessary for seasonal responses.

"Nearly all organisms adapt to the seasons by adjusting their physiology and behavior to changes in day length, or photoperiod," says Texas A&M biologist and 2017 Klingenstein-Simons Fellow Christine Merlin.

"Despite decades of research, the molecular and genetic mechanisms by which changes in photoperiod are sensed and translated into seasonal changes in animal physiology and behavior have remained poorly understood. While much remains to be learned, our findings pave the way for understanding the mechanisms by which vitamin A operates in the brain to translate day length encoding into seasonal physiological and behavioral responses in animals.

"Given that seasonal changes associated with this pathway have also been reported in the mammalian brain, it is tantalizing to speculate that the function of vitamin A in animal photoperiodism may be evolutionary conserved. If this turns out to be the case, our work in the Monarch could have implications for better understanding seasonal changes in the human brain that could lead to ailments such as seasonal depression."

For the past six years, Merlin's lab within the Texas A&M Center for Biological Clocks Research has been using the majestic Monarch as a model to study animal migration, the role of circadian clocks in regulating daily and seasonal animal physiology and behavior, and the evolution of the animal clockwork. Aided by CRISPR/Cas9 technology, her group already has succeeded in altering key biological clock-related genes in the Monarch in order to study their impact on daily circadian rhythms and seasonal migratory responses.

"Despite significant advances our lab has made in developing genetic tools to knock out virtually any genes in the Monarch genome, which has been key in this study to demonstrate the central importance of the vitamin A pathway in photoperiodic responses, the genetic toolbox in the Monarch is still far from rivaling with the one available in more conventional genetically tractable model organisms, such as Drosophila and the mouse," Merlin said.

One of the complications the Merlin lab had to overcome in the study is that vitamin A is necessary for visual function of the Monarch's compound eyes, meaning that their ninaB1 full-body knockouts would be rendered blind. As a fail-safe, Merlin's team had to find a non-genetic way to eliminate the potential function of the compound eyes as a possible tie-back to the lack of photoperiodic responses observed in these new mutant butterflies.

"We had to be creative, so we turned to arts and crafts experiments," Merlin said. "By painting the compound eyes of wild-type adult butterflies with black paint, we demonstrated that visual function was not necessary for photoperiodic responses, thereby supporting the idea that the vitamin A function in the brain and not the eyes is responsible for photoperiodic sensing and responses."

Merlin says the study raises interesting questions regarding the pathway's possible involvement in any number of intriguing scenarios, including the production of a deep-brain photoreceptor for photoperiodic sensing, the seasonal regulation of a retinoic acid-mediated transcriptional program, and/or the seasonal plasticity of the clock neuronal circuitry in the brain.

"Teasing these possibilities apart through the continued molecular and genetic dissection of this pathway in the Monarch will be necessary to increase our understanding of the mechanisms of action of vitamin A in photoperiodic responsiveness in the Monarch and animals in general," Merlin added.

Merlin credits 2015 Texas A&M biology graduate Samantha Iiams, currently a Ph.D. candidate in the Interdisciplinary Graduate Program in Genetics, for much of her lab's progress in this line of investigations. In addition to serving as first author for the team's PNAS paper, Iiams has received an impressive number of awards for her work forming the basis of this study -- most notably, the International Society for Research on Biological Rhythms' 2018 Patricia DeCoursey Excellence Award as well as several first-place poster prizes.

Credit: 
Texas A&M University

Hire more LGBTQ and disabled astronomers or risk falling behind, review finds

image: Professor Lisa Kewley, director of the ARC Centre of Excellence in All Sky Astrophysics (ASTRO 3D), in front of Australia's Reynolds telescope, which dates from 1927.

Image: 
ASTRO 3D

Ensuring research opportunities for indigenous, disabled and LGBTQ astronomers is essential if Australian research is to succeed in the new era of "mega-telescopes", a major analysis has found.

In a paper published in the journal Nature Astronomy, Professor Lisa Kewley, director of the ARC Centre of Excellence in All Sky Astrophysics (ASTRO 3D), finds that encouraging astronomers from marginalised communities will increase the chances of significant research discoveries.

"Studies show that increased diversity up to the highest levels of organisations, and effective diversity management, leads to organisations outperforming their competition in innovation, productivity and profit because more ideas are produced," she says.

"These might be ideas for new experiments, products, or new ways to become more efficient or profitable."

Fresh approaches will be crucial if Australia is to fully exploit the potential of powerful new facilities soon to start operating. These include the Square Kilometre Array in Australia and South Africa, and the Giant Magellan and Extremely Large Telescopes, both in Chile.

Professor Kewley finds that recent programs to improve gender equality have met with success.

There are about 500 working astronomers in Australia, of whom 27% are women with PhD degrees, and 37% women studying for a PhD or lesser degree. A decade-long plan for the field published in 2016 by the Australian Academy of Science recommends that women fill 33% of positions at all levels by 2025.

Although that target is still some way off, Professor Kewley describes progress as "striking".

"There has been a dramatic change in the culture of Australian astronomy," she says. "Diversity and inclusion across the sector are improving."

She cites an initiative called the Pleiades Awards, run by the Astronomical Society of Australia (ASA), as particularly important. Put into place in 2014, the scheme accords bronze, silver or gold ratings to institutions based on the participation of women at all levels. The idea is to move up through the scale.

"The broad uptake of Pleiades Awards is remarkable," she notes. "Institutions are not required to do so, and there is no financial incentive for receiving one."

Professor Kewley finds that many astronomy departments are moving beyond a focus on female participation to include active recruitment of indigenous, LGBTQ, disabled, and chronically ill scientists.

"There is a statistically signi?cant correlation between greater levels of diversity in company leadership and a greater likelihood of outperforming the relevant industry peer group on key measure such as profit," she says.

"It is reasonable to infer that greater levels of diversity in astronomy organisations will also produce a greater likelihood of outperforming competition in astronomy key performance measures in major discoveries and advances."

Professor Cathryn Trott, a senior researcher at the International Centre for Radio Astronomy Research (ICRAR) based at Curtin University in Western Australia and current head of the ASA, agrees that the discipline is making great strides.

"Australia is a world-leader in tracking, promoting and rewarding progress in gender equity in astronomy, due to a suite of initiatives championed and celebrated across the community," she says.

She adds that the Pleiades Awards have been an effective catalyst for change.

"The experience in astronomy over the past five years has shown that these schemes can take the gender equity discussion from hidden in the academic corridors to openly discussed, debated and encouraged," she says.

Credit: 
ARC Centre of Excellence for All Sky Astrophysics in 3D (ASTRO 3D)

New tool for rapidly analyzing CRISPR edits reveals frequent production of unintended edits

Wilmington, DE, Dec. 6, 2019 -Amidst rising hopes for using CRISPR gene editing tools to repair deadly mutations linked to conditions like cystic fibrosis and sickle cell disease, a new study in the Nature journal Communications Biology describes a new innovation that could accelerate this work by rapidly revealing unintended and potentially harmful changes introduced by a gene editing process.

"We've developed a new process for rapidly screening all of the edits made by CRISPR, and it shows there may be many more unintended changes to DNA around the site of a CRISPR repair than previously thought," said Eric Kmiec, Ph.D., director of ChristianaCare's Gene Editing Institute and the principle author of the study.

The study describes a new tool developed at the Gene Editing Institute that in just 48 hours can identify "multiple outcomes of CRISPR-directed gene editing," a process that typically required up to two months of costly and complicated DNA analysis.

Kmiec cautioned that the unintended changes revealed by their work involve "subtle mutations" to DNA around the immediate site of the genome targeted for repair. That's very different, he said, from the hotly debated concern about the risk of CRISPR causing "off-target" mutations by drifting far afield from the intended site and making random cuts across the genome.

"It's important to note that in all instances we were still seeing CRISPR achieve a fantastic level of successful repairs that would have been unimaginable even five years ago," added lead author Brett Sansbury. "But we saw a lot of other changes to DNA near the site of the repair that need to be better understood so that when we correct one problem, we're not creating another."

She said those changes included deletions, duplications and rearrangements of DNA code. And while the researchers believe the vast majority of these unintended edits may have no consequence for patients, it's important to identify them and determine which ones might pose a risk. For example, Kmiec said unintended changes to DNA code--which acts like software for determining how genes function--could instruct a gene to produce a harmful protein.

According to the study, "such information forms the basis for determining risk-benefit decisions surrounding the effectiveness of genetic engineering tools to treat human disease."

"CRISPR will probably never be perfect 100 percent of the time," Kmiec said. "But CRISPR tools are constantly improving. And if we can achieve a 70 or 80 percent rate of precision--and reveal and understand the importance of any changes that occur alongside that repair--that brings us much closer to safely using CRISPR to treat patients. We hope our new tool can help accelerate efforts to achieve that goal."

CRISPR stands for "clustered regularly interspaced short palindromic repeats." It is a defense mechanism found in bacteria that can recognize and slice up the DNA of invading viruses.

Scientists have learned how to modify this mechanism so it can be directed to "edit" specific sequences of DNA code, with a focus on repairing DNA mutations that cause deadly diseases. For example, there is work under way to use CRISPR to repair the genetic mutation that produces the abnormal red blood cells in patients with sickle cell disease and the mutation that causes the damaging buildup of mucus in patients with cystic fibrosis.

But Kmiec noted that most tools for analyzing CRISPR gene edits are best suited for verifying that the repair was successful, not for revealing alterations that may occur to nearby strands of DNA. He said going further and screening for these unintended edits has required extracting and analyzing an enormous amount of DNA code from a cell, a sort of needle-in-a-haystack process that can take up to two months. And even then, it might not capture all the changes. The study warns that as a result, when researchers report success in using CRISPR to repair malfunctioning genes, they "may inadvertently underreport the collateral activity of this remarkable technology."

Scientists at the Gene Editing Institute found a way around this problem by working with a system they have developed that performs gene edits on circular segments of DNA extracted from the cell, which are known as plasmids. The researchers found that working in a plasmid or "cell-free" system eliminated a lot of the complex biological activity within a cell that makes it hard to isolate the full array of DNA changes introduced by CRISPR.

In the study, they report that their system allowed them to "visualize the wide array of genetic modifications created through the process of CRISPR-directed gene editing in a straight-forward and simple fashion." Also, because the tool can screen outcomes of an edit quickly and affordably, the researchers note that it frees scientists to execute and screen multiple trial edits--many more than is practical with a cell-based system. And that would allow them to identify unintended mutations that may occur at relatively rare frequencies and thus would otherwise go unnoticed.

It also allows them to test different variations of CRISPR. When scientists use CRISPR tools to edit a gene, they can employ different enzymes (such as Cas9 or Cas12a) to do the actual cutting and often include something called a DNA "template" to act as a map to identify and repair damaged code. The new study found that the rate of "precise repair"--repairs that are accomplished without introducing unintended mutations--varied considerably depending on the enzyme and template employed, ranging from a low of five percent to a high of 64 percent.

The work to develop a better way to screen for CRISPR-induced mutations is part of a broader effort at the Gene Editing Institute that is producing groundbreaking advances in editing DNA plasmids extracted from human cells. The team already has used their "cell-free" approach to engineer multiple edits simultaneously. This work has led to a collaboration with a biotech firm to develop new approaches to personalized cancer care. A tool developed by the team at the Gene Editing Institute can rapidly reproduce, in a human DNA sample, the unique and complex genetic features of an individual patient's cancer tumor. And those samples can be used to screen multiple chemotherapies and other cancer drugs to design a treatment best suited for the individual patient.

Credit: 
Burness

Acupuncture reduces radiation-induced dry mouth for cancer patients

HOUSTON -- After receiving acupuncture treatment three days a week during the course of radiation treatment, head and neck cancer patients experienced less dry mouth, according to study results from researchers at The University of Texas MD Anderson Cancer Center.

The trial results, published in JAMA Network Open, is the first randomized, placebo-controlled, Phase III trial to evaluate the use of acupuncture during radiation therapy to reduce the incidence and severity of radiation-induced xerostomia, or dry mouth. Acupuncture treatment has very few side effects and is relatively low cost compared to standard treatments such as medication and saliva substitutes. These results support a 2011 study that found symptoms improved up to six months after radiation treatment with concurrent acupuncture sessions.

"Dry mouth is a serious concern for head and neck cancer patients undergoing radiation therapy. The condition can affect up to 80% of patients by the end of radiation treatment," said the study's principal investigator, Lorenzo Cohen, Ph.D., professor of Palliative, Rehabilitation, and Integrative Medicine and director of the Integrative Medicine Program. "The symptoms severely impact quality of life and oral health, and current treatments have limited benefits."

True acupuncture (TA) had significantly lower xerostomia scores than standard care control (SCC) and marginally lower than the sham acupuncture (SA), with no differences between SA and SCC. One year after the end of radiation therapy, the incidence of clinically significant xerostomia was 35% in the TA group, 48% in the SA group and 55% in the SCC group.

The study included 339 head and neck cancer patients undergoing radiation treatment at MD Anderson and Fudan University Cancer Center in Shanghai between December 16, 2011 and July 7, 2015. The patients were divided into three groups. One group received true acupuncture, another group received sham acupuncture and the third group, the SCC group, received radiation and oral health education but no acupuncture. None had received acupuncture prior to participating in the study.

Patients assigned to either TA or SA received acupuncture three days a week on the same day as their radiation treatment, which lasted six to seven weeks. The sham procedure involved a real needle at a point not indicated for xerostomia, real needles at sham points and placebo needles at sham points.

Results were based on data derived from a self-report questionnaire. Patients completed the Xerostomia Questionnaire (XQ), an eight-item survey assessing symptoms of the condition. XQ scores under 30 corresponded to mild or no symptoms of xerostomia. The data was collected at baseline, at the end of radiotherapy, and three, six and 12 months after radiation treatment.

TA resulted in significantly fewer and less severe dry mouth symptoms one year after treatment. The xerostomia score in the TA group was 26.6 vs 31.3 in the SA group and 34.8 in the SCC group.

The Acupuncture Expectancy Scale (AES) was used to measure the relationship between expectations related to acupuncture and clinical response. Patients completed the AES at baseline, after four acupuncture sessions and at the end of acupuncture treatment. There were no group differences or differences between sites.

"The evidence is to a point where patients should incorporate acupuncture alongside radiation treatment as a way to prevent the severity of dry mouth symptoms," said Cohen. "I think with this study we can add acupuncture to the list for the prevention and treatment of xerostomia, and the guidelines for the use of acupuncture in the oncology setting should be revised to include this important chronic condition."

A secondary analysis showed a significant difference between sites in response to placebo. The Chinese patients had little to no placebo response to sham acupuncture whereas the MD Anderson patients had a large placebo response, showing both forms of acupuncture worked. More studies are needed to understand these site differences, but it has been suggested that it could be due to the environment in which the acupuncture is delivered, cultural influences or the relationship between patient and practitioner.

Future studies will focus on ensuring acupuncture delivery is well controlled and will evaluate inconsistencies in response to sham acupuncture. Additional studies are needed to confirm the trial results and better understand the neurological mechanisms of acupuncture.

Credit: 
University of Texas M. D. Anderson Cancer Center

Empowering mucosal healing with an engineered probiotic

image: Inflammatory lesions destroy epithelial cells that function as a barrier between the inside of the gut (lumen) and the rest of the body (left). This loss of barrier function leads to a feedback cycle of worsening inflammation fueled by bacteria and other particles crossing the barrier. PATCH is a bioactive material synthesized by engineered probiotic bacteria that helps to maintain gut barrier function even in the presence of inflammatory insults, thereby helping to keep bacteria and other particulates in the lumen and ameliorating the symptoms of inflammation (right).

Image: 
Wyss Institute at Harvard University

(BOSTON) -- About 1.6 million people in the US alone currently have lifelong and incurable Inflammatory Bowel Disease (IBD) including Crohn's disease and ulcerative colitis, and 70,000 new cases are diagnosed in the USA each year. IBD patients suffer from pain, extreme discomfort, and many other symptoms caused by continuously relapsing and remitting inflammatory lesions in the layer of cells that lines the intestinal lumen (mucosa). The exact causes for IBD still are poorly understood, but it is clear that a misdirected immune system is at work, and that certain components of the microbial community in our gut, known as the intestinal microbiome, and environmental factors contribute to its destructive forces.

While anti-inflammatory drugs can dampen acute inflammation and antibiotics can fight local infections when IBD episodes flare up, their use also comes at a cost. Anti-inflammatory drugs can have severe side effects and antibiotics can disrupt the beneficial parts of the microbiome on which we depend for many of our body's functions. Importantly, there are no wound treatments available that could be applied to inflamed lesions directly from inside the gut lumen in order to speed up the healing process and minimize the use of those drugs.

Now, a research team at Harvard's Wyss Institute for Biologically Inspired Engineering led by Neel Joshi, Ph.D., has developed a living material approach that uses a strain of genetically engineered E.coli Nissle gut bacteria as a locally acting probiotic. The engineered bacteria produce a network of nanofibers that directly binds to mucus to fill inflamed areas like a patch, shielding them from gut microbes and environmental factors. This probiotic-based therapeutic strategy protected mice against the effects of colitis induced by a chemical agent and promoted mucosal healing. Their findings are reported in Nature Communications.

"With this 'living therapeutics' approach, we created multivalent biomaterials that are secreted by resident engineered bacteria on-site and attach to many mucus proteins at a time - firmly adhering to the viscous and otherwise moving mucus layer, which is a challenging thing to do," said Joshi. "The 'Probiotic Associated Therapeutic Curli Hybrids' (PATCH) approach, as we named it, creates a biocompatible, mucoadhesive coating that functions as a stable, self-regenerating BAND-AID® and provides biological cues for mucosal healing." Joshi presently is a Core Faculty member of the Wyss Institute and Associate Professor at Harvard's Paulson School of Engineering and Applied Sciences (SEAS), and will shortly be appointed as a Professor at Northeastern University in Boston.

In previous work, Joshi's group has demonstrated that self-regenerating bacterial hydrogels firmly attached to mucosal surfaces ex vivo, and, when orally given to mice, withstood the harsh pH and digestive conditions of the stomach and small intestine without affecting the health of the animals. To fabricate them, his team programmed a laboratory E. coli strain to synthesize and secrete a modified CsgA protein, which as part of E. coli's "curli" system assembles into long nanofibers at the outer surface of the bacteria. "To enable mucus adhesion, we fused CsgA to the mucus-binding domain of different human trefoil factors (TFFs), proteins that occur naturally in the intestinal mucosa and bind to mucins, the major mucus proteins present there. The secreted fusion proteins form a water-storing mesh with tunable hydrogel properties," said co-author Anna Duraj-Thatte, Ph.D., a postdoctoral fellow working with Joshi. "This turned out to be a simple and robust strategy to produce self-renewing, mucoadhesive materials with long residence times in the mouse intestinal tract."

In their new study, the team further built on these findings by introducing the machinery for producing one of the mucoadhesive hydrogels based on TFF3 into an E. coli Nissle strain that is a normal gut bacterium which can thrive in the colon and cecum sections of the intestinal tract affected by IBD, and is currently sold in many commercial probiotic formulations. "We found that the newly engineered Nissle bacteria, when given orally, also populated and resided in the intestinal tract, and that their curli fibers integrated with the intestinal mucus layer," said first-author Pichet Praveschotinunt, who is a graduate student mentored by Joshi.

"When we induced colitis in the colons of mice by orally administering the chemical dextran sodium sulfate, animals that had received the PATCH-generating E. coli Nissle strain by daily rectal administration starting three days prior to chemical treatment, had significantly faster healing and lower inflammatory responses, which caused them to lose much less weight and recover faster compared to control animals," said Praveschotinunt. "Their colon epithelial mucosa displayed a more normal morphology and lower numbers of infiltrating immune cells".

Joshi and his team think that their approach could be developed as a companion therapy to existing anti-inflammatory, immuno-suppressant, and antibiotic therapies to help minimize patients' exposure to the drugs and potentially provide protection against IBD relapses.

"This powerful and simple approach could potentially impact the lives of thousands of patients with IBD for whom there is no disease-specific cure available. It also is a testament to the creativity and vision of the Wyss Institute's "Living Cellular Devices" initiative that engineers living cells to perform key therapeutic and diagnostic tasks in our bodies," said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS, the Vascular Biology Program at Boston Children's Hospital, and Professor of Bioengineering at SEAS.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Dendrites filtering neuron's excitement

image: Purkinje cell dendrites were measured using patch-clamps. Data showed that distal dendrites can filter out incoming signals to the soma

Image: 
Kyoto University/Mindy Takamiya

Kyoto, Japan -- In mere milliseconds trillions of chemical reactions ignite signals that travel across the billions of neurons in our brain. As we go through our daily lives and absorb new knowledge these neurons begin to modify themselves and change their signaling properties.

However, the mechanisms of how signals are integrated into the neurons to establish such flexibility, also known as plasticity, remains elusive.

Publishing in the Journal of Neuroscience, Gen Ohtsuki of Kyoto University's Hakubi Center reports that Purkinje cells -- the primary output neurons in the cerebellum -- have the ability to modulate and filter incoming signals. The findings bring new insight into the learning mechanisms of the cerebellum and the brain.

The cerebellum is a structure located at the base of the brain, and is known to play a vital role in motor control and cognitive function. Recent findings have even revealed its contributions in mental illnesses. One of the most vivid features of Purkinje cells are their long complex branches called dendrites.

It is thought that the plasticity of these Purkinje-cell dendrites is the basis for cerebellar learning. However, validation of this hypothesis was difficult due to the challenge of measuring signals within a single cell.

Thankfully in a prior study, Ohtsuki was successful in measuring the electrical activity on dendrites of a single Purkinje cell utilizing the patch-clamp method.

"To measure how electrical signals travel through the Purkinje cell membrane, I applied this method using rats and measured the spontaneous synaptic activity between the dendrite and the 'soma' or cell body," explains Ohtsuki.

What he found was that signals coming from dendrites far away from the soma, known as distal dendrites, were not being registered. This suggests the dendrites have a mechanism that limits electroconduction, and that individual branches can choose whether an input passes through or not. In fact, the same signals were registered when they came from proximal dendrites -- the ones closer to the soma.

After further analysis it was found that these distal dendrites modulated their incoming signals through intrinsic plasticity associated with the down-regulation of an ion channel called SK channels.

"One of the reasons for this new finding is because similar experiments used cesium ions in the intracellular fluid, so the phenomena itself could not be observed at all," states Ohtsuki. "The results reveal a new learning mechanism at the dendritic level."

He hopes to further verify these results and determine whether similar findings can be obtained with animals other than rodents, such as fishes and reptiles, or higher mammals.

Ohtsuki concludes, "Studying these fundamental processes should help us understand the reasons for the mechanism of intelligence."

Credit: 
Kyoto University

Genetic typing of a bacterium with biotechnological potential

image: Tree of P. putida sequence types (STs). Squares indicate environmental isolates. Dash lines indicate clonal complex (relatively close ST types).

Image: 
Kanazawa University

Pseudomonas putida is a bacterium occuring in soil, aquatic environments and plants. Although the virulence of Pseudomonas p. -- the ability of the bacterium to infect its host and inflict a disease -- is considered to be low, infection in severely ill patients can be lethal. P. putida strains (also called isolates) have been found in hospitals, e.g. in urine, blood or wound discharge from patients, and such clinical isolates have been found to display resistance to drugs. Now, Kohei Ogura from Kanazawa University and colleagues have performed gene sequencing for various P. putida isolates originating from both environmental and clinical sites.

Genetic typing of different P. putida strains enables to determine which are the more virulent ones. This is important because P. putida has high biotechnological value. Indeed, P. putida is a perfect microbiological platform for 'metabolic engineering', in which selected biochemical processes within the cells of an organism are stimulated so that the cells produce more of a particular substance. (Examples of metabolic engineering include the industrial production of beer, wine and cheese.)

The researchers applied a technique known as multilocus sequence typing (MLST), a method used in molecular biology for the genetic typing of more than one locus -- a locus refers to the position on a chromosome where a specific gene is located.

The MLST technique is based on obtaining DNA sequences of several so-called 'housekeeping genes': genes that are needed for the maintenance of the basic functioning of a cell. In order to arrive at a valid MLST scheme, typically 100 isolates are required. Ogura and colleagues used 106 isolates, with 16 having an environmental origin and 90 coming from clinical sites. For the MLST scheme, the scientists used 8 housekeeping genes.

The scientists not only obtained the first MLST scheme for P. putida, they also were able to deduce that the studied bacterium isolates are clonal, meaning that they share common ancestry. At the same time, the researchers found that "our MLST scheme reflects the genetic diversity of P. putida group isolated from both clinical and environmental sites".

Credit: 
Kanazawa University

How saving the ozone layer in 1987 slowed global warming

image: This is a NASA image showing the ozone hole at its maximum extent for 2015.

Image: 
NASA Goddard Space Flight Center

The Montreal Protocol, an international agreement signed in 1987 to stop chlorofluorocarbons (CFCs) destroying the ozone layer, now appears to be the first international treaty to successfully slow the rate of global warming.

New research published today in Environmental Research Letters has revealed that thanks to the Protocol, today's global temperatures are considerably lower. And by mid-century the Earth will be - on average - at least 1°C cooler than it would have been without the agreement. Mitigation is even greater in regions such as the Arctic, where the avoided warming will be as much as 3°C - 4°C.

"By mass CFCs are thousands of times more potent a greenhouse gas compared to CO2, so the Montreal Protocol not only saved the ozone layer but it also mitigated a substantial fraction of global warming," said lead author of the paper Rishav Goyal.

"Remarkably, the Protocol has had a far greater impact on global warming than the Kyoto Agreement, which was specifically designed to reduce greenhouse gases.
Action taken as part of the Kyoto Agreement will only reduce temperatures by 0.12°C by the middle of the century - compared to a full 1°C of mitigation from the Montreal Protocol."

The findings were made inadvertently when the team set out to quantify how the Montreal Protocol had affected atmospheric circulation around Antarctica. To get their results, the researchers modelled global climate under two scenarios of atmospheric chemistry - one with, and one without the Montreal Protocol being enacted. They then extended these simulations into the future using conservative estimates for unmitigated CFC emissions - set to 3% growth per annum, much less than the observed CFC growth rates at the time of establishment of the Montreal Protocol. Their results therefore likely underestimate the actual impact of the international treaty to reduce CFCs.

The success of the Montreal Protocol in mitigating climate change is even more striking when focusing on regional domains. For example, warming of between 0.5°C - 1°C has already been avoided over North America, Africa and Eurasia. By midcentury avoided warming in some of these areas will be 1.5°C - 2°C and over the Arctic avoided warming will be as much as 3°C - 4°C.

The researchers also found an amount of avoided ice melt due to the Protocol, with the extent of sea ice around the Arctic during summer around 25% greater today than it would have been without any reduction in CFC emissions. The avoided warming over Greenland also suggests that the observed accelerating ice sheet melt there and the associated sea level rise has also been reduced by the Protocol.

"Without any fanfare the Montreal Protocol has been mitigating global warming impacts for more than three decades, surpassing some treaties that were specifically aimed to ameliorate climate change impacts," said co author Dr Martin Jucker.

Looking ahead, co-author Prof Matthew England said, "The success of the Montreal Protocol demonstrates superbly that international treaties to limit greenhouse gas emissions really do work; they can impact our climate in very favourable ways, and they can help us avoid dangerous levels of climate change.

"Montreal sorted out CFC's, the next big target has to be zeroing out our emissions of carbon dioxide."

Credit: 
University of New South Wales