Culture

Behavioral disorders in kids with autism linked to reduced brain connectivity

More than a quarter of children with autism spectrum disorder are also diagnosed with disruptive behavior disorders. For the first time, Yale researchers have identified a possible biological cause: a key mechanism that regulates emotion functions differently in the brains of the children who exhibit disruptive behavior.

The study appears in Biological Psychiatry: Cognitive Neuroscience and Neuroimaging.

"Disruptive behaviors such as aggression, irritability, and noncompliance are common in children with autism, and are among the main reasons for psychiatric treatment and even hospitalization," said Denis Sukhodolsky, senior author and associate professor in the Yale Child Study Center. "Yet, little is known about the biological underpinnings of behavioral problems in children with autism."

The first of its kind, the Yale study used fMRI scans conducted during an emotion perception task to compare the brain activity of autistic children who do and do not exhibit disruptive behavior. While in the scanner, the children were asked to view pictures of human faces that displayed calm or fearful expressions.

During the task, the researchers found reduced connectivity between the amygdala and ventrolateral prefrontal cortex -- a pathway critical to the regulation of emotion -- in the brains of children who exhibit disruptive behavior as compared to the brains of children who do not. "Reduced amygdala-ventrolateral prefrontal cortex functional connectivity was uniquely associated with disruptive behavior but not with severity of social deficits or anxiety, suggesting a distinct brain network that could be separate from core autism symptoms," explained Karim Ibrahim, first author and postdoctoral fellow in the Sukhodolsky lab.

"This finding points to a brain mechanism of emotion dysregulation in children with autism and offers a potential biomarker for developing targeted treatments for irritability and aggression in autism," said Sukhodolsky.

Credit: 
Yale University

Brain wiring differences identified in children with conduct disorder

Behavioural problems in young people with severe antisocial behaviour - known as conduct disorder - could be caused by differences in the brain's wiring that link the brain's emotional centres together, according to new research led by the University of Birmingham.

Conduct disorder affects around 1 in 20 children and teenagers and is one of the most common reasons for referral to child and adolescent mental health services. It is characterised by a wide range of antisocial or aggressive behaviours such as vandalism, weapon use and harm to others. It is often also associated with other disorders such as attention-deficit/hyperactivity disorder (ADHD), anxiety, or depression.

The exact causes of conduct disorder - thought to be an interaction between genetic and environmental factors - are not well understood, but scientists in the University's Centre for Human Brain Health and the Institute for Mental Health have found that there are distinctive differences in white matter pathways (the brain's structural wiring) among young people who have the condition.

The researchers investigated differences in the brain's structure between children with conduct disorder and a comparison group of typically-developing children without severe antisocial behaviour. The study included nearly 300 children aged between 9 and 18, with equal numbers of boys and girls.

Each volunteer underwent a brain scan using a magnetic resonance imaging (MRI) scanning technique called diffusion-tensor imaging to examine differences in white matter fibre tracts - which carry signals between different areas of the brain.

One of the largest differences identified by the team was in an area of the brain called the corpus callosum, the largest white matter fibre tract in the brain and a major pathway which connects the two hemispheres of the brain together. The MRI results suggested there was less branching along these fibres, so the connections between the left and right sides of the brain were less efficient in young people with conduct disorder as compared to the comparison group. Interestingly, the researchers found that boys and girls with conduct disorder showed the same structural abnormalities within this pathway in the brain.

The researchers also investigated whether certain antisocial behaviours, such as aggression, or personality traits, such as reduced empathy or guilt, were linked to the observed changes in brain structure. They found that the differences in the corpus callosum were linked to callous behaviour, including deficits in empathy and a disregard for other people's feelings.

Increasing our understanding of how the brain is wired differently in young people with conduct disorder is an important area of research because it may help clinicians to diagnose the condition more accurately and guide the development of effective interventions in the future.

"The differences that we see in the brains of young people with conduct disorder are unique in so much as they are different from the white matter changes that have been reported in other childhood conditions such as autism or ADHD," says Dr Jack Rogers, co-lead author on the study.

"Additionally we found that callous traits, such as reduced empathy and guilt, explained some of the white matter differences seen in youths with conduct disorder suggesting that these traits are important factors to consider when exploring differences in the brains of young people with conduct disorder".

Dr Stephane De Brito, also co-lead author, adds: "It can be really difficult to get a diagnosis for children with conduct disorder - partly because it is often obscured by other conditions, but also because it is frequently not seen as a genuine disorder. Increasing our understanding of what these structural differences look like in the brain might lead to more accurate diagnosis in the future, but also will help us develop and test interventions that can help children at a critical period of brain development."

Dr Graeme Fairchild, a Reader in the Department of Psychology at the University of Bath and a collaborator on the project, said: "This is the first large-scale study looking at white-matter pathways in the brains of girls and boys with conduct disorder. The results demonstrate that there are reliable differences in the connectivity of these pathways, and that these differ from those seen in other mental health conditions such as depression. It will be important to study whether these white matter changes cause conduct disorder by studying how the brain develops over time, and also whether these brain changes can be modified by psychological interventions."

Credit: 
University of Birmingham

BRB-seq: The quick and cheaper future of RNA sequencing

image: This is an illustration of the BRB-seq method.

Image: 
B. Deplancke/EPFL

RNA sequencing is a technique used to analyze entire genomes by looking at the expression of their genes. Today, such genome-wide expression analyses are a standard tool for genomic studies because they rely on high-throughput technologies, which themselves have become widely available.

Nonetheless, RNA sequencing is still expensive and time-consuming, because it first requires the costly preparation of an entire genomic library - the DNA pool generated from the RNA of cells - while the data itself are also difficult to analyze. All this makes RNA sequencing difficult to run, rendering its adoption not as widespread as it could be.

Some new approaches have come in to help, propelled by the revolution in single-cell transcriptomics, which uses what is known as "sample barcoding" or "multiplexing". Here, individual "barcode" sequences are added to each DNA fragment during library preparation so that each one can be identified and sorted before the analysis of the final data - meaning that this approach only requires a single library that contains multiple distinct samples or cells.

Barcoding reduces both cost and time, and this could extend to bulk RNA sequencing of large sets of samples. But there is still trouble with adapting and validating protocols for reliable and cheap profiling of bulk RNA samples - which is what we're faced with when trying to analyze the transcriptome of cells or tissues.

Now, scientists from the lab of Bart Deplancke at EPFL's Institute of Bioengineering have developed a novel approach called Bulk RNA Barcoding and sequencing (BRB-seq) which is 25 times less expensive than a conventional commercial RNA sequencing technology (Illumina's TruSeq).

Among its many advantages, BRB-seq is quick and preserves strand-specificity - a challenge in the field, having to do with transcribing DNA in the correct direction. As such, BRB-seq offers a low-cost approach for performing transcriptomics on hundreds of RNA samples, which can increase the number of biological replicates (and therefore experimental accuracy) in a single run.

In terms of performance, the scientists found that BRB-seq can detect the same number of genes as "the gold standard" in the field, namely TruSeq Stranded mRNA, at the same sequencing depth and that the technique produces reliable data even with low-quality RNA samples. Moreover, it generates genome-wide transcriptomic data at a cost that is comparable to profiling four genes using RT-qPCR, which is currently a standard, but low-throughput method for measuring gene expression.

In a test, BRB-seq could generate ready-to-sequence genomic libraries for up to 192 samples a day, requiring only two hours of hands-on time. The technique is combined with a user-friendly pipeline for pre-processing and analyzing sequencing data, allowing result acquisition in a single day.

"Since its release, dozens of labs and companies have already contacted us to help them implement the BRB-seq approach," says Bart Deplancke. "Because of BRB-seq's low cost, these researchers realized that they could now analyze many more samples with the same budget, thus vastly increasing the scope and reproducibility of their experiments. We therefore anticipate that BRB-seq or a comparable approach will over the longer term become standard in any molecular biology lab and replace RT-qPCR as the first gene expression profiling option."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Can science writing be automated?

CAMBRIDGE, Mass. -- The work of a science writer, including this one, includes reading journal papers filled with specialized technical terminology, and figuring out how to explain their contents in language that readers without a scientific background can understand.

Now, a team of scientists at MIT and elsewhere has developed a neural network, a form of artificial intelligence (AI), that can do much the same thing, at least to a limited extent: It can read scientific papers and render a plain-English summary in a sentence or two.

Even in this limited form, such a neural network could be useful for helping editors, writers, and scientists scan a large number of papers to get a preliminary sense of what they're about. But the approach the team developed could also find applications in a variety of other areas besides language processing, including machine translation and speech recognition.

The work is described in the journal Transactions of the Association for Computational Linguistics, in a paper by Rumen Dangovski and Li Jing, both MIT graduate students; Marin Soljacic, a professor of physics at MIT; Preslav Nakov, a senior scientist at the Qatar Computing Research Institute, HBKU; and Mico Tatalovic, a former Knight Science Journalism fellow at MIT and a former editor at New Scientist magazine.

From AI for physics to natural language

The work came about as a result of an unrelated project, which involved developing new artificial intelligence approaches based on neural networks, aimed at tackling certain thorny problems in physics. However, the researchers soon realized that the same approach could be used to address other difficult computational problems, including natural language processing, in ways that might outperform existing neural network systems.

"We have been doing various kinds of work in AI for a few years now," Soljacic says. "We use AI to help with our research, basically to do physics better. And as we got to be more familiar with AI, we would notice that every once in a while there is an opportunity to add to the field of AI because of something that we know from physics -- a certain mathematical construct or a certain law in physics. We noticed that hey, if we use that, it could actually help with this or that particular AI algorithm."

This approach could be useful in a variety of specific kinds of tasks, he says, but not all. "We can't say this is useful for all of AI, but there are instances where we can use an insight from physics to improve on a given AI algorithm."

Neural networks in general are an attempt to mimic the way humans learn certain new things: The computer examines many different examples and "learns" what the key underlying patterns are. Such systems are widely used for pattern recognition, such as learning to identify objects depicted in photos.

But neural networks in general have difficulty correlating information from a long string of data, such as is required in interpreting a research paper. Various tricks have been used to improve this capability, including techniques known as long short-term memory (LSTM) and gated recurrent units (GRU), but these still fall well short of what's needed for real natural-language processing, the researchers say.

The team came up with an alternative system, which instead of being based on the multiplication of matrices, as most conventional neural networks are, is based on vectors rotating in a multidimensional space. The key concept is something they call a rotational unit of memory (RUM).

Essentially, the system represents each word in the text by a vector in multidimensional space -- a line of a certain length pointing in a particular direction. Each subsequent word swings this vector in some direction, represented in a theoretical space that can ultimately have thousands of dimensions. At the end of the process, the final vector or set of vectors is translated back into its corresponding string of words.

"RUM helps neural networks to do two things very well," Nakov says. "It helps them to remember better, and it enables them to recall information more accurately."

After developing the RUM system to help with certain tough physics problems such as the behavior of light in complex engineered materials, "we realized one of the places where we thought this approach could be useful would be natural language processing," says Soljacic, recalling a conversation with Tatalovic, who noted that such a tool would be useful for his work as an editor trying to decide which papers to write about. Tatalovic was at the time exploring AI in science journalism as his Knight fellowship project.

"And so we tried a few natural language processing tasks on it," Soljacic says. "One that we tried was summarizing articles, and that seems to be working quite well."

The proof is in the reading

As an example, they fed the same research paper through a conventional LSTM-based neural network and through their RUM-based system. The resulting summaries were dramatically different.

The LSTM system yielded this highly repetitive and fairly technical summary: "Baylisascariasis," kills mice, has endangered the allegheny woodrat and has caused disease like blindness or severe consequences. This infection, termed "baylisascariasis," kills mice, has endangered the allegheny woodrat and has caused disease like blindness or severe consequences. This infection, termed "baylisascariasis," kills mice, has endangered the allegheny woodrat.

Based on the same paper, the RUM system produced a much more readable summary, and one that did not include the needless repetition of phrases: Urban raccoons may infect people more than previously assumed. 7 percent of surveyed individuals tested positive for raccoon roundworm antibodies. Over 90 percent of raccoons in Santa Barbara play host to this parasite.

Already, the RUM-based system has been expanded so it can "read" through entire research papers, not just the abstracts, to produce a summary of their contents. The researchers have even tried using the system on their own research paper describing these findings -- the paper that this news story is attempting to summarize.

Here is the new neural network's summary: Researchers have developed a new representation process on the rotational unit of RUM, a recurrent memory that can be used to solve a broad spectrum of the neural revolution in natural language processing.

It may not be elegant prose, but it does at least hit the key points of information.

Credit: 
Massachusetts Institute of Technology

How do we make moral decisions?

When it comes to making moral decisions, we often think of the golden rule: do unto others as you would have them do unto you. Yet, why we make such decisions has been widely debated. Are we motivated by feelings of guilt, where we don't want to feel bad for letting the other person down? Or by fairness, where we want to avoid unequal outcomes? Some people may rely on principles of both guilt and fairness and may switch their moral rule depending on the circumstances, according to a Radboud University - Dartmouth College study on moral decision-making and cooperation. The findings challenge prior research in economics, psychology and neuroscience, which is often based on the premise that people are motivated by one moral principle, which remains constant over time. The study was published recently in Nature Communications.

"Our study demonstrates that with moral behavior, people may not in fact always stick to the golden rule. While most people tend to exhibit some concern for others, others may demonstrate what we have called 'moral opportunism,' where they still want to look moral but want to maximize their own benefit," said lead author Jeroen van Baar, a postdoctoral research associate in the department of cognitive, linguistic and psychological sciences at Brown University, who started this research when he was a scholar at Dartmouth visiting from the Donders Institute for Brain, Cognition and Behavior at Radboud University.

"In everyday life, we may not notice that our morals are context-dependent since our contexts tend to stay the same daily. However, under new circumstances, we may find that the moral rules we thought we'd always follow are actually quite malleable," explained co-author Luke J. Chang, an assistant professor of psychological and brain sciences and director of the Computational Social Affective Neuroscience Laboratory (Cosan Lab) at Dartmouth. "This has tremendous ramifications if one considers how our moral behavior could change under new contexts, such as during war," he added.

To examine moral decision-making within the context of reciprocity, the researchers designed a modified trust game called the Hidden Multiplier Trust Game, which allowed them to classify decisions in reciprocating trust as a function of an individual's moral strategy. With this method, the team could determine which type of moral strategy a study participant was using: inequity aversion (where people reciprocate because they want to seek fairness in outcomes), guilt aversion (where people reciprocate because they want to avoid feeling guilty), greed, or moral opportunism (a new strategy that the team identified, where people switch between inequity aversion and guilt aversion depending on what will serve their interests best). The researchers also developed a computational, moral strategy model that could be used to explain how people behave in the game and examined the brain activity patterns associated with the moral strategies.

The findings reveal for the first time that unique patterns of brain activity underlie the inequity aversion and guilt aversion strategies, even when the strategies yield the same behavior. For the participants that were morally opportunistic, the researchers observed that their brain patterns switched between the two moral strategies across different contexts. "Our results demonstrate that people may use different moral principles to make their decisions, and that some people are much more flexible and will apply different principles depending on the situation," explained Chang. "This may explain why people that we like and respect occasionally do things that we find morally objectionable."

Credit: 
Dartmouth College

Firms are better off revealing their environmental practices, new research shows

image: This is Michel Magnan, professor of accountancy at the John Molson School of School of Business.

Image: 
Concordia University

Is honesty the best policy when it comes to being green?

It just might be, according to a new paper by Michel Magnan, a professor of accountancy at the John Molson School of School of Business.

In their article for Sustainability Accounting, Management and Policy Journal, Magnan and co-author Hani Tadros of Elon University in North Carolina looked at 78 US firms in environmentally sensitive industries from 1997 to 2010. They wanted to deepen their understanding of the driving forces behind the firms' disclosure of their environmental practices and management.

"There is tension out there," says Magnan, the Stephen A. Jarislowsky Chair in Corporate Governance. "Many people are skeptical and will adopt a cynical perspective regarding what corporations choose to disclose."

With public trust in business in general decline, it may be natural to assume that most firms are padding their numbers or deciding to obscure their environmental behaviour. But Magnan says he has found that that is not the case.

Many are keenly aware of growing environmental concerns among members of the public, including investors and consumers of their products. In response, some are quite literally cleaning up their act.

What is said vs. what is done

The researchers separated the firms they studied into two groups based on the data they collected, including public information and the firms' annual disclosure reports and regulatory filings.

The companies whose environmental performance scored positively when compared to existing government regulations (meaning they respected guidelines on pollution, emissions and so on) were designated "high performers." Those that did poorly were designated "low performers."

"High- and low-performing firms will adopt different patterns when it comes to disclosure," explains Magnan. "High performers will provide more information because they are doing well and want to convey that message to their various stakeholders. Poor performers, meanwhile, will try to manage impressions in some way."

The researchers paid close attention to the usefulness of the information firms disclosed. They preferred data that was objectively verifiable and quantitative -- they called that "hard" information. "Soft" information generally consisted of vague statements, unattached to specifics.

They found that high-performing corporations were more likely to disclose hard information because they could afford to be forthcoming. They were using their disclosure as a way of building trust and earning public goodwill, which pays dividends down the line.

"If more disclosure raises your market value, it makes sense," Magnan says.

Look for good, clean facts

With stakeholders paying more attention to environmental issues, Magnan says there is added pressure on firms to come clean on their environmental performance. He sees corporate culture heading in that direction already.

"Some firms will be more forthcoming because that is their governance model, and they feel that it is better to be forthcoming early on," he says. "The costs will be less, and it shows they are operating in good faith."

Companies that engage in practices designed to obfuscate, deny or lie about poor environmental performances are likely to suffer serious consequences, he adds.

"In the short run, that kind of behaviour may help you, but in the long run it may come back to hurt you. Everything becomes public at some point."

Credit: 
Concordia University

Study: Infamous 'death roll' almost universal among crocodile species

image: Paleosuchus palpebrosus, also known as Cuvier's dwarf caiman.

Image: 
Kent Vliet/University of Florida.

The iconic "death roll" of alligators and crocodiles may be more common among species than previously believed, according to a new study published in Ethology, Ecology & Evolution and coauthored by a researcher at the University of Tennessee, Knoxville.

Contrary to popular belief, crocodiles can't chew, so they use a powerful bite coupled with a full-bodied twisting motion--a death roll--to disable, kill, and dismember prey into smaller pieces. The lethal movement is characteristic of both alligators and crocodiles and has been featured in numerous movies and nature documentaries.

Until now, the death roll had only been documented in a few of the 25 living crocodilian species, but how many actually do it?

"We conducted tests in all 25 species, and 24 of them exhibited the behavior," said lead author Stephanie Drumheller-Horton, a paleontologist and adjunct assistant professor in the Department of Earth and Planetary Sciences at UT.

For the research, Drumheller-Horton teamed up with Kent Vliet from the University of Florida and Jim Darlington, curator of reptiles at the St. Augustine Alligator Farm.

It was previously believed that slender-snouted species, like the Indian gharial, didn't roll because their diets consist of small prey like fish, eaten whole.

But it turns out that feeding isn't the only time the animals might roll.

"Aggression between individual crocodylians can become quite intense, often involving bites and death rolls in establishing dominance or competition for females," Vliet said.

Paleosuchus palpebrosus, commonly called Cuvier's dwarf caiman, is the only species that did not perform a death roll under experimental conditions. "Although, it's also possible that they were just being uncooperative," said Darlington.

And the fossil ancestors of modern crocodiles? If they share a similar body plan and lifestyle with their modern counterparts, it's likely that they could death roll, too.

"Crocodile relatives have played the role of semi-aquatic ambush predator since the Age of Dinosaurs," said Drumheller-Horton.

Whether in the Northern Territories of Australia, a lake in the Serengeti, or a watering hole in the late Cretaceous, chances are that a patient predator is waiting in the water to surprise its next meal with a burst of speed, a powerful bite, and a spinning finish.

Credit: 
University of Tennessee at Knoxville

Antimicrobial paints have a blind spot

image: This is a scanning electron microscopy (SEM) image ofBacillus timonensis.

Image: 
Jinglin Hu/Northwestern University

EVANSTON, Ill. -- Antimicrobial paints offer the promise of extra protection against bacteria. But Northwestern University researchers caution that these paints might be doing more harm than good.

In a new study, the researchers tested bacteria commonly found inside homes on samples of drywall coated with antimicrobial, synthetic latex paints. Within 24 hours, all bacteria died except for Bacillus timonensis, a spore-forming bacterium. Most bacilli are commonly inhabit soil, but many are found in indoor environments.

"If you attack bacteria with antimicrobial chemicals, then they will mount a defense," said Northwestern's Erica Hartmann, who led the study. "Bacillus is typically innocuous, but by attacking it, you might prompt it to develop more antibiotic resistance."

Bacteria thrive in warm, moist environments, so most die on indoor surfaces, which are dry and cold, anyway. This makes Hartmann question the need to use antimicrobial paints, which may only be causing bacteria to become stronger.

Spore-forming bacteria, such as Bacillus, protect themselves by falling dormant for a period of time. While dormant, they are highly resistant to even the harshest conditions. After those conditions improve, they reactivate.

"When it's in spore form, you can hit it with everything you've got, and it's still going to survive," said Hartmann, assistant professor of civil and environmental engineering in Northwestern's McCormick School of Engineering. "We should be judicious in our use of antimicrobial products to make sure that we're not exposing the more harmless bacteria to something that could make them harmful."

The study was published online on April 13 in the journal Indoor Air.

One problem with antimicrobial products -- such as these paints -- is that they are not tested against more common bacteria. Manufacturers test how well more pathogenic bacteria, such as E. coli or Staphylococcus, survive but largely ignore the bacteria that people (and the products they use) would more plausibly encounter.

"E. coli is like the 'lab rat' of the microbial world," Hartmann said. "It is way less abundant in the environment than people think. We wanted to see how the authentic indoor bacteria would respond to antimicrobial surfaces because they don't behave the same way as E. coli."

Credit: 
Northwestern University

Decline in measles vaccination is causing a preventable global resurgence of the disease

image: This is an illustration of the virus which causes measles.

Image: 
CDC/ Allison M. Maiuri, MPH, CHES

WHAT:
In 2000, measles was declared to be eliminated in the United States, when no sustained transmission of the virus was seen in this country for more than 12 months. Today, however, the United States and many other countries that had also eliminated the disease are experiencing concerning outbreaks of measles because of declines in measles vaccine coverage. Without renewed focus on measles vaccination efforts, the disease may rebound in full force, according to a new commentary in the New England Journal of Medicine by infectious diseases experts at the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, and the Penn State University College of Medicine's Milton S. Hershey Medical Center.

Measles is an extremely contagious illness transmitted through respiratory droplets and aerosolized particles that can remain in the air for up to two hours. Most often seen in young children, the disease is characterized by fever, malaise, nasal congestion, conjunctivitis, cough and a red, splotchy rash. Most people with measles recover without complications within a week. However, for infants, people with immune deficiencies, and other vulnerable populations, the consequences of a measles infection can be severe. Rare complications can occur, including pneumonia, encephalitis, other secondary infections, blindness and even death. Before the measles vaccine was developed, the disease killed between two and three million people annually worldwide. Today, measles still causes more than 100,000 deaths globally each year.

Measles can be prevented with a vaccine that is both highly effective and safe. Each complication and death related to measles is a "preventable tragedy that could have been avoided through vaccination," the authors write. Some people are reluctant to vaccinate their children based on widespread misinformation about the vaccine. For example, they may fear that the vaccine raises their child's risk of autism, a falsehood based on a debunked and fraudulent claim. A very small number of people have valid medical contraindications to the measles vaccine, such as certain immunodeficiencies, but almost everyone can be safely vaccinated.

When levels of vaccine coverage fall, the weakened umbrella of protection provided by herd immunity--indirect protection that results when a sufficiently high percentage of the community is immune to the disease--places unvaccinated young children and immunocompromised people at greater risk. This can have disastrous consequences with measles. The authors describe a case in which a single child with measles infected 23 other children in a pediatric oncology clinic, with a fatality rate of 21 percent.

If vaccination rates continue to decline, measles outbreaks may become even more frequent, a prospect the authors describe as "alarming." This is particularly confounding, they note, since measles is one of the most easily prevented contagious illnesses. In fact, it is possible to eliminate and even eradicate the disease. However, they say, achieving this goal will require collective action on the part of parents and healthcare practitioners alike.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Cell-killing proteins suppress listeria without killing cells

image: After infection by Listeria, cells without RIPK3 proteins (top) showed greater bacterial replication than those with RIPK3 (bottom). Bar chart (right) shows replication over a 24-hour period.

Image: 
Kazuhito Sai, NC State University

New North Carolina State University research shows that key proteins known for their ability to prevent viral infections by inducing cell death can also block certain bacterial infections without triggering the death of the host cells.

Rather than killing host cells infected by Listeria in the gastrointestinal tract, the RIPK3 and MLKL proteins recognize the chemical composition of the bacteria and MLKL binds to it, preventing the spread of Listeria while keeping the host cells alive.

"While we've shown that these proteins take on a different function in intestinal epithelial cells than they do in immune cells, we're still not sure how or why this differentiation occurs," said Jun Ninomiya-Tsuji, professor of biological sciences and co-corresponding author of a paper describing the research.

The researchers, led by Kazuhito Sai, a toxicology research associate and co-corresponding author of the paper, first used human intestinal cells to show that RIPK3-deficient cells were infected by Listeria while cells with RIPK3 had few such infections. The researchers then used mice to see if Listeria could reach mouse livers by invading intestinal cells. They found many Listeria in RIPK3-deficient mice but few Listeria in normal mice.

They then showed that RIPK3 and a protein that works with it, MLKL, were activated by the presence of Listeria. This protein-pathway activation inhibited Listeria replication, showing that the proteins effectively blunted Listeria.

Next, and most surprisingly, the researchers showed that the activation of RIPK3 and MLKL by Listeria did not result in cell death. Instead, MLKL proteins bound themselves to Listeria, stopping its spread.

"These proteins induce cell death to prevent certain infections, particularly in immune cells," Sai said. "Inducing death of epithelial cells in the GI tract may cause removal of an important barrier to viruses and bacteria, so it's possible that these proteins recognize that killing these cells could make things worse instead of better."

Future research will attempt to understand how and why these proteins take different approaches - inducing cell death or not - to stave off bacteria in the GI tract, the researchers said.

Credit: 
North Carolina State University

Investigators incorporate randomized trial within dialysis care delivery

Highlights

The Time to Reduce Mortality in ESRD (TiME) trial was a large pragmatic trial demonstration project designed to determine the benefits of hemodialysis sessions that are longer than many patients currently receive.

The trial was conducted through a partnership between academic investigators and 2 large dialysis provider organizations using a highly centralized implementation approach.

Although the trial accomplished most of its demonstration project objectives, uptake of the intervention was insufficient to determine whether longer sessions improve outcomes.

Washington, DC (April 18, 2019) -- A recent clinical trial fully embedded into the routine delivery of care at dialysis facilities sought to determine if hemodialysis sessions that are longer than many patients in the United States currently receive can improve patients' health. Although the trial accomplished most of its objectives, uptake of the intervention was insufficient to determine whether longer sessions are beneficial. The findings, which appear in an upcoming issue of JASN, indicate that embedding trials into dialysis care will require more effective strategies for engaging clinicians and patients.

The trial's investigators had 2 goals: to develop approaches for embedding large randomized trials into the routine delivery of clinical care, and to determine whether patients benefit from hemodialysis sessions that are longer than usual. In the Time to Reduce Mortality in ESRD (TiME) trial, 266 dialysis facilities randomized to the intervention adopted a default hemodialysis session duration of at least 4.25 hours for new dialysis patients; those randomized to usual care had no trial-specified approach to duration. Trial implementation was highly centralized, with no on-site research personnel and complete reliance on clinically acquired data.

The team demonstrated that a trial embedded into clinical care delivery with no on-site research personnel could efficiently enroll a large number of participants using an opt-out approach to informed consent. (The trial enrolled 7,035 patients.) The trial was also able to obtain useful treatment and outcomes data from hundreds of medical facilities and monitor trial conduct and safety through a centralized approach.

The trial was discontinued at a median follow-up of 1.1 years because of an inadequate between-group difference in session duration. Average session duration was 216 minutes for the intervention group and 207 minutes for the usual care group. Investigators found no reduction in mortality or hospitalization rates for the intervention vs. usual care.

"There is a pressing need for data from randomized trials to guide clinical practice in dialysis," said lead author Laura M. Dember, MD (University of Pennsylvania Perelman School of Medicine). "Pragmatic trials embedded in clinical care delivery have tremendous potential for efficiently producing evidence that is highly generalizable to the non-research setting; however, experience with this approach is limited. The TiME trial provides an important foundation for future pragmatic trials in dialysis as well as in other settings."

Credit: 
American Society of Nephrology

Asian nations in early tobacco epidemic: study

image: From left, Wei Zheng, MD, PhD, Jae Jeong Yang, PhD, Danxia Yu, PhD, and colleagues are studying smoking patterns and associated deaths in Asian countries.

Image: 
Photo by Susan Urmy

Asian countries are in the early stages of a tobacco smoking epidemic with habits mirroring those of the United States from past decades, setting the stage for a spike in future deaths from smoking-related diseases.

That's the conclusion of researchers from Vanderbilt-Ingram Cancer Center and the Vanderbilt Epidemiology Center after analyzing 20 prospective cohort studies from mainland China, Japan, South Korea, Singapore, Taiwan and India.

Using long-term follow-up data from those cohorts, the study -- published in JAMA Network Open -- is the largest investigation in Asian countries of birth cohort-specific and county- or region-specific smoking patterns and their association with deaths.

Future deaths are likely to echo the pattern that occurred in the United States as the popularity of smoking increased during and after World War II, which resulted in lung cancer mortality peaking around 1990, said Wei Zheng, MD, PhD, Anne Potter Wilson Professor of Medicine.

"There is about a 30-year gap or incubation period for the mortality to occur," said Zheng, the study's senior author. "Smoking takes about 20 or 30 years to have this full effect on lung cancer mortality."

Tobacco control interventions may be having an effect on the smoking epidemic in some countries or areas because male smokers in the most recent birth cohort tended to quit smoking at younger ages.

"Asian countries that are richer, like Japan, South Korea and urban China, are doing a better job with this than rural China, India and other places," said Danxia Yu, PhD, who is co-first author of the study along with Jae Jeong Yang, PhD, a visiting research fellow from the Seoul National University, South Korea.

The study calls for immediate actions for all Asian countries to implement comprehensive tobacco control policies, including raising tobacco taxes, implementing laws for smoke-free areas, banning tobacco advertising, requiring warning labels for tobacco products and providing help with quitting.

Older generations of Asians tended to start smoking later in life and smoke less than people in the United States did in past decades, Zheng said, but that behavior pattern is changing.

"Younger people in more recent cohorts started smoking at a younger age, and they smoked a lot more," Zheng said. "The deaths due to tobacco smoking also increased with this cohort."

The researchers classified the birth cohorts by decades, ranging from pre-1910 to 1950 or later. Smoking accounted for 12.5% of all-cause mortality in the pre-1920 birth cohort, 21.1% in the 1920s cohort and 29.3% for the cohort born in 1930 or later. Lung cancer deaths attributable to smoking, which were 56.6% among men in the pre-1920s cohort, increased to 68.4% for men born in 1930 or later.

The researchers also studied cohorts with more recent data for men and women born in later decades to analyze smoking habits.

The rate for men in mainland China who have ever smoked has increased. Among Chinese men born in 1950 or later, 79.4% of those living in urban areas had smoked, and 74.3% of those in rural areas had. Traditionally, Japanese men have had the highest rate of having ever smoked.

Women in Asia have a much lower rate for smoking. The average percentage of women smokers for all 20 cohort studies was 7.8% compared to 65.4% for men.

Asia will face a growing burden of smoking-related health problems unless urgent tobacco control policies are implemented, the authors concluded.

Credit: 
Vanderbilt University Medical Center

Disappearing bumblebee species under threat of extinction

The American Bumblebee - a species once more commonly seen buzzing around Southern Ontario - is critically endangered, according to a new study led by York University.

The finding, published in Journal of Insect Conservation, found the native North American species, Bombus pensylvanicus, is facing imminent extinction from Canada, considered the highest and most at-risk classification before extinction. Many bumblebee species are rapidly declining across North America, but are important pollinators needed to grow Canada's crops including apples, tomatoes, blueberries and legumes, as well as countless types of trees, shrubs, and wildflowers.

The researchers assessed the extinction risk of the American Bumblebee, ranking the risk much higher than a federal advisory committee's most recent assessment which classifies the species' extinction risk at special concern.

"This species is at risk of extinction and it's currently not protected in any way despite the drastic decline," said Assistant Professor Sheila Colla, an expert in bees and endangered species in the Faculty of Environmental Studies.

"Now that we have assessed the extent of the decline and located where the remaining populations are, we can look more closely at threats and habitat requirements to design an effective conservation management plan so that this species does not disappear from Canada forever," said Colla, who co-authored and helped design the study.

Colla has been studying bumblebees in Southern Ontario since the mid-2000s. This study relies on the annual data that she and her fellow researchers have collected.

The study's research team - led by Victoria MacPhail, Colla's doctoral student, and including a scientist from the University of Vermont - used data from three sources. They analyzed Southern Ontario data from the citizen science program, Bumble Bee Watch, a collaboration of volunteers who submit bumblebee photos through a website or phone app for experts to identify. The researchers used the Bumble Bees of North America database to obtain records of bumblebee species in Ontario and Quebec dating back to the late-1800s. They also used their own field survey work which allowed them to evaluate the status of the species within its Canadian range, using the globally-recognized International Union for the Conservation of Nature (IUCN) Red List assessment criteria.

The researchers found that the American Bumblebee's area of occurrence has decreased by about 70 percent and its relative abundance fell by 89 percent from 2007-2016 compared to 1907-2006.

"This bumblebee species now has a reduced overall range," explained MacPhail. "It used to stretch from Windsor to Toronto, and all the way to Ottawa and into the Quebec area, but it is now only found in some core areas and has experienced a 37 percent decrease in overall range."

"It's now a rare sighting in Toronto," said MacPhail. "In terms of relative abundance, compared to other bees, you'd have to catch 1,000 bumblebees to find four of this species, and that compares to finding 37 bees in the past. You could walk out the door and win the lottery and find it, or you could be searching for years and not find any."

This study echoes Colla's previous findings with the critically endangered Rusty-patched Bumblebee, once found in Southern Ontario. The species has not been seen in Canada for about ten years and drastically declined towards extinction without receiving protection or conservation management.

"The American bumblebee is still found in areas throughout its Canadian range and immediate action may save it from the same fate as the Rusty-patched Bumblebee," said Colla.

Credit: 
York University

When the physics say 'don't follow your nose'

video: This robot is sniffing out the source of an ethanol leak, but it's being clever about doing it. Rather than just following the strongest scent, the robot is plugging measurements of concentration and airflow into a complex partial differential equation and then deciding where the most useful position to take another measurement is. By repeating this process, it can find an ethanol source in just a dozen or two tries in a complex environment with multiple sources.

Image: 
Reza Khodayi-mehr

Engineers at Duke University are developing a smart robotic system for sniffing out pollution hotspots and sources of toxic leaks. Their approach enables a robot to incorporate calculations made on the fly to account for the complex airflows of confined spaces rather than simply 'following its nose.'

"Many existing approaches that employ robots to locate sources of airborne particles rely on bio-inspired educated but simplistic guesses, or heuristic techniques, that drive the robots upwind or to follow increasing concentrations," said Michael M. Zavlanos, the Mary Milus Yoh and Harold L. Yoh, Jr. Associate Professor of Mechanical Engineering and Materials Science at Duke. "These methods can usually only localize a single source in open space, and they cannot estimate other equally important parameters such as release rates."

But in complex environments, these simplistic methods can send the robots on wild goose chases into areas where concentrations are artificially increased by the physics of the airflows, not because they're the source of the leak.

"If somebody is smoking outside, it doesn't take long to find them by just following your nose because there's nothing stopping the air currents from being predictable," said Wilkins Aquino, the Anderson-Rupp Professor of Mechanical Engineering and Materials Science at Duke. "But put the same cigarette inside an office and suddenly it becomes much more difficult because of the irregular air currents created by hallways, corners and offices."

In a recent paper published online in the IEEE Transactions on Robotics, Zavlanos, Aquino and newly minted PhD graduate Reza Khodayi-mehr instead take advantage of the physics behind these airflows to trace the source of an emission more efficiently.

Their approach combines physics-based models of the source identification problem with path planning algorithms for robotics in a feedback loop. The robots take measurements of contaminant concentrations in the environment and then use these measurements to incrementally calculate where the chemicals are actually coming from.

"Creating these physics-based models requires the solution of partial differential equations, which is computationally demanding and makes their application onboard small, mobile robots very challenging," said Khodayi-mehr. "We've had to create simplified models to make the calculations more efficient, which also makes them less accurate. It's a challenging trade-off."

Khodayi-mehr built a rectangular box with a wall nearly bisecting the space length-wise to create a miniature U-shaped hallway that mimics a simplified office space. A fan pumps air into the corridor at one end of the U and back out of the other, while gaseous ethanol is slowly leaked into one of the corners. Despite the simplicity of the setup, the air currents created within are turbulent and messy, creating a difficult source identification problem for any ethanol-sniffing robot to solve.

But the robot solves the problem anyway.

The robot takes a concentration measurement, fuses it with previous measurements, and solves a challenging optimization problem to estimate where the source is. It then figures out the most useful location to take its next measurement and repeats the process until the source is found.

"By combining physics-based models with optimal path planning, we can figure out where the source is with very few measurements," said Zavlanos. "This is because physics-based models provide correlations between measurements that are not accounted for in purely data-driven approaches, and optimal path planning allows the robot to select those few measurements with the most information content."

"The physics-based models are not perfect but they still carry way more information than just the sensors alone," added Aquino. "They don't have to be exact, but they allow the robot to make inferences based on what is possible within the physics of the airflows. This results in a much more efficient approach."

This complex series of problem solving isn't necessarily faster, but it's much more robust. It can handle situations with multiple sources, which is currently impossible for heuristic approaches, and can even measure the rate of contamination.

The group is still working to create machine-learning algorithms to make their models even more efficient and accurate at the same time. They're also working to extend this idea to programming a fleet of robots to conduct a methodical search of a large area. While they haven't tried the group approach in practice yet, they have published simulations that demonstrate its potential.

"Moving from a lab environment with controlled settings to a more practical scenario obviously requires addressing other challenges too," said Khodayi-mehr. "For example, in a real-world scenario we probably won't know the geometry of the domain going in. Those are some of the ongoing research directions we're currently working on."

"Model-Based Active Source Identification in Complex Environments." Reza Khodayi-mehr, Wilkins Aquino, Michael M. Zavlanos. IEEE Transactions on Robots, 2019.

Credit: 
Duke University

The Leukemia Atlas: researchers unveil proteins that signal disease

video: To rapidly accelerate research in leukemia and advance the hunt for treatments, Qutub provided the hallmarks in an online compendium where fellow researchers and oncologists worldwide can build from the resource and tools.

Image: 
Courtesy of UTSA

(San Antonio, April 17, 2019) -- Only about one in four people diagnosed with acute myelogenous leukemia (AML) survive five years after the initial diagnosis. To improve that survival rate, researchers at The University of Texas at San Antonio (UTSA) and the University of Texas MD Anderson Cancer Center created an online atlas to identify and classify protein signatures present at AML diagnosis.

The new protein classifications will help researchers and clinicians recommend better treatment and personalized medicine for patients suffering from this aggressive cancer, which occurs in the blood and bone marrow. The breakthrough research is published in the latest April issue of Nature Biomedical Engineering.

Researcher Amina Qutub, an associate professor in the UTSA Department of Biomedical Engineering (who joined UTSA in 2018 from Rice University), and oncologist Steven M. Kornblau, a professor and practicing clinician in the Department of Leukemia at UT MD Anderson Cancer Center, examined the genetic, epigenetic and environmental diversity that occurs in cancerous cells due to AML. Analyzing proteomic screens of 205 patient biopsies obtained at MD Anderson Cancer Center, first author Chenyue Wendy Hu (then a graduate student at the Qutub Lab, now at Uber Technologies), Kornblau and Qutub developed a new computational method called MetaGalaxy to categorize the protein signatures into 154 different patterns based on their cellular functions and pathways.

By approaching this challenge through the unique lens of developing a quantitative map for each leukemia patient from protein expression in their blood and bone marrow, rather than the standard lens of qualitative metrics and genetic risks alone, Qutub, Kornblau and their research collaborators will be able to more precisely categorize patients into risk groups and better predict their treatment outcomes.

To better understand the AML hallmarks at the proteomic (protein system) level and to share the results of their work with other researchers, the UTSA biomedical engineering professor and her team including Hu, and students Andrew Ligeralde (now at the University of California, Berkeley) and Allie Raybon (from the UTSA Department of Biomedical Engineering) built a web portal known as the Leukemia Proteome Atlas. Designed by Qutub's and Kornblau's teams with input from clinical collaborators worldwide, the online portal gives oncologists and cancer scientists the tools they need to investigate AML protein expression patterns from one patient to the next. It also provides investigators around the world with leads for new leukemia research and new computational tools.

Since many genetic mutations cannot be targeted, the proteomic profiling and target identification process used in this research study will accelerate the identification of therapeutic targets. It also propels researchers much closer to the development of personalized combination therapies for patients based on their unique protein signatures.

"Acute myelogenous leukemia presents as a cancer so heterogeneous that it is often described as not one, but a collection of diseases," said Qutub. "To decipher the clues found in proteins from blood and bone marrow of leukemia patients, we developed a new computer analysis - MetaGalaxy - that identifies molecular hallmarks of leukemia. These hallmarks are analogous to the way constellations guide navigation of the stars: they provide a map to protein changes for leukemia. Our 'hallmark' predictions are being experimentally tested through drug screens and can be 'programmed' into cells through synthetic manipulation of proteins. A next step to bring this work to the clinic and impact patient care is testing whether these signatures lead to the aggressive growth or resistance to chemotherapy observed in leukemia patients. At the same time, to rapidly accelerate research in leukemia and advance the hunt for treatments, we provide the hallmarks in an online compendium where fellow researchers and oncologists worldwide can build from the resource, tools and findings, LeukemiaAtlas.org."

Credit: 
University of Texas at San Antonio