Culture

Mosquito love songs send mixed message about immunity

ITHACA, N.Y. - As mosquito-borne diseases pose risks for half the world's population, scientists have been releasing sterile or genetically modified male mosquitos in attempts to suppress populations or alter their traits to control human disease.

But these technologies have failed to spread very rapidly because they require successful mating of modified mosquitoes with mosquitoes in nature and not enough research exists to fully explain which male traits females seek when they choose a mate.

Now, a new Cornell study of Aedes aegypti mosquitoes investigates how a mating cue called "harmonic convergence" might affect immunity against parasites, bacteria and dengue virus in offspring, which has important implications for trade-offs male mosquitoes make between investing energy towards immunity or investing it on traits that impact mating and fitness.

Previous research has shown that when choosing a mate, a female will use a male's ability to beat his wings at the same frequency as her own as a cue for a male's fitness and desirability. Other studies have also shown that the male offspring from pairs that harmonically converge are better able to achieve harmonic convergence themselves.

"We decided to look to see if the cues that a female responds to might correlate with the downstream genes that a male passes on to offspring that could protect them against parasites and pathogens," said Courtney Murdock, associate professor in the Department of Entomology in the College of Agriculture and Life Sciences, and senior author of the paper, "Sex, age, and parental harmonic convergence behavior affect the immune performance of Aedes aegypti offspring," published June 11 in Nature Communications Biology.

Christine Reitmeyer, a former postdoctoral researcher in Murdock's lab who is currently working at the Pirbright Institute in the United Kingdom, is the paper's first author.

In their study, the researchers separated harmonically converging pairs and non-converging pairs of Aedes aegypti, which transmit dengue, yellow fever, chikungunya and Zika viruses. Males and females of each group were housed together for up to four days. Though females avoid mating with non-converging males in the wild, the males will harass the female into mating under these conditions.

The researchers examined a subset of male and female offspring from the convergent pairs and the non-convergent pairs. They then investigated the effects of pairings that resulted from harmonic convergence on three types of offspring immunity, as well as how these effects were altered by offspring sex, age and life-history stage.

They tested humoral melanization, a defense response where insects will coat a pathogen or parasite with melanin in its gut in order to wall it off and prevent infection. They did this by injecting a tiny bead into the mosquitoes' midsection, and then removing the bead and assessing whether the bead was uncoated, partially coated or fully coated.

They also injected mosquitoes with fluorescently labelled Escherichia coli bacteria and tested how well they grew after 24 hours. And they exposed the females (as males don't feed on blood) to dengue virus and looked for its presence in tissues, especially saliva glands, from where the virus is transmitted during a blood meal.

Overall, males were less capable of melanizing beads and resisting bacterial infections than females.

Additionally, male offspring from converging parents were significantly less able to melanize compared with nonconverging male offspring.

"I think this is because males from converging pairs tend to invest more energy in mating effort and the ability to harmonically converge than on immunity, at least when it comes to melanization," Murdock said.

At the same time, these males fended off bacterial infection better than their nonconverging peers at rates similar to females when they were young, but their defenses dropped rapidly as they aged.

With regard to dengue, females from converging parents showed higher immunity early in life, which waned and matched offspring from nonconverging parents as they aged. But overall, the presence of virus found in saliva was so low that the researchers concluded more study is needed.

"The bottom line is that if convergence is going to have an effect, it's going to affect male ability when it comes to immune response," Murdock said, "and then, the direction of the effect is probably going to be dependent upon the physiological costs associated with that immune response."

The findings also have implications for mosquito control, Murdock said. If modified males have introduced traits that make them undesirable to female mosquitoes in nature, these approaches to suppress or replace mosquito populations will not be sustainable in the field over time, she added.

Credit: 
Cornell University

Virus that causes COVID-19 can find alternate route to infect cells

Early in the COVID-19 pandemic, scientists identified how SARS-CoV-2, the virus that causes COVID-19, gets inside cells to cause infection. All current COVID-19 vaccines and antibody-based therapeutics were designed to disrupt this route into cells, which requires a receptor called ACE2.

Now, researchers at Washington University School of Medicine in St. Louis have found that a single mutation gives SARS-CoV-2 the ability to enter cells through another route - one that does not require ACE2. The ability to use an alternative entry pathway opens up the possibility of evading COVID-19 antibodies or vaccines, but the researchers did not find evidence of such evasion. However, the discovery does show that the virus can change in unexpected ways and find new ways to cause infection. The study is published June 23 in Cell Reports.

"This mutation occurred at one of the spots that changes a lot as the virus circulates in the human population," said co-senior author Sebla Kutluay, PhD, an assistant professor of molecular microbiology. "Most of the time, alternative receptors and attachment factors simply enhance ACE2-dependent entry. But in this case, we have discovered an alternative way to infect a key cell type -- a human lung cell -- and that the virus acquired this ability via a mutation that we know arises in the population. This is something we definitely need to know more about."

The finding was serendipitous. Last year, Kutluay and co-senior author M. Ben Major, PhD, the Alan A. and Edith L. Wolff Distinguished Professor of Cell Biology & Physiology, planned to study the molecular changes that occur inside cells infected with SARS-CoV-2. Most researchers study SARS-CoV-2 in primate kidney cells because the virus grows well in them, but Kutluay and Major felt it was important to do the study in lung or other cells similar to the ones that are naturally infected. To find more relevant cells capable of growing SARS-CoV-2, Kutluay and Major screened a panel of 10 lung and head-and-neck cell lines.

"The only one that was able to be infected was the one I had included as a negative control," Major said. "It was a human lung cancer cell line with no detectable ACE2. So that was a crazy surprise."

Kutluay, Major and colleagues -- including co-first authors and postdoctoral researchers Maritza Puray-Chavez, PhD, and Kyle LaPak, PhD, as well as co-authors Dennis Goldfarb, PhD, an assistant professor of cell biology & physiology and of medicine, and Steven L. Brody, MD, the Dorothy R. and Hubert C. Moog Professor of Pulmonary Diseases in Medicine, and a professor of radiology -- discovered that the virus they were using for experiments had picked up a mutation. The virus had originally been obtained from a person in Washington state with COVID-19, but as it was grown over time in the laboratory, it had acquired a mutation that led to a change of a single amino acid at position 484 in the virus's spike protein. SARS-CoV-2 uses spike to attach to ACE2, and position 484 is a hot spot for mutations. A variety of mutations at the same position have been found in viral variants from people and mice, and in virus grown in the lab. Some of the mutations found in virus samples taken from people are identical to the one Kutluay and Major found in their variant. The Alpha and Beta variants of concern have mutations at position 484, although those mutations are different.

"This position is evolving over time within the human population and in the lab," Major said. "Given our data and those of others, it is possible that the virus is under selective pressure to get into cells without using ACE2. In so many ways, it is scary to think of the world's population fighting a virus that is diversifying the mechanisms by which it can infect cells."

To determine whether the ability to use an alternative entry pathway allowed the virus to escape COVID-19 antibodies or vaccines, the researchers screened panels of antibodies and blood serum with antibodies from people who have been vaccinated for COVID-19 or recovered from COVID-19 infection. There was some variation, but in general, the antibodies and blood sera were effective against the virus with the mutation.

It is not yet clear whether the alternative pathway comes into play under real-world conditions when people are infected with SARS-CoV-2. Before the researchers can begin to address that question, they must find the alternative receptor that the virus is using to get into cells.

"It is possible that the virus uses ACE2 until it runs out of cells with ACE2, and then it switches over to using this alternative pathway," Kutluay said. "This might have relevance in the body, but without knowing the receptor, we cannot say what the relevance is going to be."

Major added, "That's where we're going right now. What is the receptor? If it's not ACE2, what is it?"

Credit: 
Washington University School of Medicine

Kiwi disease study finds closely related bacterial strains display different behaviors

image: Elodie Vandelle, Annalisa Polverari,
Davide Danzi, Vanessa Maurizio, Alice Regaiolo,
Maria Rita Puttilli, Teresa Colombo, Tommaso Libardi

Image: 
Elodie Vandelle

Over the last decade, severe outbreaks of bacterial canker have caused huge economic losses for kiwi growers, especially in Italy, New Zealand, and China, which are among the largest producers. Bacterial canker is caused by the bacterial pathogen Pseudomonas syringae pv. actinidiae (Psa) and more recent outbreaks have been particularly devastating due to the emergence of a new, extremely aggressive biovar called Psa3.

Due to its recent introduction, the molecular basis of Psa3's virulence is unknown, making it difficult to develop mitigation strategies. In light of this dilemma, a group of scientists at the University of Verona and University of Rome collaborated on a study comparing the behavior of Psa3 with less-virulent biovars to determine the basis of pathogenicity.

They found that genes involved in bacterial signaling (the transmission of external stimuli within cells) were especially important, especially the genes required for the synthesis and degradation of a small chemical signal called c-di-GMP, that suppresses the expression of virulence factors. Compared to other biovars, Psa3 produces very low levels of c-di-GMP, contributing to an immediate and aggressive phenotype at the onset of infection before the plant can corral a defense response.

"It was exciting to discover this diversified arsenal of pathogenicity strategies among closely related bacterial strains that infect the same hosts but display different behaviors," said Elodie Vandelle, one of the scientists involved with this study. "Although their 'small' genomes mainly contain the same information, our research shows that bacterial populations within a pathovar are more complex than expected and their pathogenicity may have evolved throughout different strategies to attack the same host."

Their research highlights the importance of working on a multitude of real-life pathogenic bacterial strains to shed light on the diversity of virulence strategies. This approach can contribute to the creation of wider pathogenicity working models. In terms of kiwi production, Vandelle hopes their findings can help scientists develop new mitigation methods. In the long-term, their research could lead to the identification of key molecular switches responsible for the transition between high and low bacterial virulence phenotypes.

"This identification would allow, at industrial level, to develop new targeted strategies to control phytopathogenic bacteria, weakening their aggressiveness through switch control, instead of killing them," Vandelle explained. "This would avoid the occurrence of new resistances among bacterial communities, thus guaranteeing a sustainable plant protection."

Credit: 
American Phytopathological Society

Study shows potential dangers of sweeteners

New research has discovered that common artificial sweeteners can cause previously healthy gut bacteria to become diseased and invade the gut wall, potentially leading to serious health issues.

The study, published in the International Journal of Molecular Sciences, is the first to show the pathogenic effects of some of the most widely used artificial sweeteners - saccharin, sucralose, and aspartame - on two types of gut bacteria, E. coli (Escherichia coli) and E. faecalis (Enterococcus faecalis).

Previous studies have shown that artificial sweeteners can change the number and type of bacteria in the gut, but this new molecular research, led by academics from Anglia Ruskin University (ARU), has demonstrated that sweeteners can also make the bacteria pathogenic. It found that these pathogenic bacteria can attach themselves to, invade, and kill Caco-2 cells, which are epithelial cells that line the wall of the intestine.

It is known that bacteria such as E. faecalis which cross the intestinal wall can enter the blood stream and congregate in the lymph nodes, liver, and spleen, causing a number of infections including septicaemia.

This new study discovered that at a concentration equivalent to two cans of diet soft drink, all three artificial sweeteners significantly increased the adhesion of both E. coli and E. faecalis to intestinal Caco-2 cells, and differentially increased the formation of biofilms.

Bacteria growing in biofilms are less sensitive to antimicrobial resistance treatment and are more likely to secrete toxins and express virulence factors, which are molecules that can cause disease.

Additionally, all three sweeteners caused the pathogenic gut bacteria to invade Caco-2 cells found in the wall of the intestine, with the exception of saccharin which had no significant effect on E. coli invasion.

Senior author of the paper Dr Havovi Chichger, Senior Lecturer in Biomedical Science at Anglia Ruskin University (ARU), said: "There is a lot of concern about the consumption of artificial sweeteners, with some studies showing that sweeteners can affect the layer of bacteria which support the gut, known as the gut microbiota.

"Our study is the first to show that some of the sweeteners most commonly found in food and drink - saccharin, sucralose and aspartame - can make normal and 'healthy' gut bacteria become pathogenic. These pathogenic changes include greater formation of biofilms and increased adhesion and invasion of bacteria into human gut cells.

"These changes could lead to our own gut bacteria invading and causing damage to our intestine, which can be linked to infection, sepsis and multiple-organ failure.

"We know that overconsumption of sugar is a major factor in the development of conditions such as obesity and diabetes. Therefore, it is important that we increase our knowledge of sweeteners versus sugars in the diet to better understand the impact on our health."

Credit: 
Anglia Ruskin University

A detailed atlas of the developing brain

image: Microscopy image of the developing cerebral cortex, showing two types of neural progenitors (blue and green) and young developing neurons (red). The image represents one of the tissues used in the study.

Image: 
Arlotta laboratory, Harvard University

Researchers at Harvard University and the Broad Institute of MIT and Harvard have created a first detailed atlas of a critical region of the developing mouse brain, applying multiple advanced genomic technologies to the part of the cerebral cortex that is responsible for processing sensation from the body. By measuring how gene activity and regulation change over time, researchers now have a better understanding of how the cerebral cortex is built, as well as a brand new set of tools to explore how the cortex is affected in neurodevelopmental disease. The study is published in the journal Nature.

"We have had a long-standing interest in understanding the development of the mammalian cerebral cortex, as it is the seat of higher-order cognition and the part of the brain that has expanded and diversified the most during human evolution," said Paola Arlotta, the co-senior author of the study and the Golub Family Professor of Stem Cell and Regenerative Biology at Harvard University. "In this study, we looked at the cortex with a very fine lens, practically profiling all of its cells, one by one, every day of development. We catalogued changes in gene expression and regulation at an unprecedented level of temporal resolution to build a first single-cell-resolution molecular map of this amazing tissue. The map allowed us to extract first mechanistic principles governing how the cortex is built, and begin to decode how genetic abnormalities affect such highy controlled process in the embryo."

"In the developing brain, we have to consider three things: the types of cells that are present, where those cells are located, and at what stage they are in development. In addition, by identifying the drivers that direct this process in normal development, we can better understand what may go wrong in disease," said co-senior author Aviv Regev, who was a core institute member at the Broad Institute when the study began and is currently Head of Genentech Research and Early Development.

The researchers focused on the somatosensory cortex, which may serve as a model for other regions of the cerebral cortex because it contains cells representing all of its major classes. For every day of cortex development, the researchers analyzed the brain using multiple technologies at the single-cell level. They used RNA-seq to measure which genes are expressed, as well as spatial transcriptomics to measure where genes are expressed in the tissue. They also used ATAC-seq to measure which parts of the genome were accessible for regulation.

"These technologies allowed us to look at different modes of gene expression and how genes are regulating each other. By combining these three modalities, we have a stronger sense of which are the important genes for directing neuronal development, for example" said Daniela Di Bella, a postdoctoral fellow in the Arlotta lab and co-first author of the study.

For instance, it has been unclear exactly when the cortex's diversity of different neuron populations is established. "We found that the different flavors of neurons are decided during the neuron maturation process, rather than pre-established in their stem cells," Di Bella said.

The researchers also used their data to predict the underlying mechanism of how genetic mutation leads to defects in cortical development, finding which specific developmental steps are failing and which cells are being affected.

"We have created a uniquely comprehensive molecular atlas of the developing somatosensory cortex, and we are continuing to mine the data for more insights," said co-first author Ehsan Habibi. "Our goal is for our data to serve as a resource for the wider neuroscience community and inform how the field looks at brain development, both during normal and disease processes."

"These combined, extensive measurements provided us with a first dynamic view of the symphony of molecular events that unfold as this critical region of the brain is built in the embryo," said Arlotta, who is also an institute member in the Stanley Center for Psychiatric Research at the Broad Institute. "Researchers have been studying the process of development of the cerebral cortex for over a century, but the mechanistic events that govern how cells are made and how they interact to ultimately form functional circuit have remained elusive. As a field, we have historically looked at this complex developing tissue one cell type at a time, and investigated small numbers of genes for their roles in putting together pieces of this amazing puzzle. But the brain does not develop one cell type at the time -- it is truly a symphony in the sense that hundreds of cell types undergo development together, using ever-changing lanscapes of genes to form the adult tissue. Now imagine having for the first time the full 'recipe' of genes that any given cell class uses as its development unfolds. Imagine also gaining detailed knowledge of the 'codes of genes' that turn on or off as distinct lineages of cells separate from each other and get built. This type of overarching mechanistic knowledge offers an opportunity to study cortical development in a brand new way, looking at all cells and all genes. We never had information this complete before and I must admit that I stared at the data in awe, thinking about the type of discovery that it enables."

"Ten years ago, this study would not have been possible because the technologies either did not exist or were not mature enough yet," Regev said. "But with advances in single-cell and spatial transcriptomics, and new machine learning algorithms for large data analysis, we were able to map where cells develop, put those maps together, and watch development unfold like a movie over time. We could not only reconstruct the movie, but could also link that picture to a greater biological understanding of brain development. We hope this approach could one day help us better understand and treat diseases of the brain."

Arlotta added: "It is a pretty interesting movie -- one that I have looked forward to filming for most of my scientific career."

Credit: 
Harvard University

Caloric restriction alters microbiome, enhancing weight loss

image: Peter Turnbaugh, Ph.D.

Image: 
UCSF

Researchers at UCSF have found that extreme caloric restriction diets alter the microbiome in ways that could help with weight loss but might also result in an increased population of Clostridiodes difficile, a pathogenic bacterium that can lead to severe diarrhea and colitis.

Such diets, which allow people only 800 calories per day in liquid form, are an effective approach to weight loss in people with obesity. The unexpected results of this study raise the question of how much the microbiome influences weight loss and which bacteria are significant in that process. The study appears in the June 23, 2021, issue of Nature.

"Our results underscore that the role of calories in weight management is much more complex than simply how much energy a person is taking in," said Peter Turnbaugh, PhD, an associate professor of microbiology and immunology and a senior author on the study. "We found that this very-low-calorie diet profoundly altered the gut microbiome, including an overall decrease in gut bacteria."

Turnbaugh and his research team leveraged a clinical trial looking at a very-low-calorie liquid diet. That trial, which resulted in successful weight loss for many of the participants, was led by Joachim Spranger, MD, a professor of endocrinology and metabolic diseases at Charité Universitätsmedizin in Berlin, and co-senior author on the Nature study.

To investigate the microbial connection between this very-low-calorie diet and shedding pounds, Spranger's team collected and sequenced fecal samples from 80 participants - all of whom were post-menopausal women - before and after the trial, which lasted 16 weeks. The team then worked with members of the Turnbaugh lab to analyze the data and to transplant the samples into mice that had been raised in sterile environments.

The researchers allowed these mice to continue eating the same amount of food and found, to their surprise, that the rodents that had received a transplant of the post-diet microbiome lost weight.

"We drove weight loss just by colonizing these mice with a different microbial community," Turnbaugh said.

The next step was to identify bacteria that could be responsible for the weight loss. To do that, Jordan Bisanz, PhD, a former postdoctoral fellow in the Turnbaugh lab and a first author on the study, sequenced the gut microbiomes of the test mice and compared them to those of control mice.

Bisanz discovered a bacterial factor behind changes in weight the team had observed: higher levels of C. difficile.

In the gut, C. difficile is associated with the process of fat metabolism. Initially, fats are digested with the help of bile salts. These bile salts are then broken down by bacteria other than C. difficile, producing what are called secondary bile salts. These bacterial metabolites keep the growth of C. difficile under control. In other words, people who are eating less, particularly less fat, can produce less bile, which in turn leads to fewer secondary bile acids and less of a check on the population of C. difficile.

"Ordinarily we would predict increased inflammation or even colitis following an increase in C. difficile," said Turnbaugh, who is also an investigator at the Chan Zuckerberg Biohub, Curiously, when he and his team examined the mice, they found only mild inflammation. That lack of inflammation suggests that C. diff could have important effects on metabolism that are distinguishable from the bacterium's ability to drive severe intestinal disease.

At the same time, Turnbaugh notes, it's not at all certain what would happen if someone stayed on the diet for a longer period of time and whether that could result in a full-blown C. difficile infection, which can be life threatening if it gets out of control.

"Let's be clear; we are definitely not promoting C. difficile as a new weight loss strategy," said Turnbaugh. "We've got a lot of biology left to unpack here." The data raise a lot of interesting and unexplored questions about what role C. difficile is playing beyond the severe inflammatory conditions associated with it, he said.

It's important to understand whether diet-induced changes to C. difficile level are harmful in humans, and how the balance between different microbial species in the gut is affected by different dietary choices, Turnbaugh said. Ultimately, that knowledge could allow clinicians to add or remove specific microbes in a patient's gut to help maintain a healthy body weight.

"Multiple lines of research shows that the gut microbiome can either hinder or enhance weight loss," said Turnbaugh. "We want to better understand how common weight loss diets might impact the microbiome and what the downstream consequences are for health and disease."

Credit: 
University of California - San Francisco

Study: Environmental risks exacerbated for vulnerable populations in small towns

image: Benjamin Shirtcliff

Image: 
Iowa State University

AMES, Iowa -- A new study of small Iowa towns found that vulnerable populations within those communities face significantly more public health risks than statewide averages.

The study, published this week in PLOS ONE, a peer-reviewed open access journal, was led by Benjamin Shirtcliff, associate professor of landscape architecture at Iowa State University.

He focused on three Iowa towns - Marshalltown, Ottumwa and Perry - as a proxy for studying shifting populations in rural small towns, in particular how vulnerable populations in these towns are affected by their built environment (where people live and work) and environmental risks. Shirtcliff wants to understand how small towns can prioritize investment into their built environment for vulnerable populations on the heels of declining economic resources due to population change.

The study found the three towns have significantly higher environmental exposures than state averages, including more exposure to diesel, air toxins, lead paint in older homes and proximity to potential chemical accidents.

These risks are exacerbated for and increase physical and mental stress on populations with social vulnerability (minority status, low-income, linguistic isolation, below high school education, and populations under age 5 and over 64), which are also significantly higher in the three small towns than state averages.

With the growth of industrialized agriculture over the past few decades, small towns' populations have shifted: "...what environmental justice advocates describe as a 'double jeopardy' of injustice where people with the fewest resources reside in low-income communities with high level of environmental risk and unable defend against social threats like racism," according to the study.

Urban areas benefit from more green space, which would make it seem as if small towns surrounded by green landscapes would have greater benefits. That's not always the case, Shirtcliff says, due to the routine application of pesticides, fertilizers and other organic and inorganic toxins.

"There is a rural health paradox: These small towns may appear on the outside that they're healthier and safer, but the reality is that the metrics cities use are not really compatible," he said.

This exposes a knowledge gap in current research: Measures of environmental risk and design on vulnerable populations in urban areas are not comparable to those in small towns.

'Parallel communities'

Shirtcliff describes these small towns as having "parallel communities," or populations that rarely interact due to their opposing work and personal schedules, geography and language barriers.

This research began following one of Shirtcliff's design studios in Perry four years ago. During an interview with Jon Wolseth, assistant director of community and economic development with ISU Extension and Outreach, he and fellow researchers learned about the parallel communities in the town.

"When we think about public health these days, we think about viruses and epidemics," he said. "What's increasingly being supported through research is that the neighborhoods we live in have huge impacts on our mental and physical health."

As some Iowans move to more urban areas from small towns, the built environment they leave behind is sometimes neglected.

Now, there are new barriers that people in these towns face to report and seek care for poor health effects from their built environment. There is also sometimes an information barrier; for example, rural populations may not correlate higher rates of asthma with the landscape.

"Although the influx of foreign-born workers and their families to small towns has enabled economic growth in the hands of a local few, the stability of small towns is fragile," according to the study. "A decline in local investment coupled with aging infrastructure is likely to impact the built environments in small towns, potentially compounding deleterious effects as vulnerable populations bring families and become established."

Shirtcliff puts a call out to the landscape architecture profession, which can sometimes focus on broad-reaching issues such as major parks and environmental remediation, to also focus their efforts on "the banal, everyday 'human environment' where a sidewalk, street tree, and crosswalk make a fundamental difference." Low-cost interventions such as these can counteract "a mounting public health crisis in small towns," he says.

Credit: 
Iowa State University

Language trade-off? No, bilingual children reliably acquire English by age 5

image: U.S.-born children who live in Spanish-speaking homes and who also are exposed to English from infancy tend to become English dominant by age 5.

Image: 
Florida Atlantic University

In the United States, more than 12 million children hear a minority language at home from birth. More than two-thirds hear English as well, and they reach school age with varying levels of proficiency in two languages. Parents and teachers often worry that acquiring Spanish will interfere with children's acquisition of English.

A first-of-its kind study in U.S.-born children from Spanish-speaking families led by researchers at Florida Atlantic University finds that minority language exposure does not threaten the acquisition of English by children in the U.S. and that there is no trade-off between English and Spanish. Rather, children reliably acquire English, and their total language knowledge is greater to the degree that they also acquire Spanish.

Results of the study, published in the journal Child Development, show that children with the most balanced bilingualism were those who heard the most Spanish at home and who had parents with high levels of education in Spanish.

Importantly, these children did not have lower English skills than the English-dominant children. Children's level of English knowledge was independent of their level of Spanish knowledge. U.S.-born children who live in Spanish-speaking homes and who also are exposed to English from infancy tend to become English dominant by age 5 - but some more so than others.

The study, conducted in collaboration with The George Washington University, is the first to describe the outcome of early dual language exposure in terms of bilingual skill profiles that reflect the relations in the data between children's skill levels in their two languages. The study addresses the question of what level of English and Spanish skill can be expected in 5-year-old children who come from Spanish-speaking homes in which they also hear English in varying amounts.

"We found that early in development, children who hear two languages take a little longer to acquire each language than children who hear only one language; however, there is no evidence that learning two languages is too difficult for children," said Erika Hoff, Ph.D., lead author and a professor in the Department of Psychology within FAU's Charles E. Schmidt College of Science on the FAU Broward Campuses.

A key finding from the study is that low levels of proficiency in two languages at age 5 is not a typical outcome of exposure to two languages. Bilingual children who have weak skills in both languages at age 5 may have an underlying impairment or inadequate environmental support for language acquisition.

For the study, Hoff and co-authors Michelle K. Tulloch, a Ph.D. student in the Charles E. Schmidt College of Science; and Cynthia Core, Ph.D., an associate professor in the Department of Speech, Language and Hearing Sciences within the Columbian College of Arts and Sciences, The George Washington University, used an examiner-administered test to measure the English and Spanish expressive vocabulary of 126 U.S.-born 5 year olds from Spanish-speaking families with one or two immigrant parents who have been exposed to Spanish since birth and who have also heard English at home in varying amounts, either from birth or soon thereafter. They also measured indicators of the children's language learning ability.

Prior to this study, differences among bilingual children were described primarily in terms of dominance (English-dominant bilinguals, Spanish-dominant bilinguals) and balance, but that turns out not to be the only way in which bilinguals differ.

"Previous research has tended to treat bilingual children's development in each language as a separate outcome, rather than treating dual language skills as the single outcome of dual language exposure," said Hoff. "This approach not only fails to adequately capture the nature of children's dual language skills, it also leaves unaddressed the question of how the acquisition of one language is related to the acquisition of another."

Findings from this study suggest that dominance is not the same thing as proficiency. Bilinguals differ both in dominance and in total language knowledge. Teachers and clinicians cannot infer a bilingual child's language proficiency from that child's language dominance. There are balanced bilinguals at age 5 who have stronger English skills than some English-dominant bilinguals. Individual differences in dominance are significantly related to home exposure, although the function that relates exposure to dominance is biased toward English.

Balanced language exposure at home does not result in balanced proficiency; Spanish-dominant home exposure appears to be necessary. Individual differences in total language knowledge are significantly related to indicators of language-learning ability, measured in this study in terms of phonological memory and nonverbal intelligence.

Credit: 
Florida Atlantic University

Virtual training helps underserved middle schoolers hone social skills

Middle school, a time when children's brains are undergoing significant development, is often also a time of new challenges in navigating the social world. Recent research from the Center for BrainHealth at UT Dallas demonstrates the power of combining a virtual platform with live coaching to help students enhance their social skills and confidence in a low-risk environment.

In this study, BrainHealth researchers partnered with low-income public middle schools in Dallas. Teachers recommended 90 students to participate in virtual training sessions via questionnaires, testing their ability to accurately identify students who are struggling socially. Importantly, participation was not limited to students with a clinical diagnosis.

Next, the team explored the efficacy of using Charisma™, a proprietary virtual platform for social training built on a video game platform whose effectiveness has been demonstrated in controlled trials but never before in a school setting.

At the end of the training, students submitted self-assessments of progress, and teachers submitted evaluations based on observations in the classroom. Both sources reported improvement in students' confidence, participation in the classroom, and ability to communicate with peers and teachers, among other benchmarks. The results appear in Frontiers in Education.

"The middle school years are a time of dynamic emotional and cognitive changes for students," said Maria Johnson, Director of Youth & Family Innovations at the Center for BrainHealth and lead author of the study. "We wanted to see if we could empower middle schoolers to improve their ability to communicate in the classroom and enhance self-assertion."

The study confirmed that teachers are reliable identifiers of students who are struggling socially. The study also validated the feasibility of using this virtual platform for social training in a public school setting. Both students and teachers reported that the social communication and assertion strategies were most beneficial.

With high demands for communication, cooperation and assertion, a middle school classroom is rich with social interaction as well as considerable challenges, including peer pressure, academic competition and social comparison among peers, which may result in decreased connectedness with classmates, teachers and school staff. The potential exists for problematic behaviors that might be misclassified and treated with punishment rather than support.

Johnson continued, "Our findings are significant because using our virtual social training platform can be a key to helping individuals with social challenges soar. Demonstrating the power of this tool in a public middle school setting can inform future education policy to promote social independence and resiliency at a high level."

Credit: 
Center for BrainHealth

Machine learning for solar energy is supercomputer kryptonite

Supercomputers could find themselves out of a job thanks to a suite of new machine learning models that produce rapid, accurate results using a normal laptop.

Researchers at the ARC Centre of Excellence in Exciton Science, based at RMIT University, have written a program that predicts the band gap of materials, including for solar energy applications, via freely available and easy-to-use software. Band gap is a crucial indication of how efficient a material will be when designing new solar cells.

Band gap predictions involve quantum and atomic-scale chemical calculations and are often made using density functional theory. Until now, this process has required hundreds of hours of costly supercomputer processing time, as well as complicated and expensive software.

To address this issue, the researchers trained a machine learning model using data generated from 250,000 previous supercomputer calculations. The results have been published in Journal of Cheminformatics.

Significantly, while the program is capable of including multiple variables, it was found that just one factor, stoichiometry, contains - in almost all cases - enough information to accurately predict band gap. Stoichiometry is the numerical relationships between chemical reactants and products, like the volume of ingredients in a recipe to bake a cake.

More work is needed to fully understand why stoichiometry alone proved to be so useful. But it raises the exciting prospect of lengthy supercomputer calculations no longer being required for some applications. The artificial neural network that powers the machine learning programs could one day be succeeded by a software program that performs a similar function to density functional theory, albeit with far more simplicity.

Lead author Carl Belle said: "If you want to do simulations but you need to have millions of dollars of supercomputing infrastructure behind you, you can't do it. If we can dig into why the stoichiometric configuration is so powerful, then it could mean that supercomputers are not needed to screen candidate materials, nor for accurate simulations. It could really open things up to a whole new group of scientists to use."

The machine learning program isn't limited to band gap. It can be used to predict the properties of many other materials for other contexts, and has been developed by a professional programmer, making it useful not only for scientists and academics but also for businesses and corporate applications.

"It's built to industry standard and it's designed to be collaborative," Carl said.

"The website has a fully relational database. It's got millions of records. It's all there and freely available to use. We're ready to go."

Credit: 
ARC Centre of Excellence in Exciton Science

Tuckered out: Early Antarctic explorers underfed their dogs

image: Dr Jill Haley, the study's lead author, holds a Spratt's dog cake fragment from Canterbury Museum's collection.

Image: 
Canterbury Museum

It's one of the iconic images of early Antarctic exploration: the heroic explorer sledging across the icy wastes towed by his trusty team of canine companions.

But new research analysing a century-old dog biscuit suggests the animals in this picture were probably marching on half-empty stomachs: early British Antarctic expeditions underfed their dogs.

In a paper just published in Polar Record, researchers from Canterbury Museum, Lincoln University and University of Otago in New Zealand analysed the history and contents of Spratt's dog cakes, the chow of choice for the canine members of early Antarctic expeditions.

Lead author, Canterbury Museum Curator Human History Dr Jill Haley, has researched the lives of dogs in Antarctica and curated the Museum's 2018 exhibition Dogs in Antarctica: Tales from the Pack.

"The early explorers valued their dogs, not just for pulling sledges but for their companionship in the bleak isolation of Antarctica," she says.

"Our analysis of a partially crumbled Spratt's dog cake, one of four cared for by Canterbury Museum, found that the contents of the cakes weren't that different to modern dog biscuits. However, the quantity dogs were fed on the expeditions didn't provide enough fuel for their high-energy activities."

Pet food was a relatively new invention in the early twentieth century and seen as superior to older practices of feeding dogs table scraps or letting them scavenge for themselves.

Early polar explorers were particularly keen on Spratt's dog cakes because they were easy to transport, took no effort to prepare and did not perish.

The cakes were used on two Arctic polar expeditions before they were taken south by Captain Robert Falcon Scott's Discovery expedition (1901-1904). The expedition's 18 sledge dogs were fed the biscuits alongside dried fish from Norway; all the animals died after consuming rancid fish on a sledging expedition.

Perhaps wanting to avoid a repeat of this episode, the handlers on Scott's Terra Nova expedition (1910-1913) fed the animals on Spratt's alone. On rations of 0.3 kg of biscuits each per day the dogs became desperately hungry, even eating their own excrement. They recovered when seal meat was added to their diet.

Ernest Shackleton took Spratt's on his Nimrod (1907-1909) and Endurance (1914-1917) expeditions, where they were part of a doggy diet that also included seal meat, blubber, biscuits and pemmican, a high-energy mix of fat and protein.

University of Otago researchers Professor Keith Gordon, Dr Sara Fraser-Miller and Jeremy Rooney used laser-based analysis to determine the composition of the materials in the cake down to micron resolution, identifying a number of constituents including wheat, oats and bone.

Lincoln University Associate Professor of Animal Science Dr Craig Bunt compared the cakes with similar foods, including modern dog food, and calculated how many kilojoules of energy each biscuit would have provided.

To match the energy intake needed by modern sledge dogs, the dogs on the early Antarctic expeditions would have needed to eat between 2.6 and 3.2 kg of Spratt's dog cakes a day.

However, historic accounts suggest daily dog rations on some expeditions were only around 0.5 kg of biscuits and were sometimes as low as 0.3 kg.

The researchers concluded that Spratt's dog cakes were probably a suitable complete food for dogs in Antarctica; dogs on the early expeditions just weren't fed enough of them.

Credit: 
Canterbury Museum

Study reveals formation mechanism of first carbon-carbon bond in MTO process

image: Revealing the whole first C-C bond formation processes in MTO reaction: based on the in situ NMR spectroscopic evidences and advanced ab initio molecular dynamics (AIMD) theoretical calculation method

Image: 
DICP

A joint research team led by Prof. LIU Zhongmin, Prof. WEI Yingxu, and Prof. XU Shutao from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences (CAS) revealed the mechanism underlying the formation of the first carbon-carbon (C-C) bond formation during the methanol-to-olefins (MTO) process.

This study was published in Chem on June 23.

Prof. ZHENG Anmin's group from Innovation Academy for Precision Measurement Science and Technology of CAS was also involved in the study.

The first C-C bond in the MTO process is formed at the initial stage of the reaction. There is no direct method to elucidate the bond formation /reaction mechanism due to the difficulty in capturing intermediate species.

"We investigated the direct C-C bond formation mechanism during the initial MTO reaction over HSSZ-13 zeolite with an 8-membered ring and a chabazite topological structure," said Prof. XU.

They detected the evolution of the organic species on the SSZ-13 catalyst surface during the methanol conversion. For the first time, they directly captured the surface ethoxy species (SES), the critical species containing the initial C-C bond under real MTO reaction conditions at the initial reaction stage.

Moreover, the researchers employed the ab initio molecular dynamics (AIMD) theoretical calculation simulation to predict and present the visualized and complete process of initial C-C bond formation starting from the C1 reactants and C1 intermediates.

Based on the experimental and theoretical evidence, they established the complete and feasible initial C-C bond formation routes, namely, surface methoxy species (SMS)/trimethoxyonium (TMO) mediated methanol/dimethyl ether (DME) activation with a synergistic effect from SMS and the negatively charged framework oxygen atoms to form SES.

"Our study not only sheds light on the controversial issue of the first C-C bond formation in the MTO process, but also enriches the fundamental theory of C1 catalytic chemistry," said Prof. WEI.

Credit: 
Dalian Institute of Chemical Physics, Chinese Academy Sciences

If the right hand is hypersensitive due to an injury to the left

The results of the study were published in the journal "Neurology" on 19 May 2021 under the leadership of Professor Elena Enax-Krumova, holder of the endowed professorship of the German Social Accident Insurance (DGUV).

Nerve injuries: frequent complication after occupational accidents

Peripheral nerves refer to nerves that lie outside of the brain and spinal cord. They run throughout the entire body. These bundles of nerve fibres can be damaged in the event of blunt or sharp force trauma due to accidents, as well as during surgery. Injuries to the peripheral nerves are a frequent complication, particularly after occupational accidents. Patients often suffer from motor and sensory disorders in the affected area of the body. These can lead to persistent complaints and impairments. The underlying mechanisms are not yet fully understood.

"In patients with shingles, it is known that sensory abnormalities can not only occur in the affected area but also on the opposite side of the body," explains Enax-Krumova. "We wanted to find out to what extent such contralateral changes occur also in unilateral nerve injuries and to what factors they are related. Together with other centres from the German Research Network on Neuropathic Pain (DFNS) and some other European centres, the Department of Neurology and the Department of Pain Medicine (temporary senior doctor: Dr. Dr. Andreas Schwarzer) at Bergmannsheil were involved in a Europe-wide research project.

To answer this question, data sets from a total of 424 patients were analysed. They all suffered from unilateral painful or painless peripheral nerve disorder (neuropathy), caused either by a peripheral nerve injury, a nerve root injury or shingles. In all of the participants, the unaffected side of the body was examined with regard to possible sensory changes. Following a standardised procedure, detection and pain perception for cold, warm, sharp and blunt stimuli were examined using quantitative sensory testing (QST).

Contralateral sensory abnormalities

Contralateral sensory abnormalities were frequent both in patients with unilateral painful and painless neuropathy, even on the unaffected side of the body. Reduced contralateral perception of temperature and light touch was suggested to be an indicator of a possible unfavourable central nervous response. In a subgroup of patients with increased pain sensitivity on the affected side of the body, contralateral hypersensitivity to sharp stimuli was also registered. According to the research team, this could indicate hypersensitivity of the central nervous system, called central sensitisation. The so called descending facilitation from the brainstem of pain processing in the spinal cord seems to be a clinically important mechanism of pain amplification. The changes that were found did not depend from the disease duration. Pain intensity, underlying condition and the affected area of the body were associated with changes in only single parameters in the performed test battery.

"The results of this study show that the mechanisms of sensory abnormalities following a unilateral nerve injury appear to spread to the opposite side of the human body, both for painful and painless nerve injuries," summarises Elena Enax-Krumova. "Patients with signs of central sensitisation on the unaffected side represent a subgroup that needs to be investigated further with regard to both the precise underlying mechanisms and their response to specific treatment options." The authors of the study expect that such findings will enable personalized treatment approaches for neuropathic pain in the future.

Credit: 
Ruhr-University Bochum

Junk food relief in lockdown

image: Headshots of Dr Nicola Buckland (Sheffield University, UK), left below, and Professor of Psychology Eva Kemps (Flinders University, South Australia), right.

Image: 
Sheffield University / Flinders University

Beware of those snack attacks. A new study in Appetite has confirmed the small luxuries, from sweets and chocolate to salty treats, have helped to lift our spirits - and kilojoule intake - during COVID-19 lockdowns.

Researchers in England and Australia have gathered evidence about similar experiences in the UK and Victoria, Australia to warn about the effect of extended pandemic lockdowns on our eating behaviours.

While time at home provides more time for healthy food preparation, intake of high-energy density foods (HED) has risen for some - presenting at-risk adults with the prospect of managing weight gain, the psychology researchers warn.

"The new stresses created by the pandemic appears to be associated with reported increases in overall savoury and sweet snack intake," says lead researcher Dr Nicola Buckland, who assessed dietary survey responses from 588 people during the first UK lockdown (May-June 2020).

Participants indicated whether their intake of tasty, calorific foods (e.g. chocolate, cake, ice cream, pizza) had changed during the lockdown. These are foods that people typically report to be difficult to resist and difficult to control intake of. Participants also completed questionnaires that measured individual eating styles.

"The findings showed that, in terms of dietary changes, not everyone responded the same way to the lockdown. Over half of the respondents (53%) reported increased snack intake, 26% reported decreased snack intake and 20% reported no changes to the amount of snacks they ate during the lockdown.

"When we looked at the participants' eating styles (based on responses to the questionnaires), we found that participants who scored low in the ability to control cravings were more likely to report increased snack intake."

Flinders University Professor of Psychology Eva Kemps says the first COVID-19 lockdown started in Victoria, Australia just after the UK research was conducted.

The 124 respondents in the Australian survey also reported changes in food intake and eating styles changed during the lockdown, as well as their perceived stress levels.

Similar to the UK findings, of the Australian respondents, 49% reported increased snack intake during the COVID-19 lockdown, and the remaining respondents reported either reduced snack intake (25%) or no change (26%).

"Increased snack intake was associated with higher levels of perceived stress, indicating that those who experienced higher levels of stress reported greater increases of sweet and savoury snack foods," Professor Kemps says.

"Also, similar to the UK survey, participants with low craving control were most likely to report increased snack intake."

The findings from the UK and Australian surveys show that for some people, especially those who have difficulty controlling food cravings, the COVID-19 lockdowns were a risky time period for increased food intake, the researchers say.

"Our findings support the use of strategies that can help people to manage their cravings, to minimise the risk of increased snack intake during the COVID-19 lockdowns."

Credit: 
Flinders University

Kit clashes affect performance in football matches, new study shows

The response times of footballers is slowed down when part of the kit worn by both teams is of the same colour, a new study shows.

The research from the University of York revealed that when players have any kit colour clash - either shirt or shorts - it takes them twice as long to find a fellow player on the pitch.

Study authors are calling for a change in the laws of the game or for clearer guidance.

Researchers used two experiments to investigate how kit variations affect the visual search for teammates. Their first experiment confirmed that a player's ability to discriminate is slower when playing in crossed kits, for example red shirts-blue shorts versus blue shirts-red shorts.

The second experiment found that there was significant confusion when both teams wore the same coloured shorts and that there were quicker response times occurring for both teams when there was no overlap in shorts colour.

First author, former graduate, Liam Burnell from the Department of Psychology said: "We found that where for example one team wears white tops and red shorts and the other wears red tops and white shorts the search time for a fellow team player is affected.

"With crossed conditions also impairing the ability to discriminate two teams quickly and accurately, rules that prevent this from occurring may be beneficial in preventing misplaced passes, and incorrect refereeing decisions."

The study cites examples including the 2017 English Football League Cup Final where Manchester United played Southampton in which the players were wearing crossed kits of red and white. A Southampton goal was controversially disallowed when the assistant referee had to make an offside call.

Liam Burnell added: "This could have implications for the laws of the game, with current guidelines only prompting a change in kit when the referee believes kits to make teams indistinguishable. There is currently no requirement for teams to wear different shorts colours, and guidelines remain pretty vague as to what constitutes a kit clash."

The study also made some preliminary investigations into the role different strip colours have in player detectability when set against stadium backgrounds as they set out to explore whether Ferguson's infamous claims that his players couldn't see each other in their 1996 clash against Southampton could carry any truth.

Researchers also cited examples including comments from the then Wales manager, Chris Coleman, before a World Cup qualifier against Austria saying that no one wanted to play in grey as it was hard to distinguish players. Derek Adams in 2017, the then Plymouth Argyle manager also commented on their dark green strip at the time, reflecting on how they had changed the kit a little by bringing in white socks to help players because the strip blended in with the seating and the grass.

Study supervisor, Professor Peter Thompson from the Department of Psychology said: "These reports suggest that strips including those that are grey and dark coloured do not 'pop-out' from the background, a problem which may be exacerbated when the team strip is close in colour to the stadium seating and particularly when playing in an empty stadium."

The research was conducted using graphic images from video games and alternating kit colours which were then transposed onto different backgrounds. Participants' response time in identifying the different teams was then monitored.

Credit: 
University of York