Culture

Is disability a risk factor for miscarriage?

image: Journal of Women's Health, published monthly, is a core multidisciplinary journal dedicated to the diseases and conditions that hold greater risk for or are more prevalent among women, as well as diseases that present differently in women.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, December 3, 2019--A new study compared the proportion of women with any cognitive, physical, or independent living disability who experienced a miscarriage during the previous 5-year period to women without disabilities. Regardless of the type of disability, a greater proportion of women with a disability had a miscarriage, according to the study results published in Journal of Women's Health, a peer-reviewed publication from Mary Ann Liebert, Inc., publishers. Click here to read the full-text article on the Journal of Women's Health website through January 3, 2020.

The article entitled "Miscarriage Occurrence and Prevention Efforts by Disability Status and Type in the United States" was coauthored by Mekhala Dissanayake, MPH, Blair Darney, PhD, MPH, Aron Caughey, MD, PhD, and Willi Horner-Johnson, PhD, Oregon Health & Science University (Portland), Portland State University, and National Institute of Public Health (Cuernavaca, Mexico).

The researchers analyzed data on 3,843 women in the National Survey of Family Growth and reported that women with disabilities were more likely to receive services to prevent miscarriage compared to women without disabilities. They also found that among women who had a miscarriage, only women with independent living disability were significantly more likely to have experienced two or more miscarriages compared to women without disabilities.

Susan G. Kornstein, MD, Editor-in-Chief of Journal of Women's Health and Executive Director of the Virginia Commonwealth University Institute for Women's Health, Richmond, VA, states: "The researchers found that higher proportions of pregnancies in women with disabilities ended in miscarriages compared to women without disability. Further research is needed to understand why this is true despite higher odds of receiving preventive services among women with disabilities."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

How does protein fit in your holiday diet or New Year's resolutions?

image: A new study by Purdue University nutrition scientists show that eating more protein daily than what is recommended may only benefit a few - those who are actively losing weight by cutting calories or those strength training to build more lean muscle mass.

Image: 
Shaw Nielsen

While some diets load up on protein and other diets dictate protein sources, it can be hard to know what to consume while managing weight or during weight loss.

A new study by Purdue University nutrition scientists shows that eating more protein daily than what is recommended may benefit only a few - those who are actively losing weight by cutting calories or those strength training to build more lean muscle mass. This study also affirms that the recommended dietary allowance, of 0.8 grams of protein per kilogram of body weight per day - or 0.36 grams per pound - is adequate for most people. For example, an adult who weighs 150 pounds should eat 54 grams of protein a day, which could be three ounces of lean meat, three cups of dairy and one ounce of seeds or nuts within a day.

"But here is the hard part for consumers: These findings support that most adults who are consuming adequate amounts of protein may only benefit from moderately higher protein intake when they are purposefully trying to change their body composition such as when dieting or strength training. The results are not meant to encourage everyone to increase their protein intake in general," said Wayne Campbell, a professor of nutrition science, whose research integrates exercise physiology, geriatrics and nutrition, especially protein.

The study was led by Joshua L. Hudson, Purdue postdoctoral research associate, and it is published in Advances in Nutrition.

"This research uniquely assesses whether adults benefit from consuming more protein than the current recommended dietary allowance," Hudson said. "This research was not designed to assess whether or not adults would benefit from consuming more protein than they usually consume. This distinction is important because the recommended dietary allowance is the standard against which to assess nutrition adequacy; however, most adults consume more protein than what is recommended."

When people are in a neutral metabolic state - not losing weight or lifting weights - eating more protein does not influence their body composition any differently, including lean mass, which is consistent with the current recommended dietary allowances being adequate for generally healthy sedentary weight-stable people. This does not include adults with Type 2 diabetes.

"And that is important because there is so much encouragement, advertising and marketing for everyone to eat higher protein diets, and this research supports that, yes, under certain conditions, including strength training and weight loss, moderately more protein may be helpful, but that doesn't mean more is needed for everybody at all times," Hudson said.

More than 1,500 nutrition articles were screened across journal databases to identify 18 studies with 22 intervention groups and 981 participants that addressed this topic. The studies were selected based on specific factors including inclusion of healthy adults, protein intake, weight loss and physical activity. The sources of protein evaluated included lean and minimally processed meats, dairy, eggs, nuts, seeds and legumes.

"This research is clinically more important for women and especially older women who are known to typically consume lower amounts of protein and should be maintaining a healthy bodyweight and regularly strength training," Campbell said.

What do these findings mean for someone watching their weight during the holidays or planning New Year's resolutions?

"If you are going to start losing weight, don't cut back across all foods you usually consume, because you'll inadvertently cut back protein. Instead, work to maintain, or even moderately increase, protein-rich foods. Then, cut back on the carbs and saturated fat-containing foods," said Campbell, who studies how sources and amounts of protein - which is critical to building muscle mass - may be a part of adopting healthy eating patterns, including the Mediterranean diet and DASH diet.

These findings are in general, and more evaluation is needed to determine effects on age and gender. This research does not apply to elite athletes or people who lost weight with bariatric surgery, nor does it relate to protein supplements.

No external funding was used for this study. Campbell's lab continues to study the influences of healthy eating patterns and diets with different amounts and sources of protein on changes in body composition and clinical health risk factors.

Credit: 
Purdue University

NASA's exoplanet-hunting mission catches a natural comet outburst in unprecedented detail

image: This animation shows an explosive outburst of dust, ice and gases from comet 46P/Wirtanen that occurred on September 26, 2018 and dissipated over the next 20 days. The images, from NASA's TESS spacecraft, were taken every three hours during the first three days of the outburst.

View animated GIF: https://www.nasa.gov/sites/default/files/thumbnails/image/wirtanen_outbu...

Enlarged GIF: https://www.nasa.gov/sites/default/files/thumbnails/image/wirtanen_outbu...

Image: 
Farnham et al./NASA

Using data from NASA's Transiting Exoplanet Survey Satellite (TESS), astronomers at the University of Maryland (UMD), in College Park, Maryland, have captured a clear start-to-finish image sequence of an explosive emission of dust, ice and gases during the close approach of comet 46P/Wirtanen in late 2018. This is the most complete and detailed observation to date of the formation and dissipation of a naturally-occurring comet outburst. The team members reported their results in the November 22 issue of The Astrophysical Journal Letters.

"TESS spends nearly a month at a time imaging one portion of the sky. With no day or night breaks and no atmospheric interference, we have a very uniform, long-duration set of observations," said Tony Farnham, a research scientist in the UMD Department of Astronomy and the lead author of the research paper. "As comets orbit the Sun, they can pass through TESS' field of view. Wirtanen was a high priority for us because of its close approach in late 2018, so we decided to use its appearance in the TESS images as a test case to see what we could get out of it. We did so and were very surprised!"

"While TESS is a powerhouse for discovering planets orbiting nearby, bright stars, its observing strategy enables so much exciting additional science," said TESS project scientist Padi Boyd of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Since the TESS data are rapidly made public through NASA's Mikulski Archive for Space Telescopes (MAST), it's exciting to see scientists identifying which data are of interest to them, and then doing all kinds of additional serendipitous science beyond exoplanets."

Normal comet activity is driven by sunlight vaporizing the ices near the surface of the nucleus, and the outflowing gases drag dust off the nucleus to form the coma. However, many comets are known to experience occasional spontaneous outbursts that can significantly, but temporarily increase the comet's activity. It is not currently known what causes outbursts, but they are related to the conditions on the comet's surface. A number of potential trigger mechanisms have been proposed, including a thermal event, in which a heat wave penetrates into a pocket of highly volatile ices, causing the ice to rapidly vaporize and produce an explosion of activity, and a mechanical event, where a cliff collapses, exposing fresh ice to direct sunlight. Thus, studies of the outburst behavior, especially in the early brightening stages that are difficult to capture, can help us understand the physical and thermal properties of the comet.

Although Wirtanen came closest to Earth on December 16, 2018, the outburst occurred earlier in its approach, beginning on September 26, 2018. The initial brightening of the outburst occurred in two distinct phases, with an hour-long flash followed by a more gradual second stage that continued to grow brighter for another 8 hours. This second stage was likely caused by the gradual spreading of comet dust from the outburst, which causes the dust cloud to reflect more sunlight overall. After reaching peak brightness, the comet faded gradually over a period of more than two weeks. Because TESS takes detailed, composite images every 30 minutes, the team was able to view each phase in exquisite detail.

"With 20 days' worth of very frequent images, we were able to assess changes in brightness very easily. That's what TESS was designed for, to perform its primary job as an exoplanet surveyor," Farnham said. "We can't predict when comet outbursts will happen. But even if we somehow had the opportunity to schedule these observations, we couldn't have done any better in terms of timing. The outburst happened mere days after the observations started."

The team has generated a rough estimate of how much material may have been ejected in the outburst, about one million kilograms (2.2 million pounds), which could have left a crater on the comet of around 20 meters (about 65 feet) across. Further analysis of the estimated particle sizes in the dust tail may help improve this estimate. Observing more comets will also help to determine whether multi-stage brightening is rare or commonplace in comet outbursts.

TESS has also detected for the first time Wirtanen's dust trail. Unlike a comet's tail--the spray of gas and fine dust that follows behind a comet, growing as it approaches the sun--a comet's trail is a field of larger debris that traces the comet's orbital path as it travels around the sun. Unlike a tail, which changes direction as it is blown by the solar wind, the orientation of the trail stays more or less constant over time.

"The trail more closely follows the orbit of the comet, while the tail is offset from it, as it gets pushed around by the sun's radiation pressure. What's significant about the trail is that it contains the largest material," said Michael Kelley, an associate research scientist in the UMD Department of Astronomy and a co-author of the research paper. "Tail dust is very fine, a lot like smoke. But trail dust is much larger--more like sand and pebbles. We think comets lose most of their mass through their dust trails. When the Earth runs into a comet's dust trail, we get meteor showers."

While the current study describes initial results, Farnham, Kelley and their colleagues look forward to further analyses of Wirtanen, as well as other comets in TESS' field of view. "We also don't know what causes natural outbursts and that's ultimately what we want to find," Farnham said. "There are at least four other comets in the same area of the sky where TESS made these observations, with a total of about 50 comets expected in the first two years' worth of TESS data. There's a lot that can come of these data."

Credit: 
NASA/Goddard Space Flight Center

Tech startups gravitate toward cities with strong social networks, study finds

AUSTIN, Texas -- The presence of technology startups can drive economic growth for their home cities. So how can cities better appeal to entrepreneurs? A new study from the McCombs School of Business at The University of Texas at Austin shows the connections they can offer matter more than big money.

The research shows that today's successful tech entrepreneurs gravitate to areas where they have a strong social support network and can also obtain multiple rounds of relatively modest funding.

"If you're starting a restaurant, location matters and large funding matters," said Rajiv Garg, assistant professor of information, risk and operations management at the McCombs School. "But if you're building a mobile app, the cost is low and nobody cares where you're located, so the factors that influence a move are different."

To pinpoint those factors, Garg, along with John Sibley Butler, professor of management at McCombs, and Bryan Stephens, postdoctoral research associate at Duke University's Fuqua School of Business, conducted a study, "Social Networks, Funding, and Regional Advantages in Technology Entrepreneurship: An Empirical Analysis," published today in Information Systems Research.

The researchers used economic indicators made publicly available by the United States government, investment information from CrunchBase and PricewaterhouseCoopers, and data from LinkedIn to analyze 1,418 U.S. entrepreneurs who had secured funding for their firms during the past 10 years.

Surprisingly, the amount of money available in a location did not significantly affect a startup founder's decision about where to operate, Garg said.

Real estate and other costs of living are deterrents to starting firms in places such as Silicon Valley and New York City. Instead, today's entrepreneurs are gravitating to cities such as Austin, Seattle, San Diego and Denver, where costs may be lower. The entrepreneurs in the study were concerned with steady access to smaller sums -- with hopes for bigger sums after they bring in some profit and build a strong customer base.

"We found angel investors play a huge role, because they give a small amount of money but a much larger volume," Garg said. Equally attractive, he said, are venture capital firms that are willing to distribute, for example, $10 million of funding among 10 startup firms, versus directing all of the funds to one.

Researchers also observed the strong role of social networks in tech entrepreneurs' decisions about where to locate. The more LinkedIn connections they had in a city, the more likely they were to stay based there. This concentration of social connection had the same "sticky" effect for people who had not yet started their business, strongly predicting whether they would stay or move before their launch.

"We found that having more connections in a particular city not only pulled talent to that area, but it kept them there," Garg said. The draw of a strong social network is especially impressive, because the research revealed that tech entrepreneurs were less likely to move than other types of entrepreneurs.

So cities looking to attract entrepreneurial talent would do well to foster these social connections through events or organizations such as the South by Southwest megaconference and nationwide accelerator program TechStars. They also need to draw young entrepreneurs, just not too young. The target age for these entrepreneurs is early 30s, the research found: Entrepreneurs in this age range were the most successful, boasting youthful energy coupled with maturity and work experience.

"This is very important for cities that are shrinking or dying because of their location," Garg said. "They aren't in a great hub, but they could make themselves more appealing to tech entrepreneurs just by following a few simple steps."

Credit: 
University of Texas at Austin

Women wearing hijabs in news stories may be judged negatively

UNIVERSITY PARK, Pa. -- Women wearing a veil or headscarf in the United States may face harsher social judgement, according to a study by Penn State researchers that found when given the same information in a news story, some people may consider a woman wearing a headscarf to be more likely to have committed a crime.

The findings -- recently published in the journal Mass Communication and Society -- suggest potential obstacles that women who wear hijabs may face in the real world.

According to the researchers, the hijab has become a topic of intense debate over the past 20 years. Previous research found that while the hijab may be a symbol of religiosity to some, it may signal a threat to others. The current study sought to calculate the impact that media portrayals of veiled women have on news consumers.

"The influence of someone's view of a woman wearing a hijab can have major effects on that woman," said Colleen Connolly-Ahern, an associate professor at Penn State's Donald P. Bellisario College of Communications. "Through our findings, you can see what this could mean to a woman in a hijab if she is facing a jury."

The researchers found that a person's political identity was associated with how they reacted to a woman wearing a hijab. This was measured by comparing the gap -- called parochial empathy -- between how respondents feel about a group they identify with and a group they do not.

One thousand respondents from the United States took part in the study through an online panel. The participants watched videos of newscasts created by university video production professionals that featured stories about three individuals. Two of the stories were about a gardener and a contestant in a Miss Pennsylvania pageant, while the third was about a female college student suspected of contacting a terrorist group.

All participants watched identical versions of the stories about the gardener and beauty pageant contestant. However, each participant was randomly assigned to view one of four versions of the story about the student suspect. The versions included one in which the respondents referred to the student as a U.S. citizen wearing a hijab, a U.S. citizen not wearing a hijab, a refugee wearing a hijab, and a refugee not wearing a hijab.

After viewing the newscasts, the participants were presented with both positive and negative hypothetical scenarios about both the beauty pageant contestant and the student suspect. For example, "she found a sentimental possession she thought she lost" and "she found a $20 bill on the street." The participants were then asked about how they felt about these scenarios, and their responses were used to calculate differences in empathy for the two women. They were also asked about their perceptions of the student's innocence.

The researchers found that participants who identified as Republican tended to show a significant difference in empathy toward the woman -- and lower levels of perceived innocence -- when she was wearing a headscarf. Republicans' "preexisting dispositions were evoked by the veil," according to the report.

"The visual seems to be the difference," Connolly-Ahern said. "Conservatives were more likely to see someone as innocent until proven guilty, but when a hijab was present, that all changed. So what is the cost of wearing a veil in America today? Even when the woman is identified as a U.S. citizen, she is treated as 'other' by some."

According to lead authors Connolly-Ahern and Lee Ahern, an associate professor at the Bellisario College, the study has several implications for journalists. Connolly-Ahern said the results indicate that reporters may want to consider the preexisting biases of their audiences when covering women wearing headscarves. These portrayals, she said, "have real consequences in how people respond to news stories."

According to the researchers, the media will often generalize Muslims by regularly showing footage of women in hijabs -- even though only a small number of Muslim-American women wear them. Similarly, research found that journalists will often reference a woman's headscarf, whether it was relevant to the story or not.

"Communicators have an opportunity and responsibility to improve these messages," said Connolly-Ahern, a senior research fellow at the Arthur W. Page Center for Integrity in Public Communication, which funded the study.

While past studies have described different ways women in headscarves have been portrayed in the media, this study takes the first step toward quantifying the impact of those messages. The researchers say their model could likely be applied to other situations -- like religion and education level -- and in other media types as well.

Credit: 
Penn State

A common drug could help restore limb function after spinal cord injury

Long-term treatment with gabapentin, a commonly prescribed drug for nerve pain, could help restore upper limb function after a spinal cord injury, new research in mice suggests.

In the study, mice treated with gabapentin regained roughly 60 percent of forelimb function in a skilled walking test, compared to restoration of approximately 30 percent of forelimb function in mice that received a placebo.

The drug blocks activity of a protein that has a key role in the growth process of axons, the long, slender extensions of nerve cell bodies that transmit messages. The protein stops axon growth at times when synapses form, allowing transmission of information to another nerve cell.

The research showed that gabapentin blocks the protein from putting on its brakes, which effectively allowed axons to grow longer after injury.

"There is some spontaneous recovery in untreated mice, but it's never complete. The treated mice still have deficits, but they are significantly better," said senior author Andrea Tedeschi, assistant professor of neuroscience at The Ohio State University.

"This research has translational implications because the drug is clinically approved and already prescribed to patients," he said. "I think there's enough evidence here to reconsider how we use this drug in the clinic. The implication of our finding may also impact other neurological conditions such as brain injury and stroke."

The regained function in mice occurred after four months of treatment - the equivalent of about nine years in adult humans.

"We really have to consider that rebuilding neuronal circuits, especially in an adult central nervous system, takes time. But it can happen," said Wenjing Sun, research assistant professor of neuroscience at Ohio State and first author of the publication.

The study is published in the Journal of Clinical Investigation.

The spinal cord injury in these mice is located near the top of the spine. Humans with this type of injury generally lose enough sensation and movement to require assistance with daily living tasks.

After receiving gabapentin for four months, the treated mice were better able to move across a horizontal ladder and spread their forelimb toes than untreated mice. When the researchers used a special technique to silence neurons in the repair pathway they had targeted, there was no difference in functional recovery between treated and untreated mice.

"Now we can comfortably say that whatever we see in terms of structural and functional alterations of this motor pathway is really meaningful in promoting recovery in these mice," Tedeschi said.

Tedeschi noted that in this study, treatment with gabapentin occurred much earlier than is typical in human medicine, when it is prescribed to treat existing neuropathic pain and other neurological conditions.

"Gabapentin is given when the nervous system is already having issues associated with maladaptive plasticity that hinders normal function. We are giving it much, much earlier, when the nervous system may be more responsive to programming an adaptive repair process," he said.

A retrospective study of European medical data published in 2017 showed that individuals who had received anticonvulsants - gabapentin or a similar drug - early after spinal cord injury regained motor function. It was not a clinical trial, but the analysis showed an association between taking a class of drugs called gabapentinoids and regaining muscle strength.

Plenty of questions remain: how and when to adjust the amount of gabapentin used for treatment, and whether the drug could be combined with other interventions used to promote repair of an injured spinal cord at chronic stages. But testing the effectiveness of the drug in larger animal models is a logical next step prior to embarking on clinical trials, Tedeschi said.

"With all the evidence and mechanistic insight we provide, I feel like we are in a better situation to start planning a more translational type of research," he said. "It's the right time to try."

Tedeschi's research focuses on neurons in the corticospinal tract - specifically motor neurons that carry signals from the central nervous system to the body telling muscles to move. These cells are particularly important in controlling voluntary movement, which is impaired in cervical spinal cord injuries modeled in the study.

This work builds upon the recent discovery of the regulatory role of a neuronal receptor called alpha2delta2 in controlling axon growth ability. Tedeschi and colleagues have determined that alpha2delta2 facilitates synapse formation by putting on the brake for axon growth, an essential step during the development of the central nervous system.

The researchers discovered in the current study that after a cervical spinal cord injury, affected motor neurons above the spine increased the expression of this receptor, interfering with axons' ability to regrow. If axon repair doesn't go as expected and neuronal circuits are reorganized improperly, individuals with spinal cord injury may experience uncontrolled movement and pain.

"When neuronal circuits need to be rebuilt after injury, we need to down-regulate the expression of the receptor so axons can re-engage in an active growth program. And we found that it's doing exactly the opposite," said Tedeschi, also a member of Ohio State's Chronic Brain Injury Discovery Theme.

"Because this receptor can be pharmacologically blocked through administration of clinically approved drugs called gabapentinoids - for example, gabapentin and pregabalin - that's a very powerful target that you can modulate as long as you take the drug."

Credit: 
Ohio State University

UMD astronomers catch a natural comet outburst in unprecedented detail

image: This animation shows an explosive outburst of dust, ice and gases from comet 46P/Wirtanen that occurred on September 26, 2018 and dissipated over the next 20 days.  The images, from NASA's TESS spacecraft, were taken every three hours during the first three days of the outburst.

Image: 
Farnham et al./NASA

University of Maryland astronomers have made the most complete and detailed observations to date of the formation and dissipation of a naturally occurring comet outburst. Using data from NASA's Transiting Exoplanet Survey Satellite (TESS), the researchers gained a clear start-to-finish image sequence of an explosive emission of dust, ice and gases during the close approach to Earth of comet 46P/Wirtanen in late 2018. The team members reported their results in the November 22, 2019 issue of The Astrophysical Journal Letters.

"TESS spends nearly a month at a time imaging one portion of the sky. With no day or night breaks and no atmospheric interference, we have a very uniform, long-duration set of observations," said Tony Farnham, a research scientist in the UMD Department of Astronomy and the lead author of the research paper. "As comets orbit the sun, they can pass through TESS' field of view. Wirtanen was a high priority for us because of its close approach in late 2018, so we decided to use its appearance in the TESS images as a test case to see what we could get out of it. We did so and were very surprised!"

According to Farnham, the TESS observations of comet Wirtanen were the first to capture all phases of a natural comet outburst, from beginning to end. He noted that three other previous observations came close to recording the beginning of an outburst event. Observations of a 2007 outburst from comet 17P/Holmes began late, missing several hours of the initial brightening phase of the event. In 2017, observations of an outburst from comet 29P/Schwassmann-Wachmann 1 (SW1) concluded early, due to limitations on pre-scheduled observation time. And, while observations from the UMD-led Deep Impact mission captured an outburst from comet Tempel 1 in unprecedented detail in 2005, the outburst was not natural--created instead by the mission's impactor module. However, the current observations are the first to capture the dissipation phase in its entirety, Farnham said.

Although Wirtanen came closest to Earth on December 16, 2018, the outburst occurred earlier in its approach, beginning on September 26, 2018. The initial brightening of the outburst occurred in two distinct phases, with an hour-long flash followed by a more gradual second stage that continued to grow brighter for another 8 hours. This second stage was likely caused by the gradual spreading of comet dust from the outburst, which causes the dust cloud to reflect more sunlight overall. After reaching peak brightness, the comet faded gradually over a period of more than two weeks. Because TESS takes detailed, composite images every 30 minutes, the team was able to view each phase in exquisite detail.

"With 20 days' worth of very frequent images, we were able to assess changes in brightness very easily. That's what TESS was designed for, to perform its primary job as an exoplanet surveyor," Farnham said. "We can't predict when comet outbursts will happen. But even if we somehow had the opportunity to schedule these observations, we couldn't have done any better in terms of timing. The outburst happened mere days after the observations started."

Farnham and his colleagues are also the first to observe Wirtanen's dust trail. Unlike a comet's tail--the spray of gas and fine dust that follows behind a comet, growing as it approaches the sun--a comet's trail is a field of larger debris that traces the comet's orbital path as it travels around the sun. Unlike a tail, which changes direction as it is blown by the solar wind, the orientation of the trail stays more or less constant over time.

"The trail more closely follows the orbit of the comet, while the tail is more offset from it, as it gets pushed around by the sun's radiation pressure. What's significant about the trail is that it contains the largest material," said Michael Kelley, an associate research scientist in the UMD Department of Astronomy and a co-author of the research paper. "Tail dust is very fine, a lot like smoke. But trail dust is much larger--more like sand and pebbles. We think comets lose most of their mass through their dust trails. When the Earth runs into a comet's dust trail, we get meteor showers."

While the current study describes initial results, Farnham, Kelley and their colleagues look forward to further analyses of Wirtanen, as well as other comets in TESS' field of view. The team has generated a rough estimate of how much material may have been ejected in the outburst (about 2.2 million pounds, which could have left a crater close to 65 feet across), but further analysis of the estimated particle sizes in the dust tail may help improve this estimate. Observing more comets will also help to determine whether multi-stage brightening is rare or commonplace in comet outbursts.

"We also don't know what causes natural outbursts and that's ultimately what we want to find," Farnham said. "There are at least four other comets in the same area of the sky where TESS made these observations, with a total of about 50 comets expected in the first two years' worth of TESS data. There's a lot that can come of these data. We're still finding out the capabilities of TESS, so hopefully we'll have more to report on this comet and others very soon."

Credit: 
University of Maryland

Genomic gymnastics help sorghum plant survive drought

image: A new study led by UC Berkeley researchers reveals how sorghum crops alter the expression of their genes to adapt to drought conditions. Understanding how sorghum survives harsh conditions could help researchers design crops that are more resilient to climate change.

Image: 
UC Berkeley photo by Peggy Lemaux

Berkeley -- Scorching temperatures and parched earth are no match for the sorghum plant -- this cereal crop, native to Africa and Australia, will remain green and productive, even under conditions that would render other plants brown, brittle and barren.

A new study published this week in the journal Proceedings of the National Academy of Sciences provides the first detailed look at how the plant exercises exquisite control over its genome -- switching some genes on and some genes off at the first sign of water scarcity, and again when water returns -- to survive when its surroundings turn harsh and arid.

"Sorghum really is drought tolerant, and if we learn how it is able to be so drought-tolerant, then perhaps we can help some other plants survive in the same way," said Peggy Lemaux, a cooperative extension specialist at the University of California, Berkeley's, Department of Plant and Microbial Biology and co-author of the paper.

The massive dataset, collected from 400 samples of sorghum plants grown during 17 weeks in open fields in California's Central Valley, reveals that the plant modulates the expression of a total of 10,727 genes, or more than 40% of its genome, in response to drought stress. Many of these changes occur within a week of the plant missing a weekly watering or after it is first watered after weeks of no watering.

The data was collected as part of the Epigenetic Control of Drought Response in Sorghum, or EPICON, project, a five-year, $12.3 million study into how the sorghum plant is able to survive the stress of drought. The EPICON study is run as a partnership between UC Berkeley researchers and scientists at UC Agriculture and Natural Resources (UC ANR), the Energy Department's Joint Genome Institute (JGI) and that agency's Pacific Northwest National Laboratory (PNNL).

To conduct the research, the team cultivated sorghum plants under three different watering conditions -- pre-flowering drought, post-flowering drought and controlled applications of water -- over three consecutive years at the UC Kearney Agricultural Research and Extension Center (KARE) in Parlier, California.

Each week during the growing season, members of the research team carefully harvested samples from the leaves and roots of selected plants and set up a mobile lab in the field where they could rapidly freeze the samples until they were processed for analysis. Then, researchers at JGI sequenced the RNA in each sample to create the transcriptome data, which reveals which of the plant's tens of thousands of genes are being transcribed and used to make proteins at particular times.

Finally, statisticians led by UC Berkeley statistics professor and study senior author Elizabeth Purdom parsed the massive transcriptome data set to pinpoint how gene expression changed as the plants grew and were subjected to drought or relief from drought conditions.

"We very carefully controlled the watering conditions, and we sampled over the entire developmental timeframe of sorghum, so [researchers] could actually use this data not only to study drought stress, but also to study plant development," Lemaux said.

The researchers noticed a few interesting patterns in the transcriptome data. First, they found that a set of genes known to help the plant foster symbiotic relationships with a type of fungus that lives around its roots was switched off in drought conditions. This set of genes exhibited the most dramatic changes in gene activity that they observed.

"That was interesting, because it hinted that the plants were turning off these associations [with fungi] when they were dry," said John Vogel, a staff scientist at JGI and co-author of the paper. "That meshed well with findings that showed that the abundance of these fungi around the roots was decreasing at the same time."

Second, they noticed that certain genes known to be involved with photosynthesis were also turned off in response to drought and turned up during drought recovery. While the team doesn't yet know why these changes might help the plant, they provide interesting clues for follow-up, Vogel said.

The data in the current paper show the plant's transcriptome under both normal conditions and drought conditions over the course of a single growing season. In the future, the team also plans to publish data from the other two years of the experiment, as well as proteomic and metabolomic data.

"People have really shied away from doing these types of experiments in the field and instead conduct them under controlled conditions in the laboratory or greenhouse. But I believe that the investment of time and resources that we put into it is going to pay off, in the quality of the answers that we get, in terms of understanding real-world drought situations," Lemaux said.

Credit: 
University of California - Berkeley

Click, click, cook: Online grocery shopping leaves 'food deserts' behind

New Haven, Conn. -- There's a new path out of the "food desert," and it's as close as the nearest Internet connection.

A Yale University analysis found that most people in "food deserts" in eight states would increase their access to healthy, nutritious food if they purchase groceries online and had the food delivered as part of the federal government's Supplemental Nutrition Assistance Program (SNAP).

The analysis showed that online grocery delivery systems already cover about 90% of food deserts -- places where access to healthy food is limited -- in the eight states: Alabama, Iowa, Maryland, Nebraska, New Jersey, New York, Oregon, and Washington.

"If you live in a food desert, online grocery delivery really stands out as way to get healthy food that potentially can save your life," said Eric Brandt, M.D., a postdoctoral research fellow in the National Clinician Scholars Program at Yale and lead author of a study published online Dec. 2 in JAMA Network Open.

Earlier this year, SNAP began a pilot program in which clients had the option of buying food via online grocery delivery services. The program was established by the 2014 Farm Bill; it may be considered for national implementation after the pilot ends in 2021.

Brandt's inspiration for the study was a visit to an urban, East Coast neighborhood served only by small convenience stores. "I thought, 'One of the grocery store chains must deliver here -- wouldn't that be a better option than trying to build a new brick-and-mortar store nearby or change the way local bodegas are run?'"

Brandt then learned the latest Farm Bill had just such a program.

For his study, Brandt identified food deserts in eight states by working with data from the U.S. Department of Agriculture and the U.S. Census Bureau. He also made use of a database of all stores that both sold and delivered groceries purchased online in the eight states (including department stores and big-box retailers) and also accepted orders from SNAP clients.

Brandt said the benefits of allowing SNAP families to buy healthy food online are far-reaching and wide-ranging. In the short term, they provide nutrients and nourishment that reduce obesity, boost energy, and help heal patients recovering from serious physical ailments; in the long term, they promote better eating habits and behaviors, which can lower the risk for serious illnesses.

"When I see patients who have had a heart attack, the cornerstone of their recovery is making better lifestyle choices," Brandt said. "Part of that has to do with the environment in which they live. It really influences the outcome."

Credit: 
Yale University

Citizen scientists deserve more credit, researchers argue

image: Listing indigenous citizen scientists as co-authors on a cane toad paper proved challenging.

Image: 
Wikimedia Commons

Citizen scientists should be included as authors on journal papers, researchers say.

In a paper published in the journal Trends in Ecology and Evolution, a team led by biologist Dr Georgia Ward-Fear from Macquarie University in Australia and Dr Greg Pauly from the Natural History Museum of Los Angeles argues that newfound respect for indigenous knowledge and changes in technology mean that non-professionals are taking greater roles in science work.

Regulations governing minimum qualifications for authorship in academic journals mean that such citizen scientists are usually excluded from credit for their work.

"Members of the general public have become pivotal contributors to research, resulting in thousands of scientific publications and measurable conservation impacts," says Dr Ward-Fear. "The question is: how should we credit that input?"

Many of the most influential science journals in the world, including Nature, Science and PLOS ONE, adhere to guidelines set out by the International Committee of Medical Journal Editors. These state that researchers can only be listed as authors if they made "substantial contributions" to the design of the project, the interpretation of the data, and the critical revision of the final version.

"However, there are some projects in which citizen scientists - through online species identification apps, for instance - contribute most, even all of the data," says Dr Pauly.

"Without that contribution the accredited scientists might not even be able to make a discovery - and yet they are not able to be listed as authors. This really undervalues their contributions and might make them reluctant to take part in similar research ever again."

To solve the problem, Dr Ward-Fear and colleagues suggest an approach that will simultaneously recognise the contribution of non-professionals while safeguarding the integrity of the existing system.

Citizen scientists could be credited as "group co-authors" - being collectively credited, for instance, as users of the online interface deployed to gather data.

A slightly different example is drawn from research previously conducted by Dr Ward-Fear and her Macquarie University co-author, Professor Rick Shine.

It concerned conservation research on toxic cane toads (Rhinella marina) and their imperilled predators that was conducted in collaboration with the indigenous traditional owners of the region surveyed, and who were known collectively as the Balanggarra Rangers.

"The team unquestionably wanted group co-authorship, but adding 'the Balanggarra Rangers' to the author list was difficult," says Dr Ward-Fear.

"We had to negotiate with editors and staff of two journals to make it happen, and even then in one instance the group was listed as 'B.Rangers', as if it was an individual person."

Dr Ward-Fear adds that refusal to properly credit contributors who possess valuable traditional skills and knowledge could be seen as discriminatory.

Professor Shine agrees.

"With a little flexibility we can recognise the contribution of everyone who plays a major role in research while still deterring scientific fraud," he says.

"We all have to accept that the nature of research is changing, with more citizen scientists taking part. It's part of the evolving social dimension of science practice, and we should celebrate it rather than stifle it."

Credit: 
Macquarie University

Machine learning that works like a dream

Tsukuba, Japan - Researchers at the University of Tsukuba have created a new artificial intelligence program for automatically classifying the sleep stages of mice that combines two popular machine learning methods. Dubbed "MC-SleepNet," the algorithm achieved accuracy rates exceeding 96% and high robustness against noise in the biological signals. The use of this system for automatically annotating data can significantly assist sleep researchers when analyzing the results of their experiments.

Scientists who study sleep often use mice as animal models to better understand the ways the activity in the brain changes during the various phases. These phases can be classified as awake, REM (rapid eye movement) sleep, and non-REM sleep. Previously, researchers who monitored the brainwaves of sleeping mice ended up with mountains of data that needed to be laboriously labeled by hand, often by teams of students. This represented a major bottleneck in the research.

Now, researchers at the University of Tsukuba have introduced a program for automatically classifying the stage of sleep that a mouse experienced based on its electroencephalogram (EEG) and electromyogram (EMG) signals, which record electrical activity in the brain and body, respectively. They combined two machine learning techniques, convolutional neural networks (CNN) and long short-term memory (LSTM) recurrent neural networks, to achieve accuracies that surpass those of the best existing automatic methods.

"Machine learning is an exciting new field of research with important applications that combine medicine with computer science. It allows us to automatically classify new data based on labeled examples," corresponding author Kazumasa Horie explains. This is especially valuable when the patterns to look for are not well known, as with sleep stages. In this way, the algorithm can 'learn" how to make complex decisions without being explicitly programed. In this project, the accuracy was very high because of the large dataset used. With over 4,200 biological signals, it was the biggest dataset of any sleep research so far. Also, by implementing a CNN, the algorithm showed high robustness against individual differences and noise.

The main advance in this work was to divide the task between the two machine learning methods. First a CNN was used to extract features of interest from the recordings of the electrical activity in the brain and body. These data were then passed to an LSTM to determine which features were most indicative of the sleep phase the mouse was experiencing. "We are optimistic that we can translate this work into classifying sleep stages in humans", senior author Hiroyuki Kitagawa says. In the meantime, this program can already speed up the work of researchers in the field of sleep, which may lead to a much clearer understanding of how sleep operates.

Credit: 
University of Tsukuba

One in two homeless people may have experienced a head injury in their lifetime

People who are homeless experience a disproportionately high lifetime prevalence of traumatic brain injury (TBI), according to a new UBC-led study published today in The Lancet Public Health.

The meta-analysis--which looked at 38 studies published between 1995 and 2018--is the first to look at the prevalence of TBI in people who are homeless or in unstable housing situations.

The results suggest that one in two (53 per cent) homeless people experience a TBI, and one in four (25 per cent) experience a TBI that is moderate or severe.

After comparing their estimates to studies of the general population, the researchers estimate that the lifetime prevalence of TBI in people who are homeless and in unstable housing situations could potentially be up to four times higher than in the general population. Meanwhile, the lifetime prevalence of moderate or severe TBI in this population could be nearly 10 times higher than estimates in the general population.

Based on the data they analysed, the researchers were unable to determine whether TBI increased the risk of homelessness or whether homelessness increased the risk of TBI. While more research is needed to better understand the relationship, the researchers say the findings suggest that providing stable housing might lower the risk for TBI.

"More research is definitely needed. TBI is an underappreciated and significant factor in the health and functioning of this vulnerable group of people," says the study's senior author Dr. William Panenka, assistant professor in the UBC faculty of medicine, a member of the BC Provincial Neuropsychiatry Program at UBC and a part of the BC Mental Health and Substance Use Services Research Institute.

"I find it especially striking that we found such a high prevalence of moderate or severe TBI," says Jacob Stubbs, the study's lead author and a PhD student in Panenka's laboratory. "Our work emphasizes that healthcare workers be aware of the burden of TBI in this population, and how it relates to health and functioning."

TBI can range from a mild concussion to a severe head injury. It is caused by a blow to the head or body, a wound that breaks through the skull, a fall, or another injury that jars or shakes the brain causing bruising, swelling, or tearing of brain tissue.

With time, most people recover from a mild brain injury but some people, especially those have repeated or severe injuries, may have long-lasting problems with movement, learning, or speaking.

For their study, the researchers looked at 38 published papers from six high-income countries-- Australia, Canada, Japan, South Korea, the UK, and the USA--which included people of any age who were either homeless, in unstable housing situations, or seeking services for homeless people.

They examined the number of new cases and existing cases of TBI, and the association between TBI and health or functioning outcomes.

Their findings suggest that TBI is consistently associated with poorer self-reported physical and mental health, suicidality and suicide risk, memory concerns, increased health service use and criminal justice system involvement.

The authors suggest a need for health-care workers to have increased awareness of the burden and associated effects of TBI in people who are homeless, noting that more comprehensive assessments of their health--including checking for a history of TBI--may help improve their care.

Credit: 
University of British Columbia

Micro implants could restore standing and walking

When Vivian Mushahwar first applied to grad school, she wrote about her idea to fix paralysis by rewiring the spinal cord.

It was only after she was accepted into a bioengineering program that the young electrical engineer learned her idea had actually prompted laughter.

"I figured, hey I can fix it, it's just wires," Mushahwar said. "Yeah, well, it's not just wires. So I had to learn the biology along the way."

It's taken Mushahwar a lot of work over two decades at the University of Alberta, but the Canada Research Chair in Functional Restoration is still fixated on the dream of helping people walk again. And thanks to an electrical spinal implant pioneered in her laboratory and work in mapping the spinal cord, that dream could become a reality in the next decade.

Because an injured spinal cord dies back, it's not simply a matter of reconnecting a cable. Three herculean feats are needed. You have to translate brain signals. You have to figure out and control the spinal cord. And you have got to get the two sides talking again.

People tend to think the brain does all the thinking, but Mushahwar says the spinal cord has built-in intelligence. A complex chain of motor and sensory networks regulate everything from breathing to bowels, while the brain stem's contribution is basically "go!" and "faster!" Your spinal cord isn't just moving muscles, it's giving you your natural gait.

Other researchers have tried different avenues to restore movement. By sending electrical impulses into leg muscles, it's possible to get people standing or walking again. But the effect is strictly mechanical and not particularly effective. Mushahwar's research has focused on restoring lower-body function after severe injuries using a tiny spinal implant. Hair-like electrical wires plunge deep into the spinal grey matter, sending electrical signals to trigger the networks that already know how to do the hard work.

In a new paper in Scientific Reports, the team showcases a map to identify which parts of the spinal cord trigger the hip, knees, ankles and toes, and the areas that put movements together. The work has shown that the spinal maps have been remarkably consistent across the animal spectrum, but further work is required before moving to human trials.

The implications of moving to a human clinical setting would be massive, but must follow further work that needs to be done in animals. Being able to control standing and walking would improve bone health, improve bowel and bladder function, and reduce pressure ulcers. It could help treat cardiovascular disease--the main cause of death for spinal cord patients--while bolstering mental health and quality of life. For those with less severe spinal injuries, an implant could be therapeutic, removing the need for months of gruelling physical therapy regimes that have limited success.

"We think that intraspinal stimulation itself will get people to start walking longer and longer, and maybe even faster," said Mushahwar. "That in itself becomes their therapy."

Progress can move at a remarkable pace, yet it's often maddeningly slow.

"There's been an explosion of knowledge in neuroscience over the last 20 years," Mushahwar said. "We're at the edge of merging the human and the machine."

Given the nature of incremental funding and research, a realistic timeline for this type of progress might be close to a decade.

Mushahwar is the director of the SMART Network, a collaboration of more than 100 U of A scientists and learners who intentionally break disciplinary silos to think of unique ways to tackle neural injuries and diseases. That has meant working with researchers like neuroscientist Kathryn Todd and biochemist Matthew Churchward, both in the psychiatry department, to create three-dimensional cell cultures that simulate the testing of electrodes.

The next steps are fine-tuning the hardware--miniaturizing an implantable stimulator--and securing Health Canada and FDA approvals for clinical trials. Previous research has tackled the problem of translating brain signals and intent into commands to the intraspinal implant; however, the first generation of the intraspinal implants will require a patient to control walking and movement. Future implants could include a connection to the brain.

It's the same goal Mushahwar had decades ago. Except now it's no longer a laughable idea.

"Imagine the future," Mushahwar said. "A person just thinks and commands are transmitted to the spinal cord. People stand up and walk. This is the dream."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Opioid overdose risk is high after medical treatment ends, study finds

People with opioid addiction face a high risk of overdose after ending treatment with the medication buprenorphine, even when treated for 18 months, a new study by researchers at Columbia University Vagelos College of Physicians and Surgeons has found.

Among people who were treated with buprenorphine continuously for 6 to 18 months, about 5% needed medical treatment for an opioid overdose in the 6 months after ending buprenorphine treatment. The true rate is likely higher as the study was unable to account for overdose events that did not present to healthcare settings.

"The rate at which individuals relapsed and overdosed after ending treatment was alarmingly high, suggesting that discontinuing buprenorphine is a life-threatening event," says Arthur Robin Williams, MD, MBE, assistant professor of clinical psychiatry at Columbia University Vagelos College of Physicians and Surgeons.

The study also found that the longer patients continued with treatment, the lower their risk of other types of adverse outcomes, suggesting that buprenorphine treatment may be most effective as a long-term therapy for those with opioid use disorder.

Are Opioid Users Getting Evidence-Based Care?

As the opioid crisis continues, increasing attention has focused on difficulties faced by an estimated 2.1 million patients with opioid use disorder in accessing evidence-based care.

Buprenorphine, which won FDA approval in 2002 to combat opioid addiction, is dispensed to almost one million individuals annually. However, an estimated 50% to 80% of patients discontinue the treatment within several weeks or months, and there is no consensus about how long patients should continue taking the medication although expert consensus supports indefinite use.

Further, many insurance plans limit treatment with buprenorphine to six months or require approvals to be reauthorized every year, and patients who are at risk for incarceration are frequently deprived of buprenorphine treatment while awaiting arraignment or sentencing.

"Many clinicians think they should prescribe buprenorphine only for time-limited periods, due to stigma and outdated beliefs that patients using medications for opioid use disorder are not in 'true recovery,'" says Williams. "Our paper is one of the first to look at the effect of long-term durations of buprenorphine treatment on subsequent outcomes."

Dangers of Stopping Buprenorphine Treatment

To understand whether the duration of buprenorphine treatment had an impact on outcomes after treatment ended, the researchers analyzed Medicaid claims data of nearly 9,000 adults (ages 18 to 64 years) across a handful of anonymously reporting states who remained in continual treatment for at least 6 months and for as many as 18 months.

Regardless of treatment duration, the researchers found that in the 6 months after treatment ended, approximately 1 in 20 individuals received treatment for an opioid overdose at least once.

Rates of new opioid prescriptions (~25%) and visits to the emergency room (~45%) remained high for all groups in the 6 months after ending buprenorphine treatment, especially among those with mental illness. Rates were significantly higher for those who stopped treatment after 6 months versus the 15-18 month cohort.

What the Study Means

Previous studies have shown that the risk of dying from an opioid overdose drops by as much as 70% during buprenorphine treatment. However, most patients relapse after they discontinue the medication.

The current study adds to a growing body of literature demonstrating that treatment with buprenorphine may be needed for several years, if not indefinitely, to reduce the risk of overdose and other adverse events.

"Patients and families need guidance, social support, and better coordination of care to help facilitate long-term maintenance with buprenorphrine for opioid use disorder," Williams adds.

Credit: 
Columbia University Irving Medical Center

Anthracnose alert: How bacteria prime fifth-biggest global grain crop against deadly fungus

video: Sorghum anthracnose devastates crops of the drought- and heat-resistant cereal worldwide. Priming with rhizobacteria can boost the plant's resistance against a range of microbial attacks.

Professor Ian Dubery and Dr. Fidele Tugizimana from the University of Johannesburg's Centre for Plant Metabolomics decoded how priming enhances the 'security system' of plants for a much stronger, faster defense.

Using metabolomics and machine learning algorithms, they identified changes in the sorghum plant's chemical response to fungal attack. The low-cost approach can counter other pathogens in economically important food crops.

Image: 
Therese van Wyk, University of Johannesburg.

Anthracnose of Sorghum bicolor devastates crops of the drought- and heat-resistant cereal worldwide. Priming with rhizobacteria can boost the plants' resistance and tolerance against a wide range of adverse conditions such as microbial attacks.

University of Johannesburg researchers decoded how priming enhances the 'security system' of plants for a much stronger, faster defence.

Using metabolomics and machine learning algorithms, they identified changes in the sorghum plant's chemical response to fungal attack.

The low-cost approach can be used to counter other pathogens in economically important food crops.

Fungus modus operandi

The fungus Colletotrichum sublineolum sneaks up to its host in many ways. It may have been hanging around for years in the soil, on decaying plant matter or on equipment. It likes to pounce at the first rains in humid and warm conditions.

It enters the stomata, or "air vents" of the plant and doesn't wreck things at first. Rather, it multiplies inside the plant as a first priority. At this stage, it feeds on the plant without causing damage visible to the farmer.

But once the fungus has truly invaded its host, it switches from unwanted parasite to wholesale destroyer. As if a switch has been flicked, it starts demolishing the structural supports and cells of the plant. This way the fungus feeds its ravenous appetite and gets ready to procreate.

At this stage the devastating disease becomes visible on the outside of the plant. Sorghum anthracnose, or wilting disease, causes spots on the leaves and stems. The spots expand into lesions that can cover leaves completely.

The invaded plant doesn't stand a chance.

Unless friendly bacteria have teamed up with the plant beforehand. A mutual plant-bacteria interaction can switch on a plant's "security system", which can fend off the fungus, if it is sensitive, fast and strong enough.

Sorghum anthracnose is caused by the fungus Colletotrichum sublineolum, or CS fungus for short. The fungus is a picky predator, as most fungi are. It specializes in attacking sorghum plants of any age. Sometimes it favours plants closer to harvest. When it does attack, it can destroy entire fields of the grain, with crop yield losses up to 70% or more.

Coping with climate

"In the era of climate change, we are expecting longer periods of drought and excessive heat. Crop plants will also have to produce during intermittent and more severe flooding. It is time to adapt our existing crop plants for the conditions approaching us," says Prof Ian Dubery. He is the Director of the Research Centre for Plant Metabolomics at the University of Johannesburg in South Africa.

Fifth biggest grain globally

The species of sorghum mostly planted for commercial or subsistence production is Sorghum bicolor. It is indigenous to Africa, and used for food, fodder and bio-fuel in many countries.

By annual volume, it is the fifth-biggest grain crop in the world. Sorghum is key to food security for subsistence farmers producing the grain.

The crop is known as great millet and guinea corn in West Africa, dura in Sudan, mtama in eastern Africa, jowar in India and gaoliang in China; while it is usually referred to as milo or milo-maize in the United States, according to the FAO. Other names include feterita, jwari, shallu, cholam, jola, dari, and solam.

Certain varieties are highly drought and heat resistant.

Costly defences

Some varieties of Sorghum bicolor are more resistant to the CS fungus than others. However, the most productive varieties tend to have less resistance to the CS fungus. In addition, activating and exercising that resistance comes at a price - to the plant and the farmer.

The harder the sorghum plant has to work at mustering its defences, the fewer seeds it will produce. It may be able to defend itself so it has healthy leaves and stems, but it can end up producing a crop yield that is much less. It may even die in the process.

Also, spraying fungicides is expensive and can affect the environment. So a more sustainable way of protecting sorghum would be preferable.

Priming for defence

Priming sorghum plants with friendly bacteria around their roots, can make their leaves more resistant to attacks from the CS fungus. Biofertilizers which contain these rhizobacteria are used commercially for sorghum and other crops.

In industry, these are called "plant-growth promoting rhizobacteria" or PGPR. Seeds can be coated with biofertilizers, and soil or plants can be sprayed with it.

But how and why priming works to defeat pathogens such as the CS fungus on cereal crops was unknown.

Plant metabolomics decodes a tougher security system

Prof Ian Dubery and Dr Fidele Tugizimana decoded how and why priming with rhizobacteria works on Sorghum bicolor. Tugizimana is a research associate at the Research Centre.

"Sorghum plants and the rhizobacteria they are primed with, team up to get the plant 'security system' on higher alert. It also acts faster and with a stronger response against the attacking fungus," says Dubery.

Without fungal attacks, the bacteria living in the rhizosphere (the area around the roots of the plant), help the plant in many ways. As an example, they make it easier for the plant to digest nutrients like phosphates; and to fix nitrogen to the soil to make it more fertile.

In its turn, the plant helps the bacteria by releasing chemicals that they need.

For peace and war

The researchers planted a sorghum variety in trays in their laboratory. After the plants reached a height of 30 cm, they applied the rhizobacteria on the roots to prime the plants. These bacteria live in the soil close to the roots of the plants, which is called the rhizosphere. The plant manufactures chemicals and sends them into the rhizosphere for the bacteria to use.

What the researchers wanted to find out, is how the "chemical communications" work in and around a healthy plant. They analysed the chemicals synthesized by the plant in its leaves, stems and roots, as well as bacteria-inoculated soil in the rhizosphere.

From this, they could build up a picture of how the plant roots and bacteria "talk" to each other. They could also unravel how the roots, stems and leaves "talk" to each other to support the beneficial relationship with the bacteria.

This gave them the metabolomic "signature" of a healthy plant primed with rhizobacteria.

The other half of the plants they infected with the CS fungus to see how badly it would affect that sorghum variety. Again, they analysed the metabolic "cocktails" in the leaves, stems, roots and bacteria-inoculated soil. This gave them the metabolomic signature of a primed, infected plant.

Big data analytics

They repeated the whole process for the same variety of sorghum, with one exception. They didn't prime the roots with bacteria. Now they could see how much weaker the 'security response' was without priming.

All of these biochemical analyses generated a huge amount of complex data, more than 200 gigabytes in volume. To make sense of all of this, they employed big data analytics, which is a complex process of examining and mining large and complex datasets.

Along the way, techniques such as machine learning, chemometrics, multivariate statistical analyses and mathematical methods were used. In this way they could extract the information, so that more accurate conclusions and hypotheses could be drawn and confidently formulated.

Changes in chemical defence

Now they could see what new "cocktails" the sorghum plants manufactured to defend themselves. And how these cocktails were more diverse and concentrated, with help of the priming bacteria, than when the plants were healthy.

The researchers could also see what new "cocktail ingredients" the primed plants used when attacked by the CS fungus.

At one to three days after infection with the fungus, the primed plants produced several times higher quantities of plant hormones than non-primed plants, in particular hydroxyjasmonic acid-glucoside and zeatin.

The primed plants also synthesized significant quantities of the amino acids tyrosine and hydroxy-tryptophan, which non-primed plants made tiny quantities of. They also produced more than three times as much tryptophan than usual.

At the same time, the primed plants ramped up production of lipids, especially phytosphingosine. The non-primed plants produced tiny fractions of the lipids in comparison.

The primed plants radically cut back on producing (iso)flavonoids, especially on apigenin and luteolin.

Decoding the security system

"We found that primed sorghum plants have more sensitive plant security systems. They switch these systems on sooner than they would without priming. The primed plants also responded better to fungal attack. They had much lower infection rates and reduced symptom development compared to non-primed plants," says Tugizimana.

Even at nine days after infection with the CS fungus, few primed sorghum plants showed symptoms. The ones that had symptoms had few leaves affected. The lesions could be described as a localised hypersensitive response. None of the lesions spread over the entire leaf surface, he adds.

They found out how the plant was able to defend itself, using metabolomics analyses.

"We found out how the interaction of the beneficial bacteria with sorghum plant roots modifies the plant's ability to defend itself. The primed sorghum plant changes how it apportions energy and redirects its metabolic pathways more to defence, rather than growth or seed production. In this way, it changes the composition of its protective chemicals to resist the fungus. This is how it starts making new 'cocktails' to enhance its chemical defences.

"The primed sorghum plant is more sensitive to fungal attack, reacts quicker and more intensely. So we can say that plant-beneficial bacteria supports the plant to launch a more efficient defence," says Tugizimana.

Low-cost sustainable approach for farming

The results pave the way for similar studies on countering pathogens on other economically important crop plants, says Dubery.

"Priming with rhizobacteria can make a susceptible plant more tolerant and a tolerant plant more resistant to disease. This means that priming, or pre-conditioning, can enhance crop yields and reduce the use of pesticides. It is a promising, sustainable and low-cost option to get more effective resistance in real-world farming conditions, where many pathogens threaten food crops," he adds.

Credit: 
University of Johannesburg