Culture

Beyond mere blueprints: Variable gene expression patterns and type 1 diabetes

video: Roles epigenetic factors play in the pathogenesis of type 1 diabetes.

Genetics plays a major role in determining a person's risk of developing type 1 diabetes, but environmental and lifestyle factors are also important. In an article recently published in Chinese Medical Journal, a team of researchers explore the interplay of genetic and environmental factors by summarizing the literature on type 1 diabetes and epigenetics -- the study of how gene expression patterns can be modified. These findings have important implications in treating type 1 diabetes.

Image: 
<em>Chinese Medical Journal</em>

Type 1 diabetes is a disorder in which the immune system inappropriately targets a class of cells in the pancreas known as β cells that produce the hormone insulin, which plays an important role in regulating bloodstream glucose levels and the metabolism of carbohydrates, lipids, and proteins. The loss of insulin causes a range of health problems for people with type 1 diabetes, and patients become dependent on insulin injections for their survival. Even with insulin therapy, people with type 1 diabetes have shortened lifespans and are at an elevated risk of developing myriad complications.

Previous studies have identified numerous genetic risk factors for type 1 diabetes. One notable finding is the importance of the HLA region, a part of the human genome that contains multiple genes and explains approximately 40-50% of the genetic risk for type 1 diabetes. However, studies of identical twins have identified cases in which one twin develops type 1 diabetes while the other twin does not--which indicates that genetic risk factors cannot fully explain the occurrence of type 1 diabetes. This finding is consistent with the known relevance of certain environmental factors. For example, most studies have found that breastfeeding and vitamin D consumption protect against type 1 diabetes and that cow's milk and the early introduction of gluten increase the risk of type 1 diabetes. Furthermore, microbes in the human gut play important roles in human health and digestion, and patients with type 1 diabetes often lack diversity in their gut microbiota.

To explain the risk factors for type 1 diabetes more fully, medical researchers have turned to the field of epigenetics, which studies how environmental and lifestyle factors can influence the expression of genes without altering the underlying DNA sequence. In recent years, various research groups have published studies confirming that epigenetic changes related to environmental conditions contribute to the development of type 1 diabetes. In a review article recently published in Chinese Medical Journal, a team of researchers from the Huazhong University of Science and Technology led by Drs. Cong-Yi Wang and Fei Xiong set out to summarize how epigenetic factors modulate the risks of type 1 diabetes. They also aimed to discuss the potential of these epigenetic factors to serve as markers for monitoring disease progression and as targets for therapeutics.

One important mechanism of epigenetics is DNA methylation, which refers to the presence of chemical tags called methyl groups on DNA. DNA methylation patterns influence whether and how strongly a gene is expressed, and past studies have identified numerous methylation sites that influence the risk of type 1 diabetes. Some of these methylation sites lie within the previously mentioned HLA regions, which is highly relevant to type 1 diabetes. Other methylation sites affect the INS gene, which is second only to the HLA region in terms of influencing the risk of type 1 diabetes.

Another mechanism of epigenetics is chemical modification of histones, which are proteins around which DNA strands are wound. Several studies have reported abnormal histone modification patterns in patients with type 1 diabetes. These modifications may increase the risk of type 1 diabetes by influencing the expression of genes related to inflammation and immunity, and the elevated blood glucose levels associated with type 1 diabetes may also cause abnormal histone modification patterns.

Epigenetic effects can also be expressed in the form of noncoding RNAs, which are RNA molecules that have functional roles other than the standard role of providing instructions for protein synthesis. Noncoding RNAs are a diverse class of genetic molecules, and they can bind DNA, other RNA strands, and proteins. Through their various actions, they can promote or suppress the expression of certain genes. Recent studies have yielded evidence that noncoding RNAs can contribute to type 1 diabetes by influencing the immune system and causing β cell dysfunction.

Dr. Wang explains, "Given the relationship between epigenetic changes and type 1 diabetes, various epigenetic changes could serve as markers for disease progression and treatment effects or even as targets for future therapeutics. For example, noncoding RNAs can be measured noninvasively, while changes in DNA methylation levels and patterns for particular genes could indicate that a genetically predisposed person is developing type 1 diabetes." Furthermore, the researchers cite studies that have yielded evidence that drugs that alter DNA methylation could benefit patients with type 1 diabetes.

Dr. Xiong concludes, "Based on existing literature, it is clear that environmental insult-induced epigenetic changes modulate the expression of critical genes relevant to the initiation and progression of autoimmunity and β cell destruction and are therefore implicated in the development of type 1 diabetes." This information will be valuable to medical researchers who wish to develop new ways to predict the onset of type 1 diabetes, assess the condition's severity and progression, and provide patients with effective treatment options.

Credit: 
Cactus Communications

Health and socializing: Why people use mixed-reality sports platforms

New technologies allow users to do things like race their real bikes against other real people in a virtual world, and a new study outlines what motivates people to use these online platforms. The findings offer insights for future iterations of these technologies - and how to market them.

At issue are "mixed-reality sports": augmented reality platforms that incorporate virtual, online elements and real-world athletic endeavors. For example, Zwift is a platform that allows users to ride their real bicycles, but transfers their efforts to a virtual space depicting real-world courses - allowing them to race against other cyclists who are not physically present.

"We know that mixed-reality sports are attracting a lot of users," says Bill Rand, co-author of the paper and an associate professor of marketing in North Carolina State University's Poole College of Management. "We want to know what benefits people see in these technologies. What about risks? And how do those risks and benefits affect their actual use?

"This matters because once we understand why people are using, or not using, these technologies, we can figure out how to make the technologies for appealing for users - and also how to market them more effectively."

For this study, the researchers conducted a survey of 284 Zwift users in Germany, Austria and Switzerland. The survey collected data on each study participant's background, their motivations for using Zwift, any concerns they had about the platform, and the extent to which they felt they would continue using Zwift in the future. The researchers were then able to review each study participant's use of Zwift for 30 days after taking the survey. The study design allowed the researchers to identify any relationships between a study participant's motivations, perceived risks, their expectations for using Zwift, and their actual use of Zwift.

One of the things researchers found surprising was that users were simply not motivated by competing against other users within the game environment itself.

"The Zwift platform is designed specifically to enable competition, either informally amongst friends, or in formal races involving many competitors," Rand says. "However, we found that even the people who take part in the formal races are not strongly motivated by these in-game contests."

Instead, researchers found that four other drivers were associated with Zwift use: health consciousness; using Zwift to train for real-world competitions; socializing with others; and the ability to customize and upgrade their gaming experience by modifying their jerseys, "earning" access to new bike styles, and so on.

"To provide a more profound explanation of the quantitative results, we also conducted 14 interviews with platform users," says Daniel Westmattelmann, corresponding author of the paper, an assistant professor of sports management at the University of Münster and a former professional cyclist. "It was fascinating to see that even elite athletes who have won Tour de France stages, for example, are strongly motivated to use the platform more intensively because of customizing or socializing elements."

The researchers also found that study participants who had privacy concerns about Zwift engaged with the platform less frequently.

"There are a variety of these mixed-reality sports platforms out there, such as Peloton, with varying ratios of virtual elements to real-world elements," Rand says. "And this field is likely to grow. Our work gives us insight into what may be motivating participants on these platforms.

"For example, the ability to customize your avatar appears to be important. Social interaction and online communities are important. Health and fitness are important. Privacy concerns are important.

"Understanding the things that are important to users can help developers of next-generation mixed-reality sports technologies design more appealing products," Rand says. "And can help marketers determine which aspects of these products to highlight for consumers."

The paper, "Apart we ride together: The motivations behind users of mixed-reality sports," appears in the Journal of Business Research. The paper was co-authored by Jan-Gerrit Grotenhermen, Marius Sprenger and Gerhard Schewe of the University of Münster.

Credit: 
North Carolina State University

Mystery of Galaxy's Missing Dark Matter Deepens

image: This Hubble Space Telescope snapshot reveals an unusual "see-through" galaxy. The giant cosmic cotton ball is so diffuse and its ancient stars so spread out that distant galaxies in the background can be seen through it. Called an ultra-diffuse galaxy, this galactic oddball is almost as wide as the Milky Way, but it contains only 1/200th the number of stars as our galaxy. The ghostly galaxy doesn't appear to have a noticeable central region, spiral arms, or a disk. Researchers calculated a more accurate distance to the galaxy, named NGC 1052-DF2, or DF2, by using Hubble to observe about 5,400 aging red giant stars. Red giant stars all reach the same peak brightness, so they are reliable yardsticks to measure distances to galaxies. The research team estimates that DF2 is 72 million light-years from Earth. They say the distance measurement solidifies their claim that DF2 lacks dark matter, the invisible glue that makes up the bulk of the universe's contents. The galaxy contains at most 1/400th the amount of dark matter that the astronomers had expected. The observations were taken between December 2020 and March 2021 with Hubble's Advanced Camera for Surveys.

Image: 
Credits: SCIENCE: NASA, ESA, STScI, Zili Shen (Yale), Pieter van Dokkum (Yale), Shany Danieli (IAS) IMAGE PROCESSING: Alyssa Pagan (STScI)

When astronomers using NASA's Hubble Space Telescope uncovered an oddball galaxy that looked like it didn't have much dark matter, some thought the finding was hard to believe and looked for a simpler explanation.

Dark matter, after all, is the invisible glue that makes up the bulk of the universe's matter. All galaxies appear to be dominated by it; in fact, galaxies are thought to form inside immense halos of dark matter.

So, finding a galaxy lacking the invisible stuff is an extraordinary claim that challenges conventional wisdom. It would have the potential to upset theories of galaxy formation and evolution.

To bolster their original finding, first reported in 2018 (Dark Matter Goes Missing in Oddball Galaxy (hubblesite.org)), a team of scientists led by Pieter van Dokkum of Yale University in New Haven, Connecticut, followed up their initial study with a more robust Hubble look at the galaxy, named NGC 1052-DF2. Scientists refer to it simply as "DF2."

"We went out on a limb with our initial Hubble observations of this galaxy in 2018," van Dokkum said. "I think people were right to question it because it's such an unusual result. It would be nice if there were a simple explanation, like a wrong distance. But I think it's more fun and more interesting if it actually is a weird galaxy."

Determining the amount of the galaxy's dark matter hinges on accurate measurements of how far away it is from Earth.

If DF2 is as far from Earth as van Dokkum's team asserts, the galaxy's dark-matter content may only be a few percent. The team's conclusion is based on the motions of the stars within the galaxy; their velocities are influenced by the pull of gravity. The researchers found that the observed number of stars accounts for the galaxy's total mass, and there's not much room left for dark matter.

However, if DF2 were closer to Earth, as some astronomers claim, it would be intrinsically fainter and less massive. The galaxy, therefore, would need dark matter to account for the observed effects of the total mass.

A Better Yardstick
Team member Zili Shen, from Yale University, says that the new Hubble observations help them confirm that DF2 is not only farther from Earth than some astronomers suggest, but also slightly more distant than the team's original estimates.

The new distance estimate is that DF2 is 72 million light-years as opposed to 42 million light-years, as reported by other independent teams. This places the galaxy farther than the original Hubble 2018 estimate of 65 light-years distance.

The research team based its new result on long exposures with Hubble's Advanced Camera for Surveys, which provide a deeper view of the galaxy for finding a reliable yardstick to nail down the distance. They targeted aging red giant stars on the outskirts of the galaxy that all reach the same peak brightness in their evolution. Astronomers can use the stars' intrinsic brightness to calculate vast intergalactic distances. "Studying the brightest red giants is a well-established distance indicator for nearby galaxies," Shen explained.

The more accurate Hubble measurements solidify the researchers' initial conclusion of a galaxy deficient in dark matter, team members say. So the mystery of why DF2 is missing most of its dark matter still persists.

"For almost every galaxy we look at, we say that we can't see most of the mass because it's dark matter," van Dokkum explained. "What you see is only the tip of the iceberg with Hubble. But in this case, what you see is what you get. Hubble really shows the entire thing. That's it. It's not just the tip of the iceberg, it's the whole iceberg."

The team's science paper has appeared in The Astrophysical Journal Letters.

A Stealthy Galaxy
DF2 is a giant cosmic cotton ball that van Dokkum calls a "see-through galaxy," where the stars are spread out. The galactic oddball is almost as wide as the Milky Way, but it contains only 1/200th the number of stars as our galaxy.

The ghostly galaxy doesn't appear to have a noticeable central region, spiral arms, or a disk. The team estimates that DF2 contains at most 1/400th the amount of dark matter than astronomers had expected. How the galaxy formed remains a complete mystery based on the team's latest measurements.

DF2 isn't the only galaxy devoid of dark matter. Shany Danieli of the Institute for Advanced Study in Princeton, New Jersey, used Hubble in 2020 to obtain an accurate distance to another ghostly galaxy, called NGC 1052-DF4 (or simply DF4), which apparently lacks dark matter, too.

The researchers think both DF2 and DF4 were members of a collection of galaxies. However, the new Hubble observations show that the two galaxies are 6.5 million light-years away from each other, farther apart than they first thought. It also appears that DF2 has drifted away from the grouping and is isolated in space.

Both galaxies were discovered with the Dragonfly Telephoto Array at the New Mexico Skies observatory.

"Both of them probably were in the same group and formed at the same time," Danieli said. "So maybe there was something special in the environment where they were formed."

The researchers are hunting for more of these oddball galaxies. Other teams of astronomers are searching, too. In 2020, a group of researchers uncovered 19 unusual dwarf galaxies they say are deficient in dark matter (Off the Baryonic Tully-Fisher Relation: A Population of Baryon-dominated Ultra-diffuse Galaxies - IOPscience). However, it will take uncovering many more dark matter-less galaxies to resolve the mystery.

Nevertheless, van Dokkum thinks finding a galaxy lacking dark matter tells astronomers something about the invisible substance. "In our 2018 paper, we suggested that if you have a galaxy without dark matter, and other similar galaxies seem to have it, that means that dark matter is actually real and it exists," van Dokkum said. "It's not a mirage."

Credit: 
NASA/Goddard Space Flight Center

Immune system protein may defend against deadly intestinal disease in babies

image: A study led by researchers at Washington University School of Medicine in St. Louis has identified a protein in the immune system that may protect babies from necrotizing enterocolitis, a leading cause of death among premature infants. Pictured is Misty Good, MD, MS, an assistant professor of pediatrics in the Division of Newborn Medicine at Washington University.

Image: 
Matt Miller/Washington University

The intestinal disease necrotizing enterocolitis is a leading cause of death among premature infants born in the U.S. and across the globe. Characterized by excessive inflammation that can cause tissue decay in the bowels, the disease provides a pathway for infectious and deadly bacteria to enter the bloodstream.

Despite four decades of research, effective treatments remain elusive, and mortality rates in babies who develop the disease have remained essentially unchanged, hovering at about 30%.

Now, a study led by researchers at Washington University School of Medicine in St. Louis has identified, in mice, a protein in the immune system that may protect babies from necrotizing enterocolitis (NEC) and lead to the development of new treatments.

The findings are published online June 15 in Cell Reports Medicine.

"Necrotizing enterocolitis is a serious, fast-acting condition that can lead to death within hours," said the study's senior author, Misty Good, MD, an assistant professor of pediatrics in the Division of Newborn Medicine. "We don't know why NEC happens, and we can try to treat it with antibiotics and surgical removal of the dead tissue; however, in severe cases, many babies will still die. No treatments stop the disease from progressing, but our hope is that the protein we've identified will change that."

The scientists focused on Interleukin-22 (IL-22), a protein that regulates immune responses and helps maintain a healthy gut microbiome in adults.

Over the years, research has suggested that IL-22 has a critical role in adult gastrointestinal diseases. Consequently, potential treatments involving IL-22 are being studied in COVID-19 illness, alcohol-induced liver disease, and graft-versus-host disease that develops after organ or bone marrow transplants. However, IL-22's role in newborns' intestines has been unclear.

To better understand the protein's role, the researchers created a mouse model to examine IL-22 signaling and production in healthy intestines and in intestines damaged by NEC. They analyzed IL-22 levels before and after birth and into adulthood, which for mice begins when they are weaned, at about 28 days old. In both the healthy and diseased intestines, the researchers documented low postnatal IL-22 production up until day 21, when production skyrocketed for the mice and continued into adulthood.

The researchers also studied tissue samples from preemies who did and did not develop NEC. The scientists found low levels of IL-22 in all of the intestinal samples. And in the babies who had developed NEC, an appropriate immune response had not been mounted in the intestines.

"Immune cells in the neonatal intestine have shown an inability to produce adequate amounts of IL-22 to control the progression of NEC," said Good, who treats patients at St. Louis Children's Hospital and is also co-program director of the university's Neonatal-Perinatal Medicine Fellowship. As a member of the scientific advisory council of the Necrotizing Enterocolitis Society, Good has led an effort involving seven medical centers that have developed a large biorepository of samples from infants affected by NEC.

Good surmised that immature intestines are associated with a lack of IL-22 production, a theory strengthened by the fact that premature infants weighing less than 3 pounds 5 ounces are most at risk for NEC. Typically, the more premature a baby is, the lower the baby's weight and the more undeveloped a baby's gastrointestinal immune system is. Harmful bacteria can get cross the gut barrier and activate the immune system. And because the immune system of preemies isn't fully developed, it leads to an exaggerated inflammatory response that can lead to tissue death.

The researchers' findings of low levels of IL-22 in neonatal tissues led to their next step: injecting the mice with IL-22. The protein aids in controlling inflammation while promoting regeneration of tightly packed cells lining the intestine. IL-22 can help strengthen the intestinal walls, creating a barrier in the gut that allows for nutrient absorption while preventing toxic or otherwise hostile microorganisms from seeping into the bloodstream.

"Interestingly, our work demonstrated that treatment with IL-22, in mice, protects the newborn intestine against damage caused by NEC," Good said. "Our study represents a substantial advance in understanding the role of IL-22 in early life and sets the stage for new ways to treat NEC in the future."

Credit: 
Washington University in St. Louis

Moffitt develops non-invasive approach to predict outcomes in lung cancer

TAMPA, Fla. (June 17, 2021) - Tests that analyze biomarkers are used during cancer management to guide treatment and provide information about patient prognosis. These tests are often performed on tissue biopsy samples that require invasive procedures and can lead to significant side effects. In a new article published in the Journal for ImmunoTherapy of Cancer, Moffitt Cancer Center researchers show that PET/CT images can be used to measure levels of the PD-L1 biomarker of non-small cell lung cancer (NSCLC) patients in a non-invasive manner and, in turn, predict a patient's response to therapy.

Checkpoint inhibitors, drugs used to reactive the immune system by targeting the PD1/PD-L1 signaling pathway, are commonly used to treat patients with NSCLC. While this type of therapy has greatly improved patient outcomes, it only works for roughly half of this patient population. To avoid treating those who may not respond, checkpoint inhibitor therapy is often limited to patients whose surgical biopsy specimen is shown to express the PD-L1 biomarker. However, performing an invasive surgery is associated with inherent risks, and occasionally the biopsy sample is not adequate to perform diagnostic testing or the testing procedure itself may fail. Therefore, researchers are trying to develop alternative strategies to identify patients who should be treated with targeted agents such as checkpoint inhibitors.

Moffitt researchers wanted to take advantage of the capabilities of computer deep learning to develop a new framework to measure PD-L1 biomarker levels in NSCLC patients in a non-invasive manner. They chose to use features of PET/CT scan images, such as shape, size, pixel intensity and textures to train computers to measure PD-L1 expression. They developed a deep learning score to predict PD-L1 expression, and after validation through different patient cohorts were able to use their model to predict checkpoint inhibitor outcomes in NSCLC patients.

"These data demonstrate the feasibility of using an alternative non-invasive approach to predict expression of PD-L1," said Matthew Schabath, Ph.D., associate member of the Cancer Epidemiology Department. "This approach could help physicians determine optimum treatment strategies for their patients, especially when tissue samples are not available or when common testing approaches for PD-L1 fail."

"This study is important, as it is the single largest multi-institutional radiomic study population of NSCLC patients to date treated with immunotherapy to predict PD-L1 status and subsequent treatment response using PET/CT scans," said Robert Gillies, Ph.D., chair of the Department of Cancer Physiology at Moffitt. "Because images are routinely obtained and are not subject to sampling bias per se, we propose that the individualized risk assessment information provided by these analyses may be useful as a future clinical decision support tool pending larger prospective trials."

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

Sulfur enhances carbon storage in the Black Sea

image: So-called rosette samplers are used to take water samples at different water depths. The Oldenburg team analyzed the distribution of dissolved organic matter in the Black Sea.

Image: 
© Nelli Sergeeva

The Black Sea is an unusual body of water: below a depth of 150 metres the dissolved oxygen concentration sinks to around zero, meaning that higher life forms such as plants and animals cannot exist in these areas. At the same time, this semi-enclosed sea stores comparatively large amounts of organic carbon. A team of researchers led by Dr Gonzalo V. Gomez-Saez and Dr Jutta Niggemann from the University of Oldenburg's Institute for Chemistry and Biology of the Marine Environment (ICBM) has now presented a new hypothesis as to why organic compounds accumulate in the depths of the Black Sea - and other oxygen-depleted waters in the scientific journal Science Advances.

The researchers posit that reactions with hydrogen sulfide play an important role in stabilizing carbon compounds. "This mechanism apparently contributes to the fact that there is more than twice as much organic carbon in the waters of the Black Sea as in oxygen-rich marine areas," says Niggemann. "This provides a negative feedback in the climate system that could counteract global warming over geological periods."

In the Black Sea, which covers an area almost twice the size of France, conditions rarely found in other marine regions have prevailed for around 7,000 years: stable stratification largely prevents the mixing of surface and deep waters. The water in the upper 150 metres is low in salt and oxygen-rich, and comes mainly from rivers like the Danube. Below that, there is a layer of higher density saline water that flows into the Black Sea from the Mediterranean via the Bosporus.

"When you open a water sample from the deeper areas of the Black Sea, the smell of rotten eggs almost knocks you over," Niggemann says. On the surface, however, there is no indication that the Black Sea is a stagnant body of water in which, due to the lack of oxygen, bacteria produce foul-smelling hydrogen sulfide.

Hydrogen sulfide reacts with dissolved organic matter

As the new study shows, this highly reactive molecule binds with substances from a diverse group of carbonaceous materials that are present in every litre of seawater. These substances are known as dissolved organic matter (DOM) - a complex mixture of countless different molecules that are the product of decomposed organic matter or bacterial metabolic processes.

"We were able to show very clearly that hydrogen sulfide reacts with the extremely diluted organic matter directly in the water," Niggemann explains. The products of the reaction are potentially more durable than the starting materials and therefore accumulate in the water.

The team compared water samples from different locations in the Black Sea and other seas and rivers. Using various analytical methods, including the ultrahigh resolution mass spectrometer of the Marine Geochemistry research group at the University of Oldenburg, the researchers were able to characterize the dissolved organic matter in detail.

They found that almost a fifth of the organic molecules in the anoxic areas of the Black Sea contained sulfur - significantly more than in other seas. In addition, the team was able to establish that a high proportion of these compounds are only found in these areas, leading the researchers to conclude that the sulfur compounds form there through chemical reactions in the sulfidic water.

Negative feedback relevant on geological time scales

Given that huge amounts of carbon are stored in dissolved organic matter - the world's oceans contain roughly as much dissolved organic carbon as there is CO2 in the Earth's atmosphere - the results of this new study are also relevant for the climate. "The volume of ocean waters completely depleted of oxygen quadrupled between 1960 and 2010.

Consequently, this sulfur-based mechanism of carbon storage could influence the chemistry of the oceans in the future," says Gomez-Saez, the lead author of the study. But this negative feedback is too weak to have a noticeable impact on climate change under the current conditions, he adds. In geological history, however, there have been several periods during which large areas of the oceans were oxygen-deficient. During these periods this effect could have contributed to long-term removal of carbon dioxide from the atmosphere.

The water samples from the Black Sea were taken during an expedition with the research vessel Maria S. Merian. In addition to the team from the ICBM, researchers from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) in Bremerhaven, the MARUM - Center for Marine Environmental Sciences of the University of Bremen, and the Max Planck Institute for Marine Microbiology in Bremen participated in the study.

Credit: 
University of Oldenburg

Biodiversity imperiled

Woodlands along streams and rivers are an important part of California's diverse ecology. They are biodiversity hotspots, providing various ecosystem services including carbon sequestration and critical habitat for threatened and endangered species. But our land and water use have significantly impacted these ecosystems, sometimes in unexpected ways.

A team of researchers, including two at UC Santa Barbara, discovered that some riparian woodlands are benefitting from water that humans divert for our own needs. Although it seems like a boon to these ecosystems, the artificial supply of water begets an unintended dependence on this bounty, threatening the long-term survival of natural forest communities. The paper, published in the Proceedings of the National Academy of Sciences, spotlights the need for changes in the way water is managed across the state.

"We need to be more intentional in incorporating ecosystem water needs when we manage water--both for aquatic organisms and species on land," said lead author Melissa Rohde, a groundwater scientist at The Nature Conservancy who led the research as a doctoral student at State University of New York College of Environmental Science and Forestry (SUNY-ESF). "These forest ecosystems are in a precarious state because we have disrupted the natural hydrologic processes that these plant species rely upon to support and sustain key life processes."

In California's Mediterranean climate, plants and animals have adapted to rely on precipitation and soil moisture recharge during the rainy winter and spring seasons for reproduction and growth during the typically dry summers. Once soil moisture is exhausted, tree species often found in stream corridors, such as willows, cottonwoods and oaks, typically use deeper groundwater. However, the researchers discovered the story was more complicated.

By analyzing five years of vegetation greenness data from satellite imagery, the authors found that in some cases, these ecosystems were affected by "subsidies of water" delivered via human regulation of rivers, agricultural canals and discharges from wastewater treatment plants. Altered streamside woodlands in the most arid regions of the state stayed greener longer into the dry season and were less responsive to changes in groundwater levels than natural ecosystems.

"Although this seems like a good news story -- trees benefit from anthropogenic water management -- there is an important caveat," said co-author Michael Singer, a researcher at UC Santa Barbara's Earth Research Institute and a professor at Cardiff University in the United Kingdom. "In channels and canals with severely altered flow regimes, there are few if any opportunities for these trees to spawn new offspring. This means that once these riparian woodlands die off, they will not be replaced through forest succession."

Many of the most-altered stream ecosystems are in California's Central Valley, the state's agricultural hub, which produces a third of the produce for the United States. Following the Gold Rush in the 1850s, massive human settlement led to clearing of 95% of the natural floodplain woodlands across the region. These isolated and restricted riparian, or streamside, forests now provide important habitat for threatened and endangered species like the California red-legged frog, Chinook salmon and Swainson's hawk.

As water is rerouted from rivers into canals to accommodate urbanization and the multibillion-dollar agricultural industry, it creates an artificially stable environment for riparian woodland ecosystems. This encourages a "live fast, die young" community that favors trees that peak and then decline within a few decades. Key ecosystem functions -- such as the regeneration of new forest stands and their development over time -- are being compromised by the extensive alterations to streamflow and to river channels, which are fixed in place and no longer create new floodplain areas where young trees can establish.

"We call these forests the 'living dead' because the forest floor is devoid of saplings and younger trees that can replace the mature trees when they die," Rohde said. This has repercussions related to habitat for endangered species, biodiversity, carbon sequestration and climate change.

"California is one of the most biodiverse regions in the world, containing more species than the rest of the United States and Canada combined," said Rohde. "In the midst of the sixth mass extinction, the long-term sustainability of California's river ecosystems and the preservation of the rare and endemic species that live within them now rely on the deliberate, coordinated management of resource and government agencies."

This study is part of a $2.5 million suite of projects that the collaborators at SUNY-ESF, UC Santa Barbara and Cardiff University have funded throughout the U.S. Southwest and France. The investigators also include UCSB geography professor Dar Roberts, one of the study's co-authors. The goal is to develop water stress indicators for dryland riparian forest ecosystems threatened by climate change and increasing human water demand.

Rhode and The Nature Conservancy will use the insights from the study to provide scientific guidance to California natural resource agencies for sustainably managing groundwater-dependent ecosystems throughout the state. As Singer pointed out, the findings pertain to the recent sustainable groundwater legislation passed in California. The Sustainable Groundwater Management Act, requires all groundwater stakeholders to agree on sustainability targets for groundwater usage to support urban areas, agriculture, industry and ecology.

The research team used publicly available online data and Google Earth Engine, an open-source tool for analyzing data from satellites and other global spatial datasets. "Our methods and findings open up a whole new world of interdisciplinary research possibilities and ways that water practitioners can consider ecosystem water needs to achieve sustainable water management," Rohde said.

John Stella, a SUNY-ESF professor and principal investigator on the National Science Foundation grant that funded the study, characterized the work as "groundbreaking" for the way it "combined several big datasets in an innovative way to understand how climate and water management interact to put these sensitive ecosystems at risk."

"[The] findings are important for sustainably managing groundwater, not only throughout California, but in water-limited regions worldwide," Stella said. "By creatively harnessing and integrating these large environmental datasets, we can now answer resource management questions at a scale that was previously impossible."

Credit: 
University of California - Santa Barbara

Depression, tau deposits seen in subset of middle-aged persons

image: Mitzi Gonzales, Ph.D., of The University of Texas Health Science Center at San Antonio, is lead author of a study suggesting that middle-aged people with depressive symptoms who carry a genetic variation called apolipoprotein (APOE) ε4 may be more at risk to develop tau protein accumulations in the brain's emotion- and memory-controlling regions.

Image: 
UT Health San Antonio

SAN ANTONIO (June 17, 2021) -- Middle-aged people with depressive symptoms who carry a genetic variation called apolipoprotein (APOE) ε4 may be more at risk to develop tau protein accumulations in the brain's emotion- and memory-controlling regions, a new study by researchers from The University of Texas Health Science Center at San Antonio (UT Health San Antonio) and collaborating institutions suggests.

The Journal of Alzheimer's Disease published the findings in its June 2021 print issue. The research is based on depression assessments and positron emission tomography (PET) imaging conducted among 201 participants in the multigenerational Framingham Heart Study. The mean age of these participants was 53.

Decades before diagnosis

PET scans typically are conducted in older adults, so the Framingham PET studies in middle-aged persons are unique, said Mitzi M. Gonzales, PhD, lead author of the study and a neuropsychologist with the Glenn Biggs Institute for Alzheimer's and Neurodegenerative Diseases, which is part of UT Health San Antonio.

"This gives us an interesting opportunity to study people at midlife and get a sense of factors that might be associated with protein accumulations in individuals who are cognitively normal," Dr. Gonzales said. "These persons are likely decades before any type of dementia diagnosis, if they are to develop dementia in the future."

No associations found with amyloid beta

Amyloid beta (amyloid-β) and tau are proteins that aggregate in the brains of people with Alzheimer's disease and also typically increase to a milder extent with normal aging. The study found no associations of depressive symptoms and depression with amyloid-β. The only association was with tau, and only in APOE ε4 variant carriers. About one-fourth of the participants (47 of 201) were ε4 carriers, by virtue of having at least one copy of the ε4 allele.

Having one copy of APOE ε4 increases the risk of developing Alzheimer's by as much as two- to threefold, but some people with this variant live into their 80s and 90s and never develop the disease. "It's important to keep in mind that just because a person is identified as carrying the APOE ε4 allele does not mean that he will develop dementia in the future," Dr. Gonzales said. "It just means that the risk is higher."

Depressive symptoms (and depression if symptoms were severe enough to reach that diagnostic threshold) were evaluated with the Center for Epidemiological Studies Depression Scale at the time of PET imaging, as well as eight years prior. Associations between depressive symptoms and depression with PET outcomes at both time points were evaluated, adjusting for age and sex.

Centers of emotion and cognition

The study showed associations between depressive symptoms and increased tau in two brain regions, the entorhinal cortex and amygdala. "These associations do not infer that the tau accumulation causes the depressive symptoms, or vice versa," Dr. Gonzales said. "We only note that both are present in the ε4 carriers."

The entorhinal cortex is important for memory consolidation and tends to be an area where protein deposition occurs early on, she noted. The amygdala, meanwhile, is considered the emotion center of the brain.

"Longitudinal studies are needed to further understand what is happening, but it is intriguing to think about the clinical significance of our findings in terms of cognition as well as emotional regulation," Dr. Gonzales said.

Credit: 
University of Texas Health Science Center at San Antonio

'Nanodecoy' therapy binds and neutralizes SARS-CoV-2 virus

video: "Nanodecoys" made from human lung spheroid cells (LSCs) can bind to and neutralize SARS-CoV-2, promoting viral clearance and reducing lung injury in a macaque model.

Image: 
Ke Cheng, NC State University

Nanodecoys made from human lung spheroid cells (LSCs) can bind to and neutralize SARS-CoV-2, promoting viral clearance and reducing lung injury in a macaque model of COVID-19. By mimicking the receptor that the virus binds to rather than targeting the virus itself, nanodecoy therapy could remain effective against emerging variants of the virus.

SARS-CoV-2 enters a cell when its spike protein binds to the angiotensin-converting enzyme 2 (ACE2) receptor on the cell's surface. LSCs - a natural mixture of lung epithelial stem cells and mesenchymal cells - also express ACE2, making them a perfect vehicle for tricking the virus.

"If you think of the spike protein as a key and the cell's ACE2 receptor as a lock, then what we are doing with the nanodecoys is overwhelming the virus with fake locks so that it cannot find the ones that let it enter lung cells," says Ke Cheng, corresponding author of the research. "The fake locks bind and trap the virus, preventing it from infecting cells and replicating, and the body's immune system takes care of the rest."

Cheng is the Randall B. Terry Jr. Distinguished Professor in Regenerative Medicine at North Carolina State University and a professor in the NC State/UNC-Chapel Hill Joint Department of Biomedical Engineering.

Cheng and colleagues from NC State and UNC-CH converted individual LSCs into nanovesicles, or tiny cell membrane bubbles with ACE2 receptors and other lung cell-specific proteins on the surface.

They confirmed that the spike protein did bind to the ACE2 receptors on the decoys in vitro, then used a fabricated SARS-Co-V-2 mimic virus for in vivo testing in a mouse model. The decoys were delivered via inhalation therapy. In mice, the nanodecoys remained in the lungs for 72 hours after one dose and accelerated clearance of the mimic virus.

Finally, a contract research organization conducted a pilot study in a macaque model and found that inhalation therapy with the nanodecoys accelerated viral clearance, and reduced inflammation and fibrosis in the lungs. Although no toxicity was noted in either the mouse or macaque study, further study will be necessary to translate this therapy for human testing and determine exactly how the nanodecoys are cleared by the body.

"These nanodecoys are essentially cell 'ghosts,' and one LSC can generate around 11,000 of them," Cheng says. "Deploying millions of these decoys exponentially increases the surface area of fake binding sites for trapping the virus, and their small size basically turns them into little bite-sized snacks for macrophages, so they are cleared very efficiently."

The researchers point out three other benefits of the LSC nanodecoys. First, they can be delivered non-invasively to the lungs via inhalation therapy. Second, since the nanodecoys are acellular - there's nothing living inside - they can be easily preserved and remain stable longer, enabling off-the-shelf use. Finally, LSCs are already in use in other clinical trials, so there is an increased likelihood of being able to use them in the near future.

"By focusing on the body's defenses rather than a virus that will keep mutating we have the potential to create a therapy that will be useful long-term," Cheng says. "As long as the virus needs to enter the lung cell, we can keep tricking it."

The research appears in Nature Nanotechnology and was supported by the National Institutes of Health and the American Heart Association. Dr. Jason Lobo, pulmonologist at UNC-CH, is co-author of the paper.

Credit: 
North Carolina State University

Changing a 2D material's symmetry can unlock its promise

TROY, N.Y. -- Optoelectronic materials that are capable of converting the energy of light into electricity, and electricity into light, have promising applications as light-emitting, energy-harvesting, and sensing technologies. However, devices made of these materials are often plagued by inefficiency, losing significant useful energy as heat. To break the current limits of efficiency, new principles of light-electricity conversion are needed.

For instance, many materials that exhibit efficient optoelectronic properties are constrained by inversion symmetry, a physical property that limits engineers' control of electrons in the material and their options for designing novel or efficient devices. In research published today in Nature Nanotechnology, a team of materials scientists and engineers, led by Jian Shi, an associate professor of materials science and engineering at Rensselaer Polytechnic Institute, used a strain gradient in order to break that inversion symmetry, creating a novel optoelectronic phenomenon in the promising material molybdenum disulfide (MoS2) -- for the first time.

To break the inversion symmetry, the team placed a vanadium oxide (VO2) wire underneath a sheet of MoS2. Molybdenum disulfide is a flexible material, Shi said, so it deformed its original shape to follow the curve of the VO2 wire, creating a gradient within its crystal lattice. Imagine what would happen if you placed a piece of paper over a pencil that was sitting on a table. The varied tension created in the paper is like the strain gradient formed in the MoS2 lattice.

That gradient, Shi said, breaks the material's inversion symmetry and allows electrons traveling within the crystal to be manipulated. The unique photo-response observed near the strain gradient allows a current to flow through the material. It's known as the flexo-photovoltaic effect, and it could be harnessed to design novel and/or high-efficiency optoelectronics.

"This is the first demonstration of such an effect in this material," Shi said. "If we have a solution that does not create heat during photon-electricity conversion, then the electronic devices or circuits could be improved."

Vanadium oxide is very sensitive to temperature, so the team was also able to demonstrate that the flexo-photovoltaic effect brought about temperature dependence at the site where the MoS2 and VO2 materials meet -- changing the lattice's gradient accordingly.

"This discovery suggests a novel principle that could be used for remote thermal sensing," said Jie Jiang, a postdoctoral research fellow in Shi's lab and the first author on this paper.

What the team was able to demonstrate here, Shi said, not only shows great promise for this material, but also suggests the potential of using such an approach in engineering other materials with favorable optoelectronic properties that are plagued by inversion symmetry.

Credit: 
Rensselaer Polytechnic Institute

Most comprehensive RNA-Atlas ever

The article 'The RNA Atlas expands the catalog of human non-coding RNAs', published today in Nature Biotechnology, is the result of more than five years of hard work to further unravel the complexity of the human transcriptome. Never before such a comprehensive effort was undertaken to characterize all RNA-molecules in human cells and tissues.

RNAs in all shapes and sizes

Our transcriptome is - analogous to our genome - the sum of all RNA molecules that are transcribed from the DNA strands that make up our genome. However, there's no 1-on-1 relationship with the latter. Firstly, each cell and tissue hasve a unique transcriptomes, with varying RNA production and compositions, including tissue-specific RNAs. Secondly, not all RNAs are transcribed from typical - protein coding - genes that eventually produce proteins. Many of our RNA molecules are not used as a template to build proteins, but originate from what once was called junk DNA: long sequences of DNA with unknown functions.

These non-coding RNAs (ncRNAs) come in all kinds of shapes and sizes: short, long, and even circular RNAs. Many of them even lack the tail of adenine-molecules that is typical for protein-coding RNAs.

300 human cell and tissue types and three sequencing methods

"There have been other projects to catalogue our transcriptome but the RNA-Atlas project is unique because of the applied sequencing methods," says prof. Pieter Mestdagh from the Center for Medical Genetics at Ghent University. "Not only did we look at the transcriptome of as many as 300 human cell and tissue types, but most importantly, we did so with three complementary sequencing technologies, one aimed at small RNAs, one aimed at polyadenylated (polyA) RNAs and a technique called total RNA sequencing."

This last sequencing technology led to the discovery of thousands of novel non-coding RNA genes, including a novel class of non-polyadenylated single exon genes and many new circular RNAs. By combining and comparing the results of the different sequencing methods the researchers were able to define for every measured RNA transcript, the abundance in the different cells and tissues, whether it has a polyA-tail or not (it appears that for some genes this can differ from cell type to cell type), and whether it is linear of circular. Moreover, the consortium searched and found important clues in determining the function of some of the ncRNAs. By looking at the abundancy of different RNA's in different cell types they found correlations that indicate regulatory functions, and could determine whether this regulation happens on the transcription level (by preventing or stimulating transcription of protein coding genes) or post-transcriptional (e.g. by breaking down RNAs).

An invaluable resource for biomedical science

All data, analyses and results (equivalent to a few libraries of information) are available for download and interrogation in the R2 web portal, enabling the community to implement this resource as a tool for exploration of non-coding RNA biology and function.

Prof. Pavel Sumazin of the Baylor College of Medicine: "By combining all data in one comprehensive catalogue, we have created a new valuable resource for biomedical scientists around the world studying disease processes. A better understanding of the complexity of the transcriptome is indeed essential to better understand disease processes and uncover novel genes that may serve as therapeutic targets or biomarkers. The age of RNA therapeutics is swiftly rising - we've all witnessed the impressive creation of RNA vaccines, and already the first medicines that target RNA are used in the clinic. I'm sure we'll see lots more of these therapies in the next years and decades."

Credit: 
Ghent University

Researchers propose methods for additive manufacturing quality control

Additive manufacturing offers an unprecedented level of design flexibility and expanded functionality, but the quality and process can drastically differ across production machines, according to Hui Yang, a professor of industrial engineering at Penn State. With applications in aerospace, health care and automotive industries with potential for mass customization, additive manufacturing needs quality management.

To address this concern, Yang and a team of researchers from Penn State, University of Nebraska--Lincoln and the National Institute of Standards and Technology (NIST) proposed the design, development and implementation of a new data-driven methodology for quality control in additive manufacturing. They published their work in the Proceedings of Institute of Electrical and Electronics Engineers (IEEE).

"Like an ecosystem, we have people working in isolated efforts in different areas of additive manufacturing, and systems engineers can help connect the dots to provide a framework for quality management," Yang said. "Quality is indispensable, and if we design a system-level framework of quality management from the start, then we have higher quality and better productivity at less cost. Ultimately, everyone wants to do high-precision, high-end manufacturing, but if quality suffers at any step during production, you lose the competitive advantage needed for the global market. Leveraging data to control and ensure high quality products helps keep that advantage."

The team worked together to analyze various academic papers to deduce a six sigma framework of quality control for additive manufacturing, which lead to their proposed systems engineering approach.

The method hinges on six sigma, a popular approach that uses data-driven tactics to eliminate defects, drive profits and improve quality of products. Through their detailed analysis, the team suggested that this five-step approach of defining, measuring, analyzing, improving and controlling can further the quality management when applied to additive manufacturing.

"Via the research we analyzed, we identified the critical challenges of additive manufacturing and where quality standards are lacking," Yang said. "For each step in the process, you need to identify the sticking points, which is where methods such as machine learning can come into play and help show an engineer or designer how to control the process to avoid defects."

Tim Simpson, Paul Morrow Professor in Engineering Design and Manufacturing and professor of mechanical engineering at Penn State, explained that such defects can become massive liabilities when considered in the context of mass-produced products.

"If your goal is to use additive manufacturing to make parts for a car or a plane, then that part better not fail," Simpson said.

He also noted the cost of failed parts can add up -- a failed metal build, he said, could "easily cost 10 to 20 thousand dollars and require multiple iterations along the way."

By looking for the quality gaps in the standards for mass-produced parts, Simpson said their proposed methodology is critical to ensure quality production with additive manufacturing for both high volume and custom products.

"Quality control processes and methods are established for mass production, where you make hundreds to millions of things," Simpson said. "Additive manufacturing enables customization, and the current quality control methods and accepted practices do not readily apply when you are only making one or a few of an item. We have to think differently to ensure highly quality parts."

Soundar Kumara, Allen E. Pearce and Allen M. Pearce Professor of Industrial Engineering at Penn State, noted that their review represents the state-of-the-art additive manufacturing technologies and can help researchers by providing a comprehensive understanding of the tools and techniques.

Credit: 
Penn State

Quaise Inc. drilling technology could allow geothermal to power the world

image: Simplified schematic of the Quaise drilling rig: (1) millimeter wave drilling components interfaced with conventional drilling rig at the surface, (2) Conventional drilling from surface down to basement rock, (3) millimeter wave drilling from basement down to target depth. Source: Quaise Inc.

Image: 
Quaise Inc.

CAMBRIDGE, MA--Geothermal energy systems have the potential to power the world and become the leading technology for reducing greenhouse gas emissions if we can drill down far enough into the Earth to access the conditions necessary for economic viability and release the heat beneath our feet. Quaise Inc. is developing a potentially disruptive and completely unique drilling technology to make that happen.

That was the takeaway from a paper presented by Matt Houde of Quaise at the World Geothermal Congress (WGC) on June 15. Houde not only described the company's technology, which was pioneered at MIT, but also presented several calculations and a cost model showing its technical and economic feasibility.

Houde's coauthors are Quaise CEO Carlos Araque, Ken Oglesby of Impact Technologies LLC, and Paul Woskov of the MIT Plasma Science and Fusion Center (PSFC).

Accessing the Mother Lode

The mother lode of geothermal energy is some 2 to 12 miles beneath the Earth's surface where conditions are so extreme (for example, temperatures are over 374 degrees C, or 704 degrees F) that if water could be pumped to the area it would become supercritical, a steam-like phase that most people aren't familiar with. (Familiar phases are liquid water, ice, and the vapor that makes clouds.) Supercritical water, in turn, can carry some 5-10 times more energy than regular hot water, making it an extremely efficient energy source if it could be pumped above ground to turbines that could convert it into electricity.

Today we can't access those conditions except in Iceland and other areas where they are relatively close to the surface. The number one problem: we can't drill down far enough. The drills used by the oil and gas industries can't withstand the formidable temperatures and pressures that are found miles down.

Houde began his talk with a quote from the Department of Energy's 2019 Geovision report, an analysis of the geothermal industry in the United States: "Supercritical resources can be found everywhere on Earth by drilling deep enough...Drilling to this depth is financially prohibitive with existing technology...Economic production of supercritical resources will require the development of entirely new classes of drilling technologies and methods."

Quaise is working to that end. The company's technique replaces the conventional drill bits that mechanically break up the rock with millimeter wave energy (cousins to the microwaves many of us cook with). Those millimeter waves (MMWs) literally melt then vaporize the rock to create ever deeper holes. The title of Houde's WGC talk: "Rewriting the Limits for Deep Geothermal Drilling: Direct Energy Drilling Using Millimeter Wave Technology."

"It sounds like sci-fi technology, but it's not," says Houde. "It is definitely real, and it's feasible and practical. It's just a matter of implementing it and validating it in the lab and in the field."

A Strong Foundation

Houde emphasizes that the Quaise approach is based on technology "that's already mature and commercialized," having been developed over decades for fusion energy research and for the oil and gas industries. Quaise is simply repurposing that technology for a different application.

For example, the MMW energy key to the technology is produced with a gyrotron machine and directed to its target (deep, hot rock) via waveguides. Both were developed over some 50 years of research into nuclear fusion as an energy source. The Quaise technique also takes advantage of conventional drilling technologies such as those developed by the oil and gas industries. Quaise will still use these to drill down through surface layers to bedrock, which was what they were optimized for.

Then the system will switch to the MMW technology. The latter "simplifies everything downhole such that nothing is particularly sensitive to the high temperatures and pressures. That allows us to mitigate many of the issues we have with conventional mechanical rigs at these depths," Houde says.

Running the Numbers

Houde presented several calculations showing the technical feasibility of the Quaise approach. For example, he showed that the drilling rate even several miles into the Earth should be roughly the same as that for conventional geothermal drilling. Further: the Quaise MMW technology automatically melts the rock to create a strong glass "liner" that prevents the hole from collapsing and protects the waveguide. About six miles down, that would replace the cement casings currently used to protect the boreholes associated with conventional, mechanical drilling nearer the surface. This, in turn, actually solves additional problems like the downtime associated with removing broken drill bits.

Houde also presented calculations regarding the removal of the vaporized rock, which is done using existing compressor technology to pump a purge gas down into the hole along with the MMW energy. Think of the general setup as a straw within a larger straw. The energy and gas travel downhole through the inner straw where they eventually reach and vaporize the rock at the bottom. Then the gas carrying the vaporized rock, or particulate, travels back up to the surface through the space between the two straws. "Our calculations show that the particulate can be conveyed uphole with downhole pressures and flow rates that fall within bounds of existing compressors," Houde says.

A cost model of the economic feasibility of the Quaise approach is also promising. Houde notes that few geothermal wells have been drilled past ten kilometers (~six miles), but to get that far using conventional technology costs more than $5,000 per meter. The cost model indicates that MMW drilling could reach twice that depth at drilling costs of around $1,000 per meter.

What's Next?

Although experiments at MIT have shown the general feasibility of drilling with MMW energy, the technique must still be proved in the field. Quaise aims to do just that over the next few years out in the western United States, working in collaboration with Altarock, MIT's PSFC, Oak Ridge National Laboratory, Impact Technologies, and General Atomics.

Investors in the company are the Advanced Research Projects Agency-Energy (Houde is project manager for the ARPA-E grant), The Engine at MIT, Vinod Khosla, and Collaborative Fund, among others.

A video about Quaise can be seen here.

Credit: 
Science Communications

Anti-science, partisan tweets could flag an outbreak

In the realm of social media, anti-science views about COVID-19 align so closely with political ideology -- especially among conservatives -- that its predictability offers a strategy to help protect public health, a new USC study shows.

Resistance to science, including the efficacy of masks and vaccines, poses a challenge to conquering the coronavirus crisis. The goal of achieving herd immunity won't happen until society achieves consensus about science-based solutions.

The USC study's machine-learning assisted analysis of social media communications offers policymakers and public health officials new tools to anticipate shifts in attitudes and proactively respond.

"We show that anti-science views are aligned with political ideology, specifically conservatism," said Kristina Lerman, lead author of the study and a professor at the USC Viterbi School of Engineering. "While that's not necessarily brand new, we discovered this entirely from social media data that gives detailed clues about where COVID-19 is likely to spread so we can take preventive measures."

The study was published in the Journal of Medical Internet Research.

Previous surveys and polls have shown a partisan gulf in views about COVID-19 as well as the costs and benefits of remedies. By contrast, the USC study examined public health attitudes based on Twitter tweets from Jan. 21, 2020, and May 1, 2020.

They sorted people into three groups -- liberal versus conservative, pro-science versus anti-science, and hardline versus moderate -- then trained machine-learning algorithms to sort all the other people. They used geographical data to pare 115 million tweets worldwide down to 27 million tweets by 2.4 million users in the United States.

The researchers further parsed the data by demographics and geography and tracked it over the three-month study period. This approach allowed for near real-time monitoring of partisan and pseudo-science attitudes that could be refined in high detail aided by advanced computing techniques.

Assessing anti-science views can assist in tailoring comms strategies, bracing for outbreaks

What emerged is the ability to track public discourse around COVID-19 and compare it with epidemiological outcomes. For example, the researchers found that anti-science attitudes posted between January and April 2020 were high in some Mountain West and Southern states that were later hit with deadly COVID-19 surges.

In addition, the researchers were able to probe specific topics important to each group: anti-science conservatives were focused on political topics, including former President Trump's reelection campaigns and QAnon conspiracies, while pro-science conservatives paid attention to global outbreaks of the virus and focused more on preventive measures to "flatten the curve." Researchers were able to track attitudes across time and geography to see how they changed. For example, to their surprise, they found that polarization on the topic of science went down over time.

Perhaps most encouraging, they discovered that, even in a highly polarized population, "the number of pro-science, politically moderate users dwarfs other ideological groups, especially anti-science groups." They said their results suggest most people are ready to accept scientific evidence and trust scientists.

The findings can also help policymakers and public health officials. If they see anti-science sentiment growing in one region of the country, they can tailor messages to mitigate distrust of science while also preparing for a potential disease outbreak.

"Now we can use social media data for science, to create spatial and temporal maps of public opinions along ideological lines, pro- and anti-science lines," said Lerman, a computer scientist and expert in mining social media for clues about human behavior at USC's Information Sciences Institute. "We can also see what topics are important to these segments of society, and we can plan proactively to prevent disease outbreaks from happening."

Credit: 
University of Southern California

Incidents of serious parasitic disease on the rise in Alberta, Canada

image: Infectious diseases expert Stan Houston helped lead a new study showing that incidents of the parasitic disease have been on the rise in Alberta since 2013. "We should be paying attention, but it's still a very rare disease," he says, noting that simple precautions like handwashing can help prevent it.

Image: 
Photo: Supplied

A rare parasitic infection imported from Europe continues to take root in Alberta, Canada. The province is now the North American hotspot for human alveolar echinococcosis (AE), which takes the form of a growth in the liver, causing serious and potentially deadly health complications.

A recently published review of known AE cases in Alberta found 17 instances of human AE diagnosed in the province between 2013 and 2020. Before the recent surge, only two cases of human AE had been previously confirmed in North America--one in Manitoba in 1928 and another in the U.S. state of Minnesota in 1977.

"This parasite has now become very widely established in the wild in the Prairies. It's been found in Saskatchewan and in B.C., but Alberta has had most of the cases of human disease," said University of Alberta infectious diseases expert Stan Houston, who helped lead the study.

"We have been having on average more cases every year. There's been a lull since COVID-19, but I'm suspicious it reflects a slowdown in testing during the pandemic and that we may soon see a surge again."

According to Houston, the strain of AE found in the Alberta cases has been identified by scientists at the Faculty of Veterinary Medicine in Calgary as having originally come from Europe, likely in dogs brought to the area.

The parasite takes the form of a tiny tapeworm in canines--typically foxes and coyotes, but potentially pet dogs--and is considered to be relatively harmless to them. When a rodent ingests parasite eggs from canine feces, it gets a different form of the disease and develops a tumour, or parasitic growth, in the liver, which kills it. When the rodent is eaten by a canine, the parasite takes the tapeworm form again.

"We humans are taking the place of the rodent in the life cycle when we accidentally consume microscopic parasite eggs--maybe in strawberries or lettuce from a garden where a coyote passed through, or possibly a dog if it is carrying the parasite," said Houston, also noting that a human could become infected by petting a dog that has microscopic traces of canine feces in its hair and then touching food or their mouth, accidentally ingesting the parasite's eggs.

The European strain of AE has been notably successful in Alberta, quickly spreading in the wild. Increased human contact with coyotes as they have become urbanized, along with the increasing numbers of people with diseases or therapies that weaken the immune system, are likely to contribute to the number of humans developing the disease.

"In coyotes in Calgary and in Edmonton, more than half have been found to be carrying this parasite. So the new strain seems to not only be more virulent when it affects humans, but it seems to be super-effective in wild hosts," said Houston.

"Diseases from animals have always been important to keep an eye on," he added. "The COVID-19 pandemic has again elevated awareness of the number and importance of human diseases that are transmitted from animals."

Of the 17 cases found in Alberta, 11 patients lived in rural areas, 14 of them owned dogs and six were immunocompromised individuals--of special interest as the disease progresses faster in patients whose immune systems have been suppressed.

The symptoms of AE can be difficult to spot. The infection typically has an incubation period of several years before a patient begins showing signs of illness. Almost half of the cases in Alberta were found accidentally when the patient was getting tested for a different illness. It's often found after an ultrasound shows abnormalities in the liver, followed by an investigative biopsy. When symptoms occur, they can include unspecified pain, jaundice, weakness and weight loss--many of the same conditions that could be expected from a cancerous tumour in the liver.

"In the majority of cases that was the people's first thought when they saw the imaging, that it was cancer," said Houston. "The symptoms would be indistinguishable from many other diseases in the liver, hence the need for a biopsy diagnosis."

If found early enough, treatment can involve surgery to remove the mass from the liver. Because the parasite is initially symptomless, it is often able to slowly grow and by the time it is found, about two-thirds of patients will be inoperable. In those cases, lifelong antiparasitic drugs are the only option. The drugs won't kill the parasite but will prevent it from growing further. If left untreated, the parasite could kill its human host within 10 to 15 years.

Currently the most useful drug for controlling AE is not licensed in Canada (although widely available in most of the rest of the world) and only available through a special physician application process to both the government and manufacturer.

Avoiding the parasite comes down to good hygiene practices and taking simple precautions. Houston said it's a good idea to wash your hands after handling your dog, especially if you suspect it's eaten a rodent or spent time in a dog park or area where coyotes frequent. Also recommended is thoroughly washing produce that comes from the ground or close to the ground, such as lettuce or mushrooms.

The researchers are now working on a new study examining samples of liver biopsies from patients in Alberta where cancer wasn't found, to look for possible previously unrecognized cases of AE.

"That would give us a better picture of what's going on, but more importantly, would give us a chance to give those patients appropriate therapy," said Houston.

"We should be paying attention, but it's still a very rare disease," he added. "People should keep that in perspective, adopt health behaviours and not obsess about this."

Credit: 
University of Alberta Faculty of Medicine & Dentistry