Culture

Another way "good" cholesterol is good: combatting inflammation

DALLAS, April 12, 2021 -- Testing how well "good" cholesterol particles reduce inflammation may help predict who is at heightened risk to develop cardiovascular disease caused by narrowed arteries, according to research published today in the American Heart Association's flagship journal Circulation.

Assessing levels of high-density lipoprotein (HDL) cholesterol, known as "good cholesterol," are already a standard part of formulas used to predict cardiovascular risk. A new test of the anti-inflammatory function of HDL seems to provide additional information that is independent of the quantity of HDL. If the results are confirmed in broader populations and a test developed for clinical use, adding anti-inflammatory capacity to risk scores may improve risk prediction and help people take steps to protect themselves against heart disease.

"HDL are very complex particles with anti-atherosclerotic functions that are not reflected by measuring just the cholesterol quantity," said senior study author Uwe J.F. Tietge, M.D., Ph.D., professor and head of the division of clinical chemistry at the Karolinska Institute in Stockholm, Sweden. "Atherosclerosis [plaque build-up in the arteries] underlying cardiovascular disease is increasingly recognized as a disease with a strong inflammatory component, and a central biological function of HDL is to decrease inflammation."

This study is the first to test whether better anti-inflammatory function of HDL particles protects against heart attacks and other serious heart-related events.

Participants included 680 white adults (average age of 59, 70% male) living in the Netherlands who were part of a large population study that began in 1997. All were healthy when they enrolled in the study. From the larger study participants were identified who'd had a first cardiovascular disease event before the end of the study follow-up. HDL particles were analyzed in 340 people who experienced a first fatal or non-fatal heart attack, were diagnosed with heart problems caused by narrowed heart arteries (ischemic heart disease) or who required a procedure to open clogged coronary arteries during the median 10.5-year follow-up period. These participants were matched to a control group of 340 people of the same age (within 5 years), sex, smoking status and HDL cholesterol levels who had no cardiovascular events during follow-up.

Several lab tests were performed for all participants at enrollment, including measuring the ability of isolated HDL particles to decrease the inflammatory response of endothelial cells lining blood vessels (called the anti-inflammatory capacity). Researchers also measured C-reactive protein, a substance that rises when there is more inflammation throughout the body, and cholesterol efflux capacity, a laboratory assessment of how efficiently HDL can remove cholesterol from cells that resemble those found in plaque.

The researchers found:

HDL anti-inflammatory capacity was significantly higher in people who remained healthy (31.6%) than in those who experienced a cardiovascular event (27%);

The association of anti-inflammatory capacity with cardiovascular events was independent of the established biomarkers of HDL cholesterol and C-reactive protein levels, and was also independent of cholesterol efflux capacity;

For every 22% increase in the ability of HDL particles to suppress inflammation in endothelial cells, participants were 23% less likely to have a cardiovascular event during the next decade;

The amount of protection from increased HDL anti-inflammatory capacity was higher in women than in men; and

Risk prediction was improved by adding HDL anti-inflammatory capacity to the Framingham Risk Score, or by replacing HDL cholesterol levels with this new measure of HDL function.

"By using a novel research tool, our results provide strong support for the concept that plaque buildup in the arteries has an inflammatory component, and that the biological properties of HDL particles have clinical relevance to cardiovascular disease risk prediction," said Tietge.

Although the results raise intriguing possibilities for improved screening, the results must be confirmed in different populations. In addition, a simpler and hopefully automated test for anti-inflammatory capacity should be developed first, researchers said.

"The HDL cholesterol level is a good, established, simple and cost-efficient CVD risk biomarker. Our results, however, demonstrate that the anti-inflammatory capacity or assays looking at HDL function in general have the potential to provide clinically relevant information beyond the static HDL cholesterol measurements that are currently used," Tietge said.

The findings also raise the possibility that medications to improve HDL anti-inflammatory capacity may be developed and used to lower heart disease risk.

Study limitations to be considered include that the study population was white and genetically similar, thus results are not generalizable to other race and ethnic groups. In addition, the researchers did not include stroke incidence in their analysis so conclusions cannot be drawn about HDL and stroke.

Credit: 
American Heart Association

Prehistoric Pacific Coast diets had salmon limits

PULLMAN, Wash. - Humans cannot live on protein alone - even for the ancient indigenous people of the Pacific Northwest whose diet was once thought to be almost all salmon.

In a new paper led by Washington State University anthropologist Shannon Tushingham, researchers document the many dietary solutions ancient Pacific Coast people in North America likely employed to avoid "salmon starvation," a toxic and potentially fatal condition brought on by eating too much lean protein.

"Salmon was a critical resource for thousands of years throughout the Pacific Rim, but there were a lot of foods that were important," said Tushingham the lead author of the paper published online on April 8 in the American Journal of Physical Anthropology. "Native people were not just eating salmon. There's a bigger picture."

Some archeologists have contended for years that prehistoric Northwest people had an "extreme salmon specialization," a theory primarily based on the amount of salmon bone found at archeological sites.

Tushingham and her co-authors argue that such a protein-intensive diet would be unsustainable. They point to nutritional studies and a global database of hunter-gatherer diets that indicate people have dietary limit on lean protein of around 35%. While it can vary by individual, exceeding that ceiling can be physically debilitating within a few days and fatal within weeks. Early explorers in the U.S. West subsisting on lean wild game discovered this problem the hard way and called it "rabbit starvation" or "caribou sickness."

This toxic situation can apply to any lean meat, including salmon, Tushingham said. To avoid "salmon starvation", early Pacific Coast people had to find ways to get other nutrients, especially for children and nursing mothers who have even lower dietary thresholds for lean protein.

"There were ingenious nutritional and cultural solutions to the circumstances in the Northwest," said Tushingham. "Yes, salmon was important, but it wasn't that simple. It wasn't just a matter of going fishing and getting everything they needed. They also had to think about balancing their diet and making sure everybody could make it through the winter."

The researchers point to evidence in California that people offset stored salmon protein with acorns; in Oregon and Washington, they ate root crops like camas as well as more fat-heavy fish such as eulachon. Further north, where plants are more limited, communities often ate marine mammals with high fat content such as seals and walrus. In far north interior, where there are few plants and the salmon runs can go thousands of miles inland, this was particularly challenging. Lean dried salmon was an important food source, and people circumvented salmon starvation through trading for oil with coastal peoples or obtaining fat through processing bone marrow from caribou and elk.

The authors focus on the limits of salmon, which used to be considered a "prime mover" of Pacific Northwest populations, but their analysis also has implications for the study of historical human nutrition. If their argument is correct, it is unlikely that any human society was fully driven by pursuit of protein alone as their diets had to be more complex.

"People try to come up with one 'paleo-diet,' but there was no one specific ideal diet," said Tushingham. There were nutritional baselines that they had to cover, and nutritional limits that they couldn't exceed. There were many good solutions. It depended on where you lived and the history of your community."

Credit: 
Washington State University

Research reveals household water consumption changes during lockdown

Cranfield University research using data from smart meters has found that household water consumption changed significantly after the start of the COVID-19 lockdown, shifting from predominantly higher usage early in the morning to multiple peaks and continued demand throughout the day.

The study used machine learning algorithms to analyse and identify patterns in hourly water consumption data from 11,528 households in the East of England from January to May 2020.

The research is the first of its kind in the UK to quantify network consumption and segment households into different behavioural clusters according to significant differences in usage patterns.

Key findings were that:

There was an overall increase in household consumption from March to May 2020 compared to the same period in 2019, with the gap opening as lockdown restrictions deepened;

A sharp increase (10% on the previous week) in consumption was recorded in the fourth week of March - the week of the COVID-19 lockdown - rising to 46% above the pre-lockdown average in the fourth week of May;

Four distinct clusters of household water consumers can be characterised by their unique patterns of hourly use: early morning, late morning, evening peak and multiple peak;

The multiple peak cluster experienced the most significant increase in the number of households during the lockdown period, with a 93% rise between the third and fourth weeks of March;

The early morning cluster experienced the sharpest decrease in the number of households during the lockdown period, with a significant drop in their share of relative consumption between 07:00-08:00 a.m. from 40% to 20%.

Halidu Abu-Bakar, PhD researcher in the Cranfield Centre for Competitive Creative Design, Cranfield University, said: "The COVID-19 lockdown has instigated significant changes in household behaviour across a variety of categories including water consumption, which in the south and east regions of England is at an all-time high. The impact of the extended time people stayed at home under the lockdown and the ensuing changes in behaviour arising from this led to an increase in household water demand, exacerbating existing pressure on network water supply.

"Having knowledge of these patterns provides a solid framework for peak demand management and can help utility companies to forecast consumption, especially at unusual times such as pandemics, droughts and when there are seasonal variations."

Professor Leon Williams, Head of the Cranfield Centre for Competitive Creative Design, said: "Quality data driven research will provide the intelligence needed for water utilities to make strategic decisions."

Professor Stephen Hallett, Centre for Environmental and Agricultural Informatics, Cranfield University, said: "Water utility companies are increasingly searching for ways to understand the full nature of household water use, how to improve network demand forecasting and achieve effective water efficiency interventions. This data-driven characterisation of household clusters and understanding the impact of these unique patterns of behaviour on network demand can help in the design of demand forecasting and intervention that targets households on the basis of their shared cluster characteristics."

Credit: 
Cranfield University

The impact of chemotherapy on immune cells in the tumor microenvironment

Research from Queen Mary University of London has revealed novel insights into the effects of chemotherapy on the tumour microenvironment (TME). The study, published today in Cancer Immunology Research, a journal of the American Association for Cancer Research, found that chemotherapy enhances the anti-tumour actions of immune cells within the TME and their ability to support immune responses against cancer.

Cancers are not just a mass of cancerous cells, but are rogue organs made up of many different cell types, including cells that form connective tissue and blood vessels, and immune cells. These non-cancerous cells have been recruited and corrupted by the cancer to help it grow and spread, and constitute what is known as the TME.

When treating cancer cells with chemotherapy, the cells within the TME are also impacted and previous research has shown that chemotherapy may activate immune cells within the TME to fight against the cancer. In this study, the team, led by Professor Fran Balkwill, investigated the effects of chemotherapy on immune cells called macrophages, which are associated with poor survival across a variety of cancer types. The study focused on high-grade serous ovarian cancer (HGSOC) - the most common type of ovarian cancer.

Chemotherapy switches immune cells to anti-tumour mode

By comparing biopsy samples taken from the omentum (most common site of cancer spread in HGSOC) of 26 patients prior to and after chemotherapy, the team found a significant reduction in the number of macrophages present in the tissues following treatment. Further investigations in samples from other HGSOC patients revealed that chemotherapy switched the remaining macrophages from a pro-tumour to an anti-tumour mode, which may stimulate the patient's immune response against the cancer.

The effects of chemotherapy on macrophages observed in patient samples were also seen in preclinical mouse models of HGSOC previously developed by the team, which recapitulate many aspects of the human omental TME.

As macrophages are associated with poor survival in cancer, the team took their research a step further to determine whether eliminating all macrophages from the TME after chemotherapy could prolong disease-free survival in the preclinical mouse models. To their surprise, removing all macrophages shortly after the completion of chemotherapy caused the mice to relapse quicker. Following three doses of chemotherapy, the macrophages had switched to an anti-tumour mode, and so eliminating the macrophages from the TME actually inhibited the immune response against the tumour and resulted in poorer survival outcomes.

The research was funded by Cancer Research UK, the Wellcome Trust and Wellbeing of Women.

Implications for ovarian cancer treatment

Chemotherapy and surgery are currently the main treatment options for patients with ovarian cancer. Despite chemotherapy working well initially, many patients experience relapse due to the development of resistance against these drugs.

Professor Balkwill from Queen Mary University of London said: "This study enhances our understanding of the impact of chemotherapy on macrophages and other aspects of the immune response. As our work was driven by results first obtained in patient samples and we were able to replicate the findings in our mouse models, we could investigate hypotheses and obtain data that have translational significance.

"We now have effective preclinical models of treatment and relapse that can be used to help identify treatments that build upon the tentative immune response triggered by chemotherapy."

Ultimately, the team hope that their models could help to identify new drug combinations that harness the initial immune-boosting effects of chemotherapy to reduce the number of required chemotherapy doses, minimise toxic side effects and improve survival for patients with ovarian cancer.

Credit: 
Queen Mary University of London

Resilience against replay attacks in computer systems

From power grids and telecommunications to water supply and financial systems, digital data controls the infrastructure systems on which society relies. These complex, multi-tier systems depend on layered communications to accomplish their tasks - yet every point of contact becomes a potential target, every path of information a potential weak spot for malicious actors to attack.

A team of researchers from the University of Calabria in Italy has developed the first predictive control scheme that can help distributed networks with multiple agents not only identify these attacks but also protect against them. Their approach was published in IEEE/CAA Journal of Automatica Sinica (Volume 8, Issue 3, March 2021).

"Modern systems have an increasing complex structure due to the large number of interacting agents aligned to accomplish specific tasks in a distributed fashion," said paper author Giuseppe Franzè, associate professor of control engineering in the Department of Informatics, Modeling, Electronics and System Engineering, University of Calabria. "The key result of the paper is that model predictive control strategies, properly adapted to multi-agent configurations, can address difficult scenarios such as the presence of intrusions such as replay attacks."

Replay attacks are difficult to identify because the malicious actor uses information already in the system. By stealing an account number or a permission string stolen from one transmission and using it on another agent - or even the agent who originally received the transmission - the actor can gain access or incite a specific action.

Franzè and his team applied a "receding horizon" model, that allows the researchers to predict what the system will look like in the future. By understanding what the system should look like, the model can identify when something unexpected occurs, like the resending of information.

"The receding horizon property allows us to consider the same structure of the optimization at each next time instant," Franzè said. "This means that if a problem is solvable at the initial time instant the same occurs in the future."

Importantly, according to Franzè, the strategy also offers protection by allowing the system to encapsulate in the moment before the attack, preserving communications until the attack can be successfully blocked.

"This low-demand model predictive control scheme is an efficient way to address unknown scenarios where external malicious agents affect normal system operations," Franzè said.

Credit: 
Chinese Association of Automation

Balancing between build-up and break-down of bone

image: PTH binding to PTH receptor (PTH1R) upregulates SLPI expression. SLPI directly acts in osteoblasts to enhance bone formation by controlling gene expression. Additionally, SLPI promotes the adhesion of osteoblasts to neighboring osteoclasts, thereby increasing direct cell-cell contact. This indirect effect leads to activation of osteoblastic bone formation, and inhibition of osteoclastic bone resorption.

Image: 
Osaka University

Osaka, Japan - Despite what some people think, bone is not merely a passive component of the body. The skeleton is structurally dynamic and responds to life's physical stresses with continual equilibration between bone mass loss and reformation. This ensures healing and remodeling in tune with the ebb and flow of calcium and phosphorus in the bloodstream. Now, researchers at Osaka university have identified a molecule—secretory leukocyte protease inhibitor (SLPI)—that helps mediate this critical balance, which could be used in the development of new treatments for bone diseases such as osteoporosis.

Skeletal tissue changes are orchestrated primarily by parathyroid hormone (PTH), a regulator of blood calcium levels that is secreted by the parathyroid glands in the neck. PTH is known to have a dual effect on bone—its action is primarily catabolic, causing bone dissolution and removal. However, in small intermittent doses, PTH can also increase bone mass (anabolic). Though PTH has long been used for the clinical treatment of osteoporosis, the precise mechanism and pathways whereby PTH promotes bone formation are poorly understood.

The researchers looked at the interactions between cells that mediate bone formation (osteoblasts), cells that mediate bone loss (osteoclasts) and the functional role of SLPI in bone metabolism in vivo. Akito Morimoto, lead author, explains the research methodology of the new study published in Nature Communications: "We could establish that PTH highly upregulates the gene Slpi in osteoblasts in animal models. We analyzed the bone phenotype of experimental mice in which the gene was 'knocked out' and showed that genetic modification of Slpi prevented PTH from inducing bone formation. Moreover, Slpi induction in osteoblasts themselves increased their differentiation while promoting osteoblast-osteoclast contact which reduces bone loss activity." Furthermore, biomicroscopic imaging in living bone demonstrated that SLPI secreted outside the cells is essential for association between osteoblasts and osteoclasts and the cell-cell interactions that PTH mediates.

Corresponding author Junichi Kikuta summarizes their results. "Our findings clarify the roles of SLPI as a novel coupling factor and coordinator of bone remodeling for conservation of mass, strength and structural integrity. Not only does it promote bone formation by osteoblasts, it also attracts osteoclasts closer to osteoblasts to suppress bone loss."

"A clear understanding of the cellular networks and molecular pathways that mediate PTH anabolism will enhance clinical applicability of this drug," senior author Masaru Ishii explains. "Moreover, it may inform the development of innovative pharmacotherapies for managing osteoporosis and other intractable orthopedic diseases."

Credit: 
Osaka University

Earth's crust mineralogy drives hotspots for intraterrestrial life

image: DeMMO field team from left to right: Lily Momper, Brittany Kruger, and Caitlin Casar sampling fracture fluids from a DeMMO borehole installation

Image: 
©Matt Kapust

Below the verdant surface and organic rich soil, life extends kilometers into Earth's deep rocky crust. The continental deep subsurface is likely one of the largest reservoirs of bacteria and archaea on Earth, many forming biofilms - like a microbial coating of the rock surface. This microbial population survives without light or oxygen and with minimal organic carbon sources, and can get energy by eating or respiring minerals. Distributed throughout the deep subsurface, these biofilms could represent 20-80% of the total bacterial and archaeal biomass in the continental subsurface according to the most recent estimate. But are these microbial populations spread evenly on rock surfaces, or do they prefer to colonize specific minerals in the rocks?

To answer this question, researchers from Northwestern University in Evanston, Illinois, led a study to analyze the growth and distribution of microbial communities in deep continental subsurface settings. This work shows that the host rock mineral composition drives biofilm distribution, producing "hotspots" of microbial life. The study was published in Frontiers in Microbiology.

Hotspots of microbial life

To realize this study, the researchers went 1.5 kilometers below the surface in the Deep Mine Microbial Observatory (DeMMO), housed within a former gold mine now known as the Sanford Underground Research Facility (SURF), located in Lead, South Dakota. There, below-ground, the researchers cultivated biofilms on native rocks rich in iron and sulfur-bearing minerals. After six months, the researchers analyzed the microbial composition and physical characteristics of newly grown biofilms, as well as its distributions using microscopy, spectroscopy and spatial modelling approaches.

The spatial analyses conducted by the researchers revealed hotspots where the biofilm was denser. These hotspots correlate with iron-rich mineral grains in the rocks, highlighting some mineral preferences for biofilm colonization. "Our results demonstrate the strong spatial dependence of biofilm colonization on minerals in rock surfaces. We think that this spatial dependence is due to microbes getting their energy from the minerals they colonize." explains Caitlin Casar, first author of the study.

Future research

Altogether, these results demonstrate that host rock mineralogy is a key driver of biofilm distribution, which could help improve estimates of the microbial distribution of the Earth's deep continental subsurface. But leading intraterrestrial studies could also inform other topics. "Our findings could inform the contribution of biofilms to global nutrient cycles, and also have astrobiological implications as these findings provide insight into biomass distributions in a Mars analog system" says Caitlin Casar.

Indeed, extraterrestrial life could exist in similar subsurface environments where the microorganisms are protected from both radiation and extreme temperatures. Mars, for example, has an iron and sulfur-rich composition similar to DeMMO's rock formations, which we now know are capable of driving the formation of microbial hotspots below-ground.

Credit: 
Frontiers

Foliar application boosts the zinc content of wheat grain by up to 50%

image: Researchers Antonio Sánchez, María del Carmen del Campillo and Vidal Barron working at the case study

Image: 
University of Cordoba

A team from the Department of Agronomy at the UCO has demonstrated, through field tests carried out during 8 agricultural seasons, that foliar feeding with fertilizer increases the concentration of zinc in wheat more than if it is applied to the soil

Micronutrient deficiencies pose health problems for a third of the world's population. Worldwide, zinc deficits are more problematic in the rural areas of developing countries, where diets are largely limited to vegetable products grown in soils suffering from low nutrient availability. Biofortification, the process of bolstering the nutritional value of crops by increasing the concentration of vitamins and minerals in them, has arisen as a remedy for this problem.

In the search for solutions, the Edaphology Unit of the María de Maeztu Excellence Unit in the Department of Agronomy at the University of Cordoba (DAUCO), headed by the researcher Antonio R. Sánchez Rodríguez, has spent 8 years searching for the best biofortification strategy in terms of applying zinc to wheat grown in calcareous soils in southern Spain.

Between 2012 and 2019 this team tested different methods to biofortify wheat in 11 field trials in zinc-deficient soils. The effects of applying different doses of fertilizer to the soil (up to 10 kg per hectare) were evaluated, in addition to the results of applying different doses of zinc by spraying the plant in various phenological stages of the wheat's growth.

While application to the soil was not very effective, foliar application, or feeding, was shown to be a very efficient strategy to increase the zinc content in plants, "augmenting the concentration in grains up to 50%," says the researcher. That is, foliar application was shown to be much more effective, as, with just a tenth of the product (1.28 kg per hectare) better results were obtained than when applied to the soil.

Taking into account the variety of wheat, this direct application to plants was more effective after the start of growth or during flowering.

Nourishing the plant itself, and not the soil, thus, was demonstrated to be an effective way to tackle the problem of zinc deficits in calcareous soils in the short term. In addition, if at some point wheat were purchased based on its nutritional content, growers could see increases in their profits.

This solution "is very valuable for places where there is no other source of zinc in diets, although it would entail adding another task to wheat cultivation, or combining it with the application of other phytosanitary treatments" notes Sánchez.

Predicting wheat yields after fertilization with zinc

On this project the team from the Edaphology Unit also looked for a soil indicator that would help predict the response of durum wheat to zinc fertilization, in terms of its yields. However, they found that conditions in the field make it very difficult to verify this parameter, and that a simple indicator cannot predict this response.

While at the laboratory level some indicators could be defined, in the field this task is difficult, since it is highly dependent on factors such as precipitation, and would require many more years of study.

Credit: 
University of Córdoba

Genes and immune cells predict immunotherapy success in bladder cancer

New York, NY (April 9, 2021) - Sets of genes associated with resistance to immunotherapy in patients with metastatic urothelial cancer of the bladder have been identified and validated by researchers at Mount Sinai. In a study published in Clinical Cancer Research, the team uncovered gene signatures representing adaptive immunity and pro-tumorigenic inflammation that were responsible for sensitivity or resistance to immune checkpoint inhibitors, drugs that help the body's immune system recognize and attack cancerous cells.

"These findings enabled us to identify potential biomarkers in patients who are less likely to respond favorably to immune checkpoint inhibitors, as well as new combination therapeutic approaches that might overcome such resistance in those patients," says senior author Matthew Galsky, MD, Professor of Medicine (Hematology and Medical Oncology), Icahn School of Medicine at Mount Sinai.

Significantly, the findings demonstrated that the balance between adaptive immunity and pro-tumorigenic inflammation in individual tumor microenvironments--reflected by these two gene signatures--best predicted response or resistance to immune checkpoint blockade. The researchers then identified specific cells in the tumor microenvironment associated with resistance to immune checkpoint blockade, and potential targets for therapies designed to overcome resistance.

For decades, standard treatment for metastatic urothelial cancer of the bladder has been platinum-based chemotherapy, though the landscape has changed dramatically in recent years with the advent of PD-1 and PD-L1 immune checkpoint inhibitors. This therapeutic breakthrough has had its limits, though: only 20 to 25 percent of patients with bladder cancer respond to treatment, which has set off an intense hunt by biomedical scientists for mechanisms of resistance.

"Using RNA sequencing data from two clinical trials, and single cell RNA sequencing data from a cohort of bladder tumors, we identified a subset of genes and immune cells associated with adaptive immunity and improved checkpoint inhibitor outcomes, and a subset associated with pro-tumorigenic inflammation and resistance to PD-1/PD-L1 blockade in patients with urothelial cancer," Dr. Galsky says.

The research is among the first to use both bulk and single-cell RNA sequencing of human bladder tumors to study resistance to immunotherapy. Bulk sequencing examines the mix of genes expressed by every individual cell within a tumor, while single-cell sequencing--a technique increasingly important in cancer research--zeroes in on gene expression by each individual cell, which yields unprecedented knowledge of the complexity and heterogeneity of cells that comprise tumors.

Through this combination of RNA sequencing, researchers learned, for example, that the balance of adaptive immunity and pro-tumorigenic inflammation within the tumor microenvironment can determine PD-1/PD-L1 resistance in urothelial cancer. Adaptive immunity is the body's ability to recognize and respond to specific foreign invaders, while pro-tumorigenic inflammation is a counterproductive response of the immune system that can ultimately fuel growth and progression of cancer.

"If the tumor microenvironment is weighted more toward adaptive immunity, there's a better chance of positive outcomes from immunotherapy," explains Dr. Galsky, who is Associate Director of Translational Research and Co-Director of the Bladder Cancer Center of Excellence at The Tisch Cancer Institute. "On the other hand, if the tumor microenvironment is leaning toward pro-tumorigenic inflammation, then PD-1/PD-L1 checkpoint inhibitors alone are unlikely to be successful, and new combination approaches may be needed."

Mount Sinai researchers coined the term "2IR Score" to measure that balance.

From its comprehensive RNA analysis, the team identified not just potential biomarkers to treatment resistance, but a specific subset of white blood cells known as myeloid phagocytic cells that are linked to pro-tumorigenic inflammation and, hence, resistance. As such, they serve as prospective targets for therapeutic approaches that combine immunotherapies like PD-1/PD-L1 blockade with drugs designed to overcome the resistance conferred by myeloid cells. Those novel combination strategies are now being incorporated into future clinical trials.

"Our research shows that a specific cellular state of myeloid cells underlying pro-tumorigenic inflammation account for resistance to immune checkpoint blockade in a very large percentage of patients with urothelial bladder cancer," Dr. Galsky says. "This is an important finding which we believe can lead to a better focus and direction for developing effective combination therapies - and not just for bladder cancer, but other types of tumors, as well".

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

US children, adolescents diagnosed with COVID-19

What The Study Did: In this observational study, data are used to assess the association of demographic and clinical characteristics with severe COVID-19 illness among hospitalized U.S. pediatric patients with COVID-19.

Authors: Alyson B. Goodman, M.D., of the COVID-19 Response Team at the Centers for Disease Control and Prevention in Atlanta, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.5298)

Editor's Note:  Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Genome analysis for sequence variants in SARS-CoV-2 among asymptomatic individuals in long-term care facility

What The Study Did: Genome analysis was performed on SARS-CoV-2 RNA from seven patients in a long-term care facility who were asymptomatic at the time of screening.

Authors: Baha Abdalhamid, M.D., Ph.D., of the University of Nebraska Medical Center in Omaha, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.7939)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Helping people understand adverse events associated with COVID-19 vaccinations

What The Study Did: This Viewpoint discusses potential associations between functional neurological disorder and COVID-19 vaccinations.

Authors: David L. Perez, M.D., M.MSc., of  Massachusetts General Hospital and Harvard Medical School in Boston, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamaneurol.2021.1042)

Editor's Note: The article includes conflicts of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Out-of-pocket health care expenses before, after Affordable Care Act

What The Study Did: Researchers analyzed changes in out-of-pocket health care expenses in the United States during the last two decades.

Authors: Amit Jain, M.D., of the Johns Hopkins University School of Medicine in Baltimore, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2021.5499)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

To nodulate or not? Uncovering how nitrate regulates gene expression in legumes

image: Legumes halt root nodule formation when plentiful amounts of nitrogen nutrients, such as nitrate, are present in the environment. Arrows indicate root nodule. Scale bar: 1 mm.

Image: 
University of Tsukuba

Tsukuba, Japan - Plants in the bean family (legumes) form nodules on their roots to take up nitrogen. Legumes will stop nodule production when nitrogen is plentiful (Figure 1), but precisely how nitrate presence controls nodule formation in these plants has been a mystery. Now, researchers from Japan have found that interactions between proteins and nitrate can induce and repress genes, controlling nodulation with potential applications in sustainable agriculture.

In a study published in April in The Plant Cell, a research team from the University of Tsukuba has shown that the different DNA-binding properties between proteins that establish nodule development determine if genes involved in symbiosis that govern nodulation turn on or off and that this gene expression is nitrate-induced.

Until now, there was an incomplete understanding of the molecular activity determining how legumes stop nodulation in the presence of excess nitrate. Previous research identified transcription factors (proteins that help turn specific genes "on" or "off") involved with nodule formation, but that's just part of the story.

"Building on the previous identification of transcription factors for proteins (known as NLPs) involved in nodule inception, we sought to answer the question of how symbiotic gene expression facilitating nodulation is controlled by nitrate," says senior author of the study Professor Takuya Suzaki. "We tested specific NLPs and found that they have overlapping functions, causing nitrate-induced control of nodulation."

To examine these molecular interactions, the researchers analyzed RNA molecules and plant traits using proteins from Lotus japonicus. They found that some proteins have dual functions, acting as master regulators for nitrate-dependent gene expression. They also identified new protein binding sites and compared them to previously known ones. Their findings reveal basic principles relating to NLP-regulated transcription of symbiotic genes inhibiting nitrate nodulation.

The research team emphasized additional questions. Some NLPs are found in cell nuclei in response to nitrate and stop nodule production, while others constantly aggregate in nuclei irrespective of nitrate levels. For the latter, it is unclear how they function exclusively in the presence of nitrate. The location of the NLPs in the cell matters because translation (when RNA is coded into proteins) happens in the cell's cytoplasm. If changes to proteins occur after the genetic code has been read (post-translational modifications), it could explain how these NLPs access protein-protein interactions and regulate genes.

"Uncovering how transcription factors influence gene expression has been a missing piece to the puzzle of understanding plant transcription regulation," Professor Suzaki explains. "Our discoveries bring us closer to knowing what is possible within these complex molecular relationships, but there is plenty left to untangle. Future research should aim to answer the question of how nodulation is regulated by other NLPs and in other plant species of interest."

Credit: 
University of Tsukuba

Antipsychotic drugs may have protective effect against COVID-19

image: Authors of the study

Image: 
Universidad de Sevilla

Two studies led by the Mental Health Unit of the Virgen del Rocio University Hospital and involving researchers from the US conclude that antipsychotic drugs could have a protective effect against SARS-CoV-2. For this reason, patients treated with these drugs have a lower risk of becoming infected or suffer a milder form of the disease if they do become infected.

Thus, a first descriptive epidemiological study of a sample of 698 patients treated with antipsychotics at the Seville hospital revealed that antipsychotic drugs could provide protection against both infection and the tendency to clinical severity of Covid-19 infection. "These are very interesting findings that reflect a clinical reality where we see few patients with severe COVID-19, despite the presence of various risk factors," says Manuel Canal Rivero, clinical psychologist and lead author of one of the two papers.

"The number of Covid-19 patients is lower than expected among this group of people and in cases where a proven infection does occur, the evolution is benign and does not reach a life-threatening clinical situation. These data as a whole seem to point to the protective effect of the medication," he adds.

Complementary to this study, the same research group has observed that many of the genes whose expression is altered by Covid-19 are significantly down-regulated by antipsychotic drugs, which are commonly used to treat diseases with psychotic symptoms. This finding was achieved by investigating the gene expression profile (indicator of activated biological processes) of Covid-19 patients (Wuhan cohort) and patients being treated with antipsychotic drugs (specifically, aripiprazole) from the cohort of the Early Phases of Psychosis Intervention Programme (PAFIP) initiated 20 years ago at the Marqués de Valdecilla University Hospital in Cantabria by Benedicto Crespo-Facorro, professor at the University of Seville and current director of the Mental Health Unit at the Virgen del Rocío University Hospital.

"In a striking way we have shown how antipsychotics reduce the activation of genes involved in many of the inflammatory and immunological pathways associated with the severity of Covid-19 infection," says the lead author of the second paper, Professor Crespo-Facorro. Furthermore, he stresses that "although this finding requires replication, the discovery could be very significant because the treatment of Covid-19 with drugs originally indicated for unrelated clinical situations, that is to say drug repositioning, has been shown to be an interesting source of effective treatments for Covid-19 patients".

Credit: 
University of Seville