Culture

'Are we alone?' Study refines which exoplanets are potentially habitable

image: An artist's conception shows a hypothetical planet with two moons orbiting within the habitable zone of a red dwarf star.

Image: 
NASA/Harvard-Smithsonian Center for Astrophysics/D. Aguilar

Study is the first to include 3D chemistry to understand how a star's radiation heats or cools a rocky planet's atmosphere

Information will help astronomers know where to search for life elsewhere

Researchers find that only planets orbiting active stars lose water to vaporization

Some planets, previously believed to be habitable, receive too much UV radiation to sustain life

EVANSTON, Ill. -- In order to search for life in outer space, astronomers first need to know where to look. A new Northwestern University study will help astronomers narrow down the search.

The research team is the first to combine 3D climate modeling with atmospheric chemistry to explore the habitability of planets around M dwarf stars, which comprise about 70% of the total galactic population. Using this tool, the researchers have redefined the conditions that make a planet habitable by taking the star's radiation and the planet's rotation rate into account.

Among its findings, the Northwestern team, in collaboration with researchers at the University of Colorado Boulder, NASA's Virtual Planet Laboratory and the Massachusetts Institute of Technology, discovered that only planets orbiting active stars -- those that emit a lot of ultraviolet (UV) radiation -- lose significant water to vaporization. Planets around inactive, or quiet, stars are more likely to maintain life-sustaining liquid water.

The researchers also found that planets with thin ozone layers, which have otherwise habitable surface temperatures, receive dangerous levels of UV dosages, making them hazardous for complex surface life.

"For most of human history, the question of whether or not life exists elsewhere has belonged only within the philosophical realm," said Northwestern's Howard Chen, the study's first author. "It's only in recent years that we have had the modeling tools and observational technology to address this question."

"Still, there are a lot of stars and planets out there, which means there are a lot of targets," added Daniel Horton, senior author of the study. "Our study can help limit the number of places we have to point our telescopes."

The research will be published online Nov. 14 in the Astrophysical Journal. (Read a pre-print of the paper.)

Horton is an assistant professor of Earth and planetary sciences in Northwestern's Weinberg College of Arts and Sciences. Chen is a Ph.D. candidate in Northwestern's Climate Change Research Group and a NASA future investigator.

The 'Goldilocks zone'

To sustain complex life, planets need to be able to maintain liquid water. If a planet is too close to its star, then water will vaporize completely. If a planet is too far from its star, then water will freeze, and the greenhouse effect will be unable to keep the surface warm enough for life. This Goldilocks area is called the "circumstellar habitable zone," a term coined by Professor James Kasting of Penn State University.

Researchers have been working to figure out how close is too close -- and how far is too far -- for a planet to sustain liquid water. In other words, they are looking for the habitable zone's "inner edge."

"The inner edge of our solar system is between Venus and Earth," Chen explained. "Venus is not habitable; Earth is."

Horton and Chen are looking beyond our solar system to pinpoint the habitable zones within M dwarf stellar systems. Because they are numerous and easier to find and investigate, M dwarf planets have emerged as frontrunners in the search for habitable planets. They get their name from the small, cool, dim stars around which they orbit, called M dwarfs or "red dwarfs".

Crucial chemistry

Other researchers have characterized the atmospheres of M dwarf planets by using both 1D and 3D global climate models. These models also are used on Earth to better understand climate and climate change. Previous 3D studies of rocky exoplanets, however, have missed something important: chemistry.

By coupling 3D climate modeling with photochemistry and atmospheric chemistry, Horton and Chen constructed a more complete picture of how a star's UV radiation interacts with gases, including water vapor and ozone, in the planet's atmosphere.

In their simulations, Horton and Chen found that a star's radiation plays a deciding factor in whether or not a planet is habitable. Specifically, they discovered that planets orbiting active stars are vulnerable to losing significant amounts of water due to vaporization. This stands in stark contrast to previous research using climate models without active photochemistry.

The team also found that many planets in the circumstellar habitable zone could not sustain life due to their thin ozone layers. Despite having otherwise habitable surface temperatures, these planets' ozone layers allow too much UV radiation to pass through and penetrate to the ground. The level of radiation would be hazardous for surface life.

"3D photochemistry plays a huge role because it provides heating or cooling, which can affect the thermodynamics and perhaps the atmospheric composition of a planetary system," Chen said. "These kinds of models have not really been used at all in the exoplanet literature studying rocky planets because they are so computationally expensive. Other photochemical models studying much larger planets, such as gas giants and hot Jupiters, already show that one cannot neglect chemistry when investigating climate."

"It has also been difficult to adapt these models because they were originally designed for Earth-based conditions," Horton said. "To modify the boundary conditions and still have the models run successfully has been challenging."

'Are we alone?'

Horton and Chen believe this information will help observational astronomers in the hunt for life elsewhere. Instruments, such as the Hubble Space Telescope and James Webb Space Telescope, have the capability to detect water vapor and ozone on exoplanets. They just need to know where to look.

"'Are we alone?' is one of the biggest unanswered questions," Chen said. "If we can predict which planets are most likely to host life, then we might get that much closer to answering it within our lifetimes."

Credit: 
Northwestern University

Here's how you help kids crack the reading code

To help children learn to read earlier, one thing appears to be key: Learn the letters and sounds associated with the letters as early as possible. This may sound obvious, but another theory has suggested that children should first learn to read the letters in the context of words instead.

Charting each child's letter-sound knowledge can be helpful in supporting them further in the learning process as they begin school, says Professor Hermundur Sigmundsson at NTNU's Department of Psychology.

Sigmundsson, Greta Storm Ofteland, Trygve Solstad and Monika Haga collaborated on a recently published article in New Ideas in Psychology. Sigmundsson says the research team are among the first to clearly show the connection between learning the letters and sound correspondences and breaking the reading code.

"Since reading is the very foundation for acquiring other skills, it should be prioritized for the first few years of school," says Professor Sigmundsson.

Clear link

The connection between literacy and literacy is clear, and a good indicator of literacy. On average, the children had to know 19 letters to crack the reading code or read.

But it's not a given that you'll be able to read even if you know your letters. Reading or writing single letters is something completely different from putting those letters together into words that make sense. The individual letter variations can be huge.

Granted, the letters in Norwegian are pronounced quite consistently - especially compared to English - but they vary enough that children need time. The words "cough" or "light," for example, aren't necessarily pronounced the way you would think by just looking at the letters individually.

Children who have already cracked the reading code should have appropriate challenges to further develop their reading skills. These should be in the form of books that pique their interest. At the same time, youngsters who still haven't cracked the code should learn enough letters and letter sounds to start practicing putting words together.

Read to kids early - practice makes perfect

The research team studied 356 children aged 5 to 6 years for one year. Eleven per cent of the children could already read when they started school. By the end of the first school year, 27 per cent had not yet learned to read. Most of this group were boys, who also knew fewer letters when they started school.

"If you take out the 5 to 10 per cent who have dyslexia, the numbers could indicate that around one in five children gets too little practice or lacks motivation in their first school year," Sigmundsson says.

Girls are better at reading than boys from the outset. This difference continues throughout school, but it's important to remember that this is an average, and that parents of both boys and girls can do things to help their children.

Previous research from NTNU and elsewhere shows that you need to practice exactly what you want to be good at. Therefore, it is important that children are encouraged to become independent readers early. Parents should read to children to arouse their interest whenever possible.

What you read hardly matters, as long as the child finds it time well spent. As a bonus, children and parents enjoy a cosy time together.

Credit: 
Norwegian University of Science and Technology

Tailor-made carbon helps pinpoint hereditary diseases and correct medication dosage

image: The new methodology allows the experimental spectrum produced by X-ray spectroscopy to be separated into atomic-level data.

Image: 
Anja Aarva / Aalto University

Sensors manufactured with carbon-based materials can provide uniquely accurate and real-time information on hereditary diseases or the concentrations of drugs in the body. In addition to medicine, carbonaceous materials are used in batteries, solar cells and water purification.

Other elements, such as hydrogen and oxygen, are almost always present in carbon-based materials, which alters the materials' properties. Therefore, modifying materials for desired applications requires atomic-level knowledge on carbon surface structures and their chemistry. Researchers at Aalto University, the University of Cambridge, the University of Oxford and Stanford University have now taken a significant new step forward in describing the atomic nature of carbonaceous materials.

Detailed information on carbon surfaces can be obtained by X-ray spectroscopy, but the spectrum it produces is challenging to interpret because it summarises information from several local chemical environments of the surface. The researchers have developed a new systematic analysis method that uses machine learning to integrate the computational model (density functional theory) with the experimental results of the carbon sample. The new methodology allows the experimental spectrum produced by X-ray spectroscopy to be separated into atomic-level data.

'In the past, experimental results have been interpreted differently, based on varying literature references, but now we were able to analyse the results using only computational references. The new method gives us a much better understanding of carbon surface chemistry without human-induced bias' says Anja Aarva, a doctoral student at Aalto University.

The new method expands knowledge of carbon-based materials

In a two-part study, the researchers initially studied how differently bound carbon affects the formation of the experimental spectrum qualitatively. The researchers then attempted to aggregate the measured spectrum with computational spectrum reference data to obtain a quantitative estimate of what the experimental spectrum consists of. This was to help them determine what the nature of the carbon sample at the atomic-level is. The new methodology is suitable for analysing the surface chemistry of various forms of carbon, such as graphene, diamond and amorphous carbon.

The study is a continuation of the work of Aalto University postdoctoral researcher Miguel Caro and professor Volker Deringer from Oxford University, which extensively mapped the structure and reactivity of amorphous carbon. The study utilises machine learning methods developed by professor Volker Deringer and professor Gabor Csányi from Cambridge University. Experimental measurements were carried out by Sami Sainio, an Aalto based postdoctoral researcher at Stanford University.

'Next, we intend to use the methodology we have developed to predict, for example, what kind of carbon surface would be best for electrochemical identification of certain neurotransmitters, and then try to produce the desired surface. In this way, computational work would guide experimental work and not vice versa, as has typically been the case in the past,' Tomi Laurila, professor at Aalto University said.

Credit: 
Aalto University

Two cosmic peacocks show violent history of the magellanic clouds

video: A number of filamentary structures are formed at the same time after the collision. This simulation was performed by the supercomputer "ATERUI" operated by the National Astronomical Observatory of Japan.

Image: 
NAOJ/Inoue et al.

Two peacock-shaped gaseous clouds were revealed in the Large Magellanic Cloud (LMC) by observations with the Atacama Large Millimeter/submillimeter Array (ALMA). A team of astronomers found several massive baby stars in the complex filamentary clouds, which agrees well with computer simulations of giant collisions of gaseous clouds. The researchers interpret this to mean that the filaments and young stars are telltale evidence of violent interactions between the LMC and the Small Magellanic Cloud (SMC) 200 million years ago.

Astronomers know that stars are formed in collapsing clouds in space. However, the formation processes of giant stars, 10 times or more massive than the Sun, are not well understood because it is difficult to pack such a large amount of material into a small region. Some researchers suggest that interactions between galaxies provide a perfect environment for massive star formation. Due to the colossal gravity, clouds in the galaxies are stirred, stretched, and often collide with each other. A huge amount of gas is compressed in an unusually small area, which could form the seeds of massive stars.

A research team used ALMA to study the structure of dense gas in N159, a bustling star formation region in the LMC. Thanks to ALMA's high resolution, the team obtained a detailed map of the clouds in two sub-regions, N159E-Papillon Nebula and N159W South.

Interestingly, the cloud structures in the two regions look very similar: fan-shaped filaments of gas extending to the north with the pivots in the southernmost points. The ALMA observations also found several massive baby stars in the filaments in the two regions.

"It is unnatural that in two regions separated by 150 light-years, clouds with such similar shapes were formed and that the ages of the baby stars are similar," says Kazuki Tokuda, a researcher at Osaka Prefecture University and the National Astronomical Observatory of Japan. "There must be a common cause of these features. Interaction between the LMC and SMC is a good candidate."

In 2017, Yasuo Fukui, a professor at Nagoya University and his team revealed the motion of hydrogen gas in the LMC and found that a gaseous component right next to N159 has a different velocity than the rest of the clouds. They suggested a hypothesis that the starburst is caused by a massive flow of gas from the SMC to the LMC, and that this flow originated from a close encounter between the two galaxies 200 million years ago.

The pair of peacock-shaped clouds in the two regions revealed by ALMA fits nicely with this hypothesis. Computer simulations show that many filamentary structures are formed in a short time after a collision of two clouds, which also backs this idea.

"For the first time, we uncovered a link between massive star formation and galaxy interactions in very sharp detail," says Fukui, the lead author of one of the research papers. "This is an important step in understanding the formation process of massive star clusters in which galaxy interactions have a big impact."

Credit: 
National Institutes of Natural Sciences

Architecture of a bacterial power plant decrypted

image: Structure of the cytochrome bd oxidase. The experimental data are shown in gray and the derived molecular model is colored. The excision enlargement shows the area in which the three cytochromes are bound.

Image: 
Rudolf-Virchow-Zentrum / University of Würzburg

Both humans and many other creatures need oxygen for survival. In the conversion of nutrients into energy, the oxygen is converted to water, for which the enzyme oxidase is responsible. It represents the last step of the so-called respiratory chain.

While humans have only one type of these oxidases, the bacterial model organism Escherichia coli (E. coli) has three alternative enzymes available. In order to better understand why E. coli and other bacteria need multiple oxidases, Prof. Bettina Böttcher from the Rudolf Virchow Center in collaboration with Prof. Thorsten Friedrich (University of Freiburg) have determined the molecular structure of the cytochrome bd oxidase from E. coli. This type of oxidase is found only in bacteria and microbial archaea.

Bacteria have other types of oxidase

The eponymous cytochromes, two of type b and one of type d, are the key iron-containing groups that enable the function of oxidase. At the cytochrome d, the oxygen is bound and converted to water. The structure determination revealed that the architecture of cytochrome bd oxidase from E. coli is very similar to the structure of another bacterium, Geobacillus thermodenitrificans. "However, to our great surprise, we discovered that a cytochrome b and cytochrome d have changed positions and thus the site of oxygen conversion within the enzyme," reports Prof. Thorsten Friedrich.

The cause of this change could be that the cytochrome bd oxidase might fulfill a second function: in addition to the energy production, it can serve to protect against oxidative stress and stress by nitroxides. Particularly pathogenic bacterial strains show a high activity of cytochrome bd oxidase. Since humans do not have this type of oxidase, these results might furthermore provide important indications on the development of new antimicrobials that target the cytochrome bd oxidase of pathogens such as Mycobacteria.

Important for this success was the new high-performance electron microscope, which has been operated since 2018 under the direction of Prof. Böttcher at the Rudolf Virchow Center. "Cytochrome bd oxidase was a challenging sample for cryo-electron microscopy because it is one of the smallest membrane proteins whose structure has been determined with this technique," explains Prof. Bettina Böttcher.

Special features of this technique are extremely low temperatures down to minus 180 degrees Celsius and a resolution that moves in the order of atoms. It makes it possible to study biological molecules and complexes in solution that have been previously snap frozen and to reconstruct their three-dimensional structure. With a voltage of 300,000 volts, the microscope accelerates the electrons with which it "scans" the samples.

Credit: 
University of Würzburg

Alpine rock axeheads became social and economic exchange fetishes in the Neolithic

image: Alpine rock axehead found at Harras, Thuringia, from the Michelsberg Culture (c. 4300-2800 ANE).

Image: 
Juraj Lipták, State Office for Heritage Management and Archaeology Saxony-Anhalt.

Axeheads made out of Alpine rocks had strong social and economic symbolic meaning in the Neolithic, given their production and use value. Their resistance to friction and breakage, which permitted intense polishing and a re-elaboration of the rocks, gave these artefacts an elevated exchange value, key to the formation of long-distance exchange networks among communities of Western Europe. Communities who had already begun to set the value of exchange of a product according to the time and effort invested in producing them.

This is what a study led by a research group at the Universitat Autònoma de Barcelona (UAB) indicates in regards to the mechanical and physical parameters characterising the production, circulation and use of a series of rock types used in the manufacturing of sharp-edged polished artefacts in Europe during the Neolithic (5600-2200 BCE).

The objective of the study was to answer a long debated topic: the criteria by which Alpine rocks formed part of an unprecedented pan-European phenomenon made up of long-distance exchange networks, while others were only used locally. Was the choice based on economic, functional or perhaps subjective criteria? Stone axeheads were crucial to the survival and economic reproduction of societies in the Neolithic. Some of the rocks used travelled over 1000 kilometres from their Alpine regions to northern Europe, Andalusia in southern Spain and the Balkans.

This is the first time a study includes in the specialised bibliography comparative data obtained by testing the resistance to friction and breakage of the rocks. These mechanical parameters have led to the definition of production and use values, which were then correlated with the distances and volumes of the rocks exchanged in order to obtain their exchange value. The results help understand the basic principles underlying the supply and distribution system of stone materials during the Neolithic in Western Europe, as well as its related economic logic.

"The reasons favouring the integration of specific rock types into these long-distance networks depended on a complex pattern of technological and functional criteria. This pattern was not solely based on economic aspects, their use value, but rather on the mechanical capacity to resist successive transformation processes, i.e. their production value, and remain unaltered throughout time", explains Selina Delgado-Raack, researcher at the Department of Prehistory, UAB, and first author of the article.

Supply System and Economic Logic

The study points to the diverging economic conception between the manufacturing of tools using other rocks and Alpine rock axeheads. Neolithic communities selected the most suitable raw materials available from all the resources in their region and knew each of their mechanical and physical characteristics. These tools normally travelled in a radius of 200 kilometres from where they originated and rarely went farther than 400-500 kilometres. Only Alpine rocks travelled further than those regional and economic limits.

"The circulation of these rocks at larger distances did not respond to a functional and cost-efficient logic, in which each agent takes into account the costs of manufacturing and transport when selecting the different rock types, all of them viable in being converted into fully functioning tools", indicates Roberto Risch, also researcher at the Department of Prehistory, UAB, and coordinator of the research. "It rather obeys the emergence of a very different economic reasoning, based on the ability to transform one material through ever greater amounts of work, something which many centuries later Adam Smith used to define the British economy of the 18th century. In the case of Alpine axeheads, their exceptional exchange value was due to the increase in manufacturing costs, a result of the intense polishing of these stones as they passed from one community to another".

A Primitive Form of Currency?

For the research team, the fact that the Alpine axeheads are categorised as the most commonly crafted and modified artefact in different periods and regions during the Neolithic rules out their role as symbols of power or ceremonial elements. "The economic pattern points towards more of a fetish object used in social and economic interactions among European communities of highly different socio-political productions and orientations", Selina Delgado-Raack states.

The exceptional exchange value reached by some rock types, such as the omphacitites and jadeitites, leads the team to think that they may have been used as a primitive form of currency, although they admit that there is a need for more studies before this topic can be clarified.

Credit: 
Universitat Autonoma de Barcelona

Efficient, but not without help

HSE University economists analyzed what banks performed best on the Russian market from 2004 to 2015 - state, private, or foreign -owned ones. They found out that during stable economic and political periods, foreign- owned banks tend to take the lead, while during a crisis period, such as from 2008 to 2013, state -owned banks outperformed them.

Empirical studies have shown that efficiency in the banking sector generally correlates with a country's economic growth. A bank's efficiency is impacted by many factors, including its assets and liability structure, as well as its specialization (household deposit, securities, or investment banking). The type of ownership - public or private - also plays a role. Furthermore, depending on the overall macroeconomic environment, private or state-owned banks may perform better. This is the first time such an analysis has been carried out in Russia.

The focus of leading banks is profit efficiency. State-owned banks, in addition to operational goals, carry out social objectives as well, such as offering reduced mortgages to military personnel or other eligible beneficiaries.

In regards to foreign private-owned banks, they generally enter new markets with only one goal - profit generation. Russia used to be quite attractive in this regard, at least from 2004 to 2009, when the interest margin was almost four times higher than in Europe (8.02% vs. 2.24%).

The proportion of state-owned and foreign-owned banks in terms of total assets in the banking sector changed over 11 years, from 2004 to 2015. According to the Bank of Russia's data, this share increased rapidly from 40% in 2004, to 52% in 2008, and 61.5% in 2015, respectively. In addition, the share of foreign-owned banks started out at 7.3% in 2004 and climbed to 18.7% in 2008. However, the global financial meltdown impacted them more harshly than others, and by 2015, their share of assets fell to 13%. At the same time, the share of privately-owned Russian banks also decreased significantly. Standing above 50% in 2004, their share fell below 30% by 2015.

Research Methods

Economists at HSE University analyzed reported data from 240 banks, which, in aggregate, represent 91% of total banking assets in Russia. They aimed at identifying the factors as to why the return on assets (ROA) for certain banks fell behind the optimal level.

The 'suspects' included the banks' ownership, as well as those indicators characterizing assets and liability structures and the risk profiles of loan portfolios. The researchers analyzed the share of household deposits in total deposits, the share of household loans in banks' loan portfolios, and the share of loan loss provisions to total loans. These particular indicators can characterize a bank's focus and credit risk level.

A bank's profit efficiency demonstrates how it manages its profits. As such, a lack of efficiency may show up in banking services, which might be too expensive or not popular among the bank's clients.

'To model profit function, we relied on three factors: labour, capital, and deposits. We believed that for banks, deposits are the main resource,' explained Veronika Belousova, one of the study's authors, adding: 'The profit function included the cost of these resources (cost of labour = personel expenses/total bank assets; cost of capital = operational costs/fixed assets; cost of deposits = interest paid on deposits/volume of deposits). Also, loans and securities were investigated as banking products. ROA was a dependent variable in regards to this function'.

In addition to these indicators, the researchers considered several additional factors, such as currency exchange rates, banks' capital adequacy (as risk buffer), their location and specialization, and the size of their branch networks.

The profit function was modeled for different time periods, taking into account crises experienced by the Russian economy: i.e., the 2004 liquidity crisis, the 2008-2010 economic meltdown, and the geopolitical crisis that started in spring 2014.

Outcomes

This empirical study indicates that the profit efficiency of state-owned banks during crisis periods was better than that of private-owned banks. The researchers note that this phenomenon, in conjunction with the fact that state-owned banks have more resources, help them outperform competitors in terms of volume of household loans, attracting payroll projects (e.g., state-owned firms), and offering payments services to their corporate clients. In addition, state-owned banks are able to reduce the share of less profitable interbank loans.

The introduction of Russia's deposit insurance system and the 2004 crisis forced many firms to transfer their accounts from private to state-owned banks. As a result, the former was unable to compete with the latter in pricing their services, resulting in a decline of long-term and stable funding sources and, consequently, higher liquidity risks.
Today, state-owned banks lead in household (68.4%) and corporate (72.8%) loans, respectively. Furthermore, interbank loans have started taking a larger share in the loan portfolios of privately owned banks.

During the more 'peaceful' years (from January 2004 to June 2008, and January 2014 to October 2015) foreign-owned banks turned out to be more profit-efficient (i.e., their ROAs were closer to optimal) on the Russian market. The researchers believe that one of their key advantages is better management practices, as instructed by their mother institutions. This largely concerned all areas of activity, from routine standardization of business processes and faster decision-making, to introducing new technologies and unifying risk assessment instruments.

However, during the crisis, from July 2008 to December 2013, state-owned banks performed better in terms of profit efficiency. According to the researchers, this happened due to their larger capital stocks and ability to replenish them. 'This situation is characteristic of countries where there are more monopolies and state banks in the economy,' the authors concluded.

Credit: 
National Research University Higher School of Economics

A step closer to cancer precision medicine

image: Figure 1. Left panel: ER-positive breast cancer patients with AGR2 up-regulation showed less survival rate in the METABRIC study; Right panel: Acute myeloid leukaemia patients with SRGN up-regulation showed less survival rate in the BeatAML study.

Image: 
University of Helsinki

Researchers from the Faculty of Medicine and the Institute for Molecular Medicine (FIMM) at the University of Helsinki have developed a computational model, Combined Essentiality Scoring (CES) that enables accurate identification of essential genes in cancer cells for development of anti-cancer drugs.

Why are the essential genes important in cancer?

Cancer is the leading cause of death worldwide. Cancer cells grow faster usually with the activation of certain genes. Targeted therapies aim at inhibiting these genes that are activated only in cancer cells, and thus minimizing side effects to normal cells.

High-throughput genetic screening has been established for evaluating the importance of individual genes for the survival of cancer cells. Such an approach allows researchers to determine the so-called gene essentiality scores for nearly all genes across a large variety of cancer cell lines.

However, challenges with replicability of the estimated gene essentiality have hindered its use for drug target discovery.

"shRNA and CRISPR-Cas9 are the two common techniques used to perform high-throughput genetic screening. Despite improved quality control, the gene essentiality scores from these two techniques differ from each other on the same cancer cell lines," explains Wenyu Wang, first author of the study.

How can we do better?

To harmonize genetic screening data, researchers proposed a novel computational method called Combined Essentiality Scoring (CES) that predicts cancer essential genes using the information from shRNA and CRISPR-Cas9 screens plus molecular features of cancer cells. The team demonstrated that CES could detect essential genes with higher accuracy than the existing computational methods. Furthermore, the team showed that two predicted essential genes were indeed correlated with poor prognosis separately for breast cancer and leukaemia patients, suggesting their potential as drug targets (Figure 1).

"Improving gene essentiality scoring is just a beginning. Our next aim is to predict drug-target interactions by integrating drug sensitivity and gene essentiality profiles. Given the ever-increasing volumes of functional screening datasets, we hope to extend our knowledge of drug target profiles that will eventually benefit drug discovery in personalized medicine," says Assistant Professor Jing Tang, corresponding author of the study.

Credit: 
University of Helsinki

Sociable crows are healthier -- new research

image: Carrion crows

Image: 
Photo by Dr Claudia Wascher, Anglia Ruskin University (ARU)

A new study has found that crows living in large social groups are healthier than crows that have fewer social interactions.

The research, led by Dr Claudia Wascher of Anglia Ruskin University (ARU), has been published this week in the journal Animal Behaviour.

Dr Wascher and her colleagues studied a population of captive carrion crows over a six-year period. They monitored the behaviour of the crows in different sized groups and measured friendship by ranking the birds using a sociality index.

At the same time, they studied the crows' droppings to measure for the presence of coccidian oocyst, a gastrointestinal parasite that can represent an important health threat for birds.

Increased exposure to parasites and disease transmission is considered as one of the major disadvantages of group living. This new study, however, shows the opposite effect.

The researchers found that crows with strong social bonds, living with more relatives, and in larger groups, excreted a significantly smaller proportion of droppings containing parasites than less sociable crows.

The study did not find a connection between health and the crow's dominance within the group, but found that male crows (33%) were slightly more likely to carry the parasite than females (28%).

Dr Wascher, Senior Lecturer in Biology at Anglia Ruskin University (ARU), said: "Crows are a highly social bird and we found that crows with the strongest social bonds excreted fewer samples containing coccidian oocyst, which is a common parasite in birds.

"It is a commonly-held belief that animals in larger groups are less healthy, as illness spreads from individual to individual more easily. We also know from previous studies that aggressive social interactions can be stressful for birds and that over time chronic activation of the physiological stress response can dampen the immune system, which can make individuals more susceptible to parasites.

"Therefore the results from our six-year study, showing a correlation between sociability and health, are significant. It could be that having close social bonds reduces stress levels in crows, which in turn makes them less susceptible to parasites.

"It could also be that healthier crows are more sociable. However, as many of the birds we studied were socialising within captive family groups, dictated by the number of crows within that family, we believe that social bonds in general affect the health of crows, and not vice versa."

Credit: 
Anglia Ruskin University

Researchers find new role for dopamine in gene transcription and cell proliferation

WASHINGTON (Nov. 14, 2019) - The dopamine D2 receptor has a previously unobserved role in modulating Wnt expression and control of cell proliferation, according to a new study from the George Washington University (GW) and the University of Pittsburgh. The research, published in Scientific Reports, could have implications for the development of new therapeutics across multiple disciplines including nephrology, endocrinology, and psychiatry.

Dopamine is traditionally studied in the central nervous system, however, it is increasingly implicated in regulating functions of various other organs. This new study identifies a new role for dopamine signaling via the D2 receptor outside the brain - in controlling signaling through the Wnt/β-catenin pathway, in part, through its effects on expression of Wnt3a, a key Wnt receptor ligand.

Both dopamine and the Wnt/β-catenin signaling pathways are ubiquitous across organ systems and species. Wnt signaling is essential for development and cell proliferation, and is associated with a number of diseases from cancer to schizophrenia. However, little is known about the underlying mechanism regulating expression of Wnt3a, or the modulation of its activity.

"In our research, we found that the dopamine D2 receptor is a transcriptional regulator of Wnt signaling and this ability to modulate Wnt signaling is important for better understanding development of hypertension," said Prasad Konkalmatt, PhD, assistant research professor of medicine at the GW School of Medicine and Health Sciences and a first author on the study.

The research team focused the study on signaling in the kidneys and in the pancreas. More broadly, the study shows that dopamine receptors can act as regulators of gene transcription and that this signaling is important in controlling cell proliferation under healthy and disease conditions.

These results were unexpected, surprising the investigators by how well conserved this dopamine regulation was across species and organs. This work also showed for the first time that lithium, one of the most commonly used psychiatric medications today, strongly increases the expression of D2 receptors, providing a new mechanism of action for this drug.

"Our work opens the door to a new way of thinking about dopamine signaling and its regulation," said Zachary Freyberg, MD, PhD, assistant professor of psychiatry and cell biology at the University of Pittsburgh School of Medicine and a senior author on the study. "By providing a new mechanism for the actions of lithium, we can better understand how this medication works and make better medications in the future to treat bipolar disorder and to improve the lives of the millions of people living with this illness."

The investigators also discovered that a number of common gene polymorphisms associated with hypertension and renal injury control D2 receptor expression in renal cells. This discovery provides new mechanisms and drug discovery targets for hypertension and renal injury.

"Our findings have broad implications in terms of how we think about dopamine receptor signaling, especially given that the receptors are targets for diabetes and potentially for hypertension and renal injury," explained Ines Armando, PhD, associate research professor of medicine at the GW School of Medicine and Health Sciences and a senior author on the study. "Expanding our understanding of this unique signaling specific to individual patients offers the promise of more effective precision medicine."

Credit: 
George Washington University

MicroRNA comprehensively analyzed

image: Messenger RNA transmits genetic information to the proteins, and microRNA plays a key role in the regulation of gene expression. Scientists from the Moscow Institute of Physics and Technology and the Research Centre for Medical Genetics have described the complex interactions between these two and other kinds of human RNA.

Image: 
@tsarcyanide/MIPT Press Office

Messenger RNA transmits genetic information to the proteins, and microRNA plays a key role in the regulation of gene expression. Scientists from the Moscow Institute of Physics and Technology and the Research Centre for Medical Genetics have described the complex interactions between these two and other kinds of human RNA. The paper was published in Frontiers in Genetics.

What are microRNA and Argonaute proteins?

Ribonucleic acid, or RNA, is an essential molecule encoding genetic information in cells. Its three main types are called transfer, ribosomal, and messenger RNA. The latter is also known as mRNA and serves as an intermediary between DNA -- the gene repository -- and the protein molecules resulting from gene expression. mRNA is synthesized in the nucleus based on the DNA sequence. It is then transported out into the cytoplasm, where it becomes a template for protein synthesis. However, only about 2% of the RNA molecules produced by a cell serve as protein templates. Among the remaining ones is the so-called microRNA, which is 18 to 25 nucleotides long and plays an entirely different role.

MicroRNAs are a group comprising about 2,500 molecules known for human so far, which bind with proteins from the Argonaute family (AGO) and function together. The small-sized microRNA-AGO complex binds to particular mRNA sites, and it is always the microRNA component of the complex that determines which region of which mRNA to bind to. The Argonaute protein either blocks protein production from mRNA or eliminates the mRNA by "cleaving" it. Therefore, if a microRNA-AGO complex engages with a certain mRNA, its corresponding protein will no longer be produced. That way genes are effectively silenced by microRNA "capturing" mRNA and affecting gene expression.

While the interaction occurs between microRNA and mRNA, it is often seen as that between microRNA and the gene encoding the mRNA. This silencing is one of the numerous mechanisms for regulating gene expression. Cells employ them to control gene productivity by enabling or disabling genes to a varying degree. Misregulation of gene expression due to microRNA "malfunctions" can cause cancer and other pathologies.

Geneticists still do not fully understand the interactions between the two types of RNA. There are about 20,000 known human mRNAs and 2,500 microRNAs, but it remains unclear which of them actually bind to each other. The researchers previously showed that the computer programs predicting interactions between microRNA and mRNA do not work properly.

In their new research, the scientists combined the experimental data on the amount of mRNA and microRNA formed in a cell with the data on the interactions between these two kinds of RNA for two types of human cells. The team explored the connection between the amount of a microRNA in a cell and it's binding activity. One would expect the two quantities to be in direct proportion to each other, but that proved not to be the case. The researchers also looked at how many pairs an mRNA formed and whether with the same or with different microRNAs. Scientifically speaking, the geneticists explored the relation between the expression level and the binding activity for mRNA and microRNA. They also explored the way the behavior of these pairs depends on cell type.

"Our research explores the interaction between microRNA and genes," said Olga Plotnikova, a PhD student at MIPT and one of the authors of the research. "A microRNA is a small noncoding RNA molecule that regulates gene expression. We published an article showing the imperfections of the computer programs used to predict the interactions between microRNA and genes. That's why we wanted to get the full picture of microRNA interactions: what binds with what and how.

"We analyzed the only two available papers on this subject that report the experimental data on the entire range of interactions between microRNA and genes, for two different human cell lines. Then we correlated the data with the results of other experiments, determining the expression level of mRNA and microRNA in the same cell lines," Plotnikova went on. "We showed that microRNA does not strongly regulate all genes and its regulation potential does not directly depend on its expression level. We also identified the differences between the microRNA interactions in the two cell lines."

Methods

The main problem with the empirical research into microRNA interactions has to do with the limitations of the available methods. There is a set of methods known as reporter gene assays, which can test one particular interaction with an experiment. Another set of methods, called cross-linking with immunoprecipitation (CLIP), enables researchers to identify the binding sites, but not the specific microRNAs associated with them. Cross-linking is typically used to determine the position of direct interactions between proteins and nucleic acids. It enables to purify a specific protein-RNA complex so the vast majority of contaminating RNA can be removed. It is thus possible to identify all the microRNA-mRNA binding sites without knowing which of the thousands of known microRNAs was involved in the interaction.

There are two similar techniques that have been recently developed: the CLASH and the CLEAR-CLIP methods. They actually build upon CLIP. The problem with them is that they are very intricate and have only been used on two cancerous human cell lines: kidney cell line and hepatic cell line. The team also used the available data on the amount of mRNA and microRNA formed in each of the cell lines (gene expression data). The scientists used the data obtained in 79 CLIP experiments to identify the mRNA parts interacting with microRNAs. While the experiments establish the existence of these interactions, they do not reveal the exact microRNAs involved.

Research results

The researchers have provided an in silico proof that the data from improved CLIP experiments on the entire range of interactions between microRNA and genes for two different human cell lines are similar and can be compared. It was demonstrated that most mRNA-microRNA complexes are formed by a small number of RNAs. For example, only 1%-2% of protein-encoding genes form more than 10 different interactions. Fascinating kinds of mRNA that exhibit "sponge-like" properties have also been found. Those kinds of mRNA were binding with a large number of RNAs (more than 50) in different mRNA sites. What is more, the researchers found a group of microRNAs with two key features: They are weakly expressed and have a lot of interactions. This runs contrary to the expectation that the more a given microRNA is expressed, the more it should interact with various mRNAs.

The study also established a list of reliable microRNA-binding regions -- the places where mRNA and microRNA interact with each other. This led the team to create online software that determines whether a given position in the human genome coincides with a microRNA binding site. It helps to identify the disruption of microRNA binding and therefore of gene regulation, which means that it can explain genetically inherited diseases. The program could be used for analyzing the genome of patients. Mapping all the interactions between microRNA and human genes helps to reveal the molecular basis of congenital and acquired disorders.

Credit: 
Moscow Institute of Physics and Technology

Sexual minorities continue to face discrimination, despite increasing support

UNIVERSITY PARK, Pa. -- Despite increasing support for the rights of people in the LGBTQ+ community, discrimination remains a critical and ongoing issue for this population, according to researchers.

In a recent study, researchers found that adults who identified as gay, lesbian or bisexual -- as well as people who reported same-sex attraction or same-sex sexual partners, referred to as sexual minorities -- experienced discrimination and victimization at different rates across age.

Cara Exten, assistant professor of nursing at Penn State, said the findings are a reminder that discrimination is still a significant issue for sexual minorities, which is key for policy, prevention and intervention.

"We conducted this study because we wanted to better understand discrimination experiences affecting sexual minority populations," Exten said. "We wanted to examine whether there were adults at particular ages who were more likely to have experienced discrimination in the past year -- and if so, what types of discrimination. We aimed to call attention to the continued high rates of discrimination that LGBTQ+ individuals are experiencing -- because we know that these experiences affect their health."

Collaborator Stephanie Lanza, professor of biobehavioral health and director of the Edna Bennett Pierce Prevention Research Center, noted that "a better understanding of recent experiences of discrimination among adults across a wide range of ages is necessary so that we can add to the national discourse on LGBTQ+ disparities in physical and mental health. Importantly, examining specific types of discrimination experienced by sexual minorities across age can indicate where there is greatest need for intervention -- both to support individuals and to address stigma more broadly."

According to the researchers, previous work has found that sexual minorities tend to experience poorer health than non-sexual minorities. Exten said that while sexual minorities are not inherently more vulnerable to health concerns, their experiences with anti-LGB stress, stigma, and discrimination across the life course may lead to poor and complicated health patterns.

"Research has linked discrimination and poor health outcomes among minorities, but we didn't have a clear picture of whether sexual minorities may be more or less vulnerable to experiencing discrimination at certain points during their life," Exten said. "We might, for example, find that older adults are more likely to experience discrimination in health care settings as they age, given that older adults are more likely to need medical care."

The researchers used data gathered from a nationally representative study of U.S. citizens on 2,993 sexual minorities between the ages of 18 and 65. Participants answered a questionnaire about how often they had experienced discrimination in the previous year due to being perceived as gay, lesbian or bisexual.

The survey included questions about whether they had experienced six different forms of discrimination. The researchers grouped the different types of discrimination into three groups: general, like in public places like shops or restaurants; victimization, such as being called names, pushed or threatened; and healthcare discrimination, such as trouble obtaining healthcare due to sexual orientation, or discrimination during treatment.

After analyzing the data, the researchers found that 17% of participants had experienced some form of discrimination in the previous year. In total, 13% reported general discrimination, 12% reported victimization and 7% reported healthcare discrimination.

The researchers also broke down the data by age, gender, and sexual identity. In general, discrimination experiences were most common in early adulthood, with another increase in middle adulthood. Males were generally more likely to report having experienced anti-LGB discrimination and victimization in the last year. Healthcare discrimination peaked among individuals in their early 50s.

"The overall rates were quite high," Exten said. "This was particularly true in some subgroups of the community. Among 18-year-olds, one in five males experienced victimization in the past year. Experiencing victimization can be quite traumatic, and certainly acts as a stressor for these individuals. We hope these findings will be a call to action."

Exten said the findings -- recently published in the Journal of Homosexuality -- suggest the need for continued work in reducing discrimination.

"Reducing discrimination in the United States will require broad approaches within our communities, schools, workplaces, healthcare facilities, and families," Exten said. "It is critical that we continue to recognize that discrimination is happening and that we continue to work to develop more inclusive policies and spaces in our communities.

Credit: 
Penn State

FEFU scientists obtained new compounds with potential antitumor effect from sea sponge

image: The vast part of the research related to the synthesis of target compounds was carried out by Polina Smirnova -- a Ph.D. student of the Department of organic chemistry in SNS FEFU.

Image: 
FEFU press office

Chemists from Far Eastern Federal University's School of Natural Sciences (SNS FEFU) developed a new method to synthesize biologically active derivatives of fascaplysin -- cytotoxic pigment of a sea sponge. For the first time, they got a sufficient amount of 3-bromofascaplysin and 3,10-dibromofascaplysin, which were known before but were not available for study. Basing on these compounds, scientists synthesized 14-bromoreticulatate and 14-bromoreticulatine -- the derivatives of the reticulatine alkaloids. The resulting article is published in Marine Drugs.

The joint study of scientists from the Far Eastern branch of the Russian Academy of Sciences and University Medical Center Hamburg-Eppendorf has shown that 14-bromoreticulatine -- obtained for the first time in FEFU -- selectively affects Pseudomonas Aeruginosa bacterium, resistant to many types of antibiotics. Also, the new compound has a moderate cytotoxic activity resulting in the death of skin cancer, rectal cancer, and prostate cancer cells.

3,10-Dibromofascaplysin can suppress human prostate cancer cell's metabolism at sevenfold lower concentrations than those that lead to their cell membranes destruction. The substance has a wide range of potential applications to selective treatment of malignant tumor, without an adverse effect on the healthy cells.

&laquoIt is important that due to the modification of the original alkaloid structure, we were able to register the variety of its mechanisms of action at greatly different concentrations. This opens up possibilities to create new antitumor drugs with selective action based on fascaplysin. Modern anti-cancer drugs have this important property implemented not sufficiently -- it determines their high toxicity for the organism as a whole. Next drug generation will not affect the patients' quality of life that bad", said Maxim Zhidkov one of the authors of the study, associate professor of organic chemistry in SNS FEFU.

The author refined that it is too early to talk about the developing of a new drug. Scientists hope to synthesize a new series of fascaplysin derivatives based on the results of the study and to test their action in mice. The results of prospective experiments will reveal how close the researchers came to a new effective drug.

The vast part of the research related to the synthesis of target compounds was carried out by Polina Smirnova -- a Ph.D. student of the Department of organic chemistry in SNS FEFU. Scientists from G.B. Elyakov Pacific Institute of Bioorganic Chemistry of the Far Eastern Branch of the Russian Academy of Sciences, Federal Scientific Center of the East Asia Terrestrial Biodiversity of the Far Eastern Branch of the Russian Academy of Sciences and University Medical Center Hamburg-Eppendorf are among other participants of the study.

Fascaplysin was first extracted from the sea sponge Fascaplysinopsis sp. in 1988 and has been intensively studied ever since. This compound has a wide range of biological activity -- it strongly suppresses the growth of various types of tumor cells and has antibacterial, antifungal and even analgesic effects. However, the fascaplysin's high toxicity to healthy cells limits its use as a drug.

Earlier, FEFU scientists have shown that fascaplysin derivatives stimulate the death of glioblastoma multiforme cells -- the most aggressive type of brain cancer.

Credit: 
Far Eastern Federal University

Bottlebrushes rise up to control coatings

image: Rice University graduate student Hao Mei holds a plate with a pattern of bottlebrush polymers spelling "RICE." The microscopic polymers could give industry exquisite control over the properties of surface coatings.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (Nov. 14, 2019) - A microscopic polymer in the form of a common kitchen implement could give industry exquisite control over coatings.

Bottlebrush copolymers have long been a topic of study for Rafael Verduzco, a chemical and biomolecular engineer at Rice University's Brown School of Engineering. Now, he and his collaborators have developed models and methods to refine surface coatings to make them, for instance, more waterproof or more conductive.

The researchers discovered that bottlebrushes mixed with linear polymers tend to migrate to the top and bottom of a thin film as it dries. These films, as coatings, are ubiquitous in products, for instance as waterproof layers to keep metals from rusting or fabrics from staining.

When the migration happens, the linear polymers hold the center while the bottlebrushes are drawn to the air above or the substrate below. This, Verduzco said, effectively decouples the properties of the bulk coating from its exposed surfaces.

Computational models and experiments showed that variations in the bottlebrush itself could be used to control surface characteristics.

Bottlebrush polymers remain challenging to make in bulk, Verduzco said, but their potential uses are vast. Applications could include drug delivery via functionalized bottlebrushes that form micelles, lubricants, soft elastomers, anti-fouling filters and surfaces that heal themselves, he said.

The details appear in the American Chemical Society journal Macromolecules.

The Rice lab, with help from peers at the University of Tennessee, Knoxville; Oak Ridge National Laboratory and the University of Houston, characterized various bottlebrushes made of polystyrene and poly(methyl methacrylate) (aka PMMA) while studying what causes the polymers to migrate.

Resembling their macro kitchen cousins (as well as certain flowers), bottlebrushes consist of small polymer chains that radiate outward from a linear polymer rod. The bottlebrushes self-assemble in a solution, which can be manipulated to adjust their properties.

Coatings are ubiquitous, Verduzco said. "If we didn't have the right coatings, our materials would degrade quickly," he said. "They would react in ways we don't want them to. So coating a surface is usually a separate process; you make something and then you have to find a way to deposit a coating on top of it.

"What we're looking at is a kind of universal additive, a molecule you can blend with whatever you're making that will spontaneously go to the surface or the interface," he said. "That's how we ended up using bottlebrushes."

Bottlebrushes can be tuned by varying the number of side chains, their length or the length of the backbone polymer, Verduzco said. The side chains themselves can be of mixed type, and small molecules or proteins can be added to their end groups.

"The chemistry of these materials is advanced sufficiently that you can pretty much put just about any kind of polymer as one of these bristles on the side chain," he said. "You can put them in different order."

The researchers found entropic and enthalpic thermodynamics drove bottlebrushes almost completely away from the interior of the films and toward the interfaces as they dried. Even where linear polymers were designed to pair with the surface interface, the bottlebrushes still rose to the exposed surface.

Verduzco noted the findings were made possible by the time of flight-secondary ion mass spectrometer acquired by Rice in 2018 through a National Science Foundation grant. The spectrometer allowed the researchers to characterize not only the surface of coatings by bombarding them with ions, but also how coatings changed as microscopic layers were removed from the top down.

Credit: 
Rice University

Findings could identify aggressive breast cancers that will respond to immunotherapy

CHAPEL HILL — University of North Carolina Lineberger Comprehensive Cancer Center researchers have discovered a promising method to identify aggressive breast cancer tumors that will respond to drugs that unleash the immune system against cancer.

The U.S. Food and Drug Administration recently approved a treatment that combines an immunotherapy drug and chemotherapy for triple negative breast cancer, but not all cases of this aggressive form of breast cancer responded in clinical studies.

UNC Lineberger researchers discovered biological clues that could help identify which tumors might respond to the combination treatment.

Their findings, published in the journal Cell, were drawn from studies in mice and an analysis of data from six clinical trials. If confirmed in future studies, the insights could help guide patients to the right treatments, sparing them from those that are not effective. It also could lead to an approach to make the drugs work in cancers that don't initially respond.

"Potentially, we have a new biomarker to be used to figure out which triple negative breast cancer patients should be receiving immunotherapies," said UNC Lineberger's Charles M. Perou, PhD, the May Goldman Shaw Distinguished Professor in the UNC School of Medicine departments of genetics and pathology.

Triple negative breast cancer lacks three receptors used to guide therapy in breast cancer patients: the estrogen receptor, progesterone receptor, and HER2/ERBB2 protein. It accounts for approximately 12 percent of invasive breast cancer cases in the United States, and is more likely to affect younger women, as well as black women and people who have a BRCA1 mutation. Without these three treatment options, doctors typically have used chemotherapy, which bluntly attacks rapidly dividing cells in the body, but that can be effective for many patients with triple negative breast cancer.

The FDA approved the use of an immunotherapy drug, atezolizumab, in combination with chemotherapy for patients with triple negative breast cancer that has metastasized, or spread out of the breast. It was the first time the FDA had approved a treatment that included an immunotherapy for breast cancer.

Atezolizumab is a checkpoint inhibitor, a type of immunotherapy that "releases the brakes" on certain immune cells called T cells, freeing them to find and attack cancer cells. These treatments have been groundbreaking for patients with advanced melanoma, the deadliest form of skin cancer, and other cancers.

A New England Journal of Medicine study reported that the combined atezolizumab-chemotherapy regimen produced prolonged progression-free survival for some patients with triple negative breast cancer that tested positive for the PD-L1 protein. Forty-one percent of patients were PD-L1 positive, and not all of these tumors responded to the combination therapy.

To study why some triple negative breast cancers respond to immunotherapy and others do not, Perou and his colleagues created multiple mouse model of triple negative breast cancer to define the biological features of breast tumors that do respond to the treatment.

In order to answer this question, the mouse models had to respond to immunotherapies - which existing models did not. They genetically modified existing models so that they contained a high number of so-called "tumor antigens" - irregular proteins in the tumor that can trigger the immune system - so the altered models were highly responsive to immune treatments.

They then used genetic tools to analyze the biological features of the tumors that responded, and uncovered a "signature" of response using gene expression data.

The signature revealed clues about what could be causing the immunotherapies to work: The patients whose cancer responded to treatment had a coordinated immune response to the cancer that involved multiple types of immune cells, including both T-cells, which can directly attack and kill tumors, and B cells, which make antibodies that may also attack tumors.

"The biomarker signifies a coordinated effort of the adaptive immune system," Perou said.

Researchers also showed that the gene expression pattern worked for identifying immunotherapy responses in their mouse models as well as in analysis of data from human clinical trials. They showed they could potentially use it to identify breast cancers that will respond to chemotherapy, HER2-positive breast cancers that will respond to trastuzumab, and importantly, human melanomas that will respond to immunotherapies.

"There are many types of immune cells, and they need to work together to mount an effective immune response, and this biomarker shows when B cells and T cells are working together," Perou said. "And whenever we see that, it predicts a better response--whether it to be immune therapy in melanoma, HER2 targeting therapy in breast cancer patients or chemotherapy in triple negative breast cancer patients."

They believe the finding could identify both patients who would benefit from the therapy, and also those who wouldn't.

He added that they also want to use their findings to discover new ways to boost treatment responses.

"Our work going forward right now is to look for ways to leverage B cells to improve treatment plans and regimens for breast cancer patients," said Dan Hollern, PhD, the first author of the study and a postdoctoral research associate at UNC Lineberger. "We definitely need to be looking at ways to activate B cells more effectively with our therapies."

Researchers say they also plan to study the biomarker in further studies in order to continue to evaluate the signature in patients with triple negative breast cancer who are receiving the immunotherapy combination.

Credit: 
UNC Lineberger Comprehensive Cancer Center