Culture

Is human cooperativity an outcome of competition between cultural groups?

image: Carla Handley meeting with the Turkana community.

Image: 
Carla Handley

It may not always seem so, but scientists are convinced that humans are unusually cooperative. Unlike other animals, we cooperate not just with kith and kin, but also with genetically unrelated strangers. Consider how often we rely on the good behavior of acquaintances and strangers-- from the life-saving services of firefighters and nurses, to mundane activities like our morning commute and queueing at the airport check-in counter. Of course, we encounter people who cheat, disregard the welfare of others, and engage in cronyism and nepotism. But we tend to perceive these behaviors as deviant, whereas in most animal societies these behaviors are the gold standard.

The hotly contested issue is why we are the famed cooperators of the animal kingdom. The answer is thought to be some characteristic that is exaggerated in humans compared to other animals: language, intelligence, culture, large-game hunting, or our very needy children. Teasing apart how these traits influenced the evolution of cooperation has been challenging and has led to a proliferation of theories--and acrimonious debates--that emphasize one or other of these characteristics.

A study by ASU researchers Carla Handley and Sarah Mathew published in Nature Communications provides some insight on this issue by pinpointing how culture may have fueled our capacity to cooperate with strangers. The researchers empirically tested--and confirmed--predictions of a controversial theory referred to as cultural group selection theory. The idea is that culturally different groups compete, causing the spread of traits that give groups a competitive edge. Cooperation is exactly such a trait--costly for individuals, but advantageous for groups. Handley was a postdoctoral researcher with Arizona State University at the time of study, and Mathew is assistant professor in the ASU School of Human Evolution and Social Change and research affiliate with the Institute of Human Origins.

During the second half of the 20th century, biologists famously discredited the idea that selection could act on groups. They found that, typically, groups are not different enough from each other for selection to act on. Because individuals migrate, over time migration makes the composition of groups similar. Group-beneficial behaviors like cooperation therefore lose out.

The above concern may not apply, however, for behaviors that are culturally, rather than genetically, transmitted. When people migrate, they can, through social learning, culturally acquire the behaviors that are popular in their new surrounding. Thus, cultural groups can remain different, even if people move a lot. This means that selection can act on groups, and group-beneficial behaviors like cooperation can flourish.

"People have the intuition that being cultural helps us cooperate. What we are showing is that culture allows groups to be different, and therefore to compete. It is this group competition--ironically--that sculpted our cooperativity," said Mathew.

To evaluate this theory, Handley and Mathew examined cultural variation and cooperation among Kenyan pastoralists. They sampled 759 individuals from nine clans spanning four ethnic groups--the Borana, Rendille, Samburu and Turkana--all of who practice semi-nomadic subsistence pastoralism in the arid savanna of northern Kenya. These groups compete intensively among each other for pasture, dry-season water wells and livestock, including through lethal cattle raids. The researchers found that, as predicted, cultural practices and beliefs were substantially variable between populations. Ten to 20 percent of the observed cultural variation was between competing groups. In contrast, typically less one percent of genetic variation is between groups. This indicated that there is potential for cultural group selection to occur.

Next, they examined who people cooperate with and found that cooperation is indeed directed towards cultural ingroup members. People feel obliged to cooperate with strangers, as long as they share their cultural values, beliefs and norms with those people. Such culturally parochial cooperation is to be expected if competition between cultural groups influenced the evolution of cooperation.

"This study is unique as an empirical field test of cultural group selection by examining how cooperation operates between and within four distinct ethnolinguistic groups," said Handley. "Taking a step back, it may also contribute to a reframing of conflict discourse in small-scale societies where cooperative action has been severely underrepresented and 'illegitimate' violence is regarded as the status quo."

The findings caution that although humans are hyper cooperative, our evolved cooperative dispositions are still limited in their scale, thus proving challenging for solving global-scale problems. Innovative thinking is necessary if we are to save ourselves from climate change, pandemics, and of course, aliens.

Credit: 
Arizona State University

Publicly sharing a goal could help you persist after hitting failure

BINGHAMTON, N.Y. ? Publicly sharing a goal may help you persist after hitting a failure, but only if you care about what others think of you, according to new research from Binghamton University, State University of New York.

However, public announcements, such as Facebook posts about New Year's resolutions or weight loss targets, may only be motivating when there is immediate feedback after a failure and if there is a high incentive in reaching a goal.

"Everyone sets goals, and some people choose to make those goals public instead of keeping them private. Everyone also fails to meet goals from time to time," says Jenny Jiao, an assistant professor of marketing at Binghamton University's School of Management. "We were interested in finding out what happens after a failure."

Working with Catherine Cole, a professor of marketing at the University of Iowa's Tippie College of Business, Jiao studied the effects of goal publicity, failure feedback and goal incentives on goal persistence across three different studies.

Each study consisted of subjects completing a task, learning they failed, and giving them another opportunity to complete the task, with variations to control for each of the effects they wanted to test.

"When you hit a failure, virtually all of the effort you've put into your goal is now a sunk cost. You can't go back and try to fix what you've already done. You now only have two options - give up or keep trying," says Jiao.

The effect of goal publicity

Researchers found that publicly announcing your goal only affects those who care about what others think about them.

"If your public reputation is something you hold in high regard, then failing publicly is probably going to push you to not want to fail publicly again. There is a greater chance you're going to try hitting that goal again." says Jiao. 

However, Jiao says people who do not care too much about public perception aren't affected by the public or private nature of a goal after hitting failure.

"If you don't care as much about your reputation, then it's not going to matter if people know about your failure or not," she says. "If they think you are a failure, that is not going to bother you as much."

The effect of failure feedback

Researchers found that feedback plays an important role in goal persistence after a failure.

Jiao says those who receive feedback immediately after a failure are more likely to continue pursuing a goal than those who receive no feedback or delayed feedback.

"If someone gives you immediate feedback, you then start thinking about what you could've done better," says Jiao. "If that feedback is delayed, then you've probably found ways to justify your failure, and you're less likely to pick your goal back up."

The effect of incentives

Researchers found that incentives need to be perceived as high for someone to continue working towards a goal after hitting a failure.

"You may reassess a goal after failing and realize that it may not be worth the effort," says Jiao. "However, if there is a reward that you perceive as being very valuable, it's going to keep pushing you towards reaching that goal."

As a marketing researcher, Jiao's hopes the study will help marketers understand what drives consumers to keep working towards a goal after hitting a failure. She says the implications could help companies understand how to structure rewards programs or loyalty cards.

"Some companies spend millions on marketing campaigns that encourage consumers to post goals on social media. This research shows that it's only effective for certain kinds of consumers," says Jiao. "It also shows the importance of immediate feedback and incentives if you want consumers to continue working towards those goals."

Credit: 
Binghamton University

Study identifies interaction site for serotonin type 3A and RIC-3 chaperone

image: TTUHSC's Michaela Jansen, Pharm.D., Ph.D., completed a study that looked at the receptor dysfunction associated with Alzheimer's disease, Parkinson's disease and several other serious neurological disorders.

Image: 
TTUHSC

To address the receptor dysfunction associated with several serious neurological diseases, Michaela Jansen, Pharm.D., Ph.D., from the Texas Tech University Health Sciences Center School of Medicine recently completed a study that provides novel insights into a protein-protein interaction that may one day lead to more effective treatments for these disorders. The study, "Delineating the site of interaction of the 5-HT3A receptor with the chaperone protein RIC-3," was recently published in Biophysical Journal.

Serotonin type 3A (5-HT3A) is a member of the protein superfamily known as pentameric ligand-gated ion channels (pLGIC). These channels, primarily located within the central and peripheral nervous systems, act as neurotransmitter receptors, a type of receptor that binds with neurotransmitters rather than other molecules and produces an electrical signal by managing ion channel activity. When they don't function properly, these proteins have been linked to Alzheimer's disease, Parkinson's disease, epilepsy, schizophrenia, alcohol addiction and myasthenia gravis, a chronic autoimmune disease that causes certain muscles to weaken.

Members of the pLGIC superfamily are assembled from five subunits, each of which consists of three domains: the extracellular domain (ECD), the transmembrane domain (TMD) and the intracellular domain (ICD).

In previous research, Jansen and her team showed that the ICD of 5-HT3A interacts with a chaperone protein known as Resistance to Inhibitors of Cholinesterase 3 (RIC-3). Chaperone proteins like RIC-3 help the subunits of pLGIC proteins, like 5-HT3A, assemble and function properly.

"For this study, we specifically looked at the serotonin-gated ion channel; it's a good model system because you have five times the same subunit within one channel, which makes it somewhat easier to study" Jansen said. "Clinically, it's important for drugs that, for example, are used to treat very severe nausea and vomiting during cancer treatment with chemotherapeutic agents. So we use this receptor a lot as a model system."

The ICD for 5-HT3A consists of 115 amino acids linked together in a peptide chain. Though her team had demonstrated previously that the ICD is required and sufficient for the chaperone protein to act, they didn't know which segment of amino acids along the ICD chain supported the interaction between the receptor protein 5-HT3A and the RIC-3 chaperone protein.

"With this study, we show that the very first segment, which consists of 24 amino acids, is essentially all that's needed for the interaction," Jansen said. "Interestingly, this segment contains a short alpha helix that we think is conserved across other members of the ion channel super family, so this will help us to apply what we learned here to several related channels."

Specifically, Jansen said, her team investigated which sites of the 5-HT3A and the RIC-3 have to fit together so that the machinery that leads to assembly can work. This is important because the number of receptors in the brain is disturbed in some diseases.

For example, many times the number of channels is altered for Alzheimer's disease, so understanding how this protein-protein assembly works could help researchers design drugs that mimic the interaction. Jansen believes this could help, in a pharmacological way, to correct the receptor numbers in the brain.

"This is important because if you know this part, and the structure of it is known, then you could say, 'OK, let's make a drug that binds to the surface of the segment,'" Jansen said. "This can help us with regulating receptor numbers for Alzheimer's disease; you have the lock, now you can design the key for it because you know the structure of this segment. This is what is needed for structure-guided drug design; you can conceptualize a small drug-like molecule and then investigate if that works to interfere with processes that are not functioning in certain disease states."

Having narrowed down the important role 5-HT3A plays in this protein-protein interaction, Jansen and her team will go back and similarly investigate RIC-3.

"When we do that, we'll more fully understand the two detailed parts that need to interact," Jansen said. "I think that would complement this study and be a good step forward."

Credit: 
Texas Tech University Health Sciences Center

General anesthesia in cesarean deliveries increases odds of postpartum depression by 54 percent

February 4, 2020 -- A new study shows that having general anesthesia in a cesarean delivery is linked with significantly increased odds of severe postpartum depression requiring hospitalization, thoughts of suicide or self-inflicted injury. The findings from research conducted at Columbia University Mailman School of Public Health and Columbia University Irving Medical Center are published online in Anesthesia and Analgesia, the journal of the International Anesthesia Research Society.

The study is the first to examine the effect of the mode of anesthesia for cesarean delivery on the risk of postpartum depression (PPD) and the possible protective effect of having regional anesthesia for cesarean delivery on maternal mental health compared with general anesthesia.

Postpartum depression in the U.S. has increased seven-fold in the past 15 years, and it now affects up to 1 in 7 women, yielding about 550,000 annual new cases.

"General anesthesia for cesarean delivery may increase the risk of postpartum depression because it delays the initiation of mother to infant skin-to-skin interaction and breastfeeding, and often results in more acute and persistent postpartum pain," said Jean Guglielminotti, MD, PhD, in the Department of Anesthesiology and the Department of Epidemiology at Columbia Mailman School, and first author. "These situations are often coupled with a new mother's dissatisfaction with anesthesia in general, and can lead to negative mental health outcomes."

The researchers used hospital discharge records of cesarean delivery cases performed in New York State hospitals between 2006 and 2013. Of the 428,204 cesarean delivery cases included in the analysis, 34,356 or 8 percent of the women had general anesthesia. Severe postpartum depression requiring hospitalization was recorded in 1,158 women (3 percent); of which 60 percent were identified during readmission to the hospital, after approximately 164 days post-discharge. Compared to regional anesthesia in cesarean delivery, general anesthesia was associated with a 54 percent increased odds of postpartum depression. The odds increased to 91 percent for suicidal thoughts or self-inflicting injury.

While general anesthesia is associated with the shortest decision-to-delivery interval in case of an emergency delivery, there is no evidence that it improves outcomes for the baby but there is mounting evidence that there can be adverse consequences for mothers, noted Dr. Guglielminotti.

"Our findings underscore the need to avoid using general anesthesia for cesarean delivery whenever possible, and to provide mental health screening, counseling, and other follow-up services to obstetric patients exposed to general anesthesia," said co-author Guohua Li, MD, DrPH, Finster Professor of Epidemiology and Anesthesiology.

Credit: 
Columbia University's Mailman School of Public Health

Study paints picture of marijuana use in pregnant women

image: Authors Ekaterina Burduli, Celestina Barbosa-Leiker, Olivia Brooks, and Crystal Lederhos (from left to right) meet to discuss their study findings. The research team also included Michael Orr and Maria Gartstein, who are not pictured.

Image: 
Photo by Cori Kogan, Washington State University Health Sciences Spokane

As the use of marijuana is legalized in an increasing number of U.S. states, the number of people who use the drug daily is on the rise. This upward trend also holds up in women who are pregnant or breastfeeding, despite evidence that using marijuana could harm their babies.

Published in the Journal of Addiction Medicine, findings from a study conducted by a team of researchers at Washington State University Health Sciences Spokane delve deeper into pregnant women's use of marijuana, providing key insights that will help inform patient education efforts. Their study was conducted in Washington State and is the first study of its kind conducted in a state where marijuana is legal.

"We don't have all the research, but there's enough there to warrant saying that you should not use marijuana while pregnant," said lead author Celestina Barbosa-Leiker, citing prior research that suggests marijuana use during pregnancy is associated with increased risk of low birth weight, still birth, and cognitive and behavioral issues. "Yet, there's a group of women who are using marijuana that have these other chronic conditions, and we need to help them manage those."

An associate professor in the WSU College of Nursing and the vice chancellor for research at WSU Health Sciences Spokane, Barbosa-Leiker said that, based on their findings, pregnant women's health care providers should more fully consider patient history and engage in a shared decision-making process with them about their marijuana use. She suggested health care providers adopt a harm reduction approach focused on limiting marijuana use, rather than asking women to go cold turkey. In addition, she said it's important to train all health care staff to interact with patients in a way that minimizes stigma.

"Many of the moms we spoke to reported feeling incredibly stigmatized as soon as they reported that they were using marijuana," Barbosa-Leiker said. "The worst thing that could happen is that one of these moms feels so uncomfortable that she doesn't come back for prenatal care, which is detrimental to the health of the baby."

Five themes emerge from study

As part of their study, the WSU research team conducted personal interviews with 14 pregnant women and 5 women who had given birth within the past three months, all of whom reported using marijuana daily while pregnant. They asked the women questions related to their perceptions of the risk and benefits of using marijuana during pregnancy. From the participants' responses, they identified five common themes:

Participants reported using marijuana as a way to manage their health issues, from physical issues such as nausea, pain, and difficulty sleeping to psychological issues such as stress, anxiety, and trauma. Many made this decision not just for themselves, but also for their baby. One woman reported that using marijuana was the only way she could keep food down, providing critical nourishment to her baby. Others said it helped them reduce stress and anxiety and function better as a parent.

Many carefully weighed their decision to continue marijuana use during pregnancy and reevaluated their use through each phase of the pregnancy and after giving birth. One common reason why they used marijuana was to avoid using other medications they felt were more harmful to their baby, such as opioids, anti-nausea drugs, and anti-psychotic medications.

Pregnant women are getting mixed messages from their health care providers. Mostly their providers told them to stop using marijuana, but some were asked to limit their use. A few women reported not ever being asked about marijuana use or their provider not saying anything when they disclosed it, which surprised the researchers given national guidelines that direct health care providers to counsel pregnant women about the risks of using marijuana.

All participants said they wanted more information on the safety and effects of using marijuana while pregnant. When women felt that medical providers were not giving them enough information, they sought out advice from other sources, such as budtenders.

Legal considerations appear to be driving whether or not pregnant women disclose their marijuana use to health care providers, as well as their pattern of use during pregnancy. Fear of being reported to child protective services made some women decide to stop using toward the end of their pregnancy, when test results might have exposed their marijuana use.

Barbosa-Leiker said the study completely changed her perspective.

"Going into the study, I thought that showing these women the research about how it impacts their baby would make them change their behavior," she said. "Once I heard these women's stories of going through incredibly traumatic experiences and making very brave choices to keep themselves and their babies as healthy as possible, it made me realize that we need to do a better job of knowing patients' perspectives before we try to get them to adopt healthier behaviors," she said.

In addition to providing key insights that can help health care providers better focus their patient education efforts, Barbosa-Leiker said the study also highlights the need for more research to determine the long-term effects of marijuana use during pregnancy and breastfeeding.

Credit: 
Washington State University

Children's mental health is effected by sleep duration

image: These are the different parts of the brain effected by sleep.

Image: 
University of Warwick

Depression, anxiety, impulsive behaviour and poor cognitive performance in children is effected by the amount of sleep they have researchers from the University of Warwick have found.

Sleep states are active processes that support reorganisation of brain circuitry. This makes sleep especially important for children, whose brains are developing and reorganizing rapidly.

In the paper 'Sleep duration, brain structure, and psychiatric and cognitive problems in children.' published in the journal Molecular Psychiatry, 11,000 children aged 9-11 from the Adolescent Brain Cognitive Development dataset had the relationship between sleep duration and brain structure examined by researchers Professor Jianfeng Feng, Professor Edmund Rolls, Dr. Wei Cheng and colleagues from the University of Warwick's Department of Computer Science and Fudan University.

Measures of depression, anxiety, impulsive behaviour and poor cognitive performance in the children were associated with shorter sleep duration. Moreover, the depressive problems were associated with short sleep duration one year later.

Lower brain volume of brain areas involved the orbitofrontal cortex, prefrontal and temporal cortex, precuneus, and supramarginal gyrus was found to be associated with the shorter sleep duration by using big data analysis approach.

Professor Jianfeng Feng, from the University of Warwick's Department of Computer Science comments:

"The recommended amount of sleep for children 6 to 12 years of age is 9-12 hours. However, sleep disturbances are common among children and adolescents around the world due to the increasing demand on their time from school, increased screen time use, and sports and social activities.

A previous study showed that about 60% of adolescents in the United States receive less than eight hours of sleep on school nights.

"Our findings showed that the behaviour problems total score for children with less than 7 hours sleep was 53% higher on average and the cognitive total score was 7.8% lower on average than for children with 9-11 hours of sleep. It highlights the importance of enough sleep in both cognition and mental health in children."

Professor Edmund Rolls from the University of Warwick's Department of Computer Science also commented:

"These are important associations that have been identified between sleep duration in children, brain structure, and cognitive and mental health measures, but further research is needed to discover the underlying reasons for these relationships."

Credit: 
University of Warwick

Chitosan-graft-Polyacrylamide tested as inhibitor of hydrate formation

image: The FT-IR Spectrum of chitosan-graft-polyacrylamide.

Image: 
Kazan Federal University

Currently, 90% of the hydrocarbon resources of the entire continental shelf of Russia are concentrated in the Arctic, including 70% on the shelf of the Barents and Kara Seas. Scientists understand that the shelf is a promising future, and the necessary technological basis for its future development should already be created.

Since the Arctic zone has rather severe conditions and low temperatures, and in the case of the shelf also high pressure, specialists have difficulties that greatly affect the efficiency of oil production processes; one of these problems is hydrate formation. Gas hydrates are crystalline compounds of gases and water of variable composition. They look like snow or ice and have similar physical properties. They are formed upon contact of gas and water under certain thermobaric conditions, and the colder the climate, the more often it is necessary to solve the problem with hydrate formation.

"The main means of combating hydrate formation are thermodynamic inhibitors, but they are required in large quantities, moreover, they contain environmentally harmful substances. In contrast, our reagent is devoid of these characteristics, and it can be used in a smaller volume. The low dosage of the reagent also reduces the burden on the environment. Our reagent allows you to slow down the temporary formation of hydrates and ensure the transportation of products in the right temperature range," said Mikhail Varfolomeev, Head of Ecooil Research Unit at Kazan Federal University.

A completely new type of reagent was created by KFU scientists from natural components. It is also distinguished by the fact that it is less toxic than a number of its many predecessors. Until today, nobody in the world has ever received a substance close to this in structure. In fact, with the help of this development, Kazanian scientists created their own unique concept of obtaining new reagents, pursuing two goals at the same time: to achieve effective use and not cause environmental damage. It is worth recognizing that toxic reagents are significantly inferior to those created on the basis of biodegradable materials.

"At this stage, laboratory tests are being conducted. And we hope to patent the developments in the near future in order to introduce them into the industry in the future. This process requires some time, and I think that due to the fact that the development of oil and gas in the Arctic is growing every year, our developments will be really in demand," Varfolomeev noted.

In the northern latitudes, hydrates have long been a problem: if a hydrate formation mode is established in a well or a pipeline, a hydrate plug is formed, which blocks the movement of gas or oil and leads to an accident. Another old problem associated with gas hydrates in the Arctic is frozen gas hydrates in permafrost, which begin to decompose during drilling and generate gas emissions, and this complicates the drilling process and sometimes leads to accidents at wells. Moreover, the further north the drilling rigs move, the more intensive these emissions become.

Gas hydrates are a fairly complex subject to study. Their research requires high-pressure equipment, and there are not many researchers with relevant expertise in Russia. Today we can say that the reagent created by Kazan researchers really has great prospects, being an extremely important component for the development of offshore fields in the Arctic.

Credit: 
Kazan Federal University

Heart muscle cells change their energy source during heart regeneration

image: Heart muscle cells in the border zone of the zebrafish heart (green), with all cell nuclei in blue.

Image: 
Hessel Honkoop and Fabian Kruse, ©Hubrecht Institute

Researchers from the group of Jeroen Bakkers at the Hubrecht Institute (KNAW) have found that the muscle cells in the heart of zebrafish change their metabolism, the way in which they generate energy, during heart regeneration. Contrary to the human heart, the zebrafish heart can regenerate after injury. Studying the mechanisms of regeneration in zebrafish hearts may help to better understand why the human heart does not regenerate, and find ways to stimulate regeneration after a heart attack in humans in the future. The results from this study were published in the scientific journal eLife on the 4thof February.

Heart attacks are a common cause of death in the Western world. During a heart attack, coronary arteries get occluded, leading to a decrease of oxygen and nutrients to the heart muscle cells supplied by this artery. These low levels of essential nutrients lead to the death of heart muscle cells, which are cleared by the immune system and replaced by a permanent scar to heal the injury. The scar, although good for stabilization of the injured heart, is dysfunctional and will lead to a loss of cardiac function of patients. This results in a lower quality of life for affected patients and makes them more likely to die as a result heart failure.

Border zone

Whereas mammals, such as humans, are unable to repair their heart with functional muscle cells after injury, several other animal species have the remarkable capacity to do this through a process called regeneration. One of these species is the zebrafish. As there are no stem cells present in the heart that could replenish the heart muscle cells, zebrafish use an alternative strategy to restore cardiac function. Surviving heart muscle cells in a region close to the injury site (the so-called border zone) are able to divide and repopulate the injured part of the heart with new muscle cells, replacing the initially formed scar within 3-6 months after injury. How zebrafish achieve this remains largely unknown but holds great potential to help human hearts regenerate after a heart attack.

Metabolism

To learn more about the processes underlying the successful regeneration of the heart in zebrafish, the researchers used a technique called single cell RNA sequencing on the heart muscle cells in the border zone. With this technique, they were able to investigate the activity of all possible genes in individual heart muscle cells in the border zone. The activity of genes told the researchers which processes occur in these individual cells that enable them to divide and thereby generate new heart muscle cells. The researchers discovered that the heart muscle cells in the border zone that start dividing are very similar to embryonic heart muscle cells. “A striking difference between the border zone heart muscle cells and the other heart muscle cells further away from the injury was that they had completely altered their metabolism,” says Hessel Honkoop (Hubrecht Institute). Instead of using fatty acids as a main energy source, the border zone heart muscle cells relied on sugars, in a pathway called glycolysis, to produce energy and building blocks for their cell division. This switch in energy source turned out to be very important for heart regeneration, since blocking glycolysis severely impaired the ability of the heart muscle cells to divide.

Dividing heart muscle cells

To investigate how this finding could ultimately lead to benefits for patients the researchers looked at the mechanisms involved in this switch in metabolism. They found out that a signaling pathway in which the proteins Nrg1 and ErbB2 are involved can induce the metabolic switch during zebrafish heart regeneration. Increasing the activity of these proteins stimulates the heart muscle cells to divide in both zebrafish and mice, even without an injury in the heart. The researchers noticed that the increased cell division induced by Nrg1 and ErbB2 in mice was accompanied by the same metabolic switch they found in the zebrafish during heart regeneration. “We then blocked this metabolic switched in the dividing mouse heart muscle cells that were induced by Nrg1 and ErbB2,” says Honkoop. “When we saw that this also blocks division in these uninjured mouse heart muscle cells, we realized that the knowledge we were obtaining from the zebrafish is universal to other species.” Thereby, the researchers show that the zebrafish is a valuable tool to better understand heart regeneration and that a change in metabolism plays an important role in this process. More research is needed to show whether inducing this metabolic switch to glycolysis can help to restore cardiac function in people that have suffered from a heart attack.

Credit: 
Hubrecht Institute

Unlocking the secret of cell regulation

image: from the LIMES Institute of the University of Bonn in the laboratory.

Image: 
(c) Photo: Barbara Frommann/Uni Bonn

Ribonucleic acids (RNA) ensure that the blueprint in the cell nucleus is translated into vital proteins and that cell functions are regulated. However, little is known about the structure and function of particularly long RNAs, which consist of hundreds or thousands of building blocks. Chemists at the University of Bonn have now developed a new method for this purpose: They mark the complex molecules with tiny "flags" and measure the distances between them with a "molecular ruler". The results are published online in advance in the journal "Angewandte Chemie International Edition". The print version will be published shortly.

In living cells, everything follows a plan: The blueprints for all building and operating materials are stored in the cell nucleus. If, for example, a certain protein is required, the genetic information is read from the DNA and translated into ribonucleic acid (RNA). The RNA transmits the blueprint to the cell's "protein factories", the ribosomes. "However, more than 80 percent of ribonucleic acids are not involved in the production of proteins at all," says Dr. Stephanie Kath-Schorr from the LIMES Institute at the University of Bonn. This so-called "non-coding" RNA is probably involved in various regulatory processes in the cell.

Scientists would like to gain a much better understanding of the control processes that non-coding RNA is responsible for. "To do this, however, we must first understand the structures of ribonucleic acids and how they are folded," says Kath-Schorr. The spatial structure seems to have an important role in the function of RNA. It determines which molecules a certain RNA binds to and therefore triggers important processes in the cell.

A team of chemists from different institutes at the University of Bonn has now jointly developed a method to elucidate the structure and folding of particularly long RNA molecules. "Shorter RNAs can be examined using crystal structure analysis, but this method is very difficult to use when it comes to large and flexible ribonucleic acid complexes," explains first author Christof Domnick. The scientists were therefore looking for a new way to create RNAs consisting of several hundred or even thousands of building blocks.

"Flags" for marking

The scientists around Dr. Stephanie Kath-Schorr first inserted two artificial letters into a DNA sequence, which do not occur in this form in nature. In the subsequent transcription into RNA, these artificial letters served as a kind of "flag" to mark specific locations on the ribonucleic acid, which comprises several hundred building blocks.

The researchers used the PELDOR method to measure the positions of the labels on the RNA. "The distance between the 'flags' can then be measured as if with a ruler at the molecular level," says Prof. Dr. Olav Schiemann from the Institute of Physical and Theoretical Chemistry at the University of Bonn. The markers can be placed at different locations on the ribonucleic acid and the distance between these flags can then be determined. This data is used to create an image of the structure and folding of RNA.

"We have previously experimented with shorter RNAs and compared the results with theoretical simulations," said Kath-Schorr. "The correlation was very high and the method is therefore reliable." In future, the structure of long RNAs could also be recorded in three dimensions if the labeled ribonucleic acids are recorded from different perspectives.

Great application potential

"Our long-term goal is to measure RNA structures directly in the cell," said the biochemist. "But that's still a long way off." The basic method has great potential for application. For example, RNAs serve as important markers in cancer diagnostics. Kath-Schorr: "Our new method for the structure elucidation of long, non-coding ribonucleic acids can make an important contribution to a better understanding of cellular processes."

Credit: 
University of Bonn

New roles for DNA-packaging proteins

image: (A) Cartoon depiction of chromatin packaged into compact heterochromatin and loosely packed euchromatin. (B) The segregation of heterochromatin and euchromatin is mediated by liquid-liquid phase separation with linker histone H1. (C) Two-color fluorescent microscopy images of histone H1 and HP1α (a heterochromatin marker protein) showing co-localization of H1 and heterochromatin in liquid-like droplets within HeLa nuclei.

Image: 
IBS

How can human cells pack 3-meter-long DNA into their tiny nuclei and unpack it only where and when it is needed? This fascinating process is far from being completely understood.

Researchers at the Center for Soft and Living Matter, within the Institute for Basic Science (IBS, South Korea), made a significant advancement in our understanding of the organization of genomic DNA within the nucleus. These findings, published in Biophysical Journal, describe a new function of histones, which are proteins responsible for packaging genomic DNA and are involved in gene regulation.

DNA wraps around a complex of four core histones (H2A, H2B, H3, and H4), forming a structure known as the nucleosome core particle (NCP). Another histone, the linker histone H1, binds outside of the NCP and links the NCPs together to form a tightly packed structure. This packaging, however, varies during the lifetime of the cell. Cells capable of dividing spend most of their time in interphase, during which they are actively producing proteins and duplicating their DNA before undergoing cell division. During this stage, DNA in complex with histone proteins appears in a "loosened up" state. Its organization, however, is not as random as it might look: specific genes group together depending on their function and occupy specific locations within the nucleus. Genes that are active are grouped together in loosely packed domains called euchromatin, while genes that are silenced are partitioned into densely packed domains called heterochromatin.

Recently, a concept has emerged that formation of heterochromatin is driven by liquid-liquid phase separation (LLPS). LLPS is a physical phenomenon observed in vitro when oppositely charged polymers, mixed in solution, partition into droplets that have a high concentration of both polymers. In the last decade, this phenomenon has excited biologists as it is becoming more and more evident that LLPS plays an important role in forming membrane-less compartments as a way to organize the cell interior. The team realized that histones share many characteristics with other proteins that had been shown to undergo LLPS, namely they contain extended intrinsically disordered regions and have an excess of charged residues.

The researchers discovered that the linker histone H1 is involved in LLPS of heterochromatin in the human cell nucleus. "About three years ago, we had started experimenting with histones and DNA oligomers in vitro and were very excited to find that the DNA formed droplets with a mixture of histones, and that the histones were highly mobile within the droplets. In particular, H1 alone was the best at forming liquid droplets," says Anisha Shakya, lead and corresponding author of the study. "This seeded our interests to study H1 in cells; to test the hypothesis that H1 undergoes LLPS in cells, in particular in the context of heterochromatin formation, where it is found in elevated concentrations."

By culturing HeLa cells expressing fluorescently tagged H1, the research team imaged the spatial and temporal distribution of this protein during the interphase of the cell cycle. Fluorescence microscopy showed H1 localized in small puncta in the nucleus that would occasionally merge with each other: a sign that they are liquid-like. Other experiments confirmed that DNA was densely packed within these droplets, and was enriched in the protein HP1α, indicating heterochromatin regions. Interestingly, in vitro, H1 was found to favorably undergo LLPS with nucleosomes of increasing lengths. This was unique to H1, showing that H1 could incorporate large segments of DNA within the droplets.

Another important finding was that out of the 4 core histones, only H2A formed droplets with DNA or nucleosomes. "The unique behavior of H2A was striking, as it meant that LLPS is not solely dependent on how much charge or structural disorder a protein has. Our results indicate that finer details of the protein structure, such as sequence and charge patterning, must be considered to predict phase behavior," says John T. King, co-corresponding author of the study. The demonstration that a core histone undergoes LLPS opens a new avenue to explore how LLPS could play a role in gene regulation and epigenetics, as histones are heavily modified by the cell to control gene expression.

The research team also explored the role of nucleotide triphosphates (NTPs) in facilitating LLPS. In particular, adenosine triphosphate (ATP), one of the cell's fuels, has been shown to contribute to the solubility of proteins and can facilitate LLPS. Indeed, the team demonstrated the essential role of ATP in promoting the formation of H1-chromatin droplets.

This work describes a new function of the histone proteins: driving LLPS with chromatin to segregate heterochromatin from euchromatin. We can imagine that many biological processes that control expression levels and biochemistry of these proteins can affect LLPS, and vice versa.

Credit: 
Institute for Basic Science

Authentic behavior at work leads to greater productivity, study shows

image: Chris Rosen, University of Arkansas

Image: 
University Relations / University of Arkansas

Matching behavior with the way you feel - in other words, not faking it - is more productive at work and leads to other benefits, according to a new study co-authored by Chris Rosen, management professor in the Sam M. Walton College of Business at the University of Arkansas.

Rosen helped design and write a study led by Allison Gabriel, associate professor of management and organizations at the University of Arizona. They published their findings in the Journal of Applied Psychology.

"We found that people who put forth effort to display positive emotions towards others at work - versus faking their feelings - receive higher levels of support and trust from co-workers," Rosen said. "These people also reported significantly higher levels of progress on work goals likely due to the support they received."

From surveys of more than 2,500 working adults in a variety of industries, including education, manufacturing, engineering and financial services, the researchers analyzed two types of emotion regulation people use at work: surface acting and deep acting. Surface acting involves faking positive emotions when interacting with others in the work environment. One might be frustrated or angry on the inside, but the external appearance disguises those feelings. Deep acting involves trying to change how one feels internally. With deep acting, individuals try to feel more positively in order to be more pleasant when interacting with others.

The researchers wanted to know if people regulated their emotions when interacting with co-workers, and, if so, why they chose to do this if there were no formal rules requiring them to do so. And then, what benefits, if any, did they receive from this effort?

The researchers identified four types of people who regulate their emotions with co-workers. Nonactors engage in negligible levels of surface and deep acting, low actors display slightly higher surface and deep acting, deep actors exhibit the highest levels of deep acting and low levels of surface acting, and regulators display high levels of surface and deep acting.

Nonactors were the smallest group in each study, and the other three groups were similar in size.

Regulators were driven by "impression management," which the researchers defined as strategic motives that include gaining access to resources or looking good in front of colleagues and supervisors. Deep actors were much more likely to be motivated by "prosocial" concerns, meaning they chose to regulate their emotions with co-workers to foster positive work relationships and be courteous.

Regulators - those who mixed high levels of surface and deep acting - experienced emotional exhaustion and fatigue, the researchers found, whereas deep actors - those who relied largely on deep acting - had improved well-being.

Credit: 
University of Arkansas

Green infrastructure provides benefits that residents are willing to work for, study shows

image: Green infrastructure such as rain gardens and green roofs can provide affordable and environmentally sound ways to manage stormwater in urban areas. A new study from the University of Illinois and Reed College shows that residents are willing to help maintain those features.

Image: 
Noelwah Netusil.

URBANA, Ill. - Urban areas face increasing problems with stormwater management. Impervious surfaces on roads and buildings cause flooding, which impacts the water quality of streams, rivers and lakes. Green infrastructure, including features such as rain barrels, green roofs, rain gardens, and on-site water treatment, can provide affordable and environmentally sound ways to manage precipitation.

However, green infrastructure is challenging to maintain, because it is decentralized across a city and requires constant maintenance and upkeep. One way city management can address those challenges is to rely on volunteers to help maintain such features.

A new study from the University of Illinois and Reed College aims to estimate the value people place on improved water quality and storm management, and how much time and money they are willing to contribute to enjoy those benefits.

The researchers presented respondents in Chicago and Portland, Oregon, with a series of hypothetical scenarios that described ways to reduce flooding, improve water quality, and strengthen aquatic habitats in local rivers and streams.

"Our research indicates that these environmental goods produced by green infrastructure have significant monetary value, and that people might be willing to volunteer a significant amount of time to help provide those goods," says Amy Ando, professor of agricultural and consumer economics at U of I, and one of the study's authors.

The paper, published in the Journal of Environmental Economics and Management, used what is called a choice-experiment survey. Respondents were provided with background information about stormwater management issues, then presented with different scenarios and asked to choose between them. The study included survey responses from 334 individuals in Chicago and 351 in Portland.

Ando and co-authors Catalina Londoño Cadavid, Noehwah Netusil, and Bryan Parthum found that people are willing to make considerable contributions both in terms of time and money. For example, improved water quality is estimated to be worth about $280 a year per household. If flooding is cut in half, that benefit is estimated to be worth $300 a year. These amounts indicate how much people would be willing to pay in fees or taxes to obtain those specific benefits.

The study also showed that people may be willing to spend a considerable amount of time working to support these environmental features, especially if it directly benefits their local community.

"We were surprised at the large stated willingness to volunteer that people indicated," Ando says. "For example, the average respondent was willing to spend 50 hours a year on an ambitious project to restore aquatic habitat to excellent condition and water quality to be swimmable."

Comparing the results from Chicago and Portland, the researchers found little difference in monetary values; however, Portland residents were much more willing to volunteer their time for environmental benefits. Ando notes that while the research does not address why that may be the case, Portland has extensive volunteer programs in place, so there may already be a strong culture of volunteering among residents.

Ando says the surveys were designed to reduce hypothetical bias, or the likelihood that people indicate higher values than they would actually contribute. However, those techniques are developed for estimations of money spent, not for volunteer hours, so it remains uncertain if respondents would actually work as much as they indicate.

"People are used to the idea that if there is a city fee you have to pay it. But volunteering is volunteering. You can't force people to work," she notes.

Still, even if respondents exaggerate or overestimate their willingness to work, the results indicate that green infrastructure is considered important enough to spur considerable interest in contributing both time and money.

"The results of our paper seem encouraging to cities, indicating that they might well be able to put together a network of people that could help with decentralized management of green infrastructure," Ando says. "It encourages them to think about systems of harnessing energy of community volunteers to help maintain green infrastructure that's put in place to provide some of those environmental benefits."

The research also indicated that willingness to volunteer may be driven in part by the direct utility people get from volunteering in their neighborhoods. Ando says this connection is important and should be a topic for future research.

Another takeaway from the study is the value of water quality for residents.

"Often, when cities are talking about green infrastructure, they're very focused on flood reduction. That was not actually the biggest value that we found. We found evidence that people place very high values on improving habitat for aquatic creatures in urban rivers and streams, and in reducing water pollution so the rivers and streams are more usable by people who live near them," Ando states.

"One of the implications of our research is that urban water managers should be focused on providing those benefits and not just worry about flood reduction," she concludes.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Two million Americans lost health coverage/access in Trump's first year: BU study

Over the course of 2017, positive trends in insurance coverage and healthcare access from the Affordable Care Act reversed, particularly for low-income residents of states that did not expand Medicaid.

A new Boston University School of Public Health (BUSPH) study finds that two million more Americans avoided health care because of inability to pay, and/or did not have health insurance, at the end of 2017 compared to the end of 2016.

Published in the February issue of Health Affairs, the study examines the period from 2011 to 2017, showing positive trends in healthcare coverage and access following implementation of the Affordable Care Act (ACA, also known as Obamacare), and a reversal of those trends when newly-elected President Trump and Congressional Republicans began working to dismantle the ACA.

"We hear a lot about the ACA being 'undermined.' While we found the ACA isn't unravelling, there are real consequences to some of the policies that have been put in place. We see that you have these policy changes that are affecting millions of peoples' ability to get insurance, and millions of people forgoing care because they can't afford it," says Mr. Kevin Griffith, a doctoral candidate at BUSPH and the study's lead author.

Griffith and colleagues used data on a nationally-representative sample of 2.2 million U.S. residents between the ages of 18 and 64 years old from the Centers for Disease Control and Prevention (CDC) Behavioral Risk Factor Surveillance System.

The researchers note that this did not give them the ability to directly analyze the causal effects of specific policies, but the quarterly data did allow them to see that trends reversed coinciding with these changes. "This is a time when additional states are implementing Medicaid expansion, and the economy's improving, so you wouldn't traditionally think that access would be declining," Griffith says.

The researchers note several policy changes in 2017 that could have had effects immediate enough to see within the same year, such as shortened enrollment periods, cuts in advertising and navigator funding, and reductions in payments to hospitals. They also note widespread confusion during the "repeal and replace" battle, when a quarter of Americans believed the ACA had been at least partially repealed.

The researchers estimated that uninsurance rates fell by 7.1 percentage points from 2013 to 2016 before rising by 1.2 points during 2017. After a similar downward trend, they found a 1.0-percentage-point increase in adults who avoided health care because of costs in 2017.

They found that low-income residents of states that did not expand Medicaid were the hardest hit by the reversal, while those affected in expansion states were mostly middle-income residents who were eligible for the exchanges. In non-expansion states, the decrease in insurance coverage and healthcare access was four to five times greater than in expansion states.

They also found that the gap in healthcare access between higher- and lower-income people shrank from 2013 to 2016 by about 8.5 percentage points in expansion and nonexpansion states. Then, from the fourth quarter of 2016 to the fourth quarter of 2017, the gap increased by 2.6 percentage points in nonexpansion states (a relative increase of 11 percent) but continued to decrease by another 1.0 point in expansion states (a relative decrease of 8 percent).

"Medicaid expansion seemed to be a really great way for states to insulate themselves from some of the damage of these federal policies," Griffith says. "For states considering Medicaid expansion, this shows that it's a good way to take care of your residents, even regardless of what's going on in Congress."

The researchers are now looking through 2017 into 2018 and beyond, to see how federal policy changes and more states expanding Medicaid have affected these trends. Griffith says the results of the 2017-focused study are likely an indicator of worse to come.

"We had this narrowing of disparities in access and coverage, but that's reversing," he says. "Since 2017, the split between white and black, between rich and poor, urban and rural, renters and homeowners--all of these disparities are getting wider again. That's concerning."

Credit: 
Boston University School of Medicine

"Taphonomy: Dead and Fossilized" fossil-finding board game is a success in classrooms

image: A new board game developed by Jackson School of Geosciences researchers teaches key lessons about the fossilization process.

Image: 
Rowan Martindale/ The University of Texas at Austin Jackson School of Geosciences

Becoming a fossil is the ultimate game of chance.

From the manner of death, to the place where the corpse is buried, to the transformational events that follow, it's a rare occurrence for a specimen to hit a combo that will get it into the fossil record. And only the luckiest of fossils become part of a scientific collection you would see in a museum.

Drawing inspiration straight from the source material, two researchers from The University of Texas at Austin have designed their own game of chance and skill - a board game that puts students in the role of time-travelling paleontologists - to teach key concepts about how fossils form. According to a study published in the Journal of Geoscience Education, the game is a useful tool in teaching the notoriously difficult subject of taphonomy, or how dead things become fossils.

The study found that 71 percent of students thought the game helped them learn about fossilization, and 66 percent of students who played the game thought it was fun.

"Rather than learning abstract concepts from a lecture or textbook, the students learn what promotes or hinders fossilization as they encounter these factors through game play," said UT Jackson School of Geosciences Assistant Professor Rowan Martindale. "Overall, students seemed to really like it, and many preferred the game to a regular lab."

Martindale co-designed the game and co-authored the study with Anna Weiss, who earned her Ph.D. from the Jackson School in 2019 and is now a postdoctoral researcher at the University of Belize.

The game - called "Taphonomy: Dead and Fossilized" - is available for free online and can be printed out on card stock.

Games have long been a way to help teach students new concepts, but they have rarely been used in the Earth sciences, according to the study. Weiss and Martindale thought that "gamifying" concepts that are usually presented in a lecture would be a way to boost student engagement with the material.

"I wanted students to get more from it than just memorizing the different types of fossilization," Weiss said. "From there we started to develop the idea of building a board game."

The game is based on a Jurassic fossil site in Canada where Martindale has conducted research. It follows the specimens on their journey to become fossils, illustrating important lessons on the factors that influence preservation and discovery.

For the study, the researchers enlisted the help of 760 students enrolled in undergraduate geosciences courses at 20 institutions during the 2018-2019 school year. The class sizes ranged from four to 252 students.

The study collected detailed feedback on the game and presents how opinions differ by student demographics, including gender, ethnicity, academic year and major. Unsurprisingly, geosciences students and students who played a lot of board games liked the game the most. Minority student and non-STEM majors were less excited about the game.

Michael Chiappone, an undergraduate student at the Jackson School, played the board game in Martindale's "Life Through Time" geosciences class. He said the game was not only a means to discovery, but also a way for players to learn through experimentation.

"The fact that you got to choose where your fossils went, and then had to stand back and watch the consequences play out, that's experimentation, which is not something that's just paleo specific, but for science as a whole," he said.

While the game is designed with undergraduates in mind, Martindale said that the game is flexible and adaptable to different players and settings. Martindale and her team are currently testing an abbreviated version of the game for high school students.

"I think we have managed to make something that is both fun and educational," Martindale said. "I hope to see people using it in their classes."

Credit: 
University of Texas at Austin

Clues to how hazardous space radiation begins

image: Using data from NASA's Parker Solar Probe, UNH researchers observe sun's plasma and energy build up particles released by solar flares - highlighting new phase of energizing process leading to radiation hazards.

Image: 
NASA

DURHAM, N.H.-- Scientists at the University of New Hampshire have unlocked one of the mysteries of how particles from flares on the sun accumulate at early stages in the energization of hazardous radiation that is harmful to astronauts, satellites and electronic equipment in space. Using data obtained by NASA's Parker Solar Probe (PSP), researchers observed one of the largest events so far during the mission. These observations show how plasma that is released after a solar flare--a sudden flash of increased brightness--can accelerate and pile up energetic particles generating dangerous radiation conditions.

"We're getting some of the earliest observations from this mission to the sun on how the coronal mass ejection--the sun's release of plasma and energy--builds up particles released after solar flare events," said Nathan Schwadron, professor of physics in UNH's Space Science Center. "Because energetic particles are accelerated near the sun, by flying closer and getting a better look we are able to observe the beginning of the energization process and see them actually start to pile up like snow that piles up in front of a snowplow. Instead of an actual snowplow, it is the coronal mass ejections released from the sun that cause the buildup of this material in space."

In the study, recently published in The Astrophysical Journal Supplement Series, the researchers observed solar energetic particle events from April 18, 2019 to April 24, 2019. when two active regions near the sun's equator became highly unstable, releasing a number of flares followed by coronal mass injections (CMEs). Scientists saw the complex interplay between the flares, particle populations and CMEs causing the pre-accelerated particles that are created by these solar events to get trapped and pile up. The study highlights a new phase of the energization process that is critical for the formation of radiation hazards.

"We have known that these high-energy particles are energized in this region, but the missing link was how these particles buildup in the fronts of coronal mass ejections," said Schwadron. "It's like imagining a room filled with bouncing tennis balls and asking how did they get there? The particles become so highly energized that they move at almost the speed of light and, as a result, can pose hazards in the form of harmful radiation that cause health issues for astronauts and damage electronic equipment in space."

The PSP, which was launched by NASA in 2018, is on a seven-year mission to learn more about the sun, the solar wind and the origin of the energetic particles that can pose radiation hazards. The probe will fly to within 4 million miles of the sun's surface, closer than any previous spacecraft has flown, and will face formidable heat in the corona--the sun's atmosphere--to help scientists improve the forecasts for space weather that affect life on Earth. Onboard PSP, the Integrated Science Investigation of the Sun instrument suite is specially designed to measure the near-sun energetic particle environment and includes two instruments which were designed based on previous missions with which UNH was involved.

The University of New Hampshire inspires innovation and transforms lives in our state, nation, and world. More than 16,000 students from all 50 states and 71 countries engage with an award-winning faculty in top-ranked programs in business, engineering, law, health and human services, liberal arts and the sciences across more than 200 programs of study. As one of the nation's highest-performing research universities, UNH partners with NASA, NOAA, NSF and NIH, and receives more than $110 million in competitive external funding every year to further explore and define the frontiers of land, sea and space.

Credit: 
University of New Hampshire