Culture

It pays to be nice to your employees, new study shows

video: New research from Binghamton University, State University at New York finds that showing compassion to subordinates almost always pays off, especially when combined with the enforcement of clear goals and benchmarks.

Image: 
Binghamton University, State University at New York

BINGHAMTON, N.Y. - Want the best results out of your employees? Then be nice to them.

New research from Binghamton University, State University at New York finds that showing compassion to subordinates almost always pays off, especially when combined with the enforcement of clear goals and benchmarks.

"Being benevolent is important because it can change the perception your followers have of you," said Chou-Yu Tsai, an assistant professor of management at Binghamton University's School of Management. "If you feel that your leader or boss actually cares about you, you may feel more serious about the work you do for them."

Tsai and his fellow researchers wanted to determine how both the presence and lack of benevolence affects the job performance of followers.

Tsai partnered with Binghamton University colleagues Shelley Dionne, professor and associate dean of the School of Management, and Francis Yammarino, distinguished professor, as well as An-Chih Wang of China Europe International Business School, Seth Spain of Concordia University, Hsiao-Chi Ling of Kainan University, Min-Ping Huang of Yuan Ze University, Li-Fang Chou of National Cheng Kung University and Bor-Shiuan Cheng of National Taiwan University for the research.

They surveyed nearly 1,000 members of the Taiwanese military and almost 200 adults working full-time in the United States, and looked at the subordinate performance that resulted from three different leadership styles:

Authoritarianism-dominant leadership: Leaders who assert absolute authority and control, focused mostly on completing tasks at all costs with little consideration of the well-being of subordinates

Benevolence-dominant leadership: Leaders whose primary concern is the personal or familial well-being of subordinates. These leaders want followers to feel supported and have strong social ties.

Classical paternalistic leadership: A leadership style that combines both authoritarianism and benevolence, with a strong focus on both task completion and the well-being of subordinates.

The researchers found that authoritarianism-dominant leadership almost always had negative results on job performance, while benevolence-dominant leadership almost always had a positive impact on job performance. In other words, showing no compassion to your employees doesn't bode well for their job performance, while showing compassion motivated them to be better workers.

They also found that classical paternalistic leadership, which combines both benevolence and authoritarianism, had just as strong an effect on subordinate performance as benevolent-dominant leadership. Tsai said the reason for this phenomenon may extend all the way back to childhood.

"The parent and child relationship is the first leader-follower relationship that people experience. It can become a bit of a prototype of what we expect out of leadership going forward, and the paternalistic leadership style kind of resembles that of a parent," Tsai said.

"The findings imply that showing personal and familial support for employees is a critical part of the leader-follower relationship. While the importance of establishing structure and setting expectations is important for leaders, and arguably parents, help and guidance from the leader in developing social ties and support networks for a follower can be a powerful factor in their job performance," Dionne said.

Because of the difference in work cultures between U.S. employees and members of the Taiwanese military, researchers were surprised that the results were consistent across both groups.

"The consistency in the results across different cultures and different job types is fascinating. It suggests that the effectiveness of paternalistic leadership may be more broad-based than previously thought, and it may be all about how people respond to leaders and not about where they live or the type of work they do," Yammarino said.

Tsai said his main takeaway for managers is to put just as much or even more of an emphasis on the well-being of your employees as you do on hitting targets and goals.

"Subordinates and employees are not tools or machines that you can just use. They are human beings and deserve to be treated with respect," said Tsai. "Make sure you are focusing on their well-being and helping them find the support they need, while also being clear about what your expectations and priorities are. This is a work-based version of 'tough love' often seen in parent-child relationships."

Credit: 
Binghamton University

New study finds unexpected link between immune cells and male/ female differences

Researchers at the University of Maryland School of Medicine (UMSOM) have made a surprising discovery: during fetal development, a particular immune cell seems to play a key role in determining the male or female characteristics of the brain.

"This a totally new discovery," says Margaret McCarthy, professor and chairman of the UMSOM Department of Pharmacology. "Prior to this, we didn't know that these cells played a role in this process at all."

The study, which was published today, appears in the latest issue of the Journal of Neuroscience.

Prof. McCarthy and her colleagues studied immune cells known as mast cells, which originate in the bone marrow but are found on body surfaces such as the skin, mouth, nose and eyes. They are also found on the outside surface of the brain in a membrane known as the meninges. Mast cells are signaling molecules, and when activated, they release a range of molecules, including serotonin, histamine and other inflammatory substances. In general they act as triggers for other immune system cells to respond to an injury or threat to the body.

"Mast cells are basically a signaling system, they release these substances, which signal to other immune cells to come and help out," says Prof. McCarthy.

At the same time, they also exist, in small numbers, in a specific area of the brain known as the preoptic area. The preoptic area contributes to the control of sexual motivation and parenting behavior, basic behaviors that occur in nearly all species. During development, between 10 and 70 mast cells exist in this area. This study found that in males there are more of the cells in this area than in females, typically about twice as many, and they are more actively releasing their signaling molecules, in particular histamine.

Surprisingly, the histamine released by the mast cells in males signals to another immune cell, the microglia, and instructs them to make prostaglandins, another inflammatory signaling molecule. In previous research, Prof. McCarthy's lab has shown how prostaglandins induce the development of neural connections in the preoptic area.

In research on rats, Prof. McCarthy and her colleagues found this crucial development occurs in the first week of postnatal development, and plays a large role in determining differences between the male and female brains. She says the findings amazed her: "This one type of cell, and a very small number of these cells, is orchestrating this complex multicellular process to permanently change the circuitry of the brain to make it different in males and females."

In many animals, including both rats and humans, certain regions of male and female brains are quite different. Imaging studies in humans suggest that females tend to have more cross-hemisphere connections, while males tend to have more connections within each hemisphere.

There are also differences in the size of certain parts of the brain. Certain parts of the hypothalamus are larger in men than women. This divergence may play a role in determining sexual orientation. In gay men, this hypothalamic region is smaller than in heterosexual men; it is typically the same size as in heterosexual women.

On the cellular level male and female brains are also sometimes different. Males tend to have more dense synaptic connections in the preoptic area, while in other areas, females have more dense synaptic connections.

Prof. McCarthy has focused much of her work on the neuroscience of sex differences. In previous research she found sex and gender differences in levels of a protein associated with language acquisition and development. This finding may be associated with higher levels of communication among females in some species.

In previous research, she had found that another kind of immune cell known as microglia appear to play a role in masculinization, in part through their production of prostaglandins, a neurochemical normally associated with illness. In recent years, scientists have increasingly realized that the immune system is integral to the development of the brain;

Prof. McCarthy and her colleagues are now doing additional research on the links between the immune system and brain sex differences. They will next focus on the role of histamine, one of the chemicals released by mast cells, to discover more about precisely what role it plays in the process.

Credit: 
University of Maryland School of Medicine

For the first time, a neural link between altruism and empathy toward strangers

Giving up a kidney to a stranger requires a certain sense of selflessness, what's come to be known in social science as extraordinary altruism. University of Pennsylvania psychologist Kristin Brethel-Haurwitz wanted to understand the connection between this trait and empathy, specifically empathy for distress emotions.

Using fMRI scans, Brethel-Haurwitz and colleagues from Georgetown University discovered that these altruistic kidney donors were more sensitive to a stranger's fear and pain than a control group, with activation happening in a brain region called the anterior insula, which is key for emotions like pain and disgust. This research, published in Psychological Science, is the first to show a clear link between real-world altruism and empathy for the pain of strangers.

"This can be hard to study in a lab because it's based on self-reporting and inherently, in that process, there may be biases," says Brethel-Haurwitz, a postdoctoral fellow in Penn's Department of Psychology in the School of Arts and Sciences. "So we took this population of real-world altruists, people who have donated a kidney to a stranger, to try to better understand their empathic process."

It was important for the researchers to get at what Brethel-Haurwitz calls "pure human altruism," a selfless act taken without expectation of anything in return. Donating a kidney is costly and painful and as such altruistic kidney donors often get pushback, not praise, for giving their organ to someone they don't know. Also, the process is often anonymous and nonreciprocal, meaning they may never know or meet the organ recipient. These factors made the group a strong population for such work.

For this study, Brethel-Haurwitz and colleagues recruited 57 people, 29 extraordinary altruists and 28 healthy adults who had not donated a kidney, as the control. After answering a questionnaire to determine baseline empathy, each individual was matched with a stranger as a study partner and then completed a series of 90 task trials, 30 each during three 12-minute blocks.

During the first two blocks, the participant viewed a live video feed of her partner receiving painful pressure to her right thumbnail while researchers monitored brain activity via functional magnetic resonance imaging (fMRI). In Round 3, the participant personally experienced the thumbnail pressure as an fMRI tracked brain function. To differentiate between neural activity related to pain and that related to fear, each trial had periods of anticipation (half in each block were "safe," meaning participants knew no thumb pressure would occur, and half were "threat" trials with the potential for pain) followed by a period during which pain was administered or omitted.

Overlaying the two resulting fMRI brain scans--one made during the altruist's pain, the other while she observed someone else in pain--provided an unmistakable link between the selflessness trait and empathy.

"Prior research of ours has shown that these donors demonstrate more neural sensitivity to distress, specifically fear, in other individuals. The amygdala was more active when they viewed photos of people in fear, but there wasn't someone actually in distress in front of them," Brethel-Haurwitz explains. "Here, when the altruists are feeling pain and watching the pain of others, the neural activity matches pretty closely."

What's more, the results confirm the researchers' theory about the role of the anterior insula, a bilateral region of the brain considered a hub of neural activity. "It's thought to be a salience detector, so, when something important is happening, it's more likely to be active," Brethel-Haurwitz explains. "It's also been shown to activate in prior studies of empathy for pain, so we hypothesized it would come up here, though we weren't as certain we would see it for fear." Enhanced self-other overlap in the anterior insula in altruists for both pain and fear suggests that this region may respond more generally to distress-related emotions.

Next Brethel-Haurwitz plans to take her research in a new direction, working with Penn professor Joseph Kable on why selfish individuals make selfish decisions.

Work with the altruistic donors will continue at Georgetown, led by Abigail Marsh, Brethel-Haurwitz's former doctoral advisor.

"It's hard to get at any pure aspect of human behavior," Brethel-Haurwitz says. "But, once you do, you get closer to a greater understanding of what happens in the brain when people take certain emotion-driven actions."

Credit: 
University of Pennsylvania

Individuals with criminal records may stay in their jobs longer

In sales and customer service positions, employees with criminal records may stay in their jobs longer and be less likely to leave, according to a study published in the IZA Journal of Labor Policy.

Researchers at Northwestern University investigated the possible relationship between having a criminal record and job performance by evaluating data from employees in sales or customer service jobs in call centres in the US. They found employees with a criminal record stayed in their roles on average 19 days longer than those who did not have a criminal record.

Deborah Weiss, the corresponding author of the study said: "In sales and customer service positions, turnover is a major labor cost. Our study found that employees with criminal records had a longer tenure and were less likely to quit their jobs voluntarily than other workers. This finding suggests that individuals with a criminal record represent an untapped productivity pool."

The authors suggest that employees with a criminal record may stay in their jobs longer because they have fewer job prospects outside of their current role.

Deborah Weiss said: "Job applicants with criminal records are much less likely than others to receive an offer of employment. Six months after release from prison, 50 to 80 percent of the formerly incarcerated remain unemployed. Some of those who are offered employment may stay longer because they have no other options and others may feel a sense of loyalty or gratitude to an employer who has given them a second chance."

The researchers also found a 34% increased chance of misconduct in sales jobs for employees with a criminal record but not in customer service jobs, which may suggest that performance and tenure for employees with a criminal record may be better in customer service roles than sales roles. The authors suggest that despite this higher misconduct rate, sales employees with a criminal record may be a good investment for employers. The authors estimated that hiring a worker with a criminal record for a sales job increased expected theft-related costs by about $43, while saving the same employer about $746 in turnover costs on that worker.

Deborah Weiss said: "Finding gainful employment for individuals with a criminal background is an important public priority: without such employment, reoffending is almost inevitable. While our study may not entirely dispel employers' fears that hiring applicants with a criminal record may carry risks, our findings suggest that there are unexploited opportunities to hiring applicants with a record in a way that makes sense both on efficiency and on moral grounds."

The researchers used data on 58,977 applicants hired for sales or customer service jobs in call centres in the US, collected by a hiring consultancy from May 2008 to January 2014. The authors evaluated possible associations between having a criminal record or not having a criminal record and job performance, misconduct and time spent in the job.

The authors caution that the study only evaluated data from those working in the sales and customer service jobs, which may limit the generalizability of the results outside of these positions. The observational nature of this study does not allow for conclusions about cause and effect.

Credit: 
BMC (BioMed Central)

Regrowing dental tissue with stem cells from baby teeth

image: Stem cells extracted from baby teeth were able to regenerate dental pulp (shown, with fluorescent labeling) in young patients who had injured one of their adult teeth.

Image: 
Courtesy of University of Pennsylvania

Sometimes kids trip and fall, and their teeth take the hit. Nearly half of children suffer some injury to a tooth during childhood. When that trauma affects an immature permanent tooth, it can hinder blood supply and root development, resulting in what is essentially a "dead" tooth.

Until now, the standard of care has entailed a procedure called apexification that encourages further root development, but it does not replace the lost tissue from the injury and, even in a best-case scenario, causes root development to proceed abnormally.

New results of a clinical trial, jointly led by Songtao Shi of the University of Pennsylvania and Yan Jin, Kun Xuan, and Bei Li of the Fourth Military Medicine University in Xi'an, China, suggest that there is a more promising path for children with these types of injuries: Using stem cells extracted from the patient's baby teeth. The work was published in the journal Science Translational Medicine.

"This treatment gives patients sensation back in their teeth. If you give them a warm or cold stimulation, they can feel it; they have living teeth again," says Shi, professor and chair in the Department of Anatomy and Cell Biology in Penn's School of Dental Medicine. "So far we have follow-up data for two, two and a half, even three years and have shown it's a safe and effective therapy."

Shi has been working for a decade to test the possibilities of dental stem cells after discovering them in his daughter's baby tooth. He and colleagues have learned more about how these dental stem cells, officially called human deciduous pulp stem cells (hDPSC), work and how they could be safely employed to regrow dental tissue, known as pulp.

The Phase I trial, conducted in China, which has a research track for clinical trials, enrolled 40 children who had each injured one of their permanent incisors and still had baby teeth. Thirty were assigned to hDPSC treatment and 10 to the control treatment, apexification.

Those that received hDPSC treatment had tissue extracted from a healthy baby tooth. The stem cells from this pulp were allowed to reproduce in a laboratory culture, and the resulting cells were implanted into the injured tooth.

Upon follow-up, the researchers found that patients who received hDPSCs had more signs than the control group of healthy root development and thicker dentin, the hard part of a tooth beneath the enamel. Blood flow increased as well.

At the time the patients were initially seen, all had little sensation in the tissue of their injured teeth. A year following the procedure, only those who received hDPSCs had regained some sensation. Examining a variety of immune-system components, the team found no evidence of safety concerns.

As further support of the treatment's efficacy, the researchers had the opportunity to directly examine the tissue of a treated tooth when the patient reinjured it and had to have it extracted. They found that the implanted stem cells regenerated different components of dental pulp, including the cells that produce dentin, connective tissue, and blood vessels.

"For me the results are very exciting," Shi says. "To see something we discovered take a step forward to potentially become a routine therapy in the clinic is gratifying."

It is, however, just a first step. While using a patient's own stem cells reduces the chances of immune rejection, it's not possible in adult patients who have lost all of their baby teeth. Shi and colleagues are beginning to test the use of allogenic stem cells, or cells donated from another person, to regenerate dental tissue in adults. They are also hoping to secure FDA approval to conduct clinical trials using hDPSCs in the United States.

Eventually, they see even broader applications of hDPSCs for treating systemic disease, such as lupus, which Shi has worked on before.

"We're really eager to see what we can do in the dental field," Shi says, "and then building on that to open up channels for systemic disease therapy."

Credit: 
University of Pennsylvania

The Lancet: Dairy consumption linked to lower rates of cardiovascular disease and mortality

Dairy consumption of around three servings per day is associated with lower rates of cardiovascular disease and mortality, compared to lower levels of consumption, according to a global observational study of over 130,000 people in 21 countries, published in The Lancet.

In addition, the study found that people who consumed three servings of whole fat dairy per day had lower rates of mortality and cardiovascular disease compared to those who consumed less than 0.5 serving of whole fat dairy per day.

The findings are consistent with previous meta-analyses of observational studies and randomised trials, but stand in contrast to current dietary guidelines which recommend consuming 2-4 servings of fat-free or low-fat dairy per day, and minimising consumption of whole-fat dairy products for cardiovascular disease prevention.

Cardiovascular disease is the leading cause of mortality worldwide. The authors conclude that the consumption of dairy should not be discouraged and should even perhaps be encouraged in low-income and middle-income countries where dairy consumption is low.

"Our findings support that consumption of dairy products might be beneficial for mortality and cardiovascular disease, especially in low-income and middle-income countries where dairy consumption is much lower than in North America or Europe," says lead author Dr Mahshid Dehghan, McMaster University, Canada.

The Prospective Urban Rural Epidemiological (PURE) study included data from 136,384 individuals aged 35-70 years in 21 countries [1]. Dietary intakes were recorded at the start of the study using country-specific validated food questionnaires. Participants were followed up for an average of 9.1 years. During this time, there were 6,796 deaths and 5,855 major cardiovascular events.

One standard serving of dairy was equivalent to a glass of milk at 244g, a cup of yoghurt at 244g, one slice of cheese at 15g, or a teaspoon of butter at 5g.

Dairy consumption was highest in North America and Europe (368g/day or above 4 servings of total dairy per day) and lowest in south Asia, China, Africa and southeast Asia (147, 102, 91 and 37g/day respectively - less than 1 serving of total dairy per day).

Participants were grouped into four categories: no dairy (28,674 people), less than 1 serving per day (55,651), 1-2 servings per day (24,423), and over 2 servings per day (27,636).

Compared to the no intake group, the high intake group (mean intake of 3.2 servings per day) had lower rates of total mortality (3.4% vs 5.6%), non-cardiovascular mortality (2.5% vs 4%), cardiovascular mortality (0.9% vs 1.6%), major cardiovascular disease (3.5% vs 4.9%), and stroke (1.2% vs 2.9%). There was no difference in the rates of myocardial infarction between the two groups (1.9% vs 1.6%).

Among those who consumed only whole-fat dairy, higher intake (mean intake of 2.9 servings of whole fat dairy per day) was associated with lower rates of total mortality (3.3% vs 4.4%) and major cardiovascular disease (3.7% vs 5.0%), compared to those who consumed less than 0.5 servings whole-fat dairy per day.

Higher intake of milk and yoghurt (above 1 serving per day) was associated with lower rates of the composite outcome, which combines total mortality and cardiovascular disease (milk: 6.2% vs 8.7%; yoghurt: 6.5% vs 8.4%), compared to no consumption. The differences in the composite outcome for butter and cheese were not significant as intake was lower than for milk and yoghurt.

The authors say that more research into why dairy might be associated with lower levels of cardiovascular diseases is now needed. The recommendation to consume low-fat dairy is based on the presumed harms of saturated fats on a single cardiovascular risk marker (LDL cholesterol). However, evidence suggests that some saturated fats may be beneficial to cardiovascular health, and dairy products may also contain other potentially beneficial compounds, including specific amino acids, unsaturated fats, vitamin K1 and K2, calcium, magnesium, potassium, and potentially probiotics. The effect of dairy on cardiovascular health should therefore consider the net effect on health outcomes of all these elements.

Limitations include that diets were self-reported. While multiple weighted food records may be more accurate, they require extensive training, motivation, awareness and literacy which limits the practicality for such a large long-term study. The authors also note that diet was measured at baseline, and that changes in diet may have occurred over time. However, they add that the association between milk intake at 3 years follow up and cardiovascular disease was similar to the analyses using baseline information, suggesting that repeat measurements is unlikely to alter the findings.

Writing in a linked Comment, Jimmy Chun Yu Louie (University of Hong Kong), and Anna M Rangan (University of Sydney) conclude that dairy dietary guidelines do not need to change just yet. They write: "The results from the PURE study seem to suggest that dairy intake, especially whole-fat dairy, might be beneficial for preventing deaths and major cardiovascular diseases. However, as the authors themselves concluded, the results only suggest the "consumption of dairy products should not be discouraged and perhaps even be encouraged in low-income and middle-income countries." It is not the ultimate seal of approval for recommending whole-fat dairy over its low-fat or skimmed counterparts. Readers should be cautious, and treat this study only as yet another piece of the evidence (albeit a large one) in the literature."

Credit: 
The Lancet

UK heart failure patients twice as likely to die as their Japanese counterparts

Heart failure is common, and becoming more so as populations age. It is the primary diagnosis in more than 80,000 admissions to hospital in the UK; more than 200,000 in Japan; and more than 1 million in the US.

It's thought that cultural differences may have a role in differences in death rates for heart failure around the globe. To look at this in more detail, the researchers compared the death rates of 894 heart failure patients admitted to hospital in the UK with 3781 admitted to hospital in Japan.

To compare patients with a similar severity of heart failure, the researchers looked at the risk factors associated with a heightened risk of death in patients with the condition in previously published studies.

The five factors most strongly associated with the risk of death were systolic blood pressure (the amount of pressure in the arteries when the heart muscle contracts) and levels of sodium, urea (a measure of protein turnover and kidney function) and creatinine (a measure of kidney function) in the blood.

They then compared death rates in hospital, and 1, 3, and 6 months after admission.

Although both UK and Japanese patients were of similar age, UK patients had more severe heart failure, as judged by the five most important risk factors. They were also more likely to have ischaemic heart disease (narrowed arteries) and COPD (chronic lung disease).

UK patients were much more likely to die at all the time points measured than were Japanese patients. Much of this difference could be attributed to British patients being sicker at the time of admission. The threshold for hospital admission in the UK seems to be higher than it is in Japan, note the researchers.

But even after accounting for observed differences in risk, British patients were more than twice as likely to have died at 6 months than patients in Japan.

This is an observational study, and as such, can't establish exactly why British patients fared so much worse than patients in Japan. Differences in the quality of care after discharge, attitudes to medical advice and taking medicines, lifestyle, diet or genes might all have influenced outcomes, suggest the researchers.

"Explaining the differences in outcome among countries, cultures and health services might provide insights that could improve care and outcome and inform healthcare policy decisions," they conclude.

Credit: 
BMJ Group

New insight on rotavirus mechanics could lead to improved treatments

image: This is an artistic rendition of the rotavirus particle dissection process, performed with atomic force microscopy.

Image: 
Image created by Scixel (http://scixel.es/), under the instructions of D. Luque and P. J. de Pablo.

Researchers have provided new insight on the mechanics of a virus that causes severe diarrhea and sickness in young children, according to a report published in eLife.

The study, from the Autonomous University of Madrid, Carlos III Health Institute and National Center for Biotechnology, Spain, could open up new avenues for developing effective treatments for rotavirus, which commonly infects children up to five years old. It is the first paper to detail the interplay between the function and mechanical properties of a 'multilayered' virus.

Virus particles enclose their genetic material in a protein shell designed to protect, shuttle and release its genome at the host cell. The structure of virus particles therefore need to be strong enough to protect the viral genome in environments outside the cell, and to withstand attacks from the host immune system, to ensure successful infection.

Many double-stranded RNA viruses, such as rotavirus, isolate their genome within a core shell that incorporates its own molecular machinery to allow the genome to replicate and spread. Some viruses take this a step further and build extra concentric protein layers that function in other ways, such as to help bind and penetrate their target cells.

"The complete particle of rotavirus is formed by three independent protein shells. This particle and the subviral particles containing one or two protein layers play distinct roles during infection," explains lead author Manuel Jiménez-Zaragoza, Research Assistant in the Department of Physics of Condensed Matter at the Autonomous University of Madrid. "We wanted to see how the interactions between the layers that define these different particles work together during the virus replication cycle."

Although previous studies have revealed how to purify two-layer protein particles, the authors of the current work have developed a novel way to purify single-layer particles, allowing them to be studied individually. After purifying these subviral particles, the team used a scanning probe system called atomic force microscopy, which involves using a small, sharp stylus to deform the virus particles. This allowed them to study the strength and stability of individual triple, double and single-layered particles.

They discovered a strong interaction between the external and middle layers, which they say is critical for the protection of the complete virus particle. Meanwhile, the interactions that take place between the middle and inner layers help the virus to replicate its genome among host cells, a process known as transcription.

"Our findings reveal how the biophysical properties of the three protein shells are fine-tuned to enable rotavirus to be carried among host cells," says senior author Pedro de Pablo, Associate Professor at the Autonomous University of Madrid. "We believe this could prove valuable in offering new venues for the development of novel antiviral strategies."

Credit: 
eLife

Stress linked to more advanced disease in some leukemia patients

Patients with chronic lymphocytic leukemia (CLL) who feel more stress also have more cancer cells in their blood and elevated levels of three other markers of more advanced disease.

A new study of 96 patients is the first to link stress with biological disease markers in patients with CLL.

"All four variables we measured are related to prognosis in CLL patients, so they have a lot of relevance," said Barbara L. Andersen, lead author of the study and professor of psychology at The Ohio State University.

"It's more evidence of the importance of managing stress in cancer patients."

The study appeared Aug. 1 in the journal Cancer.

CLL is the most common leukemia in adults, and accounts for about one-third of adult leukemia in the United States.

The study involved patients who were entering a trial at Ohio State's Arthur G. James Cancer Hospital for ibrutinib, now approved by the U.S. Food and Drug Administration. At the time of the study, the drug was in early trials to treat the disease. Data collection was done before patients received the first dose.

All patients completed a survey that measured their cancer-related stress. They were asked questions like how often they had intrusive thoughts about their cancer, how often they tried to avoid thinking about it and how often they felt jumpy and easily startled.

The researchers took blood samples and calculated absolute lymphocyte counts (ALC), which is a measure of healthy and malignant cells circulating in the blood. This measure is often elevated in patients with CLL and is used as a marker of disease severity. They also measured levels of eight different cytokines, which are proteins involved in the body's immune response. All of these cytokines can promote unhealthy levels of inflammation in patients with cancer.

Results showed that more stress in the patients was associated with a higher number of circulating cancerous cells and higher levels of three cytokines: tumor necrosis factor alpha, interleukin 16 and chemokine ligand 3 (CCL 3).

CCL3 is a particular kind of cytokine called a chemokine. It helps facilitate the development of CLL cells in places like the spleen and lymph nodes, where leukemia cells are produced.

"Chemokines have not been used in studies like this before and it is a novel way of checking for the link between stress and disease," Andersen said.

Stress was linked to disease severity even after the researchers took into account several other important factors that also play a role in disease progression, including gender, the number of prior treatments and the presence of a genetic marker (del17p) that is associated with harder-to-treat CLL.

"The fact that stress shows an effect on CLL even after we controlled for other factors suggests it may be relevant to the course of CLL," Andersen said.

Why did the other five cytokines the researchers studied not show an effect in this study?

Andersen noted that this was the first study of its kind done with leukemia patients. Many of the other cytokines have been found to have effects in solid tumors and might not work the same way in blood cancers.

The researchers are continuing to follow these patients and will examine the relationship between stress and these same responses throughout treatment, Andersen said.

Credit: 
Ohio State University

Coral bleaching increases disease risk in threatened species

image: Staghorn corals grown in Mote Marine Laboratory's underwater nursery in the Florida Keys, US.

Image: 
Conor Goulding/Mote Marine Laboratory

Bleaching events caused by rising water temperatures could increase mortality among a coral species already threatened by disease, says new research by Mote Marine Laboratory and Penn State, US, published in eLife.

The study on the species Acropora cervicornis, known as the staghorn coral, emphasizes the need for maintaining genetic diversity while at the same time increasing resilience within the species, as part of restoration efforts to help prevent further loss in the Florida region.

Once prevalent throughout the Florida Reef Tract, the staghorn coral has suffered substantial declines over the last several decades due to increasing ocean temperatures and disease outbreaks, with no evidence of natural recovery. The Florida Reef Tract is currently estimated to be worth over $6 billion to the state economy, providing over 70,000 jobs and attracting millions of tourists into Florida each year - but much of its ecosystem services will be lost if the living coral is not restored.

"With imminent threats to the staghorn coral, it is now the focus of restoration efforts throughout much of the Florida region, thanks to the existence of some coral genotypes that are more resilient to threats than others," says lead author Dr. Erinn Muller, Program Manager and Science Director of the Elizabeth Moore International Center for Coral Reef Research and Restoration at Mote Marine Laboratory, Florida. "However, there could be tradeoffs associated with these resilient traits, such as heat-tolerant corals being highly susceptible to disease infection.

"Previous studies showed there are certain staghorn genotypes resistant to white band disease. However, it is still unclear how high-water temperatures caused by climate change influence disease resistance and what role, if any, the algae that live and interact with the corals - their 'algal symbionts' - play in stress resistance. We therefore wanted to see what percentage of staghorn corals within the lower Florida Keys are disease resistant, and how this resistance changes during a warm-water event that leads to coral bleaching."

To do this, Muller and her team exposed the same staghorn coral genotypes to white band-diseased tissue before and during a coral bleaching event. They found that, in the absence of bleaching, around 25% of the population tested was resistant to the disease. However, when the corals were exposed to it during the bleaching event, their mortality rate doubled.

Interestingly, the team, which included researchers from the Mote Marine Lab and Penn State also found that two coral genotypes were resistant to the disease even while bleached. The level of bleaching within these genotypes was not related to disease susceptibility or their algal symbiont strain, suggesting there are no direct tradeoffs between their levels of heat tolerance and disease resistance.

"While we are working on reducing carbon dioxide emissions that cause climate change and ocean warming as fast as possible, our best chance at enhancing adaptation of corals and their symbionts to their warming environments is to promote genetic diversity of coral and symbiont populations," said Iliana Baums, Associate Professor of Biology at Penn State, an expert in coral molecular ecology.

Baums developed the genetic methods used in this research to fingerprint the genetic strains of symbionts associated with each of the coral colonies. Such high-level resolution has not yet been applied commonly in coral experiments. It allowed the team to disentangle the response of the coral host versus the symbiont genotype to the multiple stressors they experimented with.

"Together, our findings show that the staghorn coral's susceptibility to temperature stress creates an increased risk in death from disease, and that only two of the genotypes tested may maintain or gain disease resistance under high temperatures," said Muller. "As recurring warming events may cause continued loss of these resistant genotypes, it is crucial that restoration efforts focus on maintaining high genetic diversity to help keep these corals alive in a warming climate."

Credit: 
eLife

Beyond deep fakes: Transforming video content into another video's style, automatically

image: Researchers at Carnegie Mellon University have devised a way to automatically transform the content of one video into the style of another, making it possible to transfer the facial expressions of one person to the video of another person, or even a cartoon character.

Image: 
Carnegie Mellon University

PITTSBURGH-- Researchers at Carnegie Mellon University have devised a way to automatically transform the content of one video into the style of another, making it possible to transfer the facial expressions of comedian John Oliver to those of a cartoon character, or to make a daffodil bloom in much the same way a hibiscus would.

Because the data-driven method does not require human intervention, it can rapidly transform large amounts of video, making it a boon to movie production. It can also be used to convert black-and-white films to color and to create content for virtual reality experiences.

"I think there are a lot of stories to be told," said Aayush Bansal, a Ph.D. student in CMU's Robotics Institute. Film production was his primary motivation in helping devise the method, he explained, enabling movies to be produced more quickly and cheaply. "It's a tool for the artist that gives them an initial model that they can then improve," he added.

The technology also has the potential to be used for so-called "deep fakes," videos in which a person's image is inserted without permission, making it appear that the person has done or said things that are out of character, Bansal acknowledged.

"It was an eye opener to all of us in the field that such fakes would be created and have such an impact," he said. "Finding ways to detect them will be important moving forward."

Bansal will present the method today at ECCV 2018, the European Conference on Computer Vision, in Munich. His co-authors include Deva Ramanan, CMU associate professor of robotics.

Transferring content from one video to the style of another relies on artificial intelligence. In particular, a class of algorithms called generative adversarial networks (GANs) have made it easier for computers to understand how to apply the style of one image to another, particularly when they have not been carefully matched.

In a GAN, two models are created: a discriminator that learns to detect what is consistent with the style of one image or video, and a generator that learns how to create images or videos that match a certain style. When the two work competitively -- the generator trying to trick the discriminator and the discriminator scoring the effectiveness of the generator -- the system eventually learns how content can be transformed into a certain style.

A variant, called cycle-GAN, completes the loop, much like translating English speech into Spanish and then the Spanish back into English and then evaluating whether the twice-translated speech still makes sense. Using cycle-GAN to analyze the spatial characteristics of images has proven effective in transforming one image into the style of another.

That spatial method still leaves something to be desired for video, with unwanted artifacts and imperfections cropping up in the full cycle of translations. To mitigate the problem, the researchers developed a technique, called Recycle-GAN, that incorporates not only spatial, but temporal information. This additional information, accounting for changes over time, further constrains the process and produces better results.

The researchers showed that Recycle-GAN can be used to transform video of Oliver into what appears to be fellow comedian Stephen Colbert and back into Oliver. Or video of John Oliver's face can be transformed a cartoon character. Recycle-GAN allows not only facial expressions to be copied, but also the movements and cadence of the performance.

The effects aren't limited to faces, or even bodies. The researchers demonstrated that video of a blooming flower can be used to manipulate the image of other types of flowers. Or clouds that are crossing the sky rapidly on a windy day can be slowed to give the appearance of calmer weather.

Such effects might be useful in developing self-driving cars that can navigate at night or in bad weather, Bansal said. Obtaining video of night scenes or stormy weather in which objects can be identified and labeled can be difficult, he explained. Recycle-GAN, on the other hand, can transform easily obtained and labeled daytime scenes into nighttime or stormy scenes, providing images that can be used to train cars to operate in those conditions.

Credit: 
Carnegie Mellon University

NASA's SDO spots 2 lunar transits in space

image: SDO captured these images in a wavelength of extreme ultraviolet light that shows solar material heated to more than 10 million degrees Fahrenheit. Extreme ultraviolet light is typically invisible to the human eye, but satellites like SDO allow us to observe the swirling movement in the Sun's atmosphere visible only in these wavelengths. View animation: https://www.nasa.gov/sites/default/files/thumbnails/image/2_2018-09-10_1...

Image: 
NASA/Goddard/SDO

On Sept. 9, 2018, NASA's Solar Dynamics Observatory, SDO, saw two lunar transits as the Moon passed in front of the Sun. A transit happens when a celestial body passes between a larger body and an observer. This first lunar transit lasted one hour, from 4:30 pm to 5:30 p.m. EDT and obscured 92 percent of the Sun at the peak of its journey. The second transit happened several hours later at 9:52 p.m. and lasted a total of 49 minutes, ending at 10:41 p.m. EDT. This transit only obscured 34 percent of the Sun at its peak.

Watch the movie here to see how -- from SDO's perspective -- the Moon appears to go in one direction and then switch direction to cross the Moon again. The Moon does not, of course, actually change direction, but it appears to do so from SDO's perspective based on the fact that the spacecraft's orbit essentially catches up and passes the Moon during the first transit.

Because the Moon does not have an atmosphere, when a lunar transit occurs no light from the Sun gets distorted, allowing for a distinct view of the Moon's surface. Although it looks smooth from far away, the surface of the Moon is rugged, sprinkled with craters, valleys and mountains.

SDO captured these images in a wavelength of extreme ultraviolet light that shows solar material heated to more than 10 million degrees Fahrenheit. Extreme ultraviolet light is typically invisible to the human eye, but satellites like SDO allow us to observe the swirling movement in the Sun's atmosphere visible only in these wavelengths.

Credit: 
NASA/Goddard Space Flight Center

New innovation improves the diagnosis of dizziness

image: The new vibrating device improves the diagnosis of dizziness.

Image: 
Johan Bodell/Chalmers University of Technology

Half of over-65s suffer from dizziness and problems with balance. But some tests to identify the causes of such problems are painful and can risk hearing damage. Now, researchers from Chalmers University of Technology, Sweden, have developed a new testing device using bone conduction technology, that offers significant advantages over the current tests.

Hearing and balance have something in common. For patients with dizziness, this relationship is used to diagnose issues with balance. Commonly, a 'VEMP' test (Vestibular Evoked Myogenic Potentials) needs to be performed. A VEMP test uses loud sounds to evoke a muscle reflex contraction in the neck and eye muscles, triggered by the vestibular system - the system responsible for our balance. The Chalmers researchers have now used bone conducted sounds to achieve better results.

"We have developed a new type of vibrating device that is placed behind the ear of the patient during the test," says Bo Håkansson, a professor in the research group 'Biomedical signals and systems' at Chalmers. The vibrating device is small and compact in size, and optimised to provide an adequate sound level for triggering the reflex at frequencies as low as 250 Hz. Previously, no vibrating device has been available that was directly adapted for this type of test of the balance system.

In bone conduction transmission, sound waves are transformed into vibrations through the skull, stimulating the cochlea within the ear, in the same way as when sound waves normally go through the ear canal, the eardrum and the middle ear. Bo Håkansson has over 40 years of experience in this field and has previously developed hearing aids using this technology.

Half of over-65s suffer from dizziness, but the causes can be difficult to diagnose for several reasons. In 50% of those cases, dizziness is due to problems in the vestibular system. But today's VEMP methods have major shortcomings, and can cause hearing loss and discomfort for patients.

For example, the VEMP test uses very high sound levels, and may in fact cause permanent hearing damage itself. And, if the patient already suffers from certain types of hearing loss, it may be impossible to draw any conclusions from the test. The Chalmers researchers' new method offers significant advantages.

"Thanks to this bone conduction technology, the sound levels which patients are exposed to can be minimised. The previous test was like a machine gun going off next to the ear - with this method it will be much more comfortable. The new vibrating device provides a maximum sound level of 75 decibels. The test can be performed at 40 decibels lower than today's method using air conducted sounds through headphones. This eliminates any risk that the test itself could cause hearing damage," says postdoctoral researcher Karl-Johan Fredén Jansson, who made all the measurements in the project.

The benefits also include safer testing for children, and that patients with impaired hearing function due to chronic ear infections or congenital malformations in the ear canal and middle ear can be diagnosed for the origin of their dizziness.

The vibrating device is compatible with standardised equipment for balance diagnostics in healthcare, making it easy to start using. The cost of the new technology is also estimated to be lower than the corresponding equipment used today.

A pilot study has been conducted and recently published. The next step is to conduct a larger patient study, under a recently received ethical approval, in collaboration with Sahlgrenska University Hospital in Gothenburg, where 30 participants with normal hearing will also be included.

Credit: 
Chalmers University of Technology

Change your diet to save both water and your health

image: The potential impact on water resources of shifting to healthy vegetarian diets, visualised for 35 000 municipalities in France. The map has been adjusted to reflect population size of each geographical entity.

Image: 
European Union, 2018

Shifting to a healthy diet is not only good for us, but it also saves a lot of precious fresh water, according to a new study by the JRC published in Nature Sustainability.

Compared to existing diets, the water required to produce our food could be reduced by between 11% and 35% for healthy diets containing meat, 33% and 55% for healthy pescetarian diets and 35% and 55% for healthy vegetarian diets.

Researchers compared these three diet patterns, defined by respective national dietary guidelines, to the current actual food consumption, using available data from more than 43 thousand areas in France, the UK and Germany.

They found that eating more healthily could substantially reduce the water footprint of people's diets, consistent across all the geographical entities analysed in the study.

The study is the most detailed nationwide food consumption-related water footprint ever made, taking into account socio-economic factors of food consumption, for existing and recommended diets.

Influences on the food we eat

The scientists also show how individual food consumption behaviour - and their related water footprints - depend strongly on the socio-economic factors like age, gender and education level.

They found interesting correlations between such factors and both the water footprint of specific foods and their resulting impact on overall water footprints.

For example, the study shows how in France, the water footprint of milk consumption decreases with age across the municipalities analysed.

Across London, they show a strong correlation between the water footprint of wine consumption and the percentage of the population of each area with a high education level.

Background

The water footprint is defined as the total volume of freshwater that is used to produce goods consumed, food in this particular case.

The scientists used national dietary surveys to assess differences in food product group consumption between regions and socio- economic factors within regions.

The diet scenarios analysed in the study take into account total daily energy and protein requirements as well as maximum daily fat amounts.

They are based upon national dietary guidelines, in which for every food product group specific recommendations are given according to age and gender.

By downscaling national water footprints to the lowest possible administrative boundaries within a country, the scientists provide a useful tool for policy makers at various levels.

The methodology could also be applied to other footprints assessments - like the carbon, land or energy footprints related to food consumption.

Animal products - and especially meat - have a high water footprint.

The average European diet is characterised by overconsumption in general, particularly of animal products.

A healthy diet would contain less sugar, crop oils, meat and animal fats, and more vegetables and fruit.

Due to the numerous negative impacts of an intensive livestock production system on the planet's resources and ecosystems, as well as the growing demands of non-western countries for animal products, moving to a more resource-efficient (and healthier) vegetable-rich diet in the EU is a necessity.

Credit: 
European Commission Joint Research Centre

Immune cells destroy healthy brain connections, diminish cognitive function in obese mice

image: Obesity leads to cognitive impairment by activating microglial cells, which consume otherwise functional synapses in the hippocampus, according to a study of male mice published in JNeurosci. The research suggests that microglia may be a potential therapeutic target for one of the lesser known effects of this global health epidemic on the brain.

Image: 
Cope at al., <i>JNeurosci</i> (2018)

Obesity leads to cognitive impairment by activating microglial cells, which consume otherwise functional synapses in the hippocampus, according to a study of male mice published in JNeurosci. The research suggests that microglia may be a potential therapeutic target for one of the lesser known effects of this global health epidemic on the brain.

Nearly two billion adults worldwide are overweight, more than 600 million of whom are obese. In addition to increasing risk of conditions such as diabetes and heart disease, obesity is also a known risk factor for cognitive disorders including Alzheimer's disease. The cellular mechanisms that contribute to cognitive decline in obesity, however, are not well understood.

Elise Cope and colleagues replicated previous research by demonstrating diet-induced obesity in mice impairs performance on cognitive tasks dependent on the hippocampus and results in loss of dendritic spines -- the neuronal protrusions that receive signals from other cells -- and activates microglia. Using genetic and pharmacological approaches to block microglial activity, the researchers established microglia are causally linked to obesity-induced dendritic spine loss and cognitive decline. The results suggest obesity may drive microglia into a synapse-eating frenzy that contributes to the cognitive deficits observed in this condition.

Credit: 
Society for Neuroscience