Culture

Common class of drugs linked to dementia even when taken 20 years before diagnosis

image: Pill bottle.

Image: 
CDC [Public domain] via Wikimedia Commons

INDIANAPOLIS - The largest and most detailed study of the long-term impact of anticholinergic drugs, a class of drugs commonly prescribed in the United States and United Kingdom as antidepressants and incontinence medications, has found that their use is associated with increased risk of dementia, even when taken 20 years before diagnosis of cognitive impairment.

An international research team from the US, UK and Ireland analyzed more than 27 million prescriptions as recorded in the medical records of 40,770 patients over age 65 diagnosed with dementia compared to the records of 283,933 older adults without dementia.

The researchers found greater incidence of dementia among patients prescribed anticholinergic antidepressants, anticholinergic bladder medications and anticholinergic Parkinson's disease medications than among older adults who were not prescribed these drugs.

Dementia increased with greater exposure to anticholinergic medications.

"Anticholinergic Medication and Risk of Dementia: Case-control Study" is published in BMJ (formerly the British Medical Journal) an international peer-reviewed medical journal.

"Anticholinergics, medications that block acetylcholine, a nervous system neurotransmitter, have previously been implicated as a potential cause of cognitive impairment," said Regenstrief Institute and Indiana University Center for Aging Research investigator Noll Campbell, PharmD, MS, a co-author of the new BMJ study. "This study is large enough to evaluate the long-term effect and determine that harm may be experienced years before a diagnosis of dementia is made." Dr. Campbell is also an assistant professor of pharmacy practice at Purdue University College of Pharmacy.

"These findings make it clear that clinicians need to carefully consider the anticholinergic burden of their patients and weigh other options," said study co-author Malaz Boustani, M.D., MPH, a Regenstrief Institute and IU Center for Aging Research investigator. Dr. Boustani is the founder of the Indiana Clinical and Translational Science Institute's IU Center for Health Innovation and Implementation Science and the Richard M. Fairbanks Professor of Aging Research at IU School of Medicine.

"Physicians should review all the anticholinergic medications - including over-the-counter drugs - that patients of all ages are taking and determine safe ways to take individuals off anticholinergic medications in the interest of preserving brain health," Dr. Boustani said.

The study, which was led by the University of East Anglia and funded by the Alzheimer's Society, both in the UK, utilized data from the Clinical Practice Research Datalink which includes anonymized diagnosis, referral and prescription records for more than 11 million patients from 674 primary care practices across the UK. The data is broadly representative of the UK population in terms of age, sex and ethnicity.

"This research is really important because there are an estimated 350 million people affected globally by depression. Bladder conditions requiring treatment are estimated to affect over 13 percent of men and 30 percent of women in the UK and US," said study lead researcher George Savva, PhD, visiting researcher at University of East Anglia's School of Health Sciences.

"We don't know exactly how anticholinergics might cause dementia," said study co-author Chris Fox, MD, professor of clinical psychiatry at UEA's Norwich Medical School and a consultant psychiatrist. "Further research is needed to understand possible reasons for this link. In the meantime, I strongly advise patients with any concerns to continue taking their medicines until they have consulted their doctor or pharmacist."

Study co-author Ian Maidment, PhD, senior lecturer in clinical pharmacy at Aston University in the UK, said: "With many medicines having some anticholinergic activity, one key focus should be de-prescribing. Clinical staff, patients and carers need to work together collaboratively to limit the potential harm associated with anticholinergics."

Credit: 
Regenstrief Institute

Emergency treatment by older surgeons linked to slightly lower death rates

Patients undergoing emergency surgery who are treated by older surgeons (aged 60 or over) have slightly lower death rates in the first few weeks after their operation than patients treated by younger surgeons (aged less than 40) within the same hospital, finds US study published by The BMJ today.

There was no evidence that death rates differ between male and female surgeons.

If the results are causal, the researchers say that for every 333 of these patients who undergo surgery in the US, one fewer death would occur if quality of care was the same between younger and older surgeons.

Despite strong interest in improving the quality of surgical care, the relationship between surgical characteristics - especially age and sex of surgeons - and patient outcomes is not well understood.

So a research team led by Yusuke Tsugawa at UCLA in California, set out to investigate whether patient mortality differs based on age and sex of surgeons.

They analysed the operative mortality rate (defined as death while in hospital or within 30 days of surgery) of Medicare patients aged 65-99 years who underwent one of 20 major emergency surgical procedures at US acute care hospitals between 2011 and 2014.

After adjusting for a range of patient, surgeon and hospital characteristics that could have affected the results, they compared operative mortality according to surgeon age and sex.

A total of 892,187 patients were treated by 45,826 surgeons with an overall operative mortality rate of 6.4% (56,803).

The researchers found that patient mortality was slightly lower for older surgeons than for younger surgeons within the same hospital (6.6% for surgeons aged less than 40, 6.5% for surgeons aged 40-49, 6.4% for surgeons aged 50-59, and 6.3% for surgeons aged 60 or over), but did not differ meaningfully between male and female surgeons.

When they analysed the data by both surgeon age and sex, patient mortality declined with surgeon age for both male and female surgeons, with female surgeons in their 50s showing the lowest operative mortality across all groups.

Operative mortality did not differ between male and female surgeons by patient illness severity or for individual procedures. And there was no evidence that mortality differed by surgeon age or sex for non-emergency (elective) procedures.

Previously, the researchers found worse outcomes among patients treated by older hospital physicians, which they attributed to practice changes since training, and possibly poor adherence to guidelines. In contrast, these new findings suggest improved surgical skills with extra years in practice.

This is an observational study, so no firm conclusions can be drawn about cause and effect, and the findings may not be generalisable to other outcomes, such as patient experience or complication rates, they explain. Nevertheless, the study was large, and was able to account for a wide range of potentially influential factors.

As such, they conclude: "Our finding that younger surgeons have higher mortality suggests that more oversight and supervision early in a surgeon's career may be useful and at least warrants further investigation. Equivalent outcomes between male and female surgeons suggest that patients undergoing surgery receive high-quality care irrespective of surgeon sex."

In a linked editorial, Natalie Coburn and colleagues based in Toronto say the researchers "demonstrate clear variation in patient outcomes, identifying opportunities to improve care."

However, they warn that even objective measures are insufficient to address systemic bias. "We must learn to recognise and reduce the implicit biases that each of us inherently holds," they write. "Surgical care will improve faster when we embrace and foster teamwork, communication and diversity within our field."

Credit: 
BMJ Group

Consuming protein supplements with meals may work better for weight control

A new systematic review of available evidence appearing in Nutrition Reviews indicates that consuming protein supplements with meals may be more effective at promoting weight control than consuming supplements between meals in adults following a resistance training regimen.

It is well established that consuming dietary protein proximate to resistance-type exercise sessions promotes a positive net protein balance during post-exercise recovery. Protein supplements are available in ready-to-drink, powdered, and solid form and are marketed for different outcomes such as weight gain, weight loss, and weight management. However, for each outcome, the promoted timing of protein intake varies. Protein supplements designed to augment weight gain or support weight stability are promoted for consumption between meals. Protein supplements either with meals or as meal replacements are often recommended for ingestion to promote weight loss.

Consuming protein supplements between meals may decrease compensatory eating behaviors, thereby increasing energy intakes and body weight. Conversely, adults undergoing a resistance training program and consuming protein supplements twice daily with meals may compensate for the protein supplement by decreasing their self-chosen diet. Consequently, the timing of protein supplementation may be of particular importance, depending on the desired body weight and body composition outcome.

The impact of timing the consumption of protein supplements relative to meals has not previously been evaluated systematically. In the newly published review of the literature, the researchers investigated whether the existing research studies support consuming protein supplements between meals, vs. with meals, to differentially change body composition in adults who initiate resistance training regimens.

Researchers here assessed 34 randomized controlled trials with 59 intervention groups. Of the intervention groups designated as consuming protein supplements with meals vs. between meals, 56% vs. 72% increased their body mass, 94% vs. 90% increased their lean mass, 87% vs. 59% reduced their fat mass, and 100% vs. 84% increased the ratio of lean to fat mass over time, respectively.

With-meal ingestion of protein was defined as consumption of a dietary protein-rich supplement immediately after a meal, with a meal, or as a high-protein meal replacement. Between-meal ingestion of protein was defined as consumption of a dietary protein supplement predominantly either very near a workout or during another non mealtime.

The results from this systematic review provide novel information for people who choose to consume protein supplements as part of their dietary pattern to promote body mass gain or improve body composition through fat mass reduction. However, consuming protein supplements with meals, rather than between meals, may be a more effective dietary strategy to improve resistance training-induced changes in body composition by reducing fat mass, which may be relevant for adults looking to improve their health status. Consuming protein supplements between meals may be more effective at increasing overall body mass.

Credit: 
Oxford University Press USA

Drinking baking soda could be an inexpensive, safe way to combat autoimmune disease

image: Pictured is Dr. Paul O'Connor, renal physiologist in the lab at the Medical College of Georgia Department of Physiology at Augusta University.

Image: 
Phil Jones, Senior Photographer, Augusta University

AUGUSTA, Ga. (April 25, 2018) - A daily dose of baking soda may help reduce the destructive inflammation of autoimmune diseases like rheumatoid arthritis, scientists say.

They have some of the first evidence of how the cheap, over-the-counter antacid can encourage our spleen to promote instead an anti-inflammatory environment that could be therapeutic in the face of inflammatory disease, Medical College of Georgia scientists report in the Journal of Immunology.

They have shown that when rats or healthy people drink a solution of baking soda, or sodium bicarbonate, it becomes a trigger for the stomach to make more acid to digest the next meal and for little-studied mesothelial cells sitting on the spleen to tell the fist-sized organ that there's no need to mount a protective immune response.

"It's most likely a hamburger not a bacterial infection," is basically the message, says Dr. Paul O'Connor, renal physiologist in the MCG Department of Physiology at Augusta University and the study's corresponding author.

Mesothelial cells line body cavities, like the one that contains our digestive tract, and they also cover the exterior of our organs to quite literally keep them from rubbing together. About a decade ago, it was found that these cells also provide another level of protection. They have little fingers, called microvilli, that sense the environment, and warn the organs they cover that there is an invader and an immune response is needed.

Drinking baking soda, the MCG scientists think, tells the spleen - which is part of the immune system, acts like a big blood filter and is where some white blood cells, like macrophages, are stored - to go easy on the immune response. "Certainly drinking bicarbonate affects the spleen and we think it's through the mesothelial cells," O'Connor says.

The conversation, which occurs with the help of the chemical messenger acetylcholine, appears to promote a landscape that shifts against inflammation, they report.

In the spleen, as well as the blood and kidneys, they found after drinking water with baking soda for two weeks, the population of immune cells called macrophages, shifted from primarily those that promote inflammation, called M1, to those that reduce it, called M2. Macrophages, perhaps best known for their ability to consume garbage in the body like debris from injured or dead cells, are early arrivers to a call for an immune response.

In the case of the lab animals, the problems were hypertension and chronic kidney disease, problems which got O'Connor's lab thinking about baking soda.

One of the many functions of the kidneys is balancing important compounds like acid, potassium and sodium. With kidney disease, there is impaired kidney function and one of the resulting problems can be that the blood becomes too acidic, O'Connor says. Significant consequences can include increased risk of cardiovascular disease and osteoporosis.

"It sets the whole system up to fail basically," O'Connor says. Clinical trials have shown that a daily dose of baking soda can not only reduce acidity but actually slow progression of the kidney disease, and it's now a therapy offered to patients.

"We started thinking, how does baking soda slow progression of kidney disease?" O'Connor says.

That's when the anti-inflammatory impact began to unfold as they saw reduced numbers of M1s and increased M2s in their kidney disease model after consuming the common compound.

When they looked at a rat model without actual kidney damage, they saw the same response. So the basic scientists worked with the investigators at MCG's Georgia Prevention Institute to bring in healthy medical students who drank baking soda in a bottle of water and also had a similar response.

"The shift from inflammatory to an anti-inflammatory profile is happening everywhere," O'Connor says. "We saw it in the kidneys, we saw it in the spleen, now we see it in the peripheral blood."

The shifting landscape, he says, is likely due to increased conversion of some of the proinflammatory cells to anti-inflammatory ones coupled with actual production of more anti-inflammatory macrophages. The scientists also saw a shift in other immune cell types, like more regulatory T cells, which generally drive down the immune response and help keep the immune system from attacking our own tissues. That anti-inflammatory shift was sustained for at least four hours in humans and three days in rats.

The shift ties back to the mesothelial cells and their conversations with our spleen with the help of acetylcholine. Part of the new information about mesothelial cells is that they are neuron-like, but not neurons O'Connor is quick to clarify.

"We think the cholinergic (acetylcholine) signals that we know mediate this anti-inflammatory response aren't coming directly from the vagal nerve innervating the spleen, but from the mesothelial cells that form these connections to the spleen," O'Connor says.

In fact, when they cut the vagal nerve, a big cranial nerve that starts in the brain and reaches into the heart, lungs and gut to help control things like a constant heart rate and food digestion, it did not impact the mesothelial cells' neuron-like behavior.

The affect, it appears, was more local because just touching the spleen did have an effect.

When they removed or even just moved the spleen, it broke the fragile mesothelial connections and the anti-inflammatory response was lost, O'Connor says. In fact, when they only slightly moved the spleen as might occur in surgery, the previously smooth covering of mesothelial cells became lumpier and changed colors.

"We think this helps explain the cholinergic (acetylcholine) anti-inflammatory response that people have been studying for a long time," O'Connor says.

Studies are currently underway at other institutions that, much like vagal nerve stimulation for seizures, electrically stimulate the vagal nerve to tamp down the immune response in people with rheumatoid arthritis. While there is no known direct connection between the vagal nerve and the spleen - and O'Connor and his team looked again for one - the treatment also attenuates inflammation and disease severity in rheumatoid arthritis, researchers at the Feinstein Institute for Medical Research reported in 2016 in the journal Proceedings of the National Academy of Sciences.

O'Connor hopes drinking baking soda can one day produce similar results for people with autoimmune disease.

"You are not really turning anything off or on, you are just pushing it toward one side by giving an anti-inflammatory stimulus," he says, in this case, away from harmful inflammation. "It's potentially a really safe way to treat inflammatory disease."

The spleen also got bigger with consuming baking soda, the scientists think because of the anti-inflammatory stimulus it produces. Infection also can increase spleen size and physicians often palpate the spleen when concerned about a big infection.

Other cells besides neurons are known to use the chemical communicator acetylcholine. Baking soda also interact with acidic ingredients like buttermilk and cocoa in cakes and other baked goods to help the batter expand and, along with heat from the oven, to rise. It can also help raise the pH in pools, is found in antacids and can help clean your teeth and tub.

Credit: 
Medical College of Georgia at Augusta University

Stress hormones spike as the temperature rises

San Diego (April 25, 2018)--A new study in medical students finds that summer, not winter, is the season when people are most likely to have higher levels of circulating stress hormones. These non-intuitive findings contradict traditional concepts of the taxing physical toll of winter and the relaxed ease of summer. Researchers will present their findings today at the American Physiological Society (APS) annual meeting at Experimental Biology 2018 in San Diego.

Cortisol--often referred to as the "stress hormone" because it is released into the bloodstream during stressful situations--helps regulate the body's levels of sugar, salt and fluids. The hormone helps reduce inflammation and is essential for maintaining overall health. Cortisol levels are typically highest in the morning and gradually drop throughout the day. Levels are lower in the evening to maintain healthy sleeping patterns. Illness, lack of sleep and certain medications can affect cortisol levels more than normal daily fluctuations. Researchers from Poznan University of Medical Sciences in Poland have now discovered seasonal patterns in the cortisol levels of medical students.

The research team studied a group of female medical students on two separate days in the winter and for two days again in the summer. The researchers took saliva samples every two hours during each testing period--a full 24-hour cycle--to measure levels of cortisol and markers of inflammation. The volunteers completed a lifestyle questionnaire during each testing session about their sleep schedule, type of diet they followed and physical activity levels.

Previous studies on the seasonal variability of cortisol have shown inconsistent findings--possibly because participants were tested in their own homes and not in a uniform setting. In the current study, however, the research team found cortisol levels to be higher on the summer testing dates. Inflammation levels did not change significantly between seasons.

Dominika Kanikowska, of Poznan University of Medical Sciences, will present the poster "Daily and seasonal rhythms of interleukin 6 and cortisol levels in saliva and some lifestyle habits of medical students in Poland" on Wednesday, April 25, in the Sails Pavilion of the San Diego Convention Center.

Credit: 
American Physiological Society

Study examines denigration when people call a place a 's---hole'

By tracing the use of the word and hashtag 'shithole' on Twitter, researchers have examined who is engaged in the stigmatizing discourse of denigration, the types of place that are stigmatized, and the responses to stigmatized places.

In a Transactions of the Institute of British Geographers study, the majority of tweets were aimed at places where the tweeter was not from, a form of othering consistent with how territories are stigmatized by those in positions of power such as policymakers, politicians, and journalists. Also, an important and gendered minority of tweets were characterized by a 'cry for help' and powerlessness, where the stigma was aimed at their own places.

"As well as showing what people think about other places, our research showed how people talk about the places where they live and how a significant proportion--38% of the tweets we studied--maligned the place where they live," said lead author Alice Butler, of the University of Leeds, in the UK. "There was a significant gender difference in the way that men and women experience living in a place, and we also noted a tendency to try and separate place from self-identity."

Credit: 
Wiley

Drinking kefir may prompt brain-gut communication to lower blood pressure

Drinking kefir may have a positive effect on blood pressure by promoting communication between the gut and brain. Kefir is a fermented probiotic milk beverage known to help maintain the balance of beneficial bacteria in the digestive system. Researchers will present their findings today at the American Physiological Society (APS) annual meeting at Experimental Biology 2018 in San Diego.

Previous research has shown that an imbalance in the gut's colony of bacteria (microbiota) may cause high blood pressure in some people. Similarly, probiotics--live bacteria supplements that are beneficial to the digestive system--have been found to lower blood pressure, but the mechanisms by which this occurs are unclear.

A research team from Auburn University in Alabama, in collaboration with the University of Vila Velha in Brazil, studied three groups of rats to determine how kefir reduces high blood pressure (hypertension):

One group had hypertension and was treated with kefir ("treated").

One group had hypertension and was not treated ("untreated").

One group had normal blood pressure and was not treated ("control").

After nine weeks of kefir supplementation, the treated rats had lower levels of endotoxins (toxic substances associated with disruption in the cells), lower blood pressure and improved intestinal permeability when compared with the untreated group. Healthy intestines allow some substances to pass through, but generally act as a barrier to keep out harmful bacteria and other potentially dangerous substances. In addition, kefir supplementation restored the natural balance of four different bacteria in the gut and of an enzyme in the brain essential for normal nervous system function, suggesting that the nervous and digestive systems work together to reduce hypertension.

"Our data suggests that kefir antihypertensive-associated mechanisms involve gut microbiota-brain axis communication during hypertension," the researchers wrote.

Mirian Silva-Cutini, of Auburn University, will present "Probiotic kefir antihypertensive effects in spontaneously hypertensive rats involve central and peripheral mechanisms" on Wednesday, April 25, in the Sails Pavilion of the San Diego Convention Center.

Credit: 
Experimental Biology

Bento browser makes it easier to search on mobile devices

image: This screengrab shows the Bento browser dashboard, with each search project represented by a colored square. The browser is designed to help perform searches on mobile devices.

Image: 
Carnegie Mellon University

PITTSBURGH--Searches involving multiple websites can quickly get confusing, particularly when performed on a mobile device with a small screen. A new web browser developed at Carnegie Mellon University now brings order to complex searches in a way not possible with conventional tabbed browsing.

The Bento browser, inspired by compartmentalized bento lunch boxes popular in Japan, stores each search session as a project workspace that keeps track of the most interesting or relevant parts of visited web pages. It's not necessary for a user to keep every site open to avoid losing information.

"With Bento, we're structuring the entire experience through these projects," said Aniket Kittur, associate professor in the Human-Computer Interaction Institute (HCII). The projects are stored for later use, can be handed off to others, or can be moved to different devices. "This is a new way to browse that eliminates the tab overload that limits the usefulness of conventional browsers."

Someone planning a trip to Alaska with a conventional browser, for instance, might create multiple tabs for each location or point of interest, as well as additional tabs for hotels, restaurants and activities. With Bento, users can identify pages they found useful, trash unhelpful pages and keep track of what they have read on each page. Bento also bundles the search result pages into task cards, such as accommodations, day trips, transportation, etc. The project could be shared with other people planning their own trips.

Kittur's research team will present a report on their mobile web browser at CHI 2018, the Conference on Human Factors in Computing Systems, April 21-26 in Montreal, Canada. A research version of the Bento Browser for iPhones is available for download from the App Store.

Mobile devices now initiate more web searches than do desktop computers. Yet the limitations of conventional browsers become more acute on mobile devices. Not only is screen size limited, but mobile users are more often interrupted and distracted and have more difficulty saving and organizing information, said Nathan Hahn, a Ph.D. student in HCII.

In user studies that compared Bento with the Safari browser, users said they preferred Bento in cases where they wanted to continue a search later and wanted to pick up where they left off. They also said Bento kept their searches better organized. Though most participants found it easier to learn how to use Safari, they found Bento more useful for finding pages and believed that Bento made their mobile searches more effective.

One goal was to design Bento to work in a way that complements the way the mind works.

"If we get a lot of people using it, Bento could serve as a microscope to study how people make sense of information," Kittur said, noting people who use the research version are asked to consent to their searches becoming part of the research data. "This might lead to a new type of artificial intelligence," he added.

Bento Browser is now a search app for iPhones, but its capabilities for organizing searches and helping people resume searches also could benefit people using desktop computers. To accommodate those users, Kittur's team is now preparing a Bento plug-in for the Chrome browser.

Joseph Chee Chang, a Ph.D. student in CMU's Language Technologies Institute, also is part of the Bento team and a co-author of the CHI paper. More information is available at https://bentobrowser.com/.

Credit: 
Carnegie Mellon University

Malaria study reveals gene variants linked to risk of disease

Many people of African heritage are protected against malaria by inheriting a particular version of a gene, a large-scale study has shown.

Another variation of the same gene can have the opposite effect of raising susceptibility to malaria - but it reduces the risk of other common childhood diseases, the study found.

The findings give new insights to explain why some children, but not all, develop severe malaria. The condition, which is spread by mosquito bites, kills half a million African children each year before they have had time to acquire immunity to the disease.

Scientists also revealed that the impact of both gene variations is linked to whether or not individuals carry a third genetic mutation. Their research, which highlights the complex links that can exist between genes in our DNA, helps explain inconsistencies in previous research.

In a study of more than 5,000 Kenyan children, researchers from the University of Edinburgh with colleagues from Oxford, Kenya and Mali examined two variations, or mutations - known as Sl2 and McCb - in a gene called CR1. These gene variations, common in African populations, were believed to have evolved in response to malaria, but previous studies have been unable to confirm this.

By studying the complex relationships between the gene variants, researchers found that the Sl2 mutation in the CR1 gene protects against cerebral malaria and death - but only if children did not carry a third gene mutation, known as alpha-thalassaemia. A separate study showed that the Sl2 variant helped prevent clumps of cells called rosettes, found in severe malaria.

The other mutation in CR1, known as McCb, was found to be linked to increased risk of cerebral malaria and death, but to lower the risk of childhood diseases such as respiratory and gastrointestinal infections. The study, supported by Wellcome, was published in eLife.

Professor Alexandra Rowe, of the University of Edinburgh's School of Biological Sciences, who supervised the study, said: "Genes have complex effects on one another and on the chances of disease. These findings point towards avenues for further research in understanding factors that influence life-threatening illness in African children."

Dr Olivia Swann, of the School of Biological Sciences, who jointly led the study, said: "This study is a missing piece in the puzzle of how genes protect people against malaria. It also reminds us how complex this protection is and how hard it can be to untangle the influence of one gene from another."

Credit: 
University of Edinburgh

Feelings of ethical superiority can lead to workplace ostracism, social undermining: Study

image: This is Matthew Quade, Ph.D., assistant professor of management in Baylor University's Hankamer School of Business.

Image: 
Baylor University Marketing & Communications

WACO, Texas (April 24, 2018) - Do you consider yourself more ethical than your coworker?

Caution! Your feelings of ethical superiority can cause a chain reaction that is detrimental to you, your coworker and your organization, according to Baylor University management research.

A new study published in the Journal of Business Ethics suggests that your feelings of ethical superiority can lead you to have negative emotions toward a "less ethical" coworker. Those negative emotions can be amplified if you also believe you do not perform as well as that coworker. And, furthermore, those negative emotions can lead to your mistreatment and/or ostracism (social exclusion) of that less ethical, higher-performing coworker.

"One way to think of this is that it is - and should be - concerning to us to believe that we are more ethical than our coworkers, especially if we do not perform as well as they do," said lead author Matthew Quade, Ph.D., assistant professor of management in Baylor University's Hankamer School of Business and an expert on workplace ethics and ostracism.

The research, Quade said, can help managers create better atmospheres and improve the bottom line.

"The managerial implication is that we need to create environments where ethics and performance are both rewarded," he said.

A total of 741 people, among them 310 employees ("focal employees") and an equal number of their coworkers ("comparison coworkers"), were surveyed for the study. Focal employees compared themselves with their coworkers based on two areas: perceived ethics and performance. Then they rated their levels of negative emotions (i.e., feelings of contempt, tension or disgust) toward those same comparison coworkers.

Results show that employees who believe they are more ethical than similar coworkers (i.e., those that hold similar positions, have similar education background and similar tenure in the organization) feel negative emotions (i.e., contempt, disgust, stress, repulsion) when thinking about those coworkers. These negative emotions about the coworker are amplified when the employees also believe they do not perform as well as those same coworkers.

In turn, the comparison coworkers rated how often they experienced social undermining (i.e., insults, spreading of rumors, belittling of ideas) and ostracism (i.e., ignored, avoided, shut out of conversations) from the focal employee.

Results also show that the negative emotions that the "more ethical, lower performing" employees experience may result in them behaving in unethical ways directed at their coworkers. Specifically, they become more likely to socially undermine and ostracize those "less ethical, higher performing" coworkers. All the study's results exist regardless of gender and any positive emotion the employees may experience as a result of believing they are more ethical.

Ultimately, such workplace scenarios pose a conundrum for managers, Quade said. On one hand, there is the ethical worker who doesn't perform as well. On the other hand, there's the less ethical worker who hits all the goals.

Who gets rewarded?

"If high performance is the result of questionable or unethical behavior, that combination should not be celebrated," the researchers wrote. "Instead, organizations should be cautious when rewarding and promoting performance within organizations, ensuring that they also consider the way the job is done from an ethical standpoint."

The ideal situation, the study reveals, is when high ethics and high performance are the norm - and employees are rewarded.

"Enhancing the ethical behavior of all employees should be an emphasis to attempt to remove some of the disparity that tends to exist between employees when it comes to their moral behavior at work," the researchers wrote.

Credit: 
Baylor University

Hospital patients are eager to play a role in tracking health data, researchers find

INDIANAPOLIS -- New research shows that patients in the hospital are eager to collaborate with clinicians to track their health data. Traditionally, clinicians have been the only ones who collect, track and reflect on that data.

The findings uncover new directions for researchers in the field of human-computer interaction, said Andrew Miller, one of the researchers and an assistant professor in the Department of Human-Centered Computing in the School of Informatics and Computing at IUPUI.

The researchers reported their findings in a paper, ?"Supporting Collaborative Health Tracking in the Hospital: Patients' Perspectives," that was published by the 2018 ACM CHI Conference on Human Factors in Computing Systems, the premier international conference of human-computer interaction.

Previous studies have shown that patients have a better care experience -- and a better health outcome -- when they are engaged in their care, compared to disengaged patients.

But tracking health data in the hospital isn't easy for patients. One of the challenges is that much of a patient's health information is presented verbally. Access to information depends on their alertness and their ability to recall what was said.

"The big idea for this paper was to consider bringing into the hospital the models, approaches and mechanisms that human-computer interaction research has generated for people outside of the hospital to track their physical activity, with devices like Fitbits, and help them manage chronic health conditions like diabetes with glucose monitors," Miller said.

The question, Miller said, was whether those types of devices would help people make sense of their health data when they were in the hospital.

The researchers first questioned patients and their caregivers about what kind of information they would want to track, Miller said.

"We then looked at these models from outside of the hospital to see where there is a match and where we could bring some of the insights from outside of the hospital into the hospital," Miller said. "We think there is substantial evidence in this study to show there is a real opportunity here, and it's something patients are interested in doing."

In fact, he said, researchers were surprised by the degree of interest among patients in tracking their health data. "There was a palpable sense in patients that this is actually something they could use right now."

At the same time, however, patients don't want to track the data by themselves, Miller said: "They're interested in partnering with their care team to get better and doing what they can to help."

Credit: 
Indiana University

Imagining a positive outcome biases subsequent memories

Imagining that a future event will go well may lead you to remember it more positively after it's over, according to findings from research published in Psychological Science, a journal of the Association for Psychological Science.

"Our results suggest that imagining an upcoming event can essentially 'color' your memory for that event once it comes to pass," says psychological scientist Aleea Devitt of Harvard University, first author on the study.

"Research has shown that healthy adults tend to have an unrealistically favorable outlook, and our studies suggest that one potential benefit of this optimism might be that we remember events in a more positive way, which could contribute to general well-being," she explains.

Daydreaming about the future is a common experience and many of the events we muse about eventually come to pass. Devitt and study coauthor Daniel L. Schacter, also of Harvard University, hypothesized that simulating a future event may produce a mental representation that ultimately competes with and alters memory for the event after it has happened.

In one experiment, the researchers presented 27 participants with a series of 12 randomly-selected scenarios. For each scenario, participants imagined the event going well (or going poorly) and described it aloud for 3 minutes.

After a 15-minute break, the researchers told the participants to imagine that it was a year later and that they were going to learn how the events turned out. The participants then read brief narratives of the events, each of which contained some positive, negative, and neutral details.

On a subsequent recognition test 48 hours later, they saw 12 details (some positive and some negative) for each narrative, and indicated whether those details had appeared in the narrative.

Emotional content mattered: Participants incorrectly identified more positive details as "true" than they did negative details.

Importantly, how they imagined the event influenced what they remembered from the narrative later. Participants were more likely to mistakenly identify positive details as "true" if they had previously imagined the event going well. Imagining a negative outcome before learning how an event turned out did not seem to influence participants' memory for the event details.

A second experiment produced similar results, showing that imagining a positive event, either in the future or the past, biased participants' subsequent recall toward positive details. Participants who imagined scenarios in a positive light also rated the actual event more positively when they took the recognition test.

"These results suggest that adopting an optimistic outlook can actually transfer to a rosier reflection once those upcoming experiences become part of the personal past," says Devitt.

Devitt and Schacter are now investigating whether a similar phenomenon occurs among healthy old adults, as research has shown that older adults tend to imagine events with less detail but also show an overall positive bias in attention and memory.

Credit: 
Association for Psychological Science

Getting a better look at living cells

image: Tolou Shokuhfar, associate professor of bioengineering at the University of Illinois at Chicago College of Engineering.

Image: 
UIC

Nanoscale-level imaging of living cells has become a reality in the past few years using transmission electron microscopy and sealed sample holders that keep cells alive in a liquid environment. But do the high-resolution images obtained using these tools truly reflect the structures and functions of cells, or do they show cells damaged by the high-intensity electron beam used in transmission electron microscopy?

"We really have had no way of knowing if what we see in images obtained through liquid cell transmission electron microscopy show the natural state of cells, or if the morphological changes we see are actually the result of radiation damage," said Tolou Shokuhfar, associate professor of bioengineering at the University of Illinois at Chicago College of Engineering.

Shokuhfar and colleagues describe a device that works with most transmission electron microscopes that would significantly reduce the exposure of live samples to the electron beam used in transmission electron microscopy. They report their results in the journal Science Advances.

Transmission electron microscopy produces incredibly detailed images of cells that can show structures as small as one or two nanometers across. But for a long time, samples used in transmission electron microscopy had to be dead or frozen because the sample chamber of a transmission electron microscope is a vacuum.

The new field of liquid cell transmission electron microscopy emerged in recent years enabling scientists to study biological, chemical and materials science samples in their near-native environments. This is achieved by placing the sample in liquid inside a tiny sealed chamber that protects it from the high vacuum environment to allow dynamic imaging.

However, currently-available devices that hold samples only allow for a single chamber to be placed under the microscope at a time. "Because you place just one sample at a time under the microscope, you need to perform your pre-imaging focus and setting adjustment on that one sample," said Trevor Moser, a graduate student at Pacific Northwest National Laboratory in Richland, Washington and a co-author on the paper. "By the time you are ready to take pictures, the sample has already been exposed to significant amounts of radiation, so you just never know if the pictures you get show the unaltered cell, or if what you see on the pictures is because of damage from the electron beam," continued Moser, who has previously worked in Shokuhfar's lab.

The research team solved this problem by developing a device with 25 transparent windows rather than the single window sample holders currently provide. With more windows, the researchers expose samples to less radiation by getting closer to the settings and focus they need using one of the windows and then switching to another window where cells haven't yet been exposed to the radiation from the microscope's electron beam. Researchers still need to focus on samples in the 'fresh' window, but they don't have as many adjustments to make, significantly limiting total exposure to the electron beam before images are taken.

Next, the researchers proved that their device could prevent alteration of samples caused by overexposure to electron radiation. They imaged a bacterium called Cupriavidus metallidurans, a small single-celled organism that produces solid gold nanoparticles from aqueous gold tetrachloride, a potent heavy metal toxin to most organisms.

First, they imaged the bacteria by exposing it to increasing levels of radiation over the course of focusing and adjusting their settings before taking pictures. Then, they imaged a second batch of bacteria using their novel 25-window device. The images they produced showed significant differences.

"The images of cells exposed to higher levels of radiation were clearly different from cells imaged with no previous radiation exposure," said James Evans, a senior scientist at Pacific Northwest National Laboratory and a co-author on the paper. "This proves that damage caused by being in the electron beam too long can cause artifacts that can yield false information. We saw much more pristine, undamaged cells using our multi-chamber device."

Shokuhfar, a corresponding author on the paper, said the new device will also enable higher-fidelity imaging of nanoparticles using transmission electron microscopy. "Nanoparticles are also susceptible to damage from radiation, so this device will let us observe more accurately, how nanoparticles grow and change under different conditions, which has application in areas of new materials, nanoparticle interactions and medicine," she said.

Credit: 
University of Illinois Chicago

Rhythm crucial in drummed speech

image: The Amazonian Bora people mimic the rhythm of their language using drums.

Image: 
GAIAMEDIA/AEXCRAM

The human voice can produce rich and varied acoustic signals to transmit information. Normally, this transmission only has a reach of about 200 metres. The Boras, an indigenous group of about 1,500 members residing in small communities in the Amazonian rainforest of Colombia and Peru, can extend this range by a factor of 100 by emulating Bora phrases in sequences of drumbeats. The Boras do this with manguaré drums pairs of wooden slit drums traditionally carved from single logs (each about two metres) through burning. Each drum can produce two pitches, a pair four in total.

Public announcement

The Boras use manguaré drums in two ways. One is the "musical mode", which is used to perform memorised drum sequences with little or no variation as part of rituals and festivals. The other is the "talking mode", which is used to transmit relatively informal messages and public announcements. "For example, the manguaré is used to ask someone to bring something or to come do something, to announce the outcome of non-alcoholic drinking competitions or the arrival of visitors", says Seifart of the former Department of Linguistics at the Max Planck Institute for Evolutionary Anthropology where the major part of the now published work was done. "In this model, only two pitches are used and each beat corresponds to a syllable of a corresponding phrase of spoken Bora. The announcements contain on average 15 words and 60 drum beats."

Rhythm essential

The Boras use drummed Bora to mimic the tone and rhythm of their spoken language and to elaborate Bora phrases in order to overcome remaining ambiguities. "Rhythm turns out to be crucial for distinguishing words in drummed Bora", says Seifart. "There are four rhythmic units encoded in the length of pauses between beats. These units correspond to vowel-to-vowel intervals with different numbers of consonants and vowel lengths. The two phonological tones represented in drummed speech encode only a few lexical contrasts. Rhythm therefore appears to crucially contribute to the intelligibility of drummed Bora."

This, the researchers argue, provides novel evidence for the role of rhythmic structures composed of vowel-to-vowel intervals in the complex puzzle concerning the redundancy and distinctiveness of acoustic features embedded in speech.

Credit: 
Max Planck Institute for Evolutionary Anthropology

Landmark paper finds light at end of the tunnel for world's wildlife and wild places

image: A new paper finds that trends toward population stabilization, poverty alleviation, and urbanization are rewriting the future of biodiversity conservation in the 21st century, offering new hope for the world's wildlife.

Image: 
Cristian Samper

NEW YORK (April 23, 2018) - A new WCS paper published in the journal BioScience finds that the enormous trends toward population stabilization, poverty alleviation, and urbanization are rewriting the future of biodiversity conservation in the 21st century, offering new hope for the world's wildlife and wild places.

The paper, written by Eric Sanderson, WCS Senior Conservation Ecologist; Joe Walston, WCS Vice President for Field Conservation; and John Robinson, WCS Executive Vice President for Global Conservation, says that for the first time in the Anthropocene, the global demographic and economic trends that have resulted in unprecedented destruction of the environment are now creating the necessary conditions for a possible renaissance of nature.

Most people think that the population of people on Earth will always rise, but these authors point out that the demographic transition is already well underway. The rate of growth in global population has been dropping since the 1960s. They cite new demographic research that suggests the world population in 2100 could be as high as 12 billion or as low as 7 billion, fewer people than are alive today. The difference depends on actions we take today.

Good urbanization is key. Cities lead people to choose to have smaller families, and the increased income urbanites derive from working in town mean that people can choose to conserve nature, not destroy it, through choices about what they buy and how they live.

These considerations lead the authors to suggest that within our generation, or the generation to follow, if society makes the right moves now, there could be possibilities for rewilding unimaginable to previous generations of conservationists.

They call their thinking "From Bottleneck to Breakthrough." Recognition of the massive demographic, economic and urbanization trends suggest that conservation will best succeed if we protect the world's threatened wildlife and wild places through the bottleneck; create safe, attractive, sustainable cities; encourage better consumer choices by costing in the environmental benefits or harms of different resources and pollutants; and by inspiring all people and all institutions of the world to care for, rather than destroy, the natural bases of life on Earth.

Said lead author Eric Sanderson: "A light is appearing at the end of the tunnel, but for that light to be sunshine and not a train, it is critical that the world's nations act now."

Credit: 
Wildlife Conservation Society