Culture

Different strokes for different folks

Describing a toothpaste as "limited edition" or a Rolls Royce as a "best-seller" would sound off-key or even confusing to most consumers and could well steer them away from making a purchase. That's because, new research shows, individuals are inclined to have different mindsets depending on their consumption goals, and marketers should tailor their messages accordingly.

In "The Perfect Fit: The Moderating Role of Selling Cues on Hedonic and Utilitarian Product Types," three researchers - Gopal Das of the Indian Institute of Management, Amaradri Mukherjee of Portland State University, and Ronn J. Smith of the University of Arkansas - explore the impact of popularity versus scarcity cues and product types on consumers' perceptions of risk, product uniqueness, and purchase intentions. Their paper is forthcoming in the March issue of the Journal of Retailing.

In a series of studies, the authors probe how individuals' inclinations - whether they are more risk-averse or more aspirational -- relate to how likely they are to buy a product. Sales cues like "limited edition" or "best-seller" play to these inclinations: limited edition connotes exclusivity to those seeking an achievement of sorts, while a best-seller implies that a product is dependable. As such, these cues are important guardrails in guiding how a product would be most effectively marketed.

In one study, undergraduate participants were surveyed to determine their particular focus, then randomly shown two different print ads for a digital SLR camera. One ad promoted the camera as a best-seller, the other as a limited edition. How dependable either version was mattered little to individuals more focused on achievement, whereas risk-averse individuals were almost twice as wary of buying the limited-edition camera. However, the reverse was true for the best-seller: risk-averse buyers preferred it by a factor of two compared to the aspirational buyers, who were indifferent to that quality.

Savvy retailers, both online and bricks-and-mortar, might consider mining this insight via data analytics and loyalty programs that decode their customers' mindsets, the authors suggest: "When the selling cues are properly aligned with product types, it leads to better goal fulfillment of the shopper."

Credit: 
Journal of Retailing at New York University

Cannabis compound may help curb frequency of epileptic seizures

A naturally occurring compound found in cannabis may help to curb the frequency of epileptic seizures, suggests a review of the available evidence, published online in the Journal of Neurology Neurosurgery & Psychiatry.

But the evidence to date is confined to the treatment of children and teens whose epilepsy does not respond to conventional drugs, and rare and serious forms of the condition, caution the researchers.

Between 70 and 80 percent of people newly diagnosed with epilepsy manage to control their seizures completely using conventional drugs such as valproate and carbamazepine, but that still leaves up to a third whose condition is unresponsive to these treatments.

Preliminary research suggests that naturally occurring compounds found in cannabis (cannabinoids) may dampen down convulsions. And one of these cannabinoids, cannabidiol or CBD for short, seems to show promise for curbing seizures.

To explore this in more depth, the researchers trawled research databases for relevant published and unpublished studies looking at the potential impact of cannabinoids as an add-on to usual treatment on epilepsy seizures, and published up to October 2017.

Out of an initial haul of 91 studies they found six clinical trials (555 patients) and 30 observational studies (2865 patients) that were eligible for inclusion in their review.

All the participants, whose average age was 16, had rare forms of epilepsy that had not responded to usual treatment.

Pooled analysis of the clinical trial data showed that CBD was more effective than a dummy (placebo) drug at cutting seizure frequency by 50 percent or more, and improving quality of life.

CBD was also more effective than placebo at eradicating seizures altogether, although this was still rare.

But the risk of side effects (dizziness and drowsiness), although small, was significant--24 percent higher--while that of serious side effects was twice as high among those taking cannabidiol.

Pooled data from 17 of the observational studies showed that seizure frequency dropped by at least 50 percent in just under half of the patients and disappeared completely in nearly one in 10 (8.5%) in eight of these studies.Quality of life improved in half of the patients in 12 of the studies.

"Pharmaceutical grade CBD as adjuvant treatment in paediatric onset drug resistant epilepsy may reduce seizure frequency," conclude the researchers. "Existing [randomised controlled trial] evidence is mostly in paediatric samples with rare and severe epilepsy syndromes; [randomised controlled trials] examining other syndromes and cannabinoids are needed."

Credit: 
BMJ Group

Research finds little difference among diet plans' long-term effectiveness

WASHINGTON -- Whether you pick low-carb, low fat or another diet plan, scientific research indicates each can help some people achieve modest long-term weight loss with potential improvement in health risks, according to the Scientific Statement the Endocrine Society issued today on managing obesity.

The authors found the Mediterranean Diet and DASH diet provide demonstrated benefits for improving cardiovascular disease, and in lower calorie versions may be beneficial for weight loss.

Given the number of diets, medications and surgical procedures available to treat obesity, the best approach for each individual depends on genetics, health and how well they can adhere to a particular regimen, the statement's authors concluded. Still, maintaining long-term weight loss remains challenging, and individuals with obesity should expect to regain weight when they stop treatment.

"The stigma around this disease makes it difficult to address obesity as a public health problem," said George A. Bray, M.D., of Louisiana State University's Pennington Biomedical Research Center in Baton Rouge, La., who chaired the task force that developed the Scientific Statement. "There often is a mismatch between the patient's cosmetic goals and what can realistically be achieved with diet and exercise. While a modest 5 percent to 10 percent weight loss can yield significant health benefits, that may not provide the cosmetic changes patients seek."

Obesity remains a worldwide public health issue. More than 1.9 billion adults worldwide meet the criteria for obesity or overweight, according to the World Health Organization.

Obesity is associated with and contributes to a shortened life span, type 2 diabetes, cardiovascular disease, some cancers, kidney disease, sleep apnea, osteoarthritis and other conditions. Weight loss can lower the risk of developing these conditions and improve health outcomes.

The statement's authors examined the latest scientific evidence on a variety of diets, commercial diet plans such as Weight Watchers, exercise, obesity medications and types of bariatric surgery. Based on a review of more than 400 studies and peer-reviewed articles on obesity, the experts found all of the weight loss interventions had a high degree of variability when it came to effectiveness.

"Individual weight loss approaches worked well for some people and not for others," Bray said. "Currently, we have limited genetic and other information to predict which intervention will work for a given individual. This demonstrates just how complex the problem of severe obesity is."

Surgical approaches tended to lead to greater and longer lasting weight loss than other treatment options, the authors found.

Many consumers turn to dietary supplements, which are not evaluated by the U.S. Food and Drug Administration (FDA). There is little scientific evidence to show these supplements can effectively support weight loss or even that they are safe. Having the FDA oversee dietary supplements and holding these products to higher safety and efficacy standards would benefit public health, according to the statement authors.

Recent studies have examined whether some individuals with a body-mass index (BMI) that meets the criteria for obesity can maintain healthy blood pressure, cholesterol levels, blood sugar and levels of fats in the blood called triglycerides. The statement authors concluded metabolically healthy obesity is likely a short-term state, and individuals who fit the criteria are likely to develop metabolic and cardiovascular problems over time.

"Effectively treating obesity is crucial if we are going to be able to address the devastating impact diabetes and cardiovascular disease have on public health," Bray said. "We are seeing promising research into diabetes medications linked to weight loss, the use of peptides to enhance weight loss, and improved techniques for modulating the way food moves through the digestive system and is absorbed into the body. As our scientific understanding of obesity continues to improve, we hope this will lead to the discovery of new treatment approaches."

Credit: 
The Endocrine Society

Supportive colleagues could be the key to health and fairness at work

The attitudes and behaviours of colleagues towards people returning to work from sick leave can have a big impact on whether or not a worker feels they are fairly treated by their organisation.

According to a new study by researchers at the University of East Anglia (UEA) and Stockholm University, there is a clear link between a person's health and their perceptions of fairness at work over time. The most significant factor in that link is the amount of support a worker feels he or she gets from colleagues.

The research, published in the Journal of Occupational Health Psychology, could have implications for how managers help employees return to work following a period of absence, or how they support those struggling to manage long-term health issues while at work.

According to the study, organisations should make sure they have well-designed routines for workers with health problems. However, organisations also need to take into consideration the wider work environment to ensure colleagues can offer social support while the individual settles back into work.

The research team, led by UEA's Norwich Business School, used data from a large-scale population survey that has been carried out in Sweden every two years since 2006. Their results showed a clear pattern: among 3,200 respondents, all in paid employment, there was an association between three health indicators - self-rated health, depressive symptoms and sickness absence - and social support at work over time. In turn, social support was also strongly linked to perceptions of fairness in decision-making at work over time.

"It's well recognised that the perception of being treated fairly at work has a positive impact on a person's well-being, but the reverse relationship - between an individual's health and their perceptions of justice - is less well understood," said Dr Constanze Eib, an expert in organisational behaviour at Norwich Business School, and lead author of the study.

"Our results show a strong association between feelings of unfairness and the amount of support perceived by colleagues. It could be that when you come back to work you still feel unwell, or more unhappy and your co-workers might pick up on this and feel inclined to keep their distance.

"Added to that, they might have been picking up your work while you were away and all this might contribute to them showing you less concern. That can lead to feelings of being less included in workplace discussions, less valued, and a sense that you are not being treated fairly."

Dr Eib added: "If, on the other hand, you feel supported by colleagues, this will change how you behave at work, and what you think about your organisation. It comes down to managers really caring about their employees. They need to make sure they understand their workforce and can foster a supportive culture between colleagues - as well as taking steps to ensure procedures and decision-making processes are unbiased, robust and transparent."

Credit: 
University of East Anglia

Spicing it up: High school students may prefer seasoned veggies over plain

UNIVERSITY PARK, Pa. -- High school students prefer vegetables seasoned with herbs and spices, rather than plain veggies, according to Penn State researchers, who add this may lead to students liking and eating more vegetables, and result in less food waste in schools.

The researchers asked high school students to rate the taste of a variety of vegetables seasoned with either oil and salt or a blend of herbs and spices developed specifically for each vegetable. The majority of the students preferred the seasoned version of most of the veggies, despite the fact that many of the seasoning blends were new to them.

Kathleen Keller, associate professor of nutritional sciences, said that schools that serve vegetables that are more appealing my ultimately cut down on the amount of food that is thrown out.

"Despite the fact that many of the kids hadn't previously been exposed to a lot of different herbs and spices, our results showed that they liked and preferred the seasoned vegetables over the plain ones," Keller said. "I think that if schools were to implement these simple recipes, they might have more success than if they just serve vegetables with oil and salt or nothing at all."

Eating enough vegetables is essential for good health: Consuming a lot of veggies has been linked with a reduced risk of both heart disease and some cancers. But while the current Dietary Guidelines for Americans recommend that teens between the ages of 14 and 18 eat between two-and-a-half and four cups of vegetables a day, the researchers said only about 2 percent of American adolescents eat enough.

Schools in the United States that participate in the National School Lunch Program have to follow federal guidelines for what they serve for lunch, including offering a variety of vegetables throughout the week and limiting fat and salt. But offering vegetables doesn't guarantee students will eat them, and discarded vegetables can add up to a lot of wasted food.

"When we talked to the students, we learned the most important thing to them when it came to eating vegetables was the taste," Keller said. "For the school lunch food service workers, they were worried about sticking to the federal guidelines, food waste and making food kids will like without recipes to work from."

The researchers gave the students taste tests to measure how much they liked a variety of plain versus seasoned vegetables. The plain vegetables were prepared with a small amount of oil and salt, while the seasoned vegetables were prepared with a seasoning blend specifically developed for each vegetable. For example, the corn and peas were seasoned with a blend designed to mimic the flavor of nachos.

Taste tests were held for eight different vegetables, with about 100 students participating in each test. All recipes used either frozen or canned vegetables, to make them easy to replicate by the school's food service workers.

The researchers found that overall, the students preferred the seasoned version of the vegetables, except for the sweet potatoes. The seasoned versions of corn and peas, black beans and corn, cauliflower, broccoli, green beans, and carrots were all rated as better tasting than the plain versions.

Juliana Fritts, graduate student in food science, said in addition to being more appealing to students, another benefit was that the seasonings were developed specifically with school dietary guidelines in mind.

"The guidelines have limits for how much salt and fat can be used in each meal, so that's part of the reason why the school was serving the vegetables plain, to stay within those guidelines," Fritts said. "These blends and recipes were developed specifically within these guidelines, so schools don't have to worry if it will make them go over the fat and salt limits."

Keller said that in the future, the researchers will look at whether adding herbs and spices to the vegetables will result in the students eating more vegetables. She also said the results -- published in the journal Food Quality and Preference -- also have broader implications about how parents feed their kids, even from an early age.

"I'm a big proponent of not being afraid to use herbs and spices, even with young kids or toddlers," Keller said. "Even with baby food you can add small amounts of herbs or spices, so they're getting used to the flavors of the family. If you're going to be using these things in the foods you make for your family later on, it doesn't make sense to avoid them when kids are little."

Credit: 
Penn State

Current deforestation pace will intensify global warming, study alerts

The global warming process may be even more intense than originally forecast unless deforestation can be halted, especially in the tropical regions. This warning has been published in Nature Communications by an international group of scientists.

"If we go on destroying forests at the current pace - some 7,000 km² per year in the case of Amazonia - in three to four decades, we'll have a massive accumulated loss. This will intensify global warming regardless of all efforts to reduce greenhouse gas emissions," said Paulo Artaxo, a professor at the University of São Paulo's Physics Institute (IF-USP).

Reaching the conclusion

The group reached the conclusion after having succeeded in the mathematical reproduction of the planet's current atmospheric conditions, through computer modeling that used a numerical model of the atmosphere developed by the Met Office, the UK's national meteorological service.

Such model included meteorological factors like levels of aerosols, anthropogenic and biogenic volatile organic compounds (VOCs), ozone, carbon dioxide, methane, and other items that influence global temperature - the surface albedo among them. Albedo is a measure of the reflectivity of a surface. The albedo effect, when applied to Earth, is a measure of how much of the Sun's energy is reflected back into space. The fraction absorbed changes according to the type of surface.

The work coordinated by University of Leeds (UK) researcher Catherine Scott was also based on years of analyses and survey over the functioning of tropical and temperate forests, the gases emitted by vegetation, and their impact on climate regulation. Collection of data regarding tropical forests was coordinated by Artaxo, as part of two Thematic Projects supported by the São Paulo Research Foundation - FAPESP: "GoAmazon: interactions of the urban plume of Manaus with biogenic forest emissions in Amazonia", and "AEROCLIMA: direct and indirect effects of aerosols on climate in Amazonia and Pantanal". Data on temperate forests was obtained in Sweden, Finland, and Russia. Collection was coordinated by Erik Swietlicki, a professor at Lund University in Sweden.

Understanding how tropical forests control temperature

"After adjusting the model to reproduce the current conditions of Earth's atmosphere and the rise in surface temperatures that has occurred since 1850, we ran a simulation in which the same scenario was maintained but all forests were eliminated," Artaxo said. "The result was a significant rise of 0.8 °C in mean temperature. In other words, today the planet would be almost 1 °C warmer on average if there were no more forests."

The study also showed that the difference observed in the simulations was due mainly to emissions of biogenic VOCs from tropical forests.

"When biogenic VOCs are oxidized, they give rise to aerosol particles that cool the climate by reflecting part of the Sun's radiation back into space," Artaxo said. "Deforestation means no biogenic VOCs, no cooling, and hence future warming. This effect was not taken into account in previous modeling exercises."

Temperate forests produce different VOCs with less capacity to give rise to these cooling particles, he added.

The article notes that forests cover almost a third of the planet's land area, far less than before human intervention began. Huge swathes of forest in Europe, Asia, Africa and the Americas have been cleared.

"It's important to note that the article doesn't address the direct and immediate impact of forest burning, such as emissions of black carbon [considered a major driver of global warming owing to its high capacity for absorbing solar radiation]. This impact exists, but it lasts only a few weeks. The article focuses on the long-term impact on temperature variation," Artaxo said.

Deforestation, he stressed, affects the amount of aerosols and ozone in the atmosphere definitively, changing the atmosphere's entire radiative balance.

"The urgent need to keep the world's forests standing is even clearer in light of this study. It's urgent not only to stop their destruction but also to develop large-scale reforestation policies, especially for tropical regions. Otherwise, the effort to reduce greenhouse gas emissions from fossil fuels won't make much difference," Artaxo said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

UH researchers uncover link between heart attacks and inflammatory bowel disease

CLEVELAND - University Hospitals Harrington Heart & Vascular Institute researchers Muhammad Panhwar, MD, and Mahazarin Ginwalla, MD, recently concluded a study of more than 22 million patients that suggests a strong connection between Inflammatory Bowel Disease (IBD) and the development of heart disease and heart attacks.

The study, "Risk of Myocardial Infarction in Patient with Inflammatory Bowel Disease," was unveiled at this year's American College of Cardiology meeting in Orlando, Fla. The study was one of five featured at the meeting.

Inflammation has long been recognized as playing a key role in the development of heart disease. IBD is an umbrella term for two chronic inflammatory conditions that affect the gastrointestinal tract - Ulcerative Colitis and Crohn's disease. The Centers for Disease Control estimates that nearly 3 million Americans have IBD, with 70,000 new cases emerging every year. While studies have shown a clear increased risk of heart disease in other chronic inflammatory conditions such as lupus and rheumatoid arthritis, this link is unclear in patients with IBD.

Drs. Panhwar and Ginwalla used myocardial infarction (heart attack) as an indicator for heart disease for their research. They used IBM's Explorys, which is a large database that aggregates electronic medical records from 26 healthcare systems nationwide.

The 3-year study concluded that in the more than 22 million patients who were assessed, heart attacks were almost twice as common in patients with IBD (5.9 percent in patients with IBD compared to 3.5 percent in patients without IBD). They also found that traditional risk factors for heart disease such as high blood pressure, diabetes, and smoking were also more prevalent in patients with IBD. After adjusting for age, race, sex, and traditional heart-disease risk factors, Dr. Panhwar and his colleagues found that the patients with IBD had about 23 percent higher odds of having a heart attack.

Dr. Panhwar and colleagues also observed that the highest risk was in younger patients (less than 40 years of age), while IBD is commonly diagnosed between the ages of 15-30. Earlier age at diagnosis and female gender is associated with increased levels of inflammation and more aggressive and disabling disease.

"The disproportionately increased levels of inflammation in younger patients who may not otherwise have the traditional heart disease risk factors may explain the increased risk seen in these patients," said Dr. Panhwar. "Given that over 3 million people (~1.2 percent of the population) in the United States suffer from IBD, a large number of them may have heart disease that has gone under the radar. Our hope is that our study encourages more clinicians to screen these patients more aggressively for heart disease."

"Our study adds considerably to a growing set of literature highlighting the chronic inflammation in IBD as having a role in the development of cardiovascular disease," said Dr. Ginwalla, who is also Assistant Professor of Medicine at Case Western Reserve University School of Medicine. "Clinicians who care for patients with traditional cardiovascular risk factors who also have IBD should recognize IBD as a cardiovascular risk factor as well and treat it appropriately."

The study also highlights special populations that may be at particularly high risk such as young women and African-American patients who will benefit from closer monitoring and preventive screening.

Credit: 
University Hospitals Cleveland Medical Center

Shoddy safety/effectiveness evidence behind India's top selling diabetes drug combos

The evidence on which India's top selling drug combinations for diabetes have been approved for sale is shoddy, with the requisite trial data falling well short of the international standards set by the World Health Organization (WHO), finds the first study of its kind published in the online journal BMJ Global Health.

So poor are the data that, not only could the health of patients with type 2 diabetes be potentially put at risk, but they also call into question the role of the multinational corporations behind the manufacture of these drug combos, say the researchers.

India is often referred to as 'the diabetes capital of the world', with 60 million of its population affected by type 2 disease. Two thirds of all diabetes drugs sold in India are what is known as fixed dose combinations, or FDCs for short, often containing metformin and one other drug.

The widespread use of FDCs runs counter to national and international guidelines which don't recommend their use for treating type 2 diabetes, because of the need for constant monitoring and adjustment of drug doses to maintain adequate blood glucose control.

Amid mounting national and international concerns about the drug regulatory system in India, where more than 100 new medicines are approved each year by the Central Drugs Standard Control Organization (CDSCO), the UK and US research team set out to scrutinise the clinical trial data for the top five best selling FDCs.

They combined data on these FDCs from 25 published and unpublished trials with information from a commercial drugs sales database (PharmaTrac) for the 12 months from November 2011 to October 2012.

The trial data were reviewed in the context of the four WHO standards set for FDC approvals in 2005. India has its own regulations for FDCs, but the WHO guidelines are more comprehensive and rigorous, say the researchers.

They comprise minimum requirements for the conduct of FDC clinical trials on size (several hundred to several thousand participants); duration (at least six months); design (whether the combination is better than the individual drugs); and side effects (the pros have to outweigh the cons).

None of the 25 trials met all four WHO criteria.

For example, only two trials had more than 500 participants; only 10 lasted at least six months; only one had been designed to show whether the combination was better than the individual drugs alone; no study assessed the balance between the pros and cons. And only three trials had been conducted in India.

The five top sellers accounted for 80 percent of all metformin FDC sales by value and volume. While all five had been given the green light by CDSCO, three had already been sold and marketed before they were submitted for regulatory approval.

What's more, scrutiny of the clinical trial data for these five FDCs revealed that not one provided any evidence of safety or effectiveness for the treatment of type 2 diabetes.

In fact, four out the five were on the list of FDCs banned by the Ministry of Health and Family Welfare in March 2016--a ban that was subsequently overturned by the Delhi High Court, a decision upheld by the Supreme Court in 2017, following extensive lobbying by the multinationals behind the manufacture of FDCs.

Eighteen of the 25 trials were sponsored by multinational corporations or conducted by pharmaceutical companies, prompting the researchers to question the independence of the approvals process.

"The poor quality of available published trials and their funding sources raise concerns about the motivation for conducting these trials and whether the sponsors are using them for seeding or marketing purposes to gain a foothold in country markets," write the researchers.

They go on to say that the CDSCO should go public on the data it used to approve these drugs, given the ostensible lack of evidence on their safety and effectiveness.

"If that evidence does not extend beyond the trials reviewed here, those FDCs should be banned immediately," they urge, adding that "legislation setting out clinical trial requirements for new drugs should be revised to prevent irrational FDCs from entering the market."

Credit: 
BMJ Group

Strict eating schedule can lower Huntington disease protein in mice

New research from the University of British Columbia suggests that following a strict eating schedule can help clear away the protein responsible for Huntington disease in mice.

Huntington disease (HD) is an inherited, progressive disorder that causes involuntary movements and psychiatric problems. Symptoms appear in adulthood and worsen over time. Children born to a parent with HD have a one in two chance of inheriting the disease, which is caused by a buildup of mutant huntingtin protein (mHTT).

In research published today, scientists stimulated autophagy--a process in which the cell cleans out debris and recycles cellular material such as proteins--by restricting access to food in mice with HD to a six-hour window each day. This led to significantly lower levels of mHTT in the brain.

"We know that specific aspects of autophagy don't work properly in patients with Huntington disease," said study lead author Dagmar Ehrnhoefer, who conducted the study while she was a researcher with the UBC Centre for Molecular Medicine and Therapeutics. "Our findings suggest that, at least in mice, when you fast, or eat at certain very regulated times without snacking in between meals, your body starts to increase an alternative, still functional, autophagy mechanism, which could help lower levels of the mutant huntingtin protein in the brain."

The researchers also uncovered a clue in the mystery of why mice expressing a modified form of the HD gene show no HD symptoms. The genetic modification prevents the mHTT protein from being cut, or cleaved, at a specific site. These mice had higher rates of autophagy than mice with regular, cleavable mHTT, indicating that the cleavage site is important for regulating autophagy.

While current therapeutic strategies to lower mHTT are targeted at the Huntington disease gene, this new research suggests that another potential treatment approach is to stimulate autophagy, either through diet or the development of therapies that target the cleavage site.

Dale Martin, study co-author who conducted the research while he was a postdoctoral research fellow with the UBC Centre for Molecular Medicine and Therapeutics, said the findings demonstrate that seemingly small lifestyle changes could have an impact on HD.

"HD is a devastating disease with no cure available at this time," said Martin. "More studies are needed, but perhaps something as simple as a modified dietary schedule could provide some benefit for patients and could be complementary to some treatments currently in clinical trials."

Credit: 
University of British Columbia

BODE may overestimate transplant benefit in COPD patients

Glenview, IL - COPD remains the leading indication for lung transplantation worldwide and accounts for one third of all lung transplants performed. In order to qualify for a lung transplant, patients receive an evaluation and undergo rigorous testing to identify and exclude those with an excessive burden of comorbid conditions. The body mass index, obstruction, dyspnea and exercise capacity (BODE) score is an evaluation used to inform prognostic considerations for potential lung transplantation patients. This scoring system is widely used but has yet to be validated in the context of lung transplant. In a new study published in the journal CHEST®, researchers aimed to determine if patients selected as transplant candidates have a better survival rate than the BODE score indicates.

Researchers performed a retrospective analysis of survival according to the BODE score for 4,000 COPD patients in the United Network Organ Sharing database of lung transplant candidates. They compared survival against that observed in the cohort of COPD patients in which the BODE score was originally validated.

They found that in models controlling for BODE score and incorporating lung transplantation as a competing end point, the risk of death was higher in the BODE validation cohort. This shows that patients selected as candidates for lung transplantation survive considerably longer than predicted by the commonly used prognostic estimates extrapolated from the BODE validation cohort. In addition, results indicated that nonrespiratory cause of death was higher in the nontransplant cohort, which supports the idea that comorbid illnesses that are screened out by the transplant selection process contribute a significant amount of morbidity.

"Survival of patients with COPD who are considered candidates for lung transplantation is significantly better than would be predicted by extrapolation of survival from the cohort in which the BODE score was validated," said Dr. Robert Reed, key researcher. "This is likely due to a lower prevalence of comorbid conditions attributable to the lung transplant evaluation screening process."

Credit: 
American College of Chest Physicians

Social status influences infection risk and disease-induced mortality

image: These are hyenas at a clan communal den.

Image: 
Sarah Benhaiem/Leibniz-IZW

Spotted hyena cubs of high-ranking mothers have a lower probability of infection with and are less likely to die from canine distemper virus (CDV) than cubs of low-ranking mothers. In subadults and adults, the picture is reversed - high-ranking females exhibit a higher infection probability than low-ranking females whereas mortality was similar for both groups. These are the surprising and interesting results of a long-term study conducted by scientists at the German Leibniz Institute for Zoo and Wildlife Research (IZW) who investigated how social status and age influence the risk of infection with CDV and its consequences for survival. They have just been published in the scientific journal "Functional Ecology".

Epidemics of infectious and contagious diseases spread through contact or proximity in social species, including humans. Spotted hyenas are highly social animals that live in socially structured clans and have strict dominance hierarchies. Mothers keep their offspring together in the clan's communal den, akin to human schools and kindergartens in which diseases can easily spread. In 1993 / 1994 an epidemic of canine distemper virus swept through the Serengeti ecosystem in Tanzania and Kenya. The epidemic was caused by a novel, virulent CDV strain which was well adapted to spotted hyenas and other non-canid carnivores. During this epidemic, communal dens of hyenas were hotspots of CDV infection, and many juveniles were observed who either showed clinical symptoms of being infected with CDV or died from CDV.

The scientists used two decades of detailed demographic, social and infection data from individually recognized hyenas in three clans in the Serengeti National Park in Tanzania. "We followed 625 females and 816 males in total. In this long-term study we monitored their health status by recording the start and end of clinical signs of CDV and by collecting hundreds of samples, mainly saliva and faecal samples from known individuals to screen for the presence of CDV", says Dr Sarah Benhaiem, scientist at the Leibniz-IZW.

The results illustrate that two different mechanisms drive infection patterns among cubs and older animals. Cubs from high-ranking mothers were less likely to be infected and, once infected, less likely to die than cubs from low-ranking mothers. These cubs of high-ranking mothers were nursed more frequently than those of low-ranking mothers and therefore could allocate more energy to immune processes. This reduced their risk of disease, improved their immunocompetence and thus reduced subsequent mortality. In contrast, high-ranking subadults and adult females were more likely to be infected than low-ranking ones, reflecting the more intense social life of high-ranking individuals. This increased chance of infection did not result in an increase in adult mortality, however, because, in contrast to cubs and subadults, adult hyenas rarely die from CDV.

In many mammalian societies, such as baboons and spotted hyenas, high-ranking individuals have more social interactions with other members of their group. Elevated social contacts directly increase the risk of infection during an epidemic. This study shows for the first time that a high social status can mitigate the negative impact of a pathogen on mortality in younger age classes whereas a low social rank can hinder its transmission among subadults and adults in a free-ranging group-living mammal. "It would be interesting to investigate the impact of this epidemic at the population level, and to assess how high-ranking and low-ranking individuals contributed to the possible recovery of the Serengeti spotted hyena population", says Dr Marion East, one of the senior authors of the study. "To our knowledge, our study is the first to disentangle the importance of social processes for individual exposure and resource allocation to immune processes in a wildlife population", says Dr Lucile Marescot, scientist at the Leibniz-IZW.

Credit: 
Forschungsverbund Berlin

New test extends window for accurate detection of zika

Diagnosis of Zika infection is complex. Molecular tests for exposure are only reliable in the first two to three weeks after infection while the virus is circulating in the bloodstream. Antibody tests are confounded by cross-reactivity of antibodies to Zika with dengue, yellow fever, and Japanese encephalitis viruses following infection or vaccination. A new blood test called ZIKV-NS2B-concat ELISA is faster, less expensive, and extends the window of accurate detection from weeks to months after the onset of infection, giving clinicians a powerful new tool to screen for Zika throughout pregnancy.

The new Zika test is detailed in the scientific journal mBio and was developed by scientists at the Center for Infection and Immunity (CII) at Columbia University's Mailman School of Public Health and their colleagues at the University of California, Berkeley; Ministry of Health of Nicaragua; Walter Reed Army Institute of Research; Erasmus University Medical Centre; New York City Department of Health and Mental Hygiene; New York State Department of Health; and Roche Diagnostics.

To develop and evaluate the test, the researchers analyzed blood samples collected from children in the Nicaraguan Pediatric Dengue Cohort Study, all of whom had previously tested positive for Zika virus. Using a microarray, they identified a unique peptide sequence--a short section of amino acids--that binds with antibodies to Zika virus but not with antibodies to similar viruses like dengue, yellow fever, and Japanese encephalitis. Next, the researchers customized a low-cost testing technology called enzyme-linked immunosorbent assay (ELISA) to work with the sequence-improving on current versions of the ELISA test which use larger sections proteins that bind to the virus. (The researchers recently used the same method to build the first multiplex test for tick-borne diseases.)

ZIKV-NS2B-concat ELISA is both highly specific and sensitive, with rates of false positives and false negatives of less than 5 percent in the two to three weeks after acute illness, even without symptoms. The new test quickly detects up to 200 samples in four hours and the researchers anticipate its cost will be similar to other ELISA tests used in clinical settings.

"Many people infected by Zika have only mild illness, or are unable to see a clinician in the early, acute phase of infection," says lead author Nischay Mishra, PhD, an associate research scientist at the Center for Infection and Immunity. "Our new test greatly extends the window during which an individual can be assessed with accuracy."

Infection with Zika virus during pregnancy raises risk for neurodevelopmental problems in the offspring, including fetal microcephaly in at least one in ten pregnancies. In adults, Zika can trigger Guillain-Barré syndrome, which causes the immune system to attack the nerves. Since the emergence of Zika virus in the Americas in 2015, 583,144 cases have been reported to World Health Organization, with costs estimated as high as $18 billion between 2015 and 2017. However, long-term costs will likely be much higher given the additional, as-yet-unknown complications from congenital infections.

"An affordable and accurate test for Zika virus is critical for public health," says senior author W. Ian Lipkin, MD, director of CII and John Snow Professor of Epidemiology at Columbia's Mailman School of Public Health. "Even absent symptoms of illness or evidence of birth defects, Zika may inflict long-term harm on the person infected or their offspring."

Credit: 
Columbia University's Mailman School of Public Health

Bacteria resistant to last-resort antibiotic, missed by standard tests

Emory microbiologists have detected "heteroresistance" to colistin, a last-resort antibiotic, in already highly resistant Klebsiella pneumoniae, a bacterium that causes blood, soft tissue and urinary tract infections.

The results are scheduled for publication in mBio.

David Weiss, PhD, director of the Emory Antibiotic Resistance Center, and his colleagues had observed heteroresistance to colistin in other bacteria, called Enterobacter, previously.

Carbapenem-resistant Enterobacteriaceae (CRE), which include Klebsiella, were listed as one of the top three urgent antibiotic resistant threats in 2013 by the Centers for Disease Control and Prevention. Various types of Klebsiella are estimated to be responsible for 10 percent of infections acquired in health care facilities.

"This is concerning because Klebsiella is a more common cause of infection than Enterobacter, and these isolates were carbapenem-resistant, which means that they might actually be treated with colistin," says Weiss, professor of medicine at Emory University School of Medicine and Emory Vaccine Center. "To our knowledge, this type of heteroresistant Klebsiella has not been observed in the United States before."

The first author of the paper is Immunology and Molecular Pathogenesis graduate student Victor Band. Co-authors include Sarah Satola, PhD, Eileen Burd, PhD, Monica Farley, MD and Jesse Jacob, MD. Burd is director of clinical microbiology lab at Emory University Hospital and Farley is director of the Department of Medicine's Division of Infectious Diseases. Weiss's lab is based at the Yerkes National Primate Research Center, Emory University.

The bacterial isolates came from urine samples from two patients in Atlanta-area hospitals as part of the nationwide Multi-site Gram-Negative Surveillance Initiative, part of the CDC-funded Emerging Infections Program.

Heteroresistance means that bacterial resistance to particular antibiotics is harder to monitor. Heteroresistance is caused by a minor subpopulation of resistant bacteria which are genetically identical to the rest of the susceptible bacteria.

The bacterial isolates described in the mBio paper were not detectable with current diagnostic tests, although it was possible to see them by waiting an extra 24 hours for the resistant population to grow out. It appears that maintaining colistin resistance all the time is disadvantageous for bacteria. Probing the mechanism of heteroresistance, Weiss and his colleagues were able to see a signature of colistin resistance, in terms of genes turned on and off.

In a mouse model of peritonitis (body cavity infection), infection with the heteroresistant isolates was lethal and untreatable by colistin. Colistin is viewed as a last resort measure for bacterial infections that are resistant to other drugs, partly because it is poisonous to the kidneys.

"Clinical laboratories should consider testing for heteroresistance to colistin if this last-line antibiotic is required for treatment," the authors say. "However, the extra time required is a downside. Novel diagnostics that rapidly and accurately detect colistin heteroresistance are needed."

Credit: 
Emory Health Sciences

Study validates tool to assess mortality risk in older patients with orthopedic fractures

image: A new study led by Sanjt R. Konda, MD, assistant professor of orthopedic surgery, provides further validation of a predictive analytics software tool, developed by orthopedic trauma surgeons at NYU Langone, that has been shown to identify which middle-aged and elderly patients who experience an orthopedic fracture may face a greater mortality risk after surgery.

Image: 
NYU Langone Health

Analytic software developed by orthopedic trauma surgeons at NYU Langone Health accurately identifies which middle-aged and elderly patients face a greater mortality risk following surgery for an orthopedic fracture, according to a new study.

PersonaCARE is a unique, predictive deep-learning software that uses an algorithm called the "Score for Trauma Triage in the Geriatric and Middle Aged (STTGMA)," which has been previously validated in research. In this latest study, investigators looked at whether adding markers of frailty could improve risk assessment and enhance the software's ability to predict outcomes, such as risk of mortality, postsurgical complications or other adverse events. After comparing the original and modified versions, researchers found no significant difference in terms of predicting inpatient mortality.

These findings are being presented Tuesday, March 6, 2018, in poster session at the American Academy of Orthopaedic Surgeons (AAOS) 2018 Annual Meeting in New Orleans. The findings were also published in the December 2017 issue of Geriatric Orthopaedic Surgery & Rehabilitation.

"PersonaCARE can better inform health care providers and families on mortality risk after an orthopedic trauma, helping them collaboratively make better care decisions that are in the best interest of patients," says Sanjit R. Konda, MD, an assistant professor of orthopedic surgery in the division of trauma surgery at NYU School of Medicine and director of Geriatric Orthopedic Trauma at NYU Langone Health. "Geriatric inpatients with a higher mortality risk also typically incur additional resources and more than double the hospital costs of patients with lower mortality risk. Identifying potential problems early can help rein in these costs."

How the Study Was Conducted

Frailty is a syndrome described as an overall decline in health from illness or trauma, and is thought to be an important predictor of surgical outcomes in elderly patients. However, there is no consensus on the best assessment tool to assess frailty. Previous research has shown that geriatric trauma is an increasing epidemic in the United States, as "baby boomers" age; people are living longer and are more active than ever before, but because of this, they are sustaining more injuries. Trauma has risen to the seventh-leading cause of death in the U.S. in adults 65 years of age and older.

Konda and the trauma surgery team at NYU Langone Orthopedic Hospital created the PersonaCARE software to evaluate four major physiologic criteria: age, medical comorbidities, vital signs and anatomic injuries. It calculates a score from zero to 100 to determine a patient's mortality risk. Previous retrospective research by Konda's team, collected at a level-one trauma center over a five-year-period, provided evidence to support the algorithm's predictive abilities -- but questions remained about which frailty markers should be factored in to better assess patients' mortality risk.

The new study examined 1,486 patients age 55 and over who presented to the emergency rooms of a level-one trauma center, a tertiary care referral center, and an orthopedic specialty hospital urgent care with an orthopedic or spine fracture between September 2014 and September 2016. Of these, 993 cases were considered "low-energy," such as a ground-level fall less than or equal to two stairs, and 492 were "high-energy," equivalent to more serious traumas such as a fall from a greater height or a motor vehicle accident.

They calculated scores with the PersonaCARE software using the original algorithm and the modified version containing additional markers of frailty: existing disability; need of an assistive device to move around; and malnutrition.

There were 23 high-energy inpatient mortalities (4.7 percent of patients) and 20 low-energy inpatient mortalities (2.0 percent). When the original algorithm was used, a computational analysis of predictive accuracy found that the high-energy and low-energy cohorts had a score of 0.926 and 0.896, respectively. For the tool that included frailty indicators, the high-energy and low-energy cohorts were 0.905 and 0.937, respectively -- reflecting no meaningful statistical difference in predictive capabilities.

At NYU Langone Orthopedic Hospital, the PersonaCARE software is currently used for all hip fracture patients who are undergoing a hip replacement or operative fixation. Future research will look at the integration of this software with hospital accounting data and with the electronic medical record and what impact that might have on early intervention and care pathways.

"PersonaCARE can calculate a mortality risk upon hospital admission that can be used as a triage tool to guide older patients who face higher risks of complications and adverse events toward specialized pathways of care," says senior study author Kenneth A. Egol, MD, professor of orthopedic surgery and chief of the Division of Trauma Surgery at NYU Langone.

NYU Langone has developed a commercial version of the PersonaCARE software based on the algorithm used in the study.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Helmet use associated with reduced risk of cervical spine injury during motorcycle crashes

image: This is a bar graph showing the characterization and distribution of cervical spine injuries in helmeted & unhelmeted riders after motorcycle crashes.

Image: 
Copyright 2018 American Association of Neurological Surgeons

CHARLOTTESVILLE, Va. (MARCH 6, 2016). Despite claims that helmets do not protect the cervical spine during a motorcycle crash and may even increase the risk of injury, researchers from the University of Wisconsin Hospitals and Clinics in Madison found that, during an accident, helmet use lowers the likelihood of cervical spine injury (CSI), particularly fractures of the cervical vertebrae. These findings appear in a new article published today in the Journal of Neurosurgery: Spine: "Motorcycle helmets and cervical spine injuries: a 5-year experience at a Level 1 trauma center" written by Paul S. Page, MD, Zhikui Wei, MD, PhD, and Nathaniel P. Brooks, MD.

In Europe you're unlikely to find someone riding a motorcycle without a helmet; universal laws requiring motorcycle helmet use are applied throughout the European Union. In the United States, on the other hand, laws on helmet use vary from state to state, with some states requiring helmet use for all riders and others limiting the requirement to persons under the age of 18.

According to National Highway Traffic Safety Administration (NHTSA) estimates, wearing helmets saved the lives of 1859 motorcycle riders in 2016; an additional 802 lives could have been saved if every motorcyclist had worn them. Wearing a helmet decreases the incidence and severity of traumatic brain injury during crashes. What then are the objections to universal laws requiring motorcycle helmet use?

Major reasons cited for not requiring helmets while riding a motorcycle include freedom of choice, avoiding any limitation on vision, and a perceived increased risk of receiving a cervical spine injury (CSI). This last reason is based on the belief that the added weight of a helmet might increase torque on the cervical spine.

Risk to the cervical spine is addressed in this study. Over the years there have been a variety of studies on helmet use and CSI in motorcycle crashes, with a couple of reports indicating an increased risk of CSI among helmeted riders and most studies finding no protective effect or harmful biomechanical risk to the cervical spine. Page and colleagues hypothesized that helmet use is not associated with an increased risk of CSI during a motorcycle crash and instead may provide some protection to the wearer. In this paper the researchers provide case evidence to support their hypothesis.

The researchers reviewed the charts of 1061 patients who had been injured in motorcycle crashes and treated at a single Level 1 trauma center in Wisconsin between January 1, 2010, and January 1, 2015. Of those patients, 323 (30.4%) were wearing helmets at the time of the crash and 738 (69.6%) were not. (Wisconsin law does not require all riders to wear a helmet.)

At least one CSI was sustained by 7.4% of the riders wearing a helmet and 15.4% of those not wearing one; this difference in percentages is statistically significant (p = 0.001). Cervical spine fractures occurred more often in patients who were not wearing helmets (10.8% compared to 4.6%; p = 0.001), as did ligament injuries (1.9% compared with 0.3%; p = 0.04); again these differences are statistically significant. There were no significant differences between groups (helmeted vs. unhelmeted riders) with respect to other types of cervical spine injuries that were sustained: nerve root injury, cervical strain, or cord contusion.

In summary, Page and colleagues show that helmet use is associated with a significantly reduced likelihood of sustaining a CSI during a motorcycle crash, particularly fractures of the cervical vertebrae.

Although the study population is small, the authors believe the results provide additional evidence in support of wearing helmets to prevent severe injury in motorcycle crashes. When asked about the findings, Dr. Brooks stated, "Our study suggests that wearing a motorcycle helmet is a reasonable way to limit the risk of injury to the cervical spine in a motorcycle crash."

Credit: 
Journal of Neurosurgery Publishing Group