Culture

ACP calls for policies that better support women's health

1. ACP calls for policies that better support women and their families and improve health outcomes

Free Content-Paper: http://annals.org/aim/article/doi/10.7326/M17-3344

Free Content-Editorial: http://annals.org/aim/article/doi/10.7326/M18-1258

Note: B-roll and sound bites from ACP President, Ana Maria Lopez, MD, FACP, are available to view and download at http://www.dssimon.com/MM/ACP-womens-health/.

URLs go live when the embargo lifts

A new position paper from the American College of Physicians (ACP) examines the unique challenges women face within the U.S. health care system and calls for policies to better support them. The paper addresses a wide range of issues, such as support for paid family and medical leave, recommendations on policies to reduce domestic violence, sexual abuse and harassment, and participation in clinical trials. The paper also addresses access to coverage, including coverage for medically necessary reproductive services, and opposition to policies that would create barriers to their access to reproductive health services. The paper, "Women's Health Policy," is published in Annals of Internal Medicine.

ACP urges policymakers to strongly consider how to better integrate women's health needs over the course of their lifetime. Ensuring that women have access to non-discriminatory health care coverage is essential to improving the overall health and well-being of women living in the U.S., and a longstanding goal of ACP.

This paper was developed by ACP's Health and Public Policy Committee, which addresses issues that affect the health care of the American public and the practice of internal medicine and its subspecialties. ACP's evidence-based public policy positions are based on reviewed literature and input from the ACP's Board of Governors, Board of Regents, additional ACP councils, and nonmember experts in the field.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview ACP president Ana Maria Lopez, MD, FACP, please contact Jacquelyn Blaser at jblaser@acponline.org or 202-261-4572.

2. Prediction tool helps tailor lung cancer screening to patient benefit and preferences

Abstract: http://annals.org/aim/article/doi/10.7326/M17-2561

Editorial: http://annals.org/aim/article/doi/10.7326/M18-1350

URLs go live when the embargo lifts

Utilizing a lung cancer risk prediction tool may help clinicians tailor lung cancer screening recommendations for eligible patients based on clinical benefit and personal preferences, as opposed to taking a one-size-fits-all approach to screening. The findings of a microsimulation study are published in Annals of Internal Medicine.

The National Lung Screening Trial found that annual screening with low-dose computed tomography (LDCT) substantially reduced mortality from lung cancer, the leading cause of cancer death in the United States. However, lung cancer screening can result in substantial harms and costs, making program implementation complicated. Patients make decisions about screening based on how they value the tradeoffs and risk. As such, many health systems are exploring how to implement LDCT screening programs that are effective and also take into consideration individual patient preferences.

Researchers from the University of Michigan Medical School created a simulation model, which used data from large randomized trials and could individualize estimates of risk/benefit, to examine factors that influence when LDCT screening is preference-sensitive. The model found that moderate differences in preferences about the downsides of LDCT screening influenced whether screening was appropriate for some eligible persons (annual lung cancer risk less than 0.3 percent or life expectancy less than 10.5 years). For higher-risk persons with longer life expectancy (roughly 50 percent of the eligible population), the benefits of screening overcame even highly negative views about screening. Those with higher-risk and longer life-expectancy also had robust net benefit even in scenarios where false-positive rates were very high (i.e., 60 percent rate of false-positive findings).

The web-based decision tool that incorporates the rules of thumb derived from the researchers' findings can be found at https://share.lungdecisionprecision.com/.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Tanner J. Caverly, MD, MPH, please contact Kara Gavin at kegavin@med.umich.edu.

3. Researchers warn that drinking horsetail juice may lead to severe hyponatremia

Abstract: http://annals.org/aim/article/doi/10.7326/L18-0028

URLs go live when the embargo lifts

Researchers warn that drinking horsetail juice may lead to severe, acute hyponatremia, a condition in which the body's sodium level becomes very low, causing the cells to swell. A brief case report is published in Annals of Internal Medicine.

Horsetail is a perennial plant that has been used as an herbal remedy since at least the Greek and Roman times. However, it may be associated with an unusual adverse event.

Researchers from Haga Hospital, The Hague, the Netherlands, reported the case of a 32-year-old woman who reported to the emergency department feeling restless and anxious after consuming a drink made by crushing horsetail plants in a blender. After an initial discharge with instructions to return if her condition worsened, the patient came back to the emergency department with a headache, feeling lethargic, and acting strangely. Before she could be examined, she had a tonic-clonic seizure lasting approximately 30 seconds and terminating on its own.

The researchers believe this is the first case of such severe, acute hyponatremia after consuming horsetail juice. However, they advise clinicians to be aware of this potential adverse event in patients consuming even a small amount of the plant.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Chloe C.A. Grim, MD, MSc, please contact contact her directly at chloegrim@gmail.com.

Credit: 
American College of Physicians

Mongooses remember and reward helpful friends

image: Group of dwarf mongooses in which different cooperative acts are exchanged even with a delay in time.

Image: 
Shannon Wild

Dwarf mongooses remember previous cooperative acts by their groupmates and reward them later, according to new work by University of Bristol researchers, published today in the journal Proceedings of the National Academy of Sciences USA.

Market trade was once considered the domain of humans but the exchange of goods and services is now widely recognised in other animals. What the new research shows is that mongooses have sufficient cognitive ability to quantify earlier acts of cooperation and to provide suitable levels of delayed rewards.

Senior author, Professor Andy Radford from Bristol's School of Biological Sciences, said: "Humans frequently trade goods and can track the amount they owe using memories of past exchanges. While nonhuman animals are also known to be capable of trading cooperative acts immediately for one another, more contentious is the possibility that there can be delayed rewards."

Lead author, Dr Julie Kern, also from Bristol, added: "There have been hardly any suitable experimental tests on wild animals, especially non-primates. By working with groups of dwarf mongooses habituated to our close presence, we could collect detailed observations and conduct experimental manipulations in natural conditions."

The study is the first to provide experimental evidence in a wild non-primate population for delayed contingent cooperation--providing a later reward to an individual for the amount of cooperation it has performed. It also offers convincing evidence of cross-commodity trading, whereby individuals reward one type of cooperative behaviour with a different cooperative act. In this case, grooming was traded for sentinel behaviour, which involves an individual adopting a raised position to look out for danger and warning foraging groupmates with alarm calls.

Dr Kern said: "We began by using detailed natural observations collected over many months to show that individuals who perform lots of sentinel duty also receive lots of grooming and are well-positioned in the group's social network. But, to prove a causal link, we needed to nail a tricky field experiment."

Professor Radford added: "Over three-hour periods when groups were foraging, we simulated extra sentinel behaviour by a subordinate group member using playbacks of its surveillance calls--vocalisations given to announce it is performing this duty. At the sleeping burrow that evening, we monitored all grooming events, especially those received by the individual who had had their sentinel contribution upregulated."

The researchers found some striking results. On days when an individual was perceived to conduct more sentinel duty, it received more evening grooming from groupmates than on control days (when its foraging calls had been played back during the preceding foraging session). Moreover, the individual who had had its sentinel contribution upregulated received more grooming than a control subordinate in the group.

Grooming has long been considered an important tradable commodity in social species, being used as a reward in various contexts. The new work shows that this grooming reward does not need to occur immediately after the relevant cooperative act; the increased grooming by mongooses took place at the end of the day when the mongooses had finished foraging and returned to their sleeping burrow.

Dwarf mongooses are Africa's smallest carnivore, living in cooperatively breeding groups of 5-30 individuals. The work was conducted as part of the Dwarf Mongoose Project which has studied habituated wild groups continuously since 2011. The study animals are individually marked with blonde hair dye, are trained to climb onto a balance scale to weigh themselves, and can be watched from a few feet away as they go about their natural behaviour in ecologically valid conditions.

Credit: 
University of Bristol

Homeless populations at high risk to develop cardiovascular disease

Among homeless individuals cardiovascular disease remains one of the major causes of death due to challenges in predicting initial risk, limited access to health care and difficulties in long-term management, according to a review published today in the Journal of the American College of Cardiology.

In the U.S., roughly 550,000 people are homeless on any given night, and an estimated 2.3 million to 3.5 million people experience homelessness over the course of a year. The median age of the homeless population is 50 years, approximately 60 percent are male and 39 percent are African-American. These demographic groups experience high cardiovascular disease mortality rates, highlighting the need for proper prevention and treatment.

While the prevalence of hypertension and diabetes among homeless individuals is similar to that of the general population, it often goes untreated, leading to worse blood pressure and blood sugar control. Smoking remains the largest contributor to cardiovascular disease mortality in homeless populations, with an estimated 60 percent of ischemic heart disease deaths attributable to tobacco. Although, according to the review, most homeless individuals have a desire to quit smoking, quit rates are only one-fifth the national average.

Homeless populations are more likely to heavily drink and have a history of cocaine use, which have been linked to congestive heart failure, atherosclerosis, heart attack and sudden cardiac death. Twenty-five percent of homeless people have a chronic mental illness, contributing to cardiovascular disease risk and complicating diagnoses by impacting motivation to seek care.

In this review, researchers note that none of the current cardiovascular disease risk prediction models used in clinical practice have been confirmed in homeless populations, creating a gap in knowledge for the treatment of non-traditional cardiovascular disease risk factors.

"Clinicians need to make a concerted effort to overcome the logistical hurdles to treating and preventing cardiovascular disease in homeless populations," said Stephen W. Hwang, MD, MPH, director of the Centre for Urban Health Solutions of St. Michael's Hospital, and the review's lead author. "Half of homeless individuals don't have access to a consistent source of health care, making follow-up visits and lengthy diagnostic tests a challenge. It's our duty as health care providers to adjust our practices to provide the best possible care for these vulnerable patients."

The authors determined homeless patients are more likely to utilize the emergency department, contributing to a cycle of care focused on immediate needs rather than long-term management. Without health insurance and permanent housing, homeless patients struggle to adhere to medication that requires multiple doses per day.

"We need to apply evidence-based treatment guidelines for patients experiencing homelessness, and cardiologists can work with primary care providers to help achieve this goal." Hwang said.

Recent studies show anywhere from 44 to 89 percent of homeless individuals have cell phones. The review authors suggest that appointment reminders delivered via text message may enhance follow-up visits.

The treatment of homeless patients is made difficult by limited access to care, adherence to medication and commitment to evidence-based treatment. The authors suggest that when a diagnosis of cardiovascular disease is confirmed in a homeless patient, consult with a cardiologist for next steps in the management process and schedule regular follow-up with patients to minimize the risk of loss of care. Practical, patient-centered care can ultimately deliver optimal cardiovascular outcomes.

Credit: 
American College of Cardiology

Paramedic-run health sessions in low-income apartments reduced number of 911 calls, improved health

A community-based health promotion program developed by McMaster University that was offered by paramedics in low-income apartment buildings significantly reduced the number of 911 calls and improved quality of life for seniors, found a randomized controlled trial published in CMAJ (Canadian Medical Association Journal)

Few studies exist on the impact of the new and rapidly evolving field of community paramedicine, which offers health care by paramedics outside of emergency visits.

Chronic diseases such as heart disease, diabetes and hypertension often cause older adults living at home to seek emergency care, leading to visits from paramedics. Seniors living in subsidized housing have higher death rates and poorer quality of life because of health issues.

The study looked at the impact of the Community Paramedicine at Clinic (CP@clinic), a weekly drop-in health promotion and prevention program for older adults run by trained paramedics in subsidized-housing buildings in Hamilton, Ontario. It compared buildings that received CP@clinic for one year, in addition to usual health care and wellness programs, with buildings that only received usual health care and nonparamedic wellness programs (control group). CP@clinic offers blood pressure, diabetes and falls assessments; identification of high-risk patients and referral to health care; health education and more. What sets this program apart from other paramedicine initiatives is the ongoing reports back to family doctors, who can reconnect with their patient.

In the buildings offering CP@clinic, there were significantly fewer emergency ambulance calls (3.1 calls per 100 units/month) compared with buildings that did not offer the clinics (3.99 calls 100 units/month), which translates to 22% fewer calls. The clinics picked up undiagnosed hypertension in 36 participants (52.5%) and elevated blood pressure in 75 people (54.7%) with previously diagnosed hypertension. After attending CP@clinic, mean blood pressure for participants with hypertension dropped significantly.

"The combination of risk factor improvements among participants were significant enough to show changes in participants' diabetes risk category, which implies that CP@clinic is having an impact in reducing participants' risk of developing diabetes," says Dr. Gina Agarwal, Department of Family Medicine, McMaster University, Hamilton, Ontario. "As well, several health-related quality-of-life areas improved in those who attended, such as ability to perform daily tasks and personal care, suggesting that these overall improvements may have led to the reduction in calls in the intervention group compared with the controls."

Building on its previous work, McMaster University ran the successful CP@clinic program with the Hamilton Paramedic Service in 2014 and 2015 in seniors' social housing buildings. The program is still being run by the Hamilton Paramedic Service in the city.

"We estimate that an average of 10-11 calls per 10 apartment units could be avoided each year with programs like this," says Dr. Agarwal. "We think that the difference in ambulance calls in the short term was due to improved health care access, linkage to health care resources and knowledge about when to access these services."

Expanding the CP@clinic program into other subsidized housing buildings across Canada could improve the health of older adults and increase efficiencies in the health care system.

"The CP@clinic can reduce the burden on emergency services -- saving resources for other important areas of health care," she says.

"Because paramedics initiate care for people in their own homes and communities, these health care professionals are well placed to recognize the unmet needs of the community-dwelling individuals they serve and to act proactively to support efforts to stem unnecessary use of emergency medical services," writes Michael Nolan, County of Renfrew, Ontario, with coauthors in a related commentary http://www.cmaj.ca/lookup/doi/10.1503/cmaj.180642

"The trial further highlights the potential value of deploying a low-cost community paramedicine intervention in a high-risk social-housing setting, because it showed a significant difference in the number of ambulance calls between participants who received the intervention (i.e., attending the CP@clinic) and controls," write the authors.

The research study was funded by the Hamilton Academic Health Sciences Organization and the Canadian Institutes of Health Research (CIHR).

"Evaluation of a community paramedicine health promotion and lifestyle risk assessment program for older adults who live in social housing: a cluster randomized trial" is published May 28, 2018.

Credit: 
Canadian Medical Association Journal

New method for finding disease-susceptibility genes

image: Professor Dougu Nam and his research team in the School of Life Sciences at UNIST.

Image: 
UNIST

A new study, affiliated with UNIST has recently presented a novel statistical algorithm, capable of identifying potential disease genes in a more accurate and cost-effective way. This algorithm has also been considered as a new promising approach for the identification of candidate disease genes, as it works effectively with less genomic data and takes only a minute or two to get results.

This breakthrough has been conducted by Professor Dougu Nam and his research team in the School of Life Sciences at UNIST. Their findings have been published in Nucleic Acids Research on March 19, 2018.

In the study, the research team presented the novel method and software GSA-SNP2 for pathway enrichment analysis of GWAS P-value data. According to the research team, GSA-SNP2 provides high power, decent type I error control and fast computation by incorporating the random set model and SNP-count adjusted gene score.

"GSA-SNP2 is a powerful and efficient tool for pathway enrichment and network analysis of genome-wide association study (GWAS) summary data," says Professor Nam. "With this algorithm, we can easily identify new drug targets, thereby deepening our understanding of diseases and unlock new therapies to treat it."

Each individual's genome is a unique combination of DNA sequences that play major roles in determining who we are. This accounts for all individual differences, including susceptibility for disease and diverse phenotypes. Such genetic variation among humans are known as single nucleotide polymorphisms (SNPs). SNPs that correlate with specific diseases could serve as predictive biomarkers to aid the development of new drugs. Through the statistical analysis of GWAS summary data, it is possible to identify the disease-associated SNPs.

Despite the astronomical amounts of money and time invested in the statistical analysis of SNP data, the conventional SNP detection technologies have been unable to identify all possible SNPs. This is because most of the conventional methods for detecting SNPs are designed to strictly control false-positives in the results. Therefore, among tens of thousands of genomics data and hundreds of thousands of SNPs analyzed, the number of markers described within a candidate disease gene often reaches severl tens.

"Although controlling false positive SNPs is needed for the correct interpretation of the results, too much filtering may hamper its usefulness in drug development," says Professor Nam. "Therefore, enhanced statistical power is essential to practical statistical algorithms."

The team aimed to develop an algorithm that improves the statistical predictability while maintaining accurate control of false positives. To do this, they applied the monotone Cubic Spline trend curve to the gene score via the competitive pathway analysis for gene expression data.

In a comparative study using simulated and real GWAS data, GSA-SNP2 exhibited high power and best prioritized gold standard positive pathways compared with six existing enrichment-based methods and two self-contained methods. Based on these results, the difference between pathway analysis approaches was investigated and the effects of the gene correlation structures on the pathway enrichment analysis were also discussed. In addition, GSA-SNP2 is able to visualize protein interaction networks within and across the significant pathways so that the user can prioritize the core subnetworks for further studies.

According to the research team, GSA-SNP2 provides a greatly improved type I error control by using the SNP-count adjusted gene scores, while nevertheless preserving high statistical power. It also provides both local and global protein interaction networks in the associated pathways, and may facilitate integrated pathway and network analysis of GWAS data.

The research team expects that their GSA-SNP2 is able to visualize protein interaction networks within and across the significant pathways so that the user can prioritize the core subnetworks for further studies.

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Study finds that chewing gum while walking affects both physical and physiological functions, especially in middle-aged and elderly men

New research presented at this year's European Congress on Obesity (ECO) in Vienna, Austria (23-26) May shows chewing gum while walking increases heart rate and energy expenditure. The study was conducted by Dr Yuka Hamada and colleagues at Waseda University, Graduate School of Sport Sciences, Saitama, Tokyo, Japan.

Although there have been a number of studies which have examined the effect of chewing gum on physiological functions while at rest, none have focused specifically on how it impacts the body while walking, which is the basis for this study.

The authors recruited 46 male and female participants aged 21-69 to participate in two trials in random order. In one trial, individuals were given 2 pellets of gum (1.5g and 3 kilocalories per pellet) to chew while walking at their natural pace for 15 minutes after a 1-hour rest period. The control trial involved the same 1-hour rest and 15 minute walk, however participants were given a powder to ingest which contained the same ingredients as gum, but did not require them to chew.

In each trial resting heart rate, mean heart rate during walking, distance covered, and cadence (rate at which they took steps) were measured. Mean walking speed was calculated from the distance travelled during the 15 minutes, and stride length was estimated from the mean walking speed and mean step count. Total energy expenditure during the walk was estimated based on the mean walking speed and the body mass of each participant.

The study found that in all participants, the mean heart rate while walking as well as the change in heart rate from being at rest was significantly higher in the gum trial than in the control trial.

The team then performed stratified analyses by sex and age, separating the group into male and female, as well as young (39 and under), middle-aged and elderly (40 and older). Both male and female participants in the gum trial had a significantly higher mean heart rate while walking and change in heart rate, however in males there was also a significant increase in the distance walked and mean walking speed when compared to the control trial. (see p627, full paper, link below).

While all ages experienced a significantly larger change in heart rate in the gum trial, middle-aged and elderly participants also had a significantly higher mean heart rate while walking compared to the control.

Combining these analyses to incorporate both sex and age showed that chewing gum had the greatest effect in middle-aged and elderly men who experienced a significant positive effect on distance walked, mean walking speed, mean step counts, mean heart rate while walking, change in heart rate, and total energy expenditure compared to the control trial.

The authors conclude: "Chewing gum while walking affects a number of physical and physiological functions in men and women of all ages. Our study also indicates that gum chewing while walking increased the walking distance and energy expenditure of middle-aged and elderly male participants in particular."

Credit: 
European Association for the Study of Obesity

Bumblebees confused by iridescent colors

image: This is a bumblebee landing on an iridescent target.

Image: 
Karin Kjernsmo

Iridescence is a form of structural colour which uses regular repeating nanostructures to reflect light at slightly different angles, causing a colour-change effect.

It is common in nature, from the dazzling blues of peacock's feathers, to the gem-like appearance of insects.

Although using bright flashy colours as camouflage may seem counterintuitive, researchers at the Bristol Camo Lab found that intense iridescence obstructs the bumblebee's ability to identify shape.
The eyes of bumblebees have been extensively studied by scientists and are very similar to those of other insects.

They can be used as a visual model for predatory insects such as wasps and hornets. When presented with different types of artificial flower targets rewarded with sugar water, the bees learned to recognise which shapes contained the sweet reward.

However, they found it much more difficult to discriminate between flower shape when the targets were iridescent.

This current study using bumblebees as a model for (predatory) insect vision and cognition is the first to show that iridescence indeed has the potential to deceive predators and make them overlook the prey, the same way disruptive camouflage would work to break up the otherwise recognisable outline of a prey.

The changing colours make the outline of the prey look completely different to the shape the predators are searching for.

The researchers concluded that iridescence produces visual signals which can confuse potential predators, and this may explain its widespread occurrence in nature.

Lead author Dr Karin Kjernsmo of the University of Bristol's School of Biological Sciences, said: "It's the first solid evidence we have that this type of colouration can be used in this way.

"Thus, if you are a visual predator searching for the specific shape of a beetle (or other prey animal), iridescence makes it difficult for predators to identify them as something edible. We are currently studying this effect using other visual predators, such as birds as well. This because birds are likely to be the most important predators of iridescent insects."

The first links between iridescence and camouflage were first made over one hundred years ago by an American naturalist named Abbott Thayer, who is often referred to as "the father of camouflage".

He published a famous book on different types of camouflage such as mimicry, shape disruption and dazzle, which is thought to have inspired the "Razzle Dazzle" painting of battleships during the first World War.

However, iridescence has been rather overlooked for the past century, as it is often assumed to be purely for attracting mates and displaying to other individuals.

The UK has several species of iridescent beetle, the largest of which being the Rose Chafer, whose superb green and gold colour-changing wing cases can commonly be spotted on flowers in grasslands during the summer.

Dr Kjernsmo added: "This study has wider implications for how we understand animal vision and camouflage - now when we see these shiny beetles we can know that their amazing colours have many more functions than previously thought."

Credit: 
University of Bristol

Study shows star-shaped bread popular with children and could encourage more healthy eating

New research on different colours and shapes of bread, presented at this year's European Congress on Obesity in Vienna, Austria (23-26 May), shows that star-shaped bread is particularly popular with young children and could help them make healthy food choices. The study is by Dr Marlies Wallner and Bianca Fuchs-Neuhold, Health Perception Lab, Institute of Dietetics and Nutrition, FH JOANNEUM University of Applied Sciences, Bad Gleichenberg, Austria, and colleagues.

Focusing on taste and product attractiveness might help children and their parents make healthier food choices. In children, increased fibre consumption is related to healthier eating and better health outcomes, whereas the opposite is true for high salt consumption. Therefore, in this study, a visually attractive whole-grain bread was developed based on published health recommendations for salt and fibre, and then children evaluated the bread based on its shape, symmetry and colour.

In the study, 38 children, aged 6-10 years, tested different types of the bread which differed in shape, colour, symmetry and taste. Data were generated via eye-tracking (Tobii® X2-60 Eye-Tracker), preference and acceptance testing. With eye-tracking it is possibly to gain information on 'gazing' behaviour. Spots with high interest, positively or negatively, will reach more attention compared to other areas. This data from areas of interest are quantitative and can be combined with data from acceptance testing.

The liking was measured on a scale from 1 (best) to 5 (worst) and was significantly higher for the pictures with the star-shaped versions (score 1.5) in contrast to the square-shaped versions (score 2.0). The yellow coloured versions of the bread (produced using turmeric) were chosen less often (18%) compared to usual brown bread (82%), also demonstrated by eye-tracking.

Although 45% of children said they preferred white bread and 53% don't eat whole-grain bread regularly, the acceptance of the star-shaped bread was high: 76% rated the bread good or very good.

The authors conclude: "Our study showed the children appeared to prefer natural brown bread colours rather than the bread coloured yellow using turmeric. We conclude that based on these results, children like an attractive child-oriented bread style, in this case a star-shape. Modifying healthy everyday foods in this manner to make them more attractive to children could help children make healthier food choices."

They add: "Our study group's main interest is in food preferences in children. Colour and shape are important factors in product development. Furthermore, we are collecting data in a pan-European study to find out which kind of texture is liked most among European children in 6 different countries."

Credit: 
European Association for the Study of Obesity

Indigenous communities moving away from government utilities

Indigenous communities are rejecting non-indigenous energy projects in favour of community-led sustainable energy infrastructure.

The switch has led to some improvements in economic and social development as well as capacity-building for self-governance, according to a study from the University of Waterloo.

"Many indigenous communities decided to take back control of their own energy production and not rely so heavily on government utilities," said Konstantinos Karanasios, lead researcher and PhD candidate at Waterloo's Faculty of Environment. "By building solar, wind and hydroelectric power projects, they have been able to develop at their own pace, realize their own vision for environmental sustainability and learn valuable lessons about how to build and manage infrastructure projects."

The study looked at 71 renewable energy projects, including wind, hydroelectric and solar power, installed between 1980 and 2016 in remote indigenous communities across Canada.

The small-scale projects examined demonstrated positive results environmentally and economically on an encouraging learning curve. From 2000 to 2016, solar projects in remote indigenous communities grew from two in 2006, to 23 in 2012 and 53 in 2016.

"Projects like these offer a blueprint for future larger projects in remote communities across Canada," said Karanasios. "Furthermore, indigenous communities are showing all Canadians that community-led renewable energy projects can be successful and economically feasible."

Remote communities in Canada have long-relied on non-renewable energy such as diesel fuel for electricity generation and economic development. Energy production from diesel fuel is often associated with high carbon emissions, spills, leakages, and service quality issues. Fossil fuel can also be unpredictable due to shifting governance regimes, fossil fuel prices and carbon emission policy, potentially restricting community development.

Credit: 
University of Waterloo

The big clean up after stress

image: This is a microscopic color image showing cells with normal (green dots) and abnormal (yellow dots) stress granules.

Image: 
Buchberger team

Toxic substances, nutrient shortage, viral infection, heat and many other events trigger stress responses in cells. In such cases, the affected cells launch a programme which tries to protect them against stress-related damages. They usually ramp down the production of endogenous proteins to save resources which they later need to repair cell damages or to survive under the stress conditions for some time.

Stress granules are a visible sign of such stress reactions: The small granules consisting of numerous proteins and messenger RNAs build up inside the cell when protein production is suspended. Once the stress is over, the cell takes up its regular operation again and eliminates the stress granules. But if this clearance process does not work according to plan, serious consequences can arise.

Recent studies show that stress granules are suspected to at least contribute to two incurable neurodegenerative diseases: Amyotrophic lateral sclerosis (ALS), which causes muscle atrophy and lethal palsy in its final stages, and frontotemporal dementia (FTD), the second most common type of dementia in people under the age of 65.

Published in Molecular Cell

The scientists from the Biocenter of the University of Würzburg have now uncovered new details of the clearance process of stress granules. The study was headed by biochemist Professor Alexander Buchberger. The lead author is Ankit Turakhiya, a member of Research Training Group GRK2243 "Understanding Ubiquitylation: From Molecular Mechanisms to Disease". Other contributors were Professor Andreas Schlosser from the Rudolf Virchow Center of the University of Würzburg and Professor Kay Hofmann (University of Cologne). The scientists present the results of their research in the current issue of Molecular Cell.

"We were able to demonstrate that the ZFAND1 protein is necessary for the normal clearance of the stress granules. When ZFAND1 is absent, some granules cannot be dissolved and change their structure as a result. These abnormal stress granules then have to be disposed of by autophagy, the cellular waste collection service, in a complex process," Alexander Buchberger sums up the central result of the new study. However, ZFAND1 does not directly impact the elimination process. Instead, it recruits a special enzyme complex required to eliminate defective proteins, the so-called proteasome, bringing it together with the stress granules.

An unexpected discovery

Buchberger explains that they had been surprised to find that the proteasome plays such a prominent role in eliminating the stress granules. He says that until now researchers had assumed that defective proteins at stress granules are eliminated together with the latter by autophagy - an assumption the biochemists were able to correct in their study.

What may appear to be mere fundamental research with little practical relevance to the layperson is in fact highly relevant for medical research. "The accumulation of abnormal stress granules is considered to be a potential cause of neurodegenerative diseases," Buchberger explains. He therefore believes that it is vital to clarify how stress granules are formed and eliminated in order to better understand the pathogenesis of these diseases and find potential targets for treating them.

In a next step, Buchberger and his team are planning to analyse the composition of stress granules in more detail and to identify the defective proteins that need to be removed by the proteasome. Their overarching goal is to shed light on the regulatory processes involved in the creation and elimination of stress granules.

Credit: 
University of Würzburg

Using the K computer, scientists predict exotic "di-Omega" particle

image: An image of the di-Omega dibaryon.

Image: 
Keiko Murano

Based on complex simulations of quantum chromodynamics performed using the K computer, one of the most powerful computers in the world, the HAL QCD Collaboration, made up of scientists from the RIKEN Nishina Center for Accelerator-based Science and the RIKEN Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) program, together with colleagues from a number of universities, have predicted a new type of "dibaryon"--a particle that contains six quarks instead of the usual three. Studying how these elements form could help scientists understand the interactions among elementary particles in extreme environments such as the interiors of neutron stars or the early universe moments after the Big Bang.

Particles known as "baryons"--principally protons and neutrons--are composed of three quarks bound tightly together, with their charge depending on the "color" of the quarks that make them up. A dibaryon is essentially a system with two baryons. There is one known dibaryon in nature--deuteron, a deuterium (or heavy-hydrogen) nucleus that contains a proton and a neutron that are very lightly bound. Scientists have long wondered whether there could be other types of dibaryons. Despite searches, no other dibaryon has been found.

The group, in work published in Physical Review Letters, has now used powerful theoretical and computational tools to predict the existence of a "most strange" dibaryon, made up of two "Omega baryons" that contain three strange quarks each. They named it "di-Omega". The group also suggested a way to look for these strange particles through experiments with heavy ion collisions planned in Europe and Japan.

The finding was made possible by a fortuitous combination of three elements: better methods for making QCD calculations, better simulation algorithms, and more powerful supercomputers.

The first essential element was a new theoretical framework called the "time-dependent HAL QCD method": It allows researchers to extract the force acting between baryons from the large volume of numerical data obtained using the K computer.

The second element was a new computational method, the unified contraction algorithm, which allows much more efficient calculation of a system with a large number of quarks.

The third element was the advent of powerful supercomputers. According to Shinya Gongyo from the RIKEN Nishina Center, "We were very lucky to have been able to use the K computer to perform the calculations. It allowed fast calculations with a huge number of variables. Still, it took almost three years for us to reach our conclusion on the di-Omega."

Discussing the future, Tetsuo Hatsuda from RIKEN iTHEMS says, "We believe that these special particles could be generated by the experiments using heavy ion collisions that are planned in Europe and in Japan, and we look forward to working with colleagues there to experimentally discover the first dibaryon system outside of deuteron. This work could give us hints for understanding the interaction among strange baryons (called hyperons) and to understand how, under extreme conditions like those found in neutron stars, normal matter can transition to what is called hyperonic matter--made up of protons, neutrons, and strange-quark particles called hyperons, and eventually to quark matter composed of up, down and strange quarks."

Credit: 
RIKEN

Health labels may deter people from buying sugary drinks

Young adults are less likely to buy sugar-sweetened beverages that include health labels, particularly those with graphic warnings about how added sugar can lead to tooth decay, obesity, and type 2 diabetes.

The new research, being presented at this year's European Congress on Obesity (ECO) in Vienna, Austria (23-26 May), also indicates that individuals are more likely to choose healthier options if there is a Health Star Rating displayed on beverages. The Health Star Rating is the national front-of- pack labelling system in use in Australia and New Zealand.

Worldwide, the intake of sugar-sweetened beverages such as soft drinks, cordials, sports drinks, energy drinks, and sweetened waters is very high. Surveys indicate that at least half of Americans and a third of Australians drink at least one sugar-sweetened drink every day. Sweetened drinks are a major source of added sugar in the diet and there is growing concern about the health effects linked with high consumption such as type 2 diabetes, tooth decay, and cardiovascular disease. Many harmful products already carry warnings. Whether or not sugar-sweetened drinks should also display warning labels is hotly debated, and which labels might have the greatest impact is unclear.

To investigate this further, Professor Anna Peeters from Australia's Deakin University and colleagues conducted an online choice experiment to examine the drink choices of almost 1000 Australians aged 18-35 years. Participants were recruited using online platforms from four states in Australia, and represented a diverse range of socioeconomic status and education levels.

Participants were asked to imagine they were entering a shop, café, or approaching a vending machine to purchase a drink. They were randomised to one of five groups and asked to choose one of 15 drinks, with sugary and non-sweetened options available. The drinks included either a no label (control group) or one of four labels on sugary drinks--a graphic warning, text warning, or sugar information, including number of teaspoons of added sugar; or a health star rating on all drinks. Alternatively, they could select "no drink" if they no longer wanted to buy a drink.

Overall, participants were far less likely to select a sugary drink when a front-of-pack label was displayed compared to no label, regardless of their level of education, age, and socioeconomic background.

Graphic warning labels which indicated that consuming drinks with added sugar may contribute to tooth decay (that include an image of decayed teeth), type 2 diabetes, or obesity appeared to have the greatest impact. Participants were 36% less likely to purchase sugary drinks that included a graphic warning compared to a drink with no label.

Other front-of-pack labels were also effective with participants 20% less likely to purchase sugary drinks that included the Health Star Rating, or 18% less likely for a label displaying teaspoons of added sugar. Furthermore, participants were far (20%) more likely to choose healthier alternatives when Health Star Ratings were displayed compared to no label.

"Our findings highlight the potential of front-of-pack health labels, particularly graphic images and Health Star Ratings, to change consumer behaviour, reduce purchases of sugar-sweetened drinks, and help people to make healthier choices," Professor Peeters said.

"The question now is what kind of impact these labels could have on the obesity epidemic. While no single measure will reverse the obesity crisis, given that the largest source of added sugars in our diet comes from sugar-sweetened drinks, there is a compelling case for the introduction of front-of-pack labels on sugary drinks worldwide."

The researchers point out some limitations of the study, including that the study measured intended selections of drinks in an online setting, and that the labels would need to be tested on real-world purchases of sugary drinks. They also note that it would be important to test drink labels with adolescents as they are the highest consumers of sugary drinks in Australia.

Credit: 
European Association for the Study of Obesity

Unsubstantiated health claims widespread within weight loss industry

New research investigating the legality of on-pack nutrition and health claims routinely found on commercially available meal replacement shakes for sale in the UK, reveals that more than three-quarters are unauthorised and do not comply with the EU Nutrition and Health Claims regulation.

This is one of the first studies to analyse how on-pack claims stack up to current regulation, and how much consumers actually understand and are influenced by such claims, and is being presented at this year's European Congress on Obesity (ECO) in Vienna, Austria (23-26 May).

Britain is the most obese nation in Western Europe, with rates rising faster than any other developed country. The Organisation for Economic Co-operation and Development (OECD) estimates that almost two-thirds (63%) of UK adults are overweight or obese, making the UK weight-loss market a €26 billion business. In Europe, sales of meal replacement products are estimated to reach €940 million by 2020.

'Meal replacement for weight control' products are simple, convenient and low in calories, and there is good evidence that they are an effective option for weight loss and weight maintenance. Their nutritional composition, labelling, and nutrition and health claims are governed by EU legislation which seeks to ensure that the products are nutritionally sound and that any reported benefits are clear, accurate, and based on scientific evidence.

In this study, Dr Kelly Johnston and colleagues from LighterLife and Kings College London in the UK analysed the nutrient composition, legal compliance, and consumer understanding of on-pack health and nutrition claims for all commercially available 'meal replacement for weight control' shakes sold in the UK in 2017. On-pack information was assessed for its compliance with composition, labelling, and nutrition and health claims in line with EU regulation.

The researchers found that only 10 of the brands provided enough information to demonstrate that they met all the EU compositional and labelling requirements and the majority of products did not meet the basic compositional criteria necessary to be called a 'meal replacement for weight control'.

Results also showed that more than 90% of products made at least one nutrition claim on pack and just over half made at least one health claim--yet 79% of these claims were not compliant with EU regulations. In order to gauge their understanding of on-pack nutrition and health claims, internet-based questionnaires were completed by volunteers who were currently or had recently been engaged with the LighterLife weight loss programme. Of 240 respondents (44 men), three quarters (75%) reported being on a diet within the last 6 months.

The claims most likely to be reported as being understood by responders included common weight loss messages such as "low fat" (95%), "low calorie" (95%), and "high protein" (94%). In contrast, only around half of those questioned understood the claims: "protects against chronic diseases" (48%) and "low GI" (53%) - neither of which are authorised, nor should be on pack.

Interestingly, despite a relatively high overall reported level of understanding of on-pack claims, the majority of claims were perceived as being false.

Dr Johnston concludes: "Manufacturer's misleading labelling is confusing consumers about the healthiness and nutritional quality of meal replacement shakes. Some of these claims are clearly exaggerated and many are simply untrue. What we see from this group of consumers is that they generally have false perceptions about the efficacy of such products. In other words, even if they understand the claims, they often don't believe what they are reading."

She adds: "This situation doesn't benefit anyone but what it does mean is that reputable providers are disadvantaged by unscrupulous parties who sell products not fit for purpose. This study highlights the need for better enforcement to ensure products for sale meet the legally required compositional and labelling criteria which will both protect consumers whilst ensuring fair market competition."

The authors note that they only looked at one type of product--single serve meal replacement shakes--which is not representative of the entire sector.

Credit: 
European Association for the Study of Obesity

Kaiser Permanente researchers develop new models for predicting suicide risk

Combining data from electronic health records with results from standardized depression questionnaires better predicts suicide risk in the 90 days following either mental health specialty or primary care outpatient visits, reports a team from the Mental Health Research Network, led by Kaiser Permanente research scientists.

The study, "Predicting Suicide Attempts and Suicide Death Following Outpatient Visits Using Electronic Health Records," conducted in five Kaiser Permanente regions (Colorado, Hawaii, Oregon, California and Washington), the Henry Ford Health System in Detroit, and the HealthPartners Institute in Minneapolis, was published today in the American Journal of Psychiatry.

Combining a variety of information from the past five years of people's electronic health records and answers to questionnaires, the new models predicted suicide risk more accurately than before, according to the authors. The strongest predictors include prior suicide attempts, mental health and substance use diagnoses, medical diagnoses, psychiatric medications dispensed, inpatient or emergency room care, and scores on a standardized depression questionnaire.

"We demonstrated that we can use electronic health record data in combination with other tools to accurately identify people at high risk for suicide attempt or suicide death," said first author Gregory E. Simon, MD, MPH, a Kaiser Permanente psychiatrist in Washington and a senior investigator at Kaiser Permanente Washington Health Research Institute.

In the 90 days following an office visit:

Suicide attempts and deaths among patients whose visits were in the highest 1 percent of predicted risk were 200 times more common than among those in the bottom half of predicted risk.

Patients with mental health specialty visits who had risk scores in the top 5 percent accounted for 43 percent of suicide attempts and 48 percent of suicide deaths.

Patients with primary care visits who had scores in the top 5 percent accounted for 48 percent of suicide attempts and 43 percent of suicide deaths.

This study builds on previous models in other health systems that used fewer potential predictors from patients' records. Using those models, people in the top 5 percent of risk accounted for only a quarter to a third of subsequent suicide attempts and deaths. More traditional suicide risk assessment, which relies on questionnaires or clinical interviews only, is even less accurate.

The new study involved seven large health systems serving a combined population of 8 million people in nine states. The research team examined almost 20 million visits by nearly 3 million people age 13 or older, including about 10.3 million mental health specialty visits and about 9.7 million primary care visits with mental health diagnoses. The researchers deleted information that could help identify individuals.

"It would be fair to say that the health systems in the Mental Health Research Network, which integrate care and coverage, are the best in the country for implementing suicide prevention programs," Dr. Simon said. "But we know we could do better. So several of our health systems, including Kaiser Permanente, are working to integrate prediction models into our existing processes for identifying and addressing suicide risk."

Suicide rates are increasing, with suicide accounting for nearly 45,000 deaths in the United States in 2016; 25 percent more than in 2000, according to the National Center for Health Statistics.

Other health systems can replicate this approach to risk stratification, according to Dr. Simon. Better prediction of suicide risk can inform decisions by health care providers and health systems. Such decisions include how often to follow up with patients, refer them for intensive treatment, reach out to them after missed or canceled appointments -- and whether to help them create a personal safety plan and counsel them about reducing access to means of self-harm.

Credit: 
Kaiser Permanente

Hot cars can hit deadly temperatures in as little as one hour

image: Cars can become deadly after one hour in the sun on a hot summer day.

Image: 
Graphic by Safwat Saleem/ASU

A lot can happen at 160 degrees Fahrenheit: Eggs fry, salmonella bacteria dies, and human skin will suffer third-degree burns. If a car is parked in the sun on a hot summer day, its dashboard can hit 160 degrees in about an hour. One hour is also about how long it can take for a young child trapped in a car to suffer heat injury or even die from hyperthermia.

Researchers from Arizona State University and the University of California at San Diego School of Medicine have completed a study to compare how different types of cars warm up on hot days when exposed to different amounts of shade and sunlight for different periods of time. The research team also took into account how these differences would affect the body temperature of a hypothetical 2-year-old child left in a vehicle on a hot day. Their study was published May 24 in the journal Temperature.

"Our study not only quantifies temperature differences inside vehicles parked in the shade and the sun, but it also makes clear that even parking a vehicle in the shade can be lethal to a small child," said Nancy Selover, an Arizona State climatologist and research professor in ASU's School of Geographical Sciences and Urban Planning.

From January through May 2018, six children have died after being left in hot cars in the United States. That number will go up. Annually in the U.S., an average of 37 children left in hot cars die from complications of hyperthermia - when the body warms to above 104 degrees and cannot cool down. More than 50 percent of cases of a child dying in a hot car involve a parent or caregiver who forgot the child in the car.

The findings

Researchers used six vehicles for the study: Two identical silver mid-size sedans, two identical silver economy cars, and two identical silver minivans. During three hot summer days with temperatures in the 100s in Tempe, Arizona, researchers moved the cars from sunlight to shade for different periods of time throughout the day. Researchers measured interior air temperature and surface temperatures throughout different parts of the day.

"These tests replicated what might happen during a shopping trip," Selover said. "We wanted to know what the interior of each vehicle would be like after one hour, about the amount of time it would take to get groceries. I knew the temperatures would be hot, but I was surprised by the surface temperatures."

For vehicles parked in the sun during the simulated shopping trip, the average cabin temperature hit 116 degrees in one hour. Dash boards averaged 157 degrees, steering wheels 127 degrees, and seats 123 degrees in one hour.

For vehicles parked in the shade, interior temperatures were closer to 100 degrees after one hour. Dash boards averaged 118 degrees, steering wheels 107 degrees and seats 105 degrees after one hour.

The different types of vehicles tested warmed up at different rates, with the economy car warming faster than the mid-size sedan and minivan.

"We've all gone back to our cars on hot days and have been barely able to touch the steering wheel," Selover said. "But, imagine what that would be like to a child trapped in a car seat. And once you introduce a person into these hot cars, they are exhaling humidity into the air. When there is more humidity in the air, a person can't cool down by sweating because sweat won't evaporate as quickly."

Hyperthermia

A person's age, weight, existing health problems and other factors, including clothing, will affect how and when heat becomes deadly. Scientists can't predict exactly when a child will suffer a heatstroke, but most cases involve a child's core body temperature rising above 104 degrees for an extended period.

In the study, the researchers used data to model a hypothetical 2-year old boy's body temperature. The team found that a child trapped in a car in the study's conditions could reach that temperature in about an hour if a car is parked in the sun, and just under two hours if the car is parked in the shade.

"We hope these findings can be leveraged for the awareness and prevention of pediatric vehicular heatstroke and the creation and adoption of in-vehicle technology to alert parents of forgotten children," said Jennifer Vanos, lead study author and assistant professor of climate and human health at U.C. San Diego.

Hyperthermia and heatstroke effects happen along a continuum, Vanos said. Internal injuries can begin at temperatures below 104 degrees, and some heatstroke survivors live with brain and organ damage.

Why memories fail

Forgetting a child in the car can happen to anyone, said Gene Brewer, an ASU associate professor of psychology. Brewer, who was not involved in the heat study, researches memory processes, and has testified as an expert witness in a court case involving a parent whose child died in a hot car.

"Often these stories involve a distracted parent," he said. "Memory failures are remarkably powerful, and they happen to everyone. There is no difference between gender, class, personality, race or other traits. Functionally, there isn't much of a difference between forgetting your keys and forgetting your child in the car."

Most people spend a lot of time on routine behaviors, doing the same activities over and over without thinking about them. For example, driving the same route to work, taking the children to daycare on Tuesdays and Thursdays, or leaving car keys in the same spot every day. When new information comes into those routines, such as a parent's daycare drop-off day suddenly changing or an emergency phone call from a boss on the way to work, that's when memory failures can occur.

"These cognitive failures have nothing to do with the child," Brewer said. "The cognitive failure happens because someone's mind has gone to a new place, and their routine has been disrupted. They are suddenly thinking about new things, and that leads to forgetfulness. Nobody in this world has an infallible memory."

Credit: 
Arizona State University