Culture

Is it ethical to use genealogy data to solve crimes?

1. Is it ethical to use genealogy data to solve crimes?

Bioethicists suggest ethical considerations for forensic use of genetic data

Abstract: http://annals.org/aim/article/doi/10.7326/M18-1348

URLs go live when the embargo lifts

Despite the popularity of online genealogy services, it is unclear whether users understand that their genetic information is available for forensic purposes. Bioethicists from the National Institutes of Health (NIH) suggest a framework for ethical discussions about how and when genealogy data should be used for crime-solving. Their paper is published in Annals of Internal Medicine.

The use of genealogy data for criminal justice purposes made news when authorities arrested the suspected Golden State Killer, a serial rapist and murderer who terrorized California decades ago, after matching DNA from multiple crime scenes to DNA obtained from online genealogy data. While the arrest was celebrated, the case has raised questions about the ethics of using online genealogy data for solving crimes. Bioethicists from the NIH say that three interrelated topics should be considered when discussing the ethical use of online genealogy data: informed consent, privacy, and justice.

According to the authors, these ethical considerations are important because citizens have rights, and also because DNA evidence can be misused. The crime-solving potential of readily available DNA evidence is exciting, but it raises many issues that must be addressed.

Credit: 
American College of Physicians

Deciphering the language of cells using observation chambers

image: Microscopic image of a single cell artificially colored, laying on billions of nanoholes.

Image: 
EPFL / this illustration will be used as the cover image of the forthcoming edition of Small

Like humans, cells of the same species each have a distinct "personality." When confronted with an external stimulus like a virus, they each secrete a different quantity of molecules and communicate with each other to a varying degree. Studies have already shown that two cells of the same type may not behave identically when subjected to the same treatment.

In their quest to learn more, researchers from EPFL, working in collaboration with RMIT University in Australia and the University of Lausanne, have come up with an optofluidic device that has a tiny chamber inside. The chamber is around one one-thousandth the size of a raindrop. A cell is placed in the chamber, and then researchers can observe it in real time without disrupting its environment. The quantity and type of the cell's chemical secretions can be monitored continuously. The device has been shown to work for 12 hours straight but could function much longer, offering researchers a powerful and innovative selection tool. The results have been published in Small.

Studying cells one by one

All cells function in their own complex way. Cancer cells, for example, produce various hormones and proteins in order to spread and to invade healthy tissue; immune cells respond to an infection or an intruder by secreting chemical mediators called cytokines that stimulate the immune system to fight the enemy. But what is the actual mechanism underlying each cell's behavior?

Numerous studies have been run on how groups of cells function, but we have precious little information on the behavior of individual cells. Compatible with traditional microscopes, the integrated and miniaturized device developed by the EPFL researchers offers a new way to gain insight into cell processes and communication. It also sets the stage for the development of new therapies to treat cancer and autoimmune diseases. "We could, for example, select the most effective immune cells to combat a given disease," says Hatice Altug, a co-author of the study and head of the Bionanophotonic Systems Laboratory at EPFL's School of Engineering.

Cells individually housed, fed and analyzed

The nanophotonic biosensor developed by the researchers is a glass slide coated with a thin gold film, perforated with billions of nanopores arranged in a precise pattern. A small chamber whose walls are made of porous membranes is placed above the slide. The chamber receives a steady flow of water and nutrients through tiny microfluidic channels. Temperature and humidity are carefully regulated. The device contains valves that let scientists insert a cell into the chamber, in which ligands or antibodies are positioned to recognize and capture specific molecules secreted by the cell.

A broadband light source shines on the chamber. Thanks to an optical phenomenon called plasmons, the nanopores let only one light-wave frequency or color through. When a cell secretes a molecule, it attaches to the antibodies, thereby changing the frequency transmitted by the nanopores. This is how minutes of specific molecules can be identified.

The researchers have used their new technique to study cytokine secretion levels in lymphoma cells. "Until now, the methods used to study individual cells have always required fluorophores," says Altug. "Yet these compounds interfere with the cells' functioning and make real-time studies impossible." Maria Soler, the study's co-lead author, adds: "In our device, each nanopore is a separate sensor. Cells can thus settle naturally anywhere in the chamber, and we can analyze them the same way."

There are numerous potential applications. "Our approach could be used to identify the most aggressive cancer cells in a tumor and decide exactly which treatment to administer to the patient," concludes Xiaokang Li, the study's co-lead author.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Most popular vitamin and mineral supplements provide no health benefit, study finds

image: The most commonly consumed vitamin and mineral supplements provide no consistent health benefit or harm, suggests a new study led by researchers at St. Michael's Hospital and the University of Toronto.

Image: 
St. Michael's Hospital

TORONTO, May 28, 2018 - The most commonly consumed vitamin and mineral supplements provide no consistent health benefit or harm, suggests a new study led by researchers at St. Michael's Hospital and the University of Toronto.

Published today in the Journal of the American College of Cardiology, the systematic review of existing data and single randomized control trials published in English from January 2012 to October 2017 found that multivitamins, vitamin D, calcium and vitamin C - the most common supplements - showed no advantage or added risk in the prevention of cardiovascular disease, heart attack, stroke or premature death. Generally, vitamin and mineral supplements are taken to add to nutrients that are found in food.

"We were surprised to find so few positive effects of the most common supplements that people consume," said Dr. David Jenkins*, the study's lead author. "Our review found that if you want to use multivitamins, vitamin D, calcium or vitamin C, it does no harm - but there is no apparent advantage either."

The study found folic acid alone and B-vitamins with folic acid may reduce cardiovascular disease and stroke. Meanwhile, niacin and antioxidants showed a very small effect that might signify an increased risk of death from any cause.

"These findings suggest that people should be conscious of the supplements they're taking and ensure they're applicable to the specific vitamin or mineral deficiencies they have been advised of by their healthcare provider," Dr. Jenkins said.

His team reviewed supplement data that included A, B1, B2, B3 (niacin), B6, B9 (folic acid), C, D and E; and β-carotene; calcium; iron; zinc; magnesium; and selenium. The term 'multivitamin' in this review was used to describe supplements that include most vitamins and minerals, rather than a select few.

"In the absence of significant positive data - apart from folic acid's potential reduction in the risk of stroke and heart disease - it's most beneficial to rely on a healthy diet to get your fill of vitamins and minerals," Dr. Jenkins said. "So far, no research on supplements has shown us anything better than healthy servings of less processed plant foods including vegetables, fruits and nuts."

Credit: 
St. Michael's Hospital

ACP calls for policies that better support women's health

1. ACP calls for policies that better support women and their families and improve health outcomes

Free Content-Paper: http://annals.org/aim/article/doi/10.7326/M17-3344

Free Content-Editorial: http://annals.org/aim/article/doi/10.7326/M18-1258

Note: B-roll and sound bites from ACP President, Ana Maria Lopez, MD, FACP, are available to view and download at http://www.dssimon.com/MM/ACP-womens-health/.

URLs go live when the embargo lifts

A new position paper from the American College of Physicians (ACP) examines the unique challenges women face within the U.S. health care system and calls for policies to better support them. The paper addresses a wide range of issues, such as support for paid family and medical leave, recommendations on policies to reduce domestic violence, sexual abuse and harassment, and participation in clinical trials. The paper also addresses access to coverage, including coverage for medically necessary reproductive services, and opposition to policies that would create barriers to their access to reproductive health services. The paper, "Women's Health Policy," is published in Annals of Internal Medicine.

ACP urges policymakers to strongly consider how to better integrate women's health needs over the course of their lifetime. Ensuring that women have access to non-discriminatory health care coverage is essential to improving the overall health and well-being of women living in the U.S., and a longstanding goal of ACP.

This paper was developed by ACP's Health and Public Policy Committee, which addresses issues that affect the health care of the American public and the practice of internal medicine and its subspecialties. ACP's evidence-based public policy positions are based on reviewed literature and input from the ACP's Board of Governors, Board of Regents, additional ACP councils, and nonmember experts in the field.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview ACP president Ana Maria Lopez, MD, FACP, please contact Jacquelyn Blaser at jblaser@acponline.org or 202-261-4572.

2. Prediction tool helps tailor lung cancer screening to patient benefit and preferences

Abstract: http://annals.org/aim/article/doi/10.7326/M17-2561

Editorial: http://annals.org/aim/article/doi/10.7326/M18-1350

URLs go live when the embargo lifts

Utilizing a lung cancer risk prediction tool may help clinicians tailor lung cancer screening recommendations for eligible patients based on clinical benefit and personal preferences, as opposed to taking a one-size-fits-all approach to screening. The findings of a microsimulation study are published in Annals of Internal Medicine.

The National Lung Screening Trial found that annual screening with low-dose computed tomography (LDCT) substantially reduced mortality from lung cancer, the leading cause of cancer death in the United States. However, lung cancer screening can result in substantial harms and costs, making program implementation complicated. Patients make decisions about screening based on how they value the tradeoffs and risk. As such, many health systems are exploring how to implement LDCT screening programs that are effective and also take into consideration individual patient preferences.

Researchers from the University of Michigan Medical School created a simulation model, which used data from large randomized trials and could individualize estimates of risk/benefit, to examine factors that influence when LDCT screening is preference-sensitive. The model found that moderate differences in preferences about the downsides of LDCT screening influenced whether screening was appropriate for some eligible persons (annual lung cancer risk less than 0.3 percent or life expectancy less than 10.5 years). For higher-risk persons with longer life expectancy (roughly 50 percent of the eligible population), the benefits of screening overcame even highly negative views about screening. Those with higher-risk and longer life-expectancy also had robust net benefit even in scenarios where false-positive rates were very high (i.e., 60 percent rate of false-positive findings).

The web-based decision tool that incorporates the rules of thumb derived from the researchers' findings can be found at https://share.lungdecisionprecision.com/.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Tanner J. Caverly, MD, MPH, please contact Kara Gavin at kegavin@med.umich.edu.

3. Researchers warn that drinking horsetail juice may lead to severe hyponatremia

Abstract: http://annals.org/aim/article/doi/10.7326/L18-0028

URLs go live when the embargo lifts

Researchers warn that drinking horsetail juice may lead to severe, acute hyponatremia, a condition in which the body's sodium level becomes very low, causing the cells to swell. A brief case report is published in Annals of Internal Medicine.

Horsetail is a perennial plant that has been used as an herbal remedy since at least the Greek and Roman times. However, it may be associated with an unusual adverse event.

Researchers from Haga Hospital, The Hague, the Netherlands, reported the case of a 32-year-old woman who reported to the emergency department feeling restless and anxious after consuming a drink made by crushing horsetail plants in a blender. After an initial discharge with instructions to return if her condition worsened, the patient came back to the emergency department with a headache, feeling lethargic, and acting strangely. Before she could be examined, she had a tonic-clonic seizure lasting approximately 30 seconds and terminating on its own.

The researchers believe this is the first case of such severe, acute hyponatremia after consuming horsetail juice. However, they advise clinicians to be aware of this potential adverse event in patients consuming even a small amount of the plant.

Media contact: For an embargoed PDF, please contact Lauren Evans at laevans@acponline.org. To interview the lead author, Chloe C.A. Grim, MD, MSc, please contact contact her directly at chloegrim@gmail.com.

Credit: 
American College of Physicians

Mongooses remember and reward helpful friends

image: Group of dwarf mongooses in which different cooperative acts are exchanged even with a delay in time.

Image: 
Shannon Wild

Dwarf mongooses remember previous cooperative acts by their groupmates and reward them later, according to new work by University of Bristol researchers, published today in the journal Proceedings of the National Academy of Sciences USA.

Market trade was once considered the domain of humans but the exchange of goods and services is now widely recognised in other animals. What the new research shows is that mongooses have sufficient cognitive ability to quantify earlier acts of cooperation and to provide suitable levels of delayed rewards.

Senior author, Professor Andy Radford from Bristol's School of Biological Sciences, said: "Humans frequently trade goods and can track the amount they owe using memories of past exchanges. While nonhuman animals are also known to be capable of trading cooperative acts immediately for one another, more contentious is the possibility that there can be delayed rewards."

Lead author, Dr Julie Kern, also from Bristol, added: "There have been hardly any suitable experimental tests on wild animals, especially non-primates. By working with groups of dwarf mongooses habituated to our close presence, we could collect detailed observations and conduct experimental manipulations in natural conditions."

The study is the first to provide experimental evidence in a wild non-primate population for delayed contingent cooperation--providing a later reward to an individual for the amount of cooperation it has performed. It also offers convincing evidence of cross-commodity trading, whereby individuals reward one type of cooperative behaviour with a different cooperative act. In this case, grooming was traded for sentinel behaviour, which involves an individual adopting a raised position to look out for danger and warning foraging groupmates with alarm calls.

Dr Kern said: "We began by using detailed natural observations collected over many months to show that individuals who perform lots of sentinel duty also receive lots of grooming and are well-positioned in the group's social network. But, to prove a causal link, we needed to nail a tricky field experiment."

Professor Radford added: "Over three-hour periods when groups were foraging, we simulated extra sentinel behaviour by a subordinate group member using playbacks of its surveillance calls--vocalisations given to announce it is performing this duty. At the sleeping burrow that evening, we monitored all grooming events, especially those received by the individual who had had their sentinel contribution upregulated."

The researchers found some striking results. On days when an individual was perceived to conduct more sentinel duty, it received more evening grooming from groupmates than on control days (when its foraging calls had been played back during the preceding foraging session). Moreover, the individual who had had its sentinel contribution upregulated received more grooming than a control subordinate in the group.

Grooming has long been considered an important tradable commodity in social species, being used as a reward in various contexts. The new work shows that this grooming reward does not need to occur immediately after the relevant cooperative act; the increased grooming by mongooses took place at the end of the day when the mongooses had finished foraging and returned to their sleeping burrow.

Dwarf mongooses are Africa's smallest carnivore, living in cooperatively breeding groups of 5-30 individuals. The work was conducted as part of the Dwarf Mongoose Project which has studied habituated wild groups continuously since 2011. The study animals are individually marked with blonde hair dye, are trained to climb onto a balance scale to weigh themselves, and can be watched from a few feet away as they go about their natural behaviour in ecologically valid conditions.

Credit: 
University of Bristol

Homeless populations at high risk to develop cardiovascular disease

Among homeless individuals cardiovascular disease remains one of the major causes of death due to challenges in predicting initial risk, limited access to health care and difficulties in long-term management, according to a review published today in the Journal of the American College of Cardiology.

In the U.S., roughly 550,000 people are homeless on any given night, and an estimated 2.3 million to 3.5 million people experience homelessness over the course of a year. The median age of the homeless population is 50 years, approximately 60 percent are male and 39 percent are African-American. These demographic groups experience high cardiovascular disease mortality rates, highlighting the need for proper prevention and treatment.

While the prevalence of hypertension and diabetes among homeless individuals is similar to that of the general population, it often goes untreated, leading to worse blood pressure and blood sugar control. Smoking remains the largest contributor to cardiovascular disease mortality in homeless populations, with an estimated 60 percent of ischemic heart disease deaths attributable to tobacco. Although, according to the review, most homeless individuals have a desire to quit smoking, quit rates are only one-fifth the national average.

Homeless populations are more likely to heavily drink and have a history of cocaine use, which have been linked to congestive heart failure, atherosclerosis, heart attack and sudden cardiac death. Twenty-five percent of homeless people have a chronic mental illness, contributing to cardiovascular disease risk and complicating diagnoses by impacting motivation to seek care.

In this review, researchers note that none of the current cardiovascular disease risk prediction models used in clinical practice have been confirmed in homeless populations, creating a gap in knowledge for the treatment of non-traditional cardiovascular disease risk factors.

"Clinicians need to make a concerted effort to overcome the logistical hurdles to treating and preventing cardiovascular disease in homeless populations," said Stephen W. Hwang, MD, MPH, director of the Centre for Urban Health Solutions of St. Michael's Hospital, and the review's lead author. "Half of homeless individuals don't have access to a consistent source of health care, making follow-up visits and lengthy diagnostic tests a challenge. It's our duty as health care providers to adjust our practices to provide the best possible care for these vulnerable patients."

The authors determined homeless patients are more likely to utilize the emergency department, contributing to a cycle of care focused on immediate needs rather than long-term management. Without health insurance and permanent housing, homeless patients struggle to adhere to medication that requires multiple doses per day.

"We need to apply evidence-based treatment guidelines for patients experiencing homelessness, and cardiologists can work with primary care providers to help achieve this goal." Hwang said.

Recent studies show anywhere from 44 to 89 percent of homeless individuals have cell phones. The review authors suggest that appointment reminders delivered via text message may enhance follow-up visits.

The treatment of homeless patients is made difficult by limited access to care, adherence to medication and commitment to evidence-based treatment. The authors suggest that when a diagnosis of cardiovascular disease is confirmed in a homeless patient, consult with a cardiologist for next steps in the management process and schedule regular follow-up with patients to minimize the risk of loss of care. Practical, patient-centered care can ultimately deliver optimal cardiovascular outcomes.

Credit: 
American College of Cardiology

Paramedic-run health sessions in low-income apartments reduced number of 911 calls, improved health

A community-based health promotion program developed by McMaster University that was offered by paramedics in low-income apartment buildings significantly reduced the number of 911 calls and improved quality of life for seniors, found a randomized controlled trial published in CMAJ (Canadian Medical Association Journal)

Few studies exist on the impact of the new and rapidly evolving field of community paramedicine, which offers health care by paramedics outside of emergency visits.

Chronic diseases such as heart disease, diabetes and hypertension often cause older adults living at home to seek emergency care, leading to visits from paramedics. Seniors living in subsidized housing have higher death rates and poorer quality of life because of health issues.

The study looked at the impact of the Community Paramedicine at Clinic (CP@clinic), a weekly drop-in health promotion and prevention program for older adults run by trained paramedics in subsidized-housing buildings in Hamilton, Ontario. It compared buildings that received CP@clinic for one year, in addition to usual health care and wellness programs, with buildings that only received usual health care and nonparamedic wellness programs (control group). CP@clinic offers blood pressure, diabetes and falls assessments; identification of high-risk patients and referral to health care; health education and more. What sets this program apart from other paramedicine initiatives is the ongoing reports back to family doctors, who can reconnect with their patient.

In the buildings offering CP@clinic, there were significantly fewer emergency ambulance calls (3.1 calls per 100 units/month) compared with buildings that did not offer the clinics (3.99 calls 100 units/month), which translates to 22% fewer calls. The clinics picked up undiagnosed hypertension in 36 participants (52.5%) and elevated blood pressure in 75 people (54.7%) with previously diagnosed hypertension. After attending CP@clinic, mean blood pressure for participants with hypertension dropped significantly.

"The combination of risk factor improvements among participants were significant enough to show changes in participants' diabetes risk category, which implies that CP@clinic is having an impact in reducing participants' risk of developing diabetes," says Dr. Gina Agarwal, Department of Family Medicine, McMaster University, Hamilton, Ontario. "As well, several health-related quality-of-life areas improved in those who attended, such as ability to perform daily tasks and personal care, suggesting that these overall improvements may have led to the reduction in calls in the intervention group compared with the controls."

Building on its previous work, McMaster University ran the successful CP@clinic program with the Hamilton Paramedic Service in 2014 and 2015 in seniors' social housing buildings. The program is still being run by the Hamilton Paramedic Service in the city.

"We estimate that an average of 10-11 calls per 10 apartment units could be avoided each year with programs like this," says Dr. Agarwal. "We think that the difference in ambulance calls in the short term was due to improved health care access, linkage to health care resources and knowledge about when to access these services."

Expanding the CP@clinic program into other subsidized housing buildings across Canada could improve the health of older adults and increase efficiencies in the health care system.

"The CP@clinic can reduce the burden on emergency services -- saving resources for other important areas of health care," she says.

"Because paramedics initiate care for people in their own homes and communities, these health care professionals are well placed to recognize the unmet needs of the community-dwelling individuals they serve and to act proactively to support efforts to stem unnecessary use of emergency medical services," writes Michael Nolan, County of Renfrew, Ontario, with coauthors in a related commentary http://www.cmaj.ca/lookup/doi/10.1503/cmaj.180642

"The trial further highlights the potential value of deploying a low-cost community paramedicine intervention in a high-risk social-housing setting, because it showed a significant difference in the number of ambulance calls between participants who received the intervention (i.e., attending the CP@clinic) and controls," write the authors.

The research study was funded by the Hamilton Academic Health Sciences Organization and the Canadian Institutes of Health Research (CIHR).

"Evaluation of a community paramedicine health promotion and lifestyle risk assessment program for older adults who live in social housing: a cluster randomized trial" is published May 28, 2018.

Credit: 
Canadian Medical Association Journal

New method for finding disease-susceptibility genes

image: Professor Dougu Nam and his research team in the School of Life Sciences at UNIST.

Image: 
UNIST

A new study, affiliated with UNIST has recently presented a novel statistical algorithm, capable of identifying potential disease genes in a more accurate and cost-effective way. This algorithm has also been considered as a new promising approach for the identification of candidate disease genes, as it works effectively with less genomic data and takes only a minute or two to get results.

This breakthrough has been conducted by Professor Dougu Nam and his research team in the School of Life Sciences at UNIST. Their findings have been published in Nucleic Acids Research on March 19, 2018.

In the study, the research team presented the novel method and software GSA-SNP2 for pathway enrichment analysis of GWAS P-value data. According to the research team, GSA-SNP2 provides high power, decent type I error control and fast computation by incorporating the random set model and SNP-count adjusted gene score.

"GSA-SNP2 is a powerful and efficient tool for pathway enrichment and network analysis of genome-wide association study (GWAS) summary data," says Professor Nam. "With this algorithm, we can easily identify new drug targets, thereby deepening our understanding of diseases and unlock new therapies to treat it."

Each individual's genome is a unique combination of DNA sequences that play major roles in determining who we are. This accounts for all individual differences, including susceptibility for disease and diverse phenotypes. Such genetic variation among humans are known as single nucleotide polymorphisms (SNPs). SNPs that correlate with specific diseases could serve as predictive biomarkers to aid the development of new drugs. Through the statistical analysis of GWAS summary data, it is possible to identify the disease-associated SNPs.

Despite the astronomical amounts of money and time invested in the statistical analysis of SNP data, the conventional SNP detection technologies have been unable to identify all possible SNPs. This is because most of the conventional methods for detecting SNPs are designed to strictly control false-positives in the results. Therefore, among tens of thousands of genomics data and hundreds of thousands of SNPs analyzed, the number of markers described within a candidate disease gene often reaches severl tens.

"Although controlling false positive SNPs is needed for the correct interpretation of the results, too much filtering may hamper its usefulness in drug development," says Professor Nam. "Therefore, enhanced statistical power is essential to practical statistical algorithms."

The team aimed to develop an algorithm that improves the statistical predictability while maintaining accurate control of false positives. To do this, they applied the monotone Cubic Spline trend curve to the gene score via the competitive pathway analysis for gene expression data.

In a comparative study using simulated and real GWAS data, GSA-SNP2 exhibited high power and best prioritized gold standard positive pathways compared with six existing enrichment-based methods and two self-contained methods. Based on these results, the difference between pathway analysis approaches was investigated and the effects of the gene correlation structures on the pathway enrichment analysis were also discussed. In addition, GSA-SNP2 is able to visualize protein interaction networks within and across the significant pathways so that the user can prioritize the core subnetworks for further studies.

According to the research team, GSA-SNP2 provides a greatly improved type I error control by using the SNP-count adjusted gene scores, while nevertheless preserving high statistical power. It also provides both local and global protein interaction networks in the associated pathways, and may facilitate integrated pathway and network analysis of GWAS data.

The research team expects that their GSA-SNP2 is able to visualize protein interaction networks within and across the significant pathways so that the user can prioritize the core subnetworks for further studies.

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Study finds that chewing gum while walking affects both physical and physiological functions, especially in middle-aged and elderly men

New research presented at this year's European Congress on Obesity (ECO) in Vienna, Austria (23-26) May shows chewing gum while walking increases heart rate and energy expenditure. The study was conducted by Dr Yuka Hamada and colleagues at Waseda University, Graduate School of Sport Sciences, Saitama, Tokyo, Japan.

Although there have been a number of studies which have examined the effect of chewing gum on physiological functions while at rest, none have focused specifically on how it impacts the body while walking, which is the basis for this study.

The authors recruited 46 male and female participants aged 21-69 to participate in two trials in random order. In one trial, individuals were given 2 pellets of gum (1.5g and 3 kilocalories per pellet) to chew while walking at their natural pace for 15 minutes after a 1-hour rest period. The control trial involved the same 1-hour rest and 15 minute walk, however participants were given a powder to ingest which contained the same ingredients as gum, but did not require them to chew.

In each trial resting heart rate, mean heart rate during walking, distance covered, and cadence (rate at which they took steps) were measured. Mean walking speed was calculated from the distance travelled during the 15 minutes, and stride length was estimated from the mean walking speed and mean step count. Total energy expenditure during the walk was estimated based on the mean walking speed and the body mass of each participant.

The study found that in all participants, the mean heart rate while walking as well as the change in heart rate from being at rest was significantly higher in the gum trial than in the control trial.

The team then performed stratified analyses by sex and age, separating the group into male and female, as well as young (39 and under), middle-aged and elderly (40 and older). Both male and female participants in the gum trial had a significantly higher mean heart rate while walking and change in heart rate, however in males there was also a significant increase in the distance walked and mean walking speed when compared to the control trial. (see p627, full paper, link below).

While all ages experienced a significantly larger change in heart rate in the gum trial, middle-aged and elderly participants also had a significantly higher mean heart rate while walking compared to the control.

Combining these analyses to incorporate both sex and age showed that chewing gum had the greatest effect in middle-aged and elderly men who experienced a significant positive effect on distance walked, mean walking speed, mean step counts, mean heart rate while walking, change in heart rate, and total energy expenditure compared to the control trial.

The authors conclude: "Chewing gum while walking affects a number of physical and physiological functions in men and women of all ages. Our study also indicates that gum chewing while walking increased the walking distance and energy expenditure of middle-aged and elderly male participants in particular."

Credit: 
European Association for the Study of Obesity

Bumblebees confused by iridescent colors

image: This is a bumblebee landing on an iridescent target.

Image: 
Karin Kjernsmo

Iridescence is a form of structural colour which uses regular repeating nanostructures to reflect light at slightly different angles, causing a colour-change effect.

It is common in nature, from the dazzling blues of peacock's feathers, to the gem-like appearance of insects.

Although using bright flashy colours as camouflage may seem counterintuitive, researchers at the Bristol Camo Lab found that intense iridescence obstructs the bumblebee's ability to identify shape.
The eyes of bumblebees have been extensively studied by scientists and are very similar to those of other insects.

They can be used as a visual model for predatory insects such as wasps and hornets. When presented with different types of artificial flower targets rewarded with sugar water, the bees learned to recognise which shapes contained the sweet reward.

However, they found it much more difficult to discriminate between flower shape when the targets were iridescent.

This current study using bumblebees as a model for (predatory) insect vision and cognition is the first to show that iridescence indeed has the potential to deceive predators and make them overlook the prey, the same way disruptive camouflage would work to break up the otherwise recognisable outline of a prey.

The changing colours make the outline of the prey look completely different to the shape the predators are searching for.

The researchers concluded that iridescence produces visual signals which can confuse potential predators, and this may explain its widespread occurrence in nature.

Lead author Dr Karin Kjernsmo of the University of Bristol's School of Biological Sciences, said: "It's the first solid evidence we have that this type of colouration can be used in this way.

"Thus, if you are a visual predator searching for the specific shape of a beetle (or other prey animal), iridescence makes it difficult for predators to identify them as something edible. We are currently studying this effect using other visual predators, such as birds as well. This because birds are likely to be the most important predators of iridescent insects."

The first links between iridescence and camouflage were first made over one hundred years ago by an American naturalist named Abbott Thayer, who is often referred to as "the father of camouflage".

He published a famous book on different types of camouflage such as mimicry, shape disruption and dazzle, which is thought to have inspired the "Razzle Dazzle" painting of battleships during the first World War.

However, iridescence has been rather overlooked for the past century, as it is often assumed to be purely for attracting mates and displaying to other individuals.

The UK has several species of iridescent beetle, the largest of which being the Rose Chafer, whose superb green and gold colour-changing wing cases can commonly be spotted on flowers in grasslands during the summer.

Dr Kjernsmo added: "This study has wider implications for how we understand animal vision and camouflage - now when we see these shiny beetles we can know that their amazing colours have many more functions than previously thought."

Credit: 
University of Bristol

Study shows star-shaped bread popular with children and could encourage more healthy eating

New research on different colours and shapes of bread, presented at this year's European Congress on Obesity in Vienna, Austria (23-26 May), shows that star-shaped bread is particularly popular with young children and could help them make healthy food choices. The study is by Dr Marlies Wallner and Bianca Fuchs-Neuhold, Health Perception Lab, Institute of Dietetics and Nutrition, FH JOANNEUM University of Applied Sciences, Bad Gleichenberg, Austria, and colleagues.

Focusing on taste and product attractiveness might help children and their parents make healthier food choices. In children, increased fibre consumption is related to healthier eating and better health outcomes, whereas the opposite is true for high salt consumption. Therefore, in this study, a visually attractive whole-grain bread was developed based on published health recommendations for salt and fibre, and then children evaluated the bread based on its shape, symmetry and colour.

In the study, 38 children, aged 6-10 years, tested different types of the bread which differed in shape, colour, symmetry and taste. Data were generated via eye-tracking (Tobii® X2-60 Eye-Tracker), preference and acceptance testing. With eye-tracking it is possibly to gain information on 'gazing' behaviour. Spots with high interest, positively or negatively, will reach more attention compared to other areas. This data from areas of interest are quantitative and can be combined with data from acceptance testing.

The liking was measured on a scale from 1 (best) to 5 (worst) and was significantly higher for the pictures with the star-shaped versions (score 1.5) in contrast to the square-shaped versions (score 2.0). The yellow coloured versions of the bread (produced using turmeric) were chosen less often (18%) compared to usual brown bread (82%), also demonstrated by eye-tracking.

Although 45% of children said they preferred white bread and 53% don't eat whole-grain bread regularly, the acceptance of the star-shaped bread was high: 76% rated the bread good or very good.

The authors conclude: "Our study showed the children appeared to prefer natural brown bread colours rather than the bread coloured yellow using turmeric. We conclude that based on these results, children like an attractive child-oriented bread style, in this case a star-shape. Modifying healthy everyday foods in this manner to make them more attractive to children could help children make healthier food choices."

They add: "Our study group's main interest is in food preferences in children. Colour and shape are important factors in product development. Furthermore, we are collecting data in a pan-European study to find out which kind of texture is liked most among European children in 6 different countries."

Credit: 
European Association for the Study of Obesity

Indigenous communities moving away from government utilities

Indigenous communities are rejecting non-indigenous energy projects in favour of community-led sustainable energy infrastructure.

The switch has led to some improvements in economic and social development as well as capacity-building for self-governance, according to a study from the University of Waterloo.

"Many indigenous communities decided to take back control of their own energy production and not rely so heavily on government utilities," said Konstantinos Karanasios, lead researcher and PhD candidate at Waterloo's Faculty of Environment. "By building solar, wind and hydroelectric power projects, they have been able to develop at their own pace, realize their own vision for environmental sustainability and learn valuable lessons about how to build and manage infrastructure projects."

The study looked at 71 renewable energy projects, including wind, hydroelectric and solar power, installed between 1980 and 2016 in remote indigenous communities across Canada.

The small-scale projects examined demonstrated positive results environmentally and economically on an encouraging learning curve. From 2000 to 2016, solar projects in remote indigenous communities grew from two in 2006, to 23 in 2012 and 53 in 2016.

"Projects like these offer a blueprint for future larger projects in remote communities across Canada," said Karanasios. "Furthermore, indigenous communities are showing all Canadians that community-led renewable energy projects can be successful and economically feasible."

Remote communities in Canada have long-relied on non-renewable energy such as diesel fuel for electricity generation and economic development. Energy production from diesel fuel is often associated with high carbon emissions, spills, leakages, and service quality issues. Fossil fuel can also be unpredictable due to shifting governance regimes, fossil fuel prices and carbon emission policy, potentially restricting community development.

Credit: 
University of Waterloo

The big clean up after stress

image: This is a microscopic color image showing cells with normal (green dots) and abnormal (yellow dots) stress granules.

Image: 
Buchberger team

Toxic substances, nutrient shortage, viral infection, heat and many other events trigger stress responses in cells. In such cases, the affected cells launch a programme which tries to protect them against stress-related damages. They usually ramp down the production of endogenous proteins to save resources which they later need to repair cell damages or to survive under the stress conditions for some time.

Stress granules are a visible sign of such stress reactions: The small granules consisting of numerous proteins and messenger RNAs build up inside the cell when protein production is suspended. Once the stress is over, the cell takes up its regular operation again and eliminates the stress granules. But if this clearance process does not work according to plan, serious consequences can arise.

Recent studies show that stress granules are suspected to at least contribute to two incurable neurodegenerative diseases: Amyotrophic lateral sclerosis (ALS), which causes muscle atrophy and lethal palsy in its final stages, and frontotemporal dementia (FTD), the second most common type of dementia in people under the age of 65.

Published in Molecular Cell

The scientists from the Biocenter of the University of Würzburg have now uncovered new details of the clearance process of stress granules. The study was headed by biochemist Professor Alexander Buchberger. The lead author is Ankit Turakhiya, a member of Research Training Group GRK2243 "Understanding Ubiquitylation: From Molecular Mechanisms to Disease". Other contributors were Professor Andreas Schlosser from the Rudolf Virchow Center of the University of Würzburg and Professor Kay Hofmann (University of Cologne). The scientists present the results of their research in the current issue of Molecular Cell.

"We were able to demonstrate that the ZFAND1 protein is necessary for the normal clearance of the stress granules. When ZFAND1 is absent, some granules cannot be dissolved and change their structure as a result. These abnormal stress granules then have to be disposed of by autophagy, the cellular waste collection service, in a complex process," Alexander Buchberger sums up the central result of the new study. However, ZFAND1 does not directly impact the elimination process. Instead, it recruits a special enzyme complex required to eliminate defective proteins, the so-called proteasome, bringing it together with the stress granules.

An unexpected discovery

Buchberger explains that they had been surprised to find that the proteasome plays such a prominent role in eliminating the stress granules. He says that until now researchers had assumed that defective proteins at stress granules are eliminated together with the latter by autophagy - an assumption the biochemists were able to correct in their study.

What may appear to be mere fundamental research with little practical relevance to the layperson is in fact highly relevant for medical research. "The accumulation of abnormal stress granules is considered to be a potential cause of neurodegenerative diseases," Buchberger explains. He therefore believes that it is vital to clarify how stress granules are formed and eliminated in order to better understand the pathogenesis of these diseases and find potential targets for treating them.

In a next step, Buchberger and his team are planning to analyse the composition of stress granules in more detail and to identify the defective proteins that need to be removed by the proteasome. Their overarching goal is to shed light on the regulatory processes involved in the creation and elimination of stress granules.

Credit: 
University of Würzburg

Using the K computer, scientists predict exotic "di-Omega" particle

image: An image of the di-Omega dibaryon.

Image: 
Keiko Murano

Based on complex simulations of quantum chromodynamics performed using the K computer, one of the most powerful computers in the world, the HAL QCD Collaboration, made up of scientists from the RIKEN Nishina Center for Accelerator-based Science and the RIKEN Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) program, together with colleagues from a number of universities, have predicted a new type of "dibaryon"--a particle that contains six quarks instead of the usual three. Studying how these elements form could help scientists understand the interactions among elementary particles in extreme environments such as the interiors of neutron stars or the early universe moments after the Big Bang.

Particles known as "baryons"--principally protons and neutrons--are composed of three quarks bound tightly together, with their charge depending on the "color" of the quarks that make them up. A dibaryon is essentially a system with two baryons. There is one known dibaryon in nature--deuteron, a deuterium (or heavy-hydrogen) nucleus that contains a proton and a neutron that are very lightly bound. Scientists have long wondered whether there could be other types of dibaryons. Despite searches, no other dibaryon has been found.

The group, in work published in Physical Review Letters, has now used powerful theoretical and computational tools to predict the existence of a "most strange" dibaryon, made up of two "Omega baryons" that contain three strange quarks each. They named it "di-Omega". The group also suggested a way to look for these strange particles through experiments with heavy ion collisions planned in Europe and Japan.

The finding was made possible by a fortuitous combination of three elements: better methods for making QCD calculations, better simulation algorithms, and more powerful supercomputers.

The first essential element was a new theoretical framework called the "time-dependent HAL QCD method": It allows researchers to extract the force acting between baryons from the large volume of numerical data obtained using the K computer.

The second element was a new computational method, the unified contraction algorithm, which allows much more efficient calculation of a system with a large number of quarks.

The third element was the advent of powerful supercomputers. According to Shinya Gongyo from the RIKEN Nishina Center, "We were very lucky to have been able to use the K computer to perform the calculations. It allowed fast calculations with a huge number of variables. Still, it took almost three years for us to reach our conclusion on the di-Omega."

Discussing the future, Tetsuo Hatsuda from RIKEN iTHEMS says, "We believe that these special particles could be generated by the experiments using heavy ion collisions that are planned in Europe and in Japan, and we look forward to working with colleagues there to experimentally discover the first dibaryon system outside of deuteron. This work could give us hints for understanding the interaction among strange baryons (called hyperons) and to understand how, under extreme conditions like those found in neutron stars, normal matter can transition to what is called hyperonic matter--made up of protons, neutrons, and strange-quark particles called hyperons, and eventually to quark matter composed of up, down and strange quarks."

Credit: 
RIKEN

Health labels may deter people from buying sugary drinks

Young adults are less likely to buy sugar-sweetened beverages that include health labels, particularly those with graphic warnings about how added sugar can lead to tooth decay, obesity, and type 2 diabetes.

The new research, being presented at this year's European Congress on Obesity (ECO) in Vienna, Austria (23-26 May), also indicates that individuals are more likely to choose healthier options if there is a Health Star Rating displayed on beverages. The Health Star Rating is the national front-of- pack labelling system in use in Australia and New Zealand.

Worldwide, the intake of sugar-sweetened beverages such as soft drinks, cordials, sports drinks, energy drinks, and sweetened waters is very high. Surveys indicate that at least half of Americans and a third of Australians drink at least one sugar-sweetened drink every day. Sweetened drinks are a major source of added sugar in the diet and there is growing concern about the health effects linked with high consumption such as type 2 diabetes, tooth decay, and cardiovascular disease. Many harmful products already carry warnings. Whether or not sugar-sweetened drinks should also display warning labels is hotly debated, and which labels might have the greatest impact is unclear.

To investigate this further, Professor Anna Peeters from Australia's Deakin University and colleagues conducted an online choice experiment to examine the drink choices of almost 1000 Australians aged 18-35 years. Participants were recruited using online platforms from four states in Australia, and represented a diverse range of socioeconomic status and education levels.

Participants were asked to imagine they were entering a shop, café, or approaching a vending machine to purchase a drink. They were randomised to one of five groups and asked to choose one of 15 drinks, with sugary and non-sweetened options available. The drinks included either a no label (control group) or one of four labels on sugary drinks--a graphic warning, text warning, or sugar information, including number of teaspoons of added sugar; or a health star rating on all drinks. Alternatively, they could select "no drink" if they no longer wanted to buy a drink.

Overall, participants were far less likely to select a sugary drink when a front-of-pack label was displayed compared to no label, regardless of their level of education, age, and socioeconomic background.

Graphic warning labels which indicated that consuming drinks with added sugar may contribute to tooth decay (that include an image of decayed teeth), type 2 diabetes, or obesity appeared to have the greatest impact. Participants were 36% less likely to purchase sugary drinks that included a graphic warning compared to a drink with no label.

Other front-of-pack labels were also effective with participants 20% less likely to purchase sugary drinks that included the Health Star Rating, or 18% less likely for a label displaying teaspoons of added sugar. Furthermore, participants were far (20%) more likely to choose healthier alternatives when Health Star Ratings were displayed compared to no label.

"Our findings highlight the potential of front-of-pack health labels, particularly graphic images and Health Star Ratings, to change consumer behaviour, reduce purchases of sugar-sweetened drinks, and help people to make healthier choices," Professor Peeters said.

"The question now is what kind of impact these labels could have on the obesity epidemic. While no single measure will reverse the obesity crisis, given that the largest source of added sugars in our diet comes from sugar-sweetened drinks, there is a compelling case for the introduction of front-of-pack labels on sugary drinks worldwide."

The researchers point out some limitations of the study, including that the study measured intended selections of drinks in an online setting, and that the labels would need to be tested on real-world purchases of sugary drinks. They also note that it would be important to test drink labels with adolescents as they are the highest consumers of sugary drinks in Australia.

Credit: 
European Association for the Study of Obesity