Culture

Improving the signal-to-noise ratio in quantum chromodynamics simulations

Over the last few decades, the exponential increase in computer power and accompanying increase in the quality of algorithms has enabled theoretical and particle physicists to perform more complex and precise simulations of fundamental particles and their interactions. If you increase the number of lattice points in a simulation, it becomes harder to tell the difference between the observed result of the simulation and the surrounding noise. A new study by Marco Ce, a physicist based at the Helmholtz-Institut Mainz in Germany and recently published in EPJ Plus, describes a technique for simulating particle ensembles that are 'large' (at least by the standards of particle physics). This improves the signal-to-noise ratio and thus the precision of the simulation; crucially, it also can be used to model ensembles of baryons: a category of elementary particles that includes the protons and neutrons that make up atomic nuclei.

Ce's simulations employ a Monte Carlo algorithm: a generic computational method that relies on repeated random sampling to obtain numerical results. These algorithms have a wide variety of uses, and in mathematical physics they are particularly well suited to evaluating complicated integrals, and to modelling systems with many degrees of freedom.

More precisely, the type of Monte Carlo algorithm used here involves multi-level sampling. This means that the samples are taken with different levels of accuracy, which is less computationally expensive than methods in which the sampling accuracy is uniform. Multi-level Monte Carlo methods have previously been applied to ensembles of bosons (the class of particle that, self-evidently, includes the now famous Higgs particle), but not to the more complex fermions. This latter category includes electrons as well as baryons: all the main components of 'everyday' matter.

Ce concludes his study by noting that there are many other problems in particle physics where computation is affected by high signal-to-noise ratios, and which might benefit from this approach.

Credit: 
Springer

Women report skipping scientific conferences because of child care

image: This is Reshma Jagsi, M.D., D.Phil.

Image: 
University of Michigan Rogel Cancer Center

ANN ARBOR, Michigan -- For oncologists in the beginning of their career, scientific conferences present an opportunity to network, share their research, gain new knowledge and begin to advance in their career. But many women find themselves skipping these conferences because of family obligations, a new study finds.

Researchers surveyed 248 early career oncologists practicing at National Cancer Institute-designated cancer centers. Women were less likely than men to attend scientific meetings, although both genders noted that conferences were important to career advancement. Nearly half of women said having children interfered with attending meetings, while only a third of men did.

"Amid concerns about gender inequity in advancement in medicine, it is especially important to identify innovative and visible actions that target the mechanisms fueling inequity," says study author Reshma Jagsi, M.D., D.Phil., Newman Family Professor and deputy chair of radiation oncology at Michigan Medicine and director of the Center for Bioethics and Social Sciences in Medicine at the University of Michigan.

About three-quarters of both the men and the women surveyed had young children. But while 74% of women had a spouse employed full-time, only 45% of men did. And women reported spending about 10 hours more each week than men on parenting and domestic tasks. The study is published in JAMA Oncology.

"Our society continues to embrace a gendered division of domestic labor, whereby women bear the greater burden of responsibilities at home, even when they're highly committed to their careers. Facilitating work-life integration is essential, and this study provides concrete data to support this need," Jagsi says.

The authors cite onsite child care and women's networking venues as essential elements to improve access for women. Nearly three times as many women as men said having child care onsite at meetings would be extremely important and would help enable them to attend.

The annual American Society of Clinical Oncology meeting in June offered professional onsite child care for children 6 months to 12 years. Jagsi says this example shows that onsite child care is feasible.

"Women want to attend these meetings. They offer critical opportunities for leadership, networking, education, mentorship, scholarly dissemination and so much more," she says. "For our profession to access the full talent pool and reap the demonstrated benefits of diversity, we need to figure out ways to promote work-life integration."

Credit: 
Michigan Medicine - University of Michigan

What makes some people more receptive to the idea of being vaccinated against infectious disease?

London, July 18, 2019 - Fear, trust, and the likelihood of exposure are three leading factors that influence whether people are willing to be vaccinated against a virulent disease, according to a new study in the journal Heliyon, published by Elsevier.

Following the highly-publicized 2014 outbreak of Ebola in Africa and anticipating the possibility of a future Ebola outbreak in the United States, a 2014 CNN/ORC poll asked a random sample of 1,018 adults if they would take an anti-Ebola vaccination if and when it became available. About half of the participants reported that they would, while half expressed hesitation or refusal, even if vaccination services for Ebola were available to them.

In the current study, investigators conducted a secondary analysis of that data to examine the factors contributing to vaccination receptivity vs. hesitation. They found that three factors primarily influenced receptivity: a general fear orientation; trust in government to contain a crisis; and the relative chance of being exposed to the pathogen. Interestingly, the effectiveness and safety of a vaccine itself was not among the factors influencing receptivity.

"Facing a raising number of epidemics that create public health dangers, our findings indicate that vaccine hesitancy is associated with social factors that are independent of the perceived effectiveness of vaccines. Willingness to take vaccination is positively associated with a generalized sense of fear, trust in the government's ability to control an outbreak of the disease, and expectation of a potential Ebola outbreak that is imminent and proximate," explains one of the study's investigators, Kent P. Schwirian, PhD, Emeritus Professor of Sociology, The Ohio State University, Columbus, OH, USA.

Professor Schwirian elaborated on how these three factors shape the willingness of half of the sample population to engage in the protective behavior of vaccination.

General Fear Orientation. Respondents expressed fear not only of being infected, but also more generally in terms of their outlook on life and how they perceive things are going overall in society today. The more than 60 percent who reported being somewhat or very scared about events in the US today were much more willing to consider an anti-Ebola vaccination than individuals who did not report this anxiety.

Trust in Government. People who expressed confidence in the US government's ability to prevent an Ebola outbreak were much more willing to take the anti-Ebola vaccine than individuals who lacked confidence in the government to do so.

Exposure Expectancy of an Ebola Outbreak. While approximately 80 percent of the respondents thought that it was somewhat or highly likely that an Ebola outbreak would happen fairly soon in the US, most people thought that the outbreak would not happen in their local community or family. However, the closer in proximity they thought the outbreak would be to them, the more willing they were to take the anti-Ebola vaccination.

Gustavo S. Mesch, PhD, Professor of Sociology and Rector, University of Haifa, Israel, the study's other investigator, recommends reexamining the research questions with more current data. "Our life and death struggle against lethal microbes is ultimately fought at the local level. Unless local hospitals and health care personnel are prepared to fight and ready to go, we are at a major disadvantage in attempting to save lives," he cautions. "Confirming what percentage of the population would opt-in or out of vaccines, and the central role of trust in the government, would help public health officials plan their responses." He added that the results also showed other factors that can be validated and explored, particularly that of older respondents who were more likely to be vaccine receptive, as were those with less formal education.

Vaccination is the primary public health response to the growing number of infectious diseases that infect the world's population. At the same time, a growing anti-vax movement has spawned a small but vocal faction of the population, spreading hesitancy despite widespread evidence of vaccine efficacy and safety. The reluctance or refusal to be vaccinated or to have one's children vaccinated was identified by the World Health Organization as one of the top ten global health threats of 2019. Understanding the factors that contribute to vaccination compliance or hesitancy is vital for controlling disease outbreaks.

While currently Ebola is largely confined to Africa, other infectious diseases loom closer to the US, such as the measles outbreak in several US states. Against this backdrop, it is important to evaluate the public's willingness to participate in preventive public health measures and understand what distinguishes those who are willing to partake from those who are not.

Credit: 
Elsevier

New research identifies gene that hides cancer cells from immunotherapy

SEATTLE -- July 18, 2019 -- A team at Fred Hutchinson Cancer Research Center has identified a gene that could make immunotherapy treatments, specifically checkpoint inhibitors, work for a wider variety of cancer patients. The study, published today in Developmental Cell, found that when the DUX4 gene is expressed in cancer cells, it can prevent the cancer from being recognized and destroyed by the immune system.

The team, led by Drs. Robert Bradley and Stephen Tapscott, looked at the gene expression profiles of nearly 10,000 cancers from 33 different cancer types and discovered that DUX4, a gene mostly known for its link to a specific muscular dystrophy (facioscapulohumeral dystrophy, or FSHD), consistently presented itself in many different solid tumors, including cancers of the bladder, breast, lung, kidney and stomach. DUX4 prevented immune cells from recognizing the cancer cells, so that patients whose cancers expressed the gene were less likely to respond to immunotherapy. Because DUX4 is expressed in many cancers, blocking its activity might increase the success of immune checkpoint inhibitors.

"Immunotherapy can be incredibly powerful against previously untreatable cancers, but it isn't effective yet for most patients," said Bradley. "Understanding the mechanisms that prevent the immune system from identifying and attacking tumors is a first step toward finding cures for all cancer patients."

Tapscott, who has previously studied the role of DUX4 in early development and in FSHD muscular dystrophy, notes the findings are an example of how the rapid, but regulated, growth in early development can be re-activated in cancers as rampant and unregulated cell growth. DUX4 is normally expressed in early development, when embryonic cells need to evade detection by the maternal immune system.

"This study suggests that cancer cells express DUX4 to hijack a normal early developmental program that can suppress anti-cancer immune activity," said Tapscott.

Tapscott further notes there is no increased cancer risk in individuals with FSHD, which indicates the cancer cells are using DUX4 as a developmental tool to avoid the immune system, but not as a driver that causes cancer.

Bradley and Tapscott hope their work will eventually lead to the development of DUX4-targeted treatments that will enhance the success of immunotherapies for a broad range of cancers.

Credit: 
Fred Hutchinson Cancer Center

Biomaterial-delivered chemotherapy leads to long-term survival in brain cancer

A combination of chemotherapy drugs during brain cancer surgery using a biodegradable paste, leads to long-term survival, researchers at the University of Nottingham have discovered.

In a new study published in Clinical Cancer Research, scientists found a significant survival benefit in rat models with brain tumours when a combination of two chemotherapy drugs, (etoposide and temozolomide), were delivered using a biodegradable polymer called PLGA/PEG.

The research was carried out by experts from the Children's Brain Tumour Research Centre (CBTRC) at the University of Nottingham, in partnership with researchers from Johns Hopkins University in the USA.

Glioblastoma multiforme (GBM) is the most aggressive and common brain tumour with a dismal average survival of 15 months from diagnosis, killing 3500 UK people annually. This is despite treatment comprising surgery, radiotherapy and chemotherapy.

GBM treatment is limited by the inability of otherwise potentially effective drugs to penetrate the brain when delivered via the bloodstream or administered orally. This is mainly due to a structure surrounding the brain called the 'blood brain barrier', which functions to protect toxins and infectious agents entering the brain.

The research team led by Dr. Ruman Rahman (Assistant Professor) and Dr. Stuart Smith (Clinical Associate Professor of Neurosurgery) at the CBTRC discovered that PLGA/PEG could act as a delivery system for chemotherapy drugs in a previous study in 2013.

The polymer formulation, which was originally designed to help mend broken bones, is made from two types of micro-particles called PLGA and PEG and has been developed and patented by leading tissue engineer Professor Kevin Shakesheff, based in the University's School of Pharmacy. A powder at room temperature, it can be mixed to a toothpaste-like consistency with the addition of water.

The paste can be applied to the brain cancer cavity created after removal of the bulk tumour during surgery. The paste then releases chemotherapy drugs into the brain, in so doing targeting the remaining cancer cells which cannot be safely removed by surgery and which cause the cancer to return.

The effect was further enhanced when combined with radiotherapy. Long-term survival was observed in over half the rat models, with laboratory tests confirming that the brains were clear of any cancer. Animals receiving this intervention survived for longer compared to those treated with current standard-of-care treatment offered to GBM patients.

The team's next step is to initiate an early phase clinical trial at Queen's Medical Centre, Nottingham.

Dr Rahman, Assistant Professor of Molecular Neuro-Oncology, said: "We are very pleased with this first demonstration that chemotherapy drugs delivered to the brain in this manner during surgery, can lead to a considerable improvement in brain cancer survival. The results give us a realistic opportunity to consider this therapy for a human clinical trial."

Credit: 
University of Nottingham

How mammals' brains evolved to distinguish odors is nothing to sniff at

image: The image shows a section of the front part of the piriform cortex, an area of the brain involved in the sense of smell. The cortex layers are stained with florescent antibodies to better distinguish key differences. Layer 1 contains two separate sections; the layer closest to the black-colored surface (1a) is stained bright green, while the second part (1b) is stained orange. Layer 2 is stained white and contains a high density of neurons. Olfactory bulb neurons, important in smell processing, send signals to the branches of neurons in layer 1a. These neurons have cell bodies located in layer 2. Layer 2 neurons communicate with one another in layer 1b.

Image: 
Salk Institute

The world is filled with millions upon millions of distinct smells, but how mammals' brains evolved to tell them apart is something of a mystery.

Now, two neuroscientists from the Salk Institute and UC San Diego have discovered that at least six types of mammals--from mice to cats--distinguish odors in roughly the same way, using circuitry in the brain that's evolutionarily preserved across species.

"The study yields insights into organizational principles underpinning brain circuitry for olfaction in mammals that may be applied to other parts of the brain and other species," says Charles Stevens, distinguished professor emeritus in Salk's Neurobiology Laboratory and coauthor of the research published in the July 18, 2019 issue of Current Biology.

In brief, the study reveals that the size of each of the three components of the neural network for olfaction scales about the same for each species, starting with receptors in the nose that transmit signals to a cluster of neurons in the front of the brain called the olfactory bulb which, in turn, relays the signals to a "higher functioning" region for odor identification called the piriform cortex.

"These three stages scale with each other, with the relationship of the number of neurons in each stage the same across species," says Shyam Srinivasan, assistant project scientist with UC San Diego's Kavli Institute for Brain and Mind, and the paper's coauthor. "So, if you told me the number of neurons in the nose, I could predict the number in the piriform cortex or the bulb."

The current study builds on research by the same duo, published in 2018, which described how mouse brains process and distinguish odors using what's known as "distributed circuits." Unlike the visual system, for example, where information is transmitted in an orderly manner to specific parts of the visual cortex, the researchers discovered that the olfactory system in mice relies on a combination of connections distributed across the piriform cortex.

Following that paper, Stevens and Srinivasan sought to determine if the distributed neural circuitry revealed in mice is similar in other mammals. For the current work, the researchers analyzed mammal brains of varying sizes and types. Their calculations, plus previous studies over the past few years, were used to estimate brain volumes. Stevens and Srinivasan used a variety of microscopy techniques that let them visualize different types of neurons that form synapses (connections) in the olfactory circuitry.

"We couldn't count every neuron, so we did a survey," says Srinivasan. "The idea is that you take samples from different represented areas, so any irregularities are caught."

The new study revealed that the average number of synapses connecting each functional unit of the olfactory bulb (a glomerulus) to neurons in the piriform cortex is invariant across species.

"It was remarkable to see how these were conserved," says Stevens.

Specifically, identification of individual odors is linked to the strength and combination of firing neurons in the circuit that can be likened to music from a piano whose notes spring from the depression of multiple keys to create chords, or the arrangement of letters that form the words on this page.

"The discrimination of odors is based on the firing rate, the electric pulse that travels down the neuron's axon," says Srinivasan. "One odor, say for coffee, may elicit a slow response in a neuron while the same neuron may respond to chocolate at a faster rate."

This code used for olfaction is different than other parts of the brain.

"We showed that the connectivity parameters and the relationship between different stages of the olfactory circuit are conserved across mammals, suggesting that evolution has used the same design for the circuit across species, but just changed the size to fit the animals' environmental niche," says Stevens.

In the future, Stevens plans to examine other regions of the brain in search of other distributed circuits whose function is based on similar coding found in this study.

Srinivasan says he will focus on how noise or variability in odor coding determines the balance between discrimination and learning, explaining that the variability the duo is finding in their work might be a mechanism for distinguishing odors, which could be applied to making better machine learning or AI systems.

Credit: 
Salk Institute

Strong family relationships may help with asthma outcomes for children

EVANSTON, Ill. --- Positive family relationships might help youth to maintain good asthma management behaviors even in the face of difficult neighborhood conditions, according to a new Northwestern University study.

For children with asthma, neighborhood environmental conditions -- the role of allergens and pollutants, for example -- have long been known to play an important role, but less is known about how social conditions in the neighborhood might affect children's asthma.

In the study, researchers sought to test whether there are social factors that can buffer children from the negative effects of difficult neighborhood conditions, focusing on one particular factor they thought would be important in the lives of children -- whether they had positive and supportive family relationships.

"We found significant interactions between neighborhood conditions and family relationship quality predicting clinical asthma outcomes," said Edith Chen, professor of psychology in the Weinberg College of Arts and Sciences, faculty fellow at the Institute for Policy Research at Northwestern and lead author of the study. "When children lived in neighborhoods that were high in danger and disorder, the better their family relationships, the fewer symptoms and activity limitations they had, and the better their pulmonary function."

In contrast, Chen said, when children lived in neighborhoods that were lower in danger and disorder, their symptoms, activity limitations and pulmonary function were generally good, and the nature of their family relationships didn't really matter.

Using Google Street View, the researchers were able to take a virtual walk through each of the research participant's Chicago neighborhoods, and code for indicators of neighborhood danger or disorder, including evidence of graffiti, rundown or abandoned cars, bars or grates on windows and doors, and abandoned or boarded-up homes. That gave them a more objective indicator of the level of neighborhood danger and disorder that a participant is likely experiencing on a daily basis as they walk to places from their home.

They then interviewed children about their family relationships and coded the amount of support, trust and conflict that was present, and measured a variety of asthma outcomes -- clinical, behavioral and biological -- in these children.

Chen said the research is important to the field of pediatrics because families often don't have options for moving out of neighborhoods that are challenging.

"If pediatricians can provide suggestions to families about how supportive relationships can help with managing their child's asthma, while at the same time still acknowledging the realities of the ongoing neighborhood difficulties that many of these families face, this might help families," Chen said.

"It's possible that when children have high-quality relationships with their family, family members are able to help their children prioritize asthma management, for example, perhaps by shielding them from neighborhood stressors in order to minimize the disruption to asthma routines," Chen said. "But this is speculative at this point, and future research could test this idea by implementing family or parenting interventions in youth with asthma who live in high-danger neighborhoods and examining their effects on childhood asthma outcomes."

Credit: 
Northwestern University

A dynamic genetic code based on DNA shape

image: Zα mutations from one parental chromosome map directly to phenotype as there is no transcript from the other parental chromosome to cover up defects in the ADAR protein produced.

Image: 
Alan Herbert

DNA has many different forms. Normally the two strands of DNA wind around each other in a right-handed spiral. However, another conformation, called Z-DNA, exists where the strands twist to the left. The function of the Z-DNA has remained a mystery since its discovery. A newly published paper unambiguously establishes that the Z-conformation is key to regulating interferon responses involved in fighting viruses and cancer. The study analyzes families with variants in the Z-binding domain of the ADAR gene.

The peer-reviewed results published online in the European Journal of Human Genetics end the long-standing debate as to whether the unusual left-handed Z-conformation has any biological function. Z-DNA forms when right-handed B-DNA is unwound to make RNA. An analysis of genetic mutations in Mendelian families by Alan Herbert at InsideOutBio reveals that the Z-conformation regulates those type I interferon responses normally induced by viruses and tumors. The study confirms a biological role for the left-handed conformation in human disease and reveals that the human genome encodes genetic information using both shape and sequence. The two codes are overlapping, with three-dimensional shapes like Z-DNA and Z-RNA forming dynamically, altering the read-out of sequence information from linear, one-dimensional DNA chromosomal arrays.

One approach to understanding the biological role of Z-DNA has been to isolate proteins that bind specifically to the left-handed Z-DNA conformation and study their function. Alan Herbert and the late Alexander Rich led a team at MIT that identified the Zα domain, which binds very tightly to both Z-DNA and Z-RNA. X-ray studies revealed that the binding was specific for the Z-conformation without any sequence specificity. The co-crystals of Zα and Z-DNA allowed identification of key protein residues in their interaction.

The Zα domain is present in a double-stranded RNA editing enzyme called ADAR. ADAR edits double-stranded RNAs (dsRNA) that usually form when an RNA transcript basepairs with itself. The enzyme changes adenosine to inosine, which is then readout as guanosine, changing both the information of the RNA and its downstream processing, generating many different RNA products from a single transcript. Early studies suggested ADAR was involved in anti-viral interferon responses. However, most edited dsRNA in a cell originate from repetitive Alu elements, fragments of non-coding RNA that colonized the human genome early in its evolution through a process of copy and paste. Recent studies show that suppression of such dsRNAs by ADAR editing is vital to the survival of many tumors.

The discovery of families with mutations in the ADAR gene has now revealed a biological function for the left-handed conformation. Families with loss of function ADAR variants over-produce interferon, leading to a severe diseases such as Aicardi-Goutières syndrome (OMIM: 615010) and Bilateral Striatal Necrosis/Dystonia. In some families, due to the different ADAR variants inherited from each parent, only one parental chromosome is capable of making ADAR protein. In such families, it is possible to map a mutation directly to phenotype. Individuals with Zα ADAR variants that no longer bind the Z-conformation have impaired dsRNA editing and exaggerated dsRNA induced interferon responses, confirming that the left-handed Z-conformation regulates these responses. The findings directly link Z-DNA to human disease and unambiguously establish a biological role for this alternative nucleic acid conformation.

The switch in shape from right-handed to left-handed DNA alters the readout from genes involved in the type I interferon pathway. Only a subset of sequences flip to form Z-DNA under physiological conditions. Their distribution within the genome is non-random. These flipons create phenotypic diversity by altering how genes generate RNA. They are subject to selection just like any other variation. The genomes that emerge encode genetic information in both shape and sequence with frequent overlap between the two different instruction sets.

Credit: 
InsideOutBio

Species on the move

image: The European bee-eater (Merops apiaster) is one of the species moving with climate change, traditionally breeding in Africa, southern and central Europe and East Asia -- they have reportedly bred in Nottinghamshire.

Image: 
(c) Elgollimoh license numbers: CC BY-SA 3.0

A total of 55 animal species in the UK have been displaced from their natural ranges or enabled to arrive for the first time on UK shores because of climate change over the last 10 years (2008-2018) - as revealed in a new study published today (18 July 2019) by scientists at international conservation charity ZSL (Zoological Society of London).

Making use of a previously overlooked source of data, the team turned to social media to search for rare species sightings. The researchers conducted searches both on Twitter and Google, attributing 10 out the 55 species identified to people posting images online of the animals in unusual places.

The study led by Dr Nathalie Pettorelli of ZSL's Institute of Zoology, published in the Journal of Applied Ecology, explains that, due to regular sightings from environmentalists, UK wildlife is one of the most intensively monitored in the world, but there is very little centralised tracking of species arriving for the first time in the country or moving to places outside of their known UK range, due to climate change.

The analysis also considered UK Government environment reports as well as 111 scientific papers, leading to a total of 55 species (out of 39,029 species in the UK) being identified. The research focused solely on species which had established sustainable populations through natural, rather than human-assisted movement.

Little evidence for any one group of animals showing resilience to the pressures of climate change were seen, with invertebrates, mammals and birds all seemingly impacted by rising temperatures. Of the 55 species identified, 64% were invertebrates, and only one formally classified as an invasive species - the leathery sea squirt (Styela clava).

Species such as the black bee fly (Anthrax anthrax) arrived in the UK for the first time in 2016 and was reportedly found using a garden bug-hotel in Cambridgeshire. The Jersey tiger moth (Euplagia quadripunctaria) previously only seen around Jersey and the south coast of England, is now regularly sighted in London.

Bird species moving with climate change include the purple heron (Ardea purpurea) and tropical-looking European bee-eater (Merops apiaster) - identified by keen birdwatchers - which have been nesting in Kent and Nottinghamshire, quite a stretch from their natural breeding grounds in Africa, central and southern Europe and East Asia.

Seeking to understand the impact of these species', the study found that 24% of new species arriving or displaced were cited as having negative impacts on ecological communities and human society. Damage to crops, biofouling, human disease spread and increased pressure on planning permissions were all regarded as negative impacts. Some positive impact was recorded, with a boost in tourism after sightings of a Eurasian nuthatch (Sitta europaea) in Scotland were reported in 2010 and 2018.

Dr Nathalie Pettorelli, lead author and Senior Research Fellow at ZSL's Institute of Zoology said: "We are currently massively unprepared for the climate-driven movement of species that is happening right now in the UK. As it stands, society is not ready for the redistribution of species, as current policies and agreements are not designed for these novel species and ecological communities - particularly if those species have no perceived value to society.

"Our results suggest that many species are on the move in the UK, and that we can expect a lot of changes in the type of nature we will have around us in the coming years. But the lack of an integrated national platform dedicated to tracking and communicating about species displaced by climate change is currently a hindrance to mitigating those potential ecological, economic and societal associated impacts."

Credit: 
Zoological Society of London

Over-claiming knowledge predicts anti-establishment voting

image: Logistic regression slopes and 95% confidence intervals of anti-establishment voting as function of self-perceived understanding of the treaty, actual knowledge of the treaty, and general overclaiming.

Image: 
Van Prooijen, Jan-Willem; Krouwel, Andre (2019). Overclaiming Knowledge Predicts Anti-Establishment Voting. <em>Social Psychological and Personality Science</em>.

Washington, DC / Amsterdam, Netherlands - In light of the election and ballot victories of populist, anti-establishment movements, many people have been trying to better understand the behaviors and motivations of voters. Studying voter behavior on an EU treaty, social psychologists in the Netherlands found that knowledge overclaiming predicts anti-establishment voting, particularly at the radical right.

The results of their research is published in the journal Social Psychological and Personality Science.

"Politicians and citizens with strong anti-establishment views, including populist movements, often articulate their views with high confidence," notes van Prooijen. "This research puts that confidence into perspective and suggests that it may often be overconfidence."

Blaming the establishment apparently is a cognitively "easy" way of making sense of the problems that society faces, write Jan-Willem van Prooijen (VU Amsterdam) and Andre Krouwel (VU Amsterdam), coauthors of the study. They note that this occurs for both the political left and the political right, though it tends to appear stronger for the radical right.

Van Prooijen and Krouwel measured and analyzed voter knowledge and behavior before and after an April 6, 2016, Dutch vote that supported or opposed a European Union (EU) treaty. The treaty was a decision on establishing stronger political and economic connections between the EU and Ukraine.

Questions were sent to a panel of voters 6 weeks before the referendum, and people were asked to rate themselves on their understanding of the treaty, as well as answer factual questions about the referendum, and a survey on their political views. A total of 13,323 people completed the questionnaire.

Two days after the vote, Van Prooijen and Krouwel followed up with a second round of questions, asking whether people voted in the referendum and how they voted, with the results kept anonymous. This group of consisted of 5568 people from the original panel who voted and 2044 people who had not voted.

Comparing the responses with voter behavior and political leanings, they found that for each measurement point of self-perceived knowledge, the anti-establishment vote becomes 1.62 times more likely. Yet, an increase in actual knowledge decreases the likelihood of the anti-establishment vote by 0.85 per measurement point.

"The study does not show that anti-establishment voters are somehow less intelligent, or less concerned with society," says van Prooijen. "Future research may reveal whether the discrepancy between self-perceived understanding and actual knowledge is due to being uninformed or due to being misinformed."

Credit: 
Society for Personality and Social Psychology

Rising CO2, climate change projected to reduce availability of nutrients worldwide

July 17, 2019 - One of the biggest challenges to reducing hunger and undernutrition around the world is to produce foods that provide not only enough calories but also make enough necessary nutrients widely available. New research finds that, over the next 30 years, climate change and increasing carbon dioxide (CO2) could significantly reduce the availability of critical nutrients such as protein, iron, and zinc, compared to a future without it. The total impacts of climate change shocks and elevated levels of CO2 in the atmosphere are estimated to reduce growth in global per capita nutrient availability of protein, iron, and zinc by 19.5%, 14.4%, and 14.6%, respectively.

"We've made a lot of progress reducing undernutrition around the world recently but global population growth over the next 30 years will require increasing the production of foods that provide sufficient nutrients," explained Senior Scientist at the International Food Policy Research Institute (IFPRI) and study co-author Timothy Sulser. "These findings suggest that climate change could slow progress on improvements in global nutrition by simply making key nutrients less available than they would be without it".

The study, "A modeling approach combining elevated atmospheric CO2 effects on protein, iron and zinc availability with projected climate change impacts on global diets," [LINK] was co-authored by an international group of researchers and published in the peer-reviewed journal, Lancet Planetary Health. The study represents the most comprehensive synthesis of the impacts of elevated CO2 and climate change on the availability of nutrients in the global food supply to date.

Using the IMPACT global agriculture sector model along with data from the Global Expanded Nutrient Supply (GENuS) model and two data sets on the effects of CO2 on nutrient content in crops, researchers projected per capita availability of protein, iron, and zinc out to 2050.

Improvements in technology, and markets effects are projected to increase nutrient availability over current levels by 2050, but these gains are substantially diminished by the negative impacts of rising concentrations of carbon dioxide.

While higher levels of CO2 can boost photosynthesis and growth in some plants, previous research has also found they reduce the concentration of key micronutrients in crops. The new study finds that wheat, rice, maize, barley, potatoes, soybeans, and vegetables are all projected to suffer nutrient losses of about 3% on average by 2050 due to elevated CO2 concentration.

The effects are not likely to be felt evenly around the world, however, and many countries currently experiencing high levels of nutrient deficiency are also projected to be more affected by lower nutrient availability in the future.

Nutrient reductions are projected to be particularly severe in South Asia, the Middle East, Africa South of the Sahara, North Africa, and the former Soviet Union--regions largely comprised of low- and middle-income countries where levels of undernutrition are generally higher and diets are more vulnerable to direct impacts of changes in temperature and precipitation triggered by climate change.

"In general, people in low- and middle-income countries receive a larger portion of their nutrients from plant-based sources, which tend to have lower bioavailability than animal-based sources," said Robert Beach, Senior Economist and Fellow at RTI International and lead author of the study.

This means that many people with already relatively low nutrient intake will likely become more vulnerable to deficiencies in iron, zinc, and protein as crops lose their nutrients. Many of these regions are also the ones expected to fuel the largest growth in populations and thus requiring the most growth in nutrient availability.

The impact on individual crops can also have disproportionate effects on diets and health. Significant nutrient losses in wheat have especially widespread implications. "Wheat accounts for a large proportion of diets in many parts of the world, so any changes in its nutrient concentrations can have substantial impact on the micronutrients many people receive," added Beach.

Protein, iron, and zinc availability in wheat are projected to be reduced by up to 12% by 2050 in all regions. People will likely experience the largest decreases in protein availability from wheat in places where wheat consumption is particularly high, including the former Soviet Union, Middle East, North Africa, and eastern Europe.

In South Asia, where the population's iron intake already sits well below the recommended level-India exhibits the highest prevalence of anemia in the world--iron availability is projected to remain inadequate. What's more, elevated carbon levels push the average availability of zinc in the region below the threshold of recommended nutrient intake.

Although the study's models were limited to 2050, Sulser added, "extending the analysis through the second half of this century, when climate change is expected to have even stronger impacts, would result in even greater reductions in nutrient availability."

Researchers also emphasized the need for further work to build upon their findings, including additional study of climate impacts on animal sources, such as poultry, livestock, and fisheries, crops' nutritional composition, nutrient deficiencies resulting from short-term climate shocks, and technologies that could mitigate reductions in nutrient availability.

Quantifying the potential health impacts for individuals also requires a consideration of the many factors beyond food consumption--including access to clean water, sanitation, and education--that influence nutrition and health outcomes.

"Diets and human health are incredibly complex and difficult to predict, and by reducing the availability of critical nutrients, climate change will further complicate efforts to eliminate undernutrition worldwide," Sulser noted.

Credit: 
International Food Policy Research Institute

Some pharmacists missing mark on therapeutic guidelines: QUT study

As the role of pharmacies in providing front-line public health services grows, a QUT study has raised concerns that some are not adhering to therapeutic guidelines when distributing pharmaceuticals.

Researchers from QUT's Business School and Faculty of Health School of Clinical Sciences (Pharmacy) combined to investigate the practices of 200-plus pharmacies in Brisbane in supplying the morning-after pill and treatments for conjunctivitis.

The results have just been published in the open access network of the Journal of the American Medical Association (JAMA).

"The frequency in which both the overuse and underuse of drugs occurs is problematic worldwide and has significant flow-on effects for public health policy across the globe," said Professor Uwe Dulleck from QUT's School of Economics and Finance.

"A defining feature of modern health care policy is the push for more autonomy in self-treatment and management."

Professor Greg Kyle, from QUT's School of Clinical Sciences, said pharmacies now played a crucial role in the consumer health mix, prompting the Pharmaceutical Society of Australia in 2016 to argue pharmacists should play a larger part in reducing the burden of increasing health costs across the sector.

"This is an excellent concept but it has also highlighted the importance of understanding the interplay of the myriad factors relevant in diagnosis and treatment provision in such a setting to ensure effective and efficient outcomes for all parties," he said.

"In other words, consumers should be given the appropriate advice for over-the-counter medications in such transactions.

"It seems that is not always the case. Financial considerations may play a role but oversupply could also be due to pharmacists trying to deliberately reduce risk of adverse outcomes - ¬ for example, when the patient's GP is not available late at night or on a weekend.

"We can only observe that the supply decisions are not in line with the recommendations of the therapeutic guidelines."

Professor Dulleck said researchers from QUT's School of Clinical Sciences created mock scenarios to seek evidence on whether over-the-counter medication dispensing behaviour followed practice standards and guidelines.

Research assistants acted as mystery shoppers at pharmacies requesting the emergency hormonal contraceptive pill, which is considered effective if taken within 72 hours of sexual activity. There was only one option available for pharmacists in Australia at the time of the study.

"When our 'consumer' presented saying they were within the 72 hours, all the pharmacists adhered to the therapeutic guidelines," Professor Dulleck said.

"However, in the case of those who said it had been more than 72 hours, only one of every two followed the guidelines, with 47.7 per cent of pharmacists selling the pill despite there being limited evidence of its effectiveness in such a timeframe. Many also referred the research assistant to a doctor, but not all.

"Other mystery shoppers requested help for either viral or bacterial conjunctivitis - both of which respond to different treatments - the findings support the emergency contraceptive results.

"Data collected from the visits to 205 pharmacists in the wider Brisbane area showed only 57.6 per cent followed dispensing behaviour compliant with the protocol, while 31.3 per cent involved some form of overtreatment or overselling of medication."

Professor Dulleck said the study was of interest for researchers in health as well as those in economics.

"Pharmacists are experts in advising their customers on the right over-the-counter-medication," he said.

"Economists refer to goods and services of this kind as 'credence goods'. Understanding the behaviours of experts in such situation is of relevance to many parts of the economy including financial advice, home repairs, legal services and education.

"When consumers have to rely on an expert for advice, especially in health-related matters, the level of trust in relation to their credential is elevated."

Credit: 
Queensland University of Technology

Red wine's resveratrol could help Mars explorers stay strong, says Harvard study

Mars is about 9 months from Earth with today's tech, NASA reckons. As the new space race hurtles forward, Harvard researchers are asking: how do we make sure the winners can still stand when they reach the finish line?

Published in Frontiers in Physiology, their study shows that resveratrol substantially preserves muscle mass and strength in rats exposed to the wasting effects of simulated Mars gravity.

Space supplements

Out in space, unchallenged by gravity, muscles and bones weaken. Weight-bearing muscles are hit first and worst, like the soleus muscle in the calf.

"After just 3 weeks in space, the human soleus muscle shrinks by a third," says Dr. Marie Mortreux, lead author of the NASA-funded study at the laboratory of Dr. Seward Rutkove, Beth Israel Deaconess Medical Center, Harvard Medical School. "This is accompanied by a loss of slow-twitch muscle fibers, which are needed for endurance."

To allow astronauts to operate safely on long missions to Mars - whose gravitational pull is just 40% of Earth's - mitigating strategies will be needed to prevent muscle deconditioning.

"Dietary strategies could be key," says Dr. Mortreux, "especially since astronauts travelling to Mars won't have access to the type of exercise machines deployed on the ISS."

A strong candidate is resveratrol: a compound commonly found in grape skin and blueberries that has been widely investigated for its anti-inflammatory, anti-oxidative, and anti-diabetic effects.

"Resveratrol has been shown to preserve bone and muscle mass in rats during complete unloading, analogous to microgravity during spaceflight. So, we hypothesized that a moderate daily dose would help mitigate muscle deconditioning in a Mars gravity analogue, too."

Mars rats

To mimic Mars gravity, the researchers used an approach first developed in mice by Mary Bouxsein, PhD, also at Beth Israel Deaconess, in which rats were fitted with a full-body harness and suspended by a chain from their cage ceiling.

Thus, 24 male rats were exposed to normal loading (Earth) or 40% loading (Mars) for 14 days. In each group, half received resveratrol (150 mg/kg/day) in water; the others got just the water. Otherwise, they fed freely from the same chow.

Calf circumference and front and rear paw grip force were measured weekly, and at 14 days the calf muscles were analyzed.

Resveratrol to the rescue

The results were impressive.

As expected, the 'Mars' condition weakened the rats' grip and shrank their calf circumference, muscle weight and slow-twitch fiber content.

But incredibly, resveratrol supplementation almost entirely rescued front and rear paw grip in the Mars rats, to the level of the non-supplemented Earth rats.

What's more, resveratrol completely protected muscle mass (soleus and gastrocnemius) in the Mars rats, and in particular reduced the loss of slow-twitch muscle fibers. The protection was not complete, though: the supplement did not entirely rescue average soleus and gastrocnemius fibers cross-sectional area, or calf circumference.

As reported previously, resveratrol did not affect food intake or total body weight.

Perfecting the dose

Previous resveratrol research can explain these findings, says Dr. Mortreux.

"A likely factor here is insulin sensitivity.

"Resveratrol treatment promotes muscle growth in diabetic or unloaded animals, by increasing insulin sensitivity and glucose uptake in the muscle fibers. This is relevant for astronauts, who are known to develop reduced insulin sensitivity during spaceflight."

The anti-inflammatory effects of resveratrol could also help to conserve muscle and bone, and other anti-oxidant sources such as dried plums are being used to test this, adds Dr. Mortreux.

"Further studies are needed to explore the mechanisms involved, as well as the effects of different doses of resveratrol (up to 700 mg/kg/day) in both males and females. In addition, it will be important to confirm the lack of any potentially harmful interactions of resveratrol with other drugs administered to astronauts during space missions."

Credit: 
Frontiers

Salt regulations linked to 9,900 cases of cardiovascular disease and 1,500 cancer cases

A relaxation of UK industry regulation of salt content in food has been linked with 9,900 additional cases of cardiovascular disease, and 1,500 cases of stomach cancer.

Researchers from Imperial College London and the University of Liverpool analysed the salt intake of the population in England over thirteen years to compare the effect of changes in regulations on how much salt manufacturers can use in their products.

The team, who published their work in the Journal of Epidemiology and Community Health, found that since the regulations on industry had been relaxed in 2011, the national decline in salt intake has stalled.

Prior to 2011, salt intake was falling annually by 0.2g a day for men, and 0.12g a day for women. However, post 2011, when the regulations were relaxed, annual declines slowed to 0.11g per day for men, and 0.07g per day for women.

Between 2003-2010, the independent Food Standards Agency (FSA) closely monitored salt content and agreed on targets for salt reduction with industry. For instance, the FSA requested all food manufacturers reduce salt content in 85 categories by 10-20 per cent. This was accompanied by public health campaigns on how to reduce salt consumption, as well as pressure from the Government, who threatened further regulation. This policy has since been emulated in many countries worldwide.

However, in 2011 this policy was replaced by the Public Health Responsibility Deal. Here, the food industry set their own targets as part of voluntary pledges they could sign-up to and they were asked to report their progress to the Department of Health.

But the newer system lacked robust and independent target setting, monitoring, and enforcement, argues Dr Anthony Laverty, lead author of the research from Imperial's School of Public Health: "Evidence from around the world is now showing that mandatory approaches are much more effective than self- regulation by industry in reducing the amount of salt and sugar in our diet."

Salt intake has been linked to an increase in cardiovascular disease - including heart attacks and stroke - as well as to an increased risk of stomach cancers. Adults are advised to eat no more than 6g (one teaspoon) a day, or 3g a day for children. However, adults in England are thought to eat an average of 8g a day and two in three people consume too much salt - with most of this coming from processed foods such as bread, processed meats and ready meals.

The team estimated the amount of cases of cardiovascular disease and stomach cancer associated with the slowing in the reductions in daily salt intake seen under the change of policy in 2011 to 2017. They found that an extra 9,900 cases of cardiovascular disease, and 1,500 cases of stomach cancer could be attributable to the change of policy from the independent FSA-led policy to the Responsibility Deal.

The researchers also found this additional disease burden hit the UK economy, as well as the nation's health. They estimated that relaxing the food industry regulations cost the economy around £160millon from 2011-2017. That included healthcare costs of extra heart attacks, strokes and cancer cases, and the loss of productivity due to workplace absences.

Without urgent and decisive action, the researchers predict even more avoidable heart attacks, strokes and cancer cases. Using a computer model, the team estimated future cases of cardiovascular disease and stomach cancer if the average reduction in salt intake continues at the current rate. They estimated an additional 26,000 of cardiovascular disease, and 3,800 cases of stomach cancer between 2019 and 2025. This could cost an additional £960million to the economy over the next 6 years.

The team used data from the National Diet and Nutrition Survey and national salt surveys, conducted 2000 - 2013. These surveys included urine analysis of over 3,000 people, which showed in detail how much salt they were eating in their diet every day, and acted as a representative sample of the population. The team acknowledge that although the data has limitations, the findings confirm previous criticisms that the Responsibility Deal was not an effective tool to improve the nation's health.

Professor Martin O'Flaherty, co-author of the study from the University of Liverpool added: "We are eating too much salt. Previous research has shown three-quarters of salt in our diet is hidden in processed food such as bread, ready meals and soups. The FSA approach was one of the most robust strategies internationally. Our research shows that we now need an equally robust mandatory programme to accelerate salt intake reduction. This will require clear targets and penalties to ensure the food industry reduce salt content in foods. Softer, voluntary measures could generate additional heart attacks, strokes and cancer cases."

Credit: 
Imperial College London

The Lancet: Big Sugar and neglect by global health community fuel oral health crisis

Oral diseases present a major global public health burden, affecting 3.5 billion people worldwide, yet oral health has been largely ignored by the global health community, according to a new Lancet Series on Oral Health.

With a treat-over-prevent model, modern dentistry has failed to combat the global challenge of oral diseases, giving rise to calls for the radical reform of dental care

The burden of oral diseases is on course to rise as more people are exposed to the underlying risk factors of oral diseases, including sugar, tobacco and alcohol

Emerging evidence of the food, beverage, and sugar industry's influence on dental research and professional bodies raises fresh concern

Oral health has been isolated from traditional healthcare and health policy for too long, despite the major global public health burden of oral diseases, according to a Lancet Series on Oral Health, published today in The Lancet. Failure of the global health community to prioritise the global burden of oral health has led to calls from Lancet Series authors for the radical reform of dental care, tightened regulation of the sugar industry, and greater transparency around conflict of interests in dental research.

Oral diseases, including tooth decay, gum disease and oral cancers, affect almost half of the global population, with untreated dental decay the most common health condition worldwide. Lip and oral cavity cancers are among the top 15 most common cancers in the world. In addition to lower quality of life, oral diseases have a major economic impact on both individuals and the wider health care system.

Accessing dental care continues to present the highest cost barrier compared to any other healthcare service in the United States (U.S.) and the highest dental expenditures globally were found for the U.S. ($129.1 billion). [1]

The Lancet Series on Oral Health led by University College London (UCL) researchers brought together 13 academic and clinical experts from 10 countries, including the US, to better understand why oral diseases have persisted globally over the last three decades, despite scientific advancements in the field, and why prevalence has increased in low- and middle- income countries (LMIC), and among socially disadvantaged and vulnerable people, no matter where they live. [2]

A tipping point for global oral health

"Dentistry is in a state of crisis," said Professor Richard Watt, Chair and Honorary Consultant in Dental Public Health at UCL and lead author of the Series. "Current dental care and public health responses have been largely inadequate, inequitable, and costly, leaving billions of people without access to even basic oral health care. While this breakdown in the delivery of oral healthcare is not the fault of individual dental clinicians committed to caring for their patients, a fundamentally different approach is required to effectively tackle to the global burden of oral diseases." [3]

In high-income countries (HIC), dentistry is increasingly technology-focused and trapped in a treatment-over-prevention cycle, failing to tackle the underlying causes of oral diseases. Oral health conditions share many of the same underlying risk factors as non-communicable diseases, such as sugar consumption, tobacco use and harmful alcohol consumption.

Professor Robert J. Weyant, DMD, DrPH Professor and Chair, University of Pittsburgh, Department of Dental Public Health, said: "The U.S. continues to struggle with how to best ensure access to affordable dental care for many individuals. This has led to ongoing suffering for many with oral disease and significant disparities in oral health for vulnerable populations such as poor families, ethnic minorities, and the elderly. The Affordable Care Act helped to expand access to dental care for millions but many still remain unable receive needed care, highlighting an urgent need for improvements in dental health policy." [4]

In middle-income countries the burden of oral diseases is considerable, but oral care systems are often underdeveloped and unaffordable to the majority. In low-income countries the current situation is most bleak, with even basic dental care unavailable and most disease remaining untreated.

Coverage for oral health care in LMIC is vastly lower than in HIC with median estimations ranging from 35% in low-, 60% in lower-middle, 75% in upper middle, and 82% in high income countries.

Sugar, alcohol and tobacco industries fuel global burden

The burden of oral diseases is on course to rise, as more people are exposed to the main risk factors of oral diseases. Sugar consumption, the underlying cause of tooth decay, is rising rapidly across many LMIC. While sugary drinks consumption is highest in HIC, the growth in sales of sugary drinks in many LMIC is substantial. By 2020, Coca-Cola intend to spend US$12 billion on marketing their products across Africa [5] in contrast to WHO's total annual budget of $4.4 billion (2017).

"The use of clinical preventive interventions such as topical fluorides to control tooth decay is proven to be highly effective, yet because it is seen as a 'panacea', it can lead to many losing sight of the fact that sugar consumption remains the primary cause of disease development." said Watt. "We need tighter regulation and legislation to restrict marketing and influence of the sugar, tobacco and alcohol industries, if we are to tackle the root causes of oral conditions."

Writing in a linked commentary, Cristin E Kearns of the University of California and Lisa A Bero of the University of Sydney raise additional concerns with the financial links between dental research organisations and the industries responsible for many of these risk factors.

"Emerging evidence of industry influence on research agendas contributes to the plausibility that major food and beverage brands could view financial relationships with dental research organisations as an opportunity to ensure a focus on commercial applications for dental caries interventions--eg, xylitol, oral hygiene instruction, fluoridated toothpaste, and sugar-free chewing gum--while deflecting attention from harm caused by their sugary products."

Lancet Series authors argue a pressing need exists to develop clearer and more transparent conflict of interest policies and procedures, and to restrict and clarify the influence of the sugar industry on dental research and oral health policy.

Radical reform of dentistry needed

Lancet Series authors have called for wholesale reform of the dental care model in five key areas:

1. Close the divide between dental and general healthcare

2. Educate and train the future dental workforce with an emphasis on prevention

3. Tackle oral health inequalities through a focus on inclusivity and accessibility

4. Take a stronger policy approach to address the underlying causes of oral diseases

5. Redefine the oral health research agenda to address gaps in LMIC knowledge

Dr Jocalyn Clark, an Executive Editor at The Lancet, said: "Dentistry is rarely thought of as a mainstream part of healthcare practice and policy, despite the centrality of the mouth and oral cavity to people's well-being and identity. A clear need exists for broader accessibility and integration of dental services into healthcare systems, especially primary care, and for oral health to have more prominence within universal health coverage commitments. Everyone who cares about global health should advocate to end the neglect of oral health."

APPENDIX OF KEY FACTS & STATISTICS

Oral disease: types and causes

The key oral health conditions include: dental caries (tooth decay) [localised destruction of dental hard tissues (enamel and dentine) by acidic by-products from the bacterial fermentation of free sugars]; periodontal (gum) disease [chronic inflammatory conditions that affect the tissues surrounding and supporting the teeth]; and oral cancer [squamous cell carcinoma is the most common type of oral cancer].

The main cause of periodontal disease is poor oral hygiene leading to an accumulation of pathogenic microbial biofilm (plaque) at and below the gingival margin. Tobacco use is also an important independent risk factor for periodontal disease.

The major risk factors for oral cancers are tobacco use, alcohol consumption, and areca nut (betel quid) chewing. In many high-income countries (HIC), human papilloma virus (HPV) infection is responsible for a steep rise in the incidence of oropharyngeal cancers among young people.

Oral diseases can lower quality of life in many ways, including pain, infections, problems with eating and speaking, diminished confidence, and disruption to social, work, and school activities.

The global burden of oral disease

The most recent data from 2015 confirm that untreated caries in the permanent dentition remain the most common health condition globally (34·1%).

A 4% decrease in the number of prevalent cases of untreated dental caries occurred globally from 1990 (31,407 cases per 100 000) to 2017 (30,129 cases per 100 000).

The global burden of untreated dental caries for primary and permanent dentition has remained relatively unchanged over the past 30 years.

Epidemiological evidence indicates that lifetime prevalence of dental caries has decreased in the past four decades, but this is mainly in HIC, with the most substantial decrease seen in 12-year-old children.

Data from 2018 show that oral cancer has the highest incidence among all cancers in Melanesia and south Asia among males, and is the leading cause of cancer-related mortality among males in India and Sri Lanka.

Inequalities in oral disease

Case-control studies showed a consistent association between low socioeconomic status and oral cancer in both LMIC and HIC, even after adjustment for behavioural confounders.

Extreme oral health inequalities exist for the most marginalised and socially excluded groups in societies, such as homeless people, prisoners, those with long term disabilities, refugees, and indigenous groups, which serves as a classic example of a so-called cliff edge of inequality

Indigenous children, even in HIC (Australia, Canada, New Zealand, and USA), are particularly vulnerable, with the prevalence of early childhood caries ranging from 68% to 90%.

Prevention

WHO recommends that free sugars intake should be restricted to less than 10% of total energy highlighting that for further benefits, restriction in sugar consumption should be now more than 5% of total energy; however, many countries do not meet these guidelines.

While topical fluorides are proven clinical preventive agents, caries will still develop in the presence of free sugars above 10% of total energy intake. Even where exposure to fluoride is optimal, evidence suggests that free sugars exposure as low as of total energy may still carry a risk of caries.

Credit: 
The Lancet