Culture

Despite progress, only 3 African nations expected to meet global breastfeeding goal

Despite substantial progress, only three African nations
expected to meet global breastfeeding goals

Estimates indicate that more than 800,000 child deaths could be averted annually
with optimal breastfeeding practices, according to World Health Organization

Detailed interactive maps illustrate where countries are advancing and falling behind

SEATTLE - Only three African countries are expected to meet the global target for exclusive breastfeeding, "an unparalleled source of nutrition for newborns and infants, no matter where they are born," according to a global health expert.

The three nations, Guinea-Bissau, Rwanda, and São Tomé and Príncipe, are singled out in a new study from the Institute for Health Metrics and Evaluation (IHME) at the University of Washington's School of Medicine. The study, published today in Nature Medicine in advance of World Breastfeeding Week Aug 1-7, finds areas of persistent low prevalence in countries that have made progress overall. Detailed maps accompanying the analysis reveal vulnerable populations, especially those living in rural areas and in extreme poverty.

However, researchers note that that several nations, including Burundi, Rwanda, and parts of Ethiopia, Uganda, and Zambia were among the highest rates of exclusive breastfeeding levels in 2000 and 2017. Sudan had some of the "highest and most consistent rates of increase" toward the exclusive breastfeeding goal of the World Health Organization (WHO) - prevalence by 2025 of at least 50% nationwide. The Global Burden of Disease, the annual comprehensive health study, attributed 169,000 child deaths to lack of breastfeeding in 2017, more than half of them in sub-Saharan Africa. Moreover, according to the WHO, increasing breastfeeding to near-universal levels could save more than 800,000 lives every year, the majority being children under 6 months.

The paper examines breastfeeding prevalence down to the level of individual districts and municipalities and compares progress among 49 African nations. The paper is accompanied by an interactive visualization tool that allows users to compare prevalence of exclusive breastfeeding within and across countries, look at the rate of change over time, and see the probability of meeting WHO's goal by 2025.

The value of exclusive breastfeeding of children cannot be over-emphasized.

"Breastfeeding is an unparalleled source of nutrition for newborns and infants, no matter where they are born. If we are serious about ensuring that every infant is offered a healthy start in life, we need to know who isn't being reached with the support they need to breastfeed," said Dr. Ellen Piwoz of the Bill & Melinda Gates Foundation. "By illustrating where exclusive breastfeeding rates are falling behind, these maps are a powerful tool to help policymakers and practitioners examine and act on disparities within their countries."

In 2017, at least a two-fold difference in exclusive breastfeeding prevalence existed across districts in 53% of countries, a three-fold difference in 14% of countries, and a more than six-fold difference in Niger and Nigeria.

Exclusive breastfeeding refers to mothers using only breast milk to feed their children for the first six months, with medications, oral rehydration salts, and vitamins as needed. The practice provides essential nutrition and can prevent infection and disease, particularly in areas without access to clean water.

Credit: 
Institute for Health Metrics and Evaluation

Are american nurses prepared for a catastrophe? New study says perhaps not

On average, American colleges and universities with nursing programs offer about one hour of instruction in handling catastrophic situations such as nuclear events, pandemics, or water contamination crises, according to two recent studies coauthored by a nursing professor at the University of Tennessee, Knoxville.

"Events that can cause greater impact but are less likely to occur, usually receive less training hours," said Roberta Lavin, executive associate dean and professor in UT's College of Nursing. Lavin is coauthor of the studies published in the Journal of Perinatal and Neonatal Nursing and Nursing Outlook.

The studies' results come from two surveys that were sent to all colleges and universities that offer nursing programs in the United States.

The surveys revealed that most students said they were not getting enough instruction in emergency response, while professors and lecturers said they were not prepared to teach how to offer care during and after catastrophic situations.

"Emergencies are not just the exact moment a disaster hits; it is also the aftermath. How do we evacuate a town? How do we carry out care for other chronic, sometimes life-lasting consequences that derive from these situations? That is the big challenge," said Lavin.

One study examined the management of Zika fever and water contamination crises and was focused on nurses' preparedness to attend pregnant women and children, two populations that are often overlooked in emergency plans.

In addition to nursing schools, that same study also assessed the preparedness of Master of Public Health programs, medical schools, and Doctor of Osteopathy programs in America.

"Even though all accreditation standards require this type of preparation, we are not putting enough emphasis on it," said Lavin.

Lavin and her coauthors now are working to offer resources to help close that knowledge gap. One of the actions they are taking is to design educational modules for instructors to use in their classes. The units are licensed under Creative Commons and can be downloaded free of charge; users can adjust the courses to meet the needs of their communities.

"We are putting people out there to attend these emergencies, and we owe it to them to prepare them right," Lavin said.

Credit: 
University of Tennessee at Knoxville

Understanding the drivers of a shift to sustainable diets

One of the 21st century's greatest challenges is to develop diets that are both sustainable for the planet and good for our bodies. An IIASA-led study explored the major drivers of widespread shifts to sustainable diets using a newly developed computational model of population-wide behavioral dynamics.

High meat consumption - especially of red and processed meat - has been linked to poor health outcomes, including diabetes, heart disease, and various cancers. Livestock farming for meat production also has a massive environmental footprint. It contributes to deforestation to make room for livestock, leads to land and water degradation and biodiversity loss, and, considering the meat industry's considerable methane emissions, creates as much greenhouse gas (GHG) emissions as all the world's cars, trucks, and airplanes combined. It therefore seems logical that several studies have demonstrated that diet change, especially lowering red meat consumption, can significantly contribute to the mitigation of climate change and environmental degradation, while also being conducive to better public health.

Previous studies on diet change scenarios involving lowered meat consumption, which were mostly based on stylized diets or average consumption values, showed promising results in terms of alleviating environmental degradation. If the world's average diet, for example, became flexitarian by 2050, in other words, people started to limit their red meat consumption to one serving per week and white meat to half a portion per day, the GHG emissions of the agriculture sector would be reduced by around 50%. This seems like an easy change to make, but research shows that due to the scale of behavioral change required, most of these scenarios will be difficult to achieve. In their study published in Nature Sustainability, researchers from IIASA and the University of Koblenz-Landau explored the major behavioral drivers of widespread shifts to sustainable diets.

"The human behavior aspect of such large scale diet changes have to our knowledge not been studied before in relation to the food system, although we need this information to understand how such a global change can be achieved. Our study covers this gap based on a computational model of population-wide behavioral dynamics," explains Sibel Eker, a researcher in the IIASA Ecosystems Services and Management Program and lead author of the study.

Eker and her colleagues adapted the land use module of an integrated assessment model to serve as a platform from where the population dynamics of dietary changes and their environmental impacts could be explored. They drew on environmental psychology to mimic population dynamics based on prominent psychological theories and included factors such as income, social norms, climate risk perception, health risk perception, and the self-efficacy of individuals, while considering the heterogeneity of their age, gender, and education levels. They then ran the model in an exploratory fashion to simulate the dynamics of dietary shifts between eating meat and a predominantly plant-based diet at the global scale. This computational analysis allowed them to identify the main drivers of widespread dietary shifts.

The results indicate that social norms - the unwritten rules of behavior that are considered acceptable in a group or society - along with self-efficacy are the key drivers of population-wide dietary shifts, playing an even more important role than both climate and health risk perception. The team also found that diet changes are particularly influenced by how quickly social norms spread in the young population and the self-efficacy of specifically females. Focusing on the behavior influencing factors highlighted in this study could therefore be helpful in the design of policy interventions or communication campaigns where community-building activities or empowering messages could be employed in addition to communicating information around climate and health risks related to meat consumption.

According to the researchers, to their knowledge, theirs is the first coupled model of climate, diet, and behavior. The modeling framework they developed is general and can be adapted to address new research questions, thus opening the door to many potential applications to explore connections between behavior, health, and sustainability. They plan to collect more data from sources like social media to quantify their model, as well as focus on specific cases where cultural values and traditions also play an important role in whether people are willing to adapt their behavior or not.

"We can use models to explore the social and behavioral aspects of climate change and sustainability problems in the same way as we explore the economic and environmental dimensions of our world. In this way, we get a better understanding of what works to steer the lifestyle changes required for sustainability and climate change mitigation. As lifestyle change is a key driver of climate change mitigation, this modeling exercise can be seen as an example of how we can integrate human behavior and lifestyle changes into integrated assessment models for a richer scenario exploration," concludes Eker.

Credit: 
International Institute for Applied Systems Analysis

More sensitive climates are more variable climates, research shows

A decade without any global warming is more likely to happen if the climate is more sensitive to carbon dioxide emissions, new research has revealed.

A team of scientists from the University of Exeter and the Centre of Ecology and Hydrology in the UK has conducted pioneering new research into why both surges and slowdowns of warming take place.

Using sophisticated climate models the team, led by PhD student Femke Nijsse, discovered if the climate was more sensitive to CO2 concentration also displayed larger variations of warming over a decade.

When combined with information from simulations without any carbon dioxide increases, the authors were able to assess the natural variability of each climate model.

The research is published this week in Nature Climate Change.

Femke Nijsse, from the University of Exeter, said: "We were surprised to see that even when we took into account that sensitive climate models warm more over the last decades of the 20th century, these sensitive models were still more likely to have short periods of cooling."

Climate sensitivity, which sits at the very heart of climate science, is the amount of global warming that takes place as atmospheric CO2 concentrations rise.

For many years, estimates have put climate sensitivity somewhere between 1.5-4.5°C of warming for a doubling of pre-industrial CO2levels.

The study found that cooling - or "hiatus" - decades were more than twice as likely around the turn of the century in high sensitivity models (models that warm 4.5 ºC after doubling CO2), compared to low sensitivity models (models that warm 1.5 ºC after doubling CO2).

Co-author Dr. Mark Williamson, A Research Fellow at Exeter: "This does not mean that the presence of a global warming slowdown at the beginning of the 21st century implies we live in a highly sensitive world.

"By looking at all decades together, we get a better picture and find observations are broadly consistent with a central estimate of climate sensitivity"

Ms Nijsse added: "We still don't exactly know how much the climate system will heat up, nor do we know exactly what the range of natural variability in trends will be over the coming decades. But our study shows that these risks should not be considered as separate."

The paper also studied the chance that a decade in the 21st century could warm by as much as the entire 20th century - a scenario that the research team call "hyperwarming".

Under a scenario where carbon dioxide emissions from fossil fuels continue to increase, the chance of hyperwarming is even more dependent on climate sensitivity than the long-term global warming trend.

Increasing the climate sensitivity by 50% from a central estimate of 3 ºC would increase the mean global warming to the end of this century by slightly less than 50%, but would increase the chance of a hyperwarming decade by more than a factor of ten.

Credit: 
University of Exeter

Plants defend against insects by inducing 'leaky gut syndrome'

image: Fall armyworms are pests of corn plants.

Image: 
Nick Sloff, Penn State

UNIVERSITY PARK, Pa.--Plants may induce "leaky gut syndrome"--permeability of the gut lining--in insects as part of a multipronged strategy for protecting themselves from being eaten, according to researchers at Penn State. By improving our understanding of plant defenses, the findings could contribute to the development of new pest control methods.

"We found that a combination of physical and chemical defenses in corn plants can disrupt the protective gut barriers of fall armyworms, creating opportunities for gut microbes to invade their body cavities," said Charles Mason, postdoctoral scholar in entomology. "This can cause septicemia, which can kill the insect, or simply trigger an immune response, which can weaken the insect."

The researchers reared fall armyworms in the laboratory and inoculated them with one of three types of naturally occurring gut bacteria. They fed the insects on one of three types of maize--one that is known to express enzymes that produce perforations in insect gut linings; one that is characterized by numerous elongated trichomes, or fine hairs that occur on the surface of the plant and help defend against herbivores; and one that has just a few short trichomes. The team used scanning electron microscopy to evaluate the impacts of the various bacteria and maize types on the integrity of the fall armyworms' gut linings.

The scientists found that the presence of all three types of gut bacteria decreased the ability of fall armyworm larvae to damage maize plants, especially when other defenses--such as elongated trichomes and enzymes, both of which can perforate gut linings--were present. However, the species of gut bacteria varied in the extent to which they weakened the insects. The results will appear in the July 22 issue of Proceedings of the National Academy of Sciences.

"Our results reveal a mechanism by which some plants use insects' gut microbiota against them in collaboration with their own defenses," said Mason.

Gary Felton, professor and head of the Department of Entomology, noted that the results should have broad significance towards understanding the ecological function of plant defenses.

"In the context of our study, disparate plant defenses, such as leaf trichomes and plant enzymes, all require certain gut microbes for their optimal defense against herbivores," he said. "Our results predict that the variation in the effectiveness of plant defenses in nature may be, in significant part, due to the variability observed in the microbial communities of insect guts."

The team said the results could help to inform the development of insect-resistant crops.

"It may be advantageous to 'stack' plant defenses that target the insect gut in order to create a 'leaky gut' that exposes the insect to microbial assaults on their immune system," said Mason.

Credit: 
Penn State

The early days of the Milky Way revealed

video: The universe 13,000 million years ago was very different from the universe we know today. It is understood that stars were forming at a very rapid rate, forming the first dwarf galaxies, whose mergers gave rise to the more massive present-day galaxies, including our own.

Image: 
Gabriel Pérez Díaz, SMM (IAC).

The universe 13,000 million years ago was very different from the universe we know today. It is understood that stars were forming at a very rapid rate, forming the first dwarf galaxies, whose mergers gave rise to the more massive present-day galaxies, including our own. However the exact chain of the events which produced the Milky Way was not known until now.

Exact measurements of position, brightness and distance for around a million stars of our galaxy within 6,500 light years of the sun, obtained with the Gaia space telescope, have allowed a team from the IAC to reveal some of its early stages. "We have analyzed, and compared with theoretical models, the distribution of colours and magnitudes (brightnesses) of the stars in the Milky Way, splitting them into several components; the so-called stellar halo (a spherical structure which surrounds spiral galaxies) and the thick disc (stars forming the disc of our Galaxy, but occupying a certain height range)" explains Carme Gallart, a researcher at the IAC and the first author of this article, which is published today in the journal Nature Astronomy.

Previous studies had discovered that the Galactic halo showed clear signs of being made up of two distinct stellar components, one dominated by bluer stars than the other. The movement of the stars in the blue component quickly allowed us to identify it as the remains of a dwarf galaxy (Gaia-Enceladus) which impacted onto the early Milky Way. However the nature of the red population, and the epoch of the merger between Gaia-Enceladus and our Galaxy were unknown until now.

"Analyzing the data from Gaia has allowed us to obtain the distribution of the ages of the stars in both components and has shown that the two are formed by equally old stars, which are older than those of the thick disc" says IAC researcher and co-author Chris Brook. But if both components were formed at the same time, what differentiates one from the other? "The final piece of the puzzle was given by the quantity of "metals" (elements which are not hydrogen or helium) in the stars of one component or the other" explains Tomás Ruiz Lara, an IAC researcher and another of the authors of the article. "The stars in the blue component have a smaller quantity of metals than those of the red component". These findings, with the addition of the predictions of simulations which are also analyzed in the article, have allowed the researchers to complete the history of the formation of the Milky Way.

Thirteen thousand million years ago stars began to form in two different stellar systems which then merged: one was a dwarf galaxy which we call Gaia-Enceladus, and the other was the main progenitor of our Galaxy, some four times more massive and with a larger proportion of metals. Some ten thousand million years ago there was a violent collision between the more massive system and Gaia-Enceladus. As a result some of its stars, and those of Gaia-Enceladus were set into chaotic motion, and eventually formed the halo of the present Milky Way. After that there were violent bursts of star formation until 6,000 million years ago, when the gas settled into the disc of the Galaxy, and produced what we know as the "thin disc".

"Until now all the cosmological predictions and observations of distant spiral galaxies similar to the Milky Way indicate that this violent phase of merging between smaller structures was very frequent" explains Matteo Monelli, a researcher at the IAC and a co-author of the article. Now we have been able to identify the specificity of the process in our own Galaxy, revealing the first stages of our cosmic history with unprecedented detail.

Credit: 
Instituto de Astrofísica de Canarias (IAC)

Characteristics in older patients associated with inability to return home after operation

WASHINGTON, D.C. (July 22, 2019): Older adults have a different physiology and unique set of needs that may make them more vulnerable to complications following a surgical procedure. The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) Geriatric Surgery Pilot Project has, for the first time, identified four factors in older patients that are associated with an inability to return home after an operation. The NSQIP Geriatric Surgery Pilot Project is unique in that it is the only specifically defined data set focused on outcomes for older surgical patients.

In presenting study results at the ACS Quality and Safety Conference 2019, concluding today in Washington, DC, researchers reported on geriatric-specific conditions among Geriatric Pilot Project patients that were associated with not living at home 30 days after surgery. This information can help surgeons advise patients about the possible effects of a surgical procedure on their lifestyle as well as their clinical outcomes before an operation. It also may guide hospital quality improvement programs to address pre- and postoperative conditions that may keep elderly surgical patients from returning home soon afterward.

"When surgeons speak with older patients about the decision to operate, we discuss complication rates and the risk of mortality. We don't usually talk about whether they will have the independence they had beforehand. In this study, we looked at the NSQIP data set to find factors that influence whether patients are living at home or require support for their functional needs in some kind of facility, such as a nursing home, 30 days after surgery. This information should help us make better preoperative decisions with our patients by allowing us to tell them about the impact a surgical procedure will have on their way of life," said study coauthor Ronnie Rosenthal, MD, FACS, co-principal investigator of the ACS-led Coalition for Quality in Geriatric Surgery (CQGS) and professor of surgery and geriatrics, Yale University School of Medicine, New Haven, CT.

The NSQIP Geriatric Surgery Pilot Project was created in 2014 to measure and improve the quality of surgical care for older Americans. The project measures preoperative variables and outcome measures that specifically target elderly patients, reflect the quality of their surgical care, and identify interventions that may improve their treatment and well-being.

"Hospitals may implement protocols that improve patient function or prevent postoperative problems that make it less likely for a patient to return home," said study co-author Lindsey Zhang, MD, MS, John A. Hartford Foundation James C. Thompson Clinical Scholar in Residence at ACS, and a general surgery resident at the University of Chicago Medical Center.

The researchers looked at 3,696 patients in the NSQIP Geriatric Surgery Pilot registry who had inpatient procedures between 2015 and 2017 and whose living location 30 days after surgery was known. Eighteen percent of these patients were still living in a care facility 30 days after surgical treatment. The four characteristics identified among these older patients were: a history of a fall within the past year, preoperative malnutrition as defined by more than 10 percent of unintentional weight loss, postoperative delirium, or a new or worsening pressure ulcer after surgery.

"This information empowers physicians to have a conversation with their older surgical patients about the possibility of a stay in an extended care facility, depending on patient characteristics and the nature of the operation they are about to undergo," Dr. Zhang said.

Because this study shows geriatric risk factors that appear to be associated with an extended stay in a care facility, its results may lead to quality improvement initiatives in a hospital. "Should we consider nutrition programs for patients with malnutrition or create programs to improve function for patients who have had a fall? Do we implement protocols in the postop period to prevent delirium and pressure ulcers? Will these steps lead to more patients going home after surgery? We can't say for sure, but these results provide strong evidence to say it's worth the effort for a hospital to address these issues," Dr. Zhang said.

On July 19, the ACS introduced the Geriatric Surgery Verification (GSV) Program by releasing the GSV standards for geriatric surgical care for hospitals to review prior to enrolling in this new surgical quality improvement program in late October. These standards address many key factors in geriatric surgery, including those that may delay an older patient's return home postoperatively.

Credit: 
American College of Surgeons

Are plant-based eating habits associated with lower diabetes risk?

Bottom Line: This study (called a systematic review and meta-analysis) combined the results of nine studies and examined the association between adherence to plant-based eating habits and risk of type 2 diabetes in adults. The analysis included 307,099 adults with 23,544 cases of type 2 diabetes. The authors report higher adherence to plant-based eating habits was associated with lower risk of type 2 diabetes, especially when only healthy plant-based foods, such as fruits, vegetables, whole grains, legumes and nuts, were included in the definition of plant-based. Limitations include that all of the studies analyzed were observational and dietary habits were self-reported.

Authors: Qi Sun, M.D., Sc.D., of the Harvard T. H. Chan School of Public Health, Boston, and coauthors

(doi:10.1001/jamainternmed.2019.2195)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Overstuffed cancer cells may have an Achilles' heel

image: Aneuploid yeast cells on the left have difficulty drawing in fluorescent molecules. Whereas, the normal yeast cells on the right are able to rapidly draw them in.

Image: 
Rong Li and Hung-Ji Tsai

In a study using yeast cells and data from cancer cell lines, Johns Hopkins University scientists report they have found a potential weak spot among cancer cells that have extra sets of chromosomes, the structures that carry genetic material. The vulnerability, they say, is rooted in a common feature among cancer cells -- their high intracellular protein concentrations -- that make them appear bloated and overstuffed, and which could be used as possible new targets for cancer treatments.

"Scientists are now thinking more about targeting the biophysical properties of cancer cells to make them self-destruct," says Rong Li, Ph.D., Bloomberg Distinguished Professor of Cell biology and Oncology at the Johns Hopkins University School of Medicine and of Chemical and Biomolecular Engineering at the Johns Hopkins Whiting School of Engineering.

Further research is planned to confirm the findings in animal and human cancer cells, says Li.

A report on the research, led by Li, is published in the June 6 issue of Nature.

The new experiments focused on a chromosome number abnormality known as aneuploidy. Normal human cells, for example, have a balanced number of chromosomes: 46 in all, or 23 pairs of different chromosomes. A cell with chromosomes that have extra or fewer copies is called aneuploid. Li says, "aneuploidy is the #1 hallmark of cancer," and is found in more than 90% of solid tumor cancer types.

When cells gain chromosomes, Li says, they also get an extra set of genes that produce more than the normal amount of protein that a cell makes. This excess can give cells growth abilities they normally wouldn't have, sometimes allowing them to overgrow and develop into a tumor.

Because aneuploid cells have unbalanced protein production, they have too many free-floating proteins that are not organized into a complex. This increases the concentration inside of the cell compared to outside. To compensate for the increased concentration, the cells draw in water, a phenomenon that leads to hypo-osmotic stress.

"Aneuploid cells tend to be bigger and more swollen than cells with a balanced number of chromosomes," says Li.

Li, who is a member of the Johns Hopkins Kimmel Cancer Center, says she and her team set out to see if there was a common Achilles' heel among aneuploid cancer cells, one that would make a powerful strategic target for cancer treatment.

For the study, which took nearly five years to complete, Li and her colleagues, including first author and Johns Hopkins postdoctoral fellow Hung-Ji Tsai, Ph.D., looked at yeast cells, which have 16 chromosomes. In stressful environments, such as those with cold temperatures or inadequate nutrients, yeast cells adapt by altering the number of chromosomes, which allows them to survive better due to changes in the relative amounts of various proteins.

Li and Tsai looked at gene expression levels of thousands of aneuploid yeast cells compared with normal ones. Specifically, the scientists looked for gene expression changes that were shared among the aneuploid cells despite their differences in chromosome copy number. Among the aneuploid cells, the scientists found that gene expression was altered in about 4% of the genome compared with normal cells.

Next, the scientists compared the aneuploidy-associated gene expression with information from a database at Stanford University that contains changes in gene expression among normal yeast cells exposed to different stressful environments. They found that both the aneuploid cells and normal cells under hypo-osmotic stress share certain gene expression characteristics. They also share the problem of being bloated, affecting their ability to internalize proteins located on the cell membrane that regulate nutrient uptake.

Li's team continued its work to see if it could exploit aneuploid cells' vulnerability in properly controlling the intake of nutrients. They screened the yeast genome and found a molecular pathway involving two proteins called ART1 and Rsp5 that regulate the cells' ability to draw in nutrients such as glucose and amino acids. When the scientists inactivated these proteins in the aneuploid yeast cells, they lacked the proper intracellular nutrient levels and were less able to grow.

The human equivalent of the molecular pathway involves proteins called arrestins and Nedd4.

"It's possible that we could find a treatment that targets this or another pathway that exploits the vulnerability common to aneuploid cancer cells," says Li.

Credit: 
Johns Hopkins Medicine

UNM scientists document late Pleistocene/early Holocene Mesoamerican stone tool tradition

image: UNM graduate student Paige Lynch conducting excavations at Mayahak Cab Pek in May 2019, part of ongoing UNM research into the earliest humans in the New World tropics.

Image: 
University of New Mexico

From the perspective of Central and South America, the peopling of the New World was a complex process lasting thousands of years and involving multiple waves of Pleistocene and early Holocene period immigrants entering into the Neotropics.

Paleoindian colonists arrived in waves of immigrants entering the Neotropics, a region starting in the humid rainforests of southern Mexico before 13,000 years ago and brought with them technologies developed for adaptation to environments and resources found in North America.

As the ice age ended across the New World people adapted more generalized stone tools to exploit changing environments and resources. In the Neotropics these changes would have been pronounced as patchy forests and grasslands gave way to broadleaf tropical forests.

In new research published recently in PLOS One titled Linking late Paleoindian stone tool technlogies and populations in North, Central and South America, scientists from The University of NewMexico led a study in Belize to document the very earliest indigenous stone tool tradition in southern Mesoamerica.

"This is an area of research for which we have very poor data regarding early humans, though this UNM-led project is expanding our knowledge of human behavior and relationships between people in North, Central and South America," said lead author Keith Prufer, professor from The University of New Mexico's Department of Anthropology.

This research, funded by grants from the National Science Foundation and the Alphawood Foundation, focuses on understanding the Late Pleistocene human colonization of tropics in the broad context of global changes occurring at the end of the last ice age (ca. 12,000-10,000 years ago). The research suggests the tools are part of a human adaptation story in response to emerging tropical conditions in what is today called the Neotropics, a broad region south of the Isthmus of Tehuantepec (in S Mexico).

As part of the research, the team conducted extensive excavations at two rock shelter sites from 2014-2018. The excavation sites, located in the Bladen Nature Research, are almost 30 miles from the nearest road or modern human settlement in a large undisturbed rainforest that is one of the best-protected wildlife refuges in Central America.

"We have identified and established an absolute chronology for the earliest stone tool types that are indigenous to Central America," said Prufer. "These have clear antecedents with the earliest known humans in both South America and North America, but appear to show more affinity with slightly younger Late Paleoindian toolkits in the Amazon and Northern Peru than with North America."

The research represents the first endogenous Paleoindian stone tool technocomplex recovered from well-dated stratigraphic contexts for Mesoamerica. Previously designated, these artifacts share multiple features with contemporary North and South American Paleoindian tool types. Once hafted, these bifaces appear to have served multiple functions for cutting, hooking, thrusting, or throwing.

"The tools were developed at a time of technological regionalization reflecting the diverse demands of a period of pronounced environmental change and population movement," said Prufer. "Combined stratigraphic, technological, and population paleogenetic data suggests that there were strong ties between lowland neotropic regions at the onset of the Holocene."

These findings support previous UNM research suggesting strong genetic relationships between early colonists in Central and South America, following the initial dispersal of humans from Asia into the Americas via the arctic prior to 14,000 years ago.

"We are partnering with Belizean conservation NGO Ya'axche Conservation Trust in our fieldwork to promote the importance of ancient cultural resources in biodiversity and protected areas management," said Prufer. "We spend a month every year camped out with no access to electricity, internet, phone or resupplies while we conduct excavations."

This field research involves several UNM graduate students in Archaeology and Evolutionary Anthropology as well as collaborators at Exeter University (UK) and Arizona State University. The analysis for this study was done in part at UNM's Center for Stable Isotopes, as well as with co-authors at Penn State and UC Santa Barbara. At UNM this involved the new radiocarbon preparation laboratories which are part of the Center for Stable isotopes, one of the anchors of UNM's interdisciplinary PAIS research and teaching facility.

The senior co-authors are world leaders in the study of early humans in the tropics and are committed to conservation efforts of cultural resources and regional biodiversity. Additionally, Prufer's long-term collaboration in indigenous Maya communities in the region was critical to the success of this project.

"This research suggests that further exploration of links between early humans living in the neotropics are needed to better understand how knowledge and technologies were shared, and will contribute to our understanding of processes that eventually led to the development of agriculture and sedentary communities," said Prufer. "Further studies on how these tools were used for food processing will be a key aspect of this research."

Credit: 
University of New Mexico

Serious falls are a health risk for adults under 65

New Haven, Conn. -- Adults who take several prescription medications are more likely to experience serious falls, say Yale researchers and their co-authors in a new study. This heightened risk can affect middle-aged individuals -- a population not typically viewed as vulnerable to debilitating or fatal falls, the researchers said.

To identify factors that put adults at risk for serious falls, the research team used patient data from the Veterans Aging Cohort Study (VACS), a national study of individuals who receive care through the Veterans Health Administration (VA). They identified 13,000 fall cases and compared them to controls of similar age, race, sex, and HIV status. The fall risk factors included prescription medication use, and alcohol and illegal drug use.

The researchers found that falls were a problem for middle-aged patients. "Providers typically think about falls in people over age 65. But these people were primarily in their 50s and falls were an important concern," said Julie Womack, lead author and associate professor at Yale School of Nursing.

The study also noted that the simultaneous use of multiple medications, known as polypharmacy, plays a significant role in serious falls among patients who are HIV positive and those who are not. The researchers examined HIV status because people treated for HIV take several medications and often at a younger age.

Medications that were associated with serious falls included those commonly used to treat anxiety and insomnia (benzodiazepines), as well as muscle relaxants and prescription opioids.

Another important finding is the role of alcohol and illegal drug use in falls, Womack said.

The study suggests that programs designed to prevent serious falls in older adults may need to be modified to address risks for middle-aged adults. "Fall risk factors are highly prevalent in the Baby Boomer generation more generally. The next step is to look at interventions for the middle aged," said Womack. Those interventions could address drinking and illicit drug use in addition to polypharmacy. "When we're thinking about fall prevention programs we have to think about alcohol and substance use. We need to help individuals cut back."

Reducing falls in middle-aged and older adults is vital because falls contribute to increased risk of injuries, hospitalizations, and death, said Womack.

Credit: 
Yale University

Antibiotics before liver transplants lead to better results

A UCLA-led research team has found that giving mice antibiotics for 10 days prior to a liver transplant leads to better liver function after the surgery.

After concluding the experiment mice, the scientists discovered data from liver transplants performed between October 2013 and August 2015 at the Ronald Reagan UCLA Medical Center, revealing that the same phenomenon appears to hold true in humans. The statistics from human patients even demonstrated that the people who were in worse health prior to their surgeries but received pre-surgery antibiotics fared better after their transplants than the patients who were healthier prior to their surgeries but did not receive antibiotics.

The researchers concluded that the antibiotics inhibited bacteria that causes inflammation, which in turn can lead to organ rejection. Specifically, they found that in both mice and humans, the treatment prior to a transplant reduced damage that could occur when blood flow is restored to the liver after a period of time without oxygen; and it reduced inflammation and cell damage while accelerating the removal of damaged cells. As a result, liver function was better than in the mice and human patients who did not receive antibiotics before a transplant.

Humans carry trillions of bacteria, many of which are essential for health -- aiding in food digestion, for example. But other bacteria are linked to inflammatory bowel disease, cardiovascular disease, obesity, diabetes and even Parkinson's disease.

"So the idea behind this is to identify which bacteria is the good guy and which is the bad guy," said Dr. Jerzy Kupiec-Weglinski, the Paul I. Terasaki Professor of Surgery at the David Geffen School of Medicine at UCLA and the study's senior author.

The research is published in the peer-reviewed Journal of Clinical Investigation.

Studies in the past four years at the University of Chicago and University of Maryland have shown that in mice that received antibiotics before skin or heart transplants, those transplants lasted longer without rejection than in mice that did not receive antibiotics pre-surgery. From those studies, the UCLA researchers were able to zero in on some bacteria that appeared to help ensure more successful transplants.

In the mice that received antibiotics over the 10 days before transplants, the livers showed less damage -- deterioration caused by dead cells, for example -- than those in mice that were not given antibiotics before undergoing the same procedure.

"The livers in the mice that received antibiotics were protected against transplant damage, as well as rejection later on, because the antibiotics modulated their host microbiomes, which in turn stimulated cell protection," Kupiec-Weglinski said.

To further substantiate the effect of the antibiotics, the researchers then transplanted fecal matter from the untreated mice into those that had been given the medication. The mice that received the fecal transplants suffered inflammatory damage to their livers, despite the fact that they had been given antibiotics earlier in the experiment.

"That showed that antibiotic-mediated benefits clearly relate to the microbiota," Kupiec-Weglinski said.

The data on the UCLA patients covered 264 people who had received liver transplants -- 156 who, because they were sicker before their surgeries received antibiotics for 10 or more days prior to the transplant, and 108 who were given antibiotics for less than 10 days, or not at all prior to surgery.

"To our total surprise, livers functioned better after transplantation in those patients who were very sick and required prolonged antibiotic therapy," Kupiec-Weglinski said.

The researchers then narrowed their focus to human patients who had been given one specific antibiotic, rifaximin, prior to the transplants. (To pull together a larger sample size, the team analyzed data for transplants for a longer period of time than the original set of UCLA patients, from January 2013 through July 2016.) They found that in patients that received rifaximin, which stays in the bowel and has a low risk for inducing bacterial resistance, early liver failure was significantly delayed or stopped.

Although the combination of data from human patients and mice demonstrates the benefits of extended pre-transplant antibiotic treatment, the researchers wrote in the journal that the results were gleaned over a short period of time -- just 10 days or less after the transplants -- and longer-term studies are needed to strengthen the findings.

Kupiec-Weglinski also said the study opens the door to further research that could determine which microorganisms protect liver function after transplants, and which bacteria need to be "turned down" to limit their negative effects.

Credit: 
University of California - Los Angeles Health Sciences

NZ researchers call for gender binary in elite sports to be abandoned

image: This is Associate Professor Lynley Anderson.

Image: 
University of Otago

Existing gender categories in sport should perhaps be abandoned in favour of a more "nuanced" approach in the new transgender era, University of Otago researchers say.

The International Olympic Committee (IOC) guidelines that allow male-to-female transgender athletes to compete in the women's category at the elite level has raised significant debate since being introduced in 2015. A recent case of New Zealand weightlifter Laurel Hubbard, a transwomen competing in the 2018 Commonwealth Games, has polarised opinions about the inclusion of transwomen in women's sport.

Bioethicist, Associate Professor Lynley Anderson, says that in discussing this topic we need to consider the principles of inclusion and fairness.

Associate Professor Anderson and Dr Taryn Knox from the Dunedin Bioethics Centre, together with Otago physiologist Professor Alison Heather investigate the ethics and science around the IOC's decision in research published in the latest issue of the Journal of Medical Ethics.

They explain the recent IOC guidelines allow transwomen to compete in the women's division if (amongst other things) their testosterone is held below 10nmol/L.

Professor Heather says this is significantly higher than that of cis-women [whose sex and gender align as female].

"Science demonstrates that high adult levels of testosterone, as well as permanent testosterone effects on male physiology during in utero and early development, provides a performance advantage in sport and that much of this male physiology is not mitigated by the transition to a transwoman," she says.

Far from arguing that transwomen be excluded, the authors are in favour of a radical change to what they describe as "the outdated structure of the gender division currently used in elite sport".

They consider possible solutions in their research however, some options value inclusion more than fairness and vice versa. The potential solutions include excluding transwomen from competing in the women's division, creating a third division for transwomen and intersex women and calculating a handicap for transwomen based on their testosterone levels - similar to that used in golf.

Their preferred option is an extension of this with a proposed algorithm that could account for a range of parameters, both physical and social, including pyshiological parameters, gender identity and could include socioeconomic status.

Associate Professor Anderson says it is important to both extend and celebrate diversity while maintaining fairness for cis-women in sport.

"To be simultaneously inclusive and fair at the elite level some innovative thinking is required, rather than attempting to shoehorn people into either 'male' or 'female'.

"Perhaps the male/female binary should be reconsidered in favour of something more nuanced and useful?," she questions.

Credit: 
University of Otago

Early introduction of peanuts in babies to reduce allergy risk

Podcast post-embargo link: https://soundcloud.com/cmajpodcasts/181613-five

Worried about peanut allergies in children? A practice article in CMAJ (Canadian Medical Association Journal) outlines five things to know about early introduction of peanuts in infants to reduce the risk of peanut allergy.

Infants who are fed peanut protein regularly have a lower risk of peanut allergy.

To prevent peanut allergy, peanut protein (such as peanut butter or powdered puff) may be introduced at home for most babies between 4 and 6 months as one of the first foods.

Babies with severe eczema are more likely to have peanut allergy, and those with no or only mild eczema are best-suited for peanut introduction in the home.

Infants with risk factors for peanut allergy, such as severe eczema, egg allergy or both, should be seen by a specialist before peanut introduction.

To reduce the risk of peanut allergy, 8 grams of peanut protein (1 heaped teaspoon of peanut butter) should be eaten at least twice a week.

Credit: 
Canadian Medical Association Journal

More ED visits because of alcohol, 175% increase in 25- to 29-year-olds seeking care

New research shows dramatically rising visits to emergency departments (ED) related to alcohol, especially for women, with a 175% increase in alcohol-related visits from young people aged 25 to 29. The article, published in CMAJ (Canadian Medical Association Journal), shows increases in ED visits related to alcohol that are occurring much faster than overall ED usage.

"These increases are consistent with data showing increasing average weekly alcohol consumption in Ontario and higher rates of binge drinking across Canada during the study period, particularly in women," says lead author Dr. Daniel Myran, a family physician and public health resident at the University of Ottawa, Ottawa, Ontario.

The study included 765 346 ED visits by 480 611 people (32% from women) in Ontario, Canada's largest province, because of alcohol between 2003 and 2016. Some findings:

Women who visited the ED due to alcohol were more likely to be under the legal drinking age of 19 years (17%) than men (9%).

The highest rates of alcohol-related visits were in women aged 15 to 24 and men aged 45 to 54.

In people aged 25 to 29, alcohol-related visits increased 240% in women and 145% in men.

ED visits for alcohol resulted in higher rates of hospital admission (13%) than general ED visits (10%).

Neighbourhoods in the lowest income bracket had more than twice the number of ED visits for alcohol compared with those in the highest income bracket.

"Since 2007, the rates of emergency department visits due to alcohol by women under legal drinking age has surpassed that of underage men," says Dr. Myran, who is also training at The Ottawa Hospital and Bruyère Research Institute in public health. "We need a better understanding of youth- and gender-specific risk factors for alcohol harms to curb these increases."

These findings are consistent with data showing increases in alcohol-related ED visits in the United States (47% between 2006 and 2014) and England (51% between 2002 and 2014). Data from the United States have also shown widening disparities in harms from alcohol between high- and low-income individuals. While data from this study also highlight the disproportionate health burden that alcohol causes on lower-income individuals, this disparity has not grown over time. The Canadian study differs in that there is less heavy drinking in lower-income groups in Canada than in the US, possibly because of policies that have prevented the sale of low-cost alcohol in Canada.

"There may be an increasing need for supports and services for people, especially young people, with high-risk alcohol consumption, particularly in light of recent changes to how alcohol is sold in Ontario, including making alcohol cheaper and easier to purchase," says Dr. Myran.

The study was funded by Bruyère Research Institute through the Big Data Research Program and by ICES, which is funded by the Ontario Ministry of Health and Long-Term Care. The ICES uOttawa subunit is also supported by The Ottawa Hospital Foundation and the University of Ottawa.

To minimize harms "the federal and provincial governments should employ a public health approach to maximize benefits and minimize harms," writes Dr. Sheryl Spithoff, Department of Family and Community Medicine, Women's College Hospital, Toronto, in a related commentary. "Alcohol should be available for sale only within licensed and strictly monitored facilities with limited hours. Taxes and price minimums should be used to reduce alcohol-related harms. The increase in tax revenues could be used to fund essential provincial programs."

Credit: 
Canadian Medical Association Journal