Culture

Drilling deeper

Groundwater may be out of sight, but for over 100 million Americans who rely on it for their lives and livelihoods it's anything but out of mind. Unfortunately, wells are going dry and scientists are just beginning to understand the complex landscape of groundwater use.

Now, researchers at UC Santa Barbara have published the first comprehensive account of groundwater wells across the contiguous United States. They analyzed data from nearly 12 million wells throughout the country in records stretching back decades. Their findings appear in the journal Nature Sustainability.

In tackling the work, Debra Perrone and Scott Jasechko had a number of different questions about groundwater usage they wanted to address. First they set out to determine both where in the country wells are located and what purposes they serve -- domestic, industrial or agricultural. They also wanted to track the depths of wells in different areas and test to see if wells are being drilled deeper over time.

Focusing on regions known to depend on groundwater, such as California's Central Valley, the pair collected a wealth of information about different types of wells across the country. Groundwater is generally a matter of state management, so they had to cull their data from a variety of sources. "[That was] one of the biggest hurdles," said Perrone, an assistant professor in UC Santa Barbara's environmental studies department.

"It took us about four years to collect and quality-assure all these data sources," added Jasechko, an assistant professor based in the Bren School of Environmental Science & Management.

Scientists know that groundwater depletion is causing some wells to run dry. Where conditions are right, drilling new and deeper wells can stave off this issue, for those who can afford it. Indeed, Perrone and Jasechko found that new wells are getting deeper between 1.4 and 9.2 times as often as they are being drilled shallower.

What's more, the researchers found that 79% of areas they looked at showed well-deepening trends across a window spanning 1950 to 2015. Hotspots of this activity include California's Central Valley, the High Plains of southwestern Kansas, and the Atlantic Coastal Plain, among other regions.

"We were surprised how widespread deeper well drilling is," Jasechko said. News media had documented the trend in places like the Central Valley, but it is pervasive in many other parts of the country as well. This includes places like Iowa, where groundwater hasn't been studied as intensively, he noted.

The reasons for drilling deeper are varied, according to Perrone. For instance, people drill deeper to avoid contamination seeping into aquifers from the surface, or to access aquifers that have less stringent withdrawal regulations, she explained. Some people may drill deeper to source more water.

"What we're finding is that in places where water levels are declining, some people are drilling deeper, maybe to avoid having their primary water supply go dry," Perrone said. "Regardless of the reasons why Americans are drilling deeper, we suggest that deeper well drilling is an unsustainable stopgap to groundwater depletion."

Four major factors explain why deeper drilling won't solve water woes indefinitely. For starters, it costs more, and it requires more energy to pump water from deep underground compared to water closer to the surface.

Geology presents another challenge: Deeper strata are generally less conducive to groundwater extraction. And finally, groundwater tends to get saltier at depth, so at a certain point it becomes unusable if not treated. As a result, in many regions there's a floor to how deep we can productively drill for water.

This issue hits rural communities particularly hard. "Groundwater is a crucial resource for rural communities," Perrone said. "Our previous work found that rural groundwater wells are especially vulnerable to going dry." What's more, these communities often have less capacity to update their groundwater infrastructure. Groundwater is also important to the agricultural sector, which often relies on it for irrigation, especially during droughts, she added.

Deliberate groundwater governance and management have emerged as active areas in addressing this challenge. The idea is to become more conscientious about how groundwater is used and regulate and monitor the practices more effectively. Additional research is underway on managed aquifer recharge, namely encouraging water to percolate back underground. This could be normal surface water, flood water or treated water. And particularly dry areas are considering whether to increase their use of recycled water.

Researchers, practitioners and policymakers from around the world will discuss the challenges facing groundwater use as well as potential solutions at an upcoming conference on groundwater sustainability. Perrone is one of the lead organizers for the event, which will convene in Valencia, Spain this October.

This new paper provides additional context to one of Perrone and Jasechko's past studies -- completed with professors Grant Ferguson of the University of Saskatchewan, and Jennifer McIntosh at the University of Arizona -- where they found that the United States may have less usable groundwater than previously thought. It also ties into Perrone's work regarding groundwater policy across the U.S. In the future, she plans to look at the legal frameworks surrounding groundwater use. "My goal is to understand what types of laws are being passed in the western 17 states to manage groundwater withdrawals in more sustainable ways," she said.

Credit: 
University of California - Santa Barbara

Patients want physicians more involved in their health outside of the doctor's office

WHAT: A webinar to release new findings from a nationwide survey of physicians and consumers conducted by The Harris Poll on behalf of Samueli Integrative Health Programs that explores the following topics:

How consumers and physicians define self-care.

Whether physicians agree that self-care is a critical component of their patients' health.

The disconnect between what physicians and their patients say they want to discuss regarding self-care as part of the doctor-patient visit.

How Americans view and prioritize self-care within their day-to-day lives, and what they do to practice self-care.

Barriers for Americans and physicians to practice self-care.

WHEN: Tuesday, July 23, 2019 at 12:00 p.m. ET

WHO: Wayne B. Jonas, MD, executive director of Samueli Integrative Health Programs and Doug Cavarocchi, deputy director of Samueli Integrative Health Programs

RSVP: Please copy and paste the following URL into your browser: https://attendee.gotowebinar.com/register/2814823833833651979

Members of the media interested in receiving the embargoed news release should contact Kathleen Petty at kpetty@thereisgroup.com or (202) 868-4013. The news release is embargoed until July 23, 2019 at 12:00 p.m.

BACKGROUND: Chronic diseases are the the leading cause of death and disability in America and a leading driver of healthcare costs. Multiple studies show that behaviors such as smoking cessation, moderate alcohol use, healthy diet, regular exercise, and the social and emotional management of stress can prevent and even treat 70 percent of chronic disease. Thus, these self-care approaches are the cornerstone of health and good medicine. While a vast majority of physicians say self-care should be considered an essential part of a patient's overall health, it's not being discussed in the doctor's office and Americans aren't prioritizing it. In fact, there seems to be a miscommunication between physicians and their patients around discussions of self-care during patient visits - leaving this critical component of health unaddressed and chronic disease mismanaged.

Credit: 
The Reis Group

Researcher discovers how mosquitoes integrate vision and smell to track victims

image: LED mosquito flight simulator.

Image: 
Alex Crookshanks

Mosquitoes are smarter than people think.

Scientists have found that mosquitoes are changing their hunting routines in response to host cues. For example, in Africa, mosquitoes now recognize when people emerge from bednets in the morning and have begun hunting more often during the day than at night.

Virginia Tech researcher Clément Vinauger has discovered new neurobiology associated with mosquito vision and sense of smell that explains how Aedes aegypti mosquitoes track their victims.

Aedes aegypti mosquitoes spread dengue fever, chikungunya, Zika fever, Mayaro, and yellow fever viruses.

"Mosquitoes are impacting millions of people every year. I've been working to understand how mosquitoes navigate space and time. Analyzing how mosquitoes process information is crucial to figuring out how to create better baits and traps for mosquito control," said Vinauger, an assistant professor in the Department of Biochemistry in the College of Agriculture and Life Sciences at Virginia Tech.

While scientists understand a lot about the mosquito's sense of smell and how it targets CO2 exhalations to find their hosts, very little is known about how the mosquito uses vision.

Vinauger discovered that the interaction between the olfactory and visual processing centers of mosquitoes' brains is what helps these insects target their victims so accurately.

These findings were recently published in the journal Current Biology.

When mosquitoes encounter CO2, they become attracted to dark, visual objects, such as their hosts. What this new study shows is that CO2 affects the responses of neurons in mosquitoes' visual centers, to helps them track visual objects with a greater accuracy.

Vinauger and his research team were able determine this by fitting the mosquitoes with tiny 3D-printed helmets and tethering them in a LED flight simulator and exposing the mosquitoes to puffs of CO2.

"We monitored the mosquitoes' responses to visual and olfactory cues by tracking wingbeat frequency, acceleration, and turning behavior," said Vinauger.

Using calcium imaging experiments of the mosquitoes' brains, the research team found CO2 modulates mosquito neural responses to discrete visual stimuli.

In previous research, Vinauger also used imaging and neural recordings to show how responses in the olfactory centers were modulated by mosquitoes' previous experience, as they learned from swats and other attempts to throw them off our scent.

"The global strategy for management of mosquito-borne diseases involves controlling vector populations, to a large extent through insecticide application. However, mosquito-borne diseases are now resurgent, mostly because of rising insecticide resistance in populations. In this context, my research aims at closing the key knowledge gaps in our understanding of the mechanisms that allow mosquitoes to be such efficient disease vectors and, more specifically, to identify and characterize factors that modulate their host-seeking behavior," said Vinauger, who is also an affiliated faculty member of the Fralin Life Sciences Institute and the BIOTRANS program.

The focus of Vinauger's laboratory is to investigate circadian and pathogen induced modulations of mosquito-host interactions while leveraging interdisciplinary tools from biochemistry, neuroscience, engineering, and chemical ecology to study how this affects genes, neurons, and insect behavior.

Credit: 
Virginia Tech

Despite progress, only 3 African nations expected to meet global breastfeeding goal

Despite substantial progress, only three African nations
expected to meet global breastfeeding goals

Estimates indicate that more than 800,000 child deaths could be averted annually
with optimal breastfeeding practices, according to World Health Organization

Detailed interactive maps illustrate where countries are advancing and falling behind

SEATTLE - Only three African countries are expected to meet the global target for exclusive breastfeeding, "an unparalleled source of nutrition for newborns and infants, no matter where they are born," according to a global health expert.

The three nations, Guinea-Bissau, Rwanda, and São Tomé and Príncipe, are singled out in a new study from the Institute for Health Metrics and Evaluation (IHME) at the University of Washington's School of Medicine. The study, published today in Nature Medicine in advance of World Breastfeeding Week Aug 1-7, finds areas of persistent low prevalence in countries that have made progress overall. Detailed maps accompanying the analysis reveal vulnerable populations, especially those living in rural areas and in extreme poverty.

However, researchers note that that several nations, including Burundi, Rwanda, and parts of Ethiopia, Uganda, and Zambia were among the highest rates of exclusive breastfeeding levels in 2000 and 2017. Sudan had some of the "highest and most consistent rates of increase" toward the exclusive breastfeeding goal of the World Health Organization (WHO) - prevalence by 2025 of at least 50% nationwide. The Global Burden of Disease, the annual comprehensive health study, attributed 169,000 child deaths to lack of breastfeeding in 2017, more than half of them in sub-Saharan Africa. Moreover, according to the WHO, increasing breastfeeding to near-universal levels could save more than 800,000 lives every year, the majority being children under 6 months.

The paper examines breastfeeding prevalence down to the level of individual districts and municipalities and compares progress among 49 African nations. The paper is accompanied by an interactive visualization tool that allows users to compare prevalence of exclusive breastfeeding within and across countries, look at the rate of change over time, and see the probability of meeting WHO's goal by 2025.

The value of exclusive breastfeeding of children cannot be over-emphasized.

"Breastfeeding is an unparalleled source of nutrition for newborns and infants, no matter where they are born. If we are serious about ensuring that every infant is offered a healthy start in life, we need to know who isn't being reached with the support they need to breastfeed," said Dr. Ellen Piwoz of the Bill & Melinda Gates Foundation. "By illustrating where exclusive breastfeeding rates are falling behind, these maps are a powerful tool to help policymakers and practitioners examine and act on disparities within their countries."

In 2017, at least a two-fold difference in exclusive breastfeeding prevalence existed across districts in 53% of countries, a three-fold difference in 14% of countries, and a more than six-fold difference in Niger and Nigeria.

Exclusive breastfeeding refers to mothers using only breast milk to feed their children for the first six months, with medications, oral rehydration salts, and vitamins as needed. The practice provides essential nutrition and can prevent infection and disease, particularly in areas without access to clean water.

Credit: 
Institute for Health Metrics and Evaluation

Are american nurses prepared for a catastrophe? New study says perhaps not

On average, American colleges and universities with nursing programs offer about one hour of instruction in handling catastrophic situations such as nuclear events, pandemics, or water contamination crises, according to two recent studies coauthored by a nursing professor at the University of Tennessee, Knoxville.

"Events that can cause greater impact but are less likely to occur, usually receive less training hours," said Roberta Lavin, executive associate dean and professor in UT's College of Nursing. Lavin is coauthor of the studies published in the Journal of Perinatal and Neonatal Nursing and Nursing Outlook.

The studies' results come from two surveys that were sent to all colleges and universities that offer nursing programs in the United States.

The surveys revealed that most students said they were not getting enough instruction in emergency response, while professors and lecturers said they were not prepared to teach how to offer care during and after catastrophic situations.

"Emergencies are not just the exact moment a disaster hits; it is also the aftermath. How do we evacuate a town? How do we carry out care for other chronic, sometimes life-lasting consequences that derive from these situations? That is the big challenge," said Lavin.

One study examined the management of Zika fever and water contamination crises and was focused on nurses' preparedness to attend pregnant women and children, two populations that are often overlooked in emergency plans.

In addition to nursing schools, that same study also assessed the preparedness of Master of Public Health programs, medical schools, and Doctor of Osteopathy programs in America.

"Even though all accreditation standards require this type of preparation, we are not putting enough emphasis on it," said Lavin.

Lavin and her coauthors now are working to offer resources to help close that knowledge gap. One of the actions they are taking is to design educational modules for instructors to use in their classes. The units are licensed under Creative Commons and can be downloaded free of charge; users can adjust the courses to meet the needs of their communities.

"We are putting people out there to attend these emergencies, and we owe it to them to prepare them right," Lavin said.

Credit: 
University of Tennessee at Knoxville

Understanding the drivers of a shift to sustainable diets

One of the 21st century's greatest challenges is to develop diets that are both sustainable for the planet and good for our bodies. An IIASA-led study explored the major drivers of widespread shifts to sustainable diets using a newly developed computational model of population-wide behavioral dynamics.

High meat consumption - especially of red and processed meat - has been linked to poor health outcomes, including diabetes, heart disease, and various cancers. Livestock farming for meat production also has a massive environmental footprint. It contributes to deforestation to make room for livestock, leads to land and water degradation and biodiversity loss, and, considering the meat industry's considerable methane emissions, creates as much greenhouse gas (GHG) emissions as all the world's cars, trucks, and airplanes combined. It therefore seems logical that several studies have demonstrated that diet change, especially lowering red meat consumption, can significantly contribute to the mitigation of climate change and environmental degradation, while also being conducive to better public health.

Previous studies on diet change scenarios involving lowered meat consumption, which were mostly based on stylized diets or average consumption values, showed promising results in terms of alleviating environmental degradation. If the world's average diet, for example, became flexitarian by 2050, in other words, people started to limit their red meat consumption to one serving per week and white meat to half a portion per day, the GHG emissions of the agriculture sector would be reduced by around 50%. This seems like an easy change to make, but research shows that due to the scale of behavioral change required, most of these scenarios will be difficult to achieve. In their study published in Nature Sustainability, researchers from IIASA and the University of Koblenz-Landau explored the major behavioral drivers of widespread shifts to sustainable diets.

"The human behavior aspect of such large scale diet changes have to our knowledge not been studied before in relation to the food system, although we need this information to understand how such a global change can be achieved. Our study covers this gap based on a computational model of population-wide behavioral dynamics," explains Sibel Eker, a researcher in the IIASA Ecosystems Services and Management Program and lead author of the study.

Eker and her colleagues adapted the land use module of an integrated assessment model to serve as a platform from where the population dynamics of dietary changes and their environmental impacts could be explored. They drew on environmental psychology to mimic population dynamics based on prominent psychological theories and included factors such as income, social norms, climate risk perception, health risk perception, and the self-efficacy of individuals, while considering the heterogeneity of their age, gender, and education levels. They then ran the model in an exploratory fashion to simulate the dynamics of dietary shifts between eating meat and a predominantly plant-based diet at the global scale. This computational analysis allowed them to identify the main drivers of widespread dietary shifts.

The results indicate that social norms - the unwritten rules of behavior that are considered acceptable in a group or society - along with self-efficacy are the key drivers of population-wide dietary shifts, playing an even more important role than both climate and health risk perception. The team also found that diet changes are particularly influenced by how quickly social norms spread in the young population and the self-efficacy of specifically females. Focusing on the behavior influencing factors highlighted in this study could therefore be helpful in the design of policy interventions or communication campaigns where community-building activities or empowering messages could be employed in addition to communicating information around climate and health risks related to meat consumption.

According to the researchers, to their knowledge, theirs is the first coupled model of climate, diet, and behavior. The modeling framework they developed is general and can be adapted to address new research questions, thus opening the door to many potential applications to explore connections between behavior, health, and sustainability. They plan to collect more data from sources like social media to quantify their model, as well as focus on specific cases where cultural values and traditions also play an important role in whether people are willing to adapt their behavior or not.

"We can use models to explore the social and behavioral aspects of climate change and sustainability problems in the same way as we explore the economic and environmental dimensions of our world. In this way, we get a better understanding of what works to steer the lifestyle changes required for sustainability and climate change mitigation. As lifestyle change is a key driver of climate change mitigation, this modeling exercise can be seen as an example of how we can integrate human behavior and lifestyle changes into integrated assessment models for a richer scenario exploration," concludes Eker.

Credit: 
International Institute for Applied Systems Analysis

More sensitive climates are more variable climates, research shows

A decade without any global warming is more likely to happen if the climate is more sensitive to carbon dioxide emissions, new research has revealed.

A team of scientists from the University of Exeter and the Centre of Ecology and Hydrology in the UK has conducted pioneering new research into why both surges and slowdowns of warming take place.

Using sophisticated climate models the team, led by PhD student Femke Nijsse, discovered if the climate was more sensitive to CO2 concentration also displayed larger variations of warming over a decade.

When combined with information from simulations without any carbon dioxide increases, the authors were able to assess the natural variability of each climate model.

The research is published this week in Nature Climate Change.

Femke Nijsse, from the University of Exeter, said: "We were surprised to see that even when we took into account that sensitive climate models warm more over the last decades of the 20th century, these sensitive models were still more likely to have short periods of cooling."

Climate sensitivity, which sits at the very heart of climate science, is the amount of global warming that takes place as atmospheric CO2 concentrations rise.

For many years, estimates have put climate sensitivity somewhere between 1.5-4.5°C of warming for a doubling of pre-industrial CO2levels.

The study found that cooling - or "hiatus" - decades were more than twice as likely around the turn of the century in high sensitivity models (models that warm 4.5 ºC after doubling CO2), compared to low sensitivity models (models that warm 1.5 ºC after doubling CO2).

Co-author Dr. Mark Williamson, A Research Fellow at Exeter: "This does not mean that the presence of a global warming slowdown at the beginning of the 21st century implies we live in a highly sensitive world.

"By looking at all decades together, we get a better picture and find observations are broadly consistent with a central estimate of climate sensitivity"

Ms Nijsse added: "We still don't exactly know how much the climate system will heat up, nor do we know exactly what the range of natural variability in trends will be over the coming decades. But our study shows that these risks should not be considered as separate."

The paper also studied the chance that a decade in the 21st century could warm by as much as the entire 20th century - a scenario that the research team call "hyperwarming".

Under a scenario where carbon dioxide emissions from fossil fuels continue to increase, the chance of hyperwarming is even more dependent on climate sensitivity than the long-term global warming trend.

Increasing the climate sensitivity by 50% from a central estimate of 3 ºC would increase the mean global warming to the end of this century by slightly less than 50%, but would increase the chance of a hyperwarming decade by more than a factor of ten.

Credit: 
University of Exeter

Plants defend against insects by inducing 'leaky gut syndrome'

image: Fall armyworms are pests of corn plants.

Image: 
Nick Sloff, Penn State

UNIVERSITY PARK, Pa.--Plants may induce "leaky gut syndrome"--permeability of the gut lining--in insects as part of a multipronged strategy for protecting themselves from being eaten, according to researchers at Penn State. By improving our understanding of plant defenses, the findings could contribute to the development of new pest control methods.

"We found that a combination of physical and chemical defenses in corn plants can disrupt the protective gut barriers of fall armyworms, creating opportunities for gut microbes to invade their body cavities," said Charles Mason, postdoctoral scholar in entomology. "This can cause septicemia, which can kill the insect, or simply trigger an immune response, which can weaken the insect."

The researchers reared fall armyworms in the laboratory and inoculated them with one of three types of naturally occurring gut bacteria. They fed the insects on one of three types of maize--one that is known to express enzymes that produce perforations in insect gut linings; one that is characterized by numerous elongated trichomes, or fine hairs that occur on the surface of the plant and help defend against herbivores; and one that has just a few short trichomes. The team used scanning electron microscopy to evaluate the impacts of the various bacteria and maize types on the integrity of the fall armyworms' gut linings.

The scientists found that the presence of all three types of gut bacteria decreased the ability of fall armyworm larvae to damage maize plants, especially when other defenses--such as elongated trichomes and enzymes, both of which can perforate gut linings--were present. However, the species of gut bacteria varied in the extent to which they weakened the insects. The results will appear in the July 22 issue of Proceedings of the National Academy of Sciences.

"Our results reveal a mechanism by which some plants use insects' gut microbiota against them in collaboration with their own defenses," said Mason.

Gary Felton, professor and head of the Department of Entomology, noted that the results should have broad significance towards understanding the ecological function of plant defenses.

"In the context of our study, disparate plant defenses, such as leaf trichomes and plant enzymes, all require certain gut microbes for their optimal defense against herbivores," he said. "Our results predict that the variation in the effectiveness of plant defenses in nature may be, in significant part, due to the variability observed in the microbial communities of insect guts."

The team said the results could help to inform the development of insect-resistant crops.

"It may be advantageous to 'stack' plant defenses that target the insect gut in order to create a 'leaky gut' that exposes the insect to microbial assaults on their immune system," said Mason.

Credit: 
Penn State

The early days of the Milky Way revealed

video: The universe 13,000 million years ago was very different from the universe we know today. It is understood that stars were forming at a very rapid rate, forming the first dwarf galaxies, whose mergers gave rise to the more massive present-day galaxies, including our own.

Image: 
Gabriel Pérez Díaz, SMM (IAC).

The universe 13,000 million years ago was very different from the universe we know today. It is understood that stars were forming at a very rapid rate, forming the first dwarf galaxies, whose mergers gave rise to the more massive present-day galaxies, including our own. However the exact chain of the events which produced the Milky Way was not known until now.

Exact measurements of position, brightness and distance for around a million stars of our galaxy within 6,500 light years of the sun, obtained with the Gaia space telescope, have allowed a team from the IAC to reveal some of its early stages. "We have analyzed, and compared with theoretical models, the distribution of colours and magnitudes (brightnesses) of the stars in the Milky Way, splitting them into several components; the so-called stellar halo (a spherical structure which surrounds spiral galaxies) and the thick disc (stars forming the disc of our Galaxy, but occupying a certain height range)" explains Carme Gallart, a researcher at the IAC and the first author of this article, which is published today in the journal Nature Astronomy.

Previous studies had discovered that the Galactic halo showed clear signs of being made up of two distinct stellar components, one dominated by bluer stars than the other. The movement of the stars in the blue component quickly allowed us to identify it as the remains of a dwarf galaxy (Gaia-Enceladus) which impacted onto the early Milky Way. However the nature of the red population, and the epoch of the merger between Gaia-Enceladus and our Galaxy were unknown until now.

"Analyzing the data from Gaia has allowed us to obtain the distribution of the ages of the stars in both components and has shown that the two are formed by equally old stars, which are older than those of the thick disc" says IAC researcher and co-author Chris Brook. But if both components were formed at the same time, what differentiates one from the other? "The final piece of the puzzle was given by the quantity of "metals" (elements which are not hydrogen or helium) in the stars of one component or the other" explains Tomás Ruiz Lara, an IAC researcher and another of the authors of the article. "The stars in the blue component have a smaller quantity of metals than those of the red component". These findings, with the addition of the predictions of simulations which are also analyzed in the article, have allowed the researchers to complete the history of the formation of the Milky Way.

Thirteen thousand million years ago stars began to form in two different stellar systems which then merged: one was a dwarf galaxy which we call Gaia-Enceladus, and the other was the main progenitor of our Galaxy, some four times more massive and with a larger proportion of metals. Some ten thousand million years ago there was a violent collision between the more massive system and Gaia-Enceladus. As a result some of its stars, and those of Gaia-Enceladus were set into chaotic motion, and eventually formed the halo of the present Milky Way. After that there were violent bursts of star formation until 6,000 million years ago, when the gas settled into the disc of the Galaxy, and produced what we know as the "thin disc".

"Until now all the cosmological predictions and observations of distant spiral galaxies similar to the Milky Way indicate that this violent phase of merging between smaller structures was very frequent" explains Matteo Monelli, a researcher at the IAC and a co-author of the article. Now we have been able to identify the specificity of the process in our own Galaxy, revealing the first stages of our cosmic history with unprecedented detail.

Credit: 
Instituto de Astrofísica de Canarias (IAC)

Characteristics in older patients associated with inability to return home after operation

WASHINGTON, D.C. (July 22, 2019): Older adults have a different physiology and unique set of needs that may make them more vulnerable to complications following a surgical procedure. The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) Geriatric Surgery Pilot Project has, for the first time, identified four factors in older patients that are associated with an inability to return home after an operation. The NSQIP Geriatric Surgery Pilot Project is unique in that it is the only specifically defined data set focused on outcomes for older surgical patients.

In presenting study results at the ACS Quality and Safety Conference 2019, concluding today in Washington, DC, researchers reported on geriatric-specific conditions among Geriatric Pilot Project patients that were associated with not living at home 30 days after surgery. This information can help surgeons advise patients about the possible effects of a surgical procedure on their lifestyle as well as their clinical outcomes before an operation. It also may guide hospital quality improvement programs to address pre- and postoperative conditions that may keep elderly surgical patients from returning home soon afterward.

"When surgeons speak with older patients about the decision to operate, we discuss complication rates and the risk of mortality. We don't usually talk about whether they will have the independence they had beforehand. In this study, we looked at the NSQIP data set to find factors that influence whether patients are living at home or require support for their functional needs in some kind of facility, such as a nursing home, 30 days after surgery. This information should help us make better preoperative decisions with our patients by allowing us to tell them about the impact a surgical procedure will have on their way of life," said study coauthor Ronnie Rosenthal, MD, FACS, co-principal investigator of the ACS-led Coalition for Quality in Geriatric Surgery (CQGS) and professor of surgery and geriatrics, Yale University School of Medicine, New Haven, CT.

The NSQIP Geriatric Surgery Pilot Project was created in 2014 to measure and improve the quality of surgical care for older Americans. The project measures preoperative variables and outcome measures that specifically target elderly patients, reflect the quality of their surgical care, and identify interventions that may improve their treatment and well-being.

"Hospitals may implement protocols that improve patient function or prevent postoperative problems that make it less likely for a patient to return home," said study co-author Lindsey Zhang, MD, MS, John A. Hartford Foundation James C. Thompson Clinical Scholar in Residence at ACS, and a general surgery resident at the University of Chicago Medical Center.

The researchers looked at 3,696 patients in the NSQIP Geriatric Surgery Pilot registry who had inpatient procedures between 2015 and 2017 and whose living location 30 days after surgery was known. Eighteen percent of these patients were still living in a care facility 30 days after surgical treatment. The four characteristics identified among these older patients were: a history of a fall within the past year, preoperative malnutrition as defined by more than 10 percent of unintentional weight loss, postoperative delirium, or a new or worsening pressure ulcer after surgery.

"This information empowers physicians to have a conversation with their older surgical patients about the possibility of a stay in an extended care facility, depending on patient characteristics and the nature of the operation they are about to undergo," Dr. Zhang said.

Because this study shows geriatric risk factors that appear to be associated with an extended stay in a care facility, its results may lead to quality improvement initiatives in a hospital. "Should we consider nutrition programs for patients with malnutrition or create programs to improve function for patients who have had a fall? Do we implement protocols in the postop period to prevent delirium and pressure ulcers? Will these steps lead to more patients going home after surgery? We can't say for sure, but these results provide strong evidence to say it's worth the effort for a hospital to address these issues," Dr. Zhang said.

On July 19, the ACS introduced the Geriatric Surgery Verification (GSV) Program by releasing the GSV standards for geriatric surgical care for hospitals to review prior to enrolling in this new surgical quality improvement program in late October. These standards address many key factors in geriatric surgery, including those that may delay an older patient's return home postoperatively.

Credit: 
American College of Surgeons

Are plant-based eating habits associated with lower diabetes risk?

Bottom Line: This study (called a systematic review and meta-analysis) combined the results of nine studies and examined the association between adherence to plant-based eating habits and risk of type 2 diabetes in adults. The analysis included 307,099 adults with 23,544 cases of type 2 diabetes. The authors report higher adherence to plant-based eating habits was associated with lower risk of type 2 diabetes, especially when only healthy plant-based foods, such as fruits, vegetables, whole grains, legumes and nuts, were included in the definition of plant-based. Limitations include that all of the studies analyzed were observational and dietary habits were self-reported.

Authors: Qi Sun, M.D., Sc.D., of the Harvard T. H. Chan School of Public Health, Boston, and coauthors

(doi:10.1001/jamainternmed.2019.2195)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Overstuffed cancer cells may have an Achilles' heel

image: Aneuploid yeast cells on the left have difficulty drawing in fluorescent molecules. Whereas, the normal yeast cells on the right are able to rapidly draw them in.

Image: 
Rong Li and Hung-Ji Tsai

In a study using yeast cells and data from cancer cell lines, Johns Hopkins University scientists report they have found a potential weak spot among cancer cells that have extra sets of chromosomes, the structures that carry genetic material. The vulnerability, they say, is rooted in a common feature among cancer cells -- their high intracellular protein concentrations -- that make them appear bloated and overstuffed, and which could be used as possible new targets for cancer treatments.

"Scientists are now thinking more about targeting the biophysical properties of cancer cells to make them self-destruct," says Rong Li, Ph.D., Bloomberg Distinguished Professor of Cell biology and Oncology at the Johns Hopkins University School of Medicine and of Chemical and Biomolecular Engineering at the Johns Hopkins Whiting School of Engineering.

Further research is planned to confirm the findings in animal and human cancer cells, says Li.

A report on the research, led by Li, is published in the June 6 issue of Nature.

The new experiments focused on a chromosome number abnormality known as aneuploidy. Normal human cells, for example, have a balanced number of chromosomes: 46 in all, or 23 pairs of different chromosomes. A cell with chromosomes that have extra or fewer copies is called aneuploid. Li says, "aneuploidy is the #1 hallmark of cancer," and is found in more than 90% of solid tumor cancer types.

When cells gain chromosomes, Li says, they also get an extra set of genes that produce more than the normal amount of protein that a cell makes. This excess can give cells growth abilities they normally wouldn't have, sometimes allowing them to overgrow and develop into a tumor.

Because aneuploid cells have unbalanced protein production, they have too many free-floating proteins that are not organized into a complex. This increases the concentration inside of the cell compared to outside. To compensate for the increased concentration, the cells draw in water, a phenomenon that leads to hypo-osmotic stress.

"Aneuploid cells tend to be bigger and more swollen than cells with a balanced number of chromosomes," says Li.

Li, who is a member of the Johns Hopkins Kimmel Cancer Center, says she and her team set out to see if there was a common Achilles' heel among aneuploid cancer cells, one that would make a powerful strategic target for cancer treatment.

For the study, which took nearly five years to complete, Li and her colleagues, including first author and Johns Hopkins postdoctoral fellow Hung-Ji Tsai, Ph.D., looked at yeast cells, which have 16 chromosomes. In stressful environments, such as those with cold temperatures or inadequate nutrients, yeast cells adapt by altering the number of chromosomes, which allows them to survive better due to changes in the relative amounts of various proteins.

Li and Tsai looked at gene expression levels of thousands of aneuploid yeast cells compared with normal ones. Specifically, the scientists looked for gene expression changes that were shared among the aneuploid cells despite their differences in chromosome copy number. Among the aneuploid cells, the scientists found that gene expression was altered in about 4% of the genome compared with normal cells.

Next, the scientists compared the aneuploidy-associated gene expression with information from a database at Stanford University that contains changes in gene expression among normal yeast cells exposed to different stressful environments. They found that both the aneuploid cells and normal cells under hypo-osmotic stress share certain gene expression characteristics. They also share the problem of being bloated, affecting their ability to internalize proteins located on the cell membrane that regulate nutrient uptake.

Li's team continued its work to see if it could exploit aneuploid cells' vulnerability in properly controlling the intake of nutrients. They screened the yeast genome and found a molecular pathway involving two proteins called ART1 and Rsp5 that regulate the cells' ability to draw in nutrients such as glucose and amino acids. When the scientists inactivated these proteins in the aneuploid yeast cells, they lacked the proper intracellular nutrient levels and were less able to grow.

The human equivalent of the molecular pathway involves proteins called arrestins and Nedd4.

"It's possible that we could find a treatment that targets this or another pathway that exploits the vulnerability common to aneuploid cancer cells," says Li.

Credit: 
Johns Hopkins Medicine

UNM scientists document late Pleistocene/early Holocene Mesoamerican stone tool tradition

image: UNM graduate student Paige Lynch conducting excavations at Mayahak Cab Pek in May 2019, part of ongoing UNM research into the earliest humans in the New World tropics.

Image: 
University of New Mexico

From the perspective of Central and South America, the peopling of the New World was a complex process lasting thousands of years and involving multiple waves of Pleistocene and early Holocene period immigrants entering into the Neotropics.

Paleoindian colonists arrived in waves of immigrants entering the Neotropics, a region starting in the humid rainforests of southern Mexico before 13,000 years ago and brought with them technologies developed for adaptation to environments and resources found in North America.

As the ice age ended across the New World people adapted more generalized stone tools to exploit changing environments and resources. In the Neotropics these changes would have been pronounced as patchy forests and grasslands gave way to broadleaf tropical forests.

In new research published recently in PLOS One titled Linking late Paleoindian stone tool technlogies and populations in North, Central and South America, scientists from The University of NewMexico led a study in Belize to document the very earliest indigenous stone tool tradition in southern Mesoamerica.

"This is an area of research for which we have very poor data regarding early humans, though this UNM-led project is expanding our knowledge of human behavior and relationships between people in North, Central and South America," said lead author Keith Prufer, professor from The University of New Mexico's Department of Anthropology.

This research, funded by grants from the National Science Foundation and the Alphawood Foundation, focuses on understanding the Late Pleistocene human colonization of tropics in the broad context of global changes occurring at the end of the last ice age (ca. 12,000-10,000 years ago). The research suggests the tools are part of a human adaptation story in response to emerging tropical conditions in what is today called the Neotropics, a broad region south of the Isthmus of Tehuantepec (in S Mexico).

As part of the research, the team conducted extensive excavations at two rock shelter sites from 2014-2018. The excavation sites, located in the Bladen Nature Research, are almost 30 miles from the nearest road or modern human settlement in a large undisturbed rainforest that is one of the best-protected wildlife refuges in Central America.

"We have identified and established an absolute chronology for the earliest stone tool types that are indigenous to Central America," said Prufer. "These have clear antecedents with the earliest known humans in both South America and North America, but appear to show more affinity with slightly younger Late Paleoindian toolkits in the Amazon and Northern Peru than with North America."

The research represents the first endogenous Paleoindian stone tool technocomplex recovered from well-dated stratigraphic contexts for Mesoamerica. Previously designated, these artifacts share multiple features with contemporary North and South American Paleoindian tool types. Once hafted, these bifaces appear to have served multiple functions for cutting, hooking, thrusting, or throwing.

"The tools were developed at a time of technological regionalization reflecting the diverse demands of a period of pronounced environmental change and population movement," said Prufer. "Combined stratigraphic, technological, and population paleogenetic data suggests that there were strong ties between lowland neotropic regions at the onset of the Holocene."

These findings support previous UNM research suggesting strong genetic relationships between early colonists in Central and South America, following the initial dispersal of humans from Asia into the Americas via the arctic prior to 14,000 years ago.

"We are partnering with Belizean conservation NGO Ya'axche Conservation Trust in our fieldwork to promote the importance of ancient cultural resources in biodiversity and protected areas management," said Prufer. "We spend a month every year camped out with no access to electricity, internet, phone or resupplies while we conduct excavations."

This field research involves several UNM graduate students in Archaeology and Evolutionary Anthropology as well as collaborators at Exeter University (UK) and Arizona State University. The analysis for this study was done in part at UNM's Center for Stable Isotopes, as well as with co-authors at Penn State and UC Santa Barbara. At UNM this involved the new radiocarbon preparation laboratories which are part of the Center for Stable isotopes, one of the anchors of UNM's interdisciplinary PAIS research and teaching facility.

The senior co-authors are world leaders in the study of early humans in the tropics and are committed to conservation efforts of cultural resources and regional biodiversity. Additionally, Prufer's long-term collaboration in indigenous Maya communities in the region was critical to the success of this project.

"This research suggests that further exploration of links between early humans living in the neotropics are needed to better understand how knowledge and technologies were shared, and will contribute to our understanding of processes that eventually led to the development of agriculture and sedentary communities," said Prufer. "Further studies on how these tools were used for food processing will be a key aspect of this research."

Credit: 
University of New Mexico

Serious falls are a health risk for adults under 65

New Haven, Conn. -- Adults who take several prescription medications are more likely to experience serious falls, say Yale researchers and their co-authors in a new study. This heightened risk can affect middle-aged individuals -- a population not typically viewed as vulnerable to debilitating or fatal falls, the researchers said.

To identify factors that put adults at risk for serious falls, the research team used patient data from the Veterans Aging Cohort Study (VACS), a national study of individuals who receive care through the Veterans Health Administration (VA). They identified 13,000 fall cases and compared them to controls of similar age, race, sex, and HIV status. The fall risk factors included prescription medication use, and alcohol and illegal drug use.

The researchers found that falls were a problem for middle-aged patients. "Providers typically think about falls in people over age 65. But these people were primarily in their 50s and falls were an important concern," said Julie Womack, lead author and associate professor at Yale School of Nursing.

The study also noted that the simultaneous use of multiple medications, known as polypharmacy, plays a significant role in serious falls among patients who are HIV positive and those who are not. The researchers examined HIV status because people treated for HIV take several medications and often at a younger age.

Medications that were associated with serious falls included those commonly used to treat anxiety and insomnia (benzodiazepines), as well as muscle relaxants and prescription opioids.

Another important finding is the role of alcohol and illegal drug use in falls, Womack said.

The study suggests that programs designed to prevent serious falls in older adults may need to be modified to address risks for middle-aged adults. "Fall risk factors are highly prevalent in the Baby Boomer generation more generally. The next step is to look at interventions for the middle aged," said Womack. Those interventions could address drinking and illicit drug use in addition to polypharmacy. "When we're thinking about fall prevention programs we have to think about alcohol and substance use. We need to help individuals cut back."

Reducing falls in middle-aged and older adults is vital because falls contribute to increased risk of injuries, hospitalizations, and death, said Womack.

Credit: 
Yale University

Antibiotics before liver transplants lead to better results

A UCLA-led research team has found that giving mice antibiotics for 10 days prior to a liver transplant leads to better liver function after the surgery.

After concluding the experiment mice, the scientists discovered data from liver transplants performed between October 2013 and August 2015 at the Ronald Reagan UCLA Medical Center, revealing that the same phenomenon appears to hold true in humans. The statistics from human patients even demonstrated that the people who were in worse health prior to their surgeries but received pre-surgery antibiotics fared better after their transplants than the patients who were healthier prior to their surgeries but did not receive antibiotics.

The researchers concluded that the antibiotics inhibited bacteria that causes inflammation, which in turn can lead to organ rejection. Specifically, they found that in both mice and humans, the treatment prior to a transplant reduced damage that could occur when blood flow is restored to the liver after a period of time without oxygen; and it reduced inflammation and cell damage while accelerating the removal of damaged cells. As a result, liver function was better than in the mice and human patients who did not receive antibiotics before a transplant.

Humans carry trillions of bacteria, many of which are essential for health -- aiding in food digestion, for example. But other bacteria are linked to inflammatory bowel disease, cardiovascular disease, obesity, diabetes and even Parkinson's disease.

"So the idea behind this is to identify which bacteria is the good guy and which is the bad guy," said Dr. Jerzy Kupiec-Weglinski, the Paul I. Terasaki Professor of Surgery at the David Geffen School of Medicine at UCLA and the study's senior author.

The research is published in the peer-reviewed Journal of Clinical Investigation.

Studies in the past four years at the University of Chicago and University of Maryland have shown that in mice that received antibiotics before skin or heart transplants, those transplants lasted longer without rejection than in mice that did not receive antibiotics pre-surgery. From those studies, the UCLA researchers were able to zero in on some bacteria that appeared to help ensure more successful transplants.

In the mice that received antibiotics over the 10 days before transplants, the livers showed less damage -- deterioration caused by dead cells, for example -- than those in mice that were not given antibiotics before undergoing the same procedure.

"The livers in the mice that received antibiotics were protected against transplant damage, as well as rejection later on, because the antibiotics modulated their host microbiomes, which in turn stimulated cell protection," Kupiec-Weglinski said.

To further substantiate the effect of the antibiotics, the researchers then transplanted fecal matter from the untreated mice into those that had been given the medication. The mice that received the fecal transplants suffered inflammatory damage to their livers, despite the fact that they had been given antibiotics earlier in the experiment.

"That showed that antibiotic-mediated benefits clearly relate to the microbiota," Kupiec-Weglinski said.

The data on the UCLA patients covered 264 people who had received liver transplants -- 156 who, because they were sicker before their surgeries received antibiotics for 10 or more days prior to the transplant, and 108 who were given antibiotics for less than 10 days, or not at all prior to surgery.

"To our total surprise, livers functioned better after transplantation in those patients who were very sick and required prolonged antibiotic therapy," Kupiec-Weglinski said.

The researchers then narrowed their focus to human patients who had been given one specific antibiotic, rifaximin, prior to the transplants. (To pull together a larger sample size, the team analyzed data for transplants for a longer period of time than the original set of UCLA patients, from January 2013 through July 2016.) They found that in patients that received rifaximin, which stays in the bowel and has a low risk for inducing bacterial resistance, early liver failure was significantly delayed or stopped.

Although the combination of data from human patients and mice demonstrates the benefits of extended pre-transplant antibiotic treatment, the researchers wrote in the journal that the results were gleaned over a short period of time -- just 10 days or less after the transplants -- and longer-term studies are needed to strengthen the findings.

Kupiec-Weglinski also said the study opens the door to further research that could determine which microorganisms protect liver function after transplants, and which bacteria need to be "turned down" to limit their negative effects.

Credit: 
University of California - Los Angeles Health Sciences