Culture

Physicians slow to use effective new antibiotics against superbugs

PITTSBURGH, Aug. 26, 2019 - New, more effective antibiotics are being prescribed in only about a quarter of infections by carbapenem-resistant Enterobacteriaceae (CRE), a family of the world's most intractable drug-resistant bacteria, according to an analysis by infectious disease and pharmaceutical scientists at the University of Pittsburgh School of Medicine and published today by the journal Open Forum Infectious Diseases.

This sluggish uptake of such high-priority antibiotics prompted the researchers to call for an examination of clinical and pharmaceutical stewardship practices across U.S. hospitals, as well as behavioral and economic factors, to see if the trend can be reversed before lackluster sales lead the pharmaceutical industry to stop developing much-needed antibiotics.

"The infectious diseases community spent the past decade saying, 'We need new antibiotics, this is a top priority,' and now we're at risk of sounding like the boy who cried wolf," said lead author Cornelius J. Clancy, M.D., associate professor of medicine and director of the mycology program and XDR Pathogen Laboratory in Pitt's Division of Infectious Diseases. "We have a responsibility to learn why it takes so long for antibiotics to be adopted into practice and figure out what we need to do to ensure the best antibiotics quickly reach the patients who desperately need them."

The U.S. Centers for Disease Control and Prevention has classified CRE as urgent threat pathogens and calls them the "nightmare bacteria." The World Health Organization and Infectious Disease Society of America have designated CRE as highest priority pathogens for development of new antibiotics. At the time of those declarations, polymyxins were the first-line antibiotics against CRE, even though they failed to work in about half the cases and carried a significant risk of damaging the kidneys.

Since 2015, five antibiotics against CRE have gained U.S. Food and Drug Administration (FDA) approval: ceftazidime-avibactam, meropenem-vaborbactam, plazomicin, eravacycline and imipenem-relebactam. Studies, including those conducted at UPMC, have shown that the first three of these antibiotics are significantly more effective at fighting CRE and less toxic than polymyxins (eravacycline and imipenem-relebactam are still too new for conclusive data).

Clancy and his colleagues surveyed hospital-based pharmacists in the U.S. to gauge their knowledge of the new antibiotics and their willingness to use them. The drugs were classified as the "first-line" choice against CRE blood infections by 90% of the pharmacists, pneumonia by 87%, intra-abdominal infections by 83% and urinary tract infections by 56%.

"Clearly hospital-based pharmacists are aware of these antibiotics and believe they are the best choice for the vast majority of CRE infections," said Clancy.

But when the team estimated the number of CRE infections nationwide and used national prescription data to calculate the proportions of old vs. new antibiotics used to treat those infections, they found that from February 2018 through January 2019, the new antibiotics were used only about 23% of the time. Their use likely started to exceed that of polymyxins only in December 2018, nearly four years after the first of the new antibiotics was approved by the FDA. Even after accounting for CRE infections in which new antibiotics might not be first-choice agents, the team found that use was only about 35% of what was expected based on positioning by hospital-based pharmacists.

Allergan and The Medicines Company, developers of two of the new antibiotics, have sought to exit the antimicrobial field since introducing their drugs because of insufficient returns on investment. Achaogen declared bankruptcy months after attaining FDA approval for a third new antibiotic.

The researchers suggest several reasons for the slow uptake of the new antibiotics, starting with cost. A 14-day course of the new antibiotics costs between $13,230 and $15,070, compared to $305 to $784 for the old drugs.

"Cost is a limitation, but I'm not convinced it is the sole cause of our findings," said Clancy. "Clinicians may not be prescribing the new drugs due to concerns about accelerating antibiotic-resistance or because initial studies on their effectiveness were relatively small. We need to get at the root causes of the disconnect between what the doctors prescribe and what the pharmacists we surveyed believe they should be prescribing, and then find a solution."

Credit: 
University of Pittsburgh

A lack of background knowledge can hinder reading comprehension

The purpose of going to school is to learn, but students may find certain topics difficult to understand if they don't have the necessary background knowledge. This is one of the conclusions of a research article published in Psychological Science, a journal of the Association for Psychological Science.

"Background knowledge plays a key role in students' reading comprehension -- our findings show that if students don't have sufficient related knowledge, they'll probably have difficulties understanding text," says lead researcher Tenaha O'Reilly of Educational Testing Service (ETS)'s Center for Research on Human Capital in Education. "We also found that it's possible to measure students' knowledge quickly by using natural language processing techniques. If a student scores below the knowledge threshold, they'll probably have trouble comprehending the text."

Previous research has shown that students who lack sufficient reading skills, including decoding and vocabulary, fare poorly relative to their peers. But the research of O'Reilly and ETS colleagues Zuowei Wang and John Sabatini suggests that a knowledge threshold may also be an essential component of reading comprehension.

The researchers examined data from 3,534 high-school students at 37 schools in the United States. The students completed a test that measured their background knowledge on ecosystems. For the topical vocabulary section of the test, the students saw a list of 44 words and had to decide which were related to the topic of ecosystems. They also completed a multiple-choice section that was designed to measure their factual knowledge.

Then, after reading a series of texts on the topic of ecosystems, the students completed 34 items designed to measure how well they understood the texts. These comprehension items tapped into their ability to summarize what they had read, recognize opinions and incorrect information, and apply what they had read to reason more broadly about the content.

The researchers used a statistical technique called broken-line regression -- often used to identify an inflection point in a data set -- to analyze the students' performance.

The results revealed that a background-knowledge score of about 33.5, or about 59% correct, functioned as a performance threshold. Below this score, background knowledge and comprehension were not noticeably correlated; above the threshold score, students' comprehension appeared to increase as their background knowledge increased.

Additional results indicated that the pattern could not be fully explained by the level of students' knowledge on a different topic -- what mattered was their background knowledge of ecosystems.

The researchers found that students' ability to identify specific keywords was a fairly strong predictor whether they would perform above or below the threshold. That is, correctly identifying ecosystems, habitat, and species as topically relevant was more strongly linked with students' comprehension than was identifying bioremediation, densities, and fauna.

The findings underscore the importance of having reached a basic knowledge level to be able to read and comprehend texts across different subjects:

"Reading isn't just relevant to English Language Arts classes but also to reading in the content areas," says O'Reilly. "The Common Core State Standards highlight the increasing role of content area and disciplinary reading. We believe that the role of background knowledge in students' comprehension and learning might be more pronounced when reading texts in the subject areas."

The researchers plan to explore whether a similar kind of knowledge threshold emerges in other topic areas and domains; they note that it will be important to extend the research by focusing on diverse measures and populations.

If the pattern holds, the findings could have important applications for classroom teaching, given the availability of knowledge assessments that can be administered without taking valuable time away from instruction.

"If we can identify whether a given student does not have sufficient knowledge to comprehend a text, then teachers can provide background material -- for example, knowledge maps -- so that students have a context for the texts they are about to read," O'Reilly concludes..

Credit: 
Association for Psychological Science

Researchers propose method to balance user experience and cloud cost

For an online gamer, lag is the worst. The gamer watches, telling the avatar to move to avoid another player's attack, but the avatar does nothing. Then, suddenly, the avatar does all of the commands, rapid fire. It was listening, it just took too long for the commands to process.

Researchers at the New Jersey Institute of Technology (NJIT) have now developed a method to help avoid this aggravating issue. They published their results in IEEE/CAA Journal of Automatica Sinica, a joint publication of the IEEE and the Chinese Association of Automation.

According to Dr. Qiang Fan from the Department of Electrical and Computer Engineering at NJIT, the problem comes down to something called end-to-end delay. This is the time it takes for information to be transmitted from a source to a destination across a network, such as a gamer's directions to the avatar actually acting, but it can go beyond the irritation of game delay.

"End-to-end delay is a significant metric for service performance," Fan said. "A long end-to-end delay is unbearable for various delay-sensitive applications, such as autonomous vehicles, augmented reality and virtual reality."

In autonomous vehicles, it's a critical problem. A delay between the source and the destination could result in an accident.

To address this issue, Fan and Nirwan Ansari, co-author and Distinguished Professor of Electrical and Computer Engineering at NJIT, proposed a fix using cloudlets. These are basically tiny versions of the cloud.

"The cloud is a centralized data center that offloads users' tasks via the Internet," Fan said, noting that this usually expediates the commands while reducing the amount of energy users consume in processing. "However, the cloud is usually remotely located and far away from its users."

In comparison, cloudlets live on the edge of a user's network and only address commands from the designated user or users, depending how many are within the network. They can significantly improve a network's service performance for a limited number of users, but they can be costly if more are needed

Each cloudlet must be hosted by a server, a considerable expense. If the cloudlet serves more than one user, the expense per user drops, but so the end-to-end delay length can increase.

Fan and Ansari aimed to find a balance between cost and acceptable delay. They developed an algorithm that assesses how the location and capacity of each cloudlet can best handle user requests to achieve an optimal balance.

"The proposed cloudlet placement scheme has jointly considered the deployment cost and service performance," Fan said, referring to the outcome of the researchers' simulations. "Cloudlet providers can flexibly balance the cost and performance by adjusting their deployment plans based on their practical requirements."

Credit: 
Chinese Association of Automation

Australian researchers reveal new insights into retina's genetic code

image: This is a graphic representation of the retinal atlas

Image: 
Centre for Eye Research Australia

Australian scientists have led the development of the world's most detailed gene map of the human retina, providing new insights which will help future research to prevent and treat blindness.

The retina is the latest part of the human body and the first part of the eye to be mapped as part of the Human Cell Atlas Project - a global project to create reference maps of all human cells to better understand, diagnose and treat disease.

It is also the first time an Australian group has contributed to the project.

The study, led by Dr Raymond Wong from the Centre for Eye Research Australia and University of Melbourne, Dr Samuel Lukowski from the Institute for Molecular Bioscience at the University of Queensland and Associate Professor Joseph Powell from the Garvan Institute of Medical Research, is published in the European Molecular Biological Organisation (EMBO) Journal.

Dr Wong says the study provides unprecedented insights into the genetic signals of cells in the retina - the thin layer of cells at the back of eye that sense light and send messages to the brain via the optic nerve to enable us to see.

The group examined the complex genetic sequences behind more than 20,000 individual cells to develop a profile of all major cell types in the retina and the genes they 'express' to function normally.

Cells mapped include photoreceptors which sense light and allow people to see, the retinal ganglion cells which transmit messages to the brain along the optic nerve and other cells which support the function and stability of the retina.

"By creating a genetic map of the human retina, we can understand the factors that enable cells to keep functioning and contribute to healthy vision,'' says Dr Wong.

"It can also help us understand the genetic signals that cause a cell to stop functioning, leading to vision loss and blindness.''

Associate Professor Powell says the retinal cell atlas will benefit researchers investigating Inherited Retinal Diseases, which occur when genetic 'mistakes' cause retinal cells to stop functioning, leading to vision loss and blindness.

"More than 200 genes are known to be associated with retinal diseases and having a detailed gene profile of individual retinal cell types, will help us study how those genes impact on different kinds of cells.

"This understanding is the first step to better identifying what causes disease and ultimately developing treatments.''

Dr Wong says the atlas will also help scientists conducting research in the emerging area of cell therapy - which could replace faulty retinal cells with new ones developed from induced pluripotent stem cells in the lab.

"The retinal cell atlas will give scientists a clear benchmark to assess the quality of the cells derived from stem cells to determine whether they have the correct genetic code which will enable them to function.''

Dr Lukowski says the research offers 'extraordinary potential'.

"We can now build upon this atlas of healthy cells with those from other retinal diseases and across different stages of human development, which will provide the community with powerful tools for disease prediction," he says.

According to Associate Professor Powell, cutting-edge cellular genomics technology will transform our understanding of health and disease.

"Cellular genomics is allowing us to see the human body at a higher resolution than ever before. The insights that researchers worldwide can gain from this atlas present an entirely new way to approach treatment and prevent eye disease."

Credit: 
University of Melbourne

Spikes in handgun purchases after high-profile events linked to more firearm injuries

image: This is a heat map of excess firearm purchases in California following 2012 presidential election.

Image: 
UC Davis Violence Prevention Research Program

Spikes in handgun purchases in 2012 after Sandy Hook and the re-election of President Obama have been linked to a 4% increase in firearm injury in California, a UC Davis Violence Prevention Research Program (VPRP) study has found.

The UC Davis School of Medicine study, will be published August 25 in Injury Epidemiology, assessed the sharp rise in handgun purchasing across 499 California cities and estimated whether the additional handguns increased fatal and non-fatal injuries in these communities. It is the first study to use a direct measure of handgun purchasing to link firearm purchases with subsequent firearm-related harm and to assess impact on firearm injury.

"We estimate there were 36,142 more-than-expected handgun acquisitions in California from the election through the 6-week period following the Sandy Hook school shooting," said Hannah Laqueur, co-author of the study and an assistant professor of emergency medicine at UC Davis. "This represents an increase of more than 55 percent over expected volume during this 11-week period."
The researchers found that cities with greater increases in the rate of handgun purchasing were more likely to see an increase in the rate of firearm injury.

"We estimated a 4% increase in injuries in the year following the two events over the entire state," said Rose Kagawa, co-author of the study and an assistant professor of emergency medicine at UC Davis. "This is an important increase in the total number of people injured: approximately 290 additional firearm injuries in the state."

Though the firearm purchasing spike statewide was substantial, it accounted for less than 10% of annual handgun acquisitions. It also is only a tiny fraction of the more than 30 million estimated privately owned firearms in California, the authors said.

"But even marginal increases in handgun prevalence may translate to more injuries," Kagawa said.

Links between firearm ownership and firearm harm

Firearm ownership is a known risk factor for firearm harm. The prevalence of firearm ownership has been associated with higher firearm homicide and suicide rates.

For the study, the research team assessed firearms purchases in California cities with a population of 10,000 or more and used a forecasting model to predict expected handgun purchases after the 2012 election. They estimated the spike in handgun purchases as the difference between actual handgun acquisitions, as recorded in California's Dealer Record of Sales, and expected acquisition based on the model. They tracked firearm fatalities using death records from the California Department of Public Health Vital Records and non-fatal injuries using hospital and emergency room visits gathered by the Office of Statewide Health Planning and Development. The data were tallied at the zip code level and attributed to corresponding cities.

"With the increasing rates of firearm purchases in the U.S. over the last decades and 2017 marking a 20-year high in firearm-related deaths, it is important to gain a deeper understanding of the relationship between firearm acquisition and harm to develop effective prevention strategies," Laqueur said.

Credit: 
University of California - Davis Health

Skin patch could painlessly deliver vaccines, cancer medications in one minute

image: A new microneedle patch delivers medication to melanomas within one minute (ruler is in centimeters).

Image: 
Celestine Hong and Yanpu He

SAN DIEGO, Aug. 25, 2019 -- Melanoma is a deadly form of skin cancer that has been increasing in the U.S. for the past 30 years. Nearly 100,000 new cases of melanoma are diagnosed every year, and 20 Americans die every day from it, according to the American Academy of Dermatology. Now, researchers have developed a fast-acting skin patch that efficiently delivers medication to attack melanoma cells. The device, tested in mice and human skin samples, is an advance toward developing a vaccine to treat melanoma and has widespread applications for other vaccines.

The researchers will present their findings today at the American Chemical Society (ACS) Fall 2019 National Meeting and Exposition. ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features more than 9,500 presentations on a wide range of science topics.

"Our patch has a unique chemical coating and mode of action that allows it to be applied and removed from the skin in just a minute while still delivering a therapeutic dose of drugs," says Yanpu He, a graduate student who helped develop the device. "Our patches elicit a robust antibody response in living mice and show promise in eliciting a strong immune response in human skin."

Topical ointments can impart medications to the skin, but they can only penetrate a small distance through it. While syringes are an effective drug delivery mode, they can be painful. Syringes can also be inconvenient for patients, leading to noncompliance.

Microneedle patches, prepared with a layer-by-layer (LbL) coating method, are one easy, pain-free way to administer treatment. With the LbL process, researchers coat a surface with molecules of alternating positive and negative charge. For a robust drug film to form on the surface of the patch, every adjacent layer must be strongly attracted to each other and also to the microneedle. "But this attraction makes the entire film very 'sticky,'" He notes. "Past methods, which have retained this 'sticky' nature, can take up to 90 minutes for a sufficient amount of drug to leave the patch and enter the skin."

Paula T. Hammond, Ph.D., along with her graduate students He, Celestine Hong and other colleagues at the Massachusetts Institute of Technology (MIT), devised a way around this problem. They designed a new pH-responsive polymer with two parts. "The first part contains amine groups that are positively charged at the pH at which we make the microneedles, but that become neutral at the pH of skin," He says. "The second part contains carboxylic acid groups with no charge when the microneedles are made, but which become negatively charged when the patch is applied to the skin, so there is an overall change in charge from positive to negative." While sticky negative-positive-negative layers are still required for LbL film construction, the team's patch quickly switches to repelling negative-negative-negative layers when placed on skin. After the microneedles pierce the skin and implant the LbL drug film beneath the skin, the drug leaves the patch quickly.

Using chicken ovalbumin as a model antigen, the team vaccinated mice with their patches, and compared the results with intramuscular and subcutaneous injections. The microneedle treatment produced nine times the antibody level compared to intramuscular injections (e.g., used for flu shots) and 160 times the antibody level compared to subcutaneous injections (e.g., used for measles vaccines). They also saw efficient immune activation in surgical samples of human skin.

"Our patch technology could be used to deliver vaccines to combat different infectious diseases," Hammond says. "But we are excited by the possibility that the patch is another tool in the oncologists' arsenal against cancer, specifically melanoma."

To make a melanoma vaccine, the researchers developed an antigen that includes a marker frequently overexpressed by melanoma cells, as well as an adjuvant, which creates a generalized danger signal for the immune system and boosts its response. Then, they tested different LbL microneedle film arrangements of antigen and adjuvant in immune cells derived from mice. From these experiments, the researchers identified the optimal LbL microneedle structure that appears to activate immune cells directly accessible in the skin. In living mice, these cells could, in turn, migrate to the lymphatic system and recruit other immune cells to attack the melanoma tumor. The researchers now plan to test the patches on melanoma tumors in mice.

"We are using low-cost chemistry and a simple fabrication scheme to transform vaccination," Hammond says. "Ultimately, we want to get a device approved and on the market."

Credit: 
American Chemical Society

New study: Migrating mule deer don't need directions

image: Mule deer move across a sagebrush-covered basin in western Wyoming. New University of Wyoming research shows that deer navigate in spring and fall mostly by using their knowledge of past migration routes and seasonal ranges.

Image: 
Joe Riis

How do big-game animals know where to migrate across hundreds of miles of vast Wyoming landscapes year after year?

Among scientists, there are two camps of thought. First is that animals use local cues within their vicinity to determine where to migrate. Animals might move up to areas with greener forage -- often termed green-wave surfing -- or move down from mountains with deeper snow. The second idea is that animals develop memory of the landscape where they live and then use that information to guide their movements.

Recent research from the University of Wyoming has found that memory explains much of deer behavior during migration: Mule deer navigate in spring and fall mostly by using their knowledge of past migration routes and seasonal ranges.

The study found that the location of past years' migratory route and summer range had 2-28 times more influence on a deer's choice of a migration path than environmental factors such as tracking spring green-up, autumn snow depth or topography.

"These animals appear to have a cognitive map of their migration routes and seasonal ranges, which helps them navigate tens to hundreds of miles between seasonal ranges," says the lead author of the paper, Jerod Merkle, assistant professor and Knobloch Professor in Migration Ecology and Conservation in the Department of Zoology and Physiology at UW.

The findings recently were published in Ecology Letters, a leading journal within the field of ecology. Co-authors of the paper included Hall Sawyer, with Western EcoSystems Technology Inc.; Kevin Monteith and Samantha Dwinnell, with UW's Haub School of Environment and Natural Resources; Matthew Kauffman, with the U.S. Geological Survey Wyoming Cooperative Fish and Wildlife Research Unit at UW; and Gary Fralick, with the Wyoming Game and Fish Department.

Scientists had long presumed that migratory behavior was dictated by availability of food resources and other external factors. Where you find resources, you will find species that exploit them, the theory went.

The UW team found it is not that simple. Without the intrinsic factor of landscape memory to guide deer between seasonal ranges, the long-distance corridors of western Wyoming's Green River Basin, for example -- exceeding 300 miles round-trip in some cases -- would not exist in their present form.

"It appears that green-wave surfing helps them determine when to move within a kind of 'map' in their brain," Merkle says. "The timing of spring green-up determines when an animal should migrate, but spatial memory determines where to migrate."

The finding has important conservation implications. Because landscape memory so strongly underlies mule deer migratory behavior, the loss of a migratory population also will destroy the herd's collective mental map of how to move within a landscape, making it very difficult to restore lost migration routes. Patches of potential habitat likely will go unused.

"This is yet another study that makes clear that animals must learn and remember how to make these incredible journeys," say Kauffman, who leads the Wyoming Cooperative Fish and Wildlife Research Unit, where the research was conducted. "This is critical for conservation, because it tells us that, to conserve a migration corridor, we need to conserve the specific animals who have the knowledge necessary to make the journey."

The study bolsters the findings of a 2018 paper in the journal Science by a UW-led team that found translocated bighorn sheep and moose with no knowledge of the landscape can take anywhere from several decades to a century to learn how to migrate to vacant habitats.

Similarly, strategies such as off-site restoration or mitigation may be unsuccessful if restored habitats are not "discovered" and integrated into the memory of individuals.

The study further makes a case that biologists will not be able to successfully predict migration corridors -- or optimally manage populations -- based on environmental information or range quality alone. Managers will find it difficult to evaluate potential conservation actions without directly gathering movement data, crucial information that reveals the migration knowledge that animals carry around in their heads.

Moreover, the research shows that migrants can obtain greater forage benefits during spring migration using memory of a vast landscape, compared to migrants that rely simply on foraging cues in their local area.

This suggests that the migratory routes we see today are optimized across generations for green-wave surfing in large landscapes. These learned migration corridors are not readily discoverable by animals if they cannot access the memories established by past generations.

Credit: 
University of Wyoming

Evolution designed by parasites

While analyzing interactions between parasites and hosts, a substantial amount of research has been devoted to studying the methods parasitic organisms use to control host behavior. In "Invisible Designers: Brain Evolution Through the Lens of Parasite Manipulation," published in the September 2019 issue of The Quarterly Review of Biology, Marco Del Giudice explores an overlooked aspect of this relationship by systematically discussing the ways in which parasitic behavior manipulation may encourage the evolution of mechanisms in the host's nervous and endocrine systems. Examining this evolutionary history, Del Giudice investigates the hypothetical methods hosts may have adopted to counteract attempts at behavioral hijacking.

Parasites, such as viruses, insects, helminths, and bacteria, seek to manipulate host behavior for numerous reasons. Parasitic organisms may induce behavioral changes in order to increase their chances of transmission from one host to another. In a similar vein, parasites may disrupt a host's normal neural functioning to prompt the organism to travel to an environment that is more hospitable for the parasite or more conducive for reproduction. Host bodies are sometimes co-opted and utilized as safe environments for the development of the parasite's offspring.

The means by which parasites attempt to alter host behavior also vary. Parasites may use an immunological approach by disrupting responses in an organism's immune system. A more direct option may be to employ neuropharmacological manipulation by secreting substances that interfere with the host's neurotransmitters. Parasites may also take the genomic/proteomic route by changing gene expression.

Expanding beyond the motivations and tactics of parasites, Del Giudice posits that attacks from these biochemical mechanisms place significant pressure on the nervous system to adapt and develop countermeasures. Drawing upon previous literature and real-world examples, the article proposes four categories of potential host countermeasures. Analyzing this taxonomy, Del Giudice argues that when encountering manipulation, hosts may prevent parasites from bypassing the brain's protective barrier, force parasites to work harder and release greater amounts of neuroactive substances, make signals more complex, or strengthen the brain's ability to endure disturbances.

Elaborating further on these countermeasures, the author considers the potential evolutionary constraints and the associated "robustness-fragility tradeoffs." Although countermeasures may increase complexity and deter parasites, new adaptations may simultaneously weaken another section of the system and provide alternative targets for parasites.

Taking into account the proposed effectiveness of the mechanisms discussed, the article suggests studying host-parasite interactions could aid in furthering neuroscience and psychopharmacology research. By studying adaptations to parasitic attacks, neuroscientists could gain additional perspective into how continuous evolutionary battles produce inefficiency or how once-critical mechanisms over time serve unrelated functions. In psychopharmacology, the biochemical methods parasites use mimic those of pharmacological drugs. Researchers could analyze the ways brain structures reject pharmacological interventions by parasites and, in turn, use that information to make psychoactive drugs more effective for patients.

"The unrelenting pressure exerted by parasites must have shaped the evolution of nervous and endocrine systems at all levels, with important consequences even for animals that are not (or no longer) manipulation targets. If this is true, many aspects of neurobiology are destined to remain mysterious or poorly understood until parasites--the brain's invisible designers--are finally included in the picture," Del Giudice writes.

Credit: 
University of Chicago Press Journals

Think declining mental sharpness 'just comes with age'? Think again, say experts

image: Founded in 1942, the American Geriatrics Society (AGS) is a nationwide, not-for-profit society of geriatrics healthcare professionals that has--for more than 75 years--worked to improve the health, independence, and quality of life of older people. Our nearly 6,000 members include geriatricians, geriatric nurses, social workers, family practitioners, physician assistants, pharmacists, and internists. The Society provides leadership to healthcare professionals, policymakers, and the public by implementing and advocating for programs in patient care, research, professional and public education, and public policy. For more information, visit AmericanGeriatrics.org.

Image: 
(C) 2019, American Geriatrics Society (AGS)

Declining mental sharpness "just comes with age," right? Not so fast, say geriatrics researchers and clinicians gathered at a prestigious 2018 conference hosted by the American Geriatrics Society (AGS) with support from the National Institute on Aging (NIA). In a report published in the Journal of the American Geriatrics Society (JAGS), attendees of a conference for the NIA's Grants for Early Medical/Surgical Specialists Transition into Aging Research (GEMSSTAR) program describe how increasing evidence shows age-related diseases--rather than age itself--may be the key cause of cognitive decline. And while old age remains a primary risk factor for cognitive impairment, researchers believe future research--and sustained funding--could illuminate more complex, nuanced connections between cognitive health, overall health, and how we approach age.

"We've long been taught that cognitive issues are 'just part of aging,'" explains Christopher R. Carpenter, MD, MSc, who helped coordinate the conference. "But contemporary medical research shows how bodily changes that lead to diseases like dementia appear long before the symptoms we associate with 'old age.' This begs the question: Is it really age that causes cognitive decline, or is it ultimately the diseases we now associate with age--in large part because we see them with increasing frequency now that we live longer? That's what we wanted to tackle coming together for this meeting."

Hosted by the AGS and NIA in 2018 as the third conference in a three-part series for GEMSSTAR scholars, the NIA "U13" conference brought together NIA experts and more than 100 scholars, researchers, and leaders representing 13 medical specialties to explore experiences with cognitive impairment across health care. Conference findings, published in JAGS (DOI: 10.1111/jgs.16093), detail early thinking on the two-way relationship between cognitive health and the health of other organ systems, as well as opportunities for moving science and practice forward.

According to attendees, several themes emerged:

Researchers and clinicians from across health care noted the critical relationship between two of their top concerns: Dementia and delirium (the medical term for abrupt, rapid-onset confusion or an altered mental state, which affects millions of older adults annually). Research now suggests delirium and dementia are mutually inclusive risk factors, with cases of one prompting risks for the other. Thus, prevention of delirium may offer the unprecedented opportunity to prevent or lessen future cognitive decline.

Still, as one of the conference attendees noted, "[T]he brain is not an island." Because the conference focused on the impact of cognitive impairment across specialties, a critical focal point for scholars was the complex, bi-directional relationship between cognition and the rest of the body. Cognitive impairments can serve as indicators or influencers in the course of other diseases and conditions. For example, cognitive impairment is perhaps "the strongest independent predictor" of hospital readmission and mortality for older people living with heart failure.

As the field progresses, however, a major barrier remains: A dearth of research owing to the exclusion of potential study participants who are cognitively impaired. Though obtaining informed consent (the term used to describe a person's willingness to participate in a study after confirming they understand all the possible risks and benefits) remains challenging, researchers pointed to data that willingness to participate remains high. Coupled with suggestions for tailoring consent safeguards to the types of studies and potential participants thus holds promise for protecting against exploitation while continuing to move cutting-edge care principles forward.

As the GEMSSTAR conference attendees concluded, "The aging of the U.S. population and the growing burden of dementia make this an area of critical research focus...[U]nderstanding and addressing cognitive health and its relationship with the health of other organ systems will require multidisciplinary team science...[and new] study designs..."

Credit: 
American Geriatrics Society

Suicide and self-harm risk nearly triple in people with restless leg syndrome

Restless legs syndrome was associated with a nearly tripled risk of suicide and self-harm in a new study led by Penn State researchers.

Using Big Data, the researchers found that people with restless legs syndrome (RLS) had a 2.7- fold higher risk of suicide or self-harm, even when the researchers controlled for such conditions as depression, insomnia, diabetes and others.

The study was published today (Aug. 23) in the Journal of the American Medical Association (JAMA) Network Open.

Xiang Gao, associate professor of nutritional sciences and director of the Nutritional Epidemiology Lab at Penn State, said that as suicide rates rise in the United States, the findings suggest that physicians should pay special attention to the mental health of patients with RLS.

"Our study suggests that restless legs syndrome isn't just connected to physical conditions, but to mental health, as well," Gao said. "And, with RLS being under-diagnosed and suicide rates rising, this connection is going to be more and more important. Clinicians may want to be careful when they're screening patients both for RLS and suicide risk."

According to the researchers, RLS affects approximately five percent of the U.S. population, causing an uncomfortable feeling in a person's legs resulting in the urge to move them, often during the night. While the exact cause of RLS is unknown, previous research has found an association between RLS and iron deficiency, as well as low levels of dopamine in the brain.

Gao said that while RLS has been linked with a higher chance of mortality in the past, scientists do not know why. Previous research has found associations between RLS and a greater risk for hypertension or heart attack, suggesting a possible cardiovascular component. But, some studies have also found links between RLS and depression and thoughts of suicide.

"I've wanted to explore a potential connection between RLS and suicide for more than 10 years, but because both RLS and suicide rates are low from a data perspective, it wasn't possible," Gao said. "But, when I moved here to Penn State, I gained access to a data set with more than 200 million people, so it gave us power to finally test this hypothesis."

The researchers used data from the Truven Health MarketScan national claims from 2006 to 2014, including 24,179 people who had been diagnosed with RLS and 145,194 people who did not have RLS. All participants were free of suicide and self-harm at the baseline of the study.

After analyzing the data, the researchers found that people who had restless leg syndrome had a 270 percent higher chance of suicide or self-harm than people who did not. The risk did not decrease even when the researchers controlled for such factors as depression, sleep disorders and common chronic diseases.

"After controlling for these factors, we still didn't see the association decrease, meaning RLS could still be an independent variable contributing to suicide and self-harm," said Muzi Na, Broadhurst Career Development Professor for the Study of Health Promotion and Disease Prevention at Penn State. "We still don't know the exact reason, but our results can help shape future research to learn more about the mechanism."

In the future, the researchers said additional studies will need to be done to replicate and confirm the findings.

Credit: 
Penn State

New approaches to heal injured nerves

image: Dietmar Fischer (on the left) and Marco Leibiger investigate new mechanisms that enable the regeneration of injured nervs.

Image: 
RUB, Kramer

Injuries to nerve fibers in the brain, spinal cord, and optic nerves usually result in functional losses as the nerve fibers are unable to regenerate. A team from the Department of Cell Physiology at Ruhr-Universität Bochum (RUB) led by Professor Dietmar Fischer has deciphered new mechanisms that enable the regeneration of such fibers. This could open up new treatment approaches for the brain, optic nerve, and spinal cord injuries. The researchers report on these results in the journal Nature Communications Biology on 23 August 2019.

Intervention into protein has desirable and undesirable effects

The brain, spinal cord, and optic nerves are referred to collectively as the central nervous system. The nerve fibers, called axons, are unable to grow back following injury, meaning that damage is permanent. "It is possible to partially restore the regenerative capacity of nerve cells in the central nervous system by eliminating the inhibiting protein PTEN," explains Dietmar Fischer. "However, a knockout of this kind also triggers many different reactions in the cells at the same time, which often lead to cancer." As a result, the direct inhibition of this protein is not suitable for therapeutic approaches in humans. What's more, the originally postulated mechanism underlying the renewed regenerative capacity following PTEN knockout could not be confirmed in further studies, causing the researchers to seek alternative explanations.

Only the positive effects allowed

While investigating this as-yet unclear mechanism, the Bochum-based researchers were able to show for the first time that PTEN knockout significantly inhibits an enzyme called glycogen synthase kinase 3, GSK3 for short. This enzyme, in turn, blocks another protein called collapsin response mediator protein 2, CRMP2. This means that the PTEN knockout prevents CRMP2 from being inhibited by GSK3. "If we directly prevent this second step, i.e., stop the inhibition of CRMP2, we can also achieve the regeneration-promoting effect in a more specific manner," explains Dietmar Fischer. The activation of CRMP2 itself is not known to have any carcinogenic effect.

Approaches for new medications

"Although we have so far only shown these effects in genetically modified mice and using gene therapy approaches, these findings open up various possibilities for the development of new drug approaches," explains the neuropharmacologist Dietmar Fischer. Further studies in his department are investigating these options.

Credit: 
Ruhr-University Bochum

Caregivers of people with dementia are losing sleep

image: This is Baylor University sleep researcher Chenlu Gao.

Image: 
Matthew Minard/Baylor University

Caregivers of people with dementia lose between 2.5 to 3.5 hours of sleep weekly due to difficulty falling asleep and staying asleep -- a negative for themselves and potentially for those who receive their care, Baylor University researchers say.

But the good news is that simple, low-cost interventions can improve caregivers' sleep and functioning.

The researchers' analysis of 35 studies with data from 3,268 caregivers -- "Sleep Duration and Sleep Quality in Caregivers of Patients with Dementia" -- is published in JAMA Network Open, a publication of the American Medical Association.

Informal caregiving for a person with dementia is akin to adding a part-time but unpaid job to one's life, with family members averaging 21.9 hours of caregiving, according to The Alzheimer's Association estimates.

"Losing 3.5 hours of sleep per week does not seem much, but caregivers often experience accumulation of sleep loss over years," said lead author Chenlu Gao, a doctoral candidate of psychology and neuroscience in Baylor's College of Arts & Sciences. "Losing 3.5 hours of sleep weekly on top of all the stress, grief and sadness can have a really strong impact on caregivers' cognition and mental and physical health. But improving caregivers' sleep quality through low-cost behavioral interventions can significantly improve their functions and quality of life."

Chronic stress is associated with short sleep and poor-quality sleep. Nighttime awakenings by a patient with dementia also can contribute to disturbed sleep in caregivers, researchers said.

"With that extra bit of sleep loss every night, maybe a caregiver now forgets some medication doses or reacts more emotionally than he or she otherwise would," said co-author Michael Scullin, Ph.D., director of Baylor's Sleep Neuroscience and Cognition Laboratory and assistant professor of psychology and neuroscience at Baylor.

"Caregivers are some of the most inspiring and hardest-working people in the world, but sleep loss eventually accumulates to a level that diminishes one's vigilance and multi-tasking."

Notably better sleep was observed in caregivers after such simple behaviors as getting more morning sunlight, establishing a regular and relaxing bedtime routine and taking part in moderate physical exercise.

In the United States, 16 million family caregivers give long-term care for dementia patients. Dementia affects some 50 million adults globally and is expected to increase to 131 million by 2050, according to the World Alzheimer Report. The global annual cost is nearing $1 trillion, largely due to patients' loss of independence because of problems with eating, bathing and grooming, incontinence and memory loss.

For the analysis, researchers searched articles in peer-reviewed journals and books addressing caregivers, sleep, dementia and Alzheimer's disease, published through June 2018. Those studies measured sleep quality and quantity by monitoring brain electrical activity, body movements and self-reporting by caregivers.

The difference in time and quality of sleep was significant when compared to non-caregivers in the same age range and with the recommended minimum of sleep: seven hours nightly for adults. Researchers also analyzed intervention-related changes in sleep quality, such as daytime exercise, not drinking coffee or tea past late afternoon, not drinking alcohol at night and getting more sunlight in the morning.

Researchers noted that four theories about sleep in dementia caregivers have emerged in studies:

The controversial "sleep need" view that older adults need less sleep than younger ones. If so, caregivers should report less sleep time but without changes in perceived sleep quality.

The "empowerment view," which argues that caregiving is a positive, enriching experience, and so sleep quality should be unchanged or even improved.

The "environmental stressor view," which holds that the caregiving is so stressful and unpredictable that caregivers would be unable to change their routine in such a way to benefit their sleep.

The "coping" view that health problems may be driven by unhealthy responses to stress, such as increased alcohol use and less exercise, while interventions should be associated with better sleep.

Baylor researchers' analysis found that caregivers slept less and perceived their sleep quality to be worsening. That means that they were not simply adapting - or not "needing" - sleep. Importantly, caregivers could improve their sleep through behavioral changes, as expected by the "coping" view of caregiving.

"Given the long-term, potentially cumulative health consequences of poor-quality sleep, as well as the rising need for dementia caregivers worldwide, clinicians should consider sleep interventions not only for the patient but also for the spouse, child or friend who will be providing care," Gao said.

Credit: 
Baylor University

Scientists use a new method to track pollution from cooking

image: Stir-frying and deep-frying, features of Chinese cooking, produce high concentrations of organic aerosols

Image: 
Yao He

Cooking organic aerosol (COA) is one of the most important primary sources of pollution in urban environments. There is growing evidence that exposure to cooking oil fumes is linked to lung cancer. Currently, the most effective method to identify and quantify COA is through positive matrix factorization of OA mass spectra from aerosol mass spectrometer measurements. However, for the widely used low mass resolution aerosol chemical speciation monitor (ACSM), it is often challenging to separate COA from traffic-related organic aerosol (HOA) due to the similarity of their unit mass resolution spectra.

Recently, Prof. Yele Sun and his team at the Institute of Atmospheric Physics, Chinese Academy of Sciences, found that black carbon (BC) is a good tracer to separate HOA and COA. By applying the BC tracer method to several datasets in megacities of Beijing and Nanjing, they found that COA contributed 15-27% to total organic aerosol in summer and even more than 10% during heating period with a significant enhancement of coal combustion emissions. COA is also an important contribution of OA in urban areas globally, on average contributing 15-20%. Their studies suggest that air quality improvements in developing countries could benefit substantially from the reduction of cooking emissions.

"Considering that ACSM has been increasingly deployed worldwide for routine measurements of aerosol particle composition, our study might have significant implications for better source apportionment of OA and exposure studies in the future." Said Prof. Sun.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Brain's astrocytes play starring role in long-term memory

image: Salk scientists (from left) Anto?nio Pinto-Duarte and Terrence Sejnowski discover that astrocytes, long considered sideline players in the brain, are required to establish long-lasting memories in mice.

Image: 
Salk Institute

LA JOLLA--(August 22, 2019) Star-shaped cells called astrocytes help the brain establish long-lasting memories, Salk researchers have discovered. The new work adds to a growing body of evidence that astrocytes, long considered to be merely supportive cells in the brain, may have more of a leading role. The study, published in the journal GLIA on July 26, 2019, could inform therapies for disorders in which long-term memory is impaired, such as traumatic brain injury or dementia.

"This is an indication that these cells are doing a lot more than just helping neurons maintain their activity," says Professor Terrence Sejnowski, head of Salk's Computational Neurobiology Laboratory and senior author of the new work. "It suggests that they're actually playing an important role in how information is transmitted and stored in the brain."

The brain's neurons rely on speedy electrical signals to communicate throughout the brain and release neurotransmitters, but astrocytes instead generate signals of calcium and release substances known as gliotransmitters, some of them chemically similar to neurotransmitters. The classical view was that astrocytes' function was mostly to provide support to the more active neurons, helping transport nutrients, clean up molecular debris, and hold neurons in place. Only more recently, researchers have found that they might play other, more active, roles in the brain through the release of gliotransmitters but these remain largely mysterious.

In 2014, Sejnowski, Salk postdoctoral researcher António Pinto-Duarte and their colleagues showed that disabling the release of gliotransmitters in astrocytes turned down a type of electrical rhythm known as a gamma oscillation, important for cognitive skills. In that study, when the researchers tested the learning and memory skills of mice with disabled astrocytes, they found deficits that were restricted to their capacity to discriminate novelty.

In the new study, Sejnowski's team looked for the first time at the longer-term memory of mice with disrupted astrocytes. They used genetically engineered animals lacking a receptor called type 2 inositol 1,4,5-trisphosphate (IP3R2), which astrocytes rely on to release calcium for communication.

The researchers tested the mice with three different types of learning and memory challenges, including interacting with a novel object and finding the exit in a maze. In each case, mice lacking IP3R2 showed the same ability to learn as normal mice. Moreover, when tested in the 24-48 hours after each initial learning process, the mice with disrupted astrocytes could still retain the information--finding their way through the maze, for example. The results were in line with what had been seen in prior studies.

However, when the group waited another 2 to 4 weeks and retested the trained mice, they saw large differences; the mice missing the receptor performed much worse, making more than twice as many errors when completing the maze.

"After a few-weeks delay, normal mice actually performed better than they did right after training, because their brain had gone through a process of memory consolidation," explains António Pinto-Duarte, who is the lead author of the new paper. "The mice lacking the IP3R2 receptor performed much worse."

The result is the first time that defects in astrocytes have been linked to defects in memory consolidation or remote memory.

The process of memory consolidation in the brain is known to involve several mechanisms affecting neurons. One of those mechanisms is thought to rely in an optimal adjustment of the strength of communication between neurons through long-term potentiation, by which that strength increases, and long-term depression, by which some of these connections weaken. Sejnowski and Pinto-Duarte showed that although the mice without IP3R2 and reduced astrocyte activity had no problems with the former, they exhibited significant deficits in the latter, suggesting that astrocytes may be playing a role specifically in the long-term depression of the connections between neurons.

"The mechanism of long-term depression of neurons is not as well studied or understood," says Sejnowski. "And this tells us we should be looking at how astrocytes are connected to the weakening of these neural connections."

The researchers are already planning future studies to better understand the pathways by which astrocytes affect the long-term depression of neuronal communication and memory in general.

"The long-term payout here is that if we better understand these pathways, we may be able to develop ways to manipulate memory consolidation with drugs," says Sejnowski.

Credit: 
Salk Institute

Biophysicists discovered how 'Australian' mutation leads to Alzheimer's disease

image: A team of scientists from MIPT and IBCh RAS studied one hereditary genetic mutation to discover general molecular mechanisms that may lead both to early onset of Alzheimer's disease and to the form of the disease caused by age-related changes in human body. Understanding these mechanisms is necessary for developing new targeted treatments for this neurodegenerative disease that is becoming ever more widespread across the developed countries' aging populations.

Image: 
Elena Khavina and @tsarcyanide, MIPT Press Office

A team of scientists from the Moscow Institute of Physics and Technology (MIPT) and Shemyakin-Ovchinnikov Institute of Bioorganic Chemistry (IBCh RAS) studied one hereditary genetic mutation to discover general molecular mechanisms that may lead both to early onset of Alzheimer's disease and to the form of the disease caused by age-related changes in human body. Understanding these mechanisms is necessary for developing new targeted treatments for this neurodegenerative disease that is becoming ever more widespread across the developed countries' aging populations. The study findings were published in ACS Chemical Biology.

Dementia is a syndrome in which there is deterioration in memory, thinking, behavior, and the ability to perform everyday activities. Alzheimer's disease is the most common form of dementia and may contribute to 60-70% of cases, according to WHO fact sheet. This makes dementia a public health priority, with substantial funds allocated to fight it by both governments and pharmaceutical companies. Prominent politicians such as Margaret Thatcher and Ronald Reagan were afflicted with Alzheimer's disease in their later years. Alzheimer's disease is most common in people over the age of 65 but people aged 40 or even younger are sometimes diagnosed with it as well. Approximately 10-15% of early onset cases are caused by inherited predisposition. Integrated studies of hereditary, or "familial" mutations may give researchers a clue about key mechanisms of Alzheimer's disease pathogenesis, in particular, its initial steps.

Alzheimer's disease is associated with accumulation of pathogenic amyloid-β peptides into amyloid plaques within brain tissue. These peptides are short (about 40 amino acids) fragments of amyloid precursor protein (APP) that spans through membrane of brain cells. APP protein is cleaved by various enzymes as part of neuron activity. The sequential cleavage of the "large" APP protein (biological function of which is still not fully understood) by β- and γ-secretase enzymes produces amyloid-β peptides which in small amounts are probably necessary for sustaining brain functions. However, γ-secretase cuts the APP chain (within neuron membranes) into consecutive fragments of slightly varying length, thus producing relatively "pathogenic" and "non-pathogenic" forms of amyloid-β peptides. The main pathogenic form consists of 42 amino acid residues (Aβ42), while the less pathogenic form consists of 40 residues (Aβ40). The Aβ42/Aβ40 ratio in healthy humans is not high, standing at approximately one to nine. A higher Aβ42/Aβ40 ratio indicates an excessive production of Aβ42 which leads to the neurodegenerative disorder. Researchers are currently testing a hypothesis that amyloid-β peptides are active participants of innate immunity of the human nervous system and their increased production may be caused by various inflammations and brain injuries. At the same time, a lot of familial mutations associated with early onset of Alzheimer's disease have been found in the transmembrane (TM) domain of APP.

This research aimed to study the "Australian" familial mutation (L723P) within the APP TM domain that is the cause of early onset of Alzheimer's disease. The scientists studied the structural-dynamic behavior of the mutant APP TM domain against the wild-type by the aid of protein engineering, high-resolution nuclear magnetic resonance (NMR), and computer simulations. NMR spectroscopy was used to compare wild-type APP peptide with its mutant by such parameters as "helicity" of the amino acid polypeptide chain, its bending and flexibility, as well as the accessibility to lipids and water molecules. The researchers discovered L723P mutation to cause local melt of the last turn of APP TM domain helix and also straighten and stabilize the domain in the center of lipid membrane. Apart from that, it was noted that mutation increases accessibility of the domain to water molecules, which shifts the "frame" of its cleavage by γ-secretase, thus switching between alternative ("pathogenic" and "non-pathogenic") cleavage cascades. This leads to growing Aβ42/Aβ40 ratio and general concentration of amyloid-β within brain tissue.

Senior research staff at Laboratory for Aging and Age-Related Neurodegenerative Diseases, MIPT, and Laboratory of biomolecular NMR-spectroscopy, IBCh RAS, Eduard Bocharov, commented:

"It goes without saying that this study touches upon just a few of causes for the multifactorial disorder that is Alzheimer's disease. The molecular mechanisms of its pathogenesis are being researched in numerous laboratories all over the world. In particular, a special attention in paid to studying the "key player" -- the amyloid precursor protein, as well as its sequential cleavage by secretases within neuron membranes. We described a cascade of events happening within and around the cell membrane as APP is cut by γ-secretase enzyme complex. We have thus used a single "Australian" mutation to reveal molecular mechanisms behind the pathogenesis that may lead both to early onset of Alzheimer's and the age-related form of the disease."

The study findings suggest a straightforward mechanism of Alzheimer's disease pathogenesis associated with the impact of "Australian" mutation on the structural-dynamic behavior of APP TM domain. This is what leads to the pathological cleavage of APP by secretases and the increased accumulation of pathogenic amyloid-β around neurons. Worth noting is the fact that age-related onset of Alzheimer's disease can be explained by similar mechanisms, where the effect of mutation is replaced by the impact of local environmental factors, such as oxidative stress or membrane lipid composition including cholesterol saturation. A detailed understanding of the molecular mechanisms regulating generation of amyloidogenic peptides is essential for development of novel treatment strategies targeted at the primary stage of the Alzheimer's disease pathogenesis.

Credit: 
Moscow Institute of Physics and Technology