Culture

Australian researchers reveal new insights into retina's genetic code

image: This is a graphic representation of the retinal atlas

Image: 
Centre for Eye Research Australia

Australian scientists have led the development of the world's most detailed gene map of the human retina, providing new insights which will help future research to prevent and treat blindness.

The retina is the latest part of the human body and the first part of the eye to be mapped as part of the Human Cell Atlas Project - a global project to create reference maps of all human cells to better understand, diagnose and treat disease.

It is also the first time an Australian group has contributed to the project.

The study, led by Dr Raymond Wong from the Centre for Eye Research Australia and University of Melbourne, Dr Samuel Lukowski from the Institute for Molecular Bioscience at the University of Queensland and Associate Professor Joseph Powell from the Garvan Institute of Medical Research, is published in the European Molecular Biological Organisation (EMBO) Journal.

Dr Wong says the study provides unprecedented insights into the genetic signals of cells in the retina - the thin layer of cells at the back of eye that sense light and send messages to the brain via the optic nerve to enable us to see.

The group examined the complex genetic sequences behind more than 20,000 individual cells to develop a profile of all major cell types in the retina and the genes they 'express' to function normally.

Cells mapped include photoreceptors which sense light and allow people to see, the retinal ganglion cells which transmit messages to the brain along the optic nerve and other cells which support the function and stability of the retina.

"By creating a genetic map of the human retina, we can understand the factors that enable cells to keep functioning and contribute to healthy vision,'' says Dr Wong.

"It can also help us understand the genetic signals that cause a cell to stop functioning, leading to vision loss and blindness.''

Associate Professor Powell says the retinal cell atlas will benefit researchers investigating Inherited Retinal Diseases, which occur when genetic 'mistakes' cause retinal cells to stop functioning, leading to vision loss and blindness.

"More than 200 genes are known to be associated with retinal diseases and having a detailed gene profile of individual retinal cell types, will help us study how those genes impact on different kinds of cells.

"This understanding is the first step to better identifying what causes disease and ultimately developing treatments.''

Dr Wong says the atlas will also help scientists conducting research in the emerging area of cell therapy - which could replace faulty retinal cells with new ones developed from induced pluripotent stem cells in the lab.

"The retinal cell atlas will give scientists a clear benchmark to assess the quality of the cells derived from stem cells to determine whether they have the correct genetic code which will enable them to function.''

Dr Lukowski says the research offers 'extraordinary potential'.

"We can now build upon this atlas of healthy cells with those from other retinal diseases and across different stages of human development, which will provide the community with powerful tools for disease prediction," he says.

According to Associate Professor Powell, cutting-edge cellular genomics technology will transform our understanding of health and disease.

"Cellular genomics is allowing us to see the human body at a higher resolution than ever before. The insights that researchers worldwide can gain from this atlas present an entirely new way to approach treatment and prevent eye disease."

Credit: 
University of Melbourne

Spikes in handgun purchases after high-profile events linked to more firearm injuries

image: This is a heat map of excess firearm purchases in California following 2012 presidential election.

Image: 
UC Davis Violence Prevention Research Program

Spikes in handgun purchases in 2012 after Sandy Hook and the re-election of President Obama have been linked to a 4% increase in firearm injury in California, a UC Davis Violence Prevention Research Program (VPRP) study has found.

The UC Davis School of Medicine study, will be published August 25 in Injury Epidemiology, assessed the sharp rise in handgun purchasing across 499 California cities and estimated whether the additional handguns increased fatal and non-fatal injuries in these communities. It is the first study to use a direct measure of handgun purchasing to link firearm purchases with subsequent firearm-related harm and to assess impact on firearm injury.

"We estimate there were 36,142 more-than-expected handgun acquisitions in California from the election through the 6-week period following the Sandy Hook school shooting," said Hannah Laqueur, co-author of the study and an assistant professor of emergency medicine at UC Davis. "This represents an increase of more than 55 percent over expected volume during this 11-week period."
The researchers found that cities with greater increases in the rate of handgun purchasing were more likely to see an increase in the rate of firearm injury.

"We estimated a 4% increase in injuries in the year following the two events over the entire state," said Rose Kagawa, co-author of the study and an assistant professor of emergency medicine at UC Davis. "This is an important increase in the total number of people injured: approximately 290 additional firearm injuries in the state."

Though the firearm purchasing spike statewide was substantial, it accounted for less than 10% of annual handgun acquisitions. It also is only a tiny fraction of the more than 30 million estimated privately owned firearms in California, the authors said.

"But even marginal increases in handgun prevalence may translate to more injuries," Kagawa said.

Links between firearm ownership and firearm harm

Firearm ownership is a known risk factor for firearm harm. The prevalence of firearm ownership has been associated with higher firearm homicide and suicide rates.

For the study, the research team assessed firearms purchases in California cities with a population of 10,000 or more and used a forecasting model to predict expected handgun purchases after the 2012 election. They estimated the spike in handgun purchases as the difference between actual handgun acquisitions, as recorded in California's Dealer Record of Sales, and expected acquisition based on the model. They tracked firearm fatalities using death records from the California Department of Public Health Vital Records and non-fatal injuries using hospital and emergency room visits gathered by the Office of Statewide Health Planning and Development. The data were tallied at the zip code level and attributed to corresponding cities.

"With the increasing rates of firearm purchases in the U.S. over the last decades and 2017 marking a 20-year high in firearm-related deaths, it is important to gain a deeper understanding of the relationship between firearm acquisition and harm to develop effective prevention strategies," Laqueur said.

Credit: 
University of California - Davis Health

Skin patch could painlessly deliver vaccines, cancer medications in one minute

image: A new microneedle patch delivers medication to melanomas within one minute (ruler is in centimeters).

Image: 
Celestine Hong and Yanpu He

SAN DIEGO, Aug. 25, 2019 -- Melanoma is a deadly form of skin cancer that has been increasing in the U.S. for the past 30 years. Nearly 100,000 new cases of melanoma are diagnosed every year, and 20 Americans die every day from it, according to the American Academy of Dermatology. Now, researchers have developed a fast-acting skin patch that efficiently delivers medication to attack melanoma cells. The device, tested in mice and human skin samples, is an advance toward developing a vaccine to treat melanoma and has widespread applications for other vaccines.

The researchers will present their findings today at the American Chemical Society (ACS) Fall 2019 National Meeting and Exposition. ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features more than 9,500 presentations on a wide range of science topics.

"Our patch has a unique chemical coating and mode of action that allows it to be applied and removed from the skin in just a minute while still delivering a therapeutic dose of drugs," says Yanpu He, a graduate student who helped develop the device. "Our patches elicit a robust antibody response in living mice and show promise in eliciting a strong immune response in human skin."

Topical ointments can impart medications to the skin, but they can only penetrate a small distance through it. While syringes are an effective drug delivery mode, they can be painful. Syringes can also be inconvenient for patients, leading to noncompliance.

Microneedle patches, prepared with a layer-by-layer (LbL) coating method, are one easy, pain-free way to administer treatment. With the LbL process, researchers coat a surface with molecules of alternating positive and negative charge. For a robust drug film to form on the surface of the patch, every adjacent layer must be strongly attracted to each other and also to the microneedle. "But this attraction makes the entire film very 'sticky,'" He notes. "Past methods, which have retained this 'sticky' nature, can take up to 90 minutes for a sufficient amount of drug to leave the patch and enter the skin."

Paula T. Hammond, Ph.D., along with her graduate students He, Celestine Hong and other colleagues at the Massachusetts Institute of Technology (MIT), devised a way around this problem. They designed a new pH-responsive polymer with two parts. "The first part contains amine groups that are positively charged at the pH at which we make the microneedles, but that become neutral at the pH of skin," He says. "The second part contains carboxylic acid groups with no charge when the microneedles are made, but which become negatively charged when the patch is applied to the skin, so there is an overall change in charge from positive to negative." While sticky negative-positive-negative layers are still required for LbL film construction, the team's patch quickly switches to repelling negative-negative-negative layers when placed on skin. After the microneedles pierce the skin and implant the LbL drug film beneath the skin, the drug leaves the patch quickly.

Using chicken ovalbumin as a model antigen, the team vaccinated mice with their patches, and compared the results with intramuscular and subcutaneous injections. The microneedle treatment produced nine times the antibody level compared to intramuscular injections (e.g., used for flu shots) and 160 times the antibody level compared to subcutaneous injections (e.g., used for measles vaccines). They also saw efficient immune activation in surgical samples of human skin.

"Our patch technology could be used to deliver vaccines to combat different infectious diseases," Hammond says. "But we are excited by the possibility that the patch is another tool in the oncologists' arsenal against cancer, specifically melanoma."

To make a melanoma vaccine, the researchers developed an antigen that includes a marker frequently overexpressed by melanoma cells, as well as an adjuvant, which creates a generalized danger signal for the immune system and boosts its response. Then, they tested different LbL microneedle film arrangements of antigen and adjuvant in immune cells derived from mice. From these experiments, the researchers identified the optimal LbL microneedle structure that appears to activate immune cells directly accessible in the skin. In living mice, these cells could, in turn, migrate to the lymphatic system and recruit other immune cells to attack the melanoma tumor. The researchers now plan to test the patches on melanoma tumors in mice.

"We are using low-cost chemistry and a simple fabrication scheme to transform vaccination," Hammond says. "Ultimately, we want to get a device approved and on the market."

Credit: 
American Chemical Society

New study: Migrating mule deer don't need directions

image: Mule deer move across a sagebrush-covered basin in western Wyoming. New University of Wyoming research shows that deer navigate in spring and fall mostly by using their knowledge of past migration routes and seasonal ranges.

Image: 
Joe Riis

How do big-game animals know where to migrate across hundreds of miles of vast Wyoming landscapes year after year?

Among scientists, there are two camps of thought. First is that animals use local cues within their vicinity to determine where to migrate. Animals might move up to areas with greener forage -- often termed green-wave surfing -- or move down from mountains with deeper snow. The second idea is that animals develop memory of the landscape where they live and then use that information to guide their movements.

Recent research from the University of Wyoming has found that memory explains much of deer behavior during migration: Mule deer navigate in spring and fall mostly by using their knowledge of past migration routes and seasonal ranges.

The study found that the location of past years' migratory route and summer range had 2-28 times more influence on a deer's choice of a migration path than environmental factors such as tracking spring green-up, autumn snow depth or topography.

"These animals appear to have a cognitive map of their migration routes and seasonal ranges, which helps them navigate tens to hundreds of miles between seasonal ranges," says the lead author of the paper, Jerod Merkle, assistant professor and Knobloch Professor in Migration Ecology and Conservation in the Department of Zoology and Physiology at UW.

The findings recently were published in Ecology Letters, a leading journal within the field of ecology. Co-authors of the paper included Hall Sawyer, with Western EcoSystems Technology Inc.; Kevin Monteith and Samantha Dwinnell, with UW's Haub School of Environment and Natural Resources; Matthew Kauffman, with the U.S. Geological Survey Wyoming Cooperative Fish and Wildlife Research Unit at UW; and Gary Fralick, with the Wyoming Game and Fish Department.

Scientists had long presumed that migratory behavior was dictated by availability of food resources and other external factors. Where you find resources, you will find species that exploit them, the theory went.

The UW team found it is not that simple. Without the intrinsic factor of landscape memory to guide deer between seasonal ranges, the long-distance corridors of western Wyoming's Green River Basin, for example -- exceeding 300 miles round-trip in some cases -- would not exist in their present form.

"It appears that green-wave surfing helps them determine when to move within a kind of 'map' in their brain," Merkle says. "The timing of spring green-up determines when an animal should migrate, but spatial memory determines where to migrate."

The finding has important conservation implications. Because landscape memory so strongly underlies mule deer migratory behavior, the loss of a migratory population also will destroy the herd's collective mental map of how to move within a landscape, making it very difficult to restore lost migration routes. Patches of potential habitat likely will go unused.

"This is yet another study that makes clear that animals must learn and remember how to make these incredible journeys," say Kauffman, who leads the Wyoming Cooperative Fish and Wildlife Research Unit, where the research was conducted. "This is critical for conservation, because it tells us that, to conserve a migration corridor, we need to conserve the specific animals who have the knowledge necessary to make the journey."

The study bolsters the findings of a 2018 paper in the journal Science by a UW-led team that found translocated bighorn sheep and moose with no knowledge of the landscape can take anywhere from several decades to a century to learn how to migrate to vacant habitats.

Similarly, strategies such as off-site restoration or mitigation may be unsuccessful if restored habitats are not "discovered" and integrated into the memory of individuals.

The study further makes a case that biologists will not be able to successfully predict migration corridors -- or optimally manage populations -- based on environmental information or range quality alone. Managers will find it difficult to evaluate potential conservation actions without directly gathering movement data, crucial information that reveals the migration knowledge that animals carry around in their heads.

Moreover, the research shows that migrants can obtain greater forage benefits during spring migration using memory of a vast landscape, compared to migrants that rely simply on foraging cues in their local area.

This suggests that the migratory routes we see today are optimized across generations for green-wave surfing in large landscapes. These learned migration corridors are not readily discoverable by animals if they cannot access the memories established by past generations.

Credit: 
University of Wyoming

Evolution designed by parasites

While analyzing interactions between parasites and hosts, a substantial amount of research has been devoted to studying the methods parasitic organisms use to control host behavior. In "Invisible Designers: Brain Evolution Through the Lens of Parasite Manipulation," published in the September 2019 issue of The Quarterly Review of Biology, Marco Del Giudice explores an overlooked aspect of this relationship by systematically discussing the ways in which parasitic behavior manipulation may encourage the evolution of mechanisms in the host's nervous and endocrine systems. Examining this evolutionary history, Del Giudice investigates the hypothetical methods hosts may have adopted to counteract attempts at behavioral hijacking.

Parasites, such as viruses, insects, helminths, and bacteria, seek to manipulate host behavior for numerous reasons. Parasitic organisms may induce behavioral changes in order to increase their chances of transmission from one host to another. In a similar vein, parasites may disrupt a host's normal neural functioning to prompt the organism to travel to an environment that is more hospitable for the parasite or more conducive for reproduction. Host bodies are sometimes co-opted and utilized as safe environments for the development of the parasite's offspring.

The means by which parasites attempt to alter host behavior also vary. Parasites may use an immunological approach by disrupting responses in an organism's immune system. A more direct option may be to employ neuropharmacological manipulation by secreting substances that interfere with the host's neurotransmitters. Parasites may also take the genomic/proteomic route by changing gene expression.

Expanding beyond the motivations and tactics of parasites, Del Giudice posits that attacks from these biochemical mechanisms place significant pressure on the nervous system to adapt and develop countermeasures. Drawing upon previous literature and real-world examples, the article proposes four categories of potential host countermeasures. Analyzing this taxonomy, Del Giudice argues that when encountering manipulation, hosts may prevent parasites from bypassing the brain's protective barrier, force parasites to work harder and release greater amounts of neuroactive substances, make signals more complex, or strengthen the brain's ability to endure disturbances.

Elaborating further on these countermeasures, the author considers the potential evolutionary constraints and the associated "robustness-fragility tradeoffs." Although countermeasures may increase complexity and deter parasites, new adaptations may simultaneously weaken another section of the system and provide alternative targets for parasites.

Taking into account the proposed effectiveness of the mechanisms discussed, the article suggests studying host-parasite interactions could aid in furthering neuroscience and psychopharmacology research. By studying adaptations to parasitic attacks, neuroscientists could gain additional perspective into how continuous evolutionary battles produce inefficiency or how once-critical mechanisms over time serve unrelated functions. In psychopharmacology, the biochemical methods parasites use mimic those of pharmacological drugs. Researchers could analyze the ways brain structures reject pharmacological interventions by parasites and, in turn, use that information to make psychoactive drugs more effective for patients.

"The unrelenting pressure exerted by parasites must have shaped the evolution of nervous and endocrine systems at all levels, with important consequences even for animals that are not (or no longer) manipulation targets. If this is true, many aspects of neurobiology are destined to remain mysterious or poorly understood until parasites--the brain's invisible designers--are finally included in the picture," Del Giudice writes.

Credit: 
University of Chicago Press Journals

Think declining mental sharpness 'just comes with age'? Think again, say experts

image: Founded in 1942, the American Geriatrics Society (AGS) is a nationwide, not-for-profit society of geriatrics healthcare professionals that has--for more than 75 years--worked to improve the health, independence, and quality of life of older people. Our nearly 6,000 members include geriatricians, geriatric nurses, social workers, family practitioners, physician assistants, pharmacists, and internists. The Society provides leadership to healthcare professionals, policymakers, and the public by implementing and advocating for programs in patient care, research, professional and public education, and public policy. For more information, visit AmericanGeriatrics.org.

Image: 
(C) 2019, American Geriatrics Society (AGS)

Declining mental sharpness "just comes with age," right? Not so fast, say geriatrics researchers and clinicians gathered at a prestigious 2018 conference hosted by the American Geriatrics Society (AGS) with support from the National Institute on Aging (NIA). In a report published in the Journal of the American Geriatrics Society (JAGS), attendees of a conference for the NIA's Grants for Early Medical/Surgical Specialists Transition into Aging Research (GEMSSTAR) program describe how increasing evidence shows age-related diseases--rather than age itself--may be the key cause of cognitive decline. And while old age remains a primary risk factor for cognitive impairment, researchers believe future research--and sustained funding--could illuminate more complex, nuanced connections between cognitive health, overall health, and how we approach age.

"We've long been taught that cognitive issues are 'just part of aging,'" explains Christopher R. Carpenter, MD, MSc, who helped coordinate the conference. "But contemporary medical research shows how bodily changes that lead to diseases like dementia appear long before the symptoms we associate with 'old age.' This begs the question: Is it really age that causes cognitive decline, or is it ultimately the diseases we now associate with age--in large part because we see them with increasing frequency now that we live longer? That's what we wanted to tackle coming together for this meeting."

Hosted by the AGS and NIA in 2018 as the third conference in a three-part series for GEMSSTAR scholars, the NIA "U13" conference brought together NIA experts and more than 100 scholars, researchers, and leaders representing 13 medical specialties to explore experiences with cognitive impairment across health care. Conference findings, published in JAGS (DOI: 10.1111/jgs.16093), detail early thinking on the two-way relationship between cognitive health and the health of other organ systems, as well as opportunities for moving science and practice forward.

According to attendees, several themes emerged:

Researchers and clinicians from across health care noted the critical relationship between two of their top concerns: Dementia and delirium (the medical term for abrupt, rapid-onset confusion or an altered mental state, which affects millions of older adults annually). Research now suggests delirium and dementia are mutually inclusive risk factors, with cases of one prompting risks for the other. Thus, prevention of delirium may offer the unprecedented opportunity to prevent or lessen future cognitive decline.

Still, as one of the conference attendees noted, "[T]he brain is not an island." Because the conference focused on the impact of cognitive impairment across specialties, a critical focal point for scholars was the complex, bi-directional relationship between cognition and the rest of the body. Cognitive impairments can serve as indicators or influencers in the course of other diseases and conditions. For example, cognitive impairment is perhaps "the strongest independent predictor" of hospital readmission and mortality for older people living with heart failure.

As the field progresses, however, a major barrier remains: A dearth of research owing to the exclusion of potential study participants who are cognitively impaired. Though obtaining informed consent (the term used to describe a person's willingness to participate in a study after confirming they understand all the possible risks and benefits) remains challenging, researchers pointed to data that willingness to participate remains high. Coupled with suggestions for tailoring consent safeguards to the types of studies and potential participants thus holds promise for protecting against exploitation while continuing to move cutting-edge care principles forward.

As the GEMSSTAR conference attendees concluded, "The aging of the U.S. population and the growing burden of dementia make this an area of critical research focus...[U]nderstanding and addressing cognitive health and its relationship with the health of other organ systems will require multidisciplinary team science...[and new] study designs..."

Credit: 
American Geriatrics Society

Suicide and self-harm risk nearly triple in people with restless leg syndrome

Restless legs syndrome was associated with a nearly tripled risk of suicide and self-harm in a new study led by Penn State researchers.

Using Big Data, the researchers found that people with restless legs syndrome (RLS) had a 2.7- fold higher risk of suicide or self-harm, even when the researchers controlled for such conditions as depression, insomnia, diabetes and others.

The study was published today (Aug. 23) in the Journal of the American Medical Association (JAMA) Network Open.

Xiang Gao, associate professor of nutritional sciences and director of the Nutritional Epidemiology Lab at Penn State, said that as suicide rates rise in the United States, the findings suggest that physicians should pay special attention to the mental health of patients with RLS.

"Our study suggests that restless legs syndrome isn't just connected to physical conditions, but to mental health, as well," Gao said. "And, with RLS being under-diagnosed and suicide rates rising, this connection is going to be more and more important. Clinicians may want to be careful when they're screening patients both for RLS and suicide risk."

According to the researchers, RLS affects approximately five percent of the U.S. population, causing an uncomfortable feeling in a person's legs resulting in the urge to move them, often during the night. While the exact cause of RLS is unknown, previous research has found an association between RLS and iron deficiency, as well as low levels of dopamine in the brain.

Gao said that while RLS has been linked with a higher chance of mortality in the past, scientists do not know why. Previous research has found associations between RLS and a greater risk for hypertension or heart attack, suggesting a possible cardiovascular component. But, some studies have also found links between RLS and depression and thoughts of suicide.

"I've wanted to explore a potential connection between RLS and suicide for more than 10 years, but because both RLS and suicide rates are low from a data perspective, it wasn't possible," Gao said. "But, when I moved here to Penn State, I gained access to a data set with more than 200 million people, so it gave us power to finally test this hypothesis."

The researchers used data from the Truven Health MarketScan national claims from 2006 to 2014, including 24,179 people who had been diagnosed with RLS and 145,194 people who did not have RLS. All participants were free of suicide and self-harm at the baseline of the study.

After analyzing the data, the researchers found that people who had restless leg syndrome had a 270 percent higher chance of suicide or self-harm than people who did not. The risk did not decrease even when the researchers controlled for such factors as depression, sleep disorders and common chronic diseases.

"After controlling for these factors, we still didn't see the association decrease, meaning RLS could still be an independent variable contributing to suicide and self-harm," said Muzi Na, Broadhurst Career Development Professor for the Study of Health Promotion and Disease Prevention at Penn State. "We still don't know the exact reason, but our results can help shape future research to learn more about the mechanism."

In the future, the researchers said additional studies will need to be done to replicate and confirm the findings.

Credit: 
Penn State

New approaches to heal injured nerves

image: Dietmar Fischer (on the left) and Marco Leibiger investigate new mechanisms that enable the regeneration of injured nervs.

Image: 
RUB, Kramer

Injuries to nerve fibers in the brain, spinal cord, and optic nerves usually result in functional losses as the nerve fibers are unable to regenerate. A team from the Department of Cell Physiology at Ruhr-Universität Bochum (RUB) led by Professor Dietmar Fischer has deciphered new mechanisms that enable the regeneration of such fibers. This could open up new treatment approaches for the brain, optic nerve, and spinal cord injuries. The researchers report on these results in the journal Nature Communications Biology on 23 August 2019.

Intervention into protein has desirable and undesirable effects

The brain, spinal cord, and optic nerves are referred to collectively as the central nervous system. The nerve fibers, called axons, are unable to grow back following injury, meaning that damage is permanent. "It is possible to partially restore the regenerative capacity of nerve cells in the central nervous system by eliminating the inhibiting protein PTEN," explains Dietmar Fischer. "However, a knockout of this kind also triggers many different reactions in the cells at the same time, which often lead to cancer." As a result, the direct inhibition of this protein is not suitable for therapeutic approaches in humans. What's more, the originally postulated mechanism underlying the renewed regenerative capacity following PTEN knockout could not be confirmed in further studies, causing the researchers to seek alternative explanations.

Only the positive effects allowed

While investigating this as-yet unclear mechanism, the Bochum-based researchers were able to show for the first time that PTEN knockout significantly inhibits an enzyme called glycogen synthase kinase 3, GSK3 for short. This enzyme, in turn, blocks another protein called collapsin response mediator protein 2, CRMP2. This means that the PTEN knockout prevents CRMP2 from being inhibited by GSK3. "If we directly prevent this second step, i.e., stop the inhibition of CRMP2, we can also achieve the regeneration-promoting effect in a more specific manner," explains Dietmar Fischer. The activation of CRMP2 itself is not known to have any carcinogenic effect.

Approaches for new medications

"Although we have so far only shown these effects in genetically modified mice and using gene therapy approaches, these findings open up various possibilities for the development of new drug approaches," explains the neuropharmacologist Dietmar Fischer. Further studies in his department are investigating these options.

Credit: 
Ruhr-University Bochum

Caregivers of people with dementia are losing sleep

image: This is Baylor University sleep researcher Chenlu Gao.

Image: 
Matthew Minard/Baylor University

Caregivers of people with dementia lose between 2.5 to 3.5 hours of sleep weekly due to difficulty falling asleep and staying asleep -- a negative for themselves and potentially for those who receive their care, Baylor University researchers say.

But the good news is that simple, low-cost interventions can improve caregivers' sleep and functioning.

The researchers' analysis of 35 studies with data from 3,268 caregivers -- "Sleep Duration and Sleep Quality in Caregivers of Patients with Dementia" -- is published in JAMA Network Open, a publication of the American Medical Association.

Informal caregiving for a person with dementia is akin to adding a part-time but unpaid job to one's life, with family members averaging 21.9 hours of caregiving, according to The Alzheimer's Association estimates.

"Losing 3.5 hours of sleep per week does not seem much, but caregivers often experience accumulation of sleep loss over years," said lead author Chenlu Gao, a doctoral candidate of psychology and neuroscience in Baylor's College of Arts & Sciences. "Losing 3.5 hours of sleep weekly on top of all the stress, grief and sadness can have a really strong impact on caregivers' cognition and mental and physical health. But improving caregivers' sleep quality through low-cost behavioral interventions can significantly improve their functions and quality of life."

Chronic stress is associated with short sleep and poor-quality sleep. Nighttime awakenings by a patient with dementia also can contribute to disturbed sleep in caregivers, researchers said.

"With that extra bit of sleep loss every night, maybe a caregiver now forgets some medication doses or reacts more emotionally than he or she otherwise would," said co-author Michael Scullin, Ph.D., director of Baylor's Sleep Neuroscience and Cognition Laboratory and assistant professor of psychology and neuroscience at Baylor.

"Caregivers are some of the most inspiring and hardest-working people in the world, but sleep loss eventually accumulates to a level that diminishes one's vigilance and multi-tasking."

Notably better sleep was observed in caregivers after such simple behaviors as getting more morning sunlight, establishing a regular and relaxing bedtime routine and taking part in moderate physical exercise.

In the United States, 16 million family caregivers give long-term care for dementia patients. Dementia affects some 50 million adults globally and is expected to increase to 131 million by 2050, according to the World Alzheimer Report. The global annual cost is nearing $1 trillion, largely due to patients' loss of independence because of problems with eating, bathing and grooming, incontinence and memory loss.

For the analysis, researchers searched articles in peer-reviewed journals and books addressing caregivers, sleep, dementia and Alzheimer's disease, published through June 2018. Those studies measured sleep quality and quantity by monitoring brain electrical activity, body movements and self-reporting by caregivers.

The difference in time and quality of sleep was significant when compared to non-caregivers in the same age range and with the recommended minimum of sleep: seven hours nightly for adults. Researchers also analyzed intervention-related changes in sleep quality, such as daytime exercise, not drinking coffee or tea past late afternoon, not drinking alcohol at night and getting more sunlight in the morning.

Researchers noted that four theories about sleep in dementia caregivers have emerged in studies:

The controversial "sleep need" view that older adults need less sleep than younger ones. If so, caregivers should report less sleep time but without changes in perceived sleep quality.

The "empowerment view," which argues that caregiving is a positive, enriching experience, and so sleep quality should be unchanged or even improved.

The "environmental stressor view," which holds that the caregiving is so stressful and unpredictable that caregivers would be unable to change their routine in such a way to benefit their sleep.

The "coping" view that health problems may be driven by unhealthy responses to stress, such as increased alcohol use and less exercise, while interventions should be associated with better sleep.

Baylor researchers' analysis found that caregivers slept less and perceived their sleep quality to be worsening. That means that they were not simply adapting - or not "needing" - sleep. Importantly, caregivers could improve their sleep through behavioral changes, as expected by the "coping" view of caregiving.

"Given the long-term, potentially cumulative health consequences of poor-quality sleep, as well as the rising need for dementia caregivers worldwide, clinicians should consider sleep interventions not only for the patient but also for the spouse, child or friend who will be providing care," Gao said.

Credit: 
Baylor University

Scientists use a new method to track pollution from cooking

image: Stir-frying and deep-frying, features of Chinese cooking, produce high concentrations of organic aerosols

Image: 
Yao He

Cooking organic aerosol (COA) is one of the most important primary sources of pollution in urban environments. There is growing evidence that exposure to cooking oil fumes is linked to lung cancer. Currently, the most effective method to identify and quantify COA is through positive matrix factorization of OA mass spectra from aerosol mass spectrometer measurements. However, for the widely used low mass resolution aerosol chemical speciation monitor (ACSM), it is often challenging to separate COA from traffic-related organic aerosol (HOA) due to the similarity of their unit mass resolution spectra.

Recently, Prof. Yele Sun and his team at the Institute of Atmospheric Physics, Chinese Academy of Sciences, found that black carbon (BC) is a good tracer to separate HOA and COA. By applying the BC tracer method to several datasets in megacities of Beijing and Nanjing, they found that COA contributed 15-27% to total organic aerosol in summer and even more than 10% during heating period with a significant enhancement of coal combustion emissions. COA is also an important contribution of OA in urban areas globally, on average contributing 15-20%. Their studies suggest that air quality improvements in developing countries could benefit substantially from the reduction of cooking emissions.

"Considering that ACSM has been increasingly deployed worldwide for routine measurements of aerosol particle composition, our study might have significant implications for better source apportionment of OA and exposure studies in the future." Said Prof. Sun.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Brain's astrocytes play starring role in long-term memory

image: Salk scientists (from left) Anto?nio Pinto-Duarte and Terrence Sejnowski discover that astrocytes, long considered sideline players in the brain, are required to establish long-lasting memories in mice.

Image: 
Salk Institute

LA JOLLA--(August 22, 2019) Star-shaped cells called astrocytes help the brain establish long-lasting memories, Salk researchers have discovered. The new work adds to a growing body of evidence that astrocytes, long considered to be merely supportive cells in the brain, may have more of a leading role. The study, published in the journal GLIA on July 26, 2019, could inform therapies for disorders in which long-term memory is impaired, such as traumatic brain injury or dementia.

"This is an indication that these cells are doing a lot more than just helping neurons maintain their activity," says Professor Terrence Sejnowski, head of Salk's Computational Neurobiology Laboratory and senior author of the new work. "It suggests that they're actually playing an important role in how information is transmitted and stored in the brain."

The brain's neurons rely on speedy electrical signals to communicate throughout the brain and release neurotransmitters, but astrocytes instead generate signals of calcium and release substances known as gliotransmitters, some of them chemically similar to neurotransmitters. The classical view was that astrocytes' function was mostly to provide support to the more active neurons, helping transport nutrients, clean up molecular debris, and hold neurons in place. Only more recently, researchers have found that they might play other, more active, roles in the brain through the release of gliotransmitters but these remain largely mysterious.

In 2014, Sejnowski, Salk postdoctoral researcher António Pinto-Duarte and their colleagues showed that disabling the release of gliotransmitters in astrocytes turned down a type of electrical rhythm known as a gamma oscillation, important for cognitive skills. In that study, when the researchers tested the learning and memory skills of mice with disabled astrocytes, they found deficits that were restricted to their capacity to discriminate novelty.

In the new study, Sejnowski's team looked for the first time at the longer-term memory of mice with disrupted astrocytes. They used genetically engineered animals lacking a receptor called type 2 inositol 1,4,5-trisphosphate (IP3R2), which astrocytes rely on to release calcium for communication.

The researchers tested the mice with three different types of learning and memory challenges, including interacting with a novel object and finding the exit in a maze. In each case, mice lacking IP3R2 showed the same ability to learn as normal mice. Moreover, when tested in the 24-48 hours after each initial learning process, the mice with disrupted astrocytes could still retain the information--finding their way through the maze, for example. The results were in line with what had been seen in prior studies.

However, when the group waited another 2 to 4 weeks and retested the trained mice, they saw large differences; the mice missing the receptor performed much worse, making more than twice as many errors when completing the maze.

"After a few-weeks delay, normal mice actually performed better than they did right after training, because their brain had gone through a process of memory consolidation," explains António Pinto-Duarte, who is the lead author of the new paper. "The mice lacking the IP3R2 receptor performed much worse."

The result is the first time that defects in astrocytes have been linked to defects in memory consolidation or remote memory.

The process of memory consolidation in the brain is known to involve several mechanisms affecting neurons. One of those mechanisms is thought to rely in an optimal adjustment of the strength of communication between neurons through long-term potentiation, by which that strength increases, and long-term depression, by which some of these connections weaken. Sejnowski and Pinto-Duarte showed that although the mice without IP3R2 and reduced astrocyte activity had no problems with the former, they exhibited significant deficits in the latter, suggesting that astrocytes may be playing a role specifically in the long-term depression of the connections between neurons.

"The mechanism of long-term depression of neurons is not as well studied or understood," says Sejnowski. "And this tells us we should be looking at how astrocytes are connected to the weakening of these neural connections."

The researchers are already planning future studies to better understand the pathways by which astrocytes affect the long-term depression of neuronal communication and memory in general.

"The long-term payout here is that if we better understand these pathways, we may be able to develop ways to manipulate memory consolidation with drugs," says Sejnowski.

Credit: 
Salk Institute

Biophysicists discovered how 'Australian' mutation leads to Alzheimer's disease

image: A team of scientists from MIPT and IBCh RAS studied one hereditary genetic mutation to discover general molecular mechanisms that may lead both to early onset of Alzheimer's disease and to the form of the disease caused by age-related changes in human body. Understanding these mechanisms is necessary for developing new targeted treatments for this neurodegenerative disease that is becoming ever more widespread across the developed countries' aging populations.

Image: 
Elena Khavina and @tsarcyanide, MIPT Press Office

A team of scientists from the Moscow Institute of Physics and Technology (MIPT) and Shemyakin-Ovchinnikov Institute of Bioorganic Chemistry (IBCh RAS) studied one hereditary genetic mutation to discover general molecular mechanisms that may lead both to early onset of Alzheimer's disease and to the form of the disease caused by age-related changes in human body. Understanding these mechanisms is necessary for developing new targeted treatments for this neurodegenerative disease that is becoming ever more widespread across the developed countries' aging populations. The study findings were published in ACS Chemical Biology.

Dementia is a syndrome in which there is deterioration in memory, thinking, behavior, and the ability to perform everyday activities. Alzheimer's disease is the most common form of dementia and may contribute to 60-70% of cases, according to WHO fact sheet. This makes dementia a public health priority, with substantial funds allocated to fight it by both governments and pharmaceutical companies. Prominent politicians such as Margaret Thatcher and Ronald Reagan were afflicted with Alzheimer's disease in their later years. Alzheimer's disease is most common in people over the age of 65 but people aged 40 or even younger are sometimes diagnosed with it as well. Approximately 10-15% of early onset cases are caused by inherited predisposition. Integrated studies of hereditary, or "familial" mutations may give researchers a clue about key mechanisms of Alzheimer's disease pathogenesis, in particular, its initial steps.

Alzheimer's disease is associated with accumulation of pathogenic amyloid-β peptides into amyloid plaques within brain tissue. These peptides are short (about 40 amino acids) fragments of amyloid precursor protein (APP) that spans through membrane of brain cells. APP protein is cleaved by various enzymes as part of neuron activity. The sequential cleavage of the "large" APP protein (biological function of which is still not fully understood) by β- and γ-secretase enzymes produces amyloid-β peptides which in small amounts are probably necessary for sustaining brain functions. However, γ-secretase cuts the APP chain (within neuron membranes) into consecutive fragments of slightly varying length, thus producing relatively "pathogenic" and "non-pathogenic" forms of amyloid-β peptides. The main pathogenic form consists of 42 amino acid residues (Aβ42), while the less pathogenic form consists of 40 residues (Aβ40). The Aβ42/Aβ40 ratio in healthy humans is not high, standing at approximately one to nine. A higher Aβ42/Aβ40 ratio indicates an excessive production of Aβ42 which leads to the neurodegenerative disorder. Researchers are currently testing a hypothesis that amyloid-β peptides are active participants of innate immunity of the human nervous system and their increased production may be caused by various inflammations and brain injuries. At the same time, a lot of familial mutations associated with early onset of Alzheimer's disease have been found in the transmembrane (TM) domain of APP.

This research aimed to study the "Australian" familial mutation (L723P) within the APP TM domain that is the cause of early onset of Alzheimer's disease. The scientists studied the structural-dynamic behavior of the mutant APP TM domain against the wild-type by the aid of protein engineering, high-resolution nuclear magnetic resonance (NMR), and computer simulations. NMR spectroscopy was used to compare wild-type APP peptide with its mutant by such parameters as "helicity" of the amino acid polypeptide chain, its bending and flexibility, as well as the accessibility to lipids and water molecules. The researchers discovered L723P mutation to cause local melt of the last turn of APP TM domain helix and also straighten and stabilize the domain in the center of lipid membrane. Apart from that, it was noted that mutation increases accessibility of the domain to water molecules, which shifts the "frame" of its cleavage by γ-secretase, thus switching between alternative ("pathogenic" and "non-pathogenic") cleavage cascades. This leads to growing Aβ42/Aβ40 ratio and general concentration of amyloid-β within brain tissue.

Senior research staff at Laboratory for Aging and Age-Related Neurodegenerative Diseases, MIPT, and Laboratory of biomolecular NMR-spectroscopy, IBCh RAS, Eduard Bocharov, commented:

"It goes without saying that this study touches upon just a few of causes for the multifactorial disorder that is Alzheimer's disease. The molecular mechanisms of its pathogenesis are being researched in numerous laboratories all over the world. In particular, a special attention in paid to studying the "key player" -- the amyloid precursor protein, as well as its sequential cleavage by secretases within neuron membranes. We described a cascade of events happening within and around the cell membrane as APP is cut by γ-secretase enzyme complex. We have thus used a single "Australian" mutation to reveal molecular mechanisms behind the pathogenesis that may lead both to early onset of Alzheimer's and the age-related form of the disease."

The study findings suggest a straightforward mechanism of Alzheimer's disease pathogenesis associated with the impact of "Australian" mutation on the structural-dynamic behavior of APP TM domain. This is what leads to the pathological cleavage of APP by secretases and the increased accumulation of pathogenic amyloid-β around neurons. Worth noting is the fact that age-related onset of Alzheimer's disease can be explained by similar mechanisms, where the effect of mutation is replaced by the impact of local environmental factors, such as oxidative stress or membrane lipid composition including cholesterol saturation. A detailed understanding of the molecular mechanisms regulating generation of amyloidogenic peptides is essential for development of novel treatment strategies targeted at the primary stage of the Alzheimer's disease pathogenesis.

Credit: 
Moscow Institute of Physics and Technology

Child death rate linked to hospital preparedness for pediatric emergencies

PITTSBURGH, Aug. 23, 2019 - Critically ill children brought to hospital emergency departments that are ill-prepared to care for pediatric emergencies have more than three times the odds of dying compared to those brought to hospitals well-equipped to care for them, according to an analysis led by University of Pittsburgh and University of California-Los Angeles physician-scientists.

The findings, published today in the journal Pediatrics, are the first to provide evidence from multiple states linking the readiness of hospital emergency departments to care for critically ill or injured children with outcomes, and could guide a variety of policy responses.

"Pediatric care requires specialized equipment, training and protocols to provide the best care to children. Obtaining that kind of preparedness is costly and time-consuming," said senior author Jeremy Kahn, M.D., M.S., professor in the Department of Critical Care Medicine at Pitt's School of Medicine and the Department of Health Policy and Management at Pitt's Graduate School of Public Health. "Our study suggests that efforts to better prepare hospitals to care for pediatric emergencies save lives."

Kahn and his colleagues obtained data from 426 hospitals in Florida, Iowa, Massachusetts, Nebraska and New York, on 20,483 critically ill patients age 18 or younger who were brought to the hospital emergency department. They cross-referenced the patient outcomes with the "pediatric readiness" of the hospital's emergency department.

Pediatric readiness is indicated by a score assigned following assessment by the National Pediatric Readiness Project, a quality improvement effort of several federal government and non-profit advocacy organizations. Hospitals receive higher scores based on several factors, including whether they have equipment designed for use on children, pediatric-specific protocols for medical procedures and care, and educational programming to keep clinicians up-to-date on the latest guidelines in pediatric care. The standardized readiness score ranges from 0 to 100.

The team divided the hospitals into four groups based on their pediatric readiness score, with the lowest quartile's scores ranging from 29.6 to 59.3, and the highest from 88.2 to 99.9. Hospitals in the lowest quartile had a pediatric mortality rate for critically ill children of 11.1%, compared to 3.4% for the highest quartile.

"Our findings indicate that it matters which hospital a critically ill or injured child is brought to in an emergency," said co-author Jennifer Marin, M.D., M.Sc., an emergency physician at UPMC Children's Hospital of Pittsburgh and associate professor of pediatrics and emergency medicine in Pitt's School of Medicine. "A hospital's pediatric readiness should be a factor in determining to which hospital a critically ill child should be transported."

There likely isn't one perfect solution to the disparity in outcomes, noted lead author Stefanie Ames, M.D., M.S., a pediatrician specializing in critical care medicine at UCLA Mattel Children's Hospital and assistant professor in the Division of Pediatric Critical Care at UCLA David Geffen School of Medicine.

"Should we focus only on improving the pediatric readiness of all hospitals, potentially investing money and resources in hospitals that rarely see children? Or should we do more to direct pediatric emergencies to hospitals well-equipped to care for them, potentially increasing transport times?" she asked. "Some combination will likely be needed and potential solutions also could incorporate telemedicine and processes to promote quick recognition and transfer of pediatric emergencies to more prepared hospitals."

Credit: 
University of Pittsburgh

Your heart's best friend: Dog ownership associated with better cardiovascular health

ROCHESTER, Minn. -- Owning a pet may help maintain a healthy heart, especially if that pet is a dog, according to the first analysis of data from the Kardiozive Brno 2030 study. The study examines the association of pet ownership -- specifically dog ownership -- with cardiovascular disease risk factors and cardiovascular health. The results are published in Mayo Clinic Proceedings: Innovations, Quality & Outcomes.

The study first established baseline health and socio-economic information on more than 2,000 subjects in the city of Brno, Czech Republic, from January 2013 through Dec. 2014. Follow-up evaluations are scheduled for five-year intervals until 2030.

In the 2019 evaluation, the study looked at 1,769 subjects with no history of heart disease and scored them based on Life's Simple 7 ideal health behaviors and factors, as outlined by the American Heart Association: body mass index, diet, physical activity, smoking status, blood pressure, blood glucose and total cholesterol.

The study compared the cardiovascular health scores of pet owners overall to those who did not own pets. Then it compared dog owners to other pet owners and those who did not own pets.

"In general, people who owned any pet were more likely to report more physical activity, better diet and blood sugar at ideal level," says Andrea Maugeri, Ph.D., a researcher with the International Clinical Research Center at St. Anne's University Hospital in Brno and the University of Catania in Catania, Italy. "The greatest benefits from having a pet were for those who owned a dog, independent of their age, sex and education level."

The study demonstrates an association between dog ownership and heart health, which is in line with the American Heart Association's scientific statement on the benefits of owning a dog in terms of physical activity, engagement and reduction of cardiovascular disease risk.

Dr. Maugeri says that the study findings support the idea that people could adopt, rescue or purchase a pet as a potential strategy to improve their cardiovascular health as long as pet ownership led them to a more physically active lifestyle.

Francisco Lopez-Jimenez, M.D., chair of the Division of Preventive Cardiology at Mayo Clinic in Rochester, says that having a dog may prompt owners to go out, move around and play with their dog regularly. Owning a dog also has been linked to better mental health in other studies and less perception of social isolation -- both risk factors for heart attacks. Dr. Lopez-Jimenez is a senior investigator of this study.

Credit: 
Mayo Clinic

Big brains or big guts: Choose one

image: The ptarmigan is a small-brained bird that thrives in colder, high latitude regions. A global study in the journal Nature Communications compares more than 2,000 birds and finds that, in highly variable environments, birds tend to have either larger or smaller brains relative to their body size.

Image: 
Trevor Fristoe

Big brains can help an animal mount quick, flexible behavioral responses to frequent or unexpected environmental changes. But some birds just don't need 'em.

A global study comparing 2,062 birds finds that, in highly variable environments, birds tend to have either larger or smaller brains relative to their body size. Birds with smaller brains tend to use ecological strategies that are not available to big-brained counterparts. Instead of relying on grey matter to survive, these birds tend to have large bodies, eat readily available food and make lots of babies.

The new research from biologists at Washington University in St. Louis appears Aug. 23 in the journal Nature Communications.

"The fact is that there are a great many species that do quite well with small brains," said Trevor Fristoe, formerly a postdoctoral researcher at Washington University, now at the University of Konstanz in Germany.

"What's really interesting is that we don't see any middle ground here," Fristoe said. "The resident species with intermediate brain size are almost completely absent from high latitude (colder and more climatically variable) environments. The species that don't go all in on either of the extreme strategies are forced to migrate to more benign climates during the winter."

"Having a large brain is typically associated with strong energetic demands and a slower life-history," said Carlos Botero, assistant professor of biology in Arts & Sciences and co-author of the paper. "Free from these constraints, species with small brains can exhibit traits and lifestyles that are never seen in larger-brained ones.

"What we found is that alternative ecological strategies that either increase or decrease investments in brain tissue are equally capable of coping with the challenges of living in high-latitude environments," he said.

Because the brain is such a costly organ to develop and maintain, biologists have long been interested in understanding how large brain size -- in all species -- could have evolved.

One hypothesis is based around the idea that one of the main advantages of possessing a big brain is that it allows for a high degree of behavioral flexibility. With flexibility comes the ability to respond to different conditions -- such as wide swings in temperature, or changes in food availability.

The so-called cognitive buffer hypothesis is not the only possible explanation for the evolution of brain size -- but it is an important and influential one.

Relative brain size is a measure of the size of the brain as compared to the body -- think: an ostrich's brain might be much bigger than a chickadee's brain, but so is the ostrich's body. Predictably, the global distribution of relative brain size of birds follows a bell curve, with most species landing squarely in the middle, and only a handful of outliers with relatively large or relatively small brains.

Previous studies had found general trends towards larger relative brain sizes in higher latitudes, where conditions are more variable -- consistent with the cognitive buffer hypothesis. Fristoe and Botero's new study is different because it looks at the full distribution of brain sizes across environments, allowing them to test whether different sizes are over- or under-represented.

Excluding contributions from migrants -- the birds that live in polar or temperate environments only during more favorable times of the year -- the researchers found that at high latitudes, bird brain size appears to be bimodal. This morphological pattern means that bird brains are significantly more likely to be relatively large, or relatively small, compared to body size.

What was going on here? Fristoe, born in Alaska, had a few ideas.

In fact, Fristoe suggests that the Alaska state bird, the ptarmigan, might be a good poster child for the small-brained species. Endearing though she is -- with her plushy bosom, feathered feet and unusual chuckling call -- she's not exactly known for her smarts. The ptarmigan can, however, chow down on twigs and willow leaves with the best of them.

"In our paper, we find that small-brained species in these environments employ strategies that are unachievable with a large brain," Fristoe said. "First, these species are able to persist by foraging on readily available but difficult to digest resources such as dormant plant buds, the needles of conifers, or even twigs.

"These foods can be found even during harsh winter conditions, but they are fibrous and require a large gut to digest," he said. "Gut tissue, like brain tissue, is energetically demanding, and limited budgets mean that it is challenging to maintain a lot of both.

"We also found that these species have high reproductive rates, producing many offspring every year," Fristoe said. "This would allow their populations to recover from high mortality during particularly challenging conditions. Because big-brained species tend to invest more time in raising fewer offspring, this is a strategy that is not available to them."

In other words, maybe big brains are not all that.

"Brains are not evolving in isolation -- they are part of a broader suite of adaptations that help organisms be successful in their lives," Botero said. "Because of trade-offs between different aspects of that total phenotype, we find that two different lineages may respond to selection from environmental oscillations in completely different ways.

"Given that our own species uses its brain to cope with these changes, it is not really surprising that biologists, ourselves included, have historically exhibited a bias toward thinking about environmental variability as a force that drives the expansion of brain size," Botero said. "But the interesting thing that we find here is that when we take a broader view, we realize that other strategies also work -- and remarkably, the alternative here involves making a brain actually smaller!"

Credit: 
Washington University in St. Louis