Culture

Brain cell membranes' lipids may play big role in Alzheimer's progression

image: Links between lipid imbalance and disease have been established, in which lipid changes increase the formation of amyloid plaques, a hallmark of Alzheimer's disease. This imbalance inspired researchers to explore the role of lipids comprising the cellular membranes of brain cells. In Biointerphases, the researchers report on the significant role lipids may play in regulating C99, a protein within the amyloid pathway, and disease progression.

Image: 
Amanda Dyrholm Stange, Jenny Pin-Chia Hsu, Lisbeth Ravnkilde, Nils Berglund, and Birgit Schiøtt

WASHINGTON, June 15, 2021 -- Alzheimer's disease is predominant in elderly people, but the way age-related changes to lipid composition affect the regulation of biological processes is still not well understood. Links between lipid imbalance and disease have been established, in which lipid changes increase the formation of amyloid plaques, a hallmark of Alzheimer's disease.

This imbalance inspired researchers from Aarhus University in Denmark to explore the role of lipids comprising the cellular membranes of brain cells.

In Biointerphases, by AIP Publishing, the researchers report on the significant role lipids may play in regulating C99, a protein within the amyloid pathway, and disease progression. Lipids have been mostly overlooked from a therapeutic standpoint, likely because their influence in biological function is not yet fully understood.

Toxic amyloid plaques are formed within the brain when a series of enzymes cleave the protein APP, which sits within the neuronal cell membrane, to form C99, which in turn is cleaved to release the amyloid-beta peptide that can form plaques.

Both C99 and APP are able to protect themselves from cleavage by forming homodimers, a protein composed of two polypeptide chains that are identical. The interaction between C99 molecules is regulated by lipids that make up the membrane in which the protein sits.

"We showed that a change in the cholesterol content of the neuronal cell membrane can change how the C99 dimerizes," said Amanda Dyrholm Stange, one of the authors. "Our work suggests age-related changes to cholesterol content in the membrane weakens the C99-C99 interaction, which consequently decreases the 'protective' effect of the dimerization process, leading to the hypothesis of why more toxic amyloid-beta peptides are released in the elderly."

Therapeutics for Alzheimer's disease currently "have a very high failure rate, with no therapeutics developed for a very long period of time, so a novel strategy is desperately needed," said co-author Nils Anton Berlund. "Attempting to modulate the composition of the lipid membrane would be an entirely new class of Alzheimer's disease therapeutics but also immensely challenging without side effects."

The researchers postulate shifting the strategy away from targeting proteins to instead targeting the lipid concentration of membranes may be worthwhile.

"We hope our work will lead the pharmaceutical/biotechnology sector to choose lipid modulation as a means for targeting in drug development, because these changes in lipid composition are linked not just to Alzheimer's but a large host of diseases -- from diabetes to cardiovascular disease," said co-author Birgit Schiøtt. "We also hope it will lead to more research and funding toward understanding the fundamental science behind the possible regulatory roles of lipids."

Credit: 
American Institute of Physics

Common cold combats COVID-19

Exposure to the rhinovirus, the most frequent cause of the common cold, can protect against infection by the virus which causes COVID-19, Yale researchers have found.

In a new study, the researchers found that the common respiratory virus jump-starts the activity of interferon-stimulated genes, early-response molecules in the immune system which can halt replication of the SARS-CoV-2 virus within airway tissues infected with the cold.

Triggering these defenses early in the course of COVID-19 infection holds promise to prevent or treat the infection, said Ellen Foxman, assistant professor of laboratory medicine and immunobiology at the Yale School of Medicine and senior author of the study. One way to do this is by treating patients with interferons, an immune system protein which is also available as a drug.

"But it all depends upon the timing," Foxman said.

The results were published June 15th in the Journal of Experimental Medicine.

Previous work showed that at the later stages of COVID-19, high interferon levels correlate with worse disease and may fuel overactive immune responses. But recent genetic studies show that interferon-stimulated genes can also be protective in cases of COVID-19 infection.

Foxman's lab wanted to study this defense system early in the course of COVID-19 infection.

Since earlier studies by Foxman's lab showed that common cold viruses may protect against influenza, they decided to study whether rhinoviruses would have the same beneficial impact against the COVID-19 virus. For the study, her team infected lab-grown human airway tissue with SARS-CoV-2 and found that for the first three days, viral load in the tissue doubled about every six hours. However, replication of the COVID-19 virus was completely stopped in tissue which had been exposed to rhinovirus. If antiviral defenses were blocked, the SARS-CoV-2 could replicate in airway tissue previously exposed to rhinovirus.

The same defenses slowed down SARS-CoV-2 infection even without rhinovirus, but only if the infectious dose was low, suggesting that the viral load at the time of exposure makes a difference in whether the body can effectively fight the infection.

The researchers also studied nasal swab samples from patients diagnosed close to the start of infection. They found evidence of rapid growth of SARS-CoV-2 in the first few days of infection, followed by activation of the body's defenses. According to their findings, the virus typically increased rapidly for the first few days of infection, before host defenses kicked in, doubling about every six hours as seen in the lab; in some patients the virus grew even faster.

"There appears to be a viral sweet spot at the beginning of COVID-19, during which the virus replicates exponentially before it triggers a strong defense response," Foxman said.

Interferon treatment holds promise but it could be tricky, she said, because it would be mostly effective in the days immediately after infection, when many people exhibit no symptoms. In theory, interferon treatment could be used prophylactically in people at high risk who have been in close contact with others diagnosed with COVID-19. Trials of interferon in COVID-19 are underway, and so far show a possible benefit early in infection, but not when given later.

These findings may help explain why at times of year when colds are common, rates of infections with other viruses such as influenza tend to be lower, Foxman said. There are concerns that as social distancing measures ease, common cold and flu viruses -- which have been dormant over the past year -- will come back in greater force. Interference among respiratory viruses could be a mitigating factor, creating an "upper limit" on the degree to which respiratory viruses co-circulate, she said.

"There are hidden interactions between viruses that we don't quite understand, and these findings are a piece of the puzzle we are just now looking at," Foxman said.

Credit: 
Yale University

Can encroachment benefit hotel franchisees?

Researchers from University of Texas at Dallas and Emory University published a new paper in the Journal of Marketing that examines the issue of encroachment in the hotel industry.

The study, forthcoming in the Journal of Marketing, is titled "Can Encroachment Benefit Hotel Franchisees?" and is authored by TI Tongil Kim and Sandy Jap.

For decades, the issue of encroachment, or adding an outlet in proximity to existing outlets, has been contentious. A new outlet increases competition for customers, causing concerns for franchisors and franchisees that the existing outlet's sales will be cannibalized. Franchising is a key means for growth and market expansion for many companies. These organizations account for as much as $890 billion or 50% of all retail sales across 75 industries in the US (approximately 3% of the US gross domestic product). Therefore, encroachment is a big issue.

A new study in the Journal of Marketing suggests that cannibalization does not always have to be the case. Using five years of hotel sales data, an experiment, and a series of simulations, the research team finds that adding a new outlet in markets with few same brand outlets can modestly benefit existing locations. Specifically, the researchers find that revenues can increase an average of 1.7% (a 2.3% improvement in profits) - in other words, a sunny side to an issue that has historically dissolved in conflict.

Importantly, this positive impact does not happen when the new outlet is a different franchise brand. In other words, brands matter. Kim adds that "We also find that younger brands, cross-brands (e.g., Hyatt Place and Hyatt House), and brand bookings through online travel agencies also benefit from having more versus fewer locations." Customers appreciate and value additional outlets of a brand (up to a point) as well as when the brands are new or when they share a moniker with other franchise locations of the same brand. Customers also value multiple same brand options when booking through websites like Expedia or Kayak.

This means that when looking to expand, franchisors should prioritize markets with few same brand outlets, newer brands in their portfolio, cross brands, and online travel agency sales channels. These potentially represent win-win options in the conflict over encroachment.

Most of the work around network expansion has focused on the legalities of this process, codes of conduct, expectations, and compliance and monitoring practices. This research suggests that there may be alternate ways of tackling this thorny problem.

Credit: 
American Marketing Association

Compounds derived from hops show promise as treatment for common liver disease

image: Hops.

Image: 
Oregon State University

CORVALLIS, Ore. - Research by Oregon State University suggests a pair of compounds originating from hops can help thwart a dangerous buildup of fat in the liver known as hepatic steatosis.

The findings, published today in eLife, are important because the condition affects roughly one-fourth of people in the United States and Europe. While heavy drinking is often associated with liver problems, people with little or no history of alcohol use comprise that 25%, which is why their illness is known as non-alcoholic fatty liver disease, or NAFLD.

Resistance to insulin, the hormone that helps control blood sugar levels, is a risk factor for NAFLD, as are obesity, a high-fat diet and elevated levels of fat in the blood. The liver helps the body process nutrients and also acts as a filter for the circulatory system, and too much fat in the liver can lead to inflammation and liver failure.

In a mouse-model study, Oregon State researchers led by Adrian Gombart showed that the compounds xanthohumol and tetrahydroxanthohumol, abbreviated to XN and TXN, can mitigate diet-induced accumulation of fat in the liver.

XN is a prenylated flavonoid produced by hops, the plant that gives beer its flavor and color, and TXN is a hydrogenated derivative of XN.

In the study, 60 mice were randomly assigned to one of five groups - low-fat diet, high-fat diet, high-fat diet supplemented by XN, high-fat diet supplemented by more XN, and high-fat diet supplemented by TXN.

The scientists found that TXN helped put the brakes on the weight gain associated with a high-fat diet and also helped stabilize blood sugar levels, both factors in thwarting the buildup of fat in the liver.

"We demonstrated that TXN was very effective in suppressing the development and progression of hepatic steatosis caused by diet," said Gombart, professor of biochemistry and biophysics in the OSU College of Science and a principal investigator at the Linus Pauling Institute. "TXN appeared to be more effective than XN perhaps because significantly higher levels of TXN are able to accumulate in the liver, but XN can slow progression of the condition as well, at the higher dose."

The mechanism behind the compounds' effectiveness involves PPARγ, a nuclear receptor protein - one that regulates gene expression. PPARγ controls glucose metabolism and the storage of fatty acids, and the genes it activates stimulate the creation of fat cells from stem cells.

XN and TXN act as "antagonists" for PPARγ - they bind to the protein without sending it into action, unlike a PPARγ agonist, which would activate it as well as bind to it. The upshot of antagonism in this case is less fat collecting in the liver.

"Activated PPAR? in liver stimulates storage of lipids and our data suggest that XN and TXN block activation and greatly reduce expression of the genes the promote lipid storage in the liver," Gombart elaborated. "These findings are consistent with studies that show weaker PPARγ agonists are more effective at treating hepatic steatosis than strong agonists. In other words, lower PPARγ activation in the liver may be beneficial."

TXN was better at accumulating in the liver than XN, which may explain why it was more effective in reducing lipids, but the difference in tissue accumulation is not fully understood.

"It may be because XN is metabolized by the host and its gut microbiota more than TXN is, but additional studies are needed to figure that out," Gombart said. "Also, while XN and TXN are effective preventative approaches in rodents, future studies need to determine if the compounds can treat existing obesity in humans. But our findings suggest antagonism of PPARγ in the liver is a logical approach to prevent and treat diet-induced liver steatosis and related metabolic disorders, and they support further development of XN and TXN as low-cost therapeutic compounds."

Credit: 
Oregon State University

EHRs not meeting the challenges of primary care according to new study

image: A new study, published in the Journal Human Factors, find that electronic health records (EHRs) are not rising to the challenges faced by primary care physicians because EHRs have not been designed or tailored to their specific needs.

Image: 
Regenstrief Institute

INDIANAPOLIS - Much needs to be accomplished during the short time a primary care physician sees a patient. A new study from U.S. Department of Veterans Affairs, Regenstrief Institute and IUPUI researchers reports that electronic health records (EHRs) are not rising to the challenges faced by primary care physicians because EHRs have not been designed or tailored to their specific needs. The study, a review and analysis of research on the topic conducted from 2012 to 2020, recommends implementing a human factor approach for the design or redesign of EHR user interfaces.

"The human mind can do many things well. Digesting vast amounts of patient information while multitasking in time-constrained situations exposes a limitation. EHR technology should be able to complement or enhance physicians' abilities in these scenarios," said Regenstrief Institute Research Scientist April Savoy, PhD, who led the new study. "But current EHRs are overloading primary care physicians with information in disparate files and folders rather than presenting comprehensive, actionable data in a context that gives meaning.

"Technology needs to adapt to humans' needs, abilities, and limitations in healthcare delivery as it has in other domains. You can get the most advanced technology available - the fastest car, the smartest cell phone -- but if it is not useful or if usability fails, users should not be forced to change their approach or work. The technology should be redesigned. Similarly, EHRs should be redesigned to improve situation awareness for busy primary care physicians and support their tasks including reviewing patient information, care coordination, and shared decision-making."

Dr. Savoy is a health services researcher and human factors engineer. She notes that it can be easier for consumers to search online and order a pair of shoes in a desired size, color and style, than for primary care clinicians to order a specialty consult or medication refill. When interacting with EHRs, primary care physicians are typically faced with numerous impediments. For example, they are forced to navigate through multiple systems and tabs to find information, increasing redundancy and decreasing efficiency. EHRs' lack of desired features ranges from advanced features such as interoperability to simple features such as auto-save, which are often default capabilities or added conveniences for online shopping.

She adds that EHRs have been tailored for specialists, operating rooms and hospitals but there has been a lack of attention, tailoring and design to fit the specific needs of the primary care physician whose effective decision-making is grounded in perception and comprehension of a patient's dynamic situation. For example, a primary care physician's decision to deprescribe (or stop) a medication could be informed by one measure or trends of patient's blood pressure or cholesterol levels and other medications taken over a month. This type of information has implications for the patient's future health trajectory.

"Electronic health records' support for primary care physicians' situation awareness: Metanarrative review" is published online ahead of print in Human Factors, the Journal of the Human Factors and Ergonomics Society. Authors, in addition to Dr. Savoy, are Himalaya Patel, PhD; Daniel R. Murphy, M.D., MBA; Ashley N.D. Meyer, PhD; Jennifer Herout, PhD; Hardeep Singh, M.D., MPH, all with the VA. The study was funded by the Human Factors Engineering Directorate in the Office of Health Informatics, U.S. Department of Veterans Affairs (VA).

To understand the extent to which EHRs support primary care physicians, the authors reviewed and analyzed studies describing EHR workflow misalignments, usability issues and communication challenges. Significant difficulties were reported related to obtaining clinical information from EHRs. Lab results and care plans were often incomplete, untimely or irrelevant. The study also included review of common clinical decisions and tasks related to care management of adult patients that are typically not supported by clinical decision support tools such as whether to start palliative care, predicting quality of life and recovery time, and tracking progress toward patients' stated goals.

The authors conducted a metanarrative analysis which is more inclusive and open ended than a metanalysis. They found that primary care physicians' experiences using EHRs often included redundant interaction and information overload, which they note could be remediated by incorporating user-centered design principles into future EHR design, development and evaluation.

Credit: 
Regenstrief Institute

Let there be light! New tech to revolutionize night vision

image: Dr Rocio Camacho Morales says the researchers have made the "invisible, visible".

Image: 
Jamie Kidston, The Australian National University

Researchers from The Australian National University (ANU) have developed new technology that allows people to see clearly in the dark, revolutionising night-vision.

The first-of-its-kind thin film, described in a new article published in Advanced Photonics, is ultra-compact and one day could work on standard glasses.

The researchers say the new prototype tech, based on nanoscale crystals, could be used for defence, as well as making it safer to drive at night and walking home after dark.

The team also say the work of police and security guards - who regularly employ night vision - will be easier and safer, reducing chronic neck injuries from currently bulk night-vision devices.

"We have made the invisible visible," lead researcher Dr Rocio Camacho Morales said.

"Our technology is able to transform infrared light, normally invisible to the human eye, and turn this into images people can clearly see - even at distance.

"We've made a very thin film, consisting of nanometre-scale crystals, hundreds of times thinner than a human hair, that can be directly applied to glasses and acts as a filter, allowing you to see in the darkness of the night."

The technology is extremely lightweight, cheap and easy to mass produce, making them accessible to everyday users.

Currently, high-end infrared imaging tech requires cryogenic freezing to work and are costly to produce. This new tech works at room temperatures.

Dragomir Neshev, Director of the ARC Centre for Excellence in Transformative Meta-Optical Systems (TMOS) and ANU Professor in Physics, said the new tech used meta-surfaces, or thin films, to manipulate light in new ways.

"This is the first time anywhere in the world that infrared light has been successfully transformed into visible images in an ultra-thin screen," Professor Neshev said.

"It's a really exciting development and one that we know will change the landscape for night vision forever."

The new tech has been developed by an international team of researchers from TMOS, ANU, Nottingham Trent University, UNSW and European partners.

Mohsen Rahmani, the Leader of the Advanced Optics and Photonics Lab in Nottingham Trent University's School of Science and Technology, led the development of the nanoscale crystal films.

"We previously demonstrated the potential of individual nanoscale crystals, but to exploit them in our everyday life we had to overcome enormous challenges to arrange the crystals in an array fashion," he said.

"While this is the first proof-of-concept experiment, we are actively working to further advance the technology."

Credit: 
Australian National University

Liver cancer call for help

image: Flinders University Professor Alan Wigg, Head of Hepatology and Liver Transplant Medicine Unit at the Southern Adelaide Local Health Network in South Australia.

Image: 
Flinders University

Rising numbers of liver cancer in Aboriginal and Torres Strait Islander communities has led experts at Flinders University to call for more programs, including mobile liver clinics and ultrasound in rural and remote Australia.

The Australian study just published in international Lancet journal EClinicalMedicine reveals the survival difference was largely accounted for by factors other than Indigenous status - including rurality, comorbidity burden and lack of curative therapy.

The study of liver cancer, or Hepatocellular carcinoma (HCC), included 229 Indigenous and 3587 non-Indigenous HCC cases in South Australia, Queensland and the Northern Territory.

"The major finding was important differences in cofactors for HCC between Indigenous and non-Indigenous patients, with Indigenous patients more frequently having multiple cofactors for HCC such as hepatitis B, diabetes and alcohol misuse," says Flinders University Professor Alan Wigg, who led the investigation.

While cancer care is difficult to deliver to remote Australia, he says HCC is preventable with surveillance.

"What is needed is a culturally appropriate model of care that in rural communities that screens for liver disease and identifies at risk patients," says Professor Wigg, who also is Head of Hepatology and Liver Transplant Medicine Unit at the Southern Adelaide Local Health Network in South Australia.

"At-risk patients need regular six-monthly high-quality liver ultrasound surveillance, which supports a model of care involving mobile liver clinics using fibro-scans and liver ultrasound."

This latest research confirms both the incidence and mortality are about 2.6 fold higher in Indigenous compared to non-Indigenous Australians.

Liver disease and HCC is the sixth most common cause of death and HCC is the second most common cause of cancer death in Indigenous Australians.

Other important associations with HCC in Indigenous Australians included a higher comorbidity burden, lower socioeconomic status, younger age at onset, higher proportion of females and poorer five-year survival rates.

"Our study shows the majority of HCC cases in Indigenous Australians occurred in patients living outside of metropolitan areas, to help address the problem of liver disease and HCC and to design effective interventions to reduce the morbidity and mortality from these diseases," says co-author Professor Patricia Valery, who leads the QIMR Berghofer Medical Research Institute's Cancer and Chronic Disease research group.

"This suggests that lower access to care may be contributing to poorer survival in these patients.

"The study findings highlight that there is still more work to do on interventions to reduce Indigenous mortality from liver disease and HCC," she says.

Credit: 
Flinders University

From symmetry to asymmetry: The two sides of life

image: Nuclei are the toughest intracellular organelles. When aligned, they act like structural pillars supporting the structure of the gut and influencing asymmetrical changes in shape during its development.

Image: 
Osaka University

Osaka, Japan - On the outside, animals often appear bilaterally symmetrical with mirror-image left and right features. However, this balance is not always reflected internally, as several organs such as the lungs and intestines are left-right (LR) asymmetrical. Researchers at Osaka University, using an innovative technique for imaging movement of cell nuclei in living tissue, have determined the patterns of nuclear alignment responsible for LR-asymmetrical shaping of internal organs in the developing embryo.

Embryogenesis involves complex genetic and molecular processes that transform a single-celled zygote into a complete, living individual with multiple functional axes, including the LR axis. A long-standing conundrum of Developmental Biology is the breaking of LR symmetry in the developing embryo to initiate lateralization (dominance of one side over the other) of organs and other structures of the body along the LR axis. Mechanisms that induce LR asymmetry are well known in vertebrates; however, the biomechanics in invertebrates remain uncertain.

The research team focused on the development of the intestine, in particular the anterior midgut (AMG) of the Drosophila (fruit fly) as an appropriate model to study LR-asymmetric development in invertebrates. "Proper organ development often requires nuclei to move to a specific position within the cell," first author Dongsun Shin explains. "Our previous studies had revealed the molecular signaling pathways as well as the proteins responsible for the biomechanical control of asymmetric development, but the dynamics had not yet been determined."

To meet the challenge of dynamically tracking nuclei in living tissue, the researchers innovated a new method to investigate nuclear migration. Using a confocal laser scanning microscope, they obtained three-dimensional (3D) time lapse videos of immunostained Drosophila midgut at the appropriate embryonal stage. Then, applying mathematical modeling and computational imaging techniques they generated 3D-surface animated models of nuclear migration in the visceral muscles.

Through 3-D time-lapse movies, the researchers identified the initial LR-symmetrical distribution of AMG muscle nuclei along the antero-posterior (front-to-back) axis—termed 'proper nuclear positioning.' Further, they vividly imaged 'collective nuclear behavior' in which crowded nuclei actively rearranged their relative positions with LR symmetry. This symmetric initial 'proper nuclear positioning' and 'collective nuclear behavior' are responsible for subsequent LR-asymmetric development of the AMG. In contrast, experiments in genetically modified Drosophila embryos demonstrated that when the nuclei aligned with LR-asymmetry, the subsequent LR-asymmetric development of the AMG was lost. It is known that nuclei are the toughest intracellular organelles. Thus, based on these results, the researchers speculated that nuclei may act like structural pillars if they are collectively aligned, which support the structure of the gut and influence LR-asymmetrical changes in shape during its development.

Senior author Kenji Matsuno explains the potential of their findings: "This new method of tracking moving nuclei in vital tissue has helped clarify the role of nuclear alignment in asymmetrical shaping of internal organs during normal development. In addition, our findings are expected to be applied to controlling the shape of regenerative organs. This knowledge could potentially shape future research into organ regeneration which may have applications such as growing artificial organs to model disease mechanisms."

Credit: 
Osaka University

Fungal spores from 250-year-old collections given new lease of life

Echoing through history by reviving fungal specimens originally preserved and described a flabbergasting quarter of a millenium ago by the "Father of Modern Taxonomy" Carl Linnaeus, this study highlights the untapped potential of museum collections in modern research programmes. The results have just been published in the renowned Cell Press journal iScience.

The "desert coprinus" fungus Podaxis has fascinated scientists and explorers for centuries, still the genus has been subjected to relatively little research. These large mushrooms thrive in hostile and mostly species-free environments and while they occur seasonally and unpredictably in deserts and on termite mounds, researchers are faced with a problem common to many biologists: Where do we find it? The researchers from the Department of Biology turned to an unconventional sampling location: Museum collections. By requesting fungal spores from various collections, including the Linnaean Society of London and the Natural History Museum of Denmark, they were able to collect more than 200 specimens from every continent aside from Antarctica. The specimens varied in age from 2 to 250 years old.

Specimen from the South African National Collection of Fungi in Pretoria. Photo: Benjamin Schantz-Conlon

Given the finding that fungal spores can grow after 2-5 years in a museum, the limit for their revival was tested. Eventually the researchers succeeded in germinating and growing two Podaxis specimens collected in the 1770s and classified by Linnaeus in Uppsala. These results reveal an extraordinary capacity for Podaxis spores to remain viable through extended periods of drought and suggests that they can remain dormant in the environment for centuries before germinating once conditions allow.

- "It was really incredible to have these fungi growing in our lab, which we knew had been handled by a scientist as important as Linnaeus, who founded the system of naming species. It allowed us to perform experiments and produce genomes of a quality that would have been impossible with dried specimens," explains postdoc and first author Benjamin Schantz-Conlon of the Department of Biology, University of Copenhagen. Benjamin Schantz-Conlon continues: "It was very interesting to examine the adaptations allowing Podaxis to survive under extreme conditions, hereunder also in herbarium collections where the samples traditionally has been treated with mercury as a pesticide".

The researchers used the specimens to ask whether free-living Podaxis species growing in deserts were genomically and physiologically different from species growing on termite mounds. The results indicated that the association with termites gave rise to smaller genome sizes and a reduced tolerance to stressful conditions.

- "These findings suggest that Podaxis living in association with termites are experiencing a relaxed selection pressure and a potential protection from competition and exposure to stressors in the environment", says corresponding author Michael Poulsen, professor at the Department of Biology.

Podaxis growing on a termite mound. Photo: Z. Wilhelm de Beer
Previous research has shown there is an overlap between tolerance to extreme conditions such as deserts and pathogenicity. By comparing the transition from a free-living state in a desert to a symbiotic state within a termite mound, the researchers hoped to learn more about the evolution of fungi that shift to associate with hosts, including pathogens.

- "While Podaxis living in an obligate association with termites exhibited relaxed selection, we also found some Podaxis which could survive both on termite mounds and free-living in deserts. In this case, we saw little genomic or physiological difference between them and the fully free-living Podaxis, suggesting the adaptations for life in the desert may facilitate the initial colonization of termite mounds; something which is also seen in opportunistic pathogens."

An immense resource of knowledge is stored in museum collections and we should work to ensure that these specimens can be used to answer important questions in science in the future.

Credit: 
University of Copenhagen - Faculty of Science

Obesity and hypertension: Researchers discover novel mechanisms

image: Blood vessels in the transparent brain of an obese mouse model

Image: 
Helmholtz Zentrum München / Tim Gruber

Hypertension is a widespread comorbidity of patients with obesity that greatly increases the risk of mortality and disability. In recent years, researchers have found that a high-calorie diet increases the density of blood vessels (hypervascularization) in the hypothalamus - an important "eating control" area in our brain. Researchers hypothesized that elevated hormone levels of leptin are associated with a higher risk of developing hypertension. However, the exact mechanisms that contribute to the condensed growth of blood vessels in the hypothalamus were unknown.

New research conducted by Cristina García-Cáceres' research group at Helmholtz Zentrum München has now revealed that obese mice do not increase the amount of blood vessels in the hypothalamus when they lack the hormone leptin. Leptin is produced by adipose tissue, is involved in the control of hunger and satiety, and plays an important role in the regulation of fat metabolism in humans and mammals.

Once the researchers increased the hormone leptin in these mice, certain brain cells, the astrocytes, boosted the production of a specific growth factor. This growth factor, in turn, promoted vessel growth. The result was an increased number of vessels in the hypothalamus (and no other brain region). The scientists thus demonstrated that leptin is mainly responsible for the increased concentration of vessels in the hypothalamus and that this process is mediated via astrocytes.

"We provide a paradigm shift in our understanding of how the hypothalamus controls blood pressure in obesity," explains first author Tim Gruber. "While previous research has focused primarily on neurons, our research highlights the new role of astrocytes, historically assumed less relevant than neurons, in controlling blood pressure".

Looking into the future, according to study leader Cristina García-Cáceres, one important question remains: How exactly do astrocytes communicate with neurons? "We have started to answer this question using in vivo real-time imaging of astrocyte-neuron circuit function in the hypothalamus," the researcher says.

Credit: 
Helmholtz Munich (Helmholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH))

Cosmic rays: Coronal mass ejections and cosmic ray observations at Syowa Station in the Antarctic

Solar activities, such as CME(Coronal Mass Ejection), cause geomagnetic storm that is a temporary disturbance of the Earth's magnetosphere. Geomagnetic storms can affect GPS positioning, radio communication, and power transmission system. Solar explosions also emit radiation, which can affect satellite failures, radiation exposure to aircraft crew, and space activity. Therefore, it is important to understand space weather phenomena and their impact on the Earth.

Space weather research by continuous observation of cosmic rays on the ground is mainly conducted using observation data from neutron monitors and multi-directional muon detectors. Since the phenomenon of space weather is on a short-term, days-long scale, it is effective to investigate changes in the flow of cosmic rays for several hours, which requires a total sky monitor of cosmic rays. In the muon detector, the global muon detector network (GMDN) has been observing space weather phenomena since 2006, and in the neutron monitor, the Spaceship Earth project constitutes a similar observation network and the role of the all-sky monitor. Until now, observations by neutron monitors and muon detectors have been performed independently, and progress has been made in space weather research.

In February 2018, Professor Chihiro Kato of Shinshu University took the lead in acquiring simultaneous observations of the neutron monitor and muon detector at Syowa Station in the Antarctic in order to acquire bridging data of observations by the neutron monitor and muon detector. In the polar regions, unlike low latitude regions on the earth, it is possible to observe cosmic rays coming from the same direction with a neutron monitor and a muon detector due to the weaker deflection by the geomagnetism. This is the reason why Syowa Station was selected as the observation point.

Syowa muon detector and neutron monitor observed small fluctuation in CR count like a Forbush decrease on 2018.8. The research group including researchers from Shinshu University and the National Polar Research Institute found curious cosmic-ray density variation on this event by analyzing GMDN data.

On the CME event, a huge amount of coronal material released with a bundle of the solar magnetic field, called Magnetic Flux Rope (MFR), into the interplanetary space. MFR moves through interplanetary space as expanding. CR density is low inside of it because it is originally coronal material. When the Earth enters the MFR, CR counts on the ground decreases. This is called Forbush Decrease.

Normally, when MFR arrives on Earth, CR density observed at the ground level decreases rapidly, and then turns to increase recovering to the original level while the Earth is in the MFR. On this event, however, the CR exceeds the original level before the Earth exits the MFR.

This event attracts interest from researchers because 1) The solar activity is currently near the minimum and the scale of the event itself is small, 2) It causes a disproportionately large geomagnetic storm, and 3) There is high-speed solar wind catching up the MFR expected to interact with it.

By analysis of the GMDN and solar plasma data, it is concluded that the high-speed solar wind causes the unusual enhancement of the CR density by compressing the rear part of the MFR locally.

Cosmic ray observation data is closely related not only to space weather research but also to atmospheric phenomena such as sudden stratospheric temperature rise and is expected to be used in a wide range of fields in the future. The cosmic ray observation data at Syowa Station, including the phenomenon in August 2018, which was the subject of this research, is published on the website and updated daily:
http://polaris.nipr.ac.jp/~cosmicrays/

Credit: 
Shinshu University

Newly developed ion-conducting membrane improves performance of alkaline-zinc iron flow battery

image: Selective ions transport and the hydroxide ions transport in LDHs.

Image: 
DICP

Alkaline zinc-iron flow battery (AZIFB) is well suitable for stationary energy storage applications due to its advantages of high open-cell voltage, low cost, and environmental friendliness. However, it surfers from zinc dendrite/accumulation and relatively low operation current density.

Recently, a research group led by Prof. LI Xianfeng from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Science (CAS) developed layered double hydroxide (LDH) membrane with high hydroxide conductivity and ion selectivity for alkaline-zinc iron flow battery.

The study was published in Nature Communications on June 7.

In order to enhance the operating current density of AZIFB, the researchers added LDHs nano materials into the AZIFB and designed a LDHs-based composite membrane with high performance. High selectivity and superb hydroxide ion conductivity were achieved through the combination of the well-defined interlayer gallery with a strong hydrogen bond network along 2D surfaces.

They identified that surface -OH groups of LDHs layer could assist the conduction of OH- by promoting proton transfer away from one water molecule to the original OH-.

Because of the high ionic conductivity, the LDHs-based membrane enabled the AZIFB to operate at 200 mA cm-2, along with an energy efficiency of 82.36%.

"This study offers a new insight to design and manufacture high-performance membranes for AZIFB," said Prof. LI.

Credit: 
Dalian Institute of Chemical Physics, Chinese Academy Sciences

Novel calibration procedure for super-resolution brain imaging

image: Multi-photon STED microscopy enhanced by adaptive optics captures the fine details of neuronal dendrites.

Image: 
Bancelin et al.

Light--and all waves--can bend around the corners of obstacles found along its path. Because of this phenomenon, called diffraction, it is impossible to focus light onto a spot that is smaller than half its wavelength. In other words, the highest resolution one can theoretically achieve using an optical microscope is approximately 250 nm, a barrier called the diffraction limit. Unfortunately, this resolution is not enough for observing fine cellular structures, such as those found in neurons.

Over more than a century, microscopists were hamstrung by this classic barrier until the invention of super-resolution fluorescence microscopy. One particularly powerful approach was developed in the late 1990s and coined stimulated-emission depletion (STED) microscopy. This technique requires the target sample to contain fluorophores, which are compounds that absorb light at one wavelength and then re-emit it at a longer one. In the simplest version of STED microscopy, fluorophores are excited in a circular spot by irradiation with a diffraction-limited focused laser. Then, a donut-shaped portion around the spot is irradiated with less-energetic light--the depletion beam--which switches off the fluorescence by the process of stimulated emission. Thus, the net effect is that only the fluorophores in the center of the donut re-emit photons, and because that area can be made arbitrarily small, this allows for super-resolution microscopy.

Although STED microscopy was a true breakthrough for observing the morphology of live neurons at higher resolution, there is still room for improvement. In a recent study published in Neurophotonics, a team of scientists led by Dr. U. Valentin Nägerl from Université de Bordeaux developed a simple yet effective calibration method that allows for more precise STED imaging at higher tissue depths. Their approach is based on analyzing and correcting for one of the main sources of systematic error in STED microscopy for biological samples: spherical aberration of the depletion beam.

When imaging a tissue sample at depths higher than 40 μm, the depletion beam suffers various types of defocusing and degradation (aberration) and loses its carefully crafted shape, which is essential to the STED method. Spherical aberration is the biggest offender and was the one the researchers targeted. Their strategy was to first prepare a brain tissue phantom sample, a gel-based proxy with a refractive index similar to that of the actual brain. This phantom sample contained homogenously dispersed fluorophores and gold nanoparticles, which allowed the team to clearly visualize and quantify how the shape of the depletion beam got distorted as it penetrated deeper. Then, they calculated the necessary pre-adjustments that should be made to the depletion beam according to tissue depth so that its final shape more closely matches the ideal one. The adjustments were made using adaptive optics, which is a technology originally developed by astronomers to improve telescopic images that suffer from aberrations caused by the earth's atmosphere.

Once the shape of the depletion beam had been calibrated according to the phantom tests, the scientists proceeded to image live neural tissue. They compared the results of regular STED microscopy, corrected STED microscopy, and two-photon microscopy--a technique that is specifically adjusted for deep tissue imaging. The results were quite convincing: corrected STED images captured the fine details of deeper neural dendrites much better than standard STED images. "Using our calibration strategy, we could measure neuronal structures as small as 80 nm at a depth of 90 μm inside biological tissue and obtain a 60 percent signal increase after correction for spherical aberration," says Nägerl.

Ji Yi, professor of biomedical engineering at Johns Hopkins University remarks, "Super-resolution microscopy has been primarily applied for thin specimens, such as single layer cells, where light scattering is negligible. The team led by Valentin Nägerl implemented adaptive optics in a two-photon stimulated emission depletion microscopy (2P-STED), and achieved 80nm resolution of imaging neuron dendritic spines through 90 micron brain tissue. This is noteworthy because super-resolution is hard to maintain in thicker tissue--particularly given the highly scattering quality of brain tissue." Yi explains that the advance will facilitate study of neural activities and interactions.

Considering this novel calibration process is robust, straightforward to implement, and relatively inexpensive, it could be easily incorporated into standard laboratory practices to obtain better results with STED microscopes, so long as the prepared phantom sample matches the optical properties of the biological specimen. In this regard, Nägerl states: "Our approach is not limited to brain samples; it could be adapted to other tissues with known and relatively homogeneous refractive indices, as well as other types of preparations, even potentially in the intact, live mouse brain."

Credit: 
SPIE--International Society for Optics and Photonics

Main gland in hormonal system ages due to process that can potentially be slowed down

image: Stem cell biologist Hugo Vankelecom (KU Leuven) and his colleagues have discovered that the pituitary gland in mice ages as the result of an age-related form of chronic inflammation. It may be possible to slow down this process or even partially repair it. Vankelecom and his colleagues studied the pituitary of mice, so further research is required to demonstrate whether their findings also apply to humans.

Image: 
© KU Leuven - Emma Laporte (first co-author of the study). Image created with BioRender.

Stem cell biologist Hugo Vankelecom (KU Leuven) and his colleagues have discovered that the pituitary gland in mice ages as the result of an age-related form of chronic inflammation. It may be possible to slow down this process or even partially repair it. The researchers have published their findings in PNAS.

The pituitary gland is a small, globular gland located underneath the brain that plays a major role in the hormonal system, explains Professor Hugo Vankelecom from the Department of Development and Regeneration at KU Leuven. "My research group discovered that the pituitary gland ages as a result of a form of chronic inflammation that affects tissue and even the organism as a whole. This natural process usually goes unnoticed and is referred to as 'inflammaging' -- a contraction of inflammation and ageing. Inflammaging has previously been linked to the ageing of other organs." Due to the central role played by the pituitary, its ageing may contribute to the reduction of hormonal processes and hormone levels in our body - as is the case with menopause, for instance.

The study also provides significant insight into the stem cells in the ageing pituitary gland. In 2012, Vankelecom and his colleagues showed that a prompt reaction of these stem cells to injury in the gland leads to repair of the tissue, even in adult animals. "As a result of this new study, we now know that stem cells in the pituitary do not lose this regenerative capacity when the organism ages. In fact, the stem cells are only unable to do their job because, over time, the pituitary becomes an 'inflammatory environment' as a result of the chronic inflammation. But as soon as the stem cells are taken out of this environment, they show the same properties as stem cells from a young pituitary."

Chance of recovery?

This insight opens up a number of potential therapeutic avenues: would it be possible to reactivate the pituitary? This wouldn't just involve slowing down hormonal ageing processes, but also repairing the damage caused by a tumour in the pituitary, for example. "No fewer than one in every 1,000 people is faced with this kind of tumour -- which causes damage to the surrounding tissue -- at some point. The quality of life of many of these patients would be drastically improved if we could repair this damage. We may be able to do so by activating the stem cells already present -- for which our present study also provides new indications -- or even by transplanting cells. That said, these new treatment options are not quite around the corner just yet, as the step from fundamental research to an actual therapy can take years to complete. For the time being, our study sets out a potential direction for further research."

The study also suggests another interesting avenue: the use of anti-inflammatory drugs to slow down pituitary ageing or rejuvenate an ageing pituitary. "Several studies have shown that anti-inflammatory drugs may have a positive impact on some ageing organs. No research has yet been performed on this effect in relation to the pituitary."

From mice to humans

Vankelecom and his colleagues studied the pituitary of mice, so further research is required to demonstrate whether their findings also apply to humans. Vankelecom comments: "Mice have a much greater regeneration capacity than humans. They can repair damaged teeth, for instance, while humans have lost this ability over the course of their evolution. Regardless, there are plenty of signs suggesting that pituitary processes in mice and humans are similar, and we have recent evidence to hand that gene expression in the pituitaries of humans and mice is very similar. As such, it is highly likely that the insights we gained will equally apply to humans."

Credit: 
KU Leuven

Predicting the evolution of a pandemic

image: An extended epidemic model that accounts for uncertainty and the latest data can better predict the evolution of pandemics.

Image: 
© 2021 KAUST; Ivan Gromicho

The inclusion of biological uncertainty and the latest case data can significantly improve the prediction accuracy of standard epidemiological models of virus transmission, new research led by KAUST and the Kuwait College of Science and Technology (KCST) has shown.

Modern mathematical epidemic models have been tested like never before during the COVID-19 pandemic. These models use mathematics to describe the various biological and transmission processes involved in an epidemic. However, when such factors are highly uncertain, such as during the emergence of a new virus like COVID-19, the predictions can be unreliable.

"The susceptible-exposed-infected-recovered model, SEIR, is a standard mathematical approach for forecasting the spread of an epidemic in a population," says Rabih Ghostine, formerly of KAUST and now at KCST. "This model is based on several assumptions, such as homogeneous mixing of the population and the omission of migration, births or deaths from causes other than the epidemic. The parameters in the traditional SEIR model also do not allow for quantification of uncertainty, being single values reflecting the modeler's best guess."

"We wanted to develop a robust mathematical model that takes into account such uncertainties and incorporates epidemic data in order to enhance forecasting accuracy," explained Ghostine.

Ghostine, along with KAUST's Ibrahim Hoteit and fellow researchers, developed an extended SEIR model compromising seven compartments: susceptible, exposed, infectious, quarantined, recovered, deaths and vaccinated. They then added uncertainty definitions and a data assimilation process to drive progressive improvement of the model.

"Our data assimilation approach exploits new incoming observations to calibrate the model with recent information in order to continuously provide improved predictions, and also to estimate uncertainties," says Ghostine. "This is a popular framework in the atmospheric and ocean research communities and is at the basis of all operational weather and ocean modeling."

The model uses an "ensemble" approach, in which a set of predictions is generated across different parameter uncertainties. This ensemble is then integrated forward in time to forecast the future state. A correction step is performed to update the forecast with the latest data. Validation using real data for Saudi Arabia showed the model to provide reliable forecasts for up to 14 days in advance.

"Mathematical models can play an important role in understanding and predicting COVID-19 transmission as well as provide crucial information to policymakers to implement appropriate measures and efficient strategies to control the pandemic spread and mitigate its impact," says Hoteit. "Our method, which we developed to simulate the COVID-19 spread in Saudi Arabia, can also be applied to forecast the spread of any pandemic in a population."

Credit: 
King Abdullah University of Science & Technology (KAUST)