Culture

New study reveals previously unseen star formation in milky way

Astronomers using two of the world's most powerful radio telescopes have made a detailed and sensitive survey of a large segment of our home galaxy -- the Milky Way -- detecting previously unseen tracers of massive star formation, a process that dominates galactic ecosystems. The scientists combined the capabilities of the National Science Foundation's Karl G. Jansky Very Large Array (VLA) and the 100-meter Effelsberg Telescope in Germany to produce high-quality data that will serve researchers for years to come.

Stars with more than about ten times the mass of our Sun are important components of the Galaxy and strongly affect their surroundings. However, understanding how these massive stars are formed has proved challenging for astronomers. In recent years, this problem has been tackled by studying the Milky Way at a variety of wavelengths, including radio and infrared. This new survey, called GLOSTAR (Global view of the Star formation in the Milky Way), was designed to take advantage of the vastly improved capabilities that an upgrade project completed in 2012 gave the VLA to produce previously unobtainable data.

GLOSTAR has excited astronomers with new data on the birth and death processes of massive stars, as well on the tenuous material between the stars. The GLOSTAR team of researchers has published a series of papers in the journal Astronomy & Astrophysics reporting initial results of their work, including detailed studies of several individual objects. Observations continue and more results will be published later.

The survey detected telltale tracers of the early stages of massive star formation, including compact regions of hydrogen gas ionized by the powerful radiation from young stars, and radio emission from methanol (wood alcohol) molecules that can pinpoint the location of very young stars still deeply shrouded by the clouds of gas and dust in which they are forming.

The survey also found many new remnants of supernova explosions -- the dramatic deaths of massive stars. Previous studies had found fewer than a third of the expected number of supernova remnants in the Milky Way. In the region it studied, GLOSTAR more than doubled the number found using the VLA data alone, with more expected to appear in the Effelsberg data.

"This is an important step to solve this longstanding mystery of the missing supernova remnants," said Rohit Dokara, a Ph.D student at the Max Planck Institute for Radioastronomy (MPIfR) and lead author on a paper about the remnants.

The GLOSTAR team combined data from the VLA and the Effelsberg telescope to obtain a complete view of the region they studied. The multi-antenna VLA -- an interferometer -- combines the signals from widely-separated antennas to make images with very high resolution that show small details. However, such a system often cannot also detect large-scale structures. The 100-meter-diameter Effelsberg telescope provided the data on structures larger than those the VLA could detect, making the image complete.

"This clearly demonstrates that the Effelberg telescope is still very crucial, even after 50 years of operation," said Andreas Brunthaler of MPIfR, project leader and first author of the survey's overview paper.

Visible light is strongly absorbed by dust, which radio waves can readily penetrate. Radio telescopes are essential to revealing the dust-shrouded regions in which young stars form.

The results from GLOSTAR, combined with other radio and infrared surveys, "offers astronomers a nearly complete census of massive star-forming clusters at various stages of formation, and this will have lasting value for future studies," said team member William Cotton, of the National Radio Astronomy Observatory (NRAO), who is an expert in combining interferometer and single-telescope data.

"GLOSTAR is the first map of the Galactic Plane at radio wavelengths that detects many of the important star formation tracers at high spatial resolution. The detection of atomic and molecular spectral lines is critical to determine the location of star formation and to better understand the structure of the Galaxy," said Dana Falser, also of NRAO.

The initiator of GLOSTAR, the MPIfR's Karl Menten, added, "It's great to see the beautiful science resulting from two of our favorite radio telescopes joining forces."

Credit: 
National Radio Astronomy Observatory

New insight on the reproductive evolution of land plants

image: Then IGC, now current ITQB NOVA PI and GREEN-IT member Jörg Becker

Image: 
Mette Haagen Marcussen , DTU Bioengineering

Around 470 million years ago, plants began to conquer the terrestrial surfaces. The first examples had a small axis terminated by a structure capable of forming spores, almost like current mosses. The appearance of plant organs mediated the explosive radiation of land plants, which shaped the surface of our planet and allowed the establishment of terrestrial animal life.

However, evolving such a diversity of organs, such as roots, leaves, or immobile gametes, requires coordinated genetic changes: rise of new genes, repurpose of genetic material, and development of new regulatory programs. In a study published in Nature Plants, a consortium between Europe - including the team led by GREEN-IT member and then IGC, now ITQB NOVA PI at the Plant Reproduction and Evolution Laboratory Jörg Becker - the United States and Singapore introduced an atlas that compiles gene expression data from ten different species of land plants - the largest collection to date. Focusing on the detailed analysis of the collected data, the team looked to identify novel and missing components involved in the formation of sex organs and cells. "Comparing data from such different species allowed us to distinguish genes that are important for the reproduction of all land plants from those that only matter for flowering plants", explains Jörg.

The comparative analysis of the atlases revealed that a large portion of the gene expression remained unchanged throughout evolution, when looking at equivalent organs from different species. The data also showed that the establishment of organs relies heavily on the repurpose of existing genetic material. "We saw that many groups of genes appeared long before the corresponding organ, and this tells us that they emerged through the repurpose of genetic material that already existed", explains the researcher.

The team also looked for patterns in the development of female and male gametes. "We were interested in comparing the first land plants, which have swimming sperm and need water for their reproduction, with plants with non-swimming sperm, which is inside the pollen grain and does not depend on water for its mobility", says Jörg. The team found that in contrast with female gametes, male gametes presented a high number and conservation of genes, indicating that male reproduction appears to be more specialized than female. Among them, proteins that regulate gene expression - transcription factors - and proteins responsible for transferring phosphate to other proteins - kinases-, potentially important for the making and function of pollen.

This work also allowed the establishment of the EVOREPRO database, a user-friendly online tool that allows the browsing and comparative analysis of the genomic and transcriptomic data derived from samples across thirteen members of the plant kingdom. This database may be a valuable resource for further studies and validation of key genes involved in organogenesis and land plants reproduction.

Plants are our greatest source of food and materials and the comprehension of their function is essential to be able to counteract the current strains that plague this resource and find long term sustainable solutions. "Knowing the genes important for the development and function of a specific organ gives us an indication of which genes to manipulate to elevate its function. We are looking for those that make sperm and egg development more heat resistant, for example, and for ways to overcome the fertilization barriers between different plant species to obtain superior quality hybrids," concludes Jörg.

Credit: 
Instituto de Tecnologia Química e Biológica António Xavier da Universidade NOVA de Lisboa ITQB NOVA

Unlocking genetic clues behind aortic aneurysm

A new study increases knowledge of the genetics behind aortic aneurysm, a disease that can spark life-threatening events like aortic dissections and ruptures.

University of Michigan Health-led researchers compared blood samples from more than 1,300 people who had a thoracic aortic aneurysm with more than 18,000 control samples, in partnership with U-M's Cardiovascular Health Improvement Project and its Michigan Genomics Initiative.

"After examining nearly the entire human genome for genetic changes that increase risk of aneurysm, we discovered a new change in the genetic code of a transcription factor, which means it controls many other genes," explained co-corresponding author Cristen Willer, Ph.D., a professor of cardiovascular medicine, internal medicine, human genetics and computational medicine and bioinformatics at University of Michigan Health.

Then, Willer's team collaborated with Minerva Garcia-Barrio, Ph.D., an assistant professor of internal medicine, to examine the role this gene played in smooth muscle cells, a component of aorta.

"We examined this gene in human cells and discovered that the transcription factor we identified is a key factor that gives instructions to cells about when to die and replenish," said co-lead author Tanmoy Roychowdhury, Ph.D., a research fellow in the Division of Cardiovascular Medicine.

The authors say future research might investigate whether slowing down this apoptosis, or cellular death, in aortic aneurysm could reduce the risk of a person experiencing an aortic dissection or rupture in the future.

Credit: 
Michigan Medicine - University of Michigan

Scientists reverse age-related memory loss in mice

Scientists at Cambridge and Leeds have successfully reversed age-related memory loss in mice and say their discovery could lead to the development of treatments to prevent memory loss in people as they age.

In a study published today in Molecular Psychiatry, the team show that changes in the extracellular matrix of the brain - 'scaffolding' around nerve cells - lead to loss of memory with ageing, but that it is possible to reverse these using genetic treatments.

Recent evidence has emerged of the role of perineuronal nets (PNNs) in neuroplasticity - the ability of the brain to learn and adapt - and to make memories. PNNs are cartilage-like structures that mostly surround inhibitory neurons in the brain. Their main function is to control the level of plasticity in the brain. They appear at around five years old in humans, and turn off the period of enhanced plasticity during which the connections in the brain are optimised. Then, plasticity is partially turned off, making the brain more efficient but less plastic.

PNNs contain compounds known as chondroitin sulphates. Some of these, such as chondroitin 4-sulphate, inhibit the action of the networks, inhibiting neuroplasticity; others, such as chondroitin 6-sulphate, promote neuroplasticity. As we age, the balance of these compounds changes, and as levels of chondroitin 6-sulphate decrease, so our ability to learn and form new memories changes, leading to age-related memory decline.

Researchers at the University of Cambridge and University of Leeds investigated whether manipulating the chondroitin sulphate composition of the PNNs might restore neuroplasticity and alleviate age-related memory deficits.

To do this, the team looked at 20-month old mice - considered very old - and using a suite of tests showed that the mice exhibited deficits in their memory compared to six-month old mice.

For example, one test involved seeing whether mice recognised an object. The mouse was placed at the start of a Y-shaped maze and left to explore two identical objects at the end of the two arms. After a short while, the mouse was once again placed in the maze, but this time one arm contained a new object, while the other contained a copy of the repeated object. The researchers measured the amount of the time the mouse spent exploring each object to see whether it had remembered the object from the previous task. The older mice were much less likely to remember the object.

The team treated the ageing mice using a 'viral vector', a virus capable of reconstituting the amount of 6-sulphate chondroitin sulphates to the PNNs and found that this completely restored memory in the older mice, to a level similar to that seen in the younger mice.

Dr Jessica Kwok from the School of Biomedical Sciences at the University of Leeds said: "We saw remarkable results when we treated the ageing mice with this treatment. The memory and ability to learn were restored to levels they would not have seen since they were much younger."

To explore the role of chondroitin 6-sulphate in memory loss, the researchers bred mice that had been genetically-manipulated such that they were only able to produce low levels of the compound to mimic the changes of ageing. Even at 11 weeks, these mice showed signs of premature memory loss. However, increasing levels of chondroitin 6-sulphate using the viral vector restored their memory and plasticity to levels similar to healthy mice.

Professor James Fawcett from the John van Geest Centre for Brain Repair at the University of Cambridge said: "What is exciting about this is that although our study was only in mice, the same mechanism should operate in humans - the molecules and structures in the human brain are the same as those in rodents. This suggests that it may be possible to prevent humans from developing memory loss in old age."

The team have already identified a potential drug, licensed for human use, that can be taken by mouth and inhibits the formation of PNNs. When this compound is given to mice and rats it can restore memory in ageing and also improves recovery in spinal cord injury. The researchers are investigating whether it might help alleviate memory loss in animal models of Alzheimer's disease.

The approach taken by Professor Fawcett's team - using viral vectors to deliver the treatment - is increasingly being used to treat human neurological conditions. A second team at the Centre recently published research showing their use for repairing damage caused by glaucoma and dementia.

Credit: 
University of Cambridge

Parkinson's disease: How lysosomes become a hub for the propagation of the pathology

Over the last few decades, neurodegenerative diseases became one of the top 10 global causes of death. Researchers worldwide are making a strong effort to understand neurodegenerative diseases pathogenesis, which is essential to develop efficient treatments against these incurable diseases. However, our knowledge about the basic molecular mechanisms underlying the pathogenesis of neurodegenerative diseases is still lacking. A team of researchers found out the implication of lysosomes in the spread of Parkinson's disease.

The accumulation of misfolded protein aggregates in affected brain regions is a common hallmark shared by several neurodegenerative diseases (NDs). Mounting evidence in cellular and in animal models highlights the capability of different misfolded proteins to be transmitted and to induce the aggregation of their endogenous counterparts, this process is called "seeding". In Parkinson's disease, the second most common ND, misfolded α-synuclein (α-syn) proteins accumulate in fibrillar aggregates within neurons. Those accumulations are named Lewy bodies.

α-syn fibrils spreads through TNTs inside lysosomes

In 2016, a team of researchers from the Institut Pasteur (Paris) and the French National Centre for Scientific Research (in French: CNRS, Centre national de la recherche scientifique) demonstrated that α- syn fibrils spread from donor to acceptor cells through tunneling nanotubes (TNTs). They also found out that these fibrils are transferred through TNTs inside lysosomes. "TNTs are actin-based membrane channels allowing the transfer of several cellular components including organelles between distant cells. Lysosomes are organelles normally deputed to the degradation and recycling of toxic/damaged cell material" describes Chiara Zurzolo, head of the Membrane Traffic and Pathogenesis Unit at the Institut Pasteur.

α-syn fibrils can modify lysosome shape and permeability to allow seeding and diffusion

Following this original discovery, researchers, now shed some light on how lysosomes participate in the spreading of α-syn aggregates through TNTs. "By using super-resolution and electron microscopy, we found that α-syn fibrils affect the morphology of lysosomes and impair their function in neuronal cells. We demonstrated for the first time that α-syn fibrils induce the peripheral redistribution of the lysosomes thus increasing the efficiency of α-syn fibrils' transfer to neighbouring cells," continues Chiara Zurzolo. They also showed that α-syn fibrils can permeabilize the lysosomal membrane, impairing the degradative function of lysosomes and allowing the seeding of soluble α-syn, which occurs mainly in those lysosomes. Thus, by impairing lysosomal function α-syn fibrils block their own degradation in lysosomes, that instead become a hub for the propagation of the pathology.

BY IMPAIRING LYSOSOMAL FUNCTION Α-SYN FIBRILS BLOCK THEIR OWN DEGRADATION IN LYSOSOMES, THAT INSTEAD BECOME A HUB FOR THE PROPAGATION OF THE PATHOLOGY

This discovery contributes to the elucidation of the mechanism by which α-syn fibrils spread through TNTs, while also revealing the crucial role of lysosomes, working as a Trojan horse for both seeding and propagation of disease pathology. This information can be exploited to establish novel therapies to target these incurable diseases.

Credit: 
Institut Pasteur

Interaction identified between SARS-CoV-2 and unusual RNA structures in human cells

Replication of SARS-CoV-2, the virus responsible for COVID-19, depends on a series of interactions between viral proteins and different cellular partners such as nucleic acids (DNA or RNA). Characterizing these interactions is crucial to elucidate the process of viral replication and identify new drugs for treating COVID-19.

An interdisciplinary consortium of scientists from the Institut Pasteur, the Ecole Polytechnique, the Institut Curie, Inserm, the CNRS and the universities of Paris, Paris-Saclay, Bordeaux and Toulouse have demonstrated a specific interaction between a domain of a SARS-CoV-2 protein (Nsp3) and unusual DNA or RNA structures known as G-quadruplexes or G4s. "Using a broad range of experimental approaches, we characterized this interaction and revealed that this Nsp3 protein domain has a clear G4 propensity. We also showed that G4 ligands [chemical compounds that bind with G4s] prevent this interaction," explains Marc Lavigne, a scientist in the Department of Virology at the Institut Pasteur and coordinator of the G4-Covid19 project. These results were recently published in Nucleic Acids Research.

Potential therapeutic application patented

Alongside this study, some G4 ligands were developed by the co-authors* of the article. In a cellular system reproducing SARS-CoV-2 infection, the Institut Pasteur (Marc Lavigne, Hélène Munier-Lehmann, Jeanne Chiavarelli and Björn Meyer) demonstrated that these ligands, which prevent interaction between the SARS-CoV-2 protein and the G4 structure, exhibit antiviral activity.

These results pave the way for the use of G4-binding molecules as potent antiviral compounds (European patent 20 306 606.3 and Institut Pasteur DI 2020-59) and confirm the choice to target host-virus interactions in antiviral strategies.

The project received financial support from the Institut Pasteur (via the exceptional COVID-19 program, funded in part by donations), the French National Research Agency (ANR-Flash-Covid) and own funding from the various laboratories involved. It also involved the participation of several of the Institut Pasteur's technological platforms (PF-BMI, PF-3PR and PF-CCB).

Credit: 
Institut Pasteur

Characterized drugs show unexpected effects

image: Applying the Morphological & Proteome Profiling, a group of already characterized substances was identified, that modulate the cholesterol homeostasis.

Image: 
MPI of Molecular Physiology

When Alexander Flemming discovered a mould on a culture plate overgrown with bacteria in 1928, he did not expect to find one of the most widely used active substances: penicillin. Accidental discoveries and the identification of active ingredients from traditional remedies, such as the morphine of the opium poppy, have shaped the discovery of new medicines for a long time.

Modern drug discovery - from chance to system

Meanwhile, major developments in chemistry and molecular biology have been made that enable a systematic and targeted search for potential active substances in modern drug discovery. First, advances in the field of organic and especially combinatorial chemistry made it possible to produce huge substance libraries and test them for their pharmacological effect in high-throughput tests. Following technological advances like the sequencing of the human genome and the development of new methods in molecular biology enabled the identification of disease-related cellular processes and their molecular key players. This paved the way towards modern drug discovery, where large libraries of molecules are screened in a high-throughput manner for their influence on relevant target molecules, mostly proteins. Identified substances, so-called hits, are optimized in their chemical structure to lead structures that are already effective in small doses and are well absorbed and distributed in the body.

No effect without side effects

This target-based drug development is very successful in identifying new drug candidates that prevent the target proteins from functioning or interacting with other proteins. However, potential drug candidates are rarely specific and very often also act on related proteins that have a similar function or structure. "It is not uncommon that an initially promising drug candidate unexpectedly shows serious side effects in a later phase of its longstanding development, thus limiting or even preventing its clinical use," says Slava Ziegler.

In search of unknown bioactivities

In order to track down possible side effects during drug development, potential drug candidates are screened in assays for their effect on known protein classes, biological processes and certain cellular properties. However, these tests can only reflect the expected bioactivity since the number of known target molecules in the cell is limited. So-called profiling approaches now offer the possibility of detecting a wider spectrum of activity. These unbiased tests investigate the influence on hundreds of cellular or genetic parameters recorded in a substance's profile that is compared with profiles of reference substances with known effects.

When drug profiles match

In their latest study, the group of Herbert Waldmann and Slava Ziegler combined two of these profiling approaches to identify bioactive substances from a library of about 15000 natural product-inspired molecules and compared them with the profiles of known, active compounds. Applying the Cell Painting Assay, where functional areas of the cell are stained and then microscopically examined for changes, a large cluster of substances with similar profiles was identified. However, it was not possible to predict the mode of action of the cluster since the associated reference compounds had various activities or target molecules. With a subsequent search using proteome profiling, in which the quantities and thus the regulation of thousands of proteins was examined, the researchers were able to narrow down the cluster to a common activity - the modulation of cholesterol homeostasis - an unexpected biological activity for the most reference substances in the cluster.

Two birds with one stone: identifying new bioactivities and side effects

But how can substances with very different target molecules trigger the same effect? The researchers revealed that most of the compounds in the cluster accumulate in the lysosome, an organelle where cholesterol is stored temporarily for its further function in the cell. The lysosome has a lower pH value than the rest of the cell, and this is crucial for the functioning of the lysosomal digestive enzymes that process foreign and the cell's own biomolecules. In the lysosome, the substances from the cluster described increase the pH value and thus disrupt the function of this organelle and, in particular, the cholesterol balance of the cell. The fact that the compounds accumulate in the lysosome is not due to a specific target molecule in the lysosome but to their chemical and physical properties, which they have obtained through their structural optimization for improved solubility.

"Interestingly, disturbed cholesterol balance has already been linked to some drugs on the market, such as antipsychotics" notes Tabea Schneidewind, first author of the study. " With the combination of the two search strategies, we can kill two birds with one stone: unveil unknown side effects and identify new active substances and modes of action" says Slava Ziegler.

Targeting cholesterol homeostasis could possibly also disturb SARS-CoV-2 infections

Influencing cholesterol homeostasis seems to be a common feature of many compounds and should be taken into account when evaluating side effects of active substances. However, the observed activity is not per se undesired. Currently, drugs and compounds with known modes of action are being intensively studied for inhibition of SARS-CoV-2 infection of host cells, and many compounds of our cluster have been identified to suppress this process. Interestingly, membrane cholesterol and thus proper cholesterol homeostasis are crucial for Sars-CoV-2 infection as shown in several studies. Our data most likely explain the reason for the activity of these compounds against the virus: they alter cholesterol biosynthesis and localization in cells, which impairs Corona-Virus infection", says Slava Ziegler.

Credit: 
Max Planck Institute of Molecular Physiology

Rensselaer-designed platform could enable personalized immunotherapy

TROY, N.Y. -- An innovative testing platform that more closely mimics what cancer encounters in the body may allow for more precise, personalized therapies by enabling the rapid study of multiple therapeutic combinations against tumor cells. The platform, which uses a three-dimensional environment to more closely mirror a tumor microenvironment, is demonstrated in research published in Communications Biology.

"This whole platform really gives us a way to optimize personalized immunotherapy on a rapid, high throughput scale," said Jonathan Dordick, Institute Professor of chemical and biological engineering and member of the Center for Biotechnology and Interdisciplinary Studies (CBIS) at Rensselaer Polytechnic Institute, who led this research. "You can imagine somebody having cancer, and you quickly biopsy the tumor and then you use this biochip platform to identify very quickly -- within a day or two -- what specific treatment modality might be ideally suited against a particular cancer."

Of particular interest to researchers is the behavior of a specific type of immune cell known as natural killer (NK) cells, which seek out cancer or viruses within the body, bind to their receptors, and excrete an enzyme meant to kill the unwanted cells. The platform studied in this paper allows researchers to compare what happens when the NK cells are left to fight tumor cells on their own versus how they behave when an antibody or cancer drug, or a combination of the two, is added.

The platform is a small two-piece plastic chip that's about the size of a microscope slide. One side of the sandwich chip contains 330 tiny pillars upon which researchers can place an external matrix, made of a gel-like substance, which mimics the mechanical environment of a tumor cell. When cancer cells are placed inside this gel-like structure, they're encouraged to grow into a spheroid shape, much as they would inside the body. The second piece contains 330 microwells within which NK cells can be added in suspension -- much as they would flow, untethered inside the body.

At Rensselaer, Dordick collaborated with Seok-Joon Kwon, senior research scientist in CBIS, and Sneha Gopal, who recently received her Ph.D. based, in part, on this study. The Rensselaer team collaborated with researchers from Konyang University and Medical & Bio Decision Company Ltd. To test this platform, researchers studied two types of breast cancer cells, as well as pancreatic cancer cells, with various combinations of NK cells, two monoclonal antibodies, and an anti-cancer chemotherapy drug.

"You can screen very quickly to determine what combinations of NK cells, antibodies, and chemotherapeutic drugs target the cancer cells within the spheroid geometry," Dordick said. "What really is amazing is we see very significant differences between what happens in that spheroid, within the slots of the chip, versus what would happen in a more traditional two-dimensional cell culture that's often used in the screening."

In the spheroid design, for instance, the chemotherapy drug paclitaxel had little effect on the three types of cancer cells on its own, whereas in a traditional two-dimensional system, Dordick said, the drug may appear to do well. It performed dramatically better when it was combined with both NK cells and an antibody.

"This platform moves researchers closer to personalized medicine," said Deepak Vashishth, director of CBIS. "This work conducted by Professor Dordick and his research group is an excellent example of how we, at Rensselaer, are providing a new angle to human health by developing new approaches at the intersection of engineering and life sciences to enhance cures for diseases such as cancer."

To further the potential use of this tool, Dordick said that it must be tested on a wide range of cancer types, including a tumor microenvironment that consists of multiple different types of cells. In the future, he envisions that the platform has the potential to identify combination therapies that work best against a patient's specific cancer, enabling the identification and delivery of personalized immunotherapy.

Credit: 
Rensselaer Polytechnic Institute

Excess coffee: A bitter brew for brain health

It's a favourite first-order for the day, but while a quick coffee may perk us up, new research from the University of South Australia shows that too much could be dragging us down, especially when it comes to brain health.

In the largest study of its kind, researchers have found that high coffee consumption is associated with smaller total brain volumes and an increased risk of dementia.

Conducted at UniSA's Australian Centre for Precision Health at SAHMRI and a team of international researchers*, the study assessed the effects of coffee on the brain among 17,702 UK Biobank participants (aged 37-73), finding that those who drank more than six cups of coffee a day had a 53 per cent increased risk of dementia.

Lead researcher and UniSA PhD candidate, Kitty Pham, says the research delivers important insights for public health.

"Coffee is among the most popular drinks in the world. Yet with global consumption being more than nine billion kilograms a year, it's critical that we understand any potential health implications," Pham says.

"This is the most extensive investigation into the connections between coffee, brain volume measurements, the risks of dementia, and the risks of stroke - it's also the largest study to consider volumetric brain imaging data and a wide range of confounding factors.

"Accounting for all possible permutations, we consistently found that higher coffee consumption was significantly associated with reduced brain volume - essentially, drinking more than six cups of coffee a day may be putting you at risk of brain diseases such as dementia and stroke."

Dementia is a degenerative brain condition that affects memory, thinking, behaviour and the ability to perform everyday tasks. About 50 million people are diagnosed with the syndrome worldwide. In Australia, dementia is the second leading cause of death, with an estimated 250 people diagnosed each day.

Stroke is a condition where the blood supply to the brain is disrupted, resulting in oxygen starvation, brain damage and loss of function. Globally, one in four adults over the age of 25 will have a stroke in their lifetime. Data suggests that 13.7 million people will have a stroke this year with 5.5 million dying as a result.

Senior investigator and Director of UniSA's Australian Centre for Precision Health, Professor Elina Hyppönen, says while the news may be a bitter brew for coffee lovers, it's all about finding a balance between what you drink and what's good for your health.

"This research provides vital insights about heavy coffee consumption and brain health, but as with many things in life, moderation is the key," Prof Hyppönen says.

"Together with other genetic evidence and a randomised controlled trial, these data strongly suggest that high coffee consumption can adversely affect brain health. While the exact mechanisms are not known, one simple thing we can do is to keep hydrated and remember to drink a bit of water alongside that cup of coffee.

"Typical daily coffee consumption is somewhere between one and two standard cups of coffee. Of course, while unit measures can vary, a couple of cups of coffee a day is generally fine.

"However, if you're finding that your coffee consumption is heading up toward more than six cups a day, it's about time you rethink your next drink."

Credit: 
University of South Australia

Possible link between late-term births and better academic outcomes, study suggests

This release has been removed upon request of the submitting institution. Please contact Jennifer Forbes, 732-788-8301, jenn.forbes@rwjms.rutgers.edu for more information.

Credit: 
Rutgers University

Pathogens get comfy in designer goo

image: Rice University bioengineers have developed hydrogels of various stiffness to see if they are more hospitable to intestinal cells and bacteria in lab experiments. The hydrogels proved far better at supporting cultures than traditional glass and plastic slides.

Image: 
Rice University/Baylor College of Medicine

HOUSTON -- (July 22, 2021) -- Researchers who want bacteria to feel right at home in the laboratory have put out a new welcome mat.

Rice University bioengineers and Baylor College of Medicine scientists looking for a better way to mimic intestinal infections that cause diarrhea and other diseases have built and tested a set of hydrogel-based platforms to see if they could make both transplanted cells and bacteria comfy.

As a mechanical model of intestinal environments, the lab's soft, medium and hard polyethylene glycol (PEG) hydrogels were far more welcoming to the cells that normally line the gut than the glass and plastic usually used by laboratories. These cells can then host bacteria like Escherichia coli that are sometimes pathogenic. The ability to study their dynamics under realistic conditions can help scientists find treatments for the maladies they cause.

The researchers found strong correlation between the stiffness of hydrogels, which mimic intestinal mucus, and how well a diarrhea-causing strain of E. coli adhered to and aggregated atop the epithelial cells that normally line the intestines. They reported that softer hydrogels promoted "significantly greater bacterial adhesion," which they attribute to mucus and other extracellular matrix components expressed by the cells.

The study led by bioengineer Jane Grande-Allen of Rice's Brown School of Engineering and Anthony Maresso at Baylor, which appears in Acta Biomaterialia, proved the gels' value in experiments involving the soft interface between organs and microbial or bacterial pathogens.

The Estes lab at Baylor built its model cultures using enteroids, constructs of intestinal cell cultures that scientists use to understand how epithelial cells respond to infectious invaders. Enteroids can incorporate a variety of cells found in the gut, but before Rice's hydrogels, they were grown on platforms that did not easily mimic the squishy tissues in host bodies.

"Tissue culture plastic is the standard for growing cells, but it's a really artificial environment because it's so rigid," Grande-Allen said. "It's hard plastic, like a glass slide. Certain cells grow just fine on tissue culture plastic, but they're not consistently easy to infect that way.

"This is the case with the cells in the jejunum, where most nutrient absorption happens in the small intestine," she said. "Our collaborators obtain human intestinal tissues from biopsies and bariatric surgeries to make enteroids, but the enteroids derived from jejunal cells had been difficult to infect with this pathogenic E. coli."

In Grande-Allen's lab, enteroid cells were grown on top of the hydrogel substrates. After a time, the researchers in Maresso's lab added bacteria and found that the enteric E. coli clustered easily on the intestinal cells grown on the medium and soft gels, but not on glass slides or stiff hydrogels.

All of the hydrogel-cultured enteroids showed significant enrichment of gene and signaling pathways related to epithelial differentiation, cell junctions and adhesions, extracellular matrix and mucins compared to those cultured on rigid surfaces.

The Rice lab reported its successful development of hydrogels for enteroid use in a previous paper with the Baylor researchers. "Getting the cells to adhere and spread on the hydrogels was tricky, which is why we wrote the methods paper," Grande-Allen said.

"But with that coating approach established, the hydrogel underneath could have a range of different stiffnesses," she said. "That was the variable in the new paper, and we were floored to find the effect that it had on bacterial adhesion.

"In general, stiffness and its effect on bacteria is rather understudied," Grande-Allen said. "Others have reported that bacteria grown directly on hydrogels prefer stiffer gels, and that finding will help to study biofilms. But here, our focus was trying to mimic the infectious disease process that actually happens in the gut, so we needed to involve the epithelial cells."

Grande-Allen said the hydrogels will be used to study other types of diarrhea-causing bacteria, including patient-specific cultivates, but in the near term said her lab will look at the combined effect of stiffness and shear stress on bacterial adhesion to enteroids.

Credit: 
Rice University

What makes a market transaction morally repugnant?

Many people find it morally impermissible to put kidneys, children, or doctorates on the free market. But what makes a market transaction morally repugnant in the eyes of the public? And which transactions trigger the strongest collective disapproval? Researchers from the Max Planck Institute for Human Development and the Robert Koch Institute have addressed these questions. Their findings, published in Cognition, offer new entry points for policy interventions.

Would you be willing to sell a kidney or be paid to spend time on a date? If not, then you are not alone. Many people find the idea of selling and buying human organs, children, sex, or doctorates morally repugnant. But what are the psychological mechanisms behind these feelings? Which aspects of a transaction do people find most repugnant? A research team from the Max Planck Institute for Human Development and the Robert Koch Institute has investigated these questions.

"Our aim was to uncover the psychological drivers of people's feelings of repugnance towards such transactions," says Christina Leuker, lead author and researcher at the Robert Koch Institute, and associate research scientist in the Center for Adaptive Rationality at the Max Planck Institute for Human Development. "Once we know what makes a market transaction morally repugnant in the eyes of the public, we are in a better position to predict how people might respond to novel transactions, such as those arising from technological advances in the field of human genetic engineering."

To shine a light on the psychology of repugnance, the researchers conducted two online surveys, in which a total of 1,554 respondents judged 51 market transactions in terms of their repugnance and 21 other characteristics. These included the extent to which the transaction triggers anger or disgust, is harmful to society, affects the dignity of the seller, or leaves people open to exploitation.

The researchers found similar patterns of repugnance judgments across the respondents in both studies. Three transactions--selling rights to hunt endangered animals, selling brides, and selling voting rights--triggered the strongest collective disapproval. Moreover, the authors were able to identify five aspects that appear to underlie feelings of repugnance. One was moral outrage: The more moral outrage a transaction triggers, the more disgust and anger people feel, the less empathy they have for those engaged in the transaction, and the more harmful they think the transaction is to society.

The four other drivers of repugnance identified by the research team were the extent to which people want a transaction to be regulated; the extent to which a transaction's worth can or cannot be translated into a monetary value; the extent to which the transaction may exploit disadvantaged individuals; and the extent to which sellers are exposed to unknown risks or are unable to fully anticipate the consequences of the transaction.

Analyses showed that the degree of moral outrage was also a good predictor of the desire for regulation: Many transactions that triggered strong moral outrage were also characterized by a strong perceived need for regulation.

The researchers highlight that their approach can provide new entry points for policy interventions. "Transactions that prompt similar degrees of public repugnance may do so for very different reasons--and this has implications for policy interventions," says Ralph Hertwig, Director of the Center for Adaptive Rationality at the Max Planck Institute for Human Development. "For instance, if the main driver of repugnance toward a transaction is that it leaves disadvantaged individuals open to exploitation, an effective policy response may be geared toward protecting those who are vulnerable. If unknown risk is the main cause of repugnance, the policy may instead focus on reducing and clearly communicating potential risks."

The researchers also identified mismatches between the judged repugnance of a transaction and its current legal status. "For instance, UK respondents considered carbon emissions trading and selling permits to shoot rare animals highly repugnant, yet both are legal in their country," says Christina Leuker. "Such mismatches may be grounds for policy makers to reevaluate those transactions."

Credit: 
Max Planck Institute for Human Development

Cell-analysis technique could combat tuberculosis

ITHACA, N.Y. - A new method that analyzes how individual immune cells react to the bacteria that cause tuberculosis could pave the way for new vaccine strategies against this deadly disease, and provide insights into fighting other infectious diseases around the world.

The cutting-edge technologies were developed in the lab of Dr. David Russell, the William Kaplan Professor of Infection Biology in the Department of Microbiology and Immunology in the College of Veterinary Medicine, and detailed in new research published in the Journal of Experimental Medicine on July 22.

For years, Russell's lab has sought to unravel how Mycobacterium tuberculosis (Mtb), the bacteria that cause tuberculosis, infect and persist in their host cells, which are typically immune cells called macrophages.

The lab's latest innovation combines two analytical tools that each target a different side of the pathogen-host relationship: "reporter" Mtb bacteria that glow different colors depending on how stressed they are in their environment; and single-cell RNA sequencing (scRNA-seq), which yields RNA transcripts of individual host macrophage cells.

"For the first time ever, Dr. Davide Pisu in my lab combined these two approaches to analyze Mtb-infected immune cells from an in vivo infection," Russell said.

After infecting mice with the fluorescent reporter Mtb bacteria, Russell's team was able to gather and flow-sort individual Mtb-infected macrophages from the mouse lung. The researchers then determined which macrophages promoted Mtb growth (sporting happy, red-glowing bacteria) or contained stressed Mtb unlikely to grow (unhappy, green-glowing bacteria).

Next, they took the two sorted, infected macrophage populations and ran them through single-cell RNA sequencing analysis, thereby generating transcriptional profiles of each individual host cell in both populations.

When the scientists compared the macrophage single cell sequencing data with the reporter bacteria phenotype, they found an almost perfect one-to-one correlation between the fitness status of the bacterium and the transcriptional profile in the host cell. Macrophages that housed unhappy green bacteria also expressed genes that were known to discourage bacterial growth, while those with happy red bacteria expressed genes known to promote bacterial growth.

Scientific experiments rarely play out so nicely.

"What absolutely stunned us is how well it worked," Russell said. "When Davide Pisu showed me the analysis I nearly fell off my chair."

Normally, phenotypes and transcriptional profiles are two characteristics that seldom come together in a perfect match, and this almost never happens from in vivo data.

This near-perfect matchup revealed new nuances.

"While our previous results identifying the resident alveolar macrophages (AM) as permissive and the blood monocyte-derived recruited macrophages (IM) as controlling Mtb infection was correct in a broad sense, we found, unsurprisingly, that this was an oversimplification," Russell said.

There was variation even within these two different macrophage types: Some AM cells controlled Mtb growth while some IMs were allowing bacterial expansion. The team found that comparable subsets of immune cells were present in both human and mouse lung samples.

An additional step in the study was to look at whether the responses of AM and IM cells to Mtb were epigenetically controlled--meaning that the cells' traits are due to certain genes being turned off or on by molecular switches This could explain how two sets of macrophages respond differently. Using a read-out of a cell's epigenetic landscape, they found that this was the case.

"The analysis showed that when these cells are exposed to Mtb or the vaccine strain- through infection or vaccination, respectively - their epigenetic programming has a major influence over whether their response leads to disease control or progression," Russell said.

Armed with this body of new information, the Russell lab plans to hit the ground running to identify novel therapeutics. "We're going to begin by screening libraries of known epigenetic inhibitor compounds to see which ones might be useful in modifying the immune response," Russell said.

If they do find promising compounds - ones that push macrophages towards an anti-Mtb behavior - they could potentially be used in combination with vaccines to assist a patient's immune system in protecting against tuberculosis.

The finding lays a foundation for more powerful studies on how pathogens affect individual cells, allowing for a holistic examination of the system.

"This is a roadmap that lets us look across an entire population of cells and see how a single perturbation impacts the cells across that population," Russell said. "We can test for drug efficacy in in vivo infection without any preconceived limitation on how compounds may function."

This approach is extremely flexible and could be used in the study of any intracellular pathogen, including viruses, and is readily applicable to any animal challenge model.

Credit: 
Cornell University

Alpha variant spread via 'super-seeding' event in UK: Oxford research

The rapid spread of the Alpha variant of COVID-19 resulted from biological changes in the virus and was enhanced by large numbers of infected people 'exporting' the variant to multiple parts of the UK, in what the researchers call a 'super-seeding' event.

Results of the largest phylogeographic analysis ever conducted, published today in the journal Science, maps the spread of the variant (also known as lineage B.1.1.7) from its origins in Kent and Greater London in November 2020 to all but five counties in Wales, Scotland, Northern Ireland and England by 19 January.

Dr Moritz Kraemer, lead author on the study and Branco Weiss Research Fellow in Oxford's Department of Zoology, says, 'At the beginning of December 2020 the epicentre of COVID-19 transmission in England shifted rapidly from the North West and North East to London and the South East, as the Alpha variant took hold.
'As people travelled from London and the South East to other areas of the UK they 'seeded' new transmission chains of the variant. This continued as a national 'super-seeding' event which did not start to slow until early January. Although travel was curbed, after travel restrictions were introduced on 20 December, this was compensated for by the continued exponential growth in Alpha variant cases.'

The rapid spread of the Alpha variant across the UK, led to initial reports that it could be up to 80% more transmissible than the original strain. This study, published today by researchers at universities including Oxford, Northeastern and Edinburgh, shows mobility significantly affected its spread and early growth rates. According to the researchers, this highlights the need for epidemiologists to work closely with virologists and geneticists rapidly to create accurate transmissibility estimates for new variants.

Professor Oliver Pybus, lead researcher of the Oxford Martin Programme on Pandemic Genomics, explains, 'Estimates of Alpha's transmission advantage over previous strains were initially 80%, but declined through time. We found Alpha's emergence was a combination of virus genetic changes and transient epidemiological factors. An initial wave of Alpha variant export to places in England with low rates of infection, from the massive outbreak in Kent and Greater London, explains why at first it spread so fast.

'The Alpha variant does contain genetic changes which makes it more transmissible. It is likely the Alpha variant was 30% to 40% more transmissible than the initial strain. And the early estimates were higher because we did not know how much its growth was exacerbated by human mobility and by how many contacts different groups of people have. Crucially, as more variants emerge and spread in other countries worldwide, we must be careful to account for these phenomena when evaluating the intrinsic transmissibility of new variants.'

Verity Hill, co-author and researcher from the University of Edinburgh, expands, 'The Alpha variant began by spreading mostly within London and the South East, even during the November lockdown in England. Once this was lifted, it spread rapidly across the country, as human movement increased significantly. Our ability to be able to trace the origins of the Alpha back to a point source in the South East of England has important implications for how new variants arise and how they will spread across the UK.'

Dr Samuel V. Scarpino, lead researcher from the Network Science Institute at Northeastern University and External Faculty at the Santa Fe Institute, highlights the importance of integrative pathogen surveillance systems, 'Only by integrating high-resolution genomic, case, testing, and aggregated, anonymous mobility data were we able to identify the drivers of Alpha variant emergence and spread in the UK.

'Uncovering the mechanisms of B.1.1.7 emergence allows governments to respond more effectively and advances our scientific understanding of epidemics. The challenge now is to build similar surveillance systems globally. Equitable, ethical data systems will be critical for ending this pandemic and preventing future ones.'

Dr Kraemer concludes, 'As new variants emerge we expect they will spread significantly before travel restrictions are put in place, as likely happened with the Delta variant. Given the scale of its current outbreak, it seems probably that the UK is now an important exporter of the Delta variant across Europe and some other parts of the world.

'The UK has decided to ease its restrictions because of our high vaccination rates and a confidence that we have protected the most vulnerable people in society. But that's not the case in most other countries and the Delta variant could be starting this process again elsewhere, highlighting the urgent need for faster and equitable distribution of vaccines worldwide.'

Credit: 
University of Oxford

Soft skin patch could provide early warning for strokes, heart attacks

image: This soft, stretchy skin patch uses ultrasound to monitor blood flow to organs like the heart and brain.

Image: 
Nature Biomedical Engineering

Engineers at the University of California San Diego developed a soft and stretchy ultrasound patch that can be worn on the skin to monitor blood flow through major arteries and veins deep inside a person's body.

Knowing how fast and how much blood flows through a patient's blood vessels is important because it can help clinicians diagnose various cardiovascular conditions, including blood clots; heart valve problems; poor circulation in the limbs; or blockages in the arteries that could lead to strokes or heart attacks.

The new ultrasound patch developed at UC San Diego can continuously monitor blood flow--as well as blood pressure and heart function--in real time. Wearing such a device could make it easier to identify cardiovascular problems early on.

A team led by Sheng Xu, a professor of nanoengineering at the UC San Diego Jacobs School of Engineering, reported the patch in a paper published July 16 in Nature Biomedical Engineering.

The patch can be worn on the neck or chest. What's special about the patch is that it can sense and measure cardiovascular signals as deep as 14 centimeters inside the body in a non-invasive manner. And it can do so with high accuracy.

"This type of wearable device can give you a more comprehensive, more accurate picture of what's going on in deep tissues and critical organs like the heart and the brain, all from the surface of the skin," said Xu.

"Sensing signals at such depths is extremely challenging for wearable electronics. Yet, this is where the body's most critical signals and the central organs are buried," said Chonghe Wang, a former nanoengineering graduate student in Xu's lab and co-first author of the study. "We engineered a wearable device that can penetrate such deep tissue depths and sense those vital signals far beneath the skin. This technology can provide new insights for the field of healthcare."

Another innovative feature of the patch is that the ultrasound beam can be tilted at different angles and steered to areas in the body that are not directly underneath the patch.

This is a first in the field of wearables, explained Xu, because existing wearable sensors typically only monitor areas right below them. "If you want to sense signals at a different position, you have to move the sensor to that location. With this patch, we can probe areas that are wider than the device's footprint. This can open up a lot of opportunities."

How it works

The patch is made up of a thin sheet of flexible, stretchable polymer that adheres to the skin. Embedded on the patch is an array of millimeter-sized ultrasound transducers. Each is individually controlled by a computer--this type of array is known as an ultrasound phased array. It is a key part of the technology because it gives the patch the ability to go deeper and wider.

The phased array offers two main modes of operation. In one mode, all the transducers can be synchronized to transmit ultrasound waves together, which produces a high-intensity ultrasound beam that focuses on one spot as deep as 14 centimeters in the body. In the other mode, the transducers can be programmed to transmit out of sync, which produces ultrasound beams that can be steered to different angles.

"With the phased array technology, we can manipulate the ultrasound beam in the way that we want," said Muyang Lin, a nanoengineering Ph.D. student at UC San Diego who is also a co-first author of the study. "This gives our device multiple capabilities: monitoring central organs as well as blood flow, with high resolution. This would not be possible using just one transducer."

The phased array consists of a 12 by 12 grid of ultrasound transducers. When electricity flows through the transducers, they vibrate and emit ultrasound waves that travel through the skin and deep into the body. When the ultrasound waves penetrate through a major blood vessel, they encounter movement from red blood cells flowing inside. This movement changes or shifts how the ultrasound waves echo back to the patch--an effect known as Doppler frequency shift. This shift in the reflected signals gets picked up by the patch and is used to create a visual recording of the blood flow. This same mechanism can also be used to create moving images of the heart's walls.

A potential game changer in the clinic

For many people, blood flow is not something that is measured during a regular visit to the physician. It is usually assessed after a patient shows some signs of cardiovascular problems, or if a patient is at high risk.

The standard blood flow exam itself can be time consuming and labor intensive. A trained technician presses a handheld ultrasound probe against a patient's skin and moves it from one area to another until it's directly above a major blood vessel. This may sound straightforward, but results can vary between tests and technicians.

Since the patch is simple to use, it could solve these problems, said Sai Zhou, a materials science and engineering Ph.D. student at UC San Diego and co-author of the study. "Just stick it on the skin, then read the signals. It's not operator dependent, and it poses no extra work or burden to the technicians, clinicians or patients," he said. "In the future, patients could wear something like this to do point of care or continuous at-home monitoring."

In tests, the patch performed as well as a commercial ultrasound probe used in the clinic. It accurately recorded blood flow in major blood vessels such as the carotid artery, which is an artery in the neck that supplies blood to the brain. Having the ability to monitor changes in this flow could, for example, help identify if a person is at risk for stroke well before the onset of symptoms.

The researchers point out that the patch still has a long way to go before it is ready for the clinic. Currently, it needs to be connected to a power source and benchtop machine in order to work. Xu's team is working on integrating all the electronics on the patch to make it wireless.

Credit: 
University of California - San Diego