Culture

Communication: A key tool for citizen participation in science

Researchers from Pompeu Fabra University (Barcelona, Spain) have analysed the way citizen science is practised in Spain. The paper, produced by Carolina Llorente and Gema Revuelta, from UPF's Science, Communication and Society Studies Centre (CCS-UPF) and Mar Carrió, from the University's Health Sciences Educational Research Group (GRECS), has been published in the Journal of Science Communication (JCOM).

Based on the study, a series of recommendations have been put forward to improve how citizen participation in science is carried out. Firstly, they suggest efforts be stepped up regarding the training given for assessing these initiatives or the creation of multi-disciplinary teams with a broad range of knowledge areas to organise this kind of assessment. They also highlight the importance of keeping in mind the activities' reproducibility.

The aims of the study were to identify citizen science participants in Spain, define what role communication plays and analyse to what extent the key requirements for undertaking citizen science are integrated in its practice. This is the first time this aspect has been studied in Spain.

In this qualitative study researchers analysed 16 interviews with coordinators of science activities in which citizens participate, which included the active participation of individuals in different phases of a research project. This participation could consist in providing opinions, collaborating with data collection, interpreting results and/or evidence-based decision-making.

The results indicate a largely strategic participation of groups of people with traits that make them ideal for participating in certain projects (such as neighbours, patients, public administration staff, etc.). The authors also highlight captive audiences - those who participate in activities without having volunteered for them and who have no choice but to take part, such as school students, for example.

The researchers underline communication as a key tool for successful practice. Gema Revuelta explains that "efforts must be spent in the conceptualisation phase to identify the participants, the best strategies for ensuring their participation and the expected level of commitment for the project".

The selection of a strategic public is essential for an activity of this nature to work properly. Although most interviewees made reference to this, there are some who continue to view the public as a single entity. Carolina Llorente stresses that: "it makes no sense to identify the public as "general public" in this kind of activity. Efforts need to be made when designing the activity to identify which specific groups should participate".

They also analysed the level of integration of five essential key elements that form part of a citizen science activity: the findings, the level of participant contribution, participation assessment, the reproducibility of the activity and the training of the participants and facilitators. Of particular importance here is training in specific skills based on the level of citizen participation and the need to train the teams responsible for organising the activities.

Researcher Carolina Llorente explains that the resulting insight into how citizen science is being performed "gives us a starting point for proposing improvement strategies to incentivise this way of doing research".

Credit: 
Universitat Pompeu Fabra - Barcelona

Worms learn how to optimize foraging by switching their response to social cues

image: An image of a Caenorhabditis elegans (C. elegans) worm.

Image: 
Adapted from work by Arne Hendriks (CC BY 2.0 - https://creativecommons.org/licenses/by/2.0)

Researchers have shown how worms learn to optimise their foraging activity by switching their response to pheromones in the environment, according to a report published today in eLife.

The findings are an important advance in the field of animal behaviour, providing new insights on how sensory cues are integrated to facilitate foraging and navigation.

Foraging food is one of the most critical yet challenging activities for animals, with food often patchily distributed and other animals trying to find and consume the same resources.

An important consideration is how long to stay and exploit a food patch before moving on to find another. Leaving incurs the cost of exploring a new territory, whereas staying put means feeding in a patch where resources are depleting. So how do animals know when to leave?

Natural habitats are usually full of chemical cues, such as pheromones from other individuals, and it is thought that pheromones help animals orientate their search for food. This means they need to know whether pheromones point to an abundant food resource, or an already exploited one. To acquire this knowledge, they need to learn from experience.

"It has been shown that bumblebees learn from the positive or negative association with pheromones acquired from their most recent feeding experience, but whether this is the case for other animals is unclear," explains first author Martina Dal Bello, Postdoctoral Associate at the Massachusetts Institute of Technology (MIT), Cambridge, US. "We wanted to investigate the foraging behaviours of Caenorhabditis elegans worms because we know that they can evaluate population density inside food patches using pheromones, and that both pheromones and food availability control when the worms decide to move on to a new food resource."

The team started with a test to assess the worms' patch-leaving behaviour. Worms fed on a small patch of bacteria for about five hours, at equal distance from two spots: one a blend of pheromones and the other a non-pheromone control. The worms left the original patch at very different times; those that left early were more likely to go to the pheromone spot, whereas those that left late avoided the pheromones.

Next, they wanted to test whether these behaviours provided any survival benefits, so they developed a mathematical model to calculate the benefit to the worms of changing their preference for pheromones in a patchy environment. In the model, worms in an already occupied food patch have three choices: remain in their food patch, switch to another occupied food patch, or disperse away to an unoccupied patch. The model showed that the strategy that maximises the food eaten by a worm aligns with what they observed in the previous experiment. A fraction of worms switch at the beginning, leaving the food patch before it is depleted and following pheromones to reach another occupied patch. Then, once the food patches are depleted, all worms will disperse to find other food sources. These late leavers avoid other depleted patches by reversing their preference for pheromones, because they now associate pheromones with a depleted food patch.

So how do the worms know to reverse their preference for pheromones? To find out, the researchers kept worms in four sets of conditions: one with food and pheromones; one with pheromones but without food; one with food but without pheromones; and one without any food or pheromones. They then monitored how the worms reacted to a blend of pheromones. As anticipated, worms that spent time in the presence of both food and pheromones moved towards the pheromone blend, whereas they avoided it if they were used to the environment with pheromones but without food. As with bumblebees, the worms learn to prefer pheromones based on their recent positive or negative experience with foraging.

"Our study explains why worms tend to leave food patches at different times," explains senior author Jeff Gore, Professor of Physics at MIT. "Those that leave early are exposed to pheromones when food is abundant and have a positive association with these cues, which leads them to seek them out again. By contrast, worms that leave later when food is sparse associate pheromones with being famished, and avoid pheromones when they move away from the patch. Taken together, these results show that worms learn to adapt to sensory cues in their environment to optimise food intake during foraging."

Credit: 
eLife

Lab analysis finds near-meat and meat not nutritionally equivalent

DURHAM, N.C. -- Plant-based meat substitutes taste and chew remarkably similar to real beef, and the 13 items listed on their nutrition labels - vitamins, fats and protein -- make them seem essentially equivalent.

But a Duke University research team's deeper examination of the nutritional content of plant-based meat alternatives, using a sophisticated tool of the science known as 'metabolomics,' shows they're as different as plants and animals.

Meat-substitute manufacturers have gone to great lengths to make the plant-based product as meaty as possible, including adding leghemoglobin, an iron-carrying molecule from soy, and red beet, berries and carrot extracts to simulate bloodiness. The texture of near-meat is thickened by adding indigestible fibers like methyl cellulose. And to bring the plant-based meat alternatives up to the protein levels of meat, they use isolated plant proteins from soy, peas, and other plant sources. Some meat-substitutes also add vitamin B12 and zinc to further replicate meat's nutrition.

However, many other components of nutrition do not appear on the labels, and that's where the products differ widely from meat, according to the study, which appears this week in Scientific Reports.

The metabolites that the scientists measured are building blocks of the body's biochemistry, crucial to the conversion of energy, signaling between cells, building structures and tearing them down, and a host of other functions. There are expected to be more than 100,000 of these molecules in biology and about half of the metabolites circulating in human blood are estimated to be derived from our diets.

"To consumers reading nutritional labels, they may appear nutritionally interchangeable," said Stephan van Vliet, a postdoctoral researcher at the Duke Molecular Physiology Institute who led the research. "But if you peek behind the curtain using metabolomics and look at expanded nutritional profiles, we found that there are large differences between meat and a plant-based meat alternative."

The Duke Molecular Physiology Institute's metabolomics core lab compared 18 samples of a popular plant-based meat alternative to 18 grass-fed ground beef samples from a ranch in Idaho. The analysis of 36 carefully cooked patties found that 171 out of the 190 metabolites they measured varied between beef and the plant-based meat substitute.

The beef contained 22 metabolites that the plant substitute did not. The plant-based substitute contained 31 metabolites that meat did not. The greatest distinctions occurred in amino acids, dipeptides, vitamins, phenols, and types of saturated and unsaturated fatty acids found in these products.

Several metabolites known to be important to human health were found either exclusively or in greater quantities in beef, including creatine, spermine, anserine, cysteamine, glucosamine, squalene, and the omega-3 fatty acid DHA. "These nutrients have potentially important physiological, anti-inflammatory, and or immunomodulatory roles," the authors said in the paper.

"These nutrients are important for our brain and other organs including our muscles" van Vliet said. "But some people on vegan diets (no animal products), can live healthy lives - that's very clear." Besides, the plant-based meat alternative contained several beneficial metabolites not found in beef such as phytosterols and phenols.

"It is important for consumers to understand that these products should not be viewed as nutritionally interchangeable, but that's not to say that one is better than the other," said van Vliet, a self-described omnivore who enjoys a plant-heavy diet but also eats meat. "Plant and animal foods can be complementary, because they provide different nutrients."

He said more research is needed to determine whether there are short-term or long-term effects of the presence or absence of particular metabolites in meat and plant-based meat alternatives.

Credit: 
Duke University

Biochemical pathway to skin darkening holds implications for prevention of skin cancers

BOSTON - A skin pigmentation mechanism that can darken the color of human skin as a natural defense against ultraviolet (UV)-associated cancers has been discovered by scientists at Massachusetts General Hospital (MGH). Mediating the biological process is an enzyme, NNT, which plays a key role in the production of melanin (a pigment that protects the skin from harmful UV rays) and whose inhibition through a topical drug or ointment could potentially reduce the risk of skin cancers. The study was published online in Cell.

"Skin pigmentation and its regulation are critically important because pigments confer major protection against UV-related cancers of the skin, which are the most common malignancies found in humans," says senior and co-corresponding author David Fisher, MD, PhD, chief of the Department of Dermatology at MGH. "Darker-pigmented individuals are better protected from cancer-causing UV radiation by the light-scattering and antioxidant properties of melanin, while people with the fairest and lightest skin are at highest risk of developing skin cancers."

Through their laboratory work with skin from humans and animal models, the MGH researchers mimicked the natural protection that exists in people with dark pigments. In the process, they gained a fuller understanding of the biochemical mechanism involved along with their drivers, and how they might be influenced by a topical agent independent of UV radiation, sun exposure, or genetics.

"We had assumed that the enzymes that make melanin by oxidizing the amino acid tyrosine in the melanosome (the synthesis and storage compartment of the cell) are largely regulated by gene expression," explains Fisher. They were surprised to learn, however, that the amount of melanin being produced is in large part regulated by a much different chemical mechanism, one that can ultimately be traced to an enzyme in the mitochondria, the inner chamber of the cell, with the ability to alter skin pigmentation.

That enzyme is nicotinamide nucleotide transhydrogenase, or NNT. Researchers found that topical application of small molecule inhibitors of NNT resulted in skin darkening in human skin, and that mice with decreased NNT function displayed increased fur pigmentation. To test their discovery, they challenged the skin with UV radiation and found that the skin with darker pigments was indeed protected from DNA damage inflicted by ultraviolet rays.

"We're excited by the discovery of a distinct pigmentation mechanism because it could pave the way, after additional studies and safety assessments, for a new approach to skin darkening and protection by targeting NNT," says Elisabeth Roider, MD, previously an investigator with MGH, and lead author and co-corresponding author of the study. "The overarching goal, of course, is to improve skin cancer prevention strategies and to offer effective new treatment options to the millions of people suffering from pigmentary disorders."

Credit: 
Massachusetts General Hospital

Fecal transplant plus fibre improves insulin sensitivity in severely obese

A transplant of healthy gut microbes followed by fibre supplements benefits patients with severe obesity and metabolic syndrome, according to University of Alberta clinical trial findings published today in Nature Medicine.

Patients who were given a single-dose oral fecal microbial transplant followed by a daily fibre supplement were found to have better insulin sensitivity and higher levels of beneficial microbes in their gut at the end of the six-week trial. Improved insulin sensitivity allows the body to use glucose more effectively, reducing blood sugar.

"They were much more metabolically healthy," said principal investigator Karen Madsen, professor of medicine in the Faculty of Medicine & Dentistry and director of the Centre of Excellence for Gastrointestinal Inflammation and Immunity Research.

"These patients were on the best known medications (for metabolic syndrome) and we could improve them further, which shows us there is an avenue for improvement by targeting these different pathways in the microbiome."

Sixty-one patients with a body mass index of 40 or higher completed the double-blind, randomized trial. Recruited from the bariatric surgery waitlist in Edmonton, all had metabolic syndrome, a condition that includes insulin resistance, high blood glucose, high blood pressure and other complications. It can eventually lead to diabetes.

The microbiome is all of the bugs--micro-organisms, bacteria, viruses, protozoa and fungi--found in the gastrointestinal tract. People with various diseases are known to have altered microbial contents. It is not fully understood whether microbiome changes cause disease or whether disease causes changes in the gut, but it is likely a bit of both, Madsen said. It is known that replacing unhealthy bacteria with healthy bacteria can lead to improved health.

Fecal transplants, which contain microbes from healthy stool donors, are currently used extensively for treating Clostridium difficile, or C. difficile, bacterial infections, and research is underway to test their usefulness in treating other illnesses such as inflammatory bowel disease, mental health and metabolic disorders.

"We know that the gut microbiome affects all of these processes--inflammation, metabolism, immune function," said Madsen, who is a member of the Women and Children's Health Research Institute and is one of the University of Alberta leads for the national Microbiome Research Core (IMPACTT).

"The potential for improving human health through the microbiome is immense," Madsen said. "We are only scratching the surface at the moment."

This is the first study to show that oral delivery of fecal transplantation is effective in patients with obesity-related metabolic syndrome.

A previous study done in Europe on a small number of male patients with obesity and metabolic syndrome had shown promising results, but the transplants in that study were given through an invasive endoscopy (a tube down the throat) and the patients had milder disease.

The fecal microbial transplants in this study were from four lean, healthy donors, and were taken by mouth in a single dose of about 20 capsules prepared in a U of A lab. The capsules have no taste or odour.

The fibre supplements following the transplant were key to the success, Madsen said.

"When you transplant beneficial microbes, you need to feed them to keep them around," Madsen explained. "If you give a new microbe and you don't feed it, if you continue to eat a diet of processed foods and no fibre, then that microbe will likely die."

Our bodies do not naturally produce the enzymes needed to break down fibre, but that's what healthy bacteria in the microbiome need to live, thus the supplements. The team experimented with fermentable fibre (the kind found in beans, which produce gas) and non-fermentable fibre (essentially cellulose, found in whole grains).

"Non-fermentable fibre can change gut motility--how fast things move through--as well as acting as a bulking and binding agent that can change levels of bile acids, which could help explain our results," Madsen explained.

Madsen said the next step will be to do a longer study with more participants in multiple centres to learn how the transplant/fibre combination works and to monitor for changes in medication requirements, weight loss and other indicators. If results continue to show benefit, she said the pills could be available as a potential therapy within five years.

While scientists continue to narrow down which bacteria are the most beneficial for us, Madsen recommends we support the health of our own gut microbiome by eating fewer processed foods and more foods that contain fibre, such as whole grains, fruits and vegetables.

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Why men take more risks than women

Researchers from HSE University and Max Planck Institute for Human Cognitive and Brain Sciences have discovered how the theta rhythm of the brain and the gender differences in attitudes to risk are linked. In an article published in the journal Frontiers in Neuroscience, the researchers addressed which processes can be explained by knowing this connection. https://www.frontiersin.org/articles/10.3389/fnins.2021.608699/full

By transmitting signals, the brain's neurons generate electromagnetic fields. The multiplicity of neurons makes these fields strong enough to be recorded on the surface of the head using magneto- and electroencephalography techniques. The resulting recording of the brain's electrical activity is divided into frequency bands -- brain rhythms, which are designated by Greek letters. We know for each one in which parts of the brain it is generated, in what state the person is in (e.g., at rest or during tasks) and what processes it may be associated with.

Existing research suggests that many differences in behaviour, including attitudes toward risk, can beat least partly explained by individual characteristics of brain activity. On average, women are known to take risks less frequently than men, and experiments have shown a correlation between willingness to take risks and differences in the strength of right and left frontal lobe theta rhythms (frontal theta asymmetry). However, these studies either included only or mostly women, and it remains unclear whether the asymmetry of theta rhythms actually contributes to gender differences in risk-taking.

The authors of the new article set three objectives:

The first was to determine whether there is a correlation between risk attitudes and frontal theta asymmetry in a sample with more or less equal numbers of male and female subjects.

The second was to test whether the combined strength of the theta rhythms of both frontal lobes is associated with behaviour under uncertainty (there is already evidence for this).

The third was to determine whether the neuronal oscillations generated in the anterior cingulate cortex (an area of the brain involved in error monitoring and possibly linked to gender differences in decision-making) correlate with risk-taking.

Thirty-five people took part in the experiment; of these, 15 participants were women. Each participant was asked to undergo a magnetoencephalography and three tests measuring risk-taking and impulsivity. The first test involved selecting a number of boxes (out of 100), each of which offered a monetary reward, but if one of the selected boxes contained a 'bomb', the participants lost all their earnings. Each participant was given 30 attempts. The second and third tests were questionnaires: the Barratt Impulsiveness Scale showed how a person assessed his or her own ability to plan and exhibit self-control, while the Domain-Speci?c Risk-Taking Scale (DOSPERT) showed how willingly a person agreed to a particular risk-taking action and how he or she assessed the possible gains and losses arising from it.

In the boxes test, men showed a higher risk appetite than women (an average of 48 boxes opened versus 40; participants chose fewer boxes on their first try -- 44 and 31 out of 100, respectively). Of the questionnaires, only the DOSPERT Benefits scale yielded a similar result (men are more optimistic about the positive outcome of a risky venture); the other tests showed no gender differences. The frontal theta asymmetry was not significantly related to the number of boxes selected in the sample -- a positive correlation was evident only among women. The strength of the frontal theta rhythms (and especially the oscillations localized in the anterior cingulate cortex) correlated with results of the game, as well as with subjective assessments of benefits and losses from risky behaviours.

Thus, the researchers suggest that individual variability in the strength of theta rhythms in the anterior cingulate cortex is related to gender differences in assessing the consequences of risky actions and, consequently, attitudes toward risk. It is likely that both the activity of this brain region and risk-taking are influenced by hormone levels such as testosterone.

'Gender differences in weighing of the potential consequences of decisions may not only affect risk-taking, but also reflect a more fundamental process of emotional responsiveness to environmental stimuli. We speculate that such differences related to hormonal regulation may also influence the prevalence of depression, anxiety and other clinical conditions among women, and we will continue to explore this topic,' concluded Maria Azanova, the lead author of the article.

Credit: 
National Research University Higher School of Economics

From eyebrow beans to 'lost' rice: community seedbanks are protecting China's crops

image: Farmer varieties in Wangjinzhuang village, Hebei province.

Image: 
Qiubi

"Plant a hundred kinds of crops"

Wangjinzhuang village is nestled amongst the steep slopes of the South Taihang Mountains in Hebei Province, China. To prosper in the northern climate, the villagers have developed a tried-and-true strategy: "using the land to plant a hundred kinds of crops and not rely on the sky". Their fields contain red millet, white sorghum, purple and green eyebrow beans, and yellow radishes. Having survived for over a thousand years, this agrobiodiversity is a vibrant cornerstone of the village's agricultural heritage that is too precious to lose.

In an effort to combat dwindling crop diversity across China (the Ministry of Agriculture found that of 11,590 grain crop varieties planted in the country in 1956, only 3,271 varieties remained in 2014), the government has bolstered its system of national genebanks, plus issued recent policy recommendations. These are making positive steps towards large-scale conservation; however, there has been relatively little attention given to the role of the country's 260 million farmers who have saved, used, and contributed to the evolution of diverse, local crops for centuries.

The services provided by China's community seedbanks have been documented for possibly the first time by an article recently published in Frontiers in Sustainable Food Systems. 27 seedbanks were surveyed to understand their ability to meet a wide range of needs, with positive implications for climate resilience, improved farmer livelihoods, and increased food security.

Seedbanks the Chinese way

The seedbank at Wangjinzhuang village, one of the case studies covered by the article, has quickly grown since its establishment in 2019 by a local farmers' association. Now run by 43 members, 26 of whom are women, the seedbank holds viable samples over a hundred crop varieties, including essential grains such as millets and 82 traditional varieties. Members have organized multiplication plots and stipulate that for 1 kg of seeds withdrawn, 1.5 kg must be returned. This, combined with the guidance of plant breeders and a farmer field school, ensures a future supply of seeds that continue to evolve to meet local environmental conditions.

Elsewhere, in Jiangsu Province (a region with 6,000 years of rice paddy culture), economic development and large-scale agriculture have depleted many aquatic crops. But, spurred by increased consumer demand for sustainable and healthy foods, farmers have begun to turn back towards more diverse traditional rice varieties such as Suyunuo, an aromatic sticky rice that had been abandoned for over two decades. This 'lost' crop diversity is being reintroduced at an organic farm backed up by community seed banking. The community seedbank facilitates farmer-to-farmer exchange of seed and brings new diversity to the area through samples obtained from a regional public genebank. However, farmers realized that they no longer knew how to cultivate Suyunuo for the best results. Only by collaborating with public research institutes is it possible for farmers to re-adapt the crop to match new environmental conditions and consumers' interests.

Article author Dr. Yiching Song from the Chinese Academy of Sciences spearheaded the Farmer Seeds Network, a national initiative that organized many seedbanks. She reflects: "Community seedbanks encourage seed and knowledge exchange within and among rural communities, between rural communities and the formal conservation and seed sectors; and add value to local crop diversity through new linkages with markets and cities."

Growing seedbanks across China and beyond

Researchers emphasize the need for policies to recognize the complementary role of community seedbanks within the national conservation system and standardize processes for seed storage and benefit-sharing. Dr. Song notes that a formal system of incentives and rewards would, "encourage farmer communities to establish community seedbanks and work together with plant breeders and other researchers to take care of our country's rich agrobiodiversity."

Further support to develop seedbanks can come through organized training. Ronnie Vernooy is a scientist at the Alliance of Bioversity International and CIAT, which was part of the establishment of China's first-ever community seedbank in 2010. Since then, he says, "The Farmer Seeds Network, using our training handbooks developed for facilitators and farmers, has done remarkable work enabling farmers to open new community seedbanks across China. This is an important and exciting step in building more resilient seed systems."

Takeaways:
* China's wealth of crop biodiversity has been steadily disappearing.
* This can be addressed by community seedbanks, which build on a long tradition of farmer seed saving.
* Policy support and training can ensure that seedbanks are best equipped to conserve biodiversity and provide additional value for farmers.

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

About half of people living with HIV have coronary artery plaque despite low cardiac risk

BOSTON - Significant amounts of atherosclerotic plaque have been found in the coronary arteries of people with HIV, even in those considered by traditional measures to be at low-to-moderate risk of future heart disease, according to a study published in JAMA Network Open.

This finding emerged from the global REPRIEVE (Randomized Trial to Prevent Vascular Events in HIV) study, in which Massachusetts General Hospital (MGH) is playing a key coordinating role. Researchers found that the higher-than-expected levels of plaque could not be attributed simply to traditional cardiovascular disease risk factors like smoking, hypertension, and lipids in the blood, but were independently related to increased arterial inflammation and immune system activation.

"While we know that people living with HIV who are receiving antiretroviral therapy are at increased risk of coronary artery disease, our understanding of the mechanisms behind this phenomenon have been very limited," says Steven Grinspoon, MD, chief of the MGH Metabolism Unit and co-principal investigator of REPRIEVE. "The latest findings from REPRIEVE expand our knowledge and provide important insights that set the stage for further studies to identify effective plaque reduction or prevention strategies, such as the possible use of statin medicines."

REPRIEVE is the largest study of cardiovascular disease among people living with HIV, having enrolled 7,700 participants at more than 100 sites in 12 countries around the world, in collaboration with the AIDS Clinical Trials Group (ACTG). The newly published results are from a subset of the overall trial, consisting of 755 individuals between the ages of 40 and 75 enrolled in 31 sites across the United States. This is the largest study to assess plaque levels in the arteries of people with HIV who have no known heart disease and are eligible for primary cardiovascular prevention. The study used coronary CT angiography to assess plaque and then correlate results with blood samples that measured inflammation and immune activation.

The MGH-led study found that 49 percent of participants had plaque in their coronary arteries. While significant narrowing of the arteries was rare, nearly a quarter had plaque which the researchers considered "vulnerable," that is, at risk for potential future cardiovascular problems. "The prevalence of plaque found in people with HIV was striking, though the number of lesions was limited in most people and only a portion could be explained by traditional risk factors," says co-author Michael Lu, MD, MPH, co-director of the MGH Cardiovascular Imaging Research Center. "We learned that the plaque burden was also associated with higher levels of arterial inflammation and immune system activation independent of traditional risk scores."

Enabling researchers to assess these nontraditional cardiovascular risk factors were two biomarkers which they hypothesized could reflect premature cardiovascular disease among people with HIV. They are interleukin 6 (IL-6), associated with immune system activation, and LpPLA2, associated with arterial inflammation. "It was particularly notable to observe the increased levels of IL-6 in relationship to plaque among relatively healthy people with HIV, inasmuch as immune system activation may have damaging effects on the vessels of the heart over time," notes Grinspoon.

In addition to helping researchers better understand the mechanisms of cardiovascular risk in people with HIV, the two biomarkers will be evaluated in the next phase of REPRIEVE for their ability to predict major events, such as heart attacks and strokes. That ongoing research will also investigate the potential of statin therapy to reduce lipid levels - its primary therapeutic target -- as well as plaque and markers of inflammation.

"We know that cardiovascular disease is occurring among people with HIV at approximately twice the rate of people without the disease," says Grinspoon, "which is why REPRIEVE is so critical in terms of discovering new ways to mitigate those risks so that people with HIV can be assured healthy and full lives."

Credit: 
Massachusetts General Hospital

Novel coronavirus infects and replicates in salivary gland cells

image: Electron microscope image showing novel coronavirus inside salivary glands

Image: 
Bruno Matuck/USP

In Brazil, researchers at the University of São Paulo’s Medical School (FM-USP) have discovered that SARS-CoV-2 infects and replicates in the salivary glands.

Analysis of samples from three types of salivary gland obtained during a minimally invasive autopsy procedure performed on patients who died from complications of COVID-19 at Hospital das Clínicas, FM-USP’s hospital complex, showed that tissues specializing in producing and secreting saliva serve as reservoirs for the novel coronavirus.

The study was supported by FAPESP and reported in an article published in the Journal of Pathology.

The researchers said the discovery helps explain why the virus is so abundant in saliva and has enabled scientists to develop saliva-based diagnostic tests for COVID-19.

“This is the first report of a respiratory virus’s capacity to infect and replicate in salivary glands. Until now it was thought that only viruses that cause highly prevalent diseases such as herpes used salivary glands as reservoirs. The discovery may help explain why SARS-CoV-2 is so infectious,” Bruno Fernandes Matuck, a PhD candidate at USP’s Dental School and first author of the article, told Agência FAPESP.

A previous study by the same group had already demonstrated the presence of RNA from SARS-CoV-2 in the periodontal tissue of patients who died from COVID-19 (more at: agencia.fapesp.br/35675/).

Because SARS-CoV-2 is highly infectious compared with other respiratory viruses, they raised the hypothesis that it may replicate in cells of the salivary glands and hence be present in saliva without coming into contact with nasal and lung secretions. Prior research detected ACE2 receptors in salivary gland ducts. The spike protein in SARS-CoV-2 binds to ACE2 in order to invade and infect cells. More recently, other research groups have conducted studies in animals showing that other receptors besides ACE2, such as transmembrane serine protease 2 (TMPRSS2) and furin, both of which are present in salivary glands, are targets of SARS-CoV-2.

To test this hypothesis in humans, ultrasound-guided autopsies were performed on 24 patients who died from COVID-19, with a mean age of 53, to extract tissue samples from the parotid, submandibular and minor salivary glands.

The tissue samples were submitted to molecular analysis (RT-PCR), which detected the presence of the virus in more than two-thirds. Immunohistochemistry – a form of immunostaining in which antibodies bind to the antigen in the tissue sample, a dye is activated, and the antigen can then be seen under a microscope – also demonstrated the presence of the virus in the tissue. Finally, examination under an electron microscope detected not just the presence of the virus but also its replication in cells and the type of organelle it uses to replicate.

“We observed several viruses clustering in salivary gland cells, which showed that they were replicating there. They weren’t in these cells passively,” Matuck said.

The mouth as direct point of entry

The researchers now plan to see whether the mouth can be a direct point of entry for SARS-CoV-2, given that ACE2 and TMPRSS2 are found in various parts of the cavity, as well as in gum tissue and oral mucosa. In addition, the mouth has a larger contact area than the nasal cavity, which is widely considered the main way in for the virus.

“We’re going to partner with researchers at the University of North Carolina in the United States to map the distribution of these receptors in the mouth and quantify viral replication in oral tissues,” said Luiz Fernando Ferraz da Silva, a professor at FM-USP and principal investigator for the project.

“The mouth could be a viable medium for the virus to enter the body directly,” Matuck said.

Another idea is to find out whether older people have more ACE2 receptors in their mouths than younger people, given the decrease in salivary secretion with age. Nevertheless, the researchers found a high viral load even in older patients, who have less salivary tissue.

“These patients had almost no salivary tissue, almost only fatty tissue. Even so, viral load was relatively high,” Matuck said.

The article “Salivary glands are a target for SARS-CoV-2: a source for saliva contamination” (doi: 10.1002/path.5679) by Bruno Fernandes Matuck, Marisa Dolhnikoff, Amaro Nunes Duarte-Neto, Gilvan Maia, Sara Costa Gomes, Daniel Isaac Sendyk, Amanda Zarpellon, Nathalia Paiva de Andrade, Renata Aparecida Monteiro, João Renato Rebello Pinho, Michele Soares Gomes-Gouvêa, Suzana C.O.M. Souza, Cristina Kanamura, Thais Mauad, Paulo Hilário Nascimento Saldiva, Paulo H. Braz-Silva, Elia Garcia Caldini and Luiz Fernando Ferraz da Silva is at: onlinelibrary.wiley.com/doi/10.1002/path.5679.

Journal

The Journal of Pathology

DOI

10.1002/path.5679

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Scientists warn on the harmful implications of losing Indigenous and local knowledge systems

image: Across the vast majority of our planet, the historical and current land-uses of Indigenous Peoples and local communities, together with their interwoven practices and knowledge systems, are essential for sustaining our planet's biodiversity.

Image: 
Joan de la Malla

Five Simon Fraser University scholars are among international scientists sounding an alarm over the "pervasive social and ecological consequences" of the destruction and suppression of the knowledge systems of Indigenous Peoples and local communities.

Their paper, published today in the Journal of Ethnobiology, draws on the knowledge of 30 international Indigenous and non-Indigenous co-authors, and highlights 15 strategic actions to support the efforts of Indigenous Peoples and local communities in sustaining their knowledge systems and ties to lands.

Study co-lead, SFU archaeology professor Dana Lepofsky, says, "We worked hard to find a balance between discussing the threats to Indigenous and local knowledge and highlighting how Indigenous Peoples and local communities are taking action to turn around these threats. Around the world, Indigenous Peoples and local communities are celebrating, protecting, and revitalizing their knowledge systems and practices.

"As scientists, policymakers, and global citizens, we need to support these efforts in our professional activities, in the policies of our governmental agencies, and in our personal choices."

The authors summarize how the knowledge systems and practices of Indigenous Peoples and local communities play fundamental roles in safeguarding the biological and cultural diversity of our planet. They also document how this knowledge is being lost at alarming rates, with dramatic social and ecological consequences.

"Although Indigenous and local knowledge systems are inherently adaptive and remarkably resilient, their foundations have been and continue to be compromised by colonial settlement, land dispossession, and resource extraction," says study co-lead Álvaro Fernández-Llamazares, a post-doctoral researcher from the University of Helsinki, Finland. "The ecological and social impacts of these pressures are profound and widespread."

The paper is part of the "Scientists' Warning to Humanity" series, which highlights threats to humanity caused by climate change, biodiversity loss and other global changes.

Credit: 
Simon Fraser University

mRNA vaccines slash risk of COVID-19 infection by 91% in fully vaccinated people

People who receive mRNA COVID-19 vaccines are up to 91 percent less likely to develop the disease than those who are unvaccinated, according to a new nationwide study of eight sites, including Salt Lake City. For those few vaccinated people who do still get an infection, or "breakthrough" cases, the study suggests that vaccines reduce the severity of COVID-19 symptoms and shorten its duration.

Researchers say these results are among the first to show that mRNA vaccination benefits even those individuals who experience breakthrough infections.

"One of the unique things about this study is that it measured the secondary benefits of the vaccine," says Sarang Yoon, D.O., a study co-author, assistant professor at the University of Utah Rocky Mountain Center for Occupational and Environmental Health (RMCOEH), and principal investigator of the RECOVER (Research on the Epidemiology of SARS-CoV-2 in Essential Response Personnel) study in Utah.

The study, published online in the New England Journal of Medicine, builds on preliminary data released by the Centers for Disease Control and Prevention (CDC) in March.

The study was designed to measure the risks and rates of infection among those on the front lines of the pandemic.

"We gave these vaccines to some of the highest risk groups in this country--doctors, nurses, and first responders," Yoon says. "These are the people who are getting exposure to the virus day in and day out, and the vaccine protected them against getting the disease. Those who unfortunately got COVID-19 despite being vaccinated were still better off than those who didn't."

The study found that mRNA COVID-19 vaccines were:

91% effective in reducing risk for infection once participants were "fully" vaccinated, two weeks after the second dose.

81% effective in reducing risk for infection after "partial" vaccination, two weeks after the first dose but before the second dose was given.

The HEROES-RECOVER network recruited 3,975 participants at eight sites. In addition to Salt Lake City, sites included Miami, Florida; Temple, Texas; Portland, Oregon; Duluth, Minnesota; and Phoenix and Tucson, as well as other areas in Arizona. Participants submitted samples for COVID-19 testing on a weekly basis for 17 weeks between Dec. 13, 2020 and April 10, 2021. Participants also reported weekly whether they had COVID-19-like symptoms, including fever, shortness of breath, and loss of taste and smell.

Only 204 (5%) of the participants eventually tested positive for SARS-CoV-2, the virus that causes COVID-19. Of these, 156 were unvaccinated, 32 had an indeterminate vaccine status, and 16 were fully or partially vaccinated. The fully or partially vaccinated participants who developed breakthrough had milder symptoms than those who were unvaccinated:

Presence of fever was reduced 58% percent among those vaccinated with a breakthrough infection.

Days spent sick in bed were reduced by 60% among those who developed a breakthrough infection.

Detection of the virus was reduced by 70% percent among those with breakthrough infections, from 8.9 days to 2.7 days.

The three people who were hospitalized were not immunized, meaning that no one who developed a breakthrough infection was hospitalized.

These findings also suggest that fully or partially vaccinated individuals who get COVID-19 might be less likely to spread the virus to others. The researchers found that infected study participants who had been fully or partially vaccinated when infected had 40% less detectable virus in the nose and did so for six fewer days compared to those who were unvaccinated.

Overall, the researchers conclude the study's findings support the CDC's recommendation to get fully vaccinated as soon as possible.

"I hope these findings reassure the public that mRNA COVID-19 vaccines are safe and protect us from this severe disease," Yoon says.

The RECOVER study is ongoing, and results from future phases will help determine how long COVID-19 vaccines protect against infection and the real-world effectiveness of newer vaccines. A new study will test the same questions in children 12 and older who are now eligible to receive the COVID-19 vaccination. The research will also investigate how well COVID-19 vaccines protect against new variants now circulating in the U.S., including the highly transmissible Delta variant of SARS-CoV-2.

Credit: 
University of Utah Health

Still waiting at an intersection? Banning certain left turns helps traffic flow

When traffic is clogged at a downtown intersection, there may be a way to reduce some of the congestion: Eliminate a few left turns.

According to Vikash Gayah, associate professor of civil engineering at Penn State, well-placed left-turn restrictions in certain busy intersections could loosen many of the bottlenecks that hamper traffic efficiency. He recently created a new method that could help cities identify where to restrict these turns to improve overall traffic flow.

"We have all experienced that feeling of getting stuck waiting to make a left turn," Gayah said. "And if you allow these turns to have their own green arrow, you have to stop all other vehicles, making the intersection less productive. Left turns are also where you find the most severe crashes, especially with pedestrians. Our idea is to get rid of these turns when we can to create safer and more efficient intersections."

By selectively restricting left turns, but not banning them entirely, drivers may simply need to find alternate routes to their destinations in certain areas, Gayah said. Some may be required to travel a few extra blocks, but Gayah believes more efficient traffic flow through busy intersections offsets the additional distance.

For urban planners, he added, determining where to place the restrictions is a balancing act between intersection productivity and increased travel lengths. With so many restriction possibilities to consider, finding the most efficient layout may prove difficult.

"For example, if you just have 16 intersections to consider, each with a choice to allow or not allow left turns, that is already 65,000 different configurations," Gayah said. "It gets even more complicated when you consider that traffic flows from one intersection to the next, so decisions depend on one another. There ends up being so many possible answers that we can never find the best one."

Gayah's new method relies on heuristic algorithms, which use shortcuts to find solutions that nearly approach, but are not guaranteed to be, an optimal outcome.

"We make a guess, we learn from that guess, and then we make better guesses," he said. "Over time, we can get really, really close to the best answer."

In a study published in Transportation Research Record, Gayah combined two existing heuristic algorithms to create a new hybrid approach. The first, a population-based incremental learning (PBIL) algorithm, randomly sampled potential configurations and recognized the patterns of high-performing options. Next, a Bayesian optimization algorithm analyzed this new set of high performers to identify how restrictions were affecting traffic at adjacent intersections. Bayesian optimization combines initial information about the problem and updates it over time as new information is learned to attain a solution which is close to, but not necessarily perfect. The algorithm then applied this knowledge of traffic dynamics to find more efficient solutions.

"Instead of starting the Bayesian optimization with a random guess, we fed it with the best guesses from the PBIL," Gayah said. "The first method creates the starting point, and the second refines it."

Gayah tested the hybrid method through a simulated, square network in a variety of scenarios, finding that all three methods -- PBIL, Bayesian optimization and hybrid -- identified configurations that led to more efficient traffic patterns than a layout with zero restrictions. However, in simulations with more realistic settings, the hybrid method proved to be the most effective.

According to Gayah, the most efficient configurations tended to ban left turns in the middle of the city and allowed them more often on the periphery. While the method was applied to a generalized network, the results can be used as a starting point for real-world traffic patterns with the algorithms being customizable on a city-by-city basis.

"The grid network is the most generalizable and not specific to any city," Gayah said. "I cannot take the best configuration for New York and apply it to San Francisco, but this generalized approach could be configured for any network with a little bit of coding."

Credit: 
Penn State

UT Southwestern scientists closing in on map of the mammalian immune system

image: Bruce Beutler, M.D.

Image: 
UT Southwestern Medical Center

Using artificial intelligence, UT Southwestern scientists have identified thousands of genetic mutations likely to affect the immune system in mice. The work is part of one Nobel laureate's quest to find virtually all such variations in mammals.

"This study identifies 101 novel gene candidates with greater than 95% chance of being required for immunity," says Bruce Beutler, M.D., director of the Center for the Genetics of Host Defense (CGHD) and corresponding author of the study published this week in the Proceedings of the National Academy of Sciences. "Many of these candidates we have already verified by re-creating the mutations or knocking out the genes." Lead author Darui Xu, a computational biologist at CGHD, wrote the software.

"We've developed software called Candidate Explorer (CE) that uses a machine-learning algorithm to identify chemically induced mutations likely to cause traits related to immunity. The software determines the probability that any mutation we've induced will be verified as causative after further testing," Beutler says. His discovery of an important family of receptors that allow mammals to quickly sense infection and trigger an inflammatory response led to the 2011 Nobel Prize in Physiology or Medicine.

"The purpose of CE is to help researchers predict whether a mutation associated with a phenotype (trait or function) is a truly causative mutation. CE has already helped us to identify hundreds of genes with novel functions in immunity. This will improve our understanding of the immune system so that we can find new ways to keep it robust, and also know the reason it sometimes falters," says Beutler, Regental Professor, and professor of immunology and internal medicine at UT Southwestern.

"CE provides a score that tells us the likelihood that a particular mutation-phenotype association will be verified for cause and effect if we re-create the mutation or knock out the gene," he says.

CE examines 67 features of the primary genetic mapping data to arrive at an estimate of the likelihood of causation. For some mutations, causation is very clear; for others, less so. Over time, the program "learns" from experiments in which researchers re-create the mutation in a fresh pedigree and verify or exclude the hypothesis of causation. All mutations are made available to the scientific community through a public repository, and the data supporting causation are viewable within the Candidate Explorer program on the CGHD website, Mutagenetix.

The team used CE to evaluate about 87,000 mutation/trait associations that passed the initial statistical threshold for candidacy. The traits examined were flow cytometry data collected on circulating immune cells of third-generation mutant mice. In this screen, Candidate Explorer ranked a total of 2,336 mutations in 1,279 genes as good or excellent candidates for causation of traits, the team reports.

Beutler adds that this work is part of a research program he set out on nearly a decade ago to identify every mutation that may affect the mouse immune system.

"We've now worked through about 60% of the genome and have identified thousands of genes - hundreds of them novel - that participate in immunity in the mouse," he says. "The vast majority of these also contribute to human immunity."

Beutler adds that the mouse and human genome are very similar in size and content of genes. "Almost all mouse genes have a human counterpart, and vice versa," he says.

Credit: 
UT Southwestern Medical Center

New method to identify dirt on criminals can lead to prosecution

image: Map of Canberra, showing location of reference samples. Location of test (blinded) samples in blue.

Image: 
Patrice De Caritat

Scientists have taken the first steps in developing a new method of identifying the movements of criminals using chemical analysis of soil and dust found on equipment, clothing and cars. The locating system allows police or security services to match soil remnants found on personal items to regional soil samples, to either implicate or eliminate presence at a crime scene. The work is presented as a Keynote Lecture at the Goldschmidt Geochemistry Conference, after recent publication.

Dr Patrice de Caritat, Principal Research Scientist at Geoscience Australia, Australia's public sector geoscience organisation, said:

"We've done the first trials to see if geochemical analysis could narrow down a search area. We took a 260 km2 area of North Canberra and divided it into cells (squares) of 1 km x 1 km, and sampled the soil in each cell. We were then given 3 samples from within the survey area, and asked to identify which grid cells they came from. This was a 'blind' experiment, in other words we did not know where the samples came from until the end of the experiment. For comparison, Manhattan Island is around 60 km2, so that shows that we looked at a pretty big area".

Using these methods, they were able to eliminate 60% of the territory under investigation.

Dr de Caritat said "Much of forensics is about elimination, so being able to rule out 60% of an area is a substantial contribution toward successfully locating a sample. You can reduce the time, risk and investment of the ongoing investigation. The more parameters we look at, the more accurate the system is. We have reached 90% detection in some cases, although we think that would involve too many factors for real-world crime detection".

The team used a range of analytical instrumentation - Fourier Transform InfraRed Spectroscopy, X-Ray Fluorescence, Magnetic Susceptibility and Mass Spectrometry to compare the 3 blind samples to the previously collected samples.

Dr de Caritat, who is also Adjunct Professor at the National Centre for Forensic Studies at the University of Canberra, said:

"This shows that our systems work, and that we have a potential new tool for criminal and intelligence investigations. It's the next stage which is potentially most interesting. Most developed countries have existing soil databases*, used for such things as mineral exploration or land use decision support. We're plugging our methods into these databases to see if we can locate samples from the database information, rather than needing to collect samples specifically for each investigation.

Conventional soil analysis has already been used in Australia to identify and prosecute criminals. For example, soil analysis was used to identify the movements of a man who carried out a sexual assault on a young girl in Adelaide. There are several such examples. We now want to take this further".

Dr de Caritat worked with the Australian Federal Police in 2017-18, where he helped them develop their capability to analyse soils for forensic location. He said:

"Geoscience Australia is now working with the Australian Federal Police, the University of Adelaide, Flinders University and the University of Canberra on a Defence Department project to incorporate environmental DNA (e.g. from local plants) and X-Ray Diffraction mineralogy into the soil and dust location system**".

Commenting, Professor Jennifer McKinley (Queen's University, Belfast) said 'The breakthrough in Dr de Caritat's work is that it integrates robust compositional data analysis of the multivariate geochemical data into forensic geoscience and applies this in an innovative way to forensic soil provenance'.

Credit: 
Goldschmidt Conference

How fish got their spines

image: Two males of the cichlid fish Astatotilapia burtoni, the model organism used to study spine and soft-ray development in Hoech et al.

Image: 
Joost Woltering

In the movie "A Fish Called Wanda", the villain Otto effortlessly gobbles up all the occupants of Ken`s fish tank. Reality, however, is more daunting. At least one unfortunate fan who re-enacted this scene was hospitalized with a fish stuck in the throat. At the same time this also was a painful lesson in ichthyology (the scientific study of fishes), namely that the defense of some fishes consists of needle-sharp fin spines.

Two types of fin elements

Indeed, many fish species possess two types of fin elements, "ordinary" soft fin rays, which are blunt and flexible and primarily serve locomotion, and fin spines, which are sharp and heavily ossified. As fin spines serve the purpose to make the fish less edible, they offer a strong evolutionary advantage. With over 18.000 members, the spiny-rayed fish are the most species-rich fish lineage. These fishes even evolved separate "spiny fins" consisting of spines only. Therefore, the evolution of fin spines is considered a major factor in determining diversity and evolutionary success amongst fishes.

In the study published in PNAS, researchers at the University of Konstanz from a team led by Dr Joost Woltering, who - together with his PhD student and first author of the study Rebekka Höch - works in the laboratory of Professor Axel Meyer, show how fin spines arise during embryonic development. They also explain how the spines could evolve out of ancestral soft-rays independently in different lineages of fish. The study focuses on a model species for the spiny-rayed fish, the cichlid Astatotilapia burtoni, which possesses well-developed soft-rayed and spiny fin parts.

Different developmental genes for spines and soft-rays

As a first step, the team determined the genetic profiles of soft-ray and spiny fins during embryonic development. "What became clear from these first experiments was that a set of genes that we already knew from fin and limb development becomes differently activated in spines and soft-rays," says Rebekka Höch. These genes correspond to so-called master regulator genes and are known to determine morphology in the axial and the limb skeleton. In the fish fins, these genes appear to provide a genetic code that determines whether the emerging fin elements will develop looking like a spine or like a soft-ray.

Soft-rays can change into spines and vice versa

Next, the team identified genetic pathways that switch on these master regulator genes and that determine their activity at different positions across the fins. "Importantly, we were able to address the roles of these pathways using chemical tools, so-called inhibitors and activators, as well as the 'gene scissors' CRISPR/Cas9 and thereby test how spiny and soft-rayed fin domains are established during development," says Joost Woltering, assistant professor in the Department of Biology at the University of Konstanz and senior author of the study.

In their experiments, the scientists were able to alter the number of spines or soft-rays in the fins. This effect was most striking when the so-called BMP (bone morphogenetic protein) signaling was modulated. "We did not only see changes in the activation of the master regulatory genes, but we also observed so-called homeotic transformations, in which soft-rays had become spines, or the other way around, spines had turned into soft-rays," Joost Woltering explains.

An additional observation was that in these fish not only the morphology of the fin elements changed, but also the accompanying fin coloration. "Male cichlids have bright yellow spots on their fins, but these are restricted to the soft-rayed part. What we observed was that when a soft-ray changed into a spine, the fin also lost the yellow spots at this position," says Joost Woltering. This observation shows that in spiny-rayed fish, spines and soft-rays are integrated parts of a larger developmental module that determines a number of the visible features of the fins.

The same principle in different fish lineages

As the puzzle was put together, the team came to realize that a deeply conserved patterning system had become redeployed during evolution of the spiny fin. "In fact, the genetic code that determines the fin domain where spines will appear is also active in fins that do not have spines. This indicates that an ancestral genetic pattern was redeployed for making spines," says Rebekka Höch.

With this newly gained insight in mind the authors set out to investigate fin patterning in catfish, a group of fish of which members have independently evolved spines in the fins. Indeed, the genetic code identified for spines in the cichlid matched the one of the catfish spines. Although some differences exist between the different spiny fish species, it altogether suggests the existence of a deeply conserved fin pattern that is relied on to make spines when this is favored by evolutionary selection.

The next steps

For its future research the team will focus on the genes that act downstream of the identified spine and soft-ray control genes to find out how exactly they alter fin morphology by controlling ossification and cellular growth pathways. "In the end we want to gain a better understanding of how new anatomical structures arise that make some species more successful than others, and how this contributed to the incredible evolutionary diversity of the fish lineages," concludes Joost Woltering.

Credit: 
University of Konstanz