Culture

Surrey builds AI to find anti-ageing chemical compounds

The University of Surrey has built an artificial intelligence (AI) model that identifies chemical compounds that promote healthy ageing - paving the way towards pharmaceutical innovations that extend a person's lifespan.

In a paper published by Nature Communication's Scientific Reports, a team of chemists from Surrey built a machine learning model based on the information from the DrugAge database to predict whether a compound can extend the life of Caenorhabditis elegans - a translucent worm that shares a similar metabolism to humans. The worm's shorter lifespan gave the researchers the opportunity to see the impact of the chemical compounds.

The AI singled out three compounds that have an 80 per cent chance of increasing the lifespan of elegans:

flavonoids (anti-oxidant pigments found in plants that promote cardiovascular health),

fatty acids (such as omega 3), and

Organooxygens (compounds that contain carbon to oxygen bonds, such as alcohol).

Sofia Kapsiani, co-author of the study and final year undergraduate student at the University of Surrey, said:

"Ageing is increasingly being recognised as a set of diseases in modern medicine, and we can apply the tools of the digital world, such as AI, to help slow down or protect against ageing and age-related diseases. Our study demonstrates the revolutionary ability of AI to aid the identification of compounds with anti-ageing properties."

Dr Brendan Howlin, lead author of the study and Senior Lecturer in Computational Chemistry at the University of Surrey, said:

"This research shows the power and potential of AI, which is a speciality of the University of Surrey, to drive significant benefits in human health."

Credit: 
University of Surrey

Cattle losing adaptations to environment, MU researchers find

image: Over the course of generations, cattle are losing the genetic adaptations that help them thrive in specific environments.

Image: 
University of Missouri.

As a fourth-generation cattle farmer, Jared Decker knows that cattle suffer from health and productivity issues when they are taken from one environment -- which the herd has spent generations adapting to -- to a place with a different climate, a different elevation or even different grass. But as a researcher at the University of Missouri, Decker also sees an opportunity to use science to solve this problem, both to improve the welfare of cattle and to plug a leak in a nearly $50 billion industry in the U.S.

"When I joined MU in 2013, I moved cattle from a family farm in New Mexico to my farm here in Missouri," said Decker, an associate professor and Wurdack Chair in Animal Genetics at the College of Agriculture, Food and Natural Resources. "New Mexico is hot and dry, and Missouri is also hot but has much more humidity. The cattle certainly didn't do as well as they did in New Mexico, and that spurred me to think about how we could give farmers more information about what their animals need to thrive."

In a new study published today in PLOS Genetics, Decker and his team have uncovered evidence showing that cattle are losing important environmental adaptations, losses the researchers attribute to a lack of genetic information available to farmers. After examining genetic material stretching back to the 1960s, they identified specific DNA variations associated with adaptations that could one day be used to create DNA tests for cattle -- tests that could tell farmers whether their cattle are suited for one environment or another.

"We can see that, for example, historically cows in Colorado are likely to have adaptations that ease the stress on their hearts at high altitudes," Decker said. "But if you bring in bulls or semen from a different environment, the frequency of those beneficial adaptations is going to decrease. Over generations, that cow herd will lose advantages that would have been very useful to a farmer in Colorado."

Decker's team, including then-doctoral student Troy Rowan, analyzed six decades worth of bovine DNA data from tests of cryo-preserved semen produced by cattle breed associations. They found that over time, while genes associated with higher productivity and fertility improved due to careful selection by farmers, many genes connected to environmental adaptations have faded.

Decker noted this is not the fault of farmers, given that there is currently no cost-effective genetic test they can use to determine whether their cattle are suitable for a particular environment. In other words, the study demonstrates a need for user-friendly cattle DNA tests that can look for the specific adaptations identified in the study. These adaptations include resistance to vasoconstriction -- a narrowing of the blood vessels that occurs at high elevations and puts undue stress on the heart -- resistance to a toxin in grass that can also cause vasoconstriction, and tolerance for high heat or humidity, all of which tend to recede over generations when cattle are removed from the associated environments.

"Sometimes, natural and artificial selection are moving in the same direction, and other times there is a tug of war between them," Decker said. "Efficiency and productivity have vastly improved in the last 60 years, but environmental stressors are never going to go away. Farmers need to know more about the genetic makeup of their herd, not only for the short-term success of their farm, but for the success of future generations."

The first broadly adopted genetic test for cattle was invented at the University of Missouri in 2007, and Decker and Rowan hope to tell the next chapter of that story. Both grew up on farms and share a passion for using research to help farmers balance America's farming traditions with the need for environmentally friendly business practices.

"As a society, we must produce food more sustainably and be good environmental stewards," Decker said. "Making sure a cow's genetics match their environment makes life better for cattle and helps farmers run efficient and productive operations. It's a win-win."

Credit: 
University of Missouri-Columbia

Early-life social connections influence gene expression, stress resilience

image: Hyena mom licking her cub in Kenya's Masai Mara National Reserve.

Image: 
Kay E. Holekamp

Having friends may not only be good for the health of your social life, but also for your actual health--if you're a hyena, that is. Strong social connections and greater maternal care early in life can influence molecular markers related to gene expression in DNA and future stress response, suggests a new University of Colorado Boulder study of spotted hyenas in the wild.

Researchers found that more social connection and maternal care during a hyena's cub and subadult, or "teenage," years corresponded with lower adult stress hormone levels and fewer modifications to DNA, including near genes involved in immune function, inflammation and aging. 

Published this week in Nature Communications, the study is one of the first to examine the association between early-life social environments and later effects on markers of health and stress response in wild animals.

"This study supports this idea that, yes, these early experiences do matter. They seem to have an effect at the molecular level and future stress response--and they're persistent," said lead author Zach Laubach, a postdoctoral fellow in ecology and evolutionary biology.

As far back as the 1950s and 60s, laboratory research has drawn associations between early life experiences in rodents, primates and humans and behavioral and physiological differences later in life. One landmark study published in 2004 also showed that the offspring of rats who got licked and groomed more by their mothers had less DNA methylation in a gene involved in regulating stress response. This kick-started the desire for more evidence that early life experiences could be related to patterns of modification in genes that influence stress and health.

One of the missing pieces in the past 20 years of research has been the ability to study this relationship in wild animals.

Enter the Masai Mara Hyena Project. Launched by co-authors Kay E. Holekamp and Laura Smale of Michigan State University in the 1980s, the project has collected more than 30 years of uninterrupted data on hyena populations in Maasai Mara National Reserve in Kenya. With this invaluable resource for studying animal behavior, evolution and conservation, the researchers have been able to utilize generations of data on individually known animals to draw connections between their interactions, behaviors and biological markers.

"Being able to measure behavior, physiology and molecular markers from the same population has allowed us to dig deeper into the possible mechanisms," said Laubach, who has been working with data from this project for nearly a decade.

Healthy stress response

Hyenas are ideal for such research as they are devoted mothers, have a strict social hierarchy and follow a consistent timeline for raising their cubs. Instead of giving birth to larger litters, they typically have one or two cubs at a time. Soon after birth, the cubs move into a communal den, where they are integrated into their peer group. For the next year, they still nurse and their mother licks and grooms them, but after that the cubs start to wander out of the den and, like teenagers, learn to start making their way in the world.

The researchers found that the more socially connected hyenas were during their teenage years, the lower their baseline stress hormone levels were later in life. This generally indicates a healthy stress response: Stress hormones can be elevated in an appropriate situation--like when being chased by a lion or a higher-ranking hyena--and when nothing's happening, levels of stress hormones remain low.

"So if you have more friends as a subadult, essentially, you have lower stress hormone levels as an adult," said Laubach. "This suggests that the type, timing and mechanisms that link these early life experiences with stress seem to be important not only in controlled laboratory settings but also in the wild, where animals are subject to natural variation."

In general, hyenas, like other vertebrates, benefit from the effects of stress hormones (e.g. cortisol) mobilizing energy, increasing their heart rate and shutting down non-essential functions, like digestion or reproduction, when escaping a dangerous situation. However, there are significant physical drawbacks to these processes occurring chronically, day after day in humans or other animals as the result of chronic stressors. That's why having a healthy stress response is so critical.

"We need these stress hormones because they are critical to a variety of basic biological functions," said Laubach. "And in the right context, like when escaping a predator, they can save your life. But when elevated chronically, these hormones can be detrimental to your health," said Laubach.

Time travel through DNA

The researchers also wanted to find out if the relationships between early life social experiences and how stress presents later in life is managed by molecular mechanisms.

To do this, Laubach and his co-authors measured and analyzed the level of care and interaction the animal received in early life and their associations with certain modifications to its DNA later in life. These modifications can, through a process known as DNA methylation, end up changing the expression of certain genes, which can in turn, affect an animal's physiology or behavior.

The researchers found that the maternal care hyenas received during their first year of life, as well as their social connections after den independence, corresponded to differences in DNA methylation levels.

"This echoes a growing body of epidemiological work which studies how the timing of an exposure affects a health outcome. The idea is that, as an organism develops, there are certain points in time, often referred to as sensitive periods, when an exposure has a larger and a more persistent effect than if that exposure occurred at a later point in time," said Laubach.

Credit: 
University of Colorado at Boulder

Perfecting collagen production in osteogenesis imperfecta

image: Meenal Mehrotra, assistant professor in the department of pathology and laboratory medicine at MUSC Health and principal investigator on the paper, points to a mutation in the collagen gene as the major cause of the weak bones that come with osteogenesis imperfecta.

Image: 
Anne Thompson, MUSC Health

Most people can expect to break close to two bones in their lifetime, but those with osteogenesis imperfecta -- also known as brittle bone disease -- can break hundreds of bones before they even hit puberty. And while healthy bones can break from a hard fall or a bad car wreck, there may not be an apparent reason at all with brittle bone disease.

Classified as a rare disease, osteogenesis imperfecta, or OI, affects 6-7 people out of every 100,000 live births and can range in severity depending on the specific mutation. And while there are currently few treatment options and no cure, Meenal Mehrotra, M.D., Ph.D., and her lab recently published promising findings in the journal Stem Cells to address this knowledge gap.

Mehrotra, assistant professor in the department of pathology and laboratory medicine at MUSC Health and principal investigator on the paper, points to a mutation in the collagen gene as the major cause of the weak bones that come with OI. "Collagen is the matrix, or backbone, of our bones," said Mehrotra. "And when it's mutated, our bones become very brittle and break very easily."

In addition to building bones and holding them together, collagen is a protein found in blood vessels, organs, corneas and teeth, which means brittle bone disease can affect a person's eyesight and bone health and can even affect their hearing. Unfortunately, current treatment options focus on the symptoms of OI rather than the cause, and the medications in use can have long-term side effects.

For her study, Mehrotra went back to the beginning -- she looked at osteoblasts, the cells that produce collagen. "Mutated osteoblasts produce mutated collagen," she said. But if she can replace the osteoblasts, she can change the collagen produced and thus the bone that stems from it.

Osteoblasts and osteoclasts are critical to bone growth and remodeling. As a dynamic tissue with constant turnover, bone requires consistent communication between its cells. Osteoblasts form new bone, and osteoclasts reabsorb old or damaged bone; both functions are needed to keep bone strong. By replacing damaged or mutated osteoblasts with healthy ones, Mehrotra and her lab offer an option with a therapeutic promise. While current results show effectiveness in animal models, Mehrotra hopes to translate her results to people in the future.

Current dogma points to mesenchymal stem cells (MSCs) as the origin for osteoblasts, but MSC transplantation has not resulted in long-term success as a treatment option. Hematopoietic stem cells (HSCs) usually give rise to blood cells and osteoclasts, but Mehrotra hypothesized that they could give rise to osteoblasts too.

And that idea led her to investigate HSC transplantation in mice with brittle bone disease.

Not all biologists agree that HSCs can give rise to both blood cells and bone cells, so Mehrotra considers her hypothesis controversial. "There have been few studies on this topic in the past," she said. "It's not that people haven't speculated on this before, but so many scientists disagree each time it is discussed that it has not been studied in detail."

Mehrotra spent much of her paper detailing how she confirmed that only HSCs were responsible for the results of her study. Using a clonal population of HSCs from mice that expressed green fluorescent protein (GFP) only in their osteoblasts, Mehrotra and her lab conducted lineage tracing studies. These studies established hematopoietic stems cells as cells that can differentiate into osteoblasts, which then brought about clinical improvements in mice with OI.

This method of clonal HSC transplantation was developed by Mehrotra's mentor Makio Ogawa, M.D., Ph.D., who was a professor in the pathology department at MUSC Health before retiring. His method of ensuring that the clonal population was only of HSCs and did not contain any MSCs allowed Mehrotra to be confident in her results.

To transplant the characterized HSCs, Mehrotra's lab first irradiated the mice to remove their current bone marrow and make room for healthier cells. The new HSCs were then transplanted through an IV into the mice, where they gave rise to collagen-producing osteoblasts. Mehrotra found that this process replaced about 20 to 40% of osteoblasts in the long bones of the mice, which is higher than percentages seen with MSC transplantation and thus shows long-term success.

"We think this has potential as a curative therapy," Mehrotra said. "By replacing the abnormal osteoblasts with normal ones that can then secrete normal collagen, we're aiming for a therapy that could result in an actual cure."

More research is needed before the technique can become a therapeutic option for patients, but Mehrotra says this is a promising start. By replacing osteoblasts, she offers a therapeutic opportunity that is not dependent on a specific mutation but simply on a mutation being present, which means it could potentially be translated to other forms of OI and other bone diseases in the future.

Mehrotra hopes to study HSC transplantation as an auto-transplantation strategy moving forward. There are two significant forms of transplantation: auto-transplantation and allotransplantation. In auto-transplantation, cells or organs are removed from one person and then transplanted back into that same person. In allotransplantation, the cells or organs come from another person, which carries the risk of rejection or graft-versus-host disease. Mehrotra proposes that by using CRISPR-Cas9 technology, scientists can correct the collagen mutation in the OI patient's HSCs, before transplanting these corrected HSCs back into the same patient's system, where they can form normal, collagen-producing osteoblasts.

By performing this study, Mehrotra addresses the controversy surrounding the potential of HSCs giving rise to osteoblasts. She hopes more researchers will consider the promise of hematopoietic stem cells in bone diseases, and she hopes to continue to help patients with such debilitating bone diseases like OI lead a higher quality of life.

The research was supported by the National Institutes of Health.

Credit: 
Medical University of South Carolina

Spotted: An exoplanet with the potential to form moons

image: This image shows wide (left) and close-up (right) views of the moon-forming disk surrounding PDS 70c, a young Jupiter-like planet nearly 400 light-years away. The close-up view shows PDS 70c and its circumplanetary disk center-front, with the larger circumstellar ring-like disk taking up most of the right-hand side of the image. The star PDS 70 is at the center of the wide-view image on the left.

Two planets have been found in the system, PDS 70c and PDS 70b, the latter not being visible in this image. They have carved a cavity in the circumstellar disk as they gobbled up material from the disk itself, growing in size. In this process, PDS 70c acquired its own circumplanetary disk, which contributes to the growth of the planet and where moons can form. This disk is as large as the Sun-Earth distance and has enough mass to form up to three satellites the size of the Moon.

Image: 
ALMA (ESO/NAOJ/NRAO)/Benisty et al.

Cambridge, MA ¬- Astronomers at the Center for Astrophysics | Harvard & Smithsonian have helped detect the clear presence of a moon-forming region around an exoplanet -- a planet outside of our Solar System. The new observations, published Thursday in The Astrophysical Journal Letters, may shed light on how moons and planets form in young stellar systems.

The detected region is known as a circumplanetary disk, a ring-shaped area surrounding a planet where moons and other satellites may form. The observed disk surrounds exoplanet PDS 70c, one of two giant, Jupiter-like planets orbiting a star nearly 400 light-years away. Astronomers had found hints of a "moon-forming" disk around this exoplanet before but since they could not clearly tell the disk apart from its surrounding environment, they could not confirm its detection -- until now.

"Our work presents a clear detection of a disk in which satellites could be forming," says Myriam Benisty, a researcher at the University of Grenoble and the University of Chile who led the research using the Atacama Large Millimetre/submillimetre Array (ALMA). "Our ALMA observations were obtained at such exquisite resolution that we could clearly identify that the disk is associated with the planet and we are able to constrain its size for the first time."

With the help of ALMA, Benisty and the team found the disk diameter is comparable to the Sun-to-Earth distance and has enough mass to form up to three satellites the size of the Moon.

"We used the millimeter emission from cool dust grains to estimate how much mass is in the disk and therefore, the potential reservoir for forming a satellite system around PDS 70c," says Sean Andrews, a study co-author and astronomer at the Center for Astrophysics (CfA).

The results are key to finding out how moons arise.

Planets form in dusty disks around young stars, carving out cavities as they gobble up material from this circumstellar disc to grow. In this process, a planet can acquire its own circumplanetary disk, which contributes to the growth of the planet by regulating the amount of material falling onto it. At the same time, the gas and dust in the circumplanetary disk can come together into progressively larger bodies through multiple collisions, ultimately leading to the birth of moons.

But astronomers do not yet fully understand the details of these processes. "In short, it is still unclear when, where, and how planets and moons form," explains ESO Research Fellow Stefano Facchini, also involved in the research.

"More than 4,000 exoplanets have been found until now, but all of them were detected in mature systems. PDS 70b and PDS 70c, which form a system reminiscent of the Jupiter-Saturn pair, are the only two exoplanets detected so far that are still in the process of being formed," explains Miriam Keppler, researcher at the Max Planck Institute for Astronomy in Germany and one of the co-authors of the study.

"This system therefore offers us a unique opportunity to observe and study the processes of planet and satellite formation," Facchini adds.

PDS 70b and PDS 70c, the two planets making up the system, were first discovered using ESO's Very Large Telescope (VLT) in 2018 and 2019 respectively, and their unique nature means they have been observed with other telescopes and instruments many times since.

These latest high resolution ALMA observations have now allowed astronomers to gain further insights into the system. In addition to confirming the detection of the circumplanetary disk around PDS 70c and studying its size and mass, they found that PDS 70b does not show clear evidence of such a disk, indicating that it was starved of dust material from its birth environment by PDS 70c.

An even deeper understanding of the planetary system will be achieved with ESO's Extremely Large Telescope (ELT), currently under construction on Cerro Armazones in the Chilean Atacama desert.

"The ELT will be key for this research since, with its much higher resolution, we will be able to map the system in great detail," says co-author Richard Teague, a co-author and Submillimeter Array (SMA) fellow at the CfA.

In particular, by using the ELT's Mid-infrared ELT Imager and Spectrograph (METIS), the team will be able to look at the gas motions surrounding PDS 70c to get a full 3D picture of the system.

Credit: 
Center for Astrophysics | Harvard & Smithsonian

COVID-19: Patients with malnutrition may be more likely to have severe outcomes

Adults and children with COVID-19 who have a history of malnutrition may have an increased likelihood of death and the need for mechanical ventilation, according to a study published in Scientific Reports.

Malnutrition hampers the proper functioning of the immune system and is known to increase the risk of severe infections for other viruses, but the potential long-term effects of malnutrition on COVID-19 outcomes are less clear.

Louis Ehwerhemuepha and colleagues investigated associations between malnutrition diagnoses and subsequent COVID-19 severity, using medical records for 8,604 children and 94,495 adults (older than 18 years) who were hospitalised with COVID-19 in the United States between March and June 2020. Patients with a diagnosis of malnutrition between 2015 and 2019 were compared to patients without.

Of 520 (6%) children with severe COVID-19, 39 (7.5%) had a previous diagnosis of malnutrition, compared to 125 (1.5%) of 7,959 (98.45%) children with mild COVID-19. Of 11,423 (11%) adults with severe COVID-19, 453 (4%) had a previous diagnosis of malnutrition, compared to 1,557 (1.8%) of 81,515 (98.13%) adults with mild COVID-19.

Children older than five and adults aged 18 to 78 years with previous diagnoses of malnutrition were found to have higher odds of severe COVID-19 than those with no history of malnutrition in the same age groups. Children younger than five and adults aged 79 or above were found to have higher odds of severe COVID-19 if they were not malnourished compared to those of the same age who were malnourished. In children, this may be due to having less medical data for those under five, according to the authors. The risk of severe COVID-19 in adults with and without malnutrition continued to rise with age above 79 years.

The authors suggest that public health interventions for those at highest risk of malnutrition may help mitigate the higher likelihood of severe COVID-19 in this group.

Credit: 
Scientific Reports

MRI, clear cell likelihood score correlate with renal mass growth rate

image: Coronal T2-weighted single shot fast spin echo and coronal T1-weighted fat-saturated spoiled gradient echo acquired during corticomedullary phase--ccLS5 lesion outlined red for clarity.

Image: 
American Roentgen Ray Society (ARRS), American Journal of Roentgenology (AJR)

Leesburg, VA, July 22, 2021--According to ARRS' American Journal of Roentgenology (AJR), the standardized non-invasive clear cell likelihood score (ccLS)--derived from MRI--correlates with the growth rate of small renal masses (cT1a,

Extracted from clinical reports, "the ccLS scores the likelihood that the small renal mass represents clear cell renal cell carcinoma, from 1 (very unlikely) to 5 (very likely)," explained corresponding author Ivan Pedrosa from the University of Texas Southwestern Medical Center at Dallas. "Small renal masses with lower ccLS may be considered for active surveillance, whereas small renal masses with higher ccLS may warrant earlier intervention."

Pedrosa and colleagues' retrospective study included consecutive small renal masses assigned a ccLS on clinical MRI examinations performed between June 2016 and November 2019 at an academic tertiary-care medical center or its affiliated safety net hospital system. Tumor size measurements were extracted from available prior and follow-up cross-sectional imaging examinations, through June 2020.

Among 389 small renal masses in 339 patients (198 men, 141 women; median age, 65 years) on active surveillance that were assigned a ccLS on clinical MRI examinations, those with ccLS4-5 grew significantly faster (9% diameter, 29% volume yearly) than those with ccLS1-2 (5% diameter, p

Noting that the lack of validated imaging markers to characterize biologic aggressiveness of small renal masses hinders medical decision making among available initial management strategies, "growth is associated with ccLS in small renal masses," the authors of this AJR article reiterated, "with higher ccLS correlating with faster growth."

Credit: 
American Roentgen Ray Society

Structural biology provides long-sought solution to innate immunity puzzle

image: Sulfatides bind to the mouse cell surface TLR4?MD-2 receptors to activate downstream signaling pathways that result in inflammation. The middle figure shows a cutaway view of MD-2 to reveal the sulfatide molecules (yellow) within.

Image: 
UT Southwestern Medical Center

DALLAS - July 20, 2021 - UT Southwestern researchers report the first structural confirmation that endogenous - or self-made - molecules can set off innate immunity in mammals via a pair of immune cell proteins called the TLR4?MD-2 receptor complex. The work has wide-ranging implications for finding ways to treat and possibly prevent autoimmune diseases such as multiple sclerosis and antiphospholipid syndrome.

The TLR4?MD-2 receptor complex is well known for its role in the body's response to infection by gram-negative bacteria. Its role in autoimmunity had been long suspected, although direct proof was lacking. The team, led by Nobel Laureate Bruce Beutler, M.D., director of the Center for the Genetics of Host Defense (CGHD), identified lipids called sulfatides that can activate the innate immunity sensor TLR4, located on a cell's membrane. His discovery of the genes behind the TLR4 receptor and its role in the body's earliest response to infection - innate immunity - led to his 2011 Nobel Prize in Physiology or Medicine.

Beutler is corresponding author of the study published this week in the Proceedings of the National Academy of Sciences that used X-ray crystallography to confirm how sulfatides bind to the receptor complex. Lead author Lijing Su, Ph.D., a CGHD assistant professor with a secondary appointment in biophysics, conducted the X-ray crystallography at UT Southwestern's Structural Biology Core Facility and at Argonne National Laboratory in Illinois.

"For many years, the question of whether endogenous - or self - molecules can activate innate immune receptors has been an important one," says Beutler, a professor of immunology and internal medicine. "Scientists had observed that our own nucleic acids can activate TLRs 3, 7, 8, and 9, causing inflammation and autoimmunity. Many endogenous ligands for TLR4, most of them proteins, have been proposed. This is the first study to substantiate the existence of such a TLR4 ligand, meaning a molecule that fits into the receptor, by structural studies."

The team's structural studies of mouse TLR4?MD-2 in complex with sulfatides gave a detailed look at how sulfatides bind to the U-shaped side of the receptor complex in order to activate it. That binding sets off biological pathways that lead to the body's inflammatory response.

The study, which raises new and important questions, includes some observations about differences in the way the receptor responds in mice and humans. It also raises new questions about how the chemical makeup of individual sulfatides might affect the way they interact with the receptor complex to activate or suppress the immune response.

"Our work demonstrates that these, or perhaps other endogenous lipids, may indeed trigger activation of TLR4," Beutler says, adding that TLR4 usually acts as a sensor of lipopolysaccharide (a lipid plus sugar molecule) - also known as endotoxin - that resides on gram-negative bacteria. TLR4-LPS binding is implicated in sepsis, a potentially deadly condition in which the immune system goes into overdrive in response to infection.

Su adds that she and others in the Beutler lab previously reported that TLR4 and its co-receptor MD-2 can be activated by a synthetic small molecule called neoseptin-3, created in collaboration with the laboratory of Dale Boger, Ph.D., at The Scripps Research Institute, which shares no structural similarity to the natural microbial ligand, LPS.

"Our crystal structure of mouse TLR4?MD-2 in complex with neoseptin-3 revealed that this receptor complex might accommodate multiple small molecules rather than a big molecule like LPS," Su explains. "This result led us to look for natural lipids that might bind and activate TLR4?MD-2 signaling. Among early candidates were phosphoceramides, but these failed to activate the receptor. Structural features of sulfatides, and their great abundance in some tissues, led us to test them instead, and we confirmed that some sulfatides do indeed activate TLR4."

Credit: 
UT Southwestern Medical Center

New map shows where millions of UK residents struggle to access food

In one out of every six local authorities, rates of hunger are more than 150 per cent (one and a half times) the national average. Shockingly, in one in 10 local authorities, the rate is almost double, according to new research by the University of Sheffield.

Researchers at the University of Sheffield Institute for Sustainable Food modelled data from the Food Foundation, who surveyed people across the UK, and for the first time were able to identify food insecurity at a local authority scale. Local authority percentages show the marked variation in levels of food insecurity between local areas and, whereas national and even regional averages, hide this difference.

According to data from the Food Foundation, in January 2021, 4.2 per cent of adults across the UK reported that during the previous month they had been hungry but unable to eat at least once, but the problem is much worse in some places with nearly one out of every ten adults going hungry.

This new analysis of the national data collected during the pandemic goes further to assess the problem at a local authority level and breaks down experiences of food insecurity into three distinct groups, mapping them for the first time:

Those who are hungry include people who indicated that they were hungry but were unable to eat food because they could not afford it, or were unable to access food in the previous month.

Those who are struggling to access food, include those who may have sought help within the last month with access to food, have cut back on meals and healthy foods to stretch tight budgets, or indicated that they struggled to access food in some way. In some places the rate is as high as 28 per cent of adults.

Those who worry about food insecurity or being able to continue to supply adequate food for their household. These people may be just about managing but could slip into food insecurity as a result of an unexpected crisis.

Data, represented on a map of the UK identifies which local authorities are most affected within all four nations of the UK. The map shows it is in England where the areas with the highest and lowest rates of food insecurity are located. At one end of the scale, the majority of local authorities in Yorkshire and the Humber are in the top 20 per cent of local authorities with the greatest percentages of adults going hungry, whereas in the East of England, the majority of local authorities are in the 20 per cent that have the lowest percentages.

Across the Sheffield City Region, both Rotherham and Barnsley are in the 20 per cent of local authorities with the highest percentages of adults who were hungry at least once in January. While Rotherham, Barnsley and Doncaster are all in the 20 per cent of local authorities with the highest percentages of adults who were most likely to have struggled to access food.

The areas worst hit by food insecurity are Wycombe [1] with 14 per cent of people who are estimated to be hungry and nearly 30 percent of people who are struggling to access food. The area also has high estimates for numbers of people who are worried about having enough food (22 per cent). This is followed closely by Hull with 13 per cent of people being hungry and more than one in five adults who struggled to access food. The locality with least hunger, struggle or worry is St. Albans.

Food insecurity is the inability to consistently afford, access and utilise the food needed to maintain good health and wellbeing. The problem has become well known in recent years due to the rise in reporting on food bank usage in the UK; the Food Foundation has been tracking household food insecurity UK-wide showing more people living with the daily effects of poor diets and limited food access.

Typically measures of moderate and severe food insecurity use three indicators: skipping meals for a whole day or more, not having enough food and going hungry, and shrinking or skipping meals with 7.4 per cent of adults reported one or more of these experiences in the one month running up to January 2021.

In this study, the second indicator comprises the Hungry measure. The estimates for the struggling with food access measure include those who skipped or shrank meals, and also includes those who indicated that they sought help when they were food insecure, and those who gave a reason for not having enough food.

The worry measure is not generally included in other estimates of food insecurity; however, this group is likely to be at increased risk of food insecurity. They will struggle to include healthy food in their diets and be bearing the mental strain of trying to make their budgets stretch.

Dr Megan Blake from the University of Sheffield Institute for Sustainable Food, who collaborated on the work, said: "This new map, for the first time, makes visible the patterns of food insecurity across the UK. While no one should have to go hungry, struggle to get or worry about having enough food, in some places it is at proportions that are especially shocking, particularly as we are a wealthy country. If we are going to recover from Covid-19 we must address this problem."

This new analysis of the data shows how not everyone living with food insecurity experiences hunger on the same level, making apparent that an alarmingly large number of people also actively plan to go without food to make ends meet, or worry about doing so. The burden of these forms of food insecurity also immediately threatens people's health and wellbeing.

For some, living on the edge of food insecurity may mean an event such as a car or boiler breakdown or an unexpected illness, could be the tipping point that squeezes their food budget, and they have to plan to eat less, or skip meals altogether.

This in turn can be linked to higher rates of illness and obesity due to people buying cheaper foods, which store longer and are more filling, over healthier choices.

Kris Gibbon-Walsh, from FareShare - the UK's national network of charitable surplus food redistributors, and the Emergencies Partnership, said: "The best way to become food secure is to have the health and money to go to the shops, buy the nutritious food you need for your family and the knowledge to cook it. People should not need to rely on charity to access food, however charities can do a huge amount of good work with food to support their communities, bring people together and stop them becoming food insecure in the first place.

"Having a map of food insecurity in the UK allows FareShare and other food organisations to understand the geographical implications of where we send our food and make decisions, not only about where the food goes, but also how the local foodscape can best support its community out of food insecurity."

Anna Taylor, Executive Director, Food Foundation said: "We've known from tracking food insecurity at a national level during the pandemic that Covid-19 has pushed households across the UK further into hardship and forced a newly vulnerable segment of society to seek help for the first time.

"Local authorities have played a leading role in strengthening charitable food provision for the vulnerable during the crisis and many now have a good understanding of levels of need in their area. But this new study makes clear the variation in food access and vulnerability to food insecurity from one local authority to the next across the UK. It should provide a valuable resource to inform and drive targeted policy action at the local level in response to the data to eradicate food insecurity and deliver on the levelling up agenda."

Dr Megan Blake, added: "We hope this clear breakdown of the data will be a useful resource for local authorities and the government to use to address the challenges facing all people living with food insecurity, and that help can be tailored and targeted to those communities who need it, as the answer is not as simple as opening more food banks."

"Food insecurity is undermining our chances of recovery after Covid-19. We need to urgently address this issue that pervades so many of our communities. No one should have to be hungry."

Credit: 
University of Sheffield

Journey from smoking to vaping variable - Otago academics

Persistence may be the key when quitting smoking using an electronic nicotine delivery system (ENDS), commonly known as vaping, a University of Otago study found.

Researchers found people attempting to switch from cigarettes to ENDS reported highly varied smoking and ENDS use. They recommend people persist in their attempts to transition away from smoking, even if their progress feels slow and uncertain.

Lead author Associate Professor Tamlin Conner, of the Department of Psychology, says, although people may plan to use ENDS exclusively instead of cigarettes, making the switch is not always straightforward.

"We found that dual use of ENDS and cigarettes was very common, suggesting that people tested but had difficulty successfully substituting ENDS for smoking," she says.

For the study, published this month in International Journal of Environmental Research and Public Health, the researchers recruited 45 people who wanted to quit smoking, gave each participant an ENDS device, and prompted participants to report their ENDS use and smoking each day for up to 20 weeks using a smartphone survey.

The researchers examined how participants' patterns of smoking and ENDS use changed over this extended period. The daily diary survey was part of a wider mixed qualitative and quantitative study of smoking to ENDS transitions funded by the Royal Society of New Zealand Marsden Fund to Principal Investigator Professor Janet Hoek and her collaborators.

Aided by innovative data visualisations created by co-authors Dr Jimmy Zeng and Vicky He, the research team found considerable variety in how people used their ENDS.

The most common behaviour pattern was dual use of cigarettes and ENDS, then exclusive ENDS use and back to dual use. A smaller group reported moving from dual use to exclusive smoking, and often back to dual use. Another small group reported moving between abstinence and different ENDS use and smoking behaviours.

"Not any one person's journey to quit smoking is the same. Some people take up exclusive vaping relatively quickly in several weeks, others take longer from 12 - 20 weeks, or do not transition at all and continue smoking. People who wish to switch to exclusive vaping should view this variability as typical and should persevere in their efforts to switch," Associate Professor Conner says.

Professor Hoek believes this variability has implications for smoking cessation programmes.

"Classifying behaviours using point-prevalence categories, like achieving 'smoke free' status after 12 weeks, may be simplistic, especially during the early stages of an ENDS-assisted quit attempt. Cessation support programmes involving ENDS may need to run for longer than the conventional 12-weeks to ensure people have access to support during what could be an extended period of movement between smoking and ENDS use," she explains.

Associate Professor Conner also wanted to remind people that ENDS are not a risk-free alternative to smoking cigarettes.

"Although vaping can assist some people in quitting cigarettes, people should regard their use as a transition phase and aim to quit vaping when they think their risk of relapse back to smoking is very low."

Credit: 
University of Otago

Longer stays in refugee camps increase cases of acute mental illness

A new quantitative study suggests people seeking asylum are more likely to experience mental health deterioration as they spend more time living in refugee camps, backing up qualitative evidence from aid organisations.

The research, co-authored by Dr Francisco Urzua from the Business School (formerly Cass) alongside practitioners from Moria Medical Support (MMS) and academics from Universidad del Desarrollo, Chile and University of Amsterdam, the Netherlands measured incidences of acute mental health crises arising from extended stays in the Moria refugee camp on the western Greek Island of Lesbos.

Key findings from the study include:

Acute mental health crises were significantly linked with the length of time somebody stayed in the Moria refugee camp: the longer a refugee stayed in the camp, the more likely they were to suffer a mental health crisis.

A 10 per cent increase in the number of days spent in the camp led to a 3.3 per cent increase in the chances of a refugee suffering a mental health crisis - a significant factor given an average length of stay of 70.6 days.

Refugees of Iranian, Iraqi and Syrian ethnicity were most significantly affected by longer stays in the refugee camps, with male refugees more likely to experience incidences of acute mental health crises than women as time spent in the refugee camp increased.

The study used three months of anonymised data from MMS, a transitory clinic that offered night-time medical services to the island at the time, between January and April 2018. This included Psychological First Aid (PFA) and psychiatric crisis management, with patient data on age, gender, ethnicity and length of stay in the camp.

An acute mental health crisis is defined as a case of somebody either harming themselves through a non-accidental implemented wound, a suicide attempt requiring hospital care, or a state of unease constituted by anxiety, nervous agitation or undirected aggression.

Dr Urzua said that results supported prior claims about the quality of life in refugee camps, and that actions should be taken to safeguard inhabitants throughout the asylum process.

"The EU-Turkey deal of 2016 has seen camp populations multiply in size, but adequate mental health care provisions have not been expanded or improved in equal measure," Dr Urzua said.

"Our study expands upon existing qualitative evidence that the prolonged system of asylum has detrimental effects on mental health, brought on by poor living conditions of refugee camps.

"This mental health deterioration not only affects the individuals themselves but also has significant repercussions for fellow refugees with increased physical violence and the destabilisation of an often close-knit social environment, which in turn affects the mental wellbeing of others. Furthermore, the implications of deteriorating mental health most likely continue even after release, which makes it harder for refugees to integrate into a new society.

"It is clear to see from our study and prior anecdotal evidence that mental health in these camps is a serious problem, and it is imperative that policymakers from across Europe take action and uphold the 1951 Geneva Refugee Convention to protect the rights and wellbeing those awaiting and granted asylum."

Dr Willemine van de Wiel, doctor and coordinator at Moria Medical Support said more needed to be done to support conditions at the Moria camp and others in the northern hemisphere.

"During our time on the Island of Lesbos, my overwhelming feeling was frustration at the conditions in the camp - a sentiment shared by many seasoned NGO-workers.

"In our experience, refugees are better off in many camps in the global south in terms of safety, housing, access to food, sanitation and medical services.

"I hope this research adds to public awareness about the psychological impact of life in these camps and inspires the development of a more humane asylum process."

Credit: 
City St George’s, University of London

Researchers automate brain MRI image labelling, more than 100,000 exams labelled in under 30 minutes

Researchers from the School of Biomedical Engineering & Imaging Sciences at King's College London have automated brain MRI image labelling, needed to teach machine learning image recognition models, by deriving important labels from radiology reports and accurately assigning them to the corresponding MRI examinations. Now, more than 100,00 MRI examinations can be labelled in less than half an hour.

Published in European Radiology, this is the first study allowing researchers to label complex MRI image datasets at scale.

The researchers say it would take years to manually perform labelling of more than 100,000 MRI examinations.

Deep learning typically requires tens of thousands of labelled images to achieve the best possible performance in image recognition tasks. This represents a bottleneck to the development of deep learning systems for complex image datasets, particularly MRI which is fundamental to neurological abnormality detection.

Senior author, Dr Tom Booth from the School of Biomedical Engineering & Imaging Sciences at King's College London said: "By overcoming this bottleneck, we have massively facilitated future deep learning image recognition tasks and this will almost certainly accelerate the arrival into the clinic of automated brain MRI readers. The potential for patient benefit through, ultimately, timely diagnosis, is enormous."

Dr Booth said their validation was uniquely robust. Rather than evaluating their model performance on unseen radiology reports, they also evaluated their model performance on unseen images.

"While this might seem obvious, this has been challenging to do in medical imaging because it requires an enormous team of expert radiologists. Fortunately, our team is a perfect synthesis of clinicians and scientists," Dr Booth said.

Lead Author, Dr David Wood from the School of Biomedical Engineering & Imaging Sciences said: "This study builds on recent breakthroughs in natural language processing, particularly the release of large transformer-based models such as BERT and BioBERT which have been trained on huge collections of unlabeled text such as all of English Wikipedia, and all PubMed Central abstracts and full-text articles; in the spirit of open-access science, we have also made our code and models available to other researchers to ensure that as many people benefit from this work as possible."

The authors say that while one barrier has now been overcome, further challenges will be, firstly, to perform the deep learning image recognition tasks which also have multiple technical challenges; and secondly, once this is achieved, to ensure the developed models can still perform accurately across different hospitals using different scanners.

Dr Booth said: "This study was possible thanks to a very broad team of experts who are working on these challenges. There is a huge base of supporting organisers and facilitators who are equally important in delivering this research. Obtaining clean data from multiple hospitals across the UK is an important step to overcome the next challenges. We are running an NIHR portfolio adopted study across the UK to prospectively collect brain MRI data for this purpose."

Credit: 
King's College London

Scientists come up with new method for simultaneous processing of different types of waste

image: Deputy Head of the Department of Functional Nanosystems and High-Temperature Materials Yuri Konyukhov with a lab employee.

Image: 
Sergey Gnuskov/NUST MISIS

An international research team has come up with an innovative method for metal recovery from industrial waste. The new method allows the simultaneous recovery of multiple metals from waste oxides in a single process. This novel route will lower the burden on waste storage facilities with significant contributions to the economic and environmental sustainability of industrial waste management. The study was published in Journal of Environmental Management. This work is the first in a series of studies aimed at developing cost-effective and environmentally sustainable solutions for industrial waste recycling.

Some of the major industries such as coal and biomass-based power generation, iron and steel sector, aluminium production, water treatment etc. are known to produce huge amounts of aluminium and iron oxides rich wastes, e.g., fly ash from combustion of coal and biomass, mill scales, red mud, biochar (the char coproduct from the thermochemical processing of biomass utilized as a soil amendment and/or carbon sequestration agent), water treatment residues. Produced in hundreds to billions of tonnes, these wastes cause immense disposal issues. Leaching of metals into atmosphere through improper disposal can result in serious environmental damage and adverse effects on humans. However, current waste management methods are economically unviable and environmentally unsustainable.

These industrial wastes can be a valuable secondary metals resource, scientists believe. A group of researchers from NUST MISIS, the University of New South Wales, Plekhanov Russian University of Economics, Universidad Andres Bello, Institute of Minerals and Materials Technology under the Council of Scientific and Industrial Research of India have developed a new technology that allows the simultaneous recovery of multiple metals from waste oxides in a single process, which, in turn, helps reduce the costs of waste processing.

The scientists used carbothermal reduction to extract metals from industrial wastes rich in oxides of iron, aluminium, silicon and other metals.

"Simply put, we used carbon and high temperatures to extract metals from the oxides in the waste. Key innovation of this study was to lower the reduction temperature for alumina, thus making it possible to recover it simultaneously along with iron and silicon that have lower reduction temperature," noted Yuri Konyhov, Deputy Head of the Department of Functional Nanosystems and High-Temperature Materials at NUST MISIS.

This novel approach could significantly enhance economic and environmental sustainability of managing industrial waste as it allows mixing various types of waste together and processing of large amounts of waste, the researchers believe.

Credit: 
National University of Science and Technology MISIS

Oral and general health associations using machine learning prediction algorithms

Alexandria, Va., USA - Muthuthanthrige Cooray, Tohoku University, Sendai, Japan, presented the oral session "Oral and General Health Associations Using Machine Learning Prediction Algorithms" at the virtual 99th General Session & Exhibition of the International Association for Dental Research (IADR), held in conjunction with the 50th Annual Meeting of the American Association for Dental Research (AADR) and the 45th Annual Meeting of the Canadian Association for Dental Research (CADR), on July 21-24, 2021.

General health and oral health are conventionally treated as separate entities within the healthcare delivery, however most general health and oral health problems share common risk factors and both affect overall well-being. This study investigated the robustness of the association between general health and oral health using machine learning prediction.

Analyses included 19,862 Japanese Gerontological Evaluation Study 2016 participants aged 65 years or older. XGBoost machine learning algorithm was used to predict self-rated oral health using general health-related predictors (frailty, psychological status, comorbidity) and self-rated general health using oral health-related predictors (poor occlusion, chewing difficulty, and dry mouth). Age, sex, household income and smoking were added as common predictors for both models. Predictors were selected based on literature and availability.

Prevalence of poor self-rated oral health was higher (28.6%) compared to poor self-rated general health (12.4%). 20.6% of those with poor self-rated oral health also reported poor self-rated general health, whereas 47.7% of those with poor self-rated general health also reported poor self-rated oral health.

A robust general health to oral health and oral health to general health association existed as demonstrated by accurate prediction models. Oral health appeared to have a higher predictive capacity in predicting general health than general health in predicting oral health. General health and oral health factors should be considered collectively when planning healthcare for older adults.

View this oral session in the IADR General Session Virtual Experience Platform.

Credit: 
International Association for Dental, Oral, and Craniofacial Research

New algorithm flies drones faster than human racing pilots

image: A drone flying through smoke to visualize the complex aerodynamic effects.

Image: 
Robotics and Perception Group, University of Zurich

To be useful, drones need to be quick. Because of their limited battery life they must complete whatever task they have - searching for survivors on a disaster site, inspecting a building, delivering cargo - in the shortest possible time. And they may have to do it by going through a series of waypoints like windows, rooms, or specific locations to inspect, adopting the best trajectory and the right acceleration or deceleration at each segment.

Algorithm outperforms professional pilots

The best human drone pilots are very good at doing this and have so far always outperformed autonomous systems in drone racing. Now, a research group at the University of Zurich (UZH) has created an algorithm that can find the quickest trajectory to guide a quadrotor - a drone with four propellers - through a series of waypoints on a circuit. "Our drone beat the fastest lap of two world-class human pilots on an experimental race track", says Davide Scaramuzza, who heads the Robotics and Perception Group at UZH and the Rescue Robotics Grand Challenge of the NCCR Robotics, which funded the research.

"The novelty of the algorithm is that it is the first to generate time-optimal trajectories that fully consider the drones' limitations", says Scaramuzza. Previous works relied on simplifications of either the quadrotor system or the description of the flight path, and thus they were sub-optimal. "The key idea is, rather than assigning sections of the flight path to specific waypoints, that our algorithm just tells the drone to pass through all waypoints, but not how or when to do that", adds Philipp Foehn, PhD student and first author of the paper.

External cameras provide position information in real-time

The researchers had the algorithm and two human pilots fly the same quadrotor through a race circuit. They employed external cameras to precisely capture the motion of the drones and - in the case of the autonomous drone - to give real-time information to the algorithm on where the drone was at any moment. To ensure a fair comparison, the human pilots were given the opportunity to train on the circuit before the race. But the algorithm won: all its laps were faster than the human ones, and the performance was more consistent. This is not surprising, because once the algorithm has found the best trajectory it can reproduce it faithfully many times, unlike human pilots.

Before commercial applications, the algorithm will need to become less computationally demanding, as it now takes up to an hour for the computer to calculate the time-optimal trajectory for the drone. Also, at the moment, the drone relies on external cameras to compute where it was at any moment. In future work, the scientists want to use onboard cameras. But the demonstration that an autonomous drone can in principle fly faster than human pilots is promising. "This algorithm can have huge applications in package delivery with drones, inspection, search and rescue, and more", says Scaramuzza.

Credit: 
University of Zurich