Culture

Study finds "thriving gap" between students who attend high school remotely vs. in person

Washington/Philadelphia, July 14, 2021--New research finds that high school students who attended school remotely during the COVID-19 pandemic suffered socially, emotionally, and academically compared with those who attended in person.

The study was published today in Educational Researcher (ER) by researchers Angela L. Duckworth, Tim Kautz, Amy Defnet, Emma Satlof-Bedrick, Sean Talamas, Benjamin Lira, and Laurence Steinberg. ER is a peer-reviewed journal of the American Educational Research Association.

"Many news stories have reported on individual stories of teenagers who have suffered from anxiety, depression, and other mental health challenges during the pandemic," said lead author Duckworth, a professor at the University of Pennsylvania and the founder and CEO of Character Lab. "This study gives some of the first empirical evidence of how learning remotely has affected adolescent well-being."

The study found a social, emotional, and academic "thriving gap" between students who had been attending school in person and their counterparts who had been attending remotely. The greater suffering of students attending school remotely held up when controlling for how students were faring on the same dimensions prior to the pandemic. Though not enormous in magnitude, the thriving gap was consistent across gender, race/ethnicity, and socioeconomic status--and even small effects are noteworthy when they impact millions of individuals.

On a 100-point scale, in-person students were rated higher than remote students on levels of social well-being (77.2 versus 74.8), emotional well-being (57.4 versus 55.7), and academic well-being (78.4 versus 77.3).

"Notably, the thriving gap was larger among students in 10th through 12th grades than it was among ninth graders," said Steinberg, a professor at Temple University.

"As policymakers gear up for national tutoring and remediation programs--which we agree are urgent priorities--we must recognize that our nation's students are not just lagging as performers, they are suffering as people," Duckworth said. "Meeting their intrinsic psychological needs--for social connection, for positive emotion, and authentic intellectual engagement--is a challenge that cannot wait."

As part of an ongoing research partnership with Orange County Public Schools, a large and demographically diverse public school district in Florida, the study authors had already administered the Character Lab Student Thriving Index--a confidential survey assessing students' current social, emotional, and academic experience--to over 6,500 students in February of 2020, just before the pandemic shut down schools.

Several months later, families in this district were offered the option of remote versus in-person classes for the 2020-21 school year. Two thirds of the students in the sample ended up attending school remotely, and one third attended school in person. Regardless of whether they were learning from home or attending classes at school, the same students completed the Student Thriving Index again in October 2020.

To capture social well-being, the survey included questions about fitting in at school, whether there was an adult in their school to whom they could turn for support or advice, and whether in their school there was an adult who always wanted them to do their best. For emotional well-being, teens responded on how often they were feeling happy, relaxed, and sad, as well as how they were feeling overall about their life. And for academic well-being, the survey asked how interesting teens found their classes, how important they found it to do well in their classes, and how confident they were that they could succeed in their classes if they tried.

Credit: 
American Educational Research Association

Oncotarget: CEA as a blood-based biomarker in anal cancer

image: Frequency of elevated CEA according to disease status of SCCA.

Image: 
Correspondence to - Van K. Morris - vkmorris@mdanderson.org

Oncotarget published "CEA as a blood-based biomarker in anal cancer" which reported that the mean Carcinoembryonic Antigen (CEA) among subgroups by clinical status at the time of presentation to our institution was highest among those patients with metastatic Squamous cell carcinoma of the anal canal (SCCA) to visceral organs, however this finding was not statistically significant by ANOVA .

By clinical subgroup, the percentage of patients with an abnormally elevated CEA was highest in those patients with metastatic disease to lymph nodes followed by recurrent/unresectable SCCA , and metastatic SCCA to visceral organs, and was statistically significant between groups.

Using RECIST criteria for tumor progression and disease response, the mean change in CEA for patients with progression was an increase in 19 ng/mL, compared to a change of –7.3 ng/mL in those with disease response.

The authors likewise assessed whether CEA levels were associated with survival outcomes for all patients with metastatic SCCA, and found no correlation between CEA and likelihood for survival in a ROC analysis.

Despite interesting patterns of abnormally high CEA in SCCA patients with advanced disease, and correlation of increased CEA with disease progression, CEA is not associated with survival outcomes in SCCA, and is not a clinically relevant biomarker in this disease.

CEA is not associated with survival outcomes in SCCA, and is not a clinically relevant biomarker in this disease

Dr. Van K. Morris from The University of Texas - MD Anderson Cancer Center said, "Squamous cell carcinoma of the anal canal (SCCA) is a rare cancer of the anogenital track with an estimated incidence of about 8500 new cases and 1350 deaths in 2020 in the U.S. annually, comprising 2–3% of all gastrointestinal malignancies."

Routine, readily available blood-based markers are often utilized in the clinical management of patients with solid tumors across a variety of clinical settings.

For example, trends in biomarkers such as carcinoembryonic antigen, carbohydrate antigen 19-9, prostate-specific antigen and carbohydrate antigen 125 can be monitored serially over time for patents with colorectal cancer, pancreatic cancer, prostate cancer, and ovarian cancer, respectively, as a surrogate for changes in amount of tumor present.

To date, no blood-based biomarker for tracking responses to HPV-associated cancers is readily available to clinical oncologists for routine use.

Among anal cancer patients, one series examined 106 patients with early-stage SCCA treated definitely with chemoradiation and did not find clinical utility in the measurement of CEA in this subset of patients with anal cancer.

Since no blood-based biomarkers are currently available in a CLIA-certified laboratory for the routine management of SCCA, we performed a retrospective, single-institution study to correlate serum CEA levels with clinical and pathologic outcomes in patients across all stages and presentations of SCCA.

The Morris Research Team concluded in their Oncotarget Research Output, "we report the largest series to describe CEA as a serum biomarker for patients with metastatic SCCA. Our findings may not provide definitive support for the use of a routinely used blood-based assay for management of patients with SCCA and should guide clinicians in seeking alternative approaches for tracking responses to treatment in this disease. Nonetheless, novel approaches with serum biomarkers are needed for patients with this rare but increasingly diagnosed malignancy."

Credit: 
Impact Journals LLC

New system for tracking macaws emphasizes species' conservation needs

New data on macaw movements gathered by the Texas A&M University College of Veterinary Medicine & Biomedical Sciences' (CVMBS) The Macaw Society has the potential to greatly improve conservation strategies for the scarlet macaw, as well as similar species of large parrots.

While the overall conservation status of the scarlet macaw is listed as "least concern" by the International Union for Conservation of Nature, the species is declining across much of Central America and in other parts of its range in South America. The species also shares its habitat with numerous endangered species and influences the ecosystems in which it lives.

The Macaw Society's paper, recently published in Avian Conservation & Ecology, is the result of the long-term research study of the ecology and conservation of macaws and other parrots in Peru's Tambopata National Reserve.

Historically, the migratory movements of large parrots and macaws have largely remained a mystery because of the difficulty of tracking them over the long distances they travel. The recent publication describing the discoveries made using satellite tracking of individual birds over large areas has shed some light on this mystery.

The research team -- consisting of associate professor Donald Brightsmith; adjunct associate professor Janice Boyd; Elizabeth Hobson from the University of Cincinnati; and Charles Randel from the Southwestern Wildlife Survey in California -- used ARGOS satellite telemetry (orbiting satellites that detect signals emitted from a transmitter attached to an animal) to track six scarlet macaws and four blue-and-yellow macaws over a period of eight years.

They found that both macaw species had very large home ranges, consisting of thousands of hectares (with 1 hectare equaling 2.471 acres), and often traveled 20 to 40 km (approximately 12 to 25 miles) per day. Individuals of both species moved up to 160 km (99 miles) during the periods of low food availability, likely searching for areas with dense patches of food trees.

New data on macaw movements gathered by the Texas A&M University College of Veterinary Medicine & Biomedical Sciences' (CVMBS) The Macaw Society has the potential to greatly improve conservation strategies for the scarlet macaw, as well as similar species of large parrots.

While the overall conservation status of the scarlet macaw is listed as "least concern" by the International Union for Conservation of Nature, the species is declining across much of Central America and in other parts of its range in South America. The species also shares its habitat with numerous endangered species and influences the ecosystems in which it lives.

The Macaw Society's paper, recently published in Avian Conservation & Ecology, is the result of the long-term research study of the ecology and conservation of macaws and other parrots in Peru's Tambopata National Reserve.

Historically, the migratory movements of large parrots and macaws have largely remained a mystery because of the difficulty of tracking them over the long distances they travel. The recent publication describing the discoveries made using satellite tracking of individual birds over large areas has shed some light on this mystery.

The research team -- consisting of associate professor Donald Brightsmith; adjunct associate professor Janice Boyd; Elizabeth Hobson from the University of Cincinnati; and Charles Randel from the Southwestern Wildlife Survey in California -- used ARGOS satellite telemetry (orbiting satellites that detect signals emitted from a transmitter attached to an animal) to track six scarlet macaws and four blue-and-yellow macaws over a period of eight years.

They found that both macaw species had very large home ranges, consisting of thousands of hectares (with 1 hectare equaling 2.471 acres), and often traveled 20 to 40 km (approximately 12 to 25 miles) per day. Individuals of both species moved up to 160 km (99 miles) during the periods of low food availability, likely searching for areas with dense patches of food trees.

Credit: 
Texas A&M University

How to make biomedical research data able to interact?

The concept of interoperability describes the ability of different systems to communicate. This is a major challenge in biomedical research, and in particular, in the field of personalised medicine, which is largely based on the compilation and analysis of numerous datasets. For instance, the COVID-19 pandemic has shown that even when the technical, legal and ethical constraints are lifted, the data remain difficult to analyse because of semantic ambiguities. Under the auspices of the Swiss Personalized Health Network (SPHN) and in close collaboration with representatives from all five Swiss university hospitals and eHealth Suisse, a team of scientists from the University of Geneva (UNIGE) and the University Hospitals of Geneva (HUG), in collaboration with the SIB Swiss Institute of Bioinformatics and the Lausanne University Hospital (CHUV), have developed the strategy for a national infrastructure adopted by all Swiss university hospitals and academic institutions. With its pragmatic approach, this strategy is based on the development of a common semantic framework that does not aim to replace existing standards, but to use them in a synergistic and flexible way according to the needs of the research and the partners involved. The implementation of this strategy, which has already started, marks a crucial step to stimulate research and innovation for a truly personalised medicine in Switzerland. Read more in the journal JMIR Medical Informatics.

Personalised medicine is based on the exploitation and analysis of large quantities of data whether genomic, epidemiological or from medical imaging, to extract meaning. To be able to do this, cross-referencing and aggregating mutually intelligible data is compulsory, even when they come from very different sources.

With this in mind, the Swiss government created in 2017 the Swiss Personalized Health Network (SPHN), an initiative placed under the leadership of the Swiss Academy of Medical Sciences in collaboration with the SIB Swiss Institute of Bioinformatics that aims to promote the use and exchange of health data for research. "Despite major investments over the past decade, there are still major disparities", says Christian Lovis, director of the Department of Radiology and Medical Informatics at the UNIGE Faculty of Medicine and head of the Division of Medical Information Sciences at the HUG. "This is why we wanted, with our partners and the SPHN, to propose a strategy and common standards that are flexible enough to accommodate all kinds of current and future databases."

A three-pillar strategy

We communicate on three main standards: the meaning we give to things, because we must agree on a common basis for understanding each other; a technical standard -- the sound, with which we speak; and finally, the organisation of the meaning and sound with sentences and grammar to structure the communication in an intelligible way. "In terms of data, it's the same thing, explains Christophe Gaudet-Blavignac, a researcher in the team led by Christian Lovis. You have to agree on a semantic, to represent conceptually what has to be communicated. Then we need a compositional language to combine these meanings with all the freedom required to express everything that needs to be expressed. And finally, depending on the projects and research communities involved, this will be 'translated' as needed into data models, which are as numerous as the languages spoken in the world."

"Our aim has therefore been to unify vocabularies so that they can be communicated in any grammar, rather than creating a new vocabulary from scratch that everybody would have to learn anew", says Christian Lovis. "In this sense, the Swiss federalism is a huge advantage: it has forced us to imagine a decentralised strategy, which can be applied everywhere. The constraint has therefore created the opportunity to develop a system that works despite local languages, cultures and regulations." This makes it possible to apply specific data models for only the last step to be adapted to the formats required by a particular project -- the Food and Drug Administration (FDA) format in the case of collaboration with an American team, for example, or any other specific format used by a particular country or research initiative. This constitutes a guarantee of mutual understanding and a huge time saving.

No impact on data protection

However, data interoperability does not mean systematic data sharing. "The banking world, for example, has long since adopted global interoperability standards, stresses Christophe Gaudet-Blavignac. A simple IBAN can be used to transfer money from any account to any other. However, this does not mean that anyone, be they individuals, private organisations or governments, can know what is in these accounts without a strict legal framework." Indeed, a distinction must be made between the instruments that create interoperability and their implementation, on the one hand, and the regulatory framework that governs their accessibility, on the other hand.

Strategy implementation

This strategy has been implemented stepwise in Switzerland since the middle of 2019, in the framework of the Swiss Personalized Health Network. "Swiss university hospitals are already following the proposed strategy to share interoperable data for all multicentric research projects funded by the SPHN initiative", reports Katrin Crameri, director of the Personalized Health Informatics Group at SIB in charge of the SPHN Data Coordination Centre. Further, some hospitals are starting to implement this strategy beyond the SPHN initiative.

Credit: 
Université de Genève

Obstacles on the racetrack of life

image: You can imagine transcription as an obstacle race in which the RNA polymerase has to overcome a number of hurdles. It is particularly difficult for it if the "rider" - the protein SPT6 - is missing.

Image: 
Sandy Westermann

The corona pandemic has ensured that the term "mRNA" is now also known to a large public beyond laboratories and lecture halls. However, the molecule is much more than an important component of a successful vaccine against the SARS-CoV-2 virus. "mRNAs are a central component of all living things on our planet. Without them life as we know it would not function," says Elmar Wolf.

Wolf is a professor for tumour system biology at the Department of Biochemistry and Molecular Biology at the University of Würzburg. With his research team, he has now deciphered new details about the formation of mRNA which provide novel insights into how a fundamental process inside cells works: the transcription. The team presents the results of their research in the current issue of Molecular Cell.

Information becomes protein

Transcription: If one can still remember their biology lessons, then they know that it is the process by which the genetic information in the DNA is translated into messenger RNA - or as how scientists like to call it: mRNA. Only the mRNA is capable of transmitting the information from the genetic material of the DNA in the nucleus of the cell to the sites of protein biosynthesis outside the nucleus. "The mRNA composition thus decides how the cells of our body look and how they function," Wolf says.

The transcription process from DNA to mRNA sounds relatively simple: "You can think of transcription as an obstacle race. The RNA polymerase starts the reading process at the beginning of the gene, then moves through the entire gene and, finally reach the finish line," Wolf explains. If the polymerase makes it through to the end, the mRNA has been produced. Scientists have long known that a lot can go wrong in this process. After all, many genes are a long "race track" with plenty of obstacles.

Polymerase fails in difficult places

In order to better understand what happens at the molecular level during the race, Wolf and his team took a close look at the process of transcription. "We studied an important component of the RNA polymerase: the protein SPT6," explains Wolf. The question they explored is: "Is SPT6 important for the process of transcription and - if so - in what way?"

What do the scientists do when they want to learn about the function of a protein: they remove it from the cells and see what happens. That's exactly what Wolf and his team did. The result was quite clear: "Interestingly, RNA polymerase starts making mRNA even in the absence of SPT6," Wolf described. But then it regularly gets stuck in difficult places - you could say that it falls over an obstacle.

New picture of the transcription

This failure has two consequences that have a negative impact on cell function: On one hand, hardly any RNA polymerase makes it to the destination, which is why hardly any mRNA is produced. On the other hand, however, the gene itself is also affected. "Without SPT6, the polymerase destroys the obstacles and the racetrack, which is why functional RNA polymerases are then unable to find their way," says Wolf. Thus, it is clear that the SPT6 protein is a central element in the production of mRNA in cells.

With these findings, the researchers are helping to shed more light on the process of transcription: "Until now, scientists had assumed that the only thing that mattered for mRNA production was how many RNA polymerases started transcription," Wolf says. Thanks to the results that have now been published, it is now clear that by no means all RNA polymerases that start the transcription process actually make it to the end of the gene and that the protein SPT6 is essential for this arrival.

Credit: 
University of Würzburg

A star in a distant galaxy blew up in a powerful explosion, solving an astronomical mystery

image: Hubble Space Telescope color composite of the electron-capture supernova 2018zd and the host starburst galaxy NGC 2146

Image: 
NASA/STScI/J. DePasquale; Las Cumbres Observatory

Dr. Iair Arcavi, a Tel Aviv University researcher at the Raymond and Beverly Sackler Faculty of Exact Sciences, participated in a study that discovered a new type of stellar explosion - an electron-capture supernova. While they have been theorized for 40 years, real-world examples have been elusive. Such supernovas arise from the explosions of stars 8-9 times the mass of the sun. The discovery also sheds new light on the thousand-year mystery of the supernova from A.D. 1054 that was seen by ancient astronomers, before eventually becoming the Crab Nebula, that we know today.

A supernova is the explosion of a star following a sudden imbalance between two opposing forces that shaped the star throughout its life. Gravity tries to contract every star. Our sun, for example, counter balances this force through nuclear fusion in its core, which produces pressure that opposes the gravitational pull. As long as there is enough nuclear fusion, gravity will not be able to collapse the star. However, eventually, nuclear fusion will stop, just like gas runs out in a car, and the star will collapse. For stars like the sun, the collapsed core is called a white dwarf. This material in white dwarfs is so dense that quantum forces between electrons prevent further collapse.

For stars 10 times more massive than our sun, however, electron quantum forces are not enough to stop the gravitational pull, and the core continues to collapse until it becomes a neutron star or a black hole, accompanied by a giant explosion. In the intermediate mass range, the electrons are squeezed (or more accurately, captured) onto atomic nuclei. This removes the electron quantum forces, and causes the star to collapse and then explode.

Historically, there have been two main supernova types. One is a thermonuclear supernova -- the explosion of a white dwarf star after it gains matter in a binary star system. These white dwarfs are the dense cores of ash that remain after a low-mass star (one up to about 8 times the mass of the sun) reaches the end of its life. Another main supernova type is a core-collapse supernova where a massive star -- one more than about 10 times the mass of the sun -- runs out of nuclear fuel and has its core collapsed, creating a black hole or a neutron star. Theoretical work suggested that electron-capture supernovae would occur on the borderline between these two types of supernovae.

That's the theory that was developed in the 1980's by Ken'ichi Nomoto of the University of Tokyo, and others. Over the decades, theorists have formulated predictions of what to look for in an electron-capture supernova. The stars should lose a lot of mass of particular composition before exploding, and the supernova itself should be relatively weak, have little radioactive fallout, and produce neutron-rich elements.

The new study, published in Nature Astronomy, focuses on the supernova SN2018zd, discovered in 2018 by Japanese amateur astronomer Koihchi Itagaki. Dr. Iair Arcavi, of the astrophysics department at Tel Aviv University, also took part in the study. This supernova, located in the galaxy NGC 2146, has all of the properties expected from an electron-capture supernova, which were not seen in any other supernova. In addition, because the supernova is relatively nearby - only 31 million light years away - the researchers were able to identify the star in pre-explosion archival images taken by the Hubble Space Telescope. Indeed, the star itself also fits the predictions of the type of star that should explode as an electron-capture supernovae, and is unlike stars that were seen to explode as the other types of supernovae.

While some supernovae discovered in the past had a few of the indicators predicted for electron-capture supernovae, only SN2018zd had all six - a progenitor star that fits within the expected mass range, strong pre-supernova mass loss, an unusual chemical composition, a weak explosion, little radioactivity, and neutron-rich material. "We started by asking 'what's this weirdo?'" said Daichi Hiramatsu of the University of California Santa Barbara and Las Cumbres Observatory, who led the study. "Then we examined every aspect of SN 2018zd and realized that all of them can be explained in the electron-capture scenario."

The new discoveries also illuminate some mysteries of one of the most famous supernovae of the past. In A.D. 1054 a supernova happened in our own Milky Way Galaxy, and according to Chinese and Japanese records, it was so bright that it could be seen in the daytime and cast shadows at night. The resulting remnant, the Crab Nebula, has been studied in great detail, and was found to have an unusual composition. It was previously the best candidate for an electron-capture supernova, but this was uncertain partly because the explosion happened nearly a thousand years ago. The new result increases the confidence that the historic 1054 supernova was an electron-capture supernova.

"It's amazing that we can shed light on historical events in the Universe with modern instruments," says Dr. Arcavi. "Today, with robotic telescopes that scan the sky in unprecedented efficiency, we can discover more and more rare events which are critical for understanding the laws of nature, without having to wait 1000 years between one event and the next."

Credit: 
Tel-Aviv University

Experts tackle modern slavery in Greek strawberry fields using satellite technology

image: Worker conditions in Nea Manolada, Greece lacking Kitchen, toilets, and heating

Image: 
Ioannis Kougkoulos, Kornilia Hatzinikolaou

A consortium of modern slavery experts, led by the University of Nottingham, have assisted the Greek government to tackle a humanitarian crisis unfolding in the strawberry fields of southern Greece.

Using satellite technology to identify migrant settlements - a technique pioneered by the university's Rights Lab - and working with the Greek authorities, the experts then developed a decision model for which they could prioritise victims that were at highest risk.

Leading the study, the Rights Lab combined different data sources and methods to build a set of criteria measuring the extent of labour exploitation in a settlement. The academics then validated these criteria with a government agency and a Non-Governmental Organisation (NGO) involved in fighting labour exploitation.

By combining earth observation data with operations management techniques, this method has been successfully used to address labour exploitation in areas where strawberries are harvested.

This approach is a world-first in the humanitarian sector, with the study, funded by the Economic and Social Research Council (ESRC), being published in the journal of Production and Operations Management.

The strawberry fields of Nea Manolada have been in the human rights spotlight since May 2013, when three local field guards shot and injured 30 Bangladeshi migrant workers. In March 2017, the European Court of Human Rights (ECtHR) ruled that the workers had been subjected to forced labour and that Greece had violated Article 4 of the European Convention on Human Rights by not preventing human trafficking of irregular migrant workers. Following a high-stake ruling by the Court, the Greek government was mandated to ramp-up its fight against labour exploitation.

Dr Ioannis Kougkoulos led the study while at the Rights Lab. He said: "The use of seasonal workers, the relatively low level of skills required, a strong reliance on outsourcing and agent-based recruitment of workers increase the likelihood of labour exploitation. Forced migration caused by crises around the world exacerbates this phenomenon. Refugees and migrants often live-in illegality and experience serious financial distress, which puts them at high risk of becoming victims of labour exploitation.

"Governments are responsible for ensuring equal treatment for migrant and national workers on their territory, and to protect migrants from being employed under substandard working conditions."

Dr Doreen Boyd, a co-author, Rights Lab Associate Director, and Professor of Earth Observation at the University of Nottingham led the ESRC grant that supported this work. She said: "We have demonstrated how remote sensing data enables the identification and location of informal settlements of workers in potential situations of labour exploitation over a large geographic area (140km2). Identifying these settlements from the ground would require driving around the entire study area in search of possible settlements, which would be costly and ineffective, since many settlements are not visible from the road.

"Our approach can be replicated in other labour-intensive agricultural activities where cheap labour is abundant, such as the Italian tomato fields or tobacco-producing regions in Argentina. Future studies could extend our approach to different applications in humanitarian operations, for example, to study migration flows, by combining remote sensing with a decision-making tool such as Multi-Criteria Decision Analysis for identifying and assessing risks of settlements of forcibly displaced persons in highly fluid conflict situations, like the South Sudan or the Democratic Republic of Congo."

In the paper, the researchers report that fighting labour exploitation in the agricultural sector requires time-intensive fieldwork, as it involves visiting suspected farms and informal worker settlements, which governments and humanitarian organisations often lack resources to do.

Using remote sensing, a form of satellite technology, for real-time data collection allowed the academics to overcome one of the major challenges of research in humanitarian operations, namely the difficulty of accessing data due to safety and logistical issues limiting access to the field.

Once areas of potential exploitation had been identified through satellite imagery, these settlements were investigated and verified - known as 'ground-truthing'. On the ground, the inspection teams collected data from each settlement using questionnaires to address all criteria required for the decision analysis model that the academics had prepared.

Next, the academics used Multi-Criteria Decision Analysis (MCDA), a recognised method in the operations sector for decision-making to formalise and address the problem of competing decision objectives - a common characteristic of humanitarian operations.

Each settlement was then ranked, using the MCDA model, to assist the government and humanitarian organisations intervene in the top-priority settlements (starting with the riskiest settlement and moving toward the less risky) and allocate resources to the most vulnerable migrant workers to improve their living conditions.

Credit: 
University of Nottingham

Solar radio signals could be used to monitor melting ice sheets

image: The experimental setup and test site at Store Glacier, Greenland. Researchers conceptualized a battery-powered receiver with an antenna placed on the ice that can measure ice thickness using the sun's radio waves. (Image credit: Sean Peters)

Image: 
Image Sean Peters

The sun provides a daunting source of electromagnetic disarray - chaotic, random energy emitted by the massive ball of gas arrives to Earth in a wide spectrum of radio frequencies. But in that randomness, Stanford researchers have discovered the makings of a powerful tool for monitoring ice and polar changes on Earth and across the solar system.

In a new study, a team of glaciologists and electrical engineers show how radio signals naturally emitted by the sun can be turned into a passive radar system for measuring the depth of ice sheets and successfully tested it on a glacier in Greenland. The technique, detailed in the journal Geophysical Research Letters on July 14, could lead to a cheaper, lower power and more pervasive alternative to current methods of collecting data, according to the researchers. The advance may offer large-scale, prolonged insight into melting ice sheets and glaciers, which are among the dominant causes of sea-level rise threatening coastal communities around the world.

A sky full of signals

Airborne ice-penetrating radar - the primary current means for collecting widespread information about the polar subsurface - involves flying airplanes containing a high-powered system that transmits its own "active" radar signal down through the ice sheet. The undertaking is resource-intensive, however, and only provides information about conditions at the time of the flight.

By contrast, the researchers' proof of concept uses a battery-powered receiver with an antenna placed on the ice to detect the sun's radio waves as they travel down to Earth, through the ice sheet and to the subsurface. In other words, instead of transmitting its own signal, the system uses naturally occurring radio waves that are already traveling down from the sun, a nuclear-powered transmitter in the sky. If this type of system were fully miniaturized and deployed in extensive sensor networks, it would offer an unprecedented look at the subsurface evolution of Earth's quickly changing polar conditions, the researchers say.

"Our goal is to chart a course for the development of low-resource sensor networks that can monitor subsurface conditions on a really wide scale," said lead study author Sean Peters, who conducted research for the study as a graduate student at Stanford and now works at the MIT Lincoln Laboratory. "That could be challenging with active sensors, but this passive technique gives us the opportunity to really take advantage of low-resource implementations."

A random advantage

In addition to visible and other kinds of light, the sun is constantly emitting radio waves across a wide, random spectrum of frequencies. The researchers used this chaos to their advantage: They recorded a snippet of the sun's radioactivity, which is like an endless song that never repeats, then listened for that unique signature in the echo that's created when the solar radio waves bounce off the bottom of an ice sheet. Measuring the delay between the original recording and the echo allows them to calculate the distance between the surface receiver and the floor of the ice sheet, and thus its thickness.

In their test on Store Glacier in West Greenland, the researchers computed an echo delay time of about 11 microseconds, which maps to an ice thickness of about 3,000 feet - a figure that matches measurements of the same site recorded from both ground-based and airborne radar.

"It's one thing to do a bunch of math and physics and convince yourself something should be possible - it's really something else to see an actual echo from the bottom of an ice sheet using the sun," said senior author Dustin Schroeder, an assistant professor of geophysics at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth).

From Jupiter to the sun

The idea of using passive radio waves to collect geophysical measurements of ice thickness was initially proposed by study co-author Andrew Romero-Wolf, a researcher with NASA's Jet Propulsion Laboratory, as a way of investigating Jupiter's icy moons. As Schroeder and Romero-Wolf worked together with others on a mission, it became clear that radio waves generated by Jupiter itself would interfere with their active ice-penetrating radar systems. At one point, Romero-Wolf realized that instead of a weakness, Jupiter's erratic radio emissions might actually be a strength, if they could be turned into a source for probing the subsurface of the moons.

"We started discussing it in the context of Jupiter's moon Europa, but then we realized it should work for observing Earth's ice sheets too if we replace Jupiter with the sun," Schroeder said.

From there, the research team undertook the task of isolating the sun's ambient radio emissions to see if it could be used to measure ice thickness. The method involved bringing a subset of the sun's 200- to 400-megahertz radio frequency band above the noise of other celestial bodies, processing massive amounts of data and eliminating man-made sources of electromagnetism like TV stations, FM radio and electronic equipment.

While the system only works when the sun is above the horizon, the proof-of-concept opens the possibility of adapting to other naturally occurring and man-made radio sources in the future. The co-authors are also still pursuing their original idea of applying this technique to space missions by harnessing the ambient energy emitted by other astronomical sources like the gas giant Jupiter.

"Pushing the frontiers of sensing technology for planetary research has enabled us to push the frontiers of sensing technology for climate change," Schroeder said. "Monitoring ice sheets under climate change and exploring icy moons at the outer planets are both extremely low-resource environments where you really need to design elegant sensors that don't require a lot of power."

Credit: 
Stanford University

Lean and mean: Building a multifunctional pressure sensor with 3D printing technology

image: A group of scientists from South Korea develop a novel, multi-directional pressure sensor coupled with a temperature sensor using 3D printing technology that is low-cost and scalable to large-scale production of smart robotic systems.

Image: 
DGIST

The treatment of many medical issues like abnormal gait and muscular disorders require an accurate sensing of applied pressure. In this regard, flexible pressure sensors that are simple, lightweight, and low-cost, have garnered considerable attention. These sensors are designed and manufactured through "additive manufacturing," or what is more commonly called "3D printing," using conductive polymer composites as their building blocks.

However, all 3D-printed pressure sensors developed so far are limited to sensing applied forces along a single direction only. This is hardly enough for real world applications, which involve situations where forces can be applied along various angles and directions. Moreover, the electrical resistance of most conductive polymers varies with temperature and must be compensated for accurate pressure sensing.

In a study published in Composites Part B: Engineering, a group of scientists led by Prof. Hoe Joon Kim from Daegu Gyeongbuk Institute of Science and Technology, South Korea, have addressed this issue with a newly designed multi-axis pressure sensor coupled with a temperature-sensing component that overcomes the limitations of conventional sensors. "Our multi-axis pressure sensor successfully captures the readings even when tilted forces are applied. Moreover, the temperature-sensing component can calibrate the resistance shift with temperature changes. In addition, the scalable and low-cost fabrication process is fully compatible with commercial 3D printers," explains Prof. Kim.

Scientists first prepared the printable conductive polymer using multi-walled carbon nanotubes (MWCNTs) and polylactic acid (PLA). Next, they built the sensor body with a commercial elastomer and sensing material with MWCNTs/PLA composite filament using 3D printing. The sensor is based on a bumper structure with a hollow trough beneath and employs three pressure-sensing elements for multi-axis pressure detection and a temperature-sensing element for calibration of resistance. The sensor could successfully calibrate both the magnitude and direction of the applied force by evaluating the response of each pressure-sensing element. This bumper structure, when installed in a 3D-printed flip-flop and a hand gripper, enabled clear distinction between distinct human motions and gripping actions.

The scientists are thrilled about the future prospects of their 3D-printed sensor. "The proposed 3D printing technology has a wide range of applications in energy, biomedicine, and manufacturing. With the incorporation of the proposed sensing elements in robotic grippers and tactile sensors, the detection of multi-directional forces along with temperature could be achieved, heralding the onset of a new age in robotics," comments an excited Prof. Kim.

Indeed, those are some interesting consequences to look forward to!

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Study: Idea sharing increases online learner engagement

image: Online learning engagement can be increased by nearly one-third by simply prompting students to share course ideas rather than personal details in the form of icebreakers and social introductions, said Unnati Narang, a professor of business administration at the Gies College of Business and co-author of the research.

Image: 
Photo by Gies College of Business

CHAMPAIGN, Ill. -- Sharing ideas in an online learning environment has a distinct advantage over sharing personal details in driving learner engagement in massive open online courses, more commonly known as MOOCs, says new research co-written by a University of Illinois Urbana-Champaign expert who studies the intersection of marketing and digital environments.

Online learning engagement can be increased by nearly one-third by simply prompting students to share course ideas in a discussion forum rather than having them share information about their identity or personal motivations for enrolling, said Unnati Narang, a professor of business administration at the Gies College of Business.

With less than 10% of online learners completing courses, and less than 5% participating in course discussions, there's a stark need for online learning platforms to identify and employ strategies that can enhance student engagement, Narang said.

"Engagement levels have tended to be really low in online classrooms simply because students may not ever get the chance to get to know each other in the way they do in an in-person, face-to-face classroom," she said. "A lot of those elements are, quite obviously, lacking in the online learning environment."

Initially, online platforms placed a lot of emphasis on having discussion forums to engage students. But over time, those efforts tended to fizzle out, Narang said.

"Even if a student is posting something, it may never be read by a classmate or by the instructor, which can really demotivate students who are trying to engage in the material," she said.

To determine how to increase learner engagement, Narang and her co-authors analyzed more than 12,000 discussion forum postings during an 18-month period and conducted a field experiment involving more than 2,000 learners in a popular online course offered by a large U.S. university.

"We randomly nudged students to either share something personal about themselves or ideas related to the course," she said. "We thought we were going to see an increase in engagement thanks to the social aspects of identity sharing because there's so much emphasis on it in face-to-face classes for icebreakers and social introductions."

The results indicated that asking learners to share ideas related to the course had a stronger effect on their video consumption and assessment completion, according to the paper.

"We found that the idea of sharing knowledge outperforms identity sharing as well as the control condition of not sharing anything," Narang said. "Across diverse metrics of learner engagement and performance, we found that what learners share plays a big role in enhancing the online learning environment, and they tended to perform 30% better in terms of how many videos they consumed, how many assessments they completed and how they scored on assessments. So there's a distinct advantage to idea sharing in online pedagogy."

For educators, the implications of what the researchers dubbed the "idea advantage" in an era of increased online learning due to the COVID-19 pandemic suggests that identity sharing tends to be superficial and brief, so it's better to push students to engage more on the course content and their ideas about what they're studying, Narang said.

"Just very basic getting-to-know-you introductions that instructors make in a physical classroom - who are you, where you're from, etc. - doesn't really translate into the online learning environment," she said. "There's just too much anonymity to successfully do that when you're in a virtual classroom. The idea posts, on the other hand, tend to be much more elaborate and well-articulated. Students put more time and effort into crafting their answers. On average, an idea-sharing post was 66 words long. But an identity-sharing post tended to be roughly half as long. Students were clearly more invested in ideas than trying to make friends in the online learning environment, thus why the idea advantage is so strong."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Study highlights need to replace 'ancestry' in forensics with something more accurate

image: Skulls in the lab of Ann Ross at NC State University. Ross is a biological anthropologist and forensic science researcher.

Image: 
Marc Hall, NC State University

A new study finds forensics researchers use terms related to ancestry and race in inconsistent ways, and calls for the discipline to adopt a new approach to better account for both the fluidity of populations and how historical events have shaped our skeletal characteristics.

"Forensic anthropology is a science, and we need to use terms consistently," says Ann Ross, corresponding author of the study and a professor of biological sciences at North Carolina State University. "Our study both highlights our discipline's challenges in discussing issues of ancestral origin consistently, and suggests that focusing on population affinity would be a way forward."

Race is a social construct - there's no scientific basis for it. Population affinity, in the context of forensic anthropology, is determined by the skeletal characteristics associated with groups of people. Those characteristics are shaped by historic events and forces such as gene flow, migration, and so on. What's more, these population groups can be very fluid.

In practical terms, that this means that race can be wildly misleading in a forensic context. For example, a missing person may have been listed as Black on their driver's license because of their skin color. But their skeletal remains may not indicate they were of African descent, because their bone structure may reflect other aspects of their ancestry.

"Like many disciplines, forensic anthropology has been coming to terms with issues regarding race," Ross says. "Some people in the discipline want to do away completely with assessing an individual's place of origin. Others say that conventional approaches still have value in helping to identify human remains.

"In this paper, we are recommending a third path. This study is focused on finding ways to evaluate human variation that give us valuable information in forensic and anthropological contexts, but that avoid clinging to the use of outdated defaults such as race."

In one part of the study, the researchers looked at all of the papers published in the Journal of Forensic Sciences between 2009 and 2019 that referenced ancestry, race or related terms. The goal of this content analysis was to determine if the terms were being used consistently within the field. And they were not.

"The Journal of Forensic Sciences is the flagship journal for forensic sciences in the U.S., and even there we found inconsistencies in how our field uses these terms," Ross says. "Inconsistent terminology opens the door to confusion, misunderstanding and misuse within the discipline."

In a second part of the study, the researchers used geometric morphometric data and spatial analysis methods to evaluate the validity of terms such as "European" or "African" to describe the ancestral origin of human remains.

Altogether, the researchers evaluated nine datasets, comprising data on 397 people. The datasets were of human remains collected in Chile, Colombia, Cuba, Guatemala, Panama, Puerto Rico, Peru, Spain and a population of enslaved Africans that had been buried in Cuba. All of the remains, except for those of the enslaved Africans, were from the 20th or 21st centuries.

"Regarding the data we have on the remains of enslaved Africans, we want to acknowledge the value that data collected from such samples can contribute to discussions of human variation, while also noting that the history and ethics of human skeletal collections, in general, is often dubious," Ross says. "Such body harvesting all too often occurred under the umbrella of scientific racism, without the permission of the deceased or next of kin, and disproportionately targeted marginalized populations."

In their review of recent papers, the researchers found that forensics experts often still referred to remains as being of African, Asian or European origin.

"But our analysis of these nine datasets shows that this approach is wrong, because it's not that simple," Ross says.

"Let's use Panama as an example," says Ross, who is from Panama. "There have been huge movements of people into this area from all over the world over the past 500 years: indigenous peoples who predate colonialism, colonizers from Europe, slaves from Africa, immigrants from Asia. The contemporary remains we see in Panama reflect all of those influences."

Ross also noted that the analysis of the nine datasets also highlighted a flaw in the contemporary idea of "clines." The idea of clines is basically that, while there are changes from one group of people to another, populations who are geographically close to each other are more similar than populations that are geographically distant. However, the researchers found that this assumption can be misleading.

For example, Panama and Colombia share a border, but very different historical forces have acted on Panama and Colombia in recent centuries - so the skeletal characteristics of remains from those two countries are much less similar than one would anticipate.

"All of this is important for multiple reasons, such as taking meaningful steps to reduce racism in our field, and ensuring that we are communicating clearly with each other within the discipline," Ross says. "It is also important because marginalized people are most often the people whose remains go unidentified. Labeling them as 'Hispanic' or 'Black' is misleading. We, as forensic anthropologists, need to change the way we think about origin. We need to begin thinking about physical markers in the context of population affinity and how we can use that to both communicate clearly and to help understand who we are seeing when we work with unidentified remains. We need to ensure that we are not contributing - even inadvertently - to structural inequities and racism.

"This also means that we are faced with a wide range of new research questions. As a field, much of our work has focused on looking at data from the remains of historic populations. I think we need to begin doing more work that can help us better understand the ways in which historical events have helped to shape the skeletal characteristics of modern populations."

Credit: 
North Carolina State University

Preventing lung cancer's unwelcome return

image: Among non-small cell lung tumors, 15% are composed of mostly EGFR-sensitive cells, which can be killed with EGFR inhibitor treatments. The nuclei of these EFGR-sensitive cells are stained in blue and the cell surfaces in red.

Image: 
Sordella lab/CSHL, 2021

When a doctor gives a patient antibiotics for a bacterial infection, they usually require them to finish the entire treatment, even when symptoms go away. This is to ensure the drugs kill off any remaining bacteria. Cold Spring Harbor Laboratory (CSHL) Visiting Scientist Raffaella Sordella investigated a similar problem that occurs in some lung cancers.

Approximately 15% of non-small cell lung cancers have a mutation in a growth receptor called EGFR, causing tumor cells to grow uncontrollably. Researchers developed an effective drug that inhibits EGFR and kills cancer cells, but the tumor grows back later. Sordella wanted to understand the molecular mechanisms behind this relapse and how to prevent it.

Sordella and her team discovered that a small percentage of drug-resistant cancer cells were already present before treatment. Instead of relying on EGFR, these cells are dependent on another gene (AXL) for survival. Furthermore, they observed that the cells could transition between these drug-sensitive and drug-resistant "states." When patients finish EGFR treatment, random modifications constantly occur in the remaining cells, causing both types of cells to grow back.

Sordella and her team worked with clinicians at Northwell Health and former CSHL Professor Gregory Hannon, now at the Cancer Research UK Cambridge Institute. Hannon's research focuses on microRNA, a molecule that regulates cells by managing transcribed (copied) genes. Sordella explains:

"The genome is like a library. So when you have to do a recipe to bake something, you go there, you transcribe your recipe, you take it out from the library, you go in the kitchen. What these microRNAs do, they intercept all the recipes that are getting out from your library. And then, they decide whether this is a recipe that the cell should care about or not. So they are what they call 'gatekeepers' of a cell state."

The researchers discovered that a certain microRNA called miR335 determines the "state" of the cancer cell. If the cancer cell loses miR335, a cascade of events is triggered that allows cells to use the alternative AXL pathway; the cells are not killed by drugs that target EGFR. These drug-resistant cells survive and eventually, the tumor grows back.

Understanding how resistance arises in lung cancer is key to figuring out how to eliminate a tumor. Sordella hopes that these findings could help develop treatments to wipe out both AXL- and EGFR-dependent cells from the start.

Credit: 
Cold Spring Harbor Laboratory

Recent study identifies 11 candidate genetic variants for Alzheimer's disease

LEXINGTON, Ky. (July 13, 2021) -- A recently published study co-authored by University of Kentucky Sanders-Brown Center on Aging researcher Justin Miller, Ph.D., identifies 11 rare candidate variants for Alzheimer's disease. Researchers found 19 different families in Utah that suffered from Alzheimer's disease more frequently than what is considered normal.

Miller, an assistant professor in the UK College of Medicine, was a co-first author for the study published in the journal Alzheimer's & Dementia. The work was started at another university, however, some of the computational work was done after Miller arrived at UK in March.

For the study, genetic sequencing was conducted on two cousins from each of the 19 families. Miller says they then identified genetic variants that were shared between both cousins.

"We then used a series of filtering criteria to identify rare genetic variants that were most likely contributing to the excess Alzheimer's disease in each family," he said.

Researchers found 11 rare genetic variants spanning 10 genes, including previously unknown variants in two known Alzheimer's disease risk genes.

"Identifying people with increased risk for Alzheimer's disease before they become symptomatic may lead to earlier and more effective interventions," Miller said. "Additionally, our methodology for analyzing high-risk pedigrees can be used to prioritize rare genetic variants that likely contribute to disease."

Miller says while this discovery will not immediately impact patient care, they do believe identifying genetic variants associated with the disease is the first step to identifying potential drug targets that can be used to develop therapeutics.

This work was funded by the National Institutes of Health, the Huntsman Cancer Institute, Brigham Young University, University of Utah, National Cancer Institute, BrightFocus Foundation, the National Heart, Lung, and Blood Institute, and was a collaboration with Brigham Young University, the University of Utah, and the Alzheimer's Disease Genetics Consortium.

Research reported in this publication was supported by the National Institute on Aging of the National Institutes of Health under Award Numbers RF1AG054052 and U01AG052411. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

The University of Kentucky is increasingly the first choice for students, faculty and staff to pursue their passions and their professional goals. In the last two years, Forbes has named UK among the best employers for diversity, and INSIGHT into Diversity recognized us as a Diversity Champion four years running. UK is ranked among the top 30 campuses in the nation for LGBTQ* inclusion and safety. UK has been judged a "Great College to Work for" three years in a row, and UK is among only 22 universities in the country on Forbes' list of "America's Best Employers." We are ranked among the top 10 percent of public institutions for research expenditures -- a tangible symbol of our breadth and depth as a university focused on discovery that changes lives and communities. And our patients know and appreciate the fact that UK HealthCare has been named the state's top hospital for five straight years. Accolades and honors are great. But they are more important for what they represent: the idea that creating a community of belonging and commitment to excellence is how we honor our mission to be not simply the University of Kentucky, but the University for Kentucky.

Credit: 
University of Kentucky

Mechanical stimuli significantly influence organ growth

video: Through time-resolved observation of the cells, a research team at the Technical University of Munich was able to investigate the complex interactions between the organoid cells and the surrounding collagen matrix in detail. By expanding in the direction of movement and then contracting again, the cells generate forces that deform the surrounding collagen matrix, making it possible for the organoid to organize the direction of its own further growth.

Image: 
Benedikt Buchmann / TUM

In addition to chemical factors, mechanical influences play an important role in the natural growth of human organs such as kidneys, lungs and mammary glands - but also in the development of tumors. Now a research team at the Technical University of Munich (TUM) has investigated the process in detail using organoids, three-dimensional model systems of such organs which are produced in the laboratory.

Organoids are three-dimensional systems modeling various human organs. Grown in the laboratory, they exhibit properties similar to those of actual body tissue. Organoids offer science new opportunities to simulate and investigate the processes of organ growth. These processes could not be observed in the simplified two-dimensional model systems used in the past.

Using mammary gland organoids to analyze the complex interactions of cells with surrounding tissue, scientists at the Technical University of Munich, the Helmholtz Zentrum München and the Ruhr-Universität Bochum have shown, that the growth of the gland tissue in the human breast is explicitly influenced by the mechanical properties of the surrounding collagen network.

Integrated dynamic development process

The organoids grown by the team form branched glandular ducts whose structure and organization very closely resemble that of the human mammary gland. During the growth process the individual organoid branches invade into the surrounding collagen matrix.

"Starting with a single stem cell, in just 14 days these organoids form a complex, branched, three-dimensional structure consisting of several thousand cells. This is absolutely fascinating," says Andreas Bausch, Professor for Cellular Biophysics at TU Munich and head of the research group.

The research team used temporally resolved microscopy on the growing structures over the course of several days and successfully monitored the dynamic development process in detail. They discovered that the organoid growth is substantially dictated by collective movements of the cells.

By expanding in the direction of movement and then contracting again, the cells generate forces so strong that they deform the surrounding collagen matrix, making it possible for the organoid to independently organize the direction of its own further growth.

Stable collagen 'cage'

"This is made possible by the mechanical plasticity of the collagen," says Benedikt Buchmann, lead author of the research team's study. "When the individual cells move back and forth collectively they produce such tension that the cells of a branch can deform the collagen matrix."

The overall process results in the formation of a mechanically stable collagen 'cage' which ultimately surrounds the growing branch. This collagen cage then controls the further generation of tension, the growth of the branches and the plastic deformation of the matrix.

These findings provide the basis for the use of this model system to investigate more complex processes such as the first steps in metastasis or mutual interaction with other cell types. Intensive current research is now on the way to determine whether this self-organization mechanism also occurs in other organs.

Credit: 
Technical University of Munich (TUM)

The impact of COVID-19 on food-shopping behavior for food-insecure populations

The COVID-19 pandemic changed just about every aspect of normal life, including how we bought food.

While grocery stores remained open as an essential business and thrived financially throughout the pandemic, this prosperity did not translate to a consistent and sufficient food supply for many customers. Researchers have found that, on average, people went to the grocery store less frequently and spent more per trip during the pandemic.

Ran Xu, professor of allied health sciences in the College of Agriculture, Health, and Natural Resources, was interested in seeing if this trend applied to people who are food-insecure. COVID-19 exacerbated food insecurity for many. Pandemic-related job loss and other factors also led to an increase in overall rates of food insecurity.

"Because of how COVID-19 hit the economy, more people were suddenly food-insecure, and we needed more research on that," Xu says.

Xu and collaborators recently published a paper in Public Health that evaluated how perceived risk aversion, resource scarcity, and the consumers' food security status affected food procurement behaviors during this moment of national strife. They found that like food-secure individuals, food-insecure individuals made fewer grocery shopping trips due to concerns about contracting COVID-19. But, unlike food-secure individuals, they did not increase spending per trip.

"We think this is a serious issue that shows that COVID-19 impacts different populations differently," Xu says. "The findings we have are worrisome."

The researchers focused on food-insecure individuals who have considerable financial difficulty in procuring food.

They measured food insecurity according to two measures from the USDA's longer food insecurity survey. They asked respondents if they worried their food supply would run out before they had money to buy more, and if the food they bought just didn't last and they didn't have money to get more.

Then, the researchers evaluated participants' food shopping behaviors, such as types of stores they patronize, frequency of trips, and average food expense. They compared these measures with their shopping experience before the pandemic.

Their results showed that, of the 2,500 respondents from around the country, food-secure individuals tended to spend more per trip to stockpile food, reduce the potential for COVID-19 exposure, and prepare for food shortages. But food-insecure individuals could not prepare in the same way as they had much more constrained budgets and resources. Although food-insecure people made fewer grocery shopping trips due to concerns about contracting COVID-19, unlike food-secure people, they did not increase spending per trip.

The team conducted the study in May 2020, during the height of the pandemic in the U.S.

These findings show that the pandemic exacerbated the disparity between food-secure and insecure people.

Food insecurity has serious health consequences. Lack of access or lack of reliable access to nutritious foods contributes to a host of diseases including diabetes and cardiovascular disease.

"Food has everything to do with our health," Xu says. "Food insecurity adds another layer to that."

Credit: 
University of Connecticut