Culture

Triple-negative breast cancer among black women in the US varies by birthplace

A new study finds substantial variation in the prevalence of triple-negative breast cancer among black women with breast cancer by birthplace in the United States. The prevalence of triple-negative breast cancer was highest among U.S.-born and Western-African-born black women, followed by Caribbean-born, and Eastern-African-born black women. The study is published in the journal Cancer and its findings suggests that the typical notion of higher proportional burden of triple-negative breast cancer among black women is not generalizable to all women of African descent.

Triple-negative breast cancer -breast cancer that is negative for estrogen receptor, progesterone receptor, and human epidermal growth factor 2 receptor-- occurs infrequently, tends to be aggressive, and has fewer treatment options. It is approximately twice as common, both in proportion of breast cancers and in incidence rates among black women than white women in the U.S., a factor that is often considered as one contributor to lower breast cancer survival among black patients.

Black populations in the United States are diverse, comprising people born in the U.S. as well as immigrants from various countries. Rapidly growing numbers of immigrants from different national and social backgrounds during the mo st recent three or four decades have reshaped the overall black population in the United States. In 2013, about 9% of the black population was documented as being born outside the United States, with approximately one-half born in the Caribbean, 35% born in Sub-Saharan Africa, and 9% born in Central and South America.

It is also notable that the highest levels of within-population genetic diversity have been reported among persons who self-identified as blacks than among those in other racial groups. Still, nativity and geographic origin among black women has seldom been examined, even as nativity-related differences may improve the understanding of the etiologic heterogeneity of breast cancer.

To learn more, investigators led by Hyuna Sung, Ph.D. of the American Cancer Society examined the prevalence of triple-negative and hormone receptor-negative breast cancer (negative for estrogen receptor and progesterone receptor) among black women in the National Program of Cancer Registries and U.S. Cancer Statistics. The authors identified 65,211 non-Hispanic black women who were diagnosed with invasive breast cancer from 2010 through 2015 and who were recorded as being born in the United States, East Africa, West Africa, or the Caribbean.

They found that compared with U.S.-born black women, the prevalence rate ratio of triple-negative breast cancer was 13% lower among Caribbean-born women, and 46% lower among Eastern-African–born black women.

“It is not clear what risk factors conferred by birthplace are associated with subtype prevalence,” said Dr. Sung. “However, the similarity in breast cancer subtype prevalence between U.S.-born and Western-African–born blacks, contrasted against the differences with Eastern-African–born blacks, may in part reflect shared ancestry-related risk factors.”

The authors conclude that "presenting breast tumor subtype in black women as a single category is not reflective of the diverse black populations in the nation, and that their study "calls for a concerted effort for more complete collection of birthplace information in cancer registries...."

Credit: 
American Cancer Society

Studies show vapor and THPs cause minimal teeth, skin and wallpaper staining

Studies by scientists at British American Tobacco have shown that aerosol from potentially reduced-risk products (PRRPs), such as vapour and tobacco heating products (THPs), cause significantly less staining to tooth enamel, skin, cloth and wallpaper than does the smoke from conventional cigarettes.

The study results are presented today at the Global Forum on Nicotine in Warsaw, Poland.

PRRPs do not involve combustion; the vapour and aerosol they produce are less complex and contain significantly lower levels of certain toxicants as compared to cigarette smoke*. Vaping devices and THPs also do not produce a sidestream aerosol, resulting in reduced odour on consumers' hands and clothes, and lower environmental exposure for bystanders, as compared to conventional cigarettes.

It is well known that cigarette smokers can develop stains that discolour teeth enamel. Although this staining is often called nicotine staining, it is actually caused by the tar in cigarette smoke. Cigarette smoke can also stain wallpaper, skin, and other materials.

BAT's scientists assessed the impact of aerosols from PRRPs. In the series of studies, a reference cigarette (3R4F), a THP (BAT's glo), and two innovative vapour products were assessed. To assess the staining levels, a wide range of materials were used, including wallpaper samples, porcine skin samples, and bovine enamel blocks.

In order to mimic conditions in the mouth, the enamel blocks were first incubated with saliva to allow the formation of a pellicle layer, a protective protein film that is normally present on teeth. They were then assessed before, during, and after exposure using a standard technique for assessing toothpaste or teeth-whitening agents.

To assess staining of the wallpaper and material samples required modification of BAT's cell culture chamber to allow the samples to be attached and exposed. To assess skin staining, porcine skin samples were incubated with particulate matter contained in the aerosols (isolated from the smoke/aerosols).

The results were remarkable -- exposure of tooth enamel, skin, wallpaper and material samples to aerosols from vapour products and THPs did not cause staining (levels of staining were comparable to untreated controls).

"A lack of combustion and significantly reduced emissions from glo as compared to conventional cigarettes mean there is less material to deposit and odour to linger. Again, this reflects consideration for others by those using the glo product," said John McAughey, BAT Principal Scientist for aerosol science.

These results show switching completely from cigarettes to vapour products or THPs may offer cosmetic and social benefits for consumers. "These benefits around social consideration and personal hygiene are really resonating with users", said Senior Scientist Annette Dalrymple, who presented the results at the conference.

"The data generated from this study clearly show that the vapour product and THP assessed caused minimal discoloration -- very promising for consumers. However, further studies are required to understand the long-term effect on teeth staining and oral health when smokers switch to using PRRPs."

Credit: 
R&D at British American Tobacco

Gut microbes eat our medication

image: Balskus has not only identified a species of bacteria responsible for consuming the Parkinson's drug levodopa, she figured out how to stop the microbe's meal.

Image: 
Kris Snibbe/Harvard Staff Photographer; Harvard University

The first time Vayu Maini Rekdal manipulated microbes, he made a decent sourdough bread. At the time, young Maini Rekdal, and most people who head to the kitchen to whip up a salad dressing, pop popcorn, ferment vegetables, or caramelize onions, did not consider the crucial chemical reactions behind these concoctions.

Even more crucial are the reactions that happen after the plates are clean. When a slice of sourdough travels through the digestive system, the trillions of microbes that live in our gut help the body break down that bread to absorb the nutrients. Since the human body cannot digest certain substances--all-important fiber, for example--microbes step up to perform chemistry no human can.

"But this kind of microbial metabolism can also be detrimental," said Maini Rekdal, a graduate student in the lab of Professor Emily Balskus and first-author on their new study published in Science. According to Maini Rekdal, gut microbes can chew up medications, too, often with hazardous side effects. "Maybe the drug is not going to reach its target in the body, maybe it's going to be toxic all of a sudden, maybe it's going to be less helpful," Maini Rekdal said.

In their study, Balskus, Maini Rekdal, and their collaborators at the University of California San Francisco, describe one of the first concrete examples of how the microbiome can interfere with a drug's intended path through the body. Focusing on levodopa (L-dopa), the primary treatment for Parkinson's disease, they identified which bacteria out of the trillions of species is responsible for degrading the drug and how to stop this microbial interference.

Parkinson's disease attacks nerve cells in the brain that produce dopamine, without which the body can suffer tremors, muscle rigidity, and problems with balance and coordination. L-dopa delivers dopamine to the brain to relieve symptoms. But only about 1 to 5% of the drug actually reaches the brain.

This number--and the drug's efficacy--varies widely from patient to patient. Since the introduction of L-dopa in the late 1960s, researchers have known that the body's enzymes (tools that perform necessary chemistry) can break down L-dopa in the gut, preventing the drug from reaching the brain. So, the pharmaceutical industry introduced a new drug, carbidopa, to block unwanted L-dopa metabolism. Taken together, the treatment seemed to work.

"Even so," Maini Rekdal said, "there's a lot of metabolism that's unexplained, and it's very variable between people." That variance is a problem: Not only is the drug less effective for some patients, but when L-dopa is transformed into dopamine outside the brain, the compound can cause side effects, including severe gastrointestinal distress and cardiac arrhythmias. If less of the drug reaches the brain, patients are often given more to manage their symptoms, potentially exacerbating these side effects.

Maini Rekdal suspected microbes might be behind the L-dopa disappearance. Since previous research showed that antibiotics improve a patient's response to L-dopa, scientists speculated that bacteria might be to blame. Still, no one identified which bacterial species might be culpable or how and why they eat the drug.

So, the Balskus team launched an investigation. The unusual chemistry--L-dopa to dopamine--was their first clue.

Few bacterial enzymes can perform this conversion. But, a good number bind to tyrosine--an amino acid similar to L-dopa. And one, from a food microbe often found in milk and pickles (Lactobacillus brevis), can accept both tyrosine and L-dopa.

Using the Human Microbiome Project as a reference, Maini Rekdal and his team hunted through bacterial DNA to identify which gut microbes had genes to encode a similar enzyme. Several fit their criteria; but only one strain, Enterococcus faecalis (E. faecalis), ate all the L-dopa, every time.

With this discovery, the team provided the first strong evidence connecting E. faecalis and the bacteria's enzyme (PLP-dependent tyrosine decarboxylase or TyrDC) to L-dopa metabolism.

And yet, a human enzyme can and does convert L-dopa to dopamine in the gut, the same reaction carbidopa is designed to stop. Then why, the team wondered, does the E. faecalis enzyme escape carbidopa's reach?

Even though the human and bacterial enzymes perform the exact same chemical reaction, the bacterial one looks just a little different. Maini Rekdal speculated that carbidopa may not be able to penetrate the microbial cells or the slight structural variance could prevent the drug from interacting with the bacterial enzyme. If true, other host-targeted treatments may be just as ineffective as carbidopa against similar microbial machinations.

But the cause may not matter. Balskus and her team already discovered a molecule capable of inhibiting the bacterial enzyme.

"The molecule turns off this unwanted bacterial metabolism without killing the bacteria; it's just targeting a non-essential enzyme," Maini Rekdal said. This and similar compounds could provide a starting place for the development of new drugs to improve L-dopa therapy for Parkinson's patients.

The team might have stopped there. But instead, they pushed further to unravel a second step in the microbial metabolism of L-dopa. After E. faecalis converts the drug into dopamine, a second organism converts dopamine into another compound, meta-tyramine.

To find this second organism, Maini Rekdal left behind his mother dough's microbial masses to experiment with a fecal sample. He subjected its diverse microbial community to a Darwinian game, feeding dopamine to hordes of microbes to see which prospered.

Eggerthella lenta won. These bacteria consume dopamine, producing meta-tyramine as a by-product. This kind of reaction is challenging, even for chemists. "There's no way to do it on the bench top," Maini Rekdal said, "and previously no enzymes were known that did this exact reaction."

The meta-tyramine by-product may contribute to some of the noxious L-dopa side effects; more research needs to be done. But, apart from the implications for Parkinson's patients, E. lenta's novel chemistry raises more questions: Why would bacteria adapt to use dopamine, which is typically associated with the brain? What else can gut microbes do? And does this chemistry impact our health?

"All of this suggests that gut microbes may contribute to the dramatic variability that is observed in side effects and efficacy between different patients taking L-dopa," Balskus said.

But this microbial interference may not be limited to L-dopa and Parkinson's disease. Their study could shepherd additional work to discover exactly who is in our gut, what they can do, and how they can impact our health, for better or worse.

Credit: 
Harvard University

Downward head tilt can make people seem more dominant

image: Stimuli used in Study 1 (top row) and Study 2 (middle and bottom rows). From left to right, the poses illustrate downward head tilts, neutral head angles, and upward head tilts. In all images, targets posed with neutral facial expressions (i.e., no facial-muscle movement).

Image: 
<em>Psychological Science</em>

We often look to people's faces for signs of how they're thinking or feeling, trying to gauge whether their eyes are narrowed or widened, whether the mouth is turned up or down. But findings published in the June 2019 issue of Psychological Science, a journal of the Association for Psychological Science, show that facial features aren't the only source of this information--we also draw social inferences from the head itself.

"We show that tilting one's head downward systematically changes the way the face is perceived, such that a neutral face--a face with no muscle movement or facial expression--appears to be more dominant when the head is tilted down," explain researchers Zachary Witkower and Jessica Tracy of the University of British Columbia. "This effect is caused by the fact that tilting one's head downward leads to the artificial appearance of lowered and V-shaped eyebrows--which in turn elicit perceptions of aggression, intimidation, and dominance."

"These findings suggest that 'neutral' faces can still be quite communicative," Witkower and Tracy add. "Subtle shifts of the head can have profound effects on social perception, partly because they can have large effects on the appearance of the face."

Although researchers have investigated how facial muscle movements, in the form of facial expressions, correlate with social impressions, few studies have specifically examined how head movements might play a role. Witkower and Tracy designed a series of studies to investigate whether the angle of head position might influence social perception, even when facial features remain neutral.

In one online study with 101 participants, the researchers generated variations of avatars with neutral facial expressions and one of three head positions: tilted upward 10 degrees, neutral (0 degrees), or tilted downward 10 degrees.

The participants judged the dominance of each avatar image, rating their agreement with statements including "This person would enjoy having control over others" and "This person would be willing to use aggressive tactics to get their way."

The results showed that participants rated the avatars with downward head tilt as more dominant than those with neutral or upward-titled heads.

A second online study, in which 570 participants rated images of actual people, showed the same pattern of results.

Additional findings revealed that the portion of the face around the eyes and eyebrows is both necessary and sufficient to produce the dominance effect. That is, participants rated downward-tilted heads as more dominant even when they could only see the eyes and eyebrows; this was not true when the rest of the face was visible and the eyes and eyebrows were obscured.

Two more experiments indicated that the angle of the eyebrows drove this effect--downward-tilted heads had eyebrows that appeared to take more of a V shape, even though the eyebrows had not moved from a neutral position, and this was associated with perceptions of dominance.

"In other words, tilting the head downward can have the same effect on social perceptions as does lowering one's eyebrows--a movement made by the corrugator muscle, known as Action Unit 4 in the Facial Action Coding System--but without any actual facial movement," say Witkower and Tracy. "Head tilt is thus an 'action unit imposter' in that it creates the illusory appearance of a facial muscle movement where none in fact exists."

Given these intriguing results, the researchers are continuing to investigate the influence of head tilt on social perception, exploring whether the effects might extend beyond perceptions of dominance to how we interpret facial expressions of emotion.

Ultimately, Witkower and Tracy note, these findings could have practical implications for our everyday social interactions:

"People often display certain movements or expressions during their everyday interactions, such as a friendly smile or wave, as a way to communicate information. Our research suggests that we may also want to consider how we hold their head during these interactions, as subtle head movements can dramatically change the meaning of otherwise innocuous facial expressions."

Credit: 
Association for Psychological Science

Early-season hurricanes result in greater transmission of mosquito-borne infectious disease

image: Researchers from Georgia State and Arizona State University developed a mathematical model to study the impact of heavy rainfall events (HREs).

Image: 
Georgia State University

The timing of a hurricane is one of the primary factors influencing its impact on the spread of mosquito-borne infectious diseases such as West Nile Virus, dengue, chikungunya and Zika, according to a study led by Georgia State University.

Researchers from Georgia State and Arizona State University developed a mathematical model to study the impact of heavy rainfall events (HREs) such as hurricanes on the transmission of vector-borne infectious diseases in temperate areas of the world, including the southern coastal U.S. In the aftermath of this type of extreme weather event, the mosquito population often booms in the presence of stagnant water. At the same time, the breakdown of public and private health infrastructure can put people at increased risk of infection. The study, which was published in Philosophical Transactions of the Royal Society B, found that the risk of a disease outbreak is highest if the HRE occurs early in the transmission season, or the period of time when mosquitos are able to pass on the virus to humans.

According to the study, an HRE that occurs on July 1 results in 70 percent fewer disease cases compared to an HRE that occurs on June 1.

"Mosquitos are very sensitive to temperature not only in terms of their ability to survive and reproduce, but also in their ability to infect individuals," said Gerardo Chowell, professor of mathematical epidemiology in the School of Public Health and lead author of the study. "The warmer it is, the faster an infected mosquito will be able to transmit the virus. Considering that mosquitos have an average lifespan of less than two weeks, that temperature difference can have a dramatic effect on disease outbreaks."

Population displacement can also affect the spread of vector-borne disease in a few ways, the researchers found. When people opt to leave the area, it reduces the number of local infections, while potentially increasing the number of infections elsewhere. However, those individuals who are not displaced during an HRE may be at higher risk because standard measures to combat mosquito breeding (such as removing pools of stagnant water) are neglected when fewer people remain in the area. And as people move into a disaster area to offer emergency relief -- or when they return after the event -- the number of local infections rises.

"Since mosquito-borne diseases tend to be spread by the movement of people rather than the movement of mosquitoes, disaster-induced movements of people can shift where and when outbreaks occur," said Charles Perrings, professor in the School of Life Sciences at Arizona State University and a co-author of the study.

Chowell notes that as HREs become more frequent in the southern U.S. and other tropical areas there's a need to develop further quantitative tools to assess how these disasters can affect the risk of disease transmission.

"Our team will now focus on improving methods to quantify the number of people that actually leave during a hurricane, how quickly they leave and when they return," he says. "We are also looking at additional hurricanes to study the impact of different displacement patterns."

Credit: 
Georgia State University

The formative years: Giant planets vs. brown dwarfs

video: Animation showing the 617 observations conducted during GPIES from November 2014 to April 2019 (right) and the location of the stars in the southern sky (left). Open circles indicate system (like 51 Eri at 2 o'clock) which have been visited multiple times. Stars indicated by a red dot have a disk of material. Blue dots are planetary systems (with one planet at least). Brown dot are binary systems with a brown dwarf.

Image: 
P. Kalas, D. Savransky, R. De Rosa and GPIES.

Based on preliminary results from a new Gemini Observatory survey of 531 stars with the Gemini Planet Imager (GPI), it appears more and more likely that large planets and brown dwarfs have very different roots.

The GPI Exoplanet Survey (GPIES), one of the largest and most sensitive direct imaging exoplanet surveys to date, is still ongoing at the Gemini South telescope in Chile. "From our analysis of the first 300 stars observed, we are already seeing strong trends," said Eric L. Nielsen of Stanford University, who is the lead author of the study, published in The Astronomical Journal.

In November 2014, GPI Principal Investigator Bruce Macintosh of Stanford University and his international team set out to observe almost 600 young nearby stars with the newly commissioned instrument. GPI was funded with support from the Gemini Observatory partnership, with the largest portion from the US National Science Foundation (NSF). The NSF, and the Canadian National Research Council (NRC; also a Gemini partner), funded researchers participating in GPIES.

Imaging a planet around another star is a difficult technical challenge possible with only a few instruments. Exoplanets are small, faint, and very close to their host star -- distinguishing an orbiting planet from its star is like resolving the width of a dime from several miles away. Even the brightest planets are ten thousand times fainter than their parent star. GPI can see planets up to a million times fainter, much more sensitive than previous planet-imaging instruments. "GPI is a great tool for studying planets, and the Gemini Observatory gave us time to do a careful, systematic survey," said Macintosh.

GPIES is now coming to an end. From the first 300 stars, GPIES has detected six giant planets and three brown dwarfs. "This analysis of the first 300 stars observed by GPIES represents the largest, most sensitive direct imaging survey for giant planets published to date," added Macintosh.

Brown dwarfs are more massive than planets, but not massive enough to fuse hydrogen like stars. "Our analysis of this Gemini survey suggests that wide-separation giant planets may have formed differently from their brown dwarf cousins," Nielsen said.

The team's paper advances the idea that massive planets form due to the slow accumulation of material surrounding a young star, while brown dwarfs come about due to rapid gravitational collapse. "It's a bit like the difference between a gentle light rain and a thunderstorm," said Macintosh.

"With six detected planets and three detected brown dwarfs from our survey, along with unprecedented sensitivity to planets a few times the mass of Jupiter at orbital distances well beyond Jupiter's, we can now answer some key questions, especially about where and how these objects form," Nielsen said.

This discovery may answer a longstanding question as to whether brown dwarfs -- intermediate-mass objects -- are born more like stars or planets. Stars form from the top down by the gravitational collapse of large primordial clouds of gas and dust, while planets are thought -- but have not been confirmed -- to form from the bottom up by the assembly of small rocky bodies that then grow into larger ones, a process also termed "core accretion."

"What the GPIES team's analysis shows is that the properties of brown dwarfs and giant planets run completely counter to each other," said Eugene Chiang, professor of astronomy at the University of California Berkeley and a co-author of the paper. "Whereas more massive brown dwarfs outnumber less massive brown dwarfs, for giant planets the trend is reversed: less massive planets outnumber more massive ones. Moreover, brown dwarfs tend to be found far from their host stars, while giant planets concentrate closer in. These opposing trends point to brown dwarfs forming top-down, and giant planets forming bottom-up."

More Surprises

Of the 300 stars surveyed thus far, 123 are at least 1.5 times more massive than our Sun. One of the most striking results of the GPI survey is that all hosts of detected planets are among these higher-mass stars -- even though it is easier to see a giant planet orbiting a fainter, more Sun-like star. Astronomers have suspected this relationship for years, but the GPIES survey has unambiguously confirmed it. This finding also supports the bottom-up formation scenario for planets.

One of the study's greatest surprises has been how different other planetary systems are from our own. Our Solar System has small rocky planets in the inner parts and giant gas planets in the outer parts. But the very first exoplanets discovered reversed this trend, with giant planets skimming closer to their stars than does moon-sized Mercury. Furthermore, radial-velocity studies -- which rely on the fact that a star experiences a gravitationally induced "wobble" when it is orbited by a planet -- have shown that the number of giant planets increases with distance from the star out to about Jupiter's orbit. But the GPIES team's preliminary results, which probe still larger distances, has shown that giant planets become less numerous farther out.

"The region in the middle could be where you're most likely to find planets larger than Jupiter around other stars," added Nielsen, "which is very interesting since this is where we see Jupiter and Saturn in our own Solar System." In this regard, the location of Jupiter in our own Solar System may fit the overall exoplanet trend.

But a surprise from all exoplanet surveys is how intrinsically rare giant planets seem to be around Sun-like stars, and how different other solar systems are. The Kepler mission discovered far more small and close-in planets -- two or more "super-Earth" planets per Sun-like star, densely packed into inner solar systems much more crowded than our own. Extrapolation of simple models suggested GPI would find a dozen giant planets or more, but it only saw six. Putting it all together, giant planets may be present around only a minority of stars like our own.

In January 2019, GPIES observed its 531st, and final, new star, and the team is currently following up the remaining candidates to determine which are truly planets and which are distant background stars impersonating giant planets.

The next-generation telescopes -- such as NASA's James Webb Space Telescope and WFIRST mission, the Giant Magellan Telescope, Thirty Meter Telescope, and Extremely Large Telescope -- should be able to push the boundaries of study, imaging planets much closer to their star and overlapping with other techniques, producing a full accounting of giant planet and brown dwarf populations from 1 to 1,000 AU.

"Further observations of additional higher mass stars can test whether this trend is real," said Macintosh, "especially as our survey is limited by the number of bright, young nearby stars available for study by direct imagers like GPI."

Background:

GPI is specifically designed to search for planets and brown dwarfs around other stars, using a mask known as a coronagraph to partially block a star's light. Together with adaptive optics correcting for turbulence in the Earth's atmosphere and advanced image processing, researchers can search the star's neighborhood for Jupiter-like exoplanets and brown dwarfs up to a million times fainter than the host star.

In our Solar System, Jupiter is the largest planet, being about 318 times as massive as the Earth and lying about five times farther from the Sun than does the Earth. Brown dwarfs range from 13 to 90 times the mass of Jupiter; and while they can be up to a tenth the mass of the Sun, they lack the nuclear fusion in their core to burn as a star -- so they lie somewhere between a diminutive star and a super-planet.

An early success of GPIES was the discovery of 51 Eridani b in December 2014, a planet about two-and-a-half times more massive than Jupiter, that orbits its star beyond the distance that Saturn orbits our own Sun. The host star, 51 Eridani, is just 97 light-years away, and is only 26 million years old (nearby and young, by astronomy standards). The star had been observed by multiple planet-imaging surveys with a variety of telescopes and instruments, but its planet was not detected until GPI's superior instrumentation was able to suppress the starlight enough for the planet to be visible.

GPIES also discovered the brown dwarf HR 2562 B, which is at a separation similar to that between the Sun and Uranus, and is 30 times more massive than Jupiter.

Most exoplanets discovered thus far, including those found by NASA's Kepler spacecraft, are found via indirect methods, such as observing a dimming in the star's light as the orbiting planet eclipses its parent star, or by observing the star's wobble as the planet's gravity tugs on the star. These methods have been very successful, but they only probe the central regions of planetary systems. Those regions outside the orbit of Jupiter, where the giant planets are in our Solar System, are usually out of their reach. GPI, however, endeavors to directly detect planets in this parameter space by taking a picture of them alongside their parent stars.

The Gemini results support those from these other techniques, including a recent study of exoplanets discovered by the radial velocity method that found the most likely separation for a giant planet around Sun-like stars is about 3 AU. The finding that brown dwarfs occur with a frequency of only about 1%, independent of stellar mass, is also consistent with previous results from direct imaging surveys.

Credit: 
Association of Universities for Research in Astronomy (AURA)

Salmonella resistant to antibiotics of last resort found in US

Researchers from North Carolina State University have found a gene that gives Salmonella resistance to antibiotics of last resort in a sample taken from a human patient in the U.S. The find is the first evidence that the gene mcr-3.1 has made its way into the U.S. from Asia.

There are more than 2,500 known serotypes of Salmonella. In the U.S., Salmonella enterica 4,[5],12:i:- ST34 is responsible for a significant percentage of human illnesses. The drug resistance gene in question - known as mcr-3.1 - gives Salmonella resistance to colistin, the drug of last resort for treating infections caused by multidrug-resistant Salmonella.

"Public health officials have known about this gene for some time," says Siddhartha Thakur, professor and director of global health at NC State and corresponding author of the research. "In 2015, they saw that mcr-3.1 had moved from a chromosome to a plasmid in China, which paves the way for the gene to be transmitted between organisms. For example, E. coli and Salmonella are in the same family, so once the gene is on a plasmid, that plasmid could move between the bacteria and they could transmit this gene to each other. Once mcr-3.1 jumped to the plasmid, it spread to 30 different countries, although not - as far as we knew - to the U.S."

Thakur's lab is one of several nationally participating in epidemiological surveillance for resistant strains of Salmonella. The lab generates whole genome sequences from Salmonella samples every year as part of routine monitoring for the presence of antimicrobial-resistant bacteria. When veterinary medicine student Valerie Nelson and Ph.D. student Daniel Monte did genome sequencing on 100 clinical human stool samples taken from the southeastern U.S. between 2014 and 2016, they discovered that one sample contained the resistant mcr-3.1 gene. The sample came from a person who had traveled to China two weeks prior to becoming ill with a Salmonella infection.

"This project proved the importance of ongoing sequencing and surveillance," says Nelson. "The original project did not involve this gene at all."

"The positive sample was from 2014, so this discovery definitely has implications for the spread of colistin-resistant Salmonella in the U.S.," Thakur says. "Our lab will continue to try and fill in these knowledge gaps."

The research appears in the Journal of Medical Microbiology and was supported by the National Institutes of Health/Food and Drug Administration (award number 5U 18FD006194-02). Monte and Nelson are first author and co-author, respectively. Prior to his global health role, Thakur was associate director of the emerging infectious diseases program at NC State's Comparative Medicine Institute.

Credit: 
North Carolina State University

New model more accurately predicts choices in classic decision-making task

image: Which door will you choose? New model helps predict what choices people make in the Iowa Gambling Task by focusing on the 'exploratory strategies' they use.

Image: 
dil/unsplash.

A new mathematical model that predicts which choices people will make in the Iowa Gambling Task, a task used for the past 25 years to study decision-making, outperforms previously developed models. Romain Ligneul of the Champalimaud Center for the Unknown in Portugal presents this research in PLOS Computational Biology.

The Iowa Gambling Task presents a subject with four virtual card decks, each containing a different mix of cards that can win or lose fake money. Without being told which decks are more valuable, the subject then picks cards from the decks as they please. Most healthy people gradually learn which decks are more valuable and choose to pick cards only from those decks.

Earlier studies have used Iowa Gambling Task data to build mathematical models that can predict people's card-picking choices. However, building such models is computationally challenging, and previously developed models do not account for the exploratory strategies people use in the task.

In reviewing previously collected data from 500 subjects, Ligneul found that healthy people tend to cycle through the four decks and pick one card from each, especially at the beginning of the task. He then incorporated this behavior, termed sequential exploration, into a new mathematical model that also accounts for the well-known reward-maximizing behaviors people exhibit in the task.

Ligneul found that his new model outperforms earlier models in predicting people's card-picking choices. He also found that sequential exploration behaviors seem to decline as subjects get older, perhaps because of neurological changes typically associated with aging.

"This study provides a mathematical method to disentangle our drive to explore the environment and our drive to exploit it," Ligneul says. "It appears that the balance of these two drives evolves with aging."

The new model and findings could help refine insights gleaned from the Iowa Gambling Task. It could also improve understanding of learning and decision-making disruptions that are associated with aging and various neuropsychiatric conditions, such as addiction, impulsive disorders, brain injury, and more.

Credit: 
PLOS

Encouraging critically necessary blood donation among minorities

image: Better community education and communication are critical for increasing levels of blood donation among minorities, according to a study by researchers at Georgia State University and Georgia Southern University.

Image: 
Georgia State University

Better community education and communication are critical for increasing levels of blood donation among minorities, according to a study by researchers at Georgia State University and Georgia Southern University.

Nursing associate professor Regena Spratling in the Byrdine F. Lewis College of Nursing and Health Professions at Georgia State and her colleagues in the Georgia Southern University School of Public Health conducted the first systematic literature review of research on barriers and facilitators among minorities with blood donations.

The research found that medical mistrust is a significant barrier to blood donation among minorities. More significant to healthcare providers is a lack of explanation to minority donors when they are turned down to being a donor. For example, potential donors found to have low hemoglobin may believe that permanently bans them from giving blood when in they may be eligible later if they eat a healthy diet and drink plenty of fluids. Better education by healthcare providers working with these donors can reduce this barrier, researchers said.

Knowing a blood transfusion recipient made minorities more likely to donate, the researchers found. In many minority communities, donating blood for a friend, family, church or community member is positively viewed. Cultural or community ties are linked closely to blood donation. Giving blood to benefit one's community was a primary motivator.

A higher prevalence of blood-based, hereditary diseases, such as sickle cell and thalassemia, is found among minorities. These diseases increase the need for blood products in minority populations. Blood from donors with similar backgrounds reduces the likelihood of severe transfusion complications. These subtle similarities go deeper into blood background than blood types A, B, AB and O and positive and negative Rh factor.

The researchers reviewed nearly four dozen articles in peer-reviewed journals on blood donation with corresponding data on donors. Half of the articles appeared in publications focused on blood transfusion. The remainder were in related journals. Very few articles in nursing or broader healthcare journals focused on blood donations in specific race and ethnic populations. The researchers found the lack of widespread discussion of low minority blood donation was a primary barrier to solving the problem.

Credit: 
Georgia State University

Using prevalent technologies and 'Internet of Things' data for atmospheric science

image: Wireless communication links, social networks and smartphones as examples of data-generating sources that can be harnessed for environmental monitoring.

Image: 
Noam David

The use of prevalent technologies and crowdsourced data may benefit weather forecasting and atmospheric research, according to a new paper authored by Dr. Noam David, a Visiting Scientist at the Laboratory of Associate Professor Yoshihide Sekimoto at the Institute of Industrial Science, The University of Tokyo, Japan. The paper, published in Advances in Atmospheric Sciences, reviews a number of research works on the subject and points to the potential of this innovative approach.

Specialized instruments for environmental monitoring are often limited as a result of technical and practical constraints. Existing technologies, including remote sensing systems and ground-level tools, may suffer from obstacles such as low spatial representativity (in situ sensors, for example) or lack of accuracy when measuring near the Earth's surface (satellites). These constraints often limit the ability to carry out representative observations and, as a result, the capacity to deepen our existing understanding of atmospheric processes. Multi-systems and IoT (Internet of Things) technologies have become increasingly distributed as they are embedded into our environment. As they become more widely deployed, these technologies generate unprecedented data volumes with immense coverage, immediacy and availability. As a result, a growing opportunity is emerging to complement state-of-the-art monitoring techniques with the large streams of data produced. Notably, these resources were originally designed for purposes other than environmental monitoring and are naturally not as precise as dedicated sensors. Therefore, they should be treated as complementary tools and not as a substitute. However, in the many cases where dedicated instruments are not deployed in the field, these newly available 'environmental sensors' can provide some response which is often invaluable.

Smartphones, for example, contain weather-sensitive sensors and recent works indicate the ability to use the data collected by these devices on a multisource basis to monitor atmospheric pressure and temperature. Data shared as an open source in social networks can provide vital environmental information reported by thousands of 'human observers' directly from an area of interest. Wireless communication links that form the basis for transmitting data between cellular communication base stations serve as an additional example. Weather conditions affect the signal strength on these links and this effect can be measured. As a result the links can be utilized as an environmental monitoring facility. A variety of studies on the subject point to the ability to monitor rainfall and other hydrometeors including fog, water vapor, dew and even the precursors of air pollution using the data generated by these systems.

Notably, the data from these new 'sensors' could be assimilated into high-resolution numerical prediction models, and thus may lead to improvements in forecasting capabilities. Put to use, this novel approach could provide the groundwork for developing new early-warning systems against natural hazards, and generate a variety of products necessary for a wide range of fields. The contribution to public health and safety as a result of these could potentially be of significant value.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Handgun licensing more effective at reducing gun deaths than background checks alone

A new white paper from the Johns Hopkins Center for Gun Policy and Research at the Johns Hopkins Bloomberg School of Public Health concludes that of the approaches used by states to screen out prohibited individuals from owning firearms, only purchaser licensing has been shown to reduce gun homicides and suicides. Purchaser licensing is currently used by nine states and Washington, D.C.

The white paper, "The Impact of Handgun Purchaser Licensing Laws on Gun Violence," and an accompanying infographic explain that states generally use three approaches to screen out prohibited individuals from purchasing firearms: 1: the minimum that federal law requires--mandatory background checks for sales from a licensed dealer; 2: comprehensive background check requirements that also cover private-party transfers of firearms; 3: a background check for all firearm transfers as a complement to a licensing or permit system. Some states with comprehensive background checks or firearm purchaser licensing limit these requirements to transfers of handguns.

"Licensing differs from a standard background check in important ways, and the purpose of issuing this white paper and infographic is to clear up confusion about the efficacy of these laws," says the report's lead author, Cassandra Crifasi, PhD, MPH, deputy director of the Johns Hopkins Center for Gun Policy and Research and assistant professor in the Bloomberg School's Department of Health Policy and Management. "Comprehensive background checks are a necessary component of any system designed to keep guns from prohibited persons, but they are insufficient to reduce firearm-related deaths without a complementary system of purchaser licensing."

In general, states with licensing require prospective gun buyers to apply for a license with a state or local law enforcement agency, pass a background check, often submit fingerprints, and, in some cases, show evidence of gun safety training. States with licensing typically have more thorough processes for checking backgrounds, allow law enforcement more time to conduct those checks, or have mandatory waiting periods.

In contrast, the mandatory federal background law requires that a prospective buyer undergo a background check if they purchase a firearm, but only if they purchase it from a licensed dealer. Other key differences among the three approaches are highlighted in the report's companion infographic.

To date, available research shows that states with comprehensive background checks that are not part of a licensing system experience fewer guns diverted to criminal use. However, research to date has not shown that background checks alone lead to significant reductions in gun-related deaths.

In comparison, earlier research from the report authors showed that when Missouri repealed its handgun purchaser licensing law in 2007, homicides rose an estimated 17 to 27 percent through 2016. A separate study found the repeal was associated with a 16 percent increase in firearm suicides through 2012. In contrast, when Connecticut enacted a handgun licensing law in 1995 to supplement its universal background check policy, the state experienced a 40 percent decrease in gun homicides and a 15 percent reduction in gun suicides over the first ten years the law was in effect.

"The most likely reasons we see impacts on firearm homicides and suicides for licensing and not for comprehensive background checks witout licensing center on the more direct interface between prospective purchasers and law enforcement and more robust systems for background checks," says report co-author Daniel Webster, ScD, MPH, director of the Johns Hopkins Center for Gun Policy and Research and Bloomberg Professor of American Health at the Bloomberg School. "These procedures may deter individuals who might otherwise make impulsive decisions to acquire a gun to hurt themselves or others."

The white paper posits that one reason advocacy organizations have pushed policymakers to adopt comprehensive background checks (versus licensing) is because of their broad appeal: Polls consistently find that over 85 percent of U.S. adults support comprehensive background checks with no difference between gun owners and non-gun owners.

Yet, says Crifasi, national public opinon surveys also show that three-quarters of adults support laws requiring handgun purchasers to obtain a license from a law enforcement agency, with support among gun owners at 60 percent.

"Given this level of support among the population, including gun owners, and the robust body of evidence on the effectiveness of licensing laws, policymakers should consider licensing as a key strategy to reduce gun violence in the communities they serve," Crifasi says.

In addition to Washington, D.C., the nine states that have licensing requirements include Connecticut, Hawaii, Illinois, Iowa, Maryland, Massachusetts, New Jersey, New York and North Carolina.

In the 2018-2019 legislative session, gun purchasing licensing bills were introduced in Oregon, Delaware and Minnesota. Last year, both the U.S. House of Representatives and the U.S. Senate introduced measures that would incentivize states to adopt handgun licensing laws.

Credit: 
Johns Hopkins Bloomberg School of Public Health

New quantum dot microscope shows electric potentials of individual atoms

image: Image from a scanning tunnelling microscope (STM, left) and a scanning quantum dot microscope (SQDM, right). Using a scanning tunnelling microscope, the physical structure of a surface can be measured on the atomic level. Quantum dot microscopy can visualize the electric potentials on the surface at a similar level of detail -- a perfect combination.

Image: 
Copyright: Forschungszentrum Jülich / Christian Wagner

A team of researchers from Jülich in cooperation with the University of Magdeburg has developed a new method to measure the electric potentials of a sample at atomic accuracy. Using conventional methods, it was virtually impossible until now to quantitatively record the electric potentials that occur in the immediate vicinity of individual molecules or atoms. The new scanning quantum dot microscopy method, which was recently presented in the journal Nature Materials by scientists from Forschungszentrum Jülich together with partners from two other institutions, could open up new opportunities for chip manufacture or the characterization of biomolecules such as DNA.

The positive atomic nuclei and negative electrons of which all matter consists produce electric potential fields that superpose and compensate each other, even over very short distances. Conventional methods do not permit quantitative measurements of these small-area fields, which are responsible for many material properties and functions on the nanoscale. Almost all established methods capable of imaging such potentials are based on the measurement of forces that are caused by electric charges. Yet these forces are difficult to distinguish from other forces that occur on the nanoscale, which prevents quantitative measurements.

Four years ago, however, scientists from Forschungszentrum Jülich discovered a method based on a completely different principle. Scanning quantum dot microscopy involves attaching a single organic molecule - the "quantum dot" - to the tip of an atomic force microscope. This molecule then serves as a probe. "The molecule is so small that we can attach individual electrons from the tip of the atomic force microscope to the molecule in a controlled manner," explains Dr. Christian Wagner, head of the Controlled Mechanical Manipulation of Molecules group at Jülich's Peter Grünberg Institute (PGI-3).

The researchers immediately recognized how promising the method was and filed a patent application. However, practical application was still a long way off. "Initially, it was simply a surprising effect that was limited in its applicability. That has all changed now. Not only can we visualize the electric fields of individual atoms and molecules, we can also quantify them precisely," explains Wagner. "This was confirmed by a comparison with theoretical calculations conducted by our collaborators from Luxembourg. In addition, we can image large areas of a sample and thus show a variety of nanostructures at once. And we only need one hour for a detailed image."

The Jülich researchers spent years investigating the method and finally developed a coherent theory. The reason for the very sharp images is an effect that permits the microscope tip to remain at a relatively large distance from the sample, roughly 2-3 nanometres - unimaginable for a normal atomic force microscope.

In this context, it is important to know that all elements of a sample generate electric fields that influences the quantum dot and can therefore be measured. The microscope tip acts as a protective shield that dampens the disruptive fields from areas of the sample that are further away. "The influence of the shielded electric fields thus decreases exponentially, and the quantum dot only detects the immediate surrounding area," explains Wagner. "Our resolution is thus much sharper than could be expected from even an ideal point probe."

The Jülich researchers owe the speed at which the complete sample surface can be measured to their partners from Otto von Guericke University Magdeburg. Engineers there developed a controller that helped to automate the complex, repeated sequence of scanning the sample. "An atomic force microscope works a bit like a record player," says Wagner. "The tip moves across the sample and pieces together a complete image of the surface. In previous scanning quantum dot microscopy work, however, we had to move to an individual site on the sample, measure a spectrum, move to the next site, measure another spectrum, and so on, in order to combine these measurements into a single image. With the Magdeburg engineers' controller, we can now simply scan the whole surface, just like using a normal atomic force microscope. While it used to take us 5-6 hours for a single molecule, we can now image sample areas with hundreds of molecules in just one hour."

There are some disadvantages as well, however. Preparing the measurements takes a lot of time and effort. The molecule serving as the quantum dot for the measurement has to be attached to the tip beforehand - and this is only possible in a vacuum at low temperatures. In contrast, normal atomic force microscopes also work at room temperature, with no need for a vacuum or complicated preparations.

And yet, Prof. Stefan Tautz, director at PGI-3, is optimistic: "This does not have to limit our options. Our method is still new, and we are excited for the first projects so we can show what it can really do."

There are many fields of application for quantum dot microscopy. Semiconductor electronics is pushing scale boundaries in areas where a single atom can make a difference for functionality. Electrostatic interaction also plays an important role in other functional materials, such as catalysts. The characterization of biomolecules is another avenue. Thanks to the comparatively large distance between the tip and the sample, the method is also suitable for rough surfaces - such as the surface of DNA molecules, with their characteristic 3D structure.

Credit: 
Forschungszentrum Juelich

Sensing food textures is a matter of pressure

image: Von Frey Hairs

Research team member Nicole Etter, assistant professor of communications sciences and disorders in the College of Health and Human Development, trained graduate students on the team to administer tactile pressure tests she developed on participants' tongues using the Von Frey Hairs, shown here.

Image: 
Nicole Etter / Penn State

Food's texture affects whether it is eaten, liked or rejected, according to Penn State researchers, who say some people are better at detecting even minor differences in consistency because their tongues can perceive particle sizes.

That is the key finding of a study conducted in the Sensory Evaluation Center in the College of Agricultural Sciences by a cross-disciplinary team that included both food and speech scientists specializing in sensory perception and behavior. The research included 111 volunteer tasters who had their tongues checked for physical sensitivity and then were asked their perceptions about various textures in chocolate.

"We've known for a long time that individual differences in taste and smell can cause differences in liking and food intake -- now it looks like the same might be true for texture," said John Hayes, associate professor of food science. "This may have implications for parents of picky eaters since texture is often a major reason food is rejected."

The perception of food texture arises from the interaction of a food with mechanoreceptors in the mouth, Hayes noted. It depends on neural impulses carried by multiple nerves. Despite being a key driver of the acceptance or rejection of foods, he pointed out, oral texture perception remains poorly understood relative to taste and smell, two other sensory inputs critical for flavor perception.

One argument is that texture typically is not noticed when it is within an acceptable range, but that it is a major factor in rejection if an adverse texture is present, explained Hayes, director of the Sensory Evaluation Center. For chocolate specifically, oral texture is a critical quality attribute, with grittiness often being used to differentiate bulk chocolate from premium chocolates.

"Chocolate manufacturers spend lots of energy grinding cocoa and sugar down to the right particle size for optimal acceptability by consumers," he said. "This work may help them figure out when it is good enough without going overboard."

Researchers tested whether there was a relationship between oral touch sensitivity and the perception of particle size. They used a device called Von Frey Hairs to gauge whether participants could discriminate between different amounts of force applied to their tongues.

When participants were split into groups based on pressure-point sensitivity -- high and low acuity -- there was a significant relationship between chocolate-texture discrimination and pressure-point sensitivity for the high-acuity group on the center tongue. However, a similar relationship was not seen for data from the lateral edge of the tongue.

Chocolate texture-detection experiments included both manipulated chocolates produced in a pilot plant in the Rodney A. Erickson Food Science Building and with two commercially produced chocolates. Because chocolate is a semi-solid suspension of fine particles from cocoa and sugar dispersed in a continuous fat base, Hayes explained, it is an ideal food for the study of texture.

"These findings are novel, as we are unaware of previous work showing a relationship between oral pressure sensitivity and ability to detect differences in particle size in a food product," Hayes said. "Collectively, these findings suggest that texture-detection mechanisms, which underpin point-pressure sensitivity, likely contribute to the detection of particle size in food such as chocolate."

Research team member Nicole Etter, assistant professor of communication sciences and disorders in the College of Health and Human Development, trained students on the team to administer tactile pressure tests she developed on participants' tongues using the Von Frey Hairs. As a speech therapist, she explained that her interest in the findings -- recently published in Scientific Reports -- were different than the food scientists.

"The overarching purpose of my work is to identify how we use touch sensation -- the ability to feel our tongue move and determine where our tongue is in our mouth -- to behave," she said. "I'm primarily interested in understanding how a patient uses sensation from their tongue to know where and how to move their tongue to make the proper sound."

However, in this research, Etter said she was trying to determine whether individual tactile sensations on the tongue relate to the ability to perceive or identify the texture of food -- in this case, chocolate. And she focused on another consideration, too.

"An important aspect of speech-language pathology is helping people with feeding and swallowing problems," she said. "Many clinical populations -- ranging from young children with disabilities to older adults with dementia -- may reject foods based on their perception of texture. This research starts to help us understand those individual differences."

This study sets the stage for follow-on cross-disciplinary research at Penn State, Etter believes. She plans to collaborate with Hayes and the Sensory Evaluation Center on studies involving foods beyond chocolate and older, perhaps less-healthy participants to judge the ability of older people to experience oral sensations and explore food-rejection behavior that may have serious health and nutrition implications.

Credit: 
Penn State

Research reveals liquid gold on the nanoscale

image: Shape changes in Au nanoclusters, indicating cluster surface melting at high temperatures. Images of two individual clusters containing 561 and 2530 atoms are shown.

Image: 
Swansea University.

The research published in Nature Communications set out to answer a simple question - how do nanoparticles melt? Although this question has been a focus of researchers for the past century, it still is an open problem - initial theoretical models describing melting date from around 100 years, and even the most relevant models being some 50 years old.

Professor Richard Palmer, who led the team based at the University's College of Engineering said of the research: "Although melting behaviour was known to change on the nanoscale, the way in which nanoparticles melt was an open question. Given that the theoretical models are now rather old, there was a clear case for us to carry out our new imaging experiments to see if we could test and improve these theoretical models."

The research team used gold in their experiments as it acts as a model system for noble and other metals. The team arrived at their results by imaging gold nanoparticles, with diameters ranging from 2 to 5 nanometres, via aberration corrected scanning transmission electron microscope. Their observations were later supported by large-scale quantum mechanical simulations.

Professor Palmer said: "We were able to prove the dependence of the melting point of the nanoparticles on their size and for the first time see directly the formation of a liquid shell around a solid core in the nanoparticles over a wide region of elevated temperatures, in fact for hundreds of degrees.

"This helps us to describe accurately how nanoparticles melt and to predict their behaviour at elevated temperatures. This is a science breakthrough in a field we can all relate to - melting - and will also help those producing nanotech devices for a range of practical and everyday uses, including medicine, catalysis and electronics."

Credit: 
Swansea University

New economic study shows combination of SNAP and WIC improves food security

AMES, Iowa - Forty million Americans, including 6.5 million children, are food insecure, according to the U.S. Department of Agriculture, which means they do not have enough food for an active, healthy life.

Many rely on the Supplemental Nutrition Assistance Program (SNAP) - the largest food assistance program for low-income families - to help make ends meet. Still, 51.2 percent of households receiving SNAP benefits, commonly known as food stamps, were food insecure in 2016.

Given the extent of food insecurity, a team of Iowa State University economists developed a methodology to analyze potential redundancies between SNAP and the Special Supplemental Nutrition Program for Women, Infants and Children (WIC), the third-largest food assistance program in the U.S. Their research, published in the Southern Economic Journal, provides evidence that the programs are in fact complementary, not redundant. They found that participating in both SNAP and WIC compared to SNAP alone increases food security by at least 2 percentage points and potentially as much as 24 percentage points.

"Our findings can help policymakers design more efficient programs to meet food needs," said Helen Jensen, ISU professor emeritus of economics. "We know low-income families often participate in more than one food assistance program, and we find the combination of SNAP and WIC helps reduce food insecurity for participating households."

Challenges of measuring program effects

The programs are similar, but serve different needs. WIC covers specific foods to meet the nutritional needs of pregnant women and new mothers as well as infants and young children. Participants also receive nutrition counseling and referrals for health services, such as prenatal programs. In comparison, eligible households can use SNAP benefits to buy most food items. All households included in the study were potentially eligible for both programs, but they chose whether or not to participate.

This "self-selection" is one reason it is difficult for researchers to ascertain whether a program causes a change in food insecurity. WIC and SNAP benefits are not randomly assigned, so any differences in food security outcomes between participants and nonparticipants could be due to actual causal impacts of the programs or unobserved differences between households that apply for benefits and those that do not.

If households at greatest risk of becoming food insecure are most likely to apply - for example, in the case of a job loss - it might falsely appear the programs are ineffective in alleviating food insecurity, the researchers said. In fact, while participants may be less food secure than eligible nonparticipants, participants may still be more food secure than they would have been in a world without the programs.

Another challenge for researchers is that households are known to systematically underreport benefits, often because they don't want to admit they are receiving government assistance.

"For these reasons, traditional econometric methods lead to misleading estimates," said Oleksandr Zhylyevskyy, associate professor of economics. "With that in mind, we developed a methodology that allows us to more accurately measure the true effects of WIC and SNAP."

The researchers applied their methodology to data from the USDA's National Household Food Acquisition and Purchase Survey or FoodAPS, which provides self-reported household participation in SNAP and WIC and validated data for SNAP participation. The study included 460 households that were income-eligible for both programs. They were surveyed for one week.

On average, these households were families of four with two children, one under the age of 6. The average monthly income was about $1,600. More than 75 percent rented a home or apartment, 26 percent did not own or lease a vehicle and 11 percent had used a food pantry within the past 30 days.

FoodAPS matched survey responses about SNAP participation with official administrative records to identify response errors, but no similar verification was available for WIC. The ISU researchers say the new methodology was specifically designed to handle this type of scenario in which researchers can corroborate answers for some survey questions, but not others.

"Our goal was to strike a balance between making assumptions that are weak enough to be credible, but strong enough to be informative," said Brent Kreider, professor of economics. "Policymakers may ask whether these programs actually work or merely increase government spending without reducing food insecurity. We find WIC helps even when SNAP is already in place."

Credit: 
Iowa State University