Culture

The case for retreat in the battle against climate change

image: A new paper led by a A.R. Siders of the University of Delaware states that communities should plan for retreat in some cases from areas that are flood- or fire- or hurricane-prone and adjust their building codes and services as needed.

Image: 
NOAA National Environmental Satellite, Data, and Information Service (NESDIS)

When it comes to climate change, moving people and development away from at-risk areas can be viewed, not as a defeat, but as a smart strategy that allows communities to adapt and thrive.

That's the case for carefully planned "managed retreat" made by three environmental researchers in an article published Aug. 22 in the Policy Forum section of the journal Science. The article was written by lead author A.R. Siders of the University of Delaware, with co-authors Miyuki Hino and Katharine J. Mach of Stanford University and the University of Miami.

"We need to stop picturing our relationship with nature as a war," said Siders, who is a core faculty member of UD's Disaster Research Center and an assistant professor of public policy and administration and of geography.

"We're not winning or losing; we're adjusting to changes in nature. Sea levels rise, storms surge into flood plains, so we need to move back."

Moving away from coastal and other endangered areas usually occurs after disaster strikes, she said, with emergency evacuations and their aftermath often handled inefficiently and haphazardly. Instead, the researchers argue that retreating from those areas should be done thoughtfully, with planning that is strategic as well as managed.

"Retreat is a tool that can help achieve societal goals like community revitalization, equity and sustainability if it is used purposefully," Siders said. "People sometimes see retreat as defeatist, but I see it as picking your battles."

In the Science paper, the researchers point out that retreat is a difficult and complex issue for many reasons, including the short-term economic gains of coastal development, subsidized insurance rates and disaster recovery costs, and people's attachment to the place where they live and to the status quo. Also, when disaster strikes, the more affluent residents are more able to relocate, often leaving behind those who don't have the financial resources to move.

"No matter the circumstances, moving is hard," Hino said. "People have chosen where to live for a reason, and it is often difficult to find a place to move to that meets all their social, cultural and financial requirements.

"One major challenge with retreat is that we're so focused on getting people out of harm's way, we miss the chance to help them move to opportunity."

The researchers take the long view, noting that retreat may be the answer to climate change in some areas, but it may not be a step that's necessary this year or even this decade.

"The challenge is to prepare for long-term retreat by limiting development in at-risk areas," they write, and making plans for further action based on responding to specific triggers and constantly monitoring and evaluating conditions.

"The story of retreat as a climate response is just beginning," Mach said. "Retreat is compelling because it brings together so many aspects of how societies work, what individuals are trying to achieve and what it takes to ensure preparedness and resilience in a changing climate."

The paper makes note of a variety of areas where additional work is needed, including coordination of various levels of government and support for relocation assistance programs. First, Siders said, communities must identify which areas they most want to protect and how to encourage and assist relocation.

"Managed retreat needs to be embedded in larger conversations and social programs," she said. "Retreat can't be just about avoiding risk. It needs to be about moving toward something better."

Credit: 
University of Delaware

Smartphone app makes parents more attuned to their babies' needs, research shows

University of York researchers have designed an app to help new parents become more 'tuned in' to what their babies are thinking and feeling.

The app, called BabyMind, prompts the parent to think about things from their baby's perspective and to consider what is going on in their baby's mind at a specific point each day. It also provides parents with accurate information on babies' psychological development.

Smartphone apps targeted at parents and parents-to-be are increasingly popular with users, but there is a lack of evidence on how effective they are at improving parenting behaviour.

Professor Elizabeth Meins, from the University of York's Department of Psychology, said, "There are thousands of parenting apps available, but we don't know whether any of them actually have a positive impact on parenting.

"There are many advantages of using apps as a means of intervention--they're low-cost, easy to use and already integrated into people's lives--but we wanted to establish whether an app can have a demonstrable effect on the quality of parent-baby interaction".

The team recruited a group of mothers who started using the app as soon as their babies were born and followed up with them when their babies were six months of age.

The researchers observed the parents playing with their babies and assessed how attuned they were to their babies' thoughts and feelings. The mothers were compared against a control group of mothers with 6-month-olds who had not used the app. Compared with the control group mothers, the app users were significantly more attuned to their babies' thoughts and feelings.

The study included a wide age range of mothers, and the results for teenage mothers were particularly interesting. Professor Meins explained, "Previous research has shown that teenage mothers show less attunement to their babies' thoughts and feelings compared with mothers in their mid-twenties or older. Our study showed that young mothers who had used BabyMind were just as attuned as the older mothers who'd used the app.

"Even more impressive was the fact that the young app users were more attuned to their babies than the older mothers who had not used the BabyMind app. This suggests that using our app is associated with younger motherhood no longer being a disadvantage".

NICE data show that uptake of antenatal and postnatal services is poor in younger parents, so this research may be useful in designing interventions to support families in hard-to-reach populations.

Credit: 
University of York

196,000 youth lose health insurance coverage in past 3 years; yet upsides remain

The national implementation of the Affordable Care Act (ACA) in 2014 was associated with gains in health insurance coverage for youth, but some of those gains have reversed during the past three years, according to findings published this month in Academic Pediatrics from researchers at the Dornsife School of Public Health at Drexel University.

Using National Health Interview Survey data, the researchers found that although the overall percentage of uninsured youth decreased by almost two percent from the periods 2011-2013 to 2014-2015, in 2016-2018, the percentage of youth without coverage increased 0.3 percent to a total of 5.1 percent, or a loss of coverage for 196,000 youth and a total of 3.7 million youth without health insurance.

"Efforts in 2017 and 2018 to reduce advertising budgets, cease payments of cost sharing reductions and other efforts could have undermined the stability of insurance markets and public perceptions of insurance availability," said lead author Alex Ortega, PhD, a professor of health policy and chair of Health Management and Policy. "Insurance declines could be attributed to those actions."

Despite this reversal, the researchers see two positive metrics: diminishing racial disparities in access to care and continued growth in the number of well visits among American youth.

"There are provisions of the ACA aimed at advancing population health through improving access to and quality of health care and implementing cost-saving measures," said Ortega. "These include measures focused on making health care more efficient, equitable and effective."

Despite the overall drop in coverage from 2014-2015 to 2016-2018, the percentage of those under the age of 18 in the United States who had a well-child visit in the past year rose. From 2011-2013, 81 percent of youth had a well-child visit during the previous year. From 2016-2018, that number rose to 85 percent of youth having a well visit during the past year.

The researchers note that the ACA requires health insurers to cover preventive visits without cost to the patient, so youth can access essential preventive services, such as immunizations, regardless of insurance coverage.

"For youth, it appears, that these provisions are increasing the number of well child visits," Ortega said.

Coverage of Latino youth has been lower than it is among non-Latino white or black youth since before the ACA became law in 2010, but the disparities gap has narrowed a bit in recent years - coverage for Latino youth rose as it declined for white and black youth from 2016-2018. The authors speculate that information about the availability of the ACA's insurance tax credits and cost sharing reductions may be slowly reaching immigrant populations and limited English proficient individuals.

Credit: 
Drexel University

The Paleozoic diet: Why animals eat what they eat

image: Insects are a group in which feeding on plants increases rates of species proliferation, including among the butterflies and moths, which are almost all herbivorous.

Image: 
Daniel Stolte/UANews

What an animal eats is a fundamental aspect of its biology, but surprisingly, the evolution of diet had not been studied across the animal kingdom until now. Scientists at the University of Arizona report several unexpected findings from taking a deep dive into the evolutionary history of more than one million animal species and going back 800 million years, when the first animals appeared on our planet.

The study, published in the journal Evolution Letters, revealed several surprising key insights:

Many species living today that are carnivorous, meaning they eat other animals, can trace this diet back to a common ancestor more than 800 million years ago.

A plant-based, or herbivorous, diet is not the evolutionary driver for new species that it was believed to be.
Closely related animals tend to share the same dietary category - plant-eating, meat-eating, or both. This finding implies that switching between dietary lifestyles is not something that happens easily and often over the course of evolution.

Cristian Román-Palacios, Joshua Scholl and John Wiens, all with the Department of Ecology and Evolutionary Biology at the UA, scoured the literature for data on the dietary habits of more than a million animal species, from sponges to insects and spiders to housecats. A species was classified as carnivorous if it feeds on other animals, fungi or protists (single-celled eukaryotic organisms, many of which live on bacteria). Species were classified as herbivorous if they depend on land plants, algae or cyanobacteria for food, and omnivorous if they eat a mixture of carnivorous and herbivorous diets.

The scientists then mapped the vast dataset of animal species and their dietary preferences onto an evolutionary tree built from DNA-sequence data to untangle the evolutionary relationships between them.

"Ours is the largest study conducted so far that examines the evolution of diet across the whole animal tree of life," said doctoral student Román-Palacios, lead author of the paper. "We addressed three highly-debated and fundamental questions in evolutionary biology by analyzing a large-scale dataset using state-of-the-art methods."

All species can be classified according to their evolutionary relationships, a concept that is known as phylogeny. Organisms are grouped into taxa, which define their interrelationships across several levels. For example, cats and dogs are different species but belong to the same order (carnivores). Similarly, horses and camels belong to a different order (ungulates.) Both orders, however, are part of the same class (mammals). On the highest level, animals are classified in phyla. Examples of animal phyla are arthropods (insects, crustaceans, spiders, scorpions and the like), mollusks (snails, clams and squid fall into this phylum), and chordates, which include all animals with a backbone, including humans.

The survey suggests that across animals, carnivory is most common, including 63% of species. Another 32% are herbivorous, while humans belong to a small minority, just 3%, of omnivorous animals.

The researchers were surprised to find that many of today's carnivorous species trace this diet back all the way to the base of the animal evolutionary tree, more than 800 million years, predating the oldest known fossils that paleontologists have been able to assign to animal origins with certainty.

"We don't see that with herbivory," said Wiens, professor of ecology and evolutionary biology and corresponding author of the study. "Herbivory seems to be much more recent, so in our evolutionary tree, it appears more frequently closer to the tips of the tree."

So if the first animal was a carnivore, what did it prey on?

The authors suggest the answer might lie with protists, including choanoflagellates: tiny, single-celled organisms considered to be the closest living relatives of the animals. Living as plankton in marine and freshwater, choanoflagellates are vaguely reminiscent of miniature versions of the shuttlecock batted back and forth during a game of badminton. A funnel-shaped collar of "hairs" surrounds a whip-like appendage called a flagellum whose rhythmic beating sucks a steady stream of water through the collar, filtering out bacteria and detritus that is then absorbed and digested. It is possible that the common ancestor of today's animals was a creature very similar to a choanoflagellate.

"The ancient creature that is most closely related to all animals living today might have eaten bacteria and other protists rather than plants," Wiens said.

Turning to a plant-based diet, on the other hand, happened much more frequently over the course of animal evolution.

Herbivory has traditionally been seen as a powerful catalyst for the origin of new species - an often-cited example is the insects, with an estimated 1.5 million described species the most diverse group among the arthropods. Many new species of flowering plants appeared during the Cretaceous period, about 130 million years ago, and the unprecedented diversity of flowers is widely thought to have coincided with an increase in insect species taking advantage of the newly available floral bounty.

"This tells us that what we see in insects doesn't necessarily apply to other groups within the animal kingdom," Wiens said. "Herbivory may go hand in hand with new species appearing in certain taxa, but it clearly is not a universal driver of new species."

The study also revealed that omnivorous ("eating everything") diets popped up rarely over the course of 800 million years of animal evolution, hinting at the possible explanation that evolution prefers specialists over generalists.

"You can be better at doing what you do if that is all you do," Wiens said. "In terrestrial vertebrates, for example, eating a diet of leaves often requires highly modified teeth and a highly modified gut. The same goes for carnivory. Nature generally seems to avoid the dilemma of being a jack-of-all-trades and master of none, at least for diets."

This need for specialization might explain why omnivores, such as humans, are rare, according to the authors. It might also explain why diets have often gone unchanged for so long.

"There is a big difference between eating leaves all the time and eating fruits every now and then," Wiens said. "The specializations required to be an efficient herbivore or carnivore might explain why the two diets have been so conserved over hundreds of millions of years."

Credit: 
University of Arizona

What's killing sea otters? Scientists pinpoint parasite strain

image: A southern sea otter swims off the coast of Moss Landing in California.

Image: 
Trina Wood/UC Davis

Many wild southern sea otters in California are infected with the parasite Toxoplasma gondii, yet the infection is fatal for only a fraction of sea otters, which has long puzzled the scientific community. A study from the University of California, Davis, identifies the parasite's specific strains that are killing southern sea otters, tracing them back to a bobcat and feral domestic cats from nearby watersheds.

The study, published this week in the journal Proceedings of the Royal Society B, marks the first time a genetic link has been clearly established between the Toxoplasma strains in felid hosts and parasites causing fatal disease in marine wildlife.

The study builds on years of work by a consortium of researchers led by the UC Davis School of Veterinary Medicine's Karen C. Drayer Wildlife Health Center and the California Department of Fish and Wildlife (CDFW). The scientists were called upon in the late 1990s to help decipher the mystery when Toxoplasma caused deaths in sea otters along the California coast.

"This is decades in the making," said corresponding author Karen Shapiro, an associate professor with the UC Davis School of Veterinary Medicine and its One Health Institute. "We now have a significant link between specific types of the parasite and the outcome for fatal toxoplasmosis in sea otters. We are actually able to link deaths in sea otters with wild and feral cats on land."

FROM LAND TO SEA

Wild and domestic cats are the only known hosts of Toxoplasma, in which the parasite forms egglike stages, called oocysts, in their feces. Shapiro led the initial effort to show how oocysts accumulate in kelp forests and are taken up by snails, which are eaten by sea otters.

For this study, the authors characterized Toxoplasma strains for more than 100 stranded southern sea otters examined by the CDFW between 1998 and 2015. CDFW Veterinary Pathologist Melissa Miller assessed the otters for Toxoplasma as a primary or contributing cause of death. The scientists compared pathology data with the parasite strains found in sea otters and nearby wild and domestic cats to identify connections between the disease-causing pathogen and its hosts.

The study's results highlight how infectious agents like Toxoplasma can spread from cat feces on land to the sea, leading to detrimental impacts on marine wildlife.

CLOSELY WATCHED

Southern sea otters are among the most intensely studied marine mammals in California because they are a threatened species and an iconic animal for the state. They live within just a few hundred meters of the coastline, allowing for close observation that enables a wealth of scientific data.

Previous research showed that up to 70 percent of stranded southern sea otters were infected with Toxoplasma, yet the infection becomes fatal for only a fraction of them. Decades of detailed investigations by CDFW and UC Davis have confirmed that infection by land-based protozoan parasites such as Toxoplasma and the related parasite Sarcocystis neurona are common causes of illness and death for southern sea otters.

Shapiro notes that Toxoplasma can also affect other wildlife species, but there is more robust data for the otters.

"Toxoplasma is one heavily studied pathogen that we care about, but there are many other viruses and bacteria that are on land and being flushed to the ocean that we probably aren't aware of yet," Shapiro said.

WHAT CAN BE DONE?

People can help reduce the spread of Toxoplasma by keeping their cats inside and disposing of cat feces in a bag in the trash, not outdoors or in the toilet because wastewater treatment is not effective in killing oocysts.

Outdoor cats that feed on wild rodents and birds are likely to become infected with Toxoplasma because the parasite is commonly present in the tissues of these prey animals.

Oocysts shed in cat feces on land get washed into waterways with rainfall, and prior research identified freshwater outflow as a key source of Toxoplasma exposure for southern sea otters.

Wetlands, forests and grasslands naturally serve to shield watersheds and oceans from pollutants, including oocysts. Preserving and restoring wetlands and natural areas, managing stormwater runoff, and replacing pavement with permeable surfaces can reduce contamination and minimize pathogens entering the water.

Credit: 
University of California - Davis

Cracking a decades-old test, researchers bolster case for quantum mechanics

image: Researchers created entangled photon pairs and distributed the two photons of each pair to two measurement stations in opposite directions. At each measurement station, a telescope received the photons from the selected cosmic radiation source, which is at least 11 light-years from Earth. The cosmic photon detection signals generate random bits for measurement setting choices for the loophole-free Bell test. In this experiment, the researchers closed detection and locality loopholes, and pushed the time constraint to rule out local hidden variable models to 11 years before the experiment.

Image: 
Ming-Han Li, USTC, Shanghai

WASHINGTON -- In a new study, researchers demonstrate creative tactics to get rid of loopholes that have long confounded tests of quantum mechanics. With their innovative method, the researchers were able to demonstrate quantum interactions between two particles spaced more than 180 meters (590 feet) apart while eliminating the possibility that shared events during the past 11 years affected their interaction.

A paper explaining these results will be presented at the Frontiers in Optics + Laser Science (FIO + LS) conference, held 15-19 September in Washington, D.C., U.S.A.

Quantum phenomena are being explored for applications in computing, encryption, sensing and more, but researchers do not yet fully understand the physics behind them. The new work could help advance quantum applications by improving techniques for probing quantum mechanics.

A test for quantum theories

Physicists have long grappled with different ideas about the forces that govern our world. While theories of quantum mechanics have gradually overtaken classical mechanics, many aspects of quantum mechanics remain mysterious. In the 1960s, physicist John Bell proposed a way to test quantum mechanics known as Bell's inequality.

The idea is that two parties, nicknamed Alice and Bob, make measurements on particles that are located far apart but connected to each other via quantum entanglement.

If the world were indeed governed solely by quantum mechanics, these remote particles would be governed by a nonlocal correlation through quantum interactions, such that measuring the state of one particle affects the state of the other. However, some alternate theories suggest that the particles only appear to affect each other, but that in reality they are connected by other hidden variables following classical, rather than quantum, physics.

Researchers have conducted many experiments to test Bell's inequality. However, experiments can't always be perfect, and there are known loopholes that could cause misleading results. While most experiments have strongly supported the conclusion that quantum interactions exist, these loopholes still leave a remote possibility that researchers could be inadvertently affecting hidden variables, thus leaving room for doubt.

Closing loopholes

In the new study, Li and his colleagues demonstrate ways to close those loopholes and add to the evidence that quantum mechanics governs the interactions between the two particles.

"We realized a loophole-free Bell test with the measurement settings determined by remote cosmic photons. Thus we verified the completeness of quantum mechanics with high-confidence probability," said Ming-Han Li of the University of Science and Technology of China, who is lead author on the paper.

Their experimental setup includes three main components: a device that periodically sends out pairs of entangled photons and two stations that measure the photons. These stations are Alice and Bob, in the parlance of Bell's inequality. The first measurement station is 93 meters (305 feet) from the photon pair source and the second station is 90 meters (295 feet) away in the opposite direction.

The entangled photons travel through single mode optical fiber to the measurement stations, where their polarization state is measured with a Pockels cell and the photons are detected by superconducting nanowire single-photon detectors.

In designing their experiment, the researchers sought to overcome three key problems: the idea that loss and noise make detection unreliable (the detection loophole), the idea that any communication that affects Alice's and Bob's measurement choices makes the measurement cheatable (the locality loophole), and the idea that a measurement-setting choice that is not "truly free and random" makes the result able to be controlled by a hidden cause in the common past (the freedom-of-choice loophole).

To address the first problem, Li and his colleagues demonstrated that their setup achieved a sufficiently low level of loss and noise by comparing measurements made at the start and end of the photon's journey. To address the second, they built the experimental setup with space-like separation between the events of measurement setting choice. To address the third, they based their measurement-setting choices on cosmic photon behavior from 11 years earlier, which offers high confidence that nothing in the particles' shared past - for at least the past 11 years - created a hidden variable affecting the outcome.

Combining theoretically calculated predictions with experimental results, the researchers were able to demonstrate quantum interactions between the entangled photon pairs with a high degree of confidence and fidelity. Their experiment thus provides robust evidence that quantum effects, rather than hidden variables, are behind the particles' behavior.

Credit: 
Optica

WPI mathematician is helping NASA spacecraft travel faster and farther

image: Randy Paffenroth, associate professor of mathematical sciences at WPI, holds a carbon-based sheet known as Miralon®, which is used in a variety of spacecraft applications.

Image: 
Worcester Polytechnic Institute

Worcester, Mass. - Aug. 22, 2019 - By combining cutting-edge machine learning with 19th-century mathematics, a Worcester Polytechnic Institute (WPI) mathematician is working to make NASA spacecraft lighter and more damage tolerant by developing methods to detect imperfections in carbon nanomaterials used to make composite rocket fuel tanks and other spacecraft structures.

Randy Paffenroth, associate professor of mathematical sciences, computer science, and data science, has a multi-part mission in this research project. Using machine learning, neural networks, and an old mathematical equation, he has developed an algorithm that will significantly enhance the resolution of density scanning systems that are used to detect flaws in carbon nanotube materials. Higher resolution scans provide more accurate images (nine times "super resolution") of the material's uniformity, detecting imperfections in Miralon® materials--a strong, lightweight, flexible nanomaterial produced by Nanocomp Technologies, Inc.

Miralon® yarns, which can be as thin as a human hair, can be wrapped around structures like rocket fuel tanks, giving them the strength to withstand high pressures. Imperfections and variations in thickness can cause weak spots in the yarn and the resulting composite. Paffenroth, with a team of graduate students, is analyzing data from the manufacturing process to help ensure a more consistent end product.

Nanocomp uses a modified commercial "basis weight" scanning system that scans the nanomaterial for mass uniformity and imperfections, creating a visual image of density; Paffenroth and his team are using machine learning to train algorithms to increase the resolution of the images, allowing the machine to detect more minute variations in the material. They have developed a unique mathematical "compressed sensing / super resolution" algorithm that has increased the resolution by nine times.

Built with the Python programming language and based on an artificial neural network, the algorithm was "trained" on thousands of sets of nanomaterial images in which Paffenroth had already identified and located flaws. He essentially gave the algorithm a series of practice tests where he already knew the answers (known as "ground truth"). Then, he gave it other tests without the answers. "I give it a sheet of material. I know the imperfections going in but the algorithm doesn't. If it finds those imperfections, I can trust its accuracy," said Paffenroth.

To make the machine learning algorithm more effective at making a high-resolution image out of a low-resolution image, he combined it with the Fourier Transform, a mathematical tool devised in the early 1800s that can be used to break down an image into its individual components.

"We take this fancy, cutting-edge neural network and add in 250-year-old mathematics and that helps the neural network work better," said Paffenroth. "The Fourier Transform makes creating a high-resolution image a much easier problem by breaking down the data that makes up the image. Think of the Fourier Transform as a set of eyeglasses for the neural network. It makes blurry things clear to the algorithm. We're taking computer vision and virtually putting glasses on it.

"It's exciting to use this combination of modern machine learning and classic math for this kind of work," he added.

Paffenroth's work is funded by an $87,353 grant WPI received from Nanocomp Technologies, a New Hampshire-based subsidiary of Huntsman Corporation that makes advanced carbon-nanotube materials for aerospace, defense, and the automotive industry. WPI is a sub-contractor to Nanocomp, which received an $8.1 million contract from NASA to advance its carbon nanotube sheets and yarns.

Miralon® has already been proven in space. For instance, it was wrapped around structural supports in NASA's Juno probe orbiting the planet Jupiter to help a challenging problem with vibration damping and static discharge. NASA has also used Miralon® nanomaterials to make and test prototypes of new carbon composite pressure vessels, the precursors to next generation rocket fuel tanks. NASA spacecraft will need that added strength and durability as they travel farther from home and deeper into space.

As part of its current NASA contract, Nanocomp is trying to make Miralon® yarns that are three times stronger, and the work by Paffenroth's team is a big part of making that happen.

"Randy is helping us achieve this goal of tripling our strength by improving the tools in our toolbox so that we can make stronger, better, next-generation materials to be used in space applications," said Bob Casoni, Quality Manager at Nanocomp. "If NASA needs to build a new rocket system strong enough to get to Mars and back, it has a big set of challenges to face. Better materials are needed to allow NASA to design rockets that can go farther, faster and survive longer."

Casoni noted that with the higher resolution from WPI's algorithm, Nanocomp can see patterns and variations in its materials that they couldn't see before.

"We can not only pick up features, but we also have a better idea of the magnitude of those features," he said. "Before, it was like seeing a blurry satellite image. You might think you're seeing the rolling hills of Pennsylvania, but with better resolution you see it's really Mount Washington or the Colorado Rockies. It's pretty amazing stuff."

And with better measurement tools, Nanocomp also will be able to improve its manufacturing process by testing whether changes in factors like temperature, tension control, pressure, and flow rates create better materials. "We can use better measurements to optimize our ultimate product performance," said Casoni. "Randy is helping us understand our manufacturing process better. He's doing his "magic math" to help us better understand variations in our product. The uniformity of that material plays a big part in its ultimate strength."

Paffenroth and his team will also develop algorithms to be used in active feedback control systems to predict how good a particular piece of material will be as it's first being made, helping to ensure a more consistent end product. The algorithm analyzes the properties measured at the beginning of the manufacturing run to effectively predict the properties at the end of the run, including mechanical properties and length of run.

"We can use machine learning to predict that Nanocomp won't get a useful length of material out of a particular production run," said Paffenroth. "It helps them with waste. If they can tell in the first few meters of the run that there will be a problem, they can stop and start over. The Holy Grail of process engineering is that the more you understand about your process, the better your process is."

WPI will present its findings on Aug. 25 at the 2019 International Conference on Image, Video Processing and Artificial Intelligence in Shanghai, China.

Credit: 
Worcester Polytechnic Institute

A single gene determines whether a fly has a good sense of sight or a good sense of smell

image: This figure shows heads from D. melanogaster (top) and D. pseudoobscura (bottom) with different proportions of eyes and antennae.

Image: 
A. Ramaekers and N. Grillenzoni

Trade-offs in the sizes of visual and olfactory organs are a common feature of animal evolution, but the underlying genetic and developmental mechanisms have not been clear. A study publishing August 22 in the journal Developmental Cell reveals that a single DNA variant that affects the timing of sensory organ development in fruit flies could explain the size trade-off between eyes and antennae, potentially providing a quick route to behavioral changes and adaptation.

Because the affected gene, eyeless/Pax6, is conserved across invertebrates and vertebrates, including humans, the discovery could represent a general mechanism for sensory organ size trade-offs across the animal kingdom.

The senses animals rely on have been shaped through evolution to better navigate and exploit the environment. As a result, even closely related species living in different ecological niches show variation in the sizes and shapes of their sensory structures. In arthropods such as fruit flies, trade-offs between the size of the eyes and of the antennae, where most olfactory organs are located, are pervasive.

"What we demonstrate is that there are consequences to subtle changes in the conserved mechanisms that govern how these sense organs develop," says senior study author Bassem Hassan of Institut du Cerveau et de la Moelle épinière (ICM). "What this means more broadly is that one cannot fully understand how genetic variation and morphological variation relate to each other without understanding the developmental processes that translate the former into the latter."

To examine the underlying mechanisms, Hassan and first author Ariane Ramaekers of Institut du Cerveau et de la Moelle épinière (ICM) combined comparative analyses of different fruit fly strains and species with developmental, molecular, and genome-editing approaches. Specifically, the authors focused on a structure called the eye-antennal imaginal disc (EAD), which consists of an eye field and a non-eye field and gives rise to all external head sensory organs during fruit fly development.

Hassan and Ramaekers found that the eye field is proportionally larger in Drosophila pseudoobscura (D. pse.) compared to Drosophila melanogaster (D. mel.), corresponding to a 35% increase in the number of ommatidia--small units that make up the insect compound eye. Similarly, the eye field is proportionally larger in the D. mel. strain called Canton-S compared with the D. mel. strain Hikone-AS, corresponding to a 12.5% increase in ommatidia number.

"These data suggest that, despite 17 to 30 million years of separated evolution between the two species groups, ommatidia number variation between D. mel. and D. pse. and between two D. mel. strains share a common developmental logic," Hassan says.

To search for genetic causes of eye size variation, the researchers next examined DNA sequences that transcription factors bind to regulate the expression of the neighboring eyeless/Pax6 gene. They found that a single nucleotide variant--a G>A substitution--at a binding site that differentiates the small eye subspecies from the large eye subspecies. The G allele in small eye subspecies is predicted to have a higher affinity for the binding site, resulting in greater repression of eyeless/Pax6 gene expression compared to the A allele in the large eye subspecies.

Additional analyses showed that this variant occurs in natural fruit fly populations, and the A allele corresponds to both more ommatidia and smaller antennal width among different laboratory strains. Using CRISPR/Cas9 to introduce the A allele into a G-homozygous stock, the researchers demonstrated that the G>A substitution causes an increase in the number of ommatidia.

"We were surprised by the simplicity of the mechanism of sensory trade-offs that we identified: To vary sensory organ size--in this case, eye versus antennae--it suffices to slightly vary the expression of a single gene," Ramaekers says. "It was particularly satisfactory to find that this gene, called Pax6, is the same one that builds the eye in all animals, including humans. We were also surprised that what matters to generate a trade-off is to change when, rather than where, Pax6 ends up being expressed. To make the eye bigger or smaller, it is sufficient to slightly speed up or slow down the subdivision of the head primordium into eye versus non-eye territories."

For the authors, the findings raise several intriguing questions. For example, it's not yet clear how the single-nucleotide variant changes the timing of Pax6 expression and sensory organ development. Moreover, it's possible that controlling the timing of the expression of key developmental genes could be a general rule for changing the size of a tissue or organ. Another interesting question is whether sensory brain regions are affected by changes in the relative sizes of sensory organs.

Credit: 
Cell Press

New method classifies brain cells based on electrical signals

image: MIT Professor Earl Miller

Image: 
The Picower Institute for Learning and Memory

For decades, neuroscientists have relied on a technique for reading out electrical "spikes" of brain activity in live, behaving subjects that tells them very little about the types of cells they are monitoring. In a new study, researchers at the University of Tuebingen and MIT's Picower Institute for Learning and Memory demonstrate a way to increase their insight by distinguishing four distinct classes of cells from that spiking information.

The advance offers brain researchers the chance to better understand how different kinds of neurons are contributing to behavior, perception and memory, and how they are malfunctioning in cases of psychiatric or neurological diseases. Much like mechanics can better understand and troubleshoot a machine by watching how each part works as it runs, neuroscientists, too, are better able to understand the brain when they can tease apart the roles different cells play while it thinks.

"We know from anatomical studies that there are multiple types of cells in the brain and if they are there, they must be there for a reason," said Earl Miller, Picower Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT, and co-senior author of the paper in Current Biology. "We can't truly understand the functional circuitry of the brain until we fully understand what different roles these different cell types might play."

Miller collaborated with the Tuebingen-based team of lead author Caterina Trainito, Constantin von Nicolai and Professor Markus Siegel, co-senior author and a former postdoc in Miller's lab, to develop the new way to wring more neuron type information from electrophysiology measurements. Those measures track the rapid voltage changes, or spikes, that neurons exhibit as they communicate in circuits, a phenomenon essential for brain function.

"Identifying different cell types will be key to understand both local and large-scale information processing in the brain," Siegel said.

Four is greater than two

At best, neuroscientists have so far only been able to determine from electrophysiology whether a neuron was excitatory or inhibitory. That's because they only analyzed the difference in the width of the spike. The typical amount of data in an electrophysiology study - spikes from a few hundred neurons -only supported that single degree of distinction, Miller said.

But the new study could go farther because it derives from a dataset of recordings from nearly 2,500 neurons. Miller and Siegel gathered the data years ago at MIT from three regions in the cortex of animals who were performing experimental tasks that integrated perception and decision making.

"We recognized the uncommonly rich resource at our disposal," Siegel said.

Thus, the team decided to put the dataset through a ringer of sophisticated statistical and computational tools to analyze the waveforms of the spikes. Their analysis showed that the waveforms could actually be sorted along two dimensions: how quickly the waveform ranges between its lowest and highest voltage ("trough to peak duration"), and how quickly the voltage changes again afterward, returning from the peak to the normal level ("repolarization time"). Plotting those two factors against each other neatly sorted the cells into four distinct "clusters." Not only were the clusters evident across the whole dataset, but individually within each of the three cortical regions, too.

For the distinction to have any meaning, the four classes of cells would have to have functional differences. To test that, the researchers decided to sort the cells out based on other criteria such as their "firing rate" (how often they spike), whether they tend to fire in bursts and how variable their intervals are between spikes - all factors in how they participate in and influence the circuits they are connected in. Indeed, the cell classes remained distinct by these measures.

In yet another phase of analysis the cell classes also remained distinguishable as the researchers watched them respond to the animals perceiving and processing visual stimulation. But in this case, they saw the cells play different roles in different regions. A class 1 cell, for example, might respond differently in one region than it did in another.

"These cell types are truly different cell types that have different properties," Miller said. "But they have different functions in different cortical areas because different cortical areas have different functions."

New research capability

In the study the authors speculate about which real neuron types their four mathematically defined classes most closely resemble, but they don't yet offer a definitive determination. Still, Miller said the finer-grained distinctions the study draws are enough to make him want to reanalyze old neural spiking data to see what new things he can learn.

One of Miller's main research interests is the nature of working memory - our ability to hold information like directions in mind while we make use of it. His research has revealed that it is enabled by a complex interplay of brain regions and precisely timed bursts of neural activity. Now he may be able to figure out how different classes of neurons play specific roles in specific regions to endow us with this handy mental ability.

And both Miller's and Siegel's labs are particularly interested in different brain rhythms, which are abundant in the brain and likely play a key role for orchestrating communication between neurons. The new results open a powerful new window for them to unravel which role different neuron classes play for these brain rhythms.

Credit: 
Picower Institute at MIT

Microbiology: Atacama Desert microbes may hold clues to life on Mars

image: Picture taken at one of the sites inspected in the hyperarid core of the Atacama Desert. Note the Petri plates with different growing media shining with the sun light at the right.

Image: 
Professor Carlos González-Silva

Microbial life on Mars may potentially be transported across the planet on dust particles carried by wind, according to a study conducted in the Atacama Desert in North Chile, a well-known Mars analogue. The findings are published in Scientific Reports.

Armando Azua-Bustos and colleagues investigated whether microbial life could move across the Atacama Desert using on wind-driven dust particles They sought to determine where these microorganisms originate, which may have implications for microbial life in extreme environments.

The authors collected 23 bacterial and eight fungal species from three sampling sites across two regions of the Atacama traversing its hyperarid core, which in addition to its extreme aridity is known for having highly saline/oxidizing soils and extremely high UV radiation. Only three of the species were shared among transects, suggesting that there are different airborne ecosystems in different parts of the desert. Bacterial and fungal species identified from the samples included Oceanobacillus oncorhynchi, a bacterium first described in aquatic environments, and Bacillus simplex, which originates from plants. These observations indicate that microbes may arrive at the hyperarid core from the Pacific Ocean and the Coastal Range of the desert.

The authors found that microbial cells collected in the morning tended to come from nearby areas, whereas in the afternoon, marine aerosols and microbial life on dust particles were carried by the wind from remote locations. This finding suggests that microbial life is able to efficiently move across the driest and most UV irradiated desert on Earth. Potential microbial life on Mars may similarly spread over, the authors speculate.

Credit: 
Scientific Reports

Memory T cells shelter in bone marrow, boosting immunity in mice with restricted diets

WHAT:
Even when taking in fewer calories and nutrients, humans and other mammals usually remain protected against infectious diseases they have already encountered. This may be because memory T cells, which are located throughout the body and required to maintain immune responses to infectious agents, according to scientists at the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health. Their study in mice, published online today in Cell, also found that animals undergoing dietary restriction were better protected against tumors and bacterial infections than animals with unrestricted diets.

Researchers led by Yasmine Belkaid, Ph.D., chief of the Metaorganism Immunity Section in NIAID's Division of Intramural Research, previously observed that fat tissue harbors memory T cells in mice. They investigated whether this phenomenon helped preserve immune memory when calorie intake was reduced. To investigate, they restricted the diet of mice previously given full access to food. While receiving less food, mice had fewer memory T cells in their lymphoid tissues, where they normally linger, and more of the T cells in bone marrow that became enriched with fat tissue.

Investigators then evaluated how well memory T cells performed when mice ate less. While eating freely, mice were infected with the bacterium Yersinia pseudotuberculosis. After the mice developed immunological memory, researchers restricted the diets of some of the mice for up to four weeks before again exposing all the mice to Y. pseudotuberculosis. Mice with restricted diets had more robust memory T cell responses and were better protected from illness. The researchers repeated this experiment using a vaccine that trains immune cells to fight melanomas and found that memory T cells were more protective against tumors in mice receiving less food.

Though this phenomenon has yet to be studied in humans, the findings suggest how the immune system may have evolved to help mammals survive periods of limited food availability while keeping their immunity intact. These results in lab animals cannot be extrapolated to dietary advice for people. However, these insights may one day help clinicians improve immunotherapy for cancers and other diseases by optimizing nutrition.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Early life on Earth limited by enzyme

image: This is the graphic of Anabaena cylindrica, a filamentous cyanobacterium, 1946 watercolor by G. E. Fogg, FRS.

Image: 
G. E. Fogg, FRS

The enzyme-nitrogenase-can be traced back to the universal common ancestor of all cells more than four billion years ago.

Found only in bacteria today, nitrogenase is nevertheless essential for the production of oxygen from water in photosynthesis, making it instrumental in how aquatic bacteria produced Earth's first molecular oxygen 2.5 billion years ago.

"For half of Earth's 4.6 billion year existence, the atmosphere contained only carbon dioxide and nitrogen, with no oxygen, but this changed when cyanobacteria, also known as blue-green algae, started producing the first oxygen using nitrogenase. This led to the Great Oxidation Event," explained study author Professor John Allen (UCL Genetics, Evolution & Environment).

"But instead of rising steadily, atmospheric oxygen levels stabilised at 2% by volume for about two billion years before increasing to today's level of 21%. The reasons for this have been long debated by scientists and we think we've finally found a simple yet robust answer."

A study, published today in Trends in Plant Sciences by researchers from UCL, Queen Mary University of London and Heinrich-Heine-Universität Düsseldorf, proposes for the first time that atmospheric oxygen produced using nitrogenase blocked the enzyme from working.

This negative feedback loop prevented further oxygen from being made and initiated a long period of stagnation in Earth's history about 2.4 billion years ago.

Lasting nearly two billion years, the Proterozoic Eon saw very little change in the evolution of life, ocean and atmosphere composition and climate, leading some to call it the 'boring billion'.

"There are many ideas about why atmospheric oxygen levels stabilised at 2% for such an incredibly long period of time, including oxygen reacting with metal ions, but remarkably, the key role of nitrogenase has been completely overlooked," said study co-author Professor William Martin (Heinrich-Heine-Universität Düsseldorf).

"Our theory is the only one that accounts for the global impact on the production of oxygen over such a sustained period of time and explains why it was able to rise to the levels we see today, fuelling the evolution of life on Earth."

The team says that the negative feedback loop ended only when plants conquered land about 600 million years ago.

When land plants emerged, their oxygen producing cells in leaves were physically separated from nitrogenase containing cells in soil. This separation allowed oxygen to accumulate without inhibiting nitrogenase.

This theory is supported by evidence in the fossil record that shows cyanobacteria had begun to protect nitrogenase in dedicated cells called heterocysts about 408 million years ago, once oxygen levels were already increasing from photosynthesis in land plants.

"Nitrogenase is essential for life and the process of photosynthesis as it fixes nitrogen in the air into ammonia, which is used to make proteins and nucleic acids," said co-author Mrs Brenda Thake (Queen Mary University of London).

"We know from studying cyanobacteria in laboratory conditions that nitrogenase ceases to work at higher than 10% current atmospheric levels, which is 2% by volume, as the enzyme is rapidly destroyed by oxygen. Despite this being known by biologists, it hasn't been suggested as a driver behind one of Earth's great mysteries, until now."

Credit: 
University College London

Maximum mass of lightest neutrino revealed using astronomical big data

Neutrinos come in three flavours made up of a mix of three neutrino masses. While the differences between the masses are known, little information was available about the mass of the lightest species until now.

It's important to better understand neutrinos and the processes through which they obtain their mass as they could reveal secrets about astrophysics, including how the universe is held together, why it is expanding and what dark matter is made of.

First author, Dr Arthur Loureiro (UCL Physics & Astronomy), said: "A hundred billion neutrinos fly through your thumb from the Sun every second, even at night. These are very weakly interactive ghosts that we know little about. What we do know is that as they move, they can change between their three flavours, and this can only happen if at least two of their masses are non-zero."

"The three flavours can be compared to ice cream where you have one scoop containing strawberry, chocolate and vanilla. Three flavours are always present but in different ratios, and the changing ratio-and the weird behaviour of the particle-can only be explained by neutrinos having a mass."

The concept that neutrinos have mass is a relatively new one with the discovery in 1998 earning Professor Takaaki Kajita and Professor Arthur B. McDonald the 2015 Nobel Prize in Physics. Even so, the Standard Model used by modern physics has yet to be updated to assign neutrinos a mass.

The study, published today in Physical Review Letters by researchers from UCL, Universidade Federal do Rio de Janeiro, Institut d'Astrophysique de Paris and Universidade de Sao Paulo, sets an upper limit for the mass of the lightest neutrino for the first time. The particle could technically have no mass as a lower limit is yet to be determined.

The team used an innovative approach to calculate the mass of neutrinos by using data collected by both cosmologists and particle physicists. This included using data from 1.1 million galaxies from the Baryon Oscillation Spectroscopic Survey (BOSS) to measure the rate of expansion of the universe, and constraints from particle accelerator experiments.

"We used information from a variety of sources including space- and ground-based telescopes observing the first light of the Universe (the cosmic microwave background radiation), exploding stars, the largest 3D map of galaxies in the Universe, particle accelerators, nuclear reactors, and more," said Dr Loureiro.

"As neutrinos are abundant but tiny and elusive, we needed every piece of knowledge available to calculate their mass and our method could be applied to other big questions puzzling cosmologists and particle physicists alike."

The researchers used the information to prepare a framework in which to mathematically model the mass of neutrinos and used UCL's supercomputer, Grace, to calculate the maximum possible mass of the lightest neutrino to be 0.086 eV (95% CI), which is equivalent to 1.5 x 10-37 Kg. They calculated that three neutrino flavours together have an upper bound of 0.26 eV (95% CI).

Second author, PhD student Andrei Cuceu (UCL Physics & Astronomy), said: "We used more than half a million computing hours to process the data; this is equivalent to almost 60 years on a single processor. This project pushed the limits for big data analysis in cosmology."

The team say that understanding how neutrino mass can be estimated is important for future cosmological studies such as DESI and Euclid, which both involve teams from across UCL.

The Dark Energy Spectroscopic Instrument (DESI) will study the large scale structure of the universe and its dark energy and dark matter contents to a high precision. Euclid is a new space telescope being developed with the European Space Agency to map the geometry of the dark Universe and evolution of cosmic structures.

Professor Ofer Lahav (UCL Physics & Astronomy), co-author of the study and chair of the UK Consortiums of the Dark Energy Survey and DESI said: "It is impressive that the clustering of galaxies on huge scales can tell us about the mass of the lightest neutrino, a result of fundamental importance to physics. This new study demonstrates that we are on the path to actually measuring the neutrino masses with the next generation of large spectroscopic galaxy surveys, such as DESI, Euclid and others."

Credit: 
University College London

Researchers reveal plant defense toolkit and insights for fighting crop diseases

image: Detlef Weigel, Director at the Max Planck Institute for Developmental Biology, at work with Arabidopsis specimens.

Image: 
Joerg Abendroth/Max Planck Institute for Developmental Biology

At an unprecedented scale, researchers have now catalogued the array of surveillance tools that plants use to detect disease-causing microbes across an entire species.

Representing a major advance for plant biology, the findings have important implications for the management of dangerous crop diseases which represent significant threats to food security.

Like animals, plants rely on immune systems to help them respond to attack by pathogenic microbes such as bacteria and fungi. A crucial layer of the immune system is formed by proteins called Nucleotide-binding Leucine-rich Repeat receptors (NLRs), which work together in combinations to detect the ever-changing array of microbes in the environment.

Despite progress in understanding how these receptors work together, key questions remained; what is the full spectrum of NLRs produced by plants? How much variation in NLRs is there in a typical plant species? What range is needed for plants to protect themselves?

An international team supported by the 2Blades Foundation through a grant from the Gordon and Betty Moore Foundation set out to piece together the NLR repertoire in the plant equivalent of the lab-rat, a mustard family species called Arabidopsis. The effort was carried out in the laboratories of Detlef Weigel, at the Max Planck Institute for Developmental Biology in Germany, Jeffery Dangl, of the Howard Hughes Medical Institute and the University of North Carolina, and Jonathan Jones at The Sainsbury Laboratory in the United Kingdom. Researchers from the University of Bath, Colorado State University, and the Center for Research in Agricultural Genomics (Barcelona) also contributed to the study, which is published today in the journal Cell.

The team sampled Arabidopsis plants, which come under attack by pathogens of all stripes, from populations across Europe, North America, and Asia. Using state-of-the-art sequencing technologies, they analyzed the genetic diversity of the plants' NLRs. They found, as anticipated, that NLR repertoires in plants differed between locations, likely due to different evolutionary pressures from regional pathogens. But the team was surprised to find an upper limit on NLR gene numbers across the species.

As Weigel described the findings, "While the diversity is perhaps more limited than anticipated, it is the mixing of genes that makes each individual uniquely resistant to a different spectrum of pathogens."

The study will also change the way researchers approach the subject in future, says Jones.

"One of the remarkable conclusions that emerge from this study is that so many populations need to be sequenced to define the full immune system repertoire of any plant," he said. "Gone are the days when a single reference sequence is sufficient to reveal the secrets of a species; it is now clear that we need to understand the genetic diversity of a species in order to understand its immune system."

The research has major implications for our understanding of agriculture and plant evolution. "This work will help drive new NLR receptor function discovery for developing disease-resistant crops and also guide the analysis of their evolution across the plant kingdom," said Dangl.

Coordination and support for the study was provided by the 2Blades Foundation, an international organization dedicated to understanding plant disease and advancing durable crop disease resistance. Through its programs, 2Blades has already demonstrated in field trials that protection against serious unmanaged diseases can be expanded in crops by introducing resistance genes from related plant species. The fundamental knowledge of NLRs from the study will further inform strategies in its on-going programs, as well as the field of plant-microbe interactions at large. This study was funded in part by the Gordon and Betty Moore Foundation through Grant GBMF4725 to the 2Blades Foundation.

Credit: 
John Innes Centre

Yale researchers detect unreported Zika outbreak

New Haven, Conn. -- Researchers at the Yale School of Public Health (YSPH) have detected a large unreported Zika outbreak that occurred in Cuba during 2017, a year after Zika outbreaks peaked throughout the Americas.

Unreported outbreaks present a risk of silently spreading viruses to other parts of the world. They also highlight the need for alternative detection methods when there is a lack of access to reliable local case reporting, the researchers said.

The study, published online in the journal Cell, was a large collaborative effort with Scripps Research, Florida Gulf Coast University, the Florida Department of Health, and other institutions.

Towards the end of 2016, as the Zika epidemic in the Americas appeared to be waning, lead author Nathan Grubaugh, an assistant professor of epidemiology, and other researchers involved in the study became interested in whether hidden outbreaks were occurring.

"Accurate case detection is not always possible in many places for a variety of biological and socioeconomic reasons," said Grubaugh. "So, we constructed a framework using infections diagnosed from travelers, travel patterns, and virus genomics to detect outbreaks in the absence of local data." Virus genomics sequences the genetic code of a virus, which helped scientists determine how the viruses in Cuba were related to those appearing throughout the Americas.

Using travel-related Zika cases reported by the Florida Department of Health, the team discovered a spike in cases from people returning from Cuba in 2017. They estimated that the outbreak in Cuba should have resulted in 1,000 to 20,000 Zika cases, even though only 187 cases were reported in 2016 and none in 2017.

"A big question was: Why did the outbreak in Cuba occur a year after others in the region?" said Chantal Vogels, a postdoctoral fellow at YSPH. "We investigated virus genomics, travel patterns, and models of mosquito transmission, and found that the outbreak in Cuba may have been delayed by an effective mosquito control campaign."

The Cuban government launched a large mosquito control campaign to prevent local Zika virus transmission as outbreaks peaked in the Americas. Researchers believe that while that initial effort was effective, local control eventually waned, leading to the delayed Zika outbreak in Cuba reported in the current study.

With Zika virus transmission still occurring in countries such as Angola, Thailand, and India, the analytical approach used by the research team to identify the Cuban outbreak may help uncover other unreported outbreaks around the world.

"While this study was about Zika, the analytical framework we developed can easily be applied to help identify hidden outbreaks of other diseases that are hard to detect under existing local surveillance systems," said Grubaugh.

Zika virus first made headlines in 2015 when it was detected in Salvador, Brazil and was associated with severe birth defects, such as microcephaly. As the health impacts of Zika are still not fully understood, monitoring where outbreaks are occurring is paramount, researchers noted.

Credit: 
Yale University