Culture

The Paleozoic diet: Why animals eat what they eat

image: Insects are a group in which feeding on plants increases rates of species proliferation, including among the butterflies and moths, which are almost all herbivorous.

Image: 
Daniel Stolte/UANews

What an animal eats is a fundamental aspect of its biology, but surprisingly, the evolution of diet had not been studied across the animal kingdom until now. Scientists at the University of Arizona report several unexpected findings from taking a deep dive into the evolutionary history of more than one million animal species and going back 800 million years, when the first animals appeared on our planet.

The study, published in the journal Evolution Letters, revealed several surprising key insights:

Many species living today that are carnivorous, meaning they eat other animals, can trace this diet back to a common ancestor more than 800 million years ago.

A plant-based, or herbivorous, diet is not the evolutionary driver for new species that it was believed to be.
Closely related animals tend to share the same dietary category - plant-eating, meat-eating, or both. This finding implies that switching between dietary lifestyles is not something that happens easily and often over the course of evolution.

Cristian Román-Palacios, Joshua Scholl and John Wiens, all with the Department of Ecology and Evolutionary Biology at the UA, scoured the literature for data on the dietary habits of more than a million animal species, from sponges to insects and spiders to housecats. A species was classified as carnivorous if it feeds on other animals, fungi or protists (single-celled eukaryotic organisms, many of which live on bacteria). Species were classified as herbivorous if they depend on land plants, algae or cyanobacteria for food, and omnivorous if they eat a mixture of carnivorous and herbivorous diets.

The scientists then mapped the vast dataset of animal species and their dietary preferences onto an evolutionary tree built from DNA-sequence data to untangle the evolutionary relationships between them.

"Ours is the largest study conducted so far that examines the evolution of diet across the whole animal tree of life," said doctoral student Román-Palacios, lead author of the paper. "We addressed three highly-debated and fundamental questions in evolutionary biology by analyzing a large-scale dataset using state-of-the-art methods."

All species can be classified according to their evolutionary relationships, a concept that is known as phylogeny. Organisms are grouped into taxa, which define their interrelationships across several levels. For example, cats and dogs are different species but belong to the same order (carnivores). Similarly, horses and camels belong to a different order (ungulates.) Both orders, however, are part of the same class (mammals). On the highest level, animals are classified in phyla. Examples of animal phyla are arthropods (insects, crustaceans, spiders, scorpions and the like), mollusks (snails, clams and squid fall into this phylum), and chordates, which include all animals with a backbone, including humans.

The survey suggests that across animals, carnivory is most common, including 63% of species. Another 32% are herbivorous, while humans belong to a small minority, just 3%, of omnivorous animals.

The researchers were surprised to find that many of today's carnivorous species trace this diet back all the way to the base of the animal evolutionary tree, more than 800 million years, predating the oldest known fossils that paleontologists have been able to assign to animal origins with certainty.

"We don't see that with herbivory," said Wiens, professor of ecology and evolutionary biology and corresponding author of the study. "Herbivory seems to be much more recent, so in our evolutionary tree, it appears more frequently closer to the tips of the tree."

So if the first animal was a carnivore, what did it prey on?

The authors suggest the answer might lie with protists, including choanoflagellates: tiny, single-celled organisms considered to be the closest living relatives of the animals. Living as plankton in marine and freshwater, choanoflagellates are vaguely reminiscent of miniature versions of the shuttlecock batted back and forth during a game of badminton. A funnel-shaped collar of "hairs" surrounds a whip-like appendage called a flagellum whose rhythmic beating sucks a steady stream of water through the collar, filtering out bacteria and detritus that is then absorbed and digested. It is possible that the common ancestor of today's animals was a creature very similar to a choanoflagellate.

"The ancient creature that is most closely related to all animals living today might have eaten bacteria and other protists rather than plants," Wiens said.

Turning to a plant-based diet, on the other hand, happened much more frequently over the course of animal evolution.

Herbivory has traditionally been seen as a powerful catalyst for the origin of new species - an often-cited example is the insects, with an estimated 1.5 million described species the most diverse group among the arthropods. Many new species of flowering plants appeared during the Cretaceous period, about 130 million years ago, and the unprecedented diversity of flowers is widely thought to have coincided with an increase in insect species taking advantage of the newly available floral bounty.

"This tells us that what we see in insects doesn't necessarily apply to other groups within the animal kingdom," Wiens said. "Herbivory may go hand in hand with new species appearing in certain taxa, but it clearly is not a universal driver of new species."

The study also revealed that omnivorous ("eating everything") diets popped up rarely over the course of 800 million years of animal evolution, hinting at the possible explanation that evolution prefers specialists over generalists.

"You can be better at doing what you do if that is all you do," Wiens said. "In terrestrial vertebrates, for example, eating a diet of leaves often requires highly modified teeth and a highly modified gut. The same goes for carnivory. Nature generally seems to avoid the dilemma of being a jack-of-all-trades and master of none, at least for diets."

This need for specialization might explain why omnivores, such as humans, are rare, according to the authors. It might also explain why diets have often gone unchanged for so long.

"There is a big difference between eating leaves all the time and eating fruits every now and then," Wiens said. "The specializations required to be an efficient herbivore or carnivore might explain why the two diets have been so conserved over hundreds of millions of years."

Credit: 
University of Arizona

What's killing sea otters? Scientists pinpoint parasite strain

image: A southern sea otter swims off the coast of Moss Landing in California.

Image: 
Trina Wood/UC Davis

Many wild southern sea otters in California are infected with the parasite Toxoplasma gondii, yet the infection is fatal for only a fraction of sea otters, which has long puzzled the scientific community. A study from the University of California, Davis, identifies the parasite's specific strains that are killing southern sea otters, tracing them back to a bobcat and feral domestic cats from nearby watersheds.

The study, published this week in the journal Proceedings of the Royal Society B, marks the first time a genetic link has been clearly established between the Toxoplasma strains in felid hosts and parasites causing fatal disease in marine wildlife.

The study builds on years of work by a consortium of researchers led by the UC Davis School of Veterinary Medicine's Karen C. Drayer Wildlife Health Center and the California Department of Fish and Wildlife (CDFW). The scientists were called upon in the late 1990s to help decipher the mystery when Toxoplasma caused deaths in sea otters along the California coast.

"This is decades in the making," said corresponding author Karen Shapiro, an associate professor with the UC Davis School of Veterinary Medicine and its One Health Institute. "We now have a significant link between specific types of the parasite and the outcome for fatal toxoplasmosis in sea otters. We are actually able to link deaths in sea otters with wild and feral cats on land."

FROM LAND TO SEA

Wild and domestic cats are the only known hosts of Toxoplasma, in which the parasite forms egglike stages, called oocysts, in their feces. Shapiro led the initial effort to show how oocysts accumulate in kelp forests and are taken up by snails, which are eaten by sea otters.

For this study, the authors characterized Toxoplasma strains for more than 100 stranded southern sea otters examined by the CDFW between 1998 and 2015. CDFW Veterinary Pathologist Melissa Miller assessed the otters for Toxoplasma as a primary or contributing cause of death. The scientists compared pathology data with the parasite strains found in sea otters and nearby wild and domestic cats to identify connections between the disease-causing pathogen and its hosts.

The study's results highlight how infectious agents like Toxoplasma can spread from cat feces on land to the sea, leading to detrimental impacts on marine wildlife.

CLOSELY WATCHED

Southern sea otters are among the most intensely studied marine mammals in California because they are a threatened species and an iconic animal for the state. They live within just a few hundred meters of the coastline, allowing for close observation that enables a wealth of scientific data.

Previous research showed that up to 70 percent of stranded southern sea otters were infected with Toxoplasma, yet the infection becomes fatal for only a fraction of them. Decades of detailed investigations by CDFW and UC Davis have confirmed that infection by land-based protozoan parasites such as Toxoplasma and the related parasite Sarcocystis neurona are common causes of illness and death for southern sea otters.

Shapiro notes that Toxoplasma can also affect other wildlife species, but there is more robust data for the otters.

"Toxoplasma is one heavily studied pathogen that we care about, but there are many other viruses and bacteria that are on land and being flushed to the ocean that we probably aren't aware of yet," Shapiro said.

WHAT CAN BE DONE?

People can help reduce the spread of Toxoplasma by keeping their cats inside and disposing of cat feces in a bag in the trash, not outdoors or in the toilet because wastewater treatment is not effective in killing oocysts.

Outdoor cats that feed on wild rodents and birds are likely to become infected with Toxoplasma because the parasite is commonly present in the tissues of these prey animals.

Oocysts shed in cat feces on land get washed into waterways with rainfall, and prior research identified freshwater outflow as a key source of Toxoplasma exposure for southern sea otters.

Wetlands, forests and grasslands naturally serve to shield watersheds and oceans from pollutants, including oocysts. Preserving and restoring wetlands and natural areas, managing stormwater runoff, and replacing pavement with permeable surfaces can reduce contamination and minimize pathogens entering the water.

Credit: 
University of California - Davis

Cracking a decades-old test, researchers bolster case for quantum mechanics

image: Researchers created entangled photon pairs and distributed the two photons of each pair to two measurement stations in opposite directions. At each measurement station, a telescope received the photons from the selected cosmic radiation source, which is at least 11 light-years from Earth. The cosmic photon detection signals generate random bits for measurement setting choices for the loophole-free Bell test. In this experiment, the researchers closed detection and locality loopholes, and pushed the time constraint to rule out local hidden variable models to 11 years before the experiment.

Image: 
Ming-Han Li, USTC, Shanghai

WASHINGTON -- In a new study, researchers demonstrate creative tactics to get rid of loopholes that have long confounded tests of quantum mechanics. With their innovative method, the researchers were able to demonstrate quantum interactions between two particles spaced more than 180 meters (590 feet) apart while eliminating the possibility that shared events during the past 11 years affected their interaction.

A paper explaining these results will be presented at the Frontiers in Optics + Laser Science (FIO + LS) conference, held 15-19 September in Washington, D.C., U.S.A.

Quantum phenomena are being explored for applications in computing, encryption, sensing and more, but researchers do not yet fully understand the physics behind them. The new work could help advance quantum applications by improving techniques for probing quantum mechanics.

A test for quantum theories

Physicists have long grappled with different ideas about the forces that govern our world. While theories of quantum mechanics have gradually overtaken classical mechanics, many aspects of quantum mechanics remain mysterious. In the 1960s, physicist John Bell proposed a way to test quantum mechanics known as Bell's inequality.

The idea is that two parties, nicknamed Alice and Bob, make measurements on particles that are located far apart but connected to each other via quantum entanglement.

If the world were indeed governed solely by quantum mechanics, these remote particles would be governed by a nonlocal correlation through quantum interactions, such that measuring the state of one particle affects the state of the other. However, some alternate theories suggest that the particles only appear to affect each other, but that in reality they are connected by other hidden variables following classical, rather than quantum, physics.

Researchers have conducted many experiments to test Bell's inequality. However, experiments can't always be perfect, and there are known loopholes that could cause misleading results. While most experiments have strongly supported the conclusion that quantum interactions exist, these loopholes still leave a remote possibility that researchers could be inadvertently affecting hidden variables, thus leaving room for doubt.

Closing loopholes

In the new study, Li and his colleagues demonstrate ways to close those loopholes and add to the evidence that quantum mechanics governs the interactions between the two particles.

"We realized a loophole-free Bell test with the measurement settings determined by remote cosmic photons. Thus we verified the completeness of quantum mechanics with high-confidence probability," said Ming-Han Li of the University of Science and Technology of China, who is lead author on the paper.

Their experimental setup includes three main components: a device that periodically sends out pairs of entangled photons and two stations that measure the photons. These stations are Alice and Bob, in the parlance of Bell's inequality. The first measurement station is 93 meters (305 feet) from the photon pair source and the second station is 90 meters (295 feet) away in the opposite direction.

The entangled photons travel through single mode optical fiber to the measurement stations, where their polarization state is measured with a Pockels cell and the photons are detected by superconducting nanowire single-photon detectors.

In designing their experiment, the researchers sought to overcome three key problems: the idea that loss and noise make detection unreliable (the detection loophole), the idea that any communication that affects Alice's and Bob's measurement choices makes the measurement cheatable (the locality loophole), and the idea that a measurement-setting choice that is not "truly free and random" makes the result able to be controlled by a hidden cause in the common past (the freedom-of-choice loophole).

To address the first problem, Li and his colleagues demonstrated that their setup achieved a sufficiently low level of loss and noise by comparing measurements made at the start and end of the photon's journey. To address the second, they built the experimental setup with space-like separation between the events of measurement setting choice. To address the third, they based their measurement-setting choices on cosmic photon behavior from 11 years earlier, which offers high confidence that nothing in the particles' shared past - for at least the past 11 years - created a hidden variable affecting the outcome.

Combining theoretically calculated predictions with experimental results, the researchers were able to demonstrate quantum interactions between the entangled photon pairs with a high degree of confidence and fidelity. Their experiment thus provides robust evidence that quantum effects, rather than hidden variables, are behind the particles' behavior.

Credit: 
Optica

WPI mathematician is helping NASA spacecraft travel faster and farther

image: Randy Paffenroth, associate professor of mathematical sciences at WPI, holds a carbon-based sheet known as Miralon®, which is used in a variety of spacecraft applications.

Image: 
Worcester Polytechnic Institute

Worcester, Mass. - Aug. 22, 2019 - By combining cutting-edge machine learning with 19th-century mathematics, a Worcester Polytechnic Institute (WPI) mathematician is working to make NASA spacecraft lighter and more damage tolerant by developing methods to detect imperfections in carbon nanomaterials used to make composite rocket fuel tanks and other spacecraft structures.

Randy Paffenroth, associate professor of mathematical sciences, computer science, and data science, has a multi-part mission in this research project. Using machine learning, neural networks, and an old mathematical equation, he has developed an algorithm that will significantly enhance the resolution of density scanning systems that are used to detect flaws in carbon nanotube materials. Higher resolution scans provide more accurate images (nine times "super resolution") of the material's uniformity, detecting imperfections in Miralon® materials--a strong, lightweight, flexible nanomaterial produced by Nanocomp Technologies, Inc.

Miralon® yarns, which can be as thin as a human hair, can be wrapped around structures like rocket fuel tanks, giving them the strength to withstand high pressures. Imperfections and variations in thickness can cause weak spots in the yarn and the resulting composite. Paffenroth, with a team of graduate students, is analyzing data from the manufacturing process to help ensure a more consistent end product.

Nanocomp uses a modified commercial "basis weight" scanning system that scans the nanomaterial for mass uniformity and imperfections, creating a visual image of density; Paffenroth and his team are using machine learning to train algorithms to increase the resolution of the images, allowing the machine to detect more minute variations in the material. They have developed a unique mathematical "compressed sensing / super resolution" algorithm that has increased the resolution by nine times.

Built with the Python programming language and based on an artificial neural network, the algorithm was "trained" on thousands of sets of nanomaterial images in which Paffenroth had already identified and located flaws. He essentially gave the algorithm a series of practice tests where he already knew the answers (known as "ground truth"). Then, he gave it other tests without the answers. "I give it a sheet of material. I know the imperfections going in but the algorithm doesn't. If it finds those imperfections, I can trust its accuracy," said Paffenroth.

To make the machine learning algorithm more effective at making a high-resolution image out of a low-resolution image, he combined it with the Fourier Transform, a mathematical tool devised in the early 1800s that can be used to break down an image into its individual components.

"We take this fancy, cutting-edge neural network and add in 250-year-old mathematics and that helps the neural network work better," said Paffenroth. "The Fourier Transform makes creating a high-resolution image a much easier problem by breaking down the data that makes up the image. Think of the Fourier Transform as a set of eyeglasses for the neural network. It makes blurry things clear to the algorithm. We're taking computer vision and virtually putting glasses on it.

"It's exciting to use this combination of modern machine learning and classic math for this kind of work," he added.

Paffenroth's work is funded by an $87,353 grant WPI received from Nanocomp Technologies, a New Hampshire-based subsidiary of Huntsman Corporation that makes advanced carbon-nanotube materials for aerospace, defense, and the automotive industry. WPI is a sub-contractor to Nanocomp, which received an $8.1 million contract from NASA to advance its carbon nanotube sheets and yarns.

Miralon® has already been proven in space. For instance, it was wrapped around structural supports in NASA's Juno probe orbiting the planet Jupiter to help a challenging problem with vibration damping and static discharge. NASA has also used Miralon® nanomaterials to make and test prototypes of new carbon composite pressure vessels, the precursors to next generation rocket fuel tanks. NASA spacecraft will need that added strength and durability as they travel farther from home and deeper into space.

As part of its current NASA contract, Nanocomp is trying to make Miralon® yarns that are three times stronger, and the work by Paffenroth's team is a big part of making that happen.

"Randy is helping us achieve this goal of tripling our strength by improving the tools in our toolbox so that we can make stronger, better, next-generation materials to be used in space applications," said Bob Casoni, Quality Manager at Nanocomp. "If NASA needs to build a new rocket system strong enough to get to Mars and back, it has a big set of challenges to face. Better materials are needed to allow NASA to design rockets that can go farther, faster and survive longer."

Casoni noted that with the higher resolution from WPI's algorithm, Nanocomp can see patterns and variations in its materials that they couldn't see before.

"We can not only pick up features, but we also have a better idea of the magnitude of those features," he said. "Before, it was like seeing a blurry satellite image. You might think you're seeing the rolling hills of Pennsylvania, but with better resolution you see it's really Mount Washington or the Colorado Rockies. It's pretty amazing stuff."

And with better measurement tools, Nanocomp also will be able to improve its manufacturing process by testing whether changes in factors like temperature, tension control, pressure, and flow rates create better materials. "We can use better measurements to optimize our ultimate product performance," said Casoni. "Randy is helping us understand our manufacturing process better. He's doing his "magic math" to help us better understand variations in our product. The uniformity of that material plays a big part in its ultimate strength."

Paffenroth and his team will also develop algorithms to be used in active feedback control systems to predict how good a particular piece of material will be as it's first being made, helping to ensure a more consistent end product. The algorithm analyzes the properties measured at the beginning of the manufacturing run to effectively predict the properties at the end of the run, including mechanical properties and length of run.

"We can use machine learning to predict that Nanocomp won't get a useful length of material out of a particular production run," said Paffenroth. "It helps them with waste. If they can tell in the first few meters of the run that there will be a problem, they can stop and start over. The Holy Grail of process engineering is that the more you understand about your process, the better your process is."

WPI will present its findings on Aug. 25 at the 2019 International Conference on Image, Video Processing and Artificial Intelligence in Shanghai, China.

Credit: 
Worcester Polytechnic Institute

A single gene determines whether a fly has a good sense of sight or a good sense of smell

image: This figure shows heads from D. melanogaster (top) and D. pseudoobscura (bottom) with different proportions of eyes and antennae.

Image: 
A. Ramaekers and N. Grillenzoni

Trade-offs in the sizes of visual and olfactory organs are a common feature of animal evolution, but the underlying genetic and developmental mechanisms have not been clear. A study publishing August 22 in the journal Developmental Cell reveals that a single DNA variant that affects the timing of sensory organ development in fruit flies could explain the size trade-off between eyes and antennae, potentially providing a quick route to behavioral changes and adaptation.

Because the affected gene, eyeless/Pax6, is conserved across invertebrates and vertebrates, including humans, the discovery could represent a general mechanism for sensory organ size trade-offs across the animal kingdom.

The senses animals rely on have been shaped through evolution to better navigate and exploit the environment. As a result, even closely related species living in different ecological niches show variation in the sizes and shapes of their sensory structures. In arthropods such as fruit flies, trade-offs between the size of the eyes and of the antennae, where most olfactory organs are located, are pervasive.

"What we demonstrate is that there are consequences to subtle changes in the conserved mechanisms that govern how these sense organs develop," says senior study author Bassem Hassan of Institut du Cerveau et de la Moelle épinière (ICM). "What this means more broadly is that one cannot fully understand how genetic variation and morphological variation relate to each other without understanding the developmental processes that translate the former into the latter."

To examine the underlying mechanisms, Hassan and first author Ariane Ramaekers of Institut du Cerveau et de la Moelle épinière (ICM) combined comparative analyses of different fruit fly strains and species with developmental, molecular, and genome-editing approaches. Specifically, the authors focused on a structure called the eye-antennal imaginal disc (EAD), which consists of an eye field and a non-eye field and gives rise to all external head sensory organs during fruit fly development.

Hassan and Ramaekers found that the eye field is proportionally larger in Drosophila pseudoobscura (D. pse.) compared to Drosophila melanogaster (D. mel.), corresponding to a 35% increase in the number of ommatidia--small units that make up the insect compound eye. Similarly, the eye field is proportionally larger in the D. mel. strain called Canton-S compared with the D. mel. strain Hikone-AS, corresponding to a 12.5% increase in ommatidia number.

"These data suggest that, despite 17 to 30 million years of separated evolution between the two species groups, ommatidia number variation between D. mel. and D. pse. and between two D. mel. strains share a common developmental logic," Hassan says.

To search for genetic causes of eye size variation, the researchers next examined DNA sequences that transcription factors bind to regulate the expression of the neighboring eyeless/Pax6 gene. They found that a single nucleotide variant--a G>A substitution--at a binding site that differentiates the small eye subspecies from the large eye subspecies. The G allele in small eye subspecies is predicted to have a higher affinity for the binding site, resulting in greater repression of eyeless/Pax6 gene expression compared to the A allele in the large eye subspecies.

Additional analyses showed that this variant occurs in natural fruit fly populations, and the A allele corresponds to both more ommatidia and smaller antennal width among different laboratory strains. Using CRISPR/Cas9 to introduce the A allele into a G-homozygous stock, the researchers demonstrated that the G>A substitution causes an increase in the number of ommatidia.

"We were surprised by the simplicity of the mechanism of sensory trade-offs that we identified: To vary sensory organ size--in this case, eye versus antennae--it suffices to slightly vary the expression of a single gene," Ramaekers says. "It was particularly satisfactory to find that this gene, called Pax6, is the same one that builds the eye in all animals, including humans. We were also surprised that what matters to generate a trade-off is to change when, rather than where, Pax6 ends up being expressed. To make the eye bigger or smaller, it is sufficient to slightly speed up or slow down the subdivision of the head primordium into eye versus non-eye territories."

For the authors, the findings raise several intriguing questions. For example, it's not yet clear how the single-nucleotide variant changes the timing of Pax6 expression and sensory organ development. Moreover, it's possible that controlling the timing of the expression of key developmental genes could be a general rule for changing the size of a tissue or organ. Another interesting question is whether sensory brain regions are affected by changes in the relative sizes of sensory organs.

Credit: 
Cell Press

New method classifies brain cells based on electrical signals

image: MIT Professor Earl Miller

Image: 
The Picower Institute for Learning and Memory

For decades, neuroscientists have relied on a technique for reading out electrical "spikes" of brain activity in live, behaving subjects that tells them very little about the types of cells they are monitoring. In a new study, researchers at the University of Tuebingen and MIT's Picower Institute for Learning and Memory demonstrate a way to increase their insight by distinguishing four distinct classes of cells from that spiking information.

The advance offers brain researchers the chance to better understand how different kinds of neurons are contributing to behavior, perception and memory, and how they are malfunctioning in cases of psychiatric or neurological diseases. Much like mechanics can better understand and troubleshoot a machine by watching how each part works as it runs, neuroscientists, too, are better able to understand the brain when they can tease apart the roles different cells play while it thinks.

"We know from anatomical studies that there are multiple types of cells in the brain and if they are there, they must be there for a reason," said Earl Miller, Picower Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT, and co-senior author of the paper in Current Biology. "We can't truly understand the functional circuitry of the brain until we fully understand what different roles these different cell types might play."

Miller collaborated with the Tuebingen-based team of lead author Caterina Trainito, Constantin von Nicolai and Professor Markus Siegel, co-senior author and a former postdoc in Miller's lab, to develop the new way to wring more neuron type information from electrophysiology measurements. Those measures track the rapid voltage changes, or spikes, that neurons exhibit as they communicate in circuits, a phenomenon essential for brain function.

"Identifying different cell types will be key to understand both local and large-scale information processing in the brain," Siegel said.

Four is greater than two

At best, neuroscientists have so far only been able to determine from electrophysiology whether a neuron was excitatory or inhibitory. That's because they only analyzed the difference in the width of the spike. The typical amount of data in an electrophysiology study - spikes from a few hundred neurons -only supported that single degree of distinction, Miller said.

But the new study could go farther because it derives from a dataset of recordings from nearly 2,500 neurons. Miller and Siegel gathered the data years ago at MIT from three regions in the cortex of animals who were performing experimental tasks that integrated perception and decision making.

"We recognized the uncommonly rich resource at our disposal," Siegel said.

Thus, the team decided to put the dataset through a ringer of sophisticated statistical and computational tools to analyze the waveforms of the spikes. Their analysis showed that the waveforms could actually be sorted along two dimensions: how quickly the waveform ranges between its lowest and highest voltage ("trough to peak duration"), and how quickly the voltage changes again afterward, returning from the peak to the normal level ("repolarization time"). Plotting those two factors against each other neatly sorted the cells into four distinct "clusters." Not only were the clusters evident across the whole dataset, but individually within each of the three cortical regions, too.

For the distinction to have any meaning, the four classes of cells would have to have functional differences. To test that, the researchers decided to sort the cells out based on other criteria such as their "firing rate" (how often they spike), whether they tend to fire in bursts and how variable their intervals are between spikes - all factors in how they participate in and influence the circuits they are connected in. Indeed, the cell classes remained distinct by these measures.

In yet another phase of analysis the cell classes also remained distinguishable as the researchers watched them respond to the animals perceiving and processing visual stimulation. But in this case, they saw the cells play different roles in different regions. A class 1 cell, for example, might respond differently in one region than it did in another.

"These cell types are truly different cell types that have different properties," Miller said. "But they have different functions in different cortical areas because different cortical areas have different functions."

New research capability

In the study the authors speculate about which real neuron types their four mathematically defined classes most closely resemble, but they don't yet offer a definitive determination. Still, Miller said the finer-grained distinctions the study draws are enough to make him want to reanalyze old neural spiking data to see what new things he can learn.

One of Miller's main research interests is the nature of working memory - our ability to hold information like directions in mind while we make use of it. His research has revealed that it is enabled by a complex interplay of brain regions and precisely timed bursts of neural activity. Now he may be able to figure out how different classes of neurons play specific roles in specific regions to endow us with this handy mental ability.

And both Miller's and Siegel's labs are particularly interested in different brain rhythms, which are abundant in the brain and likely play a key role for orchestrating communication between neurons. The new results open a powerful new window for them to unravel which role different neuron classes play for these brain rhythms.

Credit: 
Picower Institute at MIT

Microbiology: Atacama Desert microbes may hold clues to life on Mars

image: Picture taken at one of the sites inspected in the hyperarid core of the Atacama Desert. Note the Petri plates with different growing media shining with the sun light at the right.

Image: 
Professor Carlos González-Silva

Microbial life on Mars may potentially be transported across the planet on dust particles carried by wind, according to a study conducted in the Atacama Desert in North Chile, a well-known Mars analogue. The findings are published in Scientific Reports.

Armando Azua-Bustos and colleagues investigated whether microbial life could move across the Atacama Desert using on wind-driven dust particles They sought to determine where these microorganisms originate, which may have implications for microbial life in extreme environments.

The authors collected 23 bacterial and eight fungal species from three sampling sites across two regions of the Atacama traversing its hyperarid core, which in addition to its extreme aridity is known for having highly saline/oxidizing soils and extremely high UV radiation. Only three of the species were shared among transects, suggesting that there are different airborne ecosystems in different parts of the desert. Bacterial and fungal species identified from the samples included Oceanobacillus oncorhynchi, a bacterium first described in aquatic environments, and Bacillus simplex, which originates from plants. These observations indicate that microbes may arrive at the hyperarid core from the Pacific Ocean and the Coastal Range of the desert.

The authors found that microbial cells collected in the morning tended to come from nearby areas, whereas in the afternoon, marine aerosols and microbial life on dust particles were carried by the wind from remote locations. This finding suggests that microbial life is able to efficiently move across the driest and most UV irradiated desert on Earth. Potential microbial life on Mars may similarly spread over, the authors speculate.

Credit: 
Scientific Reports

Memory T cells shelter in bone marrow, boosting immunity in mice with restricted diets

WHAT:
Even when taking in fewer calories and nutrients, humans and other mammals usually remain protected against infectious diseases they have already encountered. This may be because memory T cells, which are located throughout the body and required to maintain immune responses to infectious agents, according to scientists at the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health. Their study in mice, published online today in Cell, also found that animals undergoing dietary restriction were better protected against tumors and bacterial infections than animals with unrestricted diets.

Researchers led by Yasmine Belkaid, Ph.D., chief of the Metaorganism Immunity Section in NIAID's Division of Intramural Research, previously observed that fat tissue harbors memory T cells in mice. They investigated whether this phenomenon helped preserve immune memory when calorie intake was reduced. To investigate, they restricted the diet of mice previously given full access to food. While receiving less food, mice had fewer memory T cells in their lymphoid tissues, where they normally linger, and more of the T cells in bone marrow that became enriched with fat tissue.

Investigators then evaluated how well memory T cells performed when mice ate less. While eating freely, mice were infected with the bacterium Yersinia pseudotuberculosis. After the mice developed immunological memory, researchers restricted the diets of some of the mice for up to four weeks before again exposing all the mice to Y. pseudotuberculosis. Mice with restricted diets had more robust memory T cell responses and were better protected from illness. The researchers repeated this experiment using a vaccine that trains immune cells to fight melanomas and found that memory T cells were more protective against tumors in mice receiving less food.

Though this phenomenon has yet to be studied in humans, the findings suggest how the immune system may have evolved to help mammals survive periods of limited food availability while keeping their immunity intact. These results in lab animals cannot be extrapolated to dietary advice for people. However, these insights may one day help clinicians improve immunotherapy for cancers and other diseases by optimizing nutrition.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Early life on Earth limited by enzyme

image: This is the graphic of Anabaena cylindrica, a filamentous cyanobacterium, 1946 watercolor by G. E. Fogg, FRS.

Image: 
G. E. Fogg, FRS

The enzyme-nitrogenase-can be traced back to the universal common ancestor of all cells more than four billion years ago.

Found only in bacteria today, nitrogenase is nevertheless essential for the production of oxygen from water in photosynthesis, making it instrumental in how aquatic bacteria produced Earth's first molecular oxygen 2.5 billion years ago.

"For half of Earth's 4.6 billion year existence, the atmosphere contained only carbon dioxide and nitrogen, with no oxygen, but this changed when cyanobacteria, also known as blue-green algae, started producing the first oxygen using nitrogenase. This led to the Great Oxidation Event," explained study author Professor John Allen (UCL Genetics, Evolution & Environment).

"But instead of rising steadily, atmospheric oxygen levels stabilised at 2% by volume for about two billion years before increasing to today's level of 21%. The reasons for this have been long debated by scientists and we think we've finally found a simple yet robust answer."

A study, published today in Trends in Plant Sciences by researchers from UCL, Queen Mary University of London and Heinrich-Heine-Universität Düsseldorf, proposes for the first time that atmospheric oxygen produced using nitrogenase blocked the enzyme from working.

This negative feedback loop prevented further oxygen from being made and initiated a long period of stagnation in Earth's history about 2.4 billion years ago.

Lasting nearly two billion years, the Proterozoic Eon saw very little change in the evolution of life, ocean and atmosphere composition and climate, leading some to call it the 'boring billion'.

"There are many ideas about why atmospheric oxygen levels stabilised at 2% for such an incredibly long period of time, including oxygen reacting with metal ions, but remarkably, the key role of nitrogenase has been completely overlooked," said study co-author Professor William Martin (Heinrich-Heine-Universität Düsseldorf).

"Our theory is the only one that accounts for the global impact on the production of oxygen over such a sustained period of time and explains why it was able to rise to the levels we see today, fuelling the evolution of life on Earth."

The team says that the negative feedback loop ended only when plants conquered land about 600 million years ago.

When land plants emerged, their oxygen producing cells in leaves were physically separated from nitrogenase containing cells in soil. This separation allowed oxygen to accumulate without inhibiting nitrogenase.

This theory is supported by evidence in the fossil record that shows cyanobacteria had begun to protect nitrogenase in dedicated cells called heterocysts about 408 million years ago, once oxygen levels were already increasing from photosynthesis in land plants.

"Nitrogenase is essential for life and the process of photosynthesis as it fixes nitrogen in the air into ammonia, which is used to make proteins and nucleic acids," said co-author Mrs Brenda Thake (Queen Mary University of London).

"We know from studying cyanobacteria in laboratory conditions that nitrogenase ceases to work at higher than 10% current atmospheric levels, which is 2% by volume, as the enzyme is rapidly destroyed by oxygen. Despite this being known by biologists, it hasn't been suggested as a driver behind one of Earth's great mysteries, until now."

Credit: 
University College London

Maximum mass of lightest neutrino revealed using astronomical big data

Neutrinos come in three flavours made up of a mix of three neutrino masses. While the differences between the masses are known, little information was available about the mass of the lightest species until now.

It's important to better understand neutrinos and the processes through which they obtain their mass as they could reveal secrets about astrophysics, including how the universe is held together, why it is expanding and what dark matter is made of.

First author, Dr Arthur Loureiro (UCL Physics & Astronomy), said: "A hundred billion neutrinos fly through your thumb from the Sun every second, even at night. These are very weakly interactive ghosts that we know little about. What we do know is that as they move, they can change between their three flavours, and this can only happen if at least two of their masses are non-zero."

"The three flavours can be compared to ice cream where you have one scoop containing strawberry, chocolate and vanilla. Three flavours are always present but in different ratios, and the changing ratio-and the weird behaviour of the particle-can only be explained by neutrinos having a mass."

The concept that neutrinos have mass is a relatively new one with the discovery in 1998 earning Professor Takaaki Kajita and Professor Arthur B. McDonald the 2015 Nobel Prize in Physics. Even so, the Standard Model used by modern physics has yet to be updated to assign neutrinos a mass.

The study, published today in Physical Review Letters by researchers from UCL, Universidade Federal do Rio de Janeiro, Institut d'Astrophysique de Paris and Universidade de Sao Paulo, sets an upper limit for the mass of the lightest neutrino for the first time. The particle could technically have no mass as a lower limit is yet to be determined.

The team used an innovative approach to calculate the mass of neutrinos by using data collected by both cosmologists and particle physicists. This included using data from 1.1 million galaxies from the Baryon Oscillation Spectroscopic Survey (BOSS) to measure the rate of expansion of the universe, and constraints from particle accelerator experiments.

"We used information from a variety of sources including space- and ground-based telescopes observing the first light of the Universe (the cosmic microwave background radiation), exploding stars, the largest 3D map of galaxies in the Universe, particle accelerators, nuclear reactors, and more," said Dr Loureiro.

"As neutrinos are abundant but tiny and elusive, we needed every piece of knowledge available to calculate their mass and our method could be applied to other big questions puzzling cosmologists and particle physicists alike."

The researchers used the information to prepare a framework in which to mathematically model the mass of neutrinos and used UCL's supercomputer, Grace, to calculate the maximum possible mass of the lightest neutrino to be 0.086 eV (95% CI), which is equivalent to 1.5 x 10-37 Kg. They calculated that three neutrino flavours together have an upper bound of 0.26 eV (95% CI).

Second author, PhD student Andrei Cuceu (UCL Physics & Astronomy), said: "We used more than half a million computing hours to process the data; this is equivalent to almost 60 years on a single processor. This project pushed the limits for big data analysis in cosmology."

The team say that understanding how neutrino mass can be estimated is important for future cosmological studies such as DESI and Euclid, which both involve teams from across UCL.

The Dark Energy Spectroscopic Instrument (DESI) will study the large scale structure of the universe and its dark energy and dark matter contents to a high precision. Euclid is a new space telescope being developed with the European Space Agency to map the geometry of the dark Universe and evolution of cosmic structures.

Professor Ofer Lahav (UCL Physics & Astronomy), co-author of the study and chair of the UK Consortiums of the Dark Energy Survey and DESI said: "It is impressive that the clustering of galaxies on huge scales can tell us about the mass of the lightest neutrino, a result of fundamental importance to physics. This new study demonstrates that we are on the path to actually measuring the neutrino masses with the next generation of large spectroscopic galaxy surveys, such as DESI, Euclid and others."

Credit: 
University College London

Researchers reveal plant defense toolkit and insights for fighting crop diseases

image: Detlef Weigel, Director at the Max Planck Institute for Developmental Biology, at work with Arabidopsis specimens.

Image: 
Joerg Abendroth/Max Planck Institute for Developmental Biology

At an unprecedented scale, researchers have now catalogued the array of surveillance tools that plants use to detect disease-causing microbes across an entire species.

Representing a major advance for plant biology, the findings have important implications for the management of dangerous crop diseases which represent significant threats to food security.

Like animals, plants rely on immune systems to help them respond to attack by pathogenic microbes such as bacteria and fungi. A crucial layer of the immune system is formed by proteins called Nucleotide-binding Leucine-rich Repeat receptors (NLRs), which work together in combinations to detect the ever-changing array of microbes in the environment.

Despite progress in understanding how these receptors work together, key questions remained; what is the full spectrum of NLRs produced by plants? How much variation in NLRs is there in a typical plant species? What range is needed for plants to protect themselves?

An international team supported by the 2Blades Foundation through a grant from the Gordon and Betty Moore Foundation set out to piece together the NLR repertoire in the plant equivalent of the lab-rat, a mustard family species called Arabidopsis. The effort was carried out in the laboratories of Detlef Weigel, at the Max Planck Institute for Developmental Biology in Germany, Jeffery Dangl, of the Howard Hughes Medical Institute and the University of North Carolina, and Jonathan Jones at The Sainsbury Laboratory in the United Kingdom. Researchers from the University of Bath, Colorado State University, and the Center for Research in Agricultural Genomics (Barcelona) also contributed to the study, which is published today in the journal Cell.

The team sampled Arabidopsis plants, which come under attack by pathogens of all stripes, from populations across Europe, North America, and Asia. Using state-of-the-art sequencing technologies, they analyzed the genetic diversity of the plants' NLRs. They found, as anticipated, that NLR repertoires in plants differed between locations, likely due to different evolutionary pressures from regional pathogens. But the team was surprised to find an upper limit on NLR gene numbers across the species.

As Weigel described the findings, "While the diversity is perhaps more limited than anticipated, it is the mixing of genes that makes each individual uniquely resistant to a different spectrum of pathogens."

The study will also change the way researchers approach the subject in future, says Jones.

"One of the remarkable conclusions that emerge from this study is that so many populations need to be sequenced to define the full immune system repertoire of any plant," he said. "Gone are the days when a single reference sequence is sufficient to reveal the secrets of a species; it is now clear that we need to understand the genetic diversity of a species in order to understand its immune system."

The research has major implications for our understanding of agriculture and plant evolution. "This work will help drive new NLR receptor function discovery for developing disease-resistant crops and also guide the analysis of their evolution across the plant kingdom," said Dangl.

Coordination and support for the study was provided by the 2Blades Foundation, an international organization dedicated to understanding plant disease and advancing durable crop disease resistance. Through its programs, 2Blades has already demonstrated in field trials that protection against serious unmanaged diseases can be expanded in crops by introducing resistance genes from related plant species. The fundamental knowledge of NLRs from the study will further inform strategies in its on-going programs, as well as the field of plant-microbe interactions at large. This study was funded in part by the Gordon and Betty Moore Foundation through Grant GBMF4725 to the 2Blades Foundation.

Credit: 
John Innes Centre

Yale researchers detect unreported Zika outbreak

New Haven, Conn. -- Researchers at the Yale School of Public Health (YSPH) have detected a large unreported Zika outbreak that occurred in Cuba during 2017, a year after Zika outbreaks peaked throughout the Americas.

Unreported outbreaks present a risk of silently spreading viruses to other parts of the world. They also highlight the need for alternative detection methods when there is a lack of access to reliable local case reporting, the researchers said.

The study, published online in the journal Cell, was a large collaborative effort with Scripps Research, Florida Gulf Coast University, the Florida Department of Health, and other institutions.

Towards the end of 2016, as the Zika epidemic in the Americas appeared to be waning, lead author Nathan Grubaugh, an assistant professor of epidemiology, and other researchers involved in the study became interested in whether hidden outbreaks were occurring.

"Accurate case detection is not always possible in many places for a variety of biological and socioeconomic reasons," said Grubaugh. "So, we constructed a framework using infections diagnosed from travelers, travel patterns, and virus genomics to detect outbreaks in the absence of local data." Virus genomics sequences the genetic code of a virus, which helped scientists determine how the viruses in Cuba were related to those appearing throughout the Americas.

Using travel-related Zika cases reported by the Florida Department of Health, the team discovered a spike in cases from people returning from Cuba in 2017. They estimated that the outbreak in Cuba should have resulted in 1,000 to 20,000 Zika cases, even though only 187 cases were reported in 2016 and none in 2017.

"A big question was: Why did the outbreak in Cuba occur a year after others in the region?" said Chantal Vogels, a postdoctoral fellow at YSPH. "We investigated virus genomics, travel patterns, and models of mosquito transmission, and found that the outbreak in Cuba may have been delayed by an effective mosquito control campaign."

The Cuban government launched a large mosquito control campaign to prevent local Zika virus transmission as outbreaks peaked in the Americas. Researchers believe that while that initial effort was effective, local control eventually waned, leading to the delayed Zika outbreak in Cuba reported in the current study.

With Zika virus transmission still occurring in countries such as Angola, Thailand, and India, the analytical approach used by the research team to identify the Cuban outbreak may help uncover other unreported outbreaks around the world.

"While this study was about Zika, the analytical framework we developed can easily be applied to help identify hidden outbreaks of other diseases that are hard to detect under existing local surveillance systems," said Grubaugh.

Zika virus first made headlines in 2015 when it was detected in Salvador, Brazil and was associated with severe birth defects, such as microcephaly. As the health impacts of Zika are still not fully understood, monitoring where outbreaks are occurring is paramount, researchers noted.

Credit: 
Yale University

Here's how early humans evaded immunodeficiency viruses

image: The tetherin protein (green) on the surface of monkey, ape and human cells inhibits the release of SIV virions from the cell (left). SIV overcomes this restriction by expressing the protein Nef (yellow), which down-regulates tetherin by tying it to the protein AP-2 (purple), channeling it to be destroyed. This tight Nef binding is absent in humans as a result of a mutation in tetherin. The inability of SIV to destroy human tetherin was one of the major barriers to crossover of SIV to humans.

Image: 
UC Berkeley image by Cosmo Buffalo

For hundreds of thousands of years, monkeys and apes have been plagued by simian immunodeficiency virus (SIV), which still devastates primate groups in Africa.

Luckily, as humans evolved from these early primates, we picked up a mutation that made us immune from SIV -- at least until the early 20th century, when the virus evolved to get around our defenses, giving rise to human immunodeficiency virus (HIV) and an AIDS pandemic that today affects an estimated 38 million people worldwide.

University of California, Berkeley, researchers have now discovered how that long-ago human mutation interfered with SIV infection, a finding that could provide clues for the development of new therapies to thwart HIV and similar viral infections.

"The main importance for this paper is that it tells us what was one of the last major barriers before the crossover to humans happened," said James Hurley, a UC Berkeley professor of molecular and cell biology. "The current paper is an archeological look at how this happened."

The barrier was a mutation in human cells that blocked SIV from forcing these cells to shed thousands more copies of the virus. As a result, humans could not re-infect one another.

This genetic mutation interfered with the ability of an SIV protein to tightly bind two human proteins and send them for destruction within the cell, instead of fighting the virus. The researchers used cryo-electron microscopy, or cryoEM, to determine the structure of this protein complex and discovered that the mutation so effectively disrupted the protein binding sites that it took SIV a long time to find a work-around.

"The binding site involved is structurally very complex, so essentially it is not possible to adapt to it once the tight binding is lost. The virus had to invent a completely different way to do the same thing, which took a long time in evolution," Hurley said. "This conferred an advantage on our prehistoric ancestors: From chimps on down, every primate was susceptible to SIV, but humans were immune. That gave humans probably a grace period of tens to hundreds of thousands of years to develop without having to deal with this disease. I tend to think that really gave a leg up to humans in early evolution."

Though the SIV virus -- in this case, from a monkey called the sooty mangabey, the source of the less virulent HIV-2 strain in humans -- differs in several ways from the HIV strains that afflict humans, the findings could pinpoint targets for drugs as researchers look for "functional" cures for AIDS. These would be one-time treatments that prevent flare-ups of the disease, even if the virus remains in the body.

"The overall strategy in our lab is to try to find regions in the structures of human proteins that are attacked by viruses, but are not needed for normal purposes by the host, so that a drug can be designed to attack that region," Hurley said. "The virus will typically respond by mutating, which means it evolves drug resistance, but this new finding suggests that with the right point of attack, it could take SIV or HIV, in some cases, tens of thousands of years of evolution to catch up."

The work will be published in the Sept. 11 issue of the journal Cell Host & Microbe and was posted online Aug. 22.

Sooty mangabeys

SIV and HIV, which are lentiviruses, are hard to root out the body because they insert their DNA into the genomes of host cells, where it sits like a ticking time bomb, ready at any moment to revive, take over the host cell's machinery to makes copies of itself and send out thousands of these copies -- called virions -- to infect other cells.

These virions are formed when the newly copied viral DNA wraps itself in a piece of the host cell's membrane and buds off, safely ensconced in a bubble until it can reinfect.

Because budding is an important step in the spread of many viruses, primates long ago evolved natural defenses, including proteins on the surface of cells that staple the budding virions to the cell and prevent them from leaving. As they accumulate, the immune system recognizes these unbudded virions as abnormal and destroys the whole cell, virus and all.

In monkey, ape and human cells, the staple is called tetherin, because it tethers the budding virion to the cell membrane.

In the constant arms race between host and pathogen, SIV evolved a countermeasure that exploits another normal cell function: its recycling system. Cells have ways to remove proteins sitting on the surface, through which cells constantly take up and recycle tetherin if there's no indication it is needed to fight an invading virus. It does this by dimpling the membrane inward to form a little bubble inside the cell, capturing tetherin and other surface proteins in this vesicle and then digesting all the contents, including tetherin.

SIV's countermeasure was to produce a protein, called Nef, that revs up the recycling of tetherin, even during an infection. This enables virions to bud off and search for new victims.

Hurley and project scientist Xuefeng "Snow" Ren found that Nef forms a tight wedge between tetherin and a protein in the vesicle called AP-2, preventing tetherin from escaping the vesicle and dooming it to recycling.

"Nef is a bridge between AP2 and tetherin to recruit them into endocytosis, dragging the tetherin into the vesicle," Ren said. "So it tricks our own cells' machinery for getting rid of stuff we don't want into getting rid of stuff the virus doesn't want."

The five amino acids that humans lost in the tetherin protein -- the mutation that gave humans immunity against SIV -- loosened the binding between tetherin, Nef and AP-2, which allowed tetherin to escape recycling. This blocked the crossover of zoonotic virus transmission, Ren said, because the structural rearrangement was so extensive that SIV couldn't fix it by simple mutations in Nef.

SIV developed a new trick

Some variants of SIV did eventually find a way around this hurdle, however. At some point, a few SIVs acquired a second protein, Vpu, to do what Nef also did -- wedge itself between proteins to cement connections helpful to the virus. At some point, perhaps a hundred years ago, this strain of SIV moved into humans from chimpanzees, and a slight mutation in Vpu reignited the recycling of tetherin in humans, unleashing what we know today as group M HIV-1, the most virulent form of HIV worldwide.

"There were probably many crossovers into humans that failed, but eventually, some hunter in Africa, perhaps in the course of butchering a chimp, was exposed to the blood, and the virus then acquired an additional mutation, a small step that turned SIV into HIV," Hurley said.

Next up, Hurley, Ren and their colleagues plan to use cryoEM to determine the structure of the three-protein complex in gorilla variants of SIV, which evolved into the O strain of HIV-1, a less virulent strain that originated in the African country of Cameroon.

Credit: 
University of California - Berkeley

Climate change will alter waves along half the world's coast

New research finds that a warming planet will also alter ocean waves along more than 50% of the world's coastlines. This research, published in Nature Climate Change, has significant implications for coastal flooding and erosion.

As part of the Coordinated Ocean Wave Climate Project, ten research organisations, including the National Oceanography Centre (NOC), combined to look at a range of different global wave models in a variety of future climate scenarios, to determine how waves might change in the future.

NOC scientist Dr Lucy Bricheno, who contributed to this study, said "The National Oceanography Centre is proud to be involved in this worldwide collaboration in wave science: this paper helps us to understand and quantify the future changes in global wind-wave climate".

While they identified some differences between different models, they found if the 2? Paris agreement target is kept, changes in wave patterns are likely to stay inside natural climate variability.

However, in a business-as-usual climate, where warming continues in line with current trends, the models agreed the planet is likely to see significant changes in wave conditions along 50% of the world's coasts, although these changes varied by region.

For example, if the climate warms by more than 2? beyond pre-industrial levels, southern Australia is likely to see longer, more southerly waves that could alter the stability of the coastline. For the UK coast the mean wave height is projected to decrease by about 10% by the end of the century under the most extreme global warming scenario.

Some areas will see the height of waves remain the same, but their wavelength or frequency will change. This can result in changes in the force exerted on the coast and any infrastructure there, and in some cases lead to increased wave-driven flooding.

Similarly, climate change induced alterations to the direction of waves can change how much sand they move along the coast. Infrastructure built on the coast, or offshore, is sensitive to these different characteristics of waves.

NOC scientist Professor Judith Wolf, also a co-author of the study, said "It is important to understand changes in the wave climate under climate change scenarios because waves are what cause damage to coastal defences and infrastructure, and erosion of natural coasts, beaches and ecosystems. They also contribute to increasing flood levels through wave setup, run-up and overtopping."

The overarching pattern emerging from this study is that robust changes in projected mean wave heights are seen in some areas, with increases in the Southern Ocean and the tropical eastern Pacific, but decreases in the North Atlantic Ocean and portions of the northern Pacific Ocean. These changes are consistent with a relatively uniform decrease in projected surface wind speeds over the northern hemisphere extra-tropical storm belt, partly driven by the polar amplification of climate change.

Previous research has looked at the way waves have shaped our coasts through the past, which is used as a guide to understanding past sea levels. However, this research has often assumed that while sea levels might change, wave conditions have stayed the same. This same assumption is used when considering how climate change will influence future coastlines. Although, importantly, climate change can alter waves both through changing wind patterns, and through changes to the water depth at the coast through sea-level rise.

Credit: 
National Oceanography Centre, UK

Researchers find genetic links to child obesity across diverse ethnic groups

image: Struan F.A. Grant, PhD, is a director of the Center for Spatial and Functional Genomics at Children's Hospital of Philadelphia

Image: 
Children's Hospital of Philadelphia

An international team of researchers who analyzed data across multiple ethnicities has produced the largest genetic study to date associated with common childhood obesity. The Early Growth Genetics (EGG) Consortium discovered a robust new signal, fine-mapped previously reported genetic variants, and added to evidence that genetic influences on obesity operate across the lifespan.

"As we continue to deepen our research into the genetics of obesity, this knowledge is bringing us closer to pinpointing specific causal genes and how they function in giving rise to obesity," said lead investigator Struan F.A. Grant, PhD, a director of the Center for Spatial and Functional Genomics (CSFG) at Children's Hospital of Philadelphia (CHOP). "That detailed knowledge will help guide researchers toward developing more effective treatments." Grant co-led the new study with Vincent W. V. Jaddoe, MD, PhD, of Erasmus University Medical Center in Rotterdam, Netherlands.

As a major public health issue, obesity has increasingly received public attention, in light of a rising prevalence of the condition among children--now higher than 20 percent in the U.S. Obese adolescents tend to have a higher risk of mortality as adults. Although environmental factors, such as food choices and sedentary habits, contribute to rising rates of obesity in childhood, scientists have found strong evidence of genetic influences as well.

The current research appeared online July 5, 2019 in Human Molecular Genetics. It extends the work of a 2012 collaborative study, also led by Grant, that identified gene variants linked to common childhood obesity. That study, also from the EGG Consortium, was a meta-analysis focused on children of European ancestry only.

The new study included more diverse populations. The EGG Consortium scientists performed a meta-analysis of 30 genome-wide association studies comprising 13,000 cases and 15,600 controls, all from individuals of European, African, North and South American and East Asian ancestry. A replication study then covered a subset of samples of 1,888 cases and 4,689 controls from European and North/South American cohorts.

The scientists found a strong, novel variant associated with childhood obesity, closest to the METTL15 gene, and confirmed 18 variants previously linked to childhood obesity or body mass index (BMI). The team also used fine-mapping analyses to narrow down potential causal variants at four different locations to fewer than 10 specific single-base changes (single-nucleotide polymorphisms, or SNPs).

Among the study's co-investigators was Hakon Hakonarson, MD, PhD, director of the Center for Applied Genomics (CAG) at CHOP and a long-term collaborator of Grant. Hakonarson contributed a large quantity of diverse samples from the CAG to the study. Hakonarson said, "Obesity is becoming such an alarming health problem in children that we need to scale up translational efforts to develop innovative therapies."

"Follow-up functional studies," said Grant, "will be needed to connect potential causal variants with specific effector genes, to better understand the biology of obesity and offer insights into potential treatments." He added that the genetic roots of obesity and BMI are very similar in both children and adults, and that future treatments may reflect those commonalities.

Credit: 
Children's Hospital of Philadelphia