Culture

'Native advertising' builds credibility, not perceived as 'tricking' visitors

CATONSVILLE, MD, December 2, 2019 - The concept of "native advertising" has been in existence for as long as advertisements were designed to resemble the editorial content in newspapers and magazines. As the Internet emerged and became a powerful force, native advertising evolved, which has led some in recent times to be concerned that native advertising, which mimics non-advertising content, could serve to deceive web site visitors.

This concern served as the foundation for new research in the INFORMS journal Marketing Science (Editor's note: The source of this research is INFORMS) which sought to determine more precisely how native advertising is perceived and received by web site users.

The study, to be published in the December edition of the INFORMS journal Marketing Science, is titled "Sponsorship Disclosure and Consumer Deception: Experimental Evidence from Native Advertising in Mobile Search." It is authored by Navdeep S. Sahni and Hirkesh S. Nair of Stanford University.

"We found little evidence that native advertising 'tricks' Internet users into clicking on sponsored content and driving those users directly to the advertisers," said Nair. "Instead, we found that Internet users seem to view native ads as advertisements, and they use the content to deliberately evaluate those advertisers."

The researchers studied native advertising at a mobile restaurant-search platform. They analyzed various formats of paid-search advertising, and the extent to which those ads were disclosed to over 200,000 users.

According to industry standards and certain regulations as instituted by the U.S. Federal Trade Commission (FTC), organizations behind native advertising are required to clearly indicate when content is paid advertising, or "sponsored content."

"One of the interesting findings of the research is that while native advertising benefits advertisers, we see no evidence of consumers getting deceived," said Sahni. "More to the point, users who see a native advertisement continue with their product search; they're more likely to later click on the advertiser's organic listings and make a purchase. In effect, consumers often follow a process of conducting their own due diligence incorporating the information they receive through native advertising."

Credit: 
Institute for Operations Research and the Management Sciences

Study finds common cold virus can infect the placenta

Researchers have shown that a common cold virus can infect cells derived from human placentas, suggesting that it may be possible for the infection to pass from expectant mothers to their unborn children.

The study, published in the journal PLOS ONE, was led by Dr. Giovanni Piedimonte, professor of pediatrics and vice president of research at Tulane University.

"This is the first evidence that a common cold virus can infect the human placenta," Piedimonte said. "It supports our theory that when a woman develops a cold during pregnancy, the virus causing the maternal infection can spread to the fetus and cause a pulmonary infection even before birth."

During pregnancy, the placenta acts as a gatekeeper to provide essential nourishment from a mother to a developing fetus while filtering out potential pathogens. Scientists are discovering that the barrier isn't as impenetrable as once believed with recent studies showing how viruses such as Zika can slip through its defenses.

Using donated placentas, researchers isolated the three major cells types found in placentas -- cytotrophoblast, stroma fibroblasts and Hofbauer cells -- and exposed them in vitro to the respiratory syncytial virus (RSV), which causes the common cold. While the cytotrophoblast cells supported limited viral replication, the other two types were significantly more susceptible to infection.

For example, the Hofbauer cells survived and allowed the virus to replicate inside the cell walls. As Hofbauer cells travel within the placenta, researchers suspect they could act as a Trojan horse and transmit the virus into the fetus.

"These cells don't die when they're infected by the virus, which is the problem," Piedimonte said. "When they move into the fetus, they are like bombs packed with virus. They don't disseminate the virus around by exploding, which is the typical way, but rather transfer the virus through intercellular channels."

Researchers suspect RSV could attack lung tissue within the fetus, causing an infection that may predispose offspring to developing asthma in childhood. Piedimonte plans to launch a clinical study at Tulane to further test the theory.

Credit: 
Tulane University

Smarter strategies

Though small and somewhat nondescript, quagga and zebra mussels pose a huge threat to local rivers, lakes and estuaries. Thanks to aggressive measures to prevent contamination, Santa Barbara County's waters have so far been clear of the invasive mollusks, but stewards of local waterways, reservoirs and water recreation areas remain vigilant to the possibility of infestation by these and other non-native organisms.

Now, UC Santa Barbara-based research scientist Carolynn Culver and colleagues at UCSB's Marine Science Institute are adding to this arsenal of prevention measures with a pair of studies that appear in a special edition of the North American Journal of Fisheries Management. They focus on taking an integrated approach to the management of aquatic invasive species as the state works to move beyond its current toxic, water quality-reducing methods.

"With integrated pest management you're looking for multiple ways to manipulate vulnerabilities of a pest, targeting different life stages with different methods in a combined way that can reduce the pest population with minimal harm to people and the environment," said Culver, an extension specialist with California Sea Grant who also holds an academic appointment at Scripps Institution of Oceanography. "Often there is concentrated effort on controlling one part of the life cycle, like removing adults--which are easier to see--without thinking about the larvae that are out there."

Could hungry fish fight invasive mussels?

In one study, Culver and her colleagues explored whether certain species of sunfish could be used as a biological control method to help manage invasive freshwater mussels in Southern California lakes.

The quagga mussel and closely related zebra mussel are two of the most devastating aquatic pests in the United States. The small freshwater mussels grow on hard surfaces such as water pipes, and can cause major problems for water infrastructure. They can also negatively impact ecosystems and fisheries by feeding on microscopic plants and animals that support the food web. First appearing in North America in the 1980s, they appeared in California in 2007. The cost of managing these mussels is estimated at billions of dollars since their introduction into the U.S.

Culver has worked closely with lake and reservoir managers in California to help them prepare for and respond to mussel invasions. This research was needed, she said, because many of the control systems long used in other places were developed for facilities and involved chemical applications or toxic coatings that can't readily be used in California in bodies of water that serve as sources of drinking water, or are home to endangered species that could be hurt by the chemicals. That covers the majority of California locations that have mussel infestations. In San Diego, for instance, rapid colonization of the reservoirs by these mussels caused docks and buoys to sink, but conventional, toxic methods of controlling them were a cause for concern.

"Commonly used mussel control methods are problematic for San Diego reservoirs since they are primary water supply reservoirs," said study co-author Dan Daft, a water production superintendent and biologist with the city of San Diego, who found that biocontrol methods were both effective and ecologically sound for sensitive water sources.

The study found that when one species of sunfish, bluegill, was penned up in an area where mussels occur, it could significantly reduce microscopic larvae and newly settled young mussels on surfaces within the pen, and on the pen itself. This method could be one key piece of an integrated pest management strategy, and provides a new, non-chemical method for targeting early life stages of the mussels, which are hard to detect.

"Essentially you can put these fish to work in specific areas where mussels occur," Culver said.

The researchers studied two species of resident sunfish in many infested southern California lakes, bodies of water that are human-built and nearly all serve as water supplies. Although not native to California, they were stocked into these man-made reservoirs. According to the researchers, the methods could be applied to predatory species in different places, but no other good candidates were available where they were working.

"It's important to point out that we don't support introducing non-native species," Culver said.

A better way to clean your boat

The other study assessed an integrated management framework that Culver and colleagues had developed to manage biofouling -- the growth of organisms such as algae, barnacles and other aquatic plants and animals that settle on hard surfaces such as piers, pilings and boat hulls -- while balancing both boat operations and ecosystem health. The paper describes how, when applied as part of an integrated framework, a combination of non-toxic methods can help maintain clean boats without the use of toxic paints and coatings that are increasingly regulated due to their environmental impacts.

"Controlling the growth of these organisms is critical for boat maintenance, because they create drag that slows vessels, reduces fuel efficiency, and makes boats harder to steer," said co-author Leigh Johnson, coastal advisor emerita with UC Cooperative Extension and former California Sea Grant Extension advisor. Johnson was instrumental in initiating the research and bringing attention to the need for a balanced biofouling control management approach. "However," she added, "the methods used to control fouling on boats can impact water quality and increase transport of invasive species so it is important to consider all of these issues when deciding how to maintain a clean hull."

The primary method of controlling biofouling around the world has long been toxic antifouling paints. But there are growing concerns about the impacts of currently used copper-based paints on water quality, and many countries and US states, including California and Washington, have set standards to reduce the copper levels and leaching rates of antifouling paints. These actions, however, increase the risks of moving biofouling invasive species from place to place, including vulnerable ecosystems, such as the islands off the coast of California.

In this study, researchers tested a variety of hull coatings, California-based hull cleaning practices, and conditions in various California harbors, to identify methods that could be used in combination to control biofouling.

They found that although copper-based paints were effective when first applied, they lost effectiveness fairly quickly, and that non-native species tended to accumulate first on the toxic coatings -- sometimes within just a few months. The team also showed that frequent, minimally abrasive, in-water hull cleaning was effective and did not cause an increase in fouling as reported for other hull cleaning practices. Their documentation of the time of year when different organisms were attaching to surfaces also helped to illustrate how adjusting the timing and frequency of hull cleaning could help increase its effectiveness.

Results from the study, along with other research findings, informed the development of an integrated pest management framework that boaters can adapt to different regions and specific needs.

"It's not a one-size-fits-all approach -- it's adaptive," Culver said. "Boaters can tailor it to local environments, regulations and boating patterns, and it can be applied in areas where toxic paints have been restricted, as well as where they continue to be used. It can help to keep boat hulls clean, while reducing impacts on water quality and transport of invasive species -- three issues that often are not considered together."

Culver and her colleagues have provided information to boat owners, resource managers, and regulators about applications of this integrated approach. There also has been interest, she said, in using the technique to inform biofouling management guidance and regulations in California and elsewhere.

Credit: 
University of California - Santa Barbara

Breathing? Thank volcanoes, tectonics and bacteria

image: This figure illustrates how inorganic carbon cycles through the mantle more quickly than organic carbon, which contains very little of the isotope carbon-13. Both inorganic and organic carbon are drawn into Earth's mantle at subduction zones (top left). Due to different chemical behaviors, inorganic carbon tends to return through eruptions at arc volcanoes above the subduction zone (center). Organic carbon follows a longer route, as it is drawn deep into the mantle (bottom) and returns through ocean island volcanos (right). The differences in recycling times, in combination with increased volcanism, can explain isotopic carbon signatures from rocks that are associated with both the Great Oxidation Event, about 2.4 billion years ago, and the Lomagundi Event that followed.

Image: 
Image by J. Eguchi/University of California, Riverside

HOUSTON -- (Dec. 2, 2019) -- Earth's breathable atmosphere is key for life, and a new study suggests that the first burst of oxygen was added by a spate of volcanic eruptions brought about by tectonics.

The study by geoscientists at Rice University offers a new theory to help explain the appearance of significant concentrations of oxygen in Earth's atmosphere about 2.5 billion years ago, something scientists call the Great Oxidation Event (GOE). The research appears this week in Nature Geoscience.

"What makes this unique is that it's not just trying to explain the rise of oxygen," said study lead author James Eguchi, a NASA postdoctoral fellow at the University of California, Riverside who conducted the work for his Ph.D. dissertation at Rice. "It's also trying to explain some closely associated surface geochemistry, a change in the composition of carbon isotopes, that is observed in the carbonate rock record a relatively short time after the oxidation event. We're trying explain each of those with a single mechanism that involves the deep Earth interior, tectonics and enhanced degassing of carbon dioxide from volcanoes."

Eguchi's co-authors are Rajdeep Dasgupta, an experimental and theoretical geochemist and professor in Rice's Department of Earth, Environmental and Planetary Sciences, and Johnny Seales, a Rice graduate student who helped with the model calculations that validated the new theory.

Scientists have long pointed to photosynthesis -- a process that produces waste oxygen -- as a likely source for increased oxygen during the GOE. Dasgupta said the new theory doesn't discount the role that the first photosynthetic organisms, cyanobacteria, played in the GOE.

"Most people think the rise of oxygen was linked to cyanobacteria, and they are not wrong," he said. "The emergence of photosynthetic organisms could release oxygen. But the most important question is whether the timing of that emergence lines up with the timing of the Great Oxidation Event. As it turns out, they do not."

Cyanobacteria were alive on Earth as much as 500 million years before the GOE. While a number of theories have been offered to explain why it might have taken that long for oxygen to show up in the atmosphere, Dasgupta said he's not aware of any that have simultaneously tried to explain a marked change in the ratio of carbon isotopes in carbonate minerals that began about 100 million years after the GOE. Geologists refer to this as the Lomagundi Event, and it lasted several hundred million years.

One in a hundred carbon atoms are the isotope carbon-13, and the other 99 are carbon-12. This 1-to-99 ratio is well documented in carbonates that formed before and after Lomagundi, but those formed during the event have about 10% more carbon-13.

Eguchi said the explosion in cyanobacteria associated with the GOE has long been viewed as playing a role in Lomagundi.

"Cyanobacteria prefer to take carbon-12 relative to carbon-13," he said. "So when you start producing more organic carbon, or cyanobacteria, then the reservoir from which the carbonates are being produced is depleted in carbon-12."

Eguchi said people tried using this to explain Lomagundi, but timing was again a problem.

"When you actually look at the geologic record, the increase in the carbon-13-to-carbon-12 ratio actually occurs up to 10s of millions of years after oxygen rose," he said. "So then it becomes difficult to explain these two events through a change in the ratio of organic carbon to carbonate."

The scenario Eguchi, Dasgupta and Seales arrived at to explain all of these factors is:

A dramatic increase in tectonic activity led to the formation of hundreds of volcanoes that spewed carbon dioxide into the atmosphere.

The climate warmed, increasing rainfall, which in turn increased "weathering," the chemical breakdown of rocky minerals on Earth's barren continents.

Weathering produced a mineral-rich runoff that poured into the oceans, supporting a boom in both cyanobacteria and carbonates.

The organic and inorganic carbon from these wound up on the seafloor and was eventually recycled back into Earth's mantle at subduction zones, where oceanic plates are dragged beneath continents.

When sediments remelted into the mantle, inorganic carbon, hosted in carbonates, tended to be released early, re-entering the atmosphere through arc volcanoes directly above subduction zones.

Organic carbon, which contained very little carbon-13, was drawn deep into the mantle and emerged hundreds of millions of years later as carbon dioxide from island hotspot volcanoes like Hawaii.

"It's kind of a big cyclic process," Eguchi said. "We do think the amount of cyanobacteria increased around 2.4 billion years ago. So that would drive our oxygen increase. But the increase of cyanobacteria is balanced by the increase of carbonates. So that carbon-12-to-carbon-13 ratio doesn't change until both the carbonates and organic carbon, from cyanobacteria, get subducted deep into the Earth. When they do, geochemistry comes into play, causing these two forms of carbon to reside in the mantle for different periods of time. Carbonates are much more easily released in magmas and are released back to the surface at a very short period. Lomagundi starts when the first carbon-13-enriched carbon from carbonates returns to the surface, and it ends when the carbon-12-enriched organic carbon returns much later, rebalancing the ratio."

Eguchi said the study emphasizes the importance of the role that deep Earth processes can play in the evolution of life at the surface.

"We're proposing that carbon dioxide emissions were very important to this proliferation of life," he said. "It's really trying to tie in how these deeper processes have affected surface life on our planet in the past."

Dasgupta is also the principal investigator on a NASA-funded effort called CLEVER Planets that is exploring how life-essential elements might come together on distant exoplanets. He said better understanding how Earth became habitable is important for studying habitability and its evolution on distant worlds.

"It looks like Earth's history is calling for tectonics to play a big role in habitability, but that doesn't necessarily mean that tectonics is absolutely necessary for oxygen build up," he said. "There might be other ways of building and sustaining oxygen, and exploring those is one of the things we're trying to do in CLEVER Planets."

Credit: 
Rice University

Unexpected viral behavior linked to type 1 diabetes in high-risk children

image: This is the enterovirus Coxsackievirus B3.

Image: 
Electron microscopy image of Coxsackievirus B3 courtesy of Varpu Marjomäki Laboratory, University of Jyväskylä, and Minna Hankaniemi, Tampere University, Finland

Tampa, FL (Dec. 2, 2019) -- New results from The Environmental Determinants of Diabetes in the Young (TEDDY) study show an association between prolonged enterovirus infection and the development of autoimmunity to the insulin-producing pancreatic beta-cells that precedes type 1 diabetes (T1D). Notably, researchers also found that early adenovirus C infection seemed to confer protection from autoimmunity. The full findings were published Dec. 2 in Nature Medicine.

Viruses have long been suspected to be involved in the development of T1D, an autoimmune condition, although past evidence has not been consistent enough to prove a connection. Investigators from theUniversity of South Florida Health (USF Health) Morsani College of Medicine, Baylor College of Medicine, and other institutions studied samples available through the TEDDY study, the largest prospective observational cohort study of newborns with increased genetic risk for T1D, to address this knowledge gap. TEDDY studies young children in the U.S. (Colorado, Georgia/Florida, and Washington State) and in Europe (Finland, Germany, and Sweden).

"Years of research have shown that T1D is complex and heterogeneous, meaning that more than one pathway can lead to its onset," said lead author Kendra Vehik, PhD, MPH, an epidemiologist and professor with the USF Health Informatics Institute. "T1D is usually diagnosed in children, teens and young adults, but the autoimmunity that precedes it often begins very early in life."

"T1D occurs when the immune system destroys its own insulin-producing beta cells in the pancreas. Insulin is a hormone that regulates blood sugar in the body. Without it, the body cannot keep normal blood sugar levels causing serious medical complications," said coauthor Richard Lloyd, PhD, professor of molecular virology and microbiology at Baylor College of Medicine.

In the current study, Vehik and her colleagues studied the virome, that is, all the viruses in the body. They analyzed thousands of stool samples collected from hundreds of children followed from birth in the TEDDY study, looking to identify a connection between the viruses and the development of autoimmunity against insulin-producing beta cells. The enterovirus Coxsackievirus has been implicated in T1D before, but the current results provide a completely new way to make the connection, by identifying specific viruses shed in the stool. The investigators were surprised to find that a prolonged infection of more than 30 days, rather than a short infection, was associated with autoimmunity.

"This is important because enteroviruses are a very common type of virus, sometimes causing fever, sore throat, rash or nausea. A lot of children get them, but not everybody that gets the virus will get T1D," Vehik said. "Only a small subset of children who get enterovirus will go on to develop beta cell autoimmunity. Those whose infection lasts a month or longer will be at higher risk."

A prolonged enterovirus infection might be an indicator that autoimmunity could develop.

Beta cells of the pancreas express a cell surface protein that helps them talk to neighboring cells. This protein has been adopted by the virus as a receptor molecule to allow the virus to attach to the cell surface. The investigators discovered that children who carry a particular genetic variant in this virus receptor have a higher risk of developing beta cell autoimmunity.

"This is the first time it has been shown that a variant in this virus receptor is tied to an increased risk for beta cell autoimmunity," Vehik said. Ultimately, this process leads to the onset of T1D, a life-threatening disease that requires life-long insulin injections to treat.

Another discovery was that the presence in early life of adenovirus C, a virus that can cause respiratory infections, was associated with a lower risk of developing autoimmunity. It remains to be investigated whether having adenovirus C in early life would protect from developing beta cell autoimmunity. Adenoviruses use the same beta cell surface receptor as Coxsackievirus B, which may offer one clue to explain this connection, although further research is needed to fully understand the details.

Other factors that affect autoimmunity and the development of T1D are still unknown, but the TEDDY study is working to identify them. The researchers seek to gain insights into the exposures that trigger T1D by studying samples taken before autoimmunity developed, starting when the TEDDY participants were 3 months old. Such findings could identify approaches to potentially prevent or delay the disease.

"Taking it all together, our study provides a new understanding of the roles different viruses can play in the development of beta cell autoimmunity linked to T1D, and suggests new avenues for intervention that could potentially prevent T1D in some children," Lloyd said.

Credit: 
University of South Florida (USF Health)

This 'fix' for economic theory changes everything from gambles to Ponzi schemes

Whether we decide to take out that insurance policy, buy Bitcoin, or switch jobs, many economic decisions boil down to a fundamental gamble about how to maximize our wealth over time. How we understand these decisions is the subject of a new perspective piece in Nature Physics that aims to correct a foundational mistake in economic theory.

According to author Ole Peters (London Mathematical Laboratory, Santa Fe Institute), people's real-world behavior often "deviates starkly" from what standard economic theory would recommend. Take the example of a simple coin toss: Most people would not gamble on a repeated coin toss where a heads would increase their net worth by 50%, but a tails would decrease it by 40%.

"Would you accept the gamble and risk losing at the toss of a coin 40% of your house, car and life savings?" Peters asks, echoing a similar objection raised by Nicholas Bernoulli in 1713.

But early economists would have taken that gamble, at least in theory. In classical economics, the way to approach a decision is to consider all possible outcomes, then average across them. So the coin toss game seems worth playing because equal probability of a 50% gain and a 40% loss are no different from a 5% gain.*

Why people don't choose to play the game, seemingly ignoring the opportunity to gain a steady 5%, has been explained psychologically-- people, in the parlance of the field, are "risk averse". But according to Peters, these explanations don't really get to the root of the problem, which is that the classical "solution" lacks a fundamental understanding of the individual's unique trajectory over time.

Instead of averaging wealth across parallel possibilities, Peters advocates an approach that models how an individual's wealth evolves along a single path through time. In a disarmingly simple example, he randomly multiplies the player's total wealth by either 150% or 60% depending on the coin toss. That player lives with the gain or loss of each round, carrying it with them to the next turn. As the play time increases, Peters' model reveals an array of individual trajectories. They all follow unique paths. And in contrast to the classical conception, all paths eventually plummet downward. In other words, the approach reveals a fray of exponential losses where the classical conception would show a single exponential gain.

Encouragingly, people seem to intuitively grasp the difference between these two dynamics in empirical tests. The perspective piece describes an experiment conducted by a group of neuroscientists led by Oliver Hulme, at the Danish Research Center for Magnetic Resonance. Participants played a gambling game with real money. On one day, the game was set up to maximize their wealth under classical, additive dynamics. On a separate day, the game was set up under multiplicative dynamics.

"The crucial measure was whether participants would change their willingness to take risks between the two days," explains the study's lead author David Meder. "Such a change would be incompatible with classical theories, while Peters' approach predicts exactly that."

The results were striking: When the game's dynamics changed, all of the subjects changed their willingness to take risks, and in doing so were able to approximate the optimal strategy for growing their individual wealth over time.

"The big news here is that we are much more adaptable than we thought we were," Peters says. "These
aspects of our behavior we thought were neurologically imprinted are actually quite flexible."

"This theory is exciting because it offers an explanation for why particular risk-taking behaviors emerge, and how these behaviors should adapt to different circumstances. Based on this, we can derive novel predictions for what types of reward signals the brain should compute to optimize wealth over time" says Hulme.

Peters' distinction between averaging possibilities and tracing individual trajectories can also inform a long list of economic puzzles-- from the equity premium puzzle to measuring inequality to detecting Bernie Madoff's Ponzi scheme.

"It may sound obvious to say that what matters to one's wealth is how it evolves over time, not how it averages over many parallel states of the same individual," writes Andrea Taroni in a companion Editorial in Nature Physics. "Yet that is the conceptual mistake we continue to make in our economic models."

Credit: 
Santa Fe Institute

Whaling and climate change led to 100 years of feast or famine for Antarctic penguins

image: A Gentoo and Chinstrap penguin standing on guano covered rocks at a breeding
colony along
the Antarctic Peninsula. Image courtesy of Rachael Herman.

Image: 
Rachael Herman, Louisiana State University, Stony Brook University

BATON ROUGE - New research reveals how penguins have dealt with more than a century of human impacts in Antarctica and why some species are winners or losers in this rapidly changing ecosystem.

Michael Polito, assistant professor in LSU's Department of Oceanography & Coastal Sciences and his co-authors published their findings in the Proceedings of the National Academy of Sciences, which is available on Monday, Dec. 2.

"Although remote, Antarctica has a long history of human impacts on its ecosystems and animals. By the early to mid-1900s, humans had hunted many of its seals and whales nearly to extinction. Seal and whale populations are now recovering, but decades of climate change and a growing commercial fishing industry have further degraded the environment," Polito said.

Polito co-led a team of researchers from Louisiana State University, University of Rhode Island, University of Oxford, University of California Santa Cruz, and the University of Saskatchewan with the goal of understanding how human interference in Antarctic ecosystems during the past century led to booms and busts in the availability of a key food source for penguins: Antarctic krill.

"Antarctic krill is a shrimp-like crustacean that is a key food source for penguins, seals, and whales. When seal and whale populations dwindled due to historic over-harvesting, it is thought to have led to a surplus of krill during the early to mid-1900s. In more recent times, the combined effects of commercial krill fishing, anthropogenic climate change, and the recovery of seal and whale populations are thought to have drastically decreased the abundance of krill," Polito said.

In this study, the team determined the diets of chinstrap and gentoo penguins by analyzing the nitrogen stable isotope values of amino acids in penguin feathers collected during explorations of the Antarctic Peninsula during the past century.

"We've all heard the adage, 'You are what you eat.' All living things record a chemical signal of the food they eat in their tissues. We used the stable isotope values of penguin feathers as a chemical signal of what penguins were eating over the last 100 years," said Kelton McMahon, co-lead author and assistant professor at the University of Rhode Island.

Because humans have never commercially harvested penguins, Polito and colleagues expected that changes in their diets and populations would mirror shifts in krill availability. The team focused their research on chinstrap and gentoo penguins because chinstrap penguins have had severe population declines and gentoo penguin populations have increased in the Antarctic Peninsula over the past half century.

"Given that gentoo penguins are commonly thought of as climate change winners and chinstrap penguins as climate change losers we wanted to investigate how differences in their diets may have allow one species to cope with a changing food supply while the other could not," said Tom Hart, co-author and researcher at the University of Oxford.

The team found that both penguin species primarily fed on krill during the krill surplus in the early to mid-1900s that was caused by seal and whale harvesting. In contrast, during the latter half of the past century, gentoo penguins increasingly showed an adaptive shift from strictly eating krill to including fish and squid in their diets, unlike the chinstrap penguins that continued to feed exclusively on krill.

"Our results indicate that historic harvesting and recent climate change have altered the Antarctic marine food web over the past century. Moreover, the differing diet and population responses we observed in penguins indicate that species such as chinstrap penguins, with specialized diets and a strong reliance on krill, will likely continue to do poorly as climate change and other human impacts intensify," Polito said.

The authors predict that the Antarctic Peninsula Region will remain a hotspot for climate change and human impacts during the next century, and they believe their research will be beneficial in predicting which species are likely to fare poorly and which will withstand--or even benefit from--future changes.

According to McMahon, "By understanding how past ecosystems respond to environmental change, we can improve our predictions of future responses and better manage human-environment interactions in Antarctica."

Credit: 
Louisiana State University

Scientists build a 'Hubble Space Telescope' to study multiple genome sequences

A new tool that simultaneously compares 1.4 million genetic sequences can classify how species are related to each other at far larger scales than previously possible. Described today in Nature Biotechnology by researchers from the Centre for Genomic Regulation in Barcelona, the technology can reconstruct how life has evolved over hundreds of millions of years and makes important inroads for the ambition to understand the code of life for every living species on Earth.

Protecting Earth's biodiversity is one of the most urgent global challenges of our times. To steward the planet for all life forms, humanity must understand the way animals, fungi, bacteria and other organisms have evolved and how they interact amongst millions of other species. Sequencing the genome of life on Earth can unlock previously unknown secrets that yield fresh insights into the evolution of life, while bringing new foods, drugs and materials that pinpoint strategies for saving species at risk of extinction.

The most common way scientists study these relationships is by using Multiple Sequence Alignments (MSA), a tool that can be used to describe the evolutionary relationships of living organisms by looking for similarities and differences in their biological sequences, finding matches among seemingly unrelated sequences and predicting how a change at a specific point in a gene or protein might affect its function. The technology underpins so much biological research that the original study describing it is one of the most cited papers in history.

"We currently use multiple sequence alignments to understand the family tree of species evolution," says Cédric Notredame, a researcher at the Centre for Genomic Regulation in Barcelona and lead author of the study. "The bigger your MSA, the bigger the tree and the deeper we dig into the past and find how species appeared and separated from each other.

"What we've made lets us dig ten times deeper than what we've been able to do before, helping us to see hundreds of millions of years into the past. Our technology is essentially a time machine that tells us how ancient constraints influenced genes in a way that resulted in life as we know today, much like how the Hubble Space Telescope observes things that happened millions of years ago to help us understand the Universe we live in today."

Researchers can use MSA to understand how certain species of plants have evolved to be more resistant to climate change, or how particular genetic mutations in one species makes them vulnerable to extinction. By studying a living organism's evolutionary history, scientists may come up with and test new ideas to stave off the collapse of entire ecosystems.

Technological advances have made sequencing cheaper than ever before, resulting in increasingly large datasets with more than a million sequences for scientists to analyse. Some ambitious endeavours, like the Earth BioGenome Project, may run to the tens of millions. Researchers have not been able to take full advantage of these enormous datasets because current MSAs cannot analyse more than 100,000 sequences with accuracy.

To evaluate the scale-up potential of MSA, the authors of the paper used Nextflow, a cloud-computing software developed in-house at the Centre for Genomic Regulation. "We spent hundreds of thousands of hours of computation to test our algorithm's effectiveness," says Evan Floden, a researcher at the CRG who also led on developing the tool. "My hope is that in combining high-throughput instrumentation readouts with high-throughput computation, science will usher in an era of vastly improved biological understanding, ultimately leading to better outcomes for consumers, patients and our planet as a whole."

"There is a vast amount of 'dark matter' in biology, code we have yet to identify in the unexplored parts of the genome that is untapped potential for new medicines and other benefits we can't fathom," concludes Cédric. "Even seemingly inconsequential organisms may play a pivotal role in furthering human health and that of our planet, such as the discovery of CRISPR in archaea. What we have built is a new way of finding the needles in the haystack of life's genomes."

Credit: 
Center for Genomic Regulation

A new therapeutic target against diseases caused by lipid accumulation in cells

image: The study is led by Carles Enrich and Carles Rentero, lecturers at the unit of Cell Biology in the Department of Biomedicine of the Faculty of Medicine and Health Sciences at the UB and the CELLEX Biomedical Research Center (IDIBAPS-UB).

Image: 
UNIVERSITY OF BARCELONA

Researchers from the University of Barcelona (UB) and the August Pi i Sunyer Biomedical Research Institute (IDIBAPS) found a new molecular mechanism involved in the regulation of the cholesterol movement in cells, an essential process for a proper cell functioning.

The study, published in the journal Cellular and Molecular Life Sciences, also identifies the protein Annexin A6 (AnxA6) as a key factor in this regulation and as a potential therapeutical target against diseases that are caused by the accumulation of cholesterol and other lipids in endosomes, such as the Niemann-Pick disease type C1, a minority genetic disease with no cure that causes hepatic damage and a type of dementia.

The study is led by Carles Enrich and Carles Rentero, lecturers at the unit of Cell Biology in the Department of Biomedicine of the Faculty of Medicine and Health Sciences at the UB and the CELLEX Biomedical Research Center (IDIBAPS-UB). This is the result of six years of research and a collaboration with Thomas Grewal, from the University of Sydney; Elina Ikonen, from the University of Helsinki, and the research group on Lipids and Cardiovascular Pathology of the Biomedical Research Institute at Hospital Sant Pau.

Study with the CRISPR/Cas9 editing technology

Cholesterol is essential in the organization of membranes and it also modulates the vesicular trafficking, basic mechanisms for the cell functioning. To coordinate and regulate the balance, or homeostasis in cholesterol, cells have developed a molecular machinery, which is not fully understood yet. "The understanding of these mechanisms is very important to treat diseases in which there is an accumulation of cholesterol and other lipids which cause serious physiological alterations in the liver, spleen and especially the nervous system", note Carles Enrich and Carles Rentero.

One of such diseases is Niemann-Pick type C1, caused by a mutation in the NPC1 gene, which causes the accumulation of cholesterol in the cell interior of the endosome. In order to study this mechanism, researchers used the CRISPR/Cas9 genetic editing technique to block a molecule -AnxA6 protein- in cells with the phenotype of the disease. The effect of such block resulted in the release of the endosome cholesterol, showing the essential role of this protein in the regulation of cholesterol transfer.

Increasing membrane contact sites

The results of the study also show this release occurred thanks to a significant increase of membrane contact sites (MCS), nanometric structures that can be seen through electronic microscopy. According to the authors, these membrane contact sites are just a few inside the cells of the affected patients, therefore, silencing AnxA6 induces the creation of MCS, stops the effect of the NPC1 gene mutation and redirects cholesterol towards other cell compartments, returning to cell normality.

"Results could help treating the clinical impact of the accumulation of cholesterol in Niemann-Pick and about twelve more diseases, among which are different types of cancer (pancreas, prostate, breast), in which the lipidic metabolism plays a fundamental role", note the researchers.

A new paradigm in the study of the cholesterol cell transport

The membrane contact sites being involved in the cholesterol transport is a pioneering result in this field, since researchers thought -so far- that lipid transport was carried out through vesicles and a type of specialized proteins. "We do not know much about the functioning and dynamics of membrane contact sites, but this study goes together with recent ones and shows MCS are a new paradigm for the understanding of the regulation, transport and homeostasis of lipids, cholesterol and calcium", conclude the researchers.

Credit: 
University of Barcelona

Monkeys inform group members about threats -- following principles of cooperation

image: Male mangabey monkey looking worried into the direction of the snake.

Image: 
A. Mielke/ MPI f. Evolutionary Anthropology

Cooperation - working together or exchanging services for the benefit of everyone involved - is a vital part of human life and contributes to our success as a species. Often, rather than helping specific others, we work for the good of the community, because this helps our friends and family who are part of the group, or because we share in the benefits with everyone else. However, even though the whole group can benefit when people work together, not everyone might be willing to contribute equally. One way for humans to cooperate is by exchanging information: from gossiping and storytelling to teaching and news reporting, we rely on some individuals possessing knowledge and sharing this knowledge for the greater good.

Like humans, many non-human primates live in close-knit social groups where individuals cooperate to their mutual benefit. As in our own species, information can be an important commodity for them: primates use a variety of calls to let each other know where they are going and whether they have found food. One of the most important messages to transmit is the presence of a threat: whenever a leopard or eagle has been spotted, calling for backup can help confuse predators or fight them off. While these calls help others in the group, they also clearly benefit the caller. This is different when the threat remains in one spot: many snakes, especially vipers, do not seem to actively hunt monkeys, but if stepped on, they can still bite and kill. However, once a monkey knows where the snake is, she herself is usually safe; giving a loud call to tell everyone about the snake costs time, and potentially exposes her to other dangers. Hence not all monkeys in a group will call out. So, why do some individuals call when they detect a threat that is not dangerous to themselves anymore?

Researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, made use of the reaction of sooty mangabeys to snakes to understand how this monkey species cooperates by sharing information. Mangabeys live in the tropical Taï National Park, Côte d'Ivoire, in large groups (often consisting of more than 100 members), where individuals often cannot see everything that is happening to other group members, including their kin and friends. While mangabeys often find venomous snakes in the forest, it is hard to predict and film these encounters, so the researchers created realistic snake models out of mesh and paper-mâché and hid them where the mangabeys would pass. By filming the reaction of all the monkeys that would see the snake, the team could identify who called, when, and how often. This way, they hoped to understand whether monkeys called simply because they were scared of the snake, to show off how fearless they are, to warn their friends and family, or whether they share information widely across the group when it was likely that others who followed them did not know about the threat.

Over the course of a year, the researchers worked with a wild mangabey community within the context of the Taï Chimpanzee Project, conducting two to four experiments per month. Using snake models and up to five different camera angles, they were able to get detailed recordings of the behavior of each mangabey that came close enough to see the snake. All group members are used to human observers and cameras, and their family ties, friendships, and dominance relations are known, enabling the researchers to analyze in detail whether the presence and arrival of specific group members would prompt alarm calls. Typically, after monkeys find a snake, they will stick around and observe it for a time, while other individuals follow them and also inspect the snake. Between the first and last individual to encounter the threat, there is a line of monkeys that could call; however, not all of them do, and the question is whether the ones that call differ from the ones that do not.

"What surprised us, in both natural snake encounters and in our experiments, was how different individuals reacted to the threat: most individuals did not show strong reactions unless they almost stepped on the snake, and they would usually just call once or twice and then move on. On the other hand, we had a small number of individuals who called almost every time they saw a snake", says Alex Mielke, lead author of the study. "When we look at all experiments, though, a clear pattern emerges: mangabeys did not call specifically for their kin or friends, or ignorant group members. Individuals called when few others were around the snake or nobody had called in a while, effectively broadcasting the location of the snake to the general public when there is a chance that the information gets lost. This creates a system where no individual has to invest too much - one or two calls are enough before moving on - but because the threat is regularly re-advertised, the danger for following individuals is removed."

These results showcase how the social system an animal lives in changes how information needs to be transmitted: In mangabeys, groups are large but all group members travel together, so an individual in the front of the group who spots a snake and calls out can probably assume that their sister in the back of the group will also get the message. The information gets re-iterated by those between them. In chimpanzees, where the community splits into subgroups while moving around, the same research group in Leipzig could show that individuals wait close to the snake and inform arriving group members who are potentially ignorant of the danger.

The behavior of the mangabeys also illuminates how cooperation could have evolved in a group-setting. Even though the monkeys do not individually warn their kin and allies, if they find a snake and pass this information on to the rest of the group they can rely on others to contribute the same way, so that all group members (including family) are informed and hence less likely to get injured or killed by the snake. Similar mechanisms can be important in other situations where primates work together to achieve a goal, when defending their territory or acquiring food together. This study thus further illuminates the extent to which cooperation shapes social behavior in our primate cousins.

Credit: 
Max Planck Institute for Evolutionary Anthropology

New clues about the origins of familial forms of Amyotrophic lateral sclerosis

image: These are human cells producing aggregates of the protein Sod1 (in green).

Image: 
Aline Brasil, Elis Eleutherio, and Tiago Outeiro

A team led by Brazilian researcher Elis Eleutherio, professor at the Federal University of Rio de Janeiro, in partnership with Tiago Outeiro, at University of Goettingen, Germany, made important progress in understanding the conformation and accumulation of certain proteins involved in lateral amyotrophic sclerosis (ALS).

"We believe protein accumulation is an important hallmark of ALS, and we still do not understand why the protein misbehaves and aggregates during the disease", explains Prof. Elis Eleutherio.

Amyotrophic lateral sclerosis (ALS) is a progressive and devastating neurodegenerative disorder affecting 1 to 3 individuals in 100.000, and is more prevalent in people between 55-75 years of age. The disease affects, primarily, a population of neurons known as 'motor neurons'. Patients suffer from irreversible motor paralysis, and become incapable of speaking, swallowing, or breathing as the disease progress.

Most ALS cases are sporadic, with no defined genetic origin, and only the minority is familial, with known associated genetic alterations. Certain familial forms of ALS (fALS) are associated with genetic alterations in the gene encoding for a protein known as superoxide dismutase 1 (Sod1), that cause alterations in the folding and function of the protein.

The study, published in the journal Proceedings of the National Academy of Sciences (PNAS), allowed scientists to understand the interaction between normal and mutant protein, which causes alteration of protein accumulation in the cell, but also impairs the function of Sod1 protein, thus contributing to the development of the disease. For the group, this discovery opens new perspectives for the treatment of ALS.

Sod1 is a protein that plays a role, among others, in the protection against oxidative damage in our cells. In some ALS cases, altered Sod1 protein accumulates inside neuronal cells and, researchers believe, cause damage to the neurons, leading to their death. Importantly, normal Sod1 protein, present in sporadic cases of ALS, can also misfold and accumulate, suggesting this is a central problem in ALS.

In the study, the researchers used simple experimental models, such as bakery yeast, used to make beer, wine and bread, and human cells, in order to better understand the context of protein aggregation in the disease. They also used a strategy that mimics the genetic context of fALS, where most patients carry one copy of the normal Sod1 protein, and one copy carrying a genetic alteration. "In patients, we think that the presence of a mutant copy of Sod1 alters the behavior of the normal copy", explains Dr. Aline Brasil, the first author of the study.

"By taking advantage of novel genetic manipulation tools, and powerful molecular imaging approaches, that enable the direct visualization of protein complexes in the cell (a technique known as BiFC), we were able to detect 'hetero-complexes' formed by normal and abnormal (mutant) Sod1 protein", said Prof. Tiago Outeiro, leader of the German team that participated in the study.

The research opens novel perspectives for therapeutic intervention, that the authors hope to continue to explore in the near future, such as the specific removal of mutant Sod1 protein.

"In a time when the scientific and education system in Brazil suffers from uncertainty, it is important to demonstrate that we can be competitive and conduct research that will contribute to society and may, ultimately, help change the lives of those affected by such devastating diseases", Prof. Elis Eleutherio concludes.

Credit: 
Instituto Nacional de Ciência e Tecnologia de Biologia Estrutural e Bioimagem (INBEB)

Supermarkets and child nutrition in Africa

image: Malnourished: Malnutrition is a widespread problem in Africa.

Image: 
E M Meemken

Hunger and undernutrition are still widespread problems in Africa. At the same time, overweight, obesity, and related chronic diseases are also on the rise. Recent research suggested that the growth of supermarkets contributes to obesity in Africa, because supermarkets tend to sell more processed foods than traditional markets. However, previous studies only looked at data from adults. New research shows that supermarkets are not linked to obesity in children, but that they instead contribute to reducing child undernutrition. The results were recently published in the journal Global Food Security.

For the research, agricultural and food economists from the University of Göttingen in Germany collected data from over 500 randomly selected children in Kenya over a period of three years. The data show that children from households with access to a supermarket are significantly better nourished than children in the reference group. Purchase of food in a supermarket has particularly positive effects on child growth and height, even after controlling for age, income, and other relevant factors. The most widely used indicator of chronic child undernutrition is child "stunting" which means impaired growth and development as shown by low height for their age.

"At first, we were surprised about the results, because it is often assumed that supermarkets in Africa primarily sell unhealthy snacks and convenience foods", says Dr Bethelhem Legesse Debela, the study's first author. "But our data show that households using supermarkets also consume healthy foods such as fruits and animal products more regularly." Professor Matin Qaim, the leader of the research project adds: "Not all processed foods are automatically unhealthy. Processing can improve the hygiene and shelf-life of foods. Poor households in Africa in particular often have no regular access to perishable fresh produce."

The findings clarify that modernization of the food retail sector can have multilayered effects on nutrition, which need to be analyzed in the local context. The United Nations pursues the goal of eradicating global hunger in all its forms by 2030. According to the study authors, "this can only be achieved when we better understand the complex relations between economic growth, nutrition, and health and identify and implement locally-adapted policies".

Credit: 
University of Göttingen

Bushmeat may breed deadly bacteria

image: Many people in Sub-Saharan Africa regularly consume bushmeat, up to two-to-five times per week.

Image: 
Robab Katani, Penn State

UNIVERSITY PARK, Pa. -- People who eat wildebeests, wart hogs and other wild African animals may be at risk for contracting potentially life-threatening diseases, according to an international team of researchers. The team analyzed samples of bushmeat -- meat derived from wildlife -- in the Western Serengeti in Tanzania and identified several groups of bacteria, many of which contain the species that cause diseases such as anthrax, brucellosis and Q fever.

"Many people in Sub-Saharan Africa regularly consume bushmeat, up to two-to-five times per week, which means that millions of people could be exposing themselves to these dangerous pathogens," said Robab Katani, assistant research professor of global health, Huck Institutes of the Life Sciences, Penn State. "And the number is growing. Bushmeat consumption and trade has been increasing because of growing food insecurity, low cost compared to other meat products, and perceived medicinal value, among other things."

The problem is not confined to Africa either, she added.

"Bushmeat is smuggled illegally into the U.S. and Western Europe on a daily basis," she said. "For example, Charles de Gaulle airport in France intercepts five tons per week. This practice puts even more people at risk for contracting dangerous bacterial diseases."

To quantify the risk associated with eating and handling bushmeat, the researchers first needed identify the bacteria present in the meat. They obtained 56 fresh and processed bushmeat tissue samples from the predominant large herbivores -- including buffalo, zebra and giraffe -- of the Serengeti National Park and surrounding areas. They collected these samples within three ecologically distinct regions, called Bunda, Serengeti and Tarime, within the Serengeti ecosystem. Using a broad genetic sequencing technique, called 16S rRNA sequencing, they analyzed the microbiomes -- all of the microorganisms -- present in each sample.

The team found 27 different groups -- called phyla -- of bacteria in the samples, with Firmicutes, Proteobacteria, Cyanobacteria and Bacteroidetes being the most abundant. All of these groups contain both pathogenic species. Within those phyla the researchers detected DNA signatures of bacteria within the genera Bacillus, Brucella and Coxiella, which contain the species that cause anthrax, brucellosis and Q fever, respectively. The team's findings on the microbiome analysis of the samples appear today (Dec. 2) in Scientific Reports.

"Anthrax, brucellosis and Q fever can be deadly if left untreated," said Vivek Kapur, professor of microbiology and infectious diseases, Huck Distinguished Chair in Global Health and associate director of the Huck Institutes of Life Sciences, Penn State. "Antibiotics work, but most people don't have access to them. We've encountered many people who've tragically lost a family member to these otherwise preventable diseases."

The researchers also found a particularly high prevalence of bacteria in the genus Clostridium, whose species cause diseases like botulism and tetanus. In fact, the microbiomes of wildebeest collected during the dry season comprised more than 78 percent Clostridial species.

"Understanding which bacteria are present in bushmeat is necessary to establishing a plan to help curb outbreaks of these dangerous diseases," said Kapur. "Our data suggest the presence of certain disease-causing species, and our objective now is to use species-level analyses to fine-tune our focus on specific pathogens and accurately assess and mitigate the associated risk of disease outbreaks. Ultimately, our goal is also to help build capabilities for rapid diagnosis and risk mitigation in the countries of origin to address these risks before they become a problem globally."

Credit: 
Penn State

Significant developments in gamut mapping for the film industry

image: The range of colours that a device can reproduce is called a gamut. A very common and convenient way of describing colours is to ignore their luminance component and represent only the chromatic content in a 2D plane known as the xi CIE chromaticity diagram

Image: 
UPF

Particularly in the film industry, the rapid development of display technologies has created an urgent need to develop fast, automatic gamut mapping algorithms. An article published on 14 November in the advanced online edition of the IEEE Transactions on Pattern Analysis and Machine Intelligence presents significant progress in this area.

A screen's colour range is the set of colours it can reproduce. Wider screen formats can present more vivid and intense colours. The mapping of gamut or range is the process of adapting colours to the range of the screen to fully exploit the colour palette of the display device on which the content is shown, while preserving the artistic intent of the creator of the original content.

The goal of gamut mapping for film is to develop algorithms (gamut mapping algorithms, GMA) that reproduce the original content of the film insofar as possible respecting the artist's vision, because this is an important feature that all GMA should have in order to be adopted by the film industry. "Therefore, for this study we performed psychophysical experiments to compare the performance of the proposed GMA with other methods in film conditions using a digital film projector (Barco-DP-1200 [75]) and a large projection screen", the authors point out in their article.

Software that mimics the neural processes of the human visual system

"In this paper, we propose a new framework based on biological visual models. Our method both reduces and extends the gamut, is of low computational complexity, produces results that are free from artefacts, and outperforms the most advanced methods according to psychophysical tests", explain the authors Syed Waqas Zamir, researcher at the Inception Institute of Artificial Intelligence, Abu Dhabi (UAE) and PhD from UPF (2017), and Javier Vázquez-Corral and Marcelo Bertalmío, researchers at the Department of Information and Communication Technologies (DTIC) at UPF..

In this paper, the authors present the details of a new method based on neural models that come from scientific knowledge about human vision. As Bertalmío, coordinator of the Image Processing for Enhanced Cinematography (IP4EC) research group explains, "instead of working on the hardware, improving lenses and sensors, we resort to the latest knowledge of neuroscience and the existing models of visual perception to develop software methods that mimic the neural processes of the visual system applying these methods to the images harnessed by a regular camera".

"Our experiments also highlight the limitations of existing objective metrics to the problem of gamut mapping and provides solutions", they add.

Credit: 
Universitat Pompeu Fabra - Barcelona

Pharmacy service will save NHS £651 million

A research team from the Universities of Manchester, Nottingham, and UCL evaluating a service delivered by pharmacists since 2011 have calculated it will save the English NHS around £651 million.

They also show it will allow patients to enjoy around 278,700 more quality adjusted life years, a long term measure of disease burden used by health economists.

Since the inception of the New Medicine Service (NMS), the team say community pharmacists in over 12,000 pharmacies have delivered 5.7 million consultations between 2011 and 2018.

NMS works with patients who are prescribed medicines for asthma and COPD, high blood pressure, Type 2 diabetes or are taking anticoagulant therapies such as warfarin.

The NMS, in which pharmacists follow up patients with a telephone call after 7 to 14 days, and then again 2-3 weeks later, aims to support people taking a new medicine prescribed by their doctor.

The NMS came about after the discovery by psychologists working with the research team that the decision to adhere to a medicine is often made in the first 2 weeks of it being prescribed.

The team developed the ideas behind the NMS in the late 1990s, and were influential in the decision by the Department of Health to start the scheme.

The team of pharmacists, GPs, patients, policymakers, health economists and health services researchers, ran a trial of the NMS, with 504 people in 46 pharmacies.

Their paper in 2016 showed that 11% more patients adhered to their medication regimen after 10 weeks when they used the NMS.

The present study, published in BMJ Quality and Safety, also followed up the same patients after 26 weeks, and showed that an extra 9% still stick their regimen, when they used the NMS.

However, because 66 people dropped out of the study after 10 weeks, the figures were not statistically significant. The cost to the NHS of paying community pharmacists to deliver NMS (£25) was absorbed by small reductions in other NHS contact-related costs.

The economic evaluation concluded that in the long-term, this improvement in adherence would still lead to reduced overall costs to the NHS and improved long-term health for patients.

Lead researcher from The University of Manchester Professor Rachel Elliott said: "The New Medicine Service has proved to be a simple, deliverable intervention which helps patients and saves the NHS money.

"The NMS workload had been absorbed into busy community pharmacists' daily routines alongside existing responsibilities with no extra resources or evidence of reduction in other responsibilities.

"It's not always easy for doctors to determine if their patients are sticking to their drugs regimen.

"As health care professionals, we sometimes underestimate the problems patients face around their medicines. Patients often decide to stop taking their pills when they see no difference in their symptoms, experience side effects, have found information from other sources such as the internet, or can't afford prescription charges.

Professor Elliott added: "The results of this longer-term follow-up suggests NMS helps people when the medicine is started, and some effect lasts for quite a long time. However, reviewing medicines-taking, for example every six months, is probably needed to continue the support patients need around taking medicines.

"And we think clinical pharmacists, now often based in primary care doctor's practices may be able to integrate NMS, and follow-up support, into their role.

"In addition to that, there are other medicines which we know patients are less likely to adhere to: for example, from talking to patient groups we know that mental health medicines, eyedrops and statins could also be candidates for the NMS."

Dr Boyd, Co-project lead from the University of Nottingham said "The way patients access healthcare is changing. This work highlights the valuable contribution pharmacists make in protecting valuable NHS budget and improving outcomes for patients."

Credit: 
University of Manchester