Culture

Families continue to enjoy TV together -- but potentially ruin it for each other

TV companies battling to preserve the shared experience of scheduled TV viewing in an era of 24/7 streaming and personalised viewing need more than binge-watching contracts and no-sleeping agreements to keep customers.

Recently, Netflix introduced a binge-watching contract for couples and families to regulate the way they watch TV together. However, new research from Lancaster University, the University of Warwick and Relational Economics Ltd suggests streaming and subscription TV providers need to consider several other factors to ensure their services provide value to customers.

The study of UK households who subscribe to digital and satellite TV services - which accounted for 60% of UK households in 2018 - is published in the Journal of Business Research.

Lead author Dr Helen Bruce, of Lancaster University Management School's Marketing Department, said: "From our research, we found families value more than just watching TV together, though the ability to do so - and to customise those experiences - remains extremely important, and a key reason why families continue to spend often significant sums of money each month on TV subscriptions.

"In fact, our research shows that families who work together to choose which TV packages (and elements thereof) to have within their home, who learn together how to use the ever-evolving technologies, who plan what and when they will watch together, and who meaningfully discuss their viewing experiences, will ultimately derive more value from their subscription.

"However, value can be destroyed where family members don't or can't master the various technologies, and where family members don't have equal ownership and control over the TV and its benefits.

"Perhaps more importantly for TV subscription companies trying to maintain a position within a household, value can be destroyed where the actions of one family member are detrimental to others. For instance, a person might disrupt family viewing by talking loudly, delete recorded shows that someone else wanted to watch, or make disparaging comments about another party's tastes in TV shows.

"Firms need to think about how they can facilitate collaboration among families in their use of subscription TV. For example, there is the potential to use technologies such as Alexa to identify areas of value destruction and to intervene - for instance, by detecting when one person regularly talks during a certain programme, and setting up a recording, so nothing is missed."

Dr Bruce added: "Service providers need to provide resources that are easily integrated into consumers' lives, as well as providing reliability and quality. They also need to respond to common problems, where patterns of behaviour which cause difficulties - and thus a loss of value - are repeated across users."

Credit: 
Lancaster University

Premature mortality is partly predicted by city neighbourhood

image: Neighbourhood equivalent in cigarettes smoked per person per year.

Image: 
Stephanie Melles, Ryerson University

We know that our environment affects our health. More specifically, it's understood that exposure to pollution and access to medical care play important supporting roles in maintaining health and wellness. A new in-depth study from Ryerson University assesses the link between premature mortality and a combination of environmental, health, socioeconomic and demographic characteristics within Toronto's 140 neighbourhoods and reaches some noteworthy conclusions, both anticipated and surprising.

First, the anticipated: authors Luckrezia Awuor, an Environmental Applied Science and Management graduate student, and Stephanie Melles, Assistant Professor in the Department of Chemistry and Biology in the Faculty of Science, determined that premature mortality in City of Toronto neighbourhoods was predicted by a combination of unhealthy environments and embedded socioeconomic imbalances.

"It's an ongoing concern that neighbourhoods with fewer trees, lower uptake rates in cancer screening programs, higher levels of pollution and lower total income levels best predict increased mortality rates," says Melles. "It's also the case that visible minorities and Indigenous peoples are most at risk of living in neighbourhoods with higher premature mortality rates. Though we expected these results, they emphasize a persistent issue of social injustice."

Put another way, residents of wealthy neighbourhoods have lower rates of premature mortality owing in part to greater tree cover and higher rates of cancer screening. Neighbourhoods with reduced cancer screening rates, more pollution and fewer trees - also often situated close to industrial areas - tend to be lower income. In addition, walkable neighbourhoods in closer proximity to industrial pollution had higher rates of premature mortality despite higher access to health providers.

Interestingly, higher levels of traffic-related ultrafine particulates and industrial carcinogens and non-carcinogens did not always correlate to higher rates of premature mortality. This is where the surprises came in. Larger suburbs with higher pollution levels than downtown neighbourhoods showed a decreased rate of premature mortality by about 17 deaths per 100,000. This decrease is equivalent to smoking 125 fewer cigarettes per year.

"This contradicts other published findings," says Melles. "Generally speaking, greater pollution correlates with greater premature mortality. In this case, the suburban neighbourhoods also had less walkability, which may minimize the health impact. It's also possible that the differences between where a person lives and where they work play a role in their overall exposure to pollutants. We have other hypotheses that were beyond the scope of our study. But we were surprised."

Also surprising: some Toronto neighbourhoods with major highways connecting the city and those along the shoreline with good tree cover, extensive green spaces and no greater measured pollution than other downtown areas showed above average rates of premature mortality. The area that contains the Rouge National Park was one of them.

"We have some unexplained variations in the outcomes," says Awuor. "This tells us that we are missing a variable we haven't yet identified. Why do shoreline neighbourhoods fare worse than others? Is there some additional exposure to undetermined air or water pollutants that cross Lake Ontario? The higher rates of premature mortality are correct, but the explanations aren't there."

The authors point to one reason they can't yet pinpoint precise explanations for what they call "residual neighbourhood patterns," which are those below or above expected rates of premature mortality: an insufficient number of air quality monitoring stations. With only four stations in the city, it's not possible to fully understand where and when citizens are being exposed to carcinogens and ultrafine particulates.

"We need to collect more air quality data to create a more accurate picture of exposure," says Awuor. "We also need more extensive environmental policies for better tree cover and greener spaces. And we need new approaches to promoting cancer screening programs in lower income neighbourhoods. These three advances would improve the lifespan of all Toronto residents and, in particular, visible minorities and Indigenous people, who tend to live in the least green and most polluted neighbourhoods."

Credit: 
Ryerson University - Faculty of Science

Moderate to heavy drinking during pregnancy alters genes in newborns, mothers

image: Researchers looked for alcohol-induced DNA changes in pregnant women and their children. They found changes to two genes -- POMC, which regulates the stress-response system, and PER2, which influences the body's biological clock -- in women who drank moderate to high levels of alcohol during pregnancy and in children who had been exposed to those levels of alcohol in the womb.

Image: 
Syani Mukherjee/Rutgers University-New Brunswick

Mothers who drink moderate to high levels of alcohol during pregnancy may be changing their babies' DNA, according to a Rutgers-led study.

"Our findings may make it easier to test children for prenatal alcohol exposure - and enable early diagnosis and intervention that can help improve the children's lives," said lead author Dipak K. Sarkar, a Distinguished Professor and director of the Endocrine Program in the Department of Animal Sciences at Rutgers University-New Brunswick.

The study by Sarkar and scientists in a Collaborative Initiative on Fetal Alcohol Spectrum Disorders is in the journal Alcoholism: Clinical and Experimental Research.

Building on an earlier Rutgers-led study that found binge and heavy drinking may trigger long-lasting genetic change in adults, the researchers sought alcohol-induced DNA changes in 30 pregnant women and 359 children.

They found changes to two genes - POMC, which regulates the stress-response system, and PER2, which influences the body's biological clock - in women who drank moderate to high levels of alcohol during pregnancy and in children who had been exposed to those levels of alcohol in the womb.

Heavy drinking in women is four or more drinks on at least five occasions in a month. Moderate drinking in women is about three drinks per occasion.

"Our research may help scientists identify biomarkers - measurable indicators such as altered genes or proteins - that predict the risks from prenatal alcohol exposure," Sarkar said.

Fetal alcohol spectrum disorders can include physical or intellectual disabilities as well as behavioral and learning problems. While there is no cure, early intervention treatment services can improve a child's development, according to the U.S. Centers for Disease Control and Prevention, which says there is no known safe amount of alcohol to drink while pregnant.

The study also found that infants exposed to alcohol in the womb - which passes from the mother's blood through the umbilical cord - had increased levels of cortisol, a potentially harmful stress hormone that can suppress the immune system and lead to ongoing health issues.

Credit: 
Rutgers University

Uric acid pathologies shorten fly lifespan, highlighting need for screening in humans

image: This image shows a micro-CT scan of Drosophila melanogaster with diminished expression of urate oxidase reducing the enzymatic degradation of the aggregation-prone purine intermediate uric acid. Radiolucent structures are shown in red. Radiopaque objects are shown in yellow and demonstrate the formation of uric acid concretions, i.e. events of ectopic biomineralization resembling human uric acid kidney stones. The reduced expression of urate oxidase in Drosophila resulted in elevated uric acid levels, accumulation of concretions in the excretory system, and shortening of lifespan when reared on diets containing high levels of yeast extract. The image replacing the letter "D" in the word uric acid represents a concretion dissected from the same fly.

Image: 
Sven Lang, PhD

Few people get their level of uric acid, a breakdown product of metabolism, measured in their blood. Based on Buck research published August 15 in PLOS Genetics, it might be time to rethink that, given that 20 percent of the population have elevated levels of uric acid, increasing their risk for gout, kidney stones, metabolic syndrome, obesity, diabetes and early death.

"People can be at genetic risk for high uric acid and not know it," says Pankaj Kapahi, PhD, Buck professor and senior author of the paper. "Medical practitioners haven't been paying sufficient attention to uric acid and perhaps they should." Kapahi thinks uric acid should be included in routine check-ups similar to those done for cholesterol and blood glucose. "Uric acid levels often go up with age and it's important for longevity," he explains. "Gout is also associated with premature mortality in humans. Conversely, studies from the Institute for Aging Research led by Nir Barzilai at the Albert Einstein College of Medicine show that many centenarians are genetically predisposed to having low levels of uric acid."

While humans lost the gene for metabolizing uric acid about 15 million years ago, most species, including insects, kept it. In this study, researchers in the Kapahi lab "humanized" fruit flies by knocking down the uric acid oxidase gene. The altered flies built up uric acid in their bodies only when triggered by a diet rich in purines, but showed no deleterious effect under dietary restriction. In humans, a high-purine diet is linked to excessive intake of alcohol and red meat. Similarly, sugary beverages are driving factors for the built up of uric acid.

Scientists discovered that the insulin-like signaling (ILS) pathway, which animals use to sense nutrients, plays a role in regulating uric acid levels. Suppressing the ILS pathway lowered uric acid in the genetically altered animals, providing potential targets for new drug development. Additionally, the conserved role of the ILS pathway in modulating uric acid levels was supported by a human gene study which identified variations in two ILS genes associated with serum uric acid levels or gout, the most common inflammatory arthritis.

Sven Lang, a former postdoctoral fellow in the Kapahi lab who led the research, found that a high-purine diet shortened the lifespan of the humanized flies by 48 percent, while suppressing the ILS pathway prevented the rise in uric acid levels even when the flies ate the high yeast diet. The team also found that an increase of free radicals generated by an enzyme called NADPH oxidase (NOX) mediated kidney stone formation and early mortality in the flies. "We were able to inhibit the increase in free radicals using the common antioxidant vitamin C which reduced the burden of kidney stones and improved survival in the animals," said Lang, who now has his own laboratory at the Department of Medical Biochemistry and Molecular Biology at Saarland University in Homburg, Germany.

"Changes in diet are not always sufficient to bring down levels of uric acid, so it's important to track it and make sure that patients who need preventive drug treatment get it," said study co-author Marshall Stoller, MD, head of the urinary stone division at the University of California, San Francisco's Department of Urology. "In this research we used fruit flies to recapitulate what happens in humans and we discovered new drug targets. It's our hope that we'll be able to utilize these targets to develop new drugs for the many diseases linked to uric acid accumulation."

Credit: 
Buck Institute for Research on Aging

Ancient feces reveal how 'marsh diet' left Bronze Age Fen folk infected with parasites

image: Microscopic eggs of fish tapeworm (left), giant kidney worm (centre), and Echinostoma worm (right) from the Must Farm excavation. Black scale bar represents 20 micrometres.

Image: 
Marissa Ledger

New research published today in the journal Parasitology shows how the prehistoric inhabitants of a settlement in the freshwater marshes of eastern England were infected by intestinal worms caught from foraging for food in the lakes and waterways around their homes.

The Bronze Age settlement at Must Farm, located near what is now the fenland city of Peterborough, consisted of wooden houses built on stilts above the water. Wooden causeways connected islands in the marsh, and dugout canoes were used to travel along water channels.

The village burnt down in a catastrophic fire around 3,000 years ago, with artefacts from the houses preserved in mud below the waterline, including food, cloth, and jewellery. The site has been called "Britain's Pompeii".

Also preserved in the surrounding mud were waterlogged "coprolites" - pieces of human faeces - that have now been collected and analysed by archaeologists at the University of Cambridge. They used microscopy techniques to detect ancient parasite eggs within the faeces and surrounding sediment.

Very little is known about the intestinal diseases of Bronze Age Britain. The one previous study, of a farming village in Somerset, found evidence of roundworm and whipworm: parasites spread through contamination of food by human faeces.

The ancient excrement of the Anglian marshes tells a different story. "We have found the earliest evidence for fish tapeworm, Echinostoma worm, and giant kidney worm in Britain," said study lead author Dr Piers Mitchell of Cambridge's Department of Archaeology.

"These parasites are spread by eating raw aquatic animals such as fish, amphibians and molluscs. Living over slow-moving water may have protected the inhabitants from some parasites, but put them at risk of others if they ate fish or frogs."

Disposal of human and animal waste into the water around the settlement likely prevented direct faecal pollution of the fenlanders' food, and so prevented infection from roundworm - the eggs of which have been found at Bronze Age sites across Europe.

However, water in the fens would have been quite stagnant, due in part to thick reed beds, leaving waste accumulating in the surrounding channels. Researchers say this likely provided fertile ground for other parasites to infect local wildlife, which - if eaten raw or poorly cooked - then spread to village residents.

"The dumping of excrement into the freshwater channel in which the settlement was built, and consumption of aquatic organisms from the surrounding area, created an ideal nexus for infection with various species of intestinal parasite," said study first author Marissa Ledger, also from Cambridge's Department of Archaeology.

Fish tapeworms can reach 10m in length, and live coiled up in the intestines. Heavy infection can lead to anaemia. Giant kidney worms can reach up to a metre in length. They gradually destroy the organ as they become larger, leading to kidney failure. Echinostoma worms are much smaller, up to 1cm in length. Heavy infection can lead to inflammation of the intestinal lining.

"As writing was only introduced to Britain centuries later with the Romans, these people were unable to record what happened to them during their lives. This research enables us for the first time to clearly understand the infectious diseases experienced by prehistoric people living in the Fens," said Ledger.

The Cambridge team worked with colleagues at the University of Bristol's Organic Chemistry Unit to determine whether coprolites excavated from around the houses were human or animal. While some were human, others were from dogs.

"Both humans and dogs were infected by similar parasitic worms, which suggests the humans were sharing their food or leftovers with their dogs," said Ledger.

Other parasites that infect animals were also found at the site, including pig whipworm and Capillaria worm. It is thought that they originated from the butchery and consumption of the intestines of farmed or hunted animals, but probably did not cause humans any harm.

The researchers compared their latest data with previous studies on ancient parasites from both the Bronze Age and Neolithic. Must Farm tallies with the trend of fewer parasite species found at Bronze Age compared with Neolithic sites.

"Our study fits with the broader pattern of a shrinking of the parasite ecosystem through time," said Mitchell. "Changes in diet, sanitation and human-animal relationships over millennia have affected rates of parasitic infection." Although he points out that infections from the fish tapeworm found at Must Farm have seen a recent resurgence due to the popularity of sushi, smoked salmon and ceviche.

"We now need to study other sites in prehistoric Britain where people lived different lifestyles, to help us understand how our ancestors' way of life affected their risk of developing infectious diseases," added Mitchell.

Credit: 
University of Cambridge

Canadian researchers find 'silent' strokes common after surgery

image: Dr. PJ Devereaux is a senior scientist at the Population Health Research Institute of McMaster University and Hamilton Health Sciences.

Image: 
Photo courtesy McMaster University/Hamilton Health Sciences.

Hamilton, ON (August 15, 2019) - Canadian researchers have discovered that covert - or 'silent' - strokes are common in seniors after they have elective, non-cardiac surgery and double their risk of cognitive decline one year later.

While an overt stroke causes obvious symptoms, such as weakness in one arm or speech problems that last more than a day, a covert stroke is not obvious except on brain scans, such as MRI. Each year, approximately 0.5 per cent of the 50 million people age 65 years or greater worldwide who have major, non-cardiac surgery will suffer an overt stroke, but until now little was known about the incidence or impacts of silent stroke after surgery.

The results of the NeuroVISION study were published today in The Lancet.

"We've found that 'silent' covert strokes are actually more common than overt strokes in people aged 65 or older who have surgery," said Dr. PJ Devereaux, co-principal investigator of the NeuroVISION study. Dr. Devereaux is a cardiologist at Hamilton Health Sciences (HHS), professor in the departments of health research methods, evidence, and impact, and medicine at McMaster University, and a senior scientist at the Population Health Research Institute of McMaster University and HHS.

Dr. Devereaux and his team found that one in 14 people over age 65 who had elective, non-cardiac surgery had a silent stroke, suggesting that as many as three million people in this age category globally suffer a covert stroke after surgery each year.

NeuroVISION involved 1,114 patients aged 65 years and older from 12 centres in North and South America, Asia, New Zealand, and Europe. All patients received an MRI within nine days of their surgery to look for imaging evidence of silent stroke. The research team followed patients for one year after their surgery to assess their cognitive capabilities. They found that people who had a silent stroke after surgery were more likely to experience cognitive decline, perioperative delirium, overt stroke or transient ischaemic attack within one year, compared to patients who did not have a silent stroke.

"Over the last century, surgery has greatly improved the health and the quality of life of patients around the world," said Dr. Marko Mrkobrada, an associate professor of medicine at University of Western Ontario and co-principal investigator for the NeuroVISION study. "Surgeons are now able to operate on older and sicker patients thanks to improvements in surgical and anesthetic techniques. Despite the benefits of surgery, we also need to understand the risks."

"Vascular brain injuries, both overt and covert, are more frequently being detected, recognized and prevented through research funded by our Institute and CIHR," says Dr. Brian Rowe, scientific director of the Institute of Circulatory and Respiratory Health, Canadian Institutes of Health Research (CIHR). "The NeuroVISION Study provides important insights into the development of vascular brain injury after surgery, and adds to the mounting evidence of the importance of vascular health on cognitive decline. The results of NeuroVISION are important and represent a meaningful discovery that will facilitate tackling the issue of cognitive decline after surgery."

Credit: 
McMaster University

Survey data suggests widespread bullying by superiors in medical residency training

Using questionnaire answers from thousands of internal medicine residents, primarily from U.S. training programs, a research team at Johns Hopkins Medicine says it has added to the evidence that bullying of medical trainees is fairly widespread. Bullying affects about 14% of medical trainees overall, but is particularly more prevalent among foreign-born trainees.

In a report on the findings, published Aug. 13 in JAMA, the researchers say those who report they felt harassed repeatedly by superiors described that this bullying resulted in burnout, depression and other ill health effects.

Moving forward, researchers hope that the study will alert residency training program directors to the rates of bullying within their programs and move them to take necessary action to create a more safe and supportive learning environment.

"We hope our study will further raise awareness among educational leaders of just how ubiquitous bullying is, and that it will encourage them to do more to prevent it," says Scott Wright, M.D., professor of medicine and director of general internal medicine at Johns Hopkins Bayview Medical Center.

Previous studies, Wright says, have documented varying rates of trainee bullying -- defined as more than one episode of verbal, physical, sexual or other types of harassment from a person in a position of power or authority -- ranging from 10% to 48% depending on country and level of training in medical education. A recent survey study of internal medicine residency training program directors in the Journal of Graduate Education by Wright and his team found that only 31% were aware of bullying taking place within their programs.

In a bid to learn more about the prevalence of bullying and its health consequences on medical trainees, the researchers studied more than 24,000 internal medicine residents, mostly in the United States. These trainees completed a five-question survey about bullying in 2016, which was attached to the end of the Internal Medicine In-Training Examination (IM-ITE) -- an exam given by the American College of Physicians to residents to assess their personal progress every year.

Of the 21,212 who completed the survey and allowed their answers to be used for research, 55.7% were male and 68.8% said English was their primary language.

Residents were asked whether they were bullied during residency, the type of bullying, whether they sought help and whether there were consequences for their health.

A total of 2,876 internal medicine residents reported being bullied since the beginning of their residency training, an overall rate of about 13.6%. Women were bullied at a rate of 14.4%, and men were bullied at a rate of 12.9%.

Significantly, Wright noted, more than 40% of the residents who reported being bullied spoke a native language other than English. Results also showed that compared with U.S. residency training programs, internal medicine residents training at international programs had approximately 60% higher odds of experiencing bullying.

When it came to performance on the internal medicine knowledge exam, the rates of bullying increased among those with lower exam scores. Students who scored in the top third of the scores were bullied at a rate of 12.3%, while those in the second or bottom third were bullied significantly more often at rates of 13.8% and 14.4%, respectively.

In the survey, residents were given a list of nine choices about personal or professional consequences that stemmed from perceived bullying, including "none of the above." The most prevalent consequence they reported was feeling burned out, which was acknowledged by 57% of the respondents, followed by worsened performance (39%) and depression (27%). The other consequences of bullying reported by trainees included change in weight, alcohol use, improved performance and illicit drug use.

Some 62 residents reported that they left the program as a result of bullying. Wright explains that burnout among residents can have many causes, including fatigue, stress and the intense learning curves that residents face. But he says his team's findings affirm that bullying contributes significantly to the problem for many, and it's a factor that can be entirely prevented.

Wright cautioned that the rate of bullying reported in this study was subject to the residents' interpretation of the definition of bullying. Besides bullying, trainees may be exposed to less egregious microaggressions, which weren't measured by this study.

Credit: 
Johns Hopkins Medicine

Study: Non-invasive electrical stimulation alters blood flow in brain tumors

BOSTON - In a first-of-its kind study, neurologists at Beth Israel Deaconess Medical Center (BIDMC) tested the use of non-invasive electrical stimulation as a novel therapeutic approach to brain tumors. In an experiment published in Science Advances, the scientists - led by Emiliano Santarnecchi, PhD, principal investigator at the Berenson-Allen Center For Non-Invasive Brain Stimulation at BIDMC - demonstrated that applying low-intensity electrical stimulation to the brains of patients with tumors resulted in decreased blood flow within tumors while leaving the rest of the brain unchanged. Although further study is needed, the findings suggest that a series of such treatments could modify tumor growth and progression.

"Many patients with brain tumors have limited therapeutic options, most of which come with severe side effects," said Santarnecchi, who is also an Assistant Professor of Neurology at Harvard Medical School. "Our study found that electrical stimulation resulted in a significant reduction of blood flow to the tumor, with no changes in the rest of the brain. Given the safety profile of non-invasive brain stimulation, the ease of its application, its relatively low cost and the possibility of combining it with other drug-based therapies, our findings might lead to non-invasive therapeutic options for patients with brain tumors."

Non-invasive brain stimulation - also known as transcranial electrical stimulation (tES) - is a 20- year old, FDA-approved technology that is used to treat a number of psychiatric conditions, including treatment-resistant depression. Low-intensity electrical fields are delivered to the brain through the skull via electrodes placed on the scalp. While many researchers continue to investigate the safe and relatively low cost treatment's potential to treat psychiatric disorders and to enhance cognitive skills such as memory and concentration, this study represents the first time tES has been tested in patients with brain tumors.

Santarnecchi and colleagues recruited 50 patients with brain tumors willing to participate. Given the delicate patient population, eight participants were able to complete the trial - six with glioblastoma, an aggressive tumor that originates in the brain, and two with metastatic tumors that originated in the lung.

The patients received transcranial stimulation while their brain blood flow was monitored using magnetic resonance imaging (MRI) technology. Based on prior research showing a reduction of blood flow in bodily tumors exposed to electric stimulation, Santarnecchi and team predicted that tES would have a similar effect on the brain tumors. However, the team was surprised to see significant reductions in blood flow to the tumors after a single tES session. No changes in blood flow or activity in the rest of the brain were observed, and none of the patients reported any adverse effects.

"This technique still requires further testing to optimize frequency and duration of treatment and to fully personalize protocols for individual patients," said Santarnecchi. "Future studies will investigate the impact of repeated tES sessions, evaluate the potential combination of tES with other cancer therapies and assess tES in other forms of brain tumors."

Credit: 
Beth Israel Deaconess Medical Center

Genetic redundancy aids competition among bacteria in symbiosis with squid

image: Adult Hawaiian bobtail squid. New research shows that genetic redundancy aids competition among luminescent bacteria that colonize the squid's light organ.

Image: 
Andrew Cecere

The molecular mechanism used by many bacteria to kill neighboring cells has redundancy built into its genetic makeup, which could allow for the mechanism to be expressed in different environments. Some strains of luminescent bacteria that compete to colonize the light organs of the Hawaiian bobtail squid kill nearby cells of different bacterial strains using the "type VI secretion system (T6SS)." Researchers at Penn State and the University of Wisconsin-Madison have now shown that the genomes of these bacteria contain two copies of a gene required for T6SS and that the system still works when either copy of the gene is disabled, but not both.

A paper describing the research, which provides insight into the molecular mechanisms of competition among bacteria in a microbiome, appears online in the Journal of Bacteriology.

"Many organisms, including humans, acquire bacteria from their environment," said Tim Miyashiro, assistant professor of biochemistry and molecular biology at Penn State and the leader of the research team. "These bacteria can contribute to functions within the host organism, like how our gut bacteria help us digest food. We are interested in the dynamic interactions among bacterial cells and between bacteria and their host to better understand these mutually beneficial symbiotic relationships."

Cells of a bioluminescent species of marine bacteria, Vibrio fisheri, take up residence in the light organ of a newly hatched bobtail squid. At night, the bacteria produce a blue glow that researchers believe obscures the squid's silhouette and helps protect it from predators. The light organ has pockets, or crypts, in the squid's skin that provide nutrients and a safe environment for the bacteria.

"When the squid hatches, it doesn't yet have any bacteria in its light organ," said Miyashiro. "Bacteria in the environment quickly colonize the squid's light organ. Each crypt can be occupied by several different strains initially. Some of these different bacterial strains can coexist, but others--bacteria that use the T6SS system--kill cells of other strains."

In the lab, researchers can control which bacterial strains are present in the environment of the hatching squid and study their interactions. A strain of V. fisheri referred to as FQ-A001 uses the T6SS system. If this strain occupies a crypt with cells of a strain that doesn't use the T6SS system, it can kill those cells and completely take over the crypt. The T6SS system, which is encoded in the genome of many bacteria as a cluster of genes, works almost like a hypodermic needle that can inject toxins into neighboring cells.

"We disabled a gene in the T6SS cluster called hcp to study its role in FQ-A001's ability to kill other bacteria" said Miyashiro. "The hcp gene codes for a protein that makes up part of the structure of the T6SS system and also is one of the molecules secreted by the system to kill other cells. We were therefore surprised that the T6SS system still functioned in these bacteria."

The researchers then located another copy of the hcp gene outside of the T6SS gene cluster in the genome of FQ-A001 that coded for an identical protein. Disabling this copy of the gene, called hcp1, also had no impact on the bacteria's ability to kill other cells. But when the researchers disabled both copies of the hcp gene, the T6SS system could no longer function and the FQ-A001 bacteria could coexist with other strains in the light organ of a bobtail squid. The researchers could also rescue T6SS function by inducing Hcp expression in FQ-A001 bacteria with both of its natural copies of hcp disabled.

"The two copies of the hcp gene seem to be completely functionally redundant," said Miyashiro. "Knowing this helps us better understand the molecular mechanisms that underlie the establishment of the symbiotic relationships between bacteria and their hosts."

Credit: 
Penn State

New drug targets early instigator of Alzheimer's disease

image: DYR219 is a powerful new drug developed by Travis Dunckley and his colleagues and described in the new study. Its strength lies in the fact that it can target both of the leading pathologies associated with Alzheimer's disease--plaques, (caused by the protein amyloid beta) and tangles, (caused by the tau protein).

Further, DYR219's activity can occur early in the progression of the disease, before the formation of these pathologies, offering a better chance of preventing the advance of Alzheimer's and its destruction of cognitive function.

DYR219 inhibits a kinase known as DYRK1. In Alzheimer's and other neurodegenerative diseases, DYRK1 phosphorylates both the tau and amyloid beta proteins--key steps in the formation of plaques and tangles in the brain.

Image: 
Graphic by Shireen Dooling

Over a hundred years after they were first identified, two ominous signposts of Alzheimer's disease (AD) remain central topics of research--both formed by sticky accumulations of protein in the brain. Amyloid beta solidifies into senile plaques, which congregate in the extracellular spaces of nerve tissue, while tau protein creates tangled forms crowding the bodies of neurons.

Plaques and tangles, considered classic hallmarks of AD, have been the objects of fierce debate, sustained research and many billions of dollars in drug development. Yet therapeutic efforts to target these pathologies, which are consistently associated with cognitive decline in both humans and animal models, have met with dispiriting failure.

Travis Dunckley, a researcher at the ASU-Banner Neurodegenerative Disease Research Center, and Christopher Hulme, D. Phil, medicinal chemist at the Arizona Center for Drug Discovery based at the UA College of Pharmacy, are exploring a small molecule drug known as DYR219. The promising therapy, while still in the experimental stages, may succeed where other treatments have failed, and could be effective against a range of neurodegenerative illnesses in addition to Alzheimer's.

Rather than directly attacking the visible hallmarks of AD, namely the plaques and tangles caused by the disease's relentless progression, the new drug acts by inhibiting an early pathway believed to be critical in the formation of both plaques and tangles.

Dunckley says that targeting the early-stage events leading to plaque and tangle formation represents an important advance in the field. "If you can block that process early, you can delay the downstream aggregation and formation of the pathologies."

By preventing or delaying the development of AD pathologies, DYR219 or a similar drug may halt the progression of Alzheimer's in its tracks, before it damages the brain beyond repair.

The new small molecule acts by inhibiting DYRK1, a particular neuroactive enzyme known as a kinase. Researchers like Dunckley and Hulme have been studying DYRK1 and exploring its crucial importance not only in Alzheimer's disease but a broad range of neurodegenerative maladies.

The new study recently appeared in the journal Molecular Neurobiology.

Two faces of DYRK1

Although the activity of DYRK1 is believed to be a key factor in the formation of plaques and tangles, it is vital to the brain during early embryonic development, where it is involved in a host of processes, including signaling pathways linked with cell growth and proliferation, as well as the differentiation of cells into mature neurons and the formation of dendritic spines essential for the transmission of nerve impulses.

In the mature brain, however, DYRK1's activities can turn hostile, initiating pathologies associated with Alzheimer's, dementia with Lewy bodies and Parkinson's disease. The dysfunction of DYRK1 is also a central feature of Down Syndrome. Patients with this disorder are highly prone to developing AD early in life, often in their 40s or 50s.

The DYRK1 kinase carries out its harmful role in the brain through a process known as phosphorylation. When DYRK1 encounters a protein known as APP (amyloid precursor protein), it attaches a cluster of oxygen and phosphorus atoms, known as a phosphate group. DYRK1 also phosphorylates tau.

Too much phosphorylation of these critical proteins can have disastrous effects in the brain. The hyperphosphorylation of APP is believed to increase the formation of amyloid plaques, while tau hyperphosphorylation leads to neurofibrillary tangles. Inhibition of these processes could interrupt the sequence of events leading to plaque and tangle formation and block or delay the onset of AD.

"The reason I'm excited about this, especially in the face of a lot of the recent high profile clinical trial failures, is that this is really a different approach to treating the disease," Dunckley says, noting that previous efforts to target plaques and tangles directly have failed to provide any benefit to cognitive function. "What we're trying to do is restore the normal phosphorylation of APP and tau, so that you don't get those downstream pathologies."

Working upstream

In earlier research, Dunckley, Hulme and colleagues showed that using a small molecule drug to inhibit DYRK1 in hybrid mice bred to develop AD-like symptoms reduced the load of amyloid plaque in their brains and improved cognitive performance.

The new study explores early DYRK1 inhibition as a potential preventive measure against Alzheimer's, with impressive results. "We showed a robust and significant delay in the onset of amyloid and tau pathology," Dunckley says.

Researchers speculate that one reason anti-plaque and anti-tangle therapies have shown promise in mice yet consistently failed in humans is the nature of disease progression in the two very different brains. In hybrid mice, plaques and tangles can develop quickly, before AD has caused significant neurodegeneration and cell loss in the brain. Treating plaques and tangles in this case can help the remaining healthy neurons resume normal function. In human AD, however, plaques and tangles are typically accompanied by advanced neuronal devastation. It's simply too late in the course of the disease to derive any benefit from targeting the amyloid and tau pathologies alone.

Connection with Down Syndrome

Inhibition of DYRK1 has also shown promise in the treatment of Down Syndrome. The DYRK1 gene is localized on chromosome 21, in the Down Syndrome critical region. Overexpression of DYRK1 appears intimately involved with the learning defects characteristic of this disease and its inhibition has been shown to improve cognitive performance in mice.

Dunckley believes a DYRK1 inhibitor like the one described in the new study could first be used to treat pathology and cognitive impairment in Down Syndrome patients, before its eventual application for Alzheimer's.

Those living with Down Syndrome carry a gene defect on chromosome 21 that allows for rapid and definitive diagnosis. The fact that this pool of patients will go on to develop AD with high probability makes them ideal subjects for clinical trials involving DYRK1-inhibiting drugs. Such an approach promises to avoid the pitfalls currently involved in testing preventive treatments for Alzheimer's disease, which would need to be administered years or even decades before the onset of symptoms in patients of uncertain prognosis.

Targeting an enigmatic killer

The ability of DYRK1 inhibitors to halt or significantly delay both major AD-associated pathologies caused by amyloid beta and tau offers renewed hope for effective treatment of Alzheimer's and may hold the key to addressing other devastating afflictions linked to hyperphosphorylation by DYRK1.

Hulme expresses excitement about rapid advancements in this area. " A challenging in-house design effort driven by several UA graduate students over the last seven years, most recently Christopher Foley, has successfully unearthed newer drugs that are incredibly selective, much more stable and much more potent," he says. "If such drugs deliver on their early promise, they may eventually be used as a common prophylactic against neurodegenerative diseases, perhaps like current medications for the prevention of heart disease."

The pressing need for an effective AD therapy could not be more acute. Dementia currently affects nearly 50 million people, striking a new victim somewhere in the world every three seconds. The majority fall victim to Alzheimer's disease, the most common form of dementia, which accounts for around 75% of cases. Barring major advances in treatment, the number of cases is projected to skyrocket to 131.5 million by mid-century.

On a more hopeful note, because AD is primarily a disease of old age, it has been estimated that a therapy capable of delaying the onset by just five years would cut the number of cases globally by half. The research outlined in the current study offers an innovative approach to this urgent medical crisis.

Credit: 
Arizona State University

The risk of death from yellow fever can be detected sooner

image: A study has identified markers capable of predicting mortality in patients with symptoms of yellow fever, potentially helping to prevent the development of severe conditions.

Image: 
Bernardo Portella / Arca Fiocruz

Approximately ten out of every 100 people who are bitten by mosquitos infected by the yellow fever virus develop symptoms of the disease. Although most infected people do not develop the disease, it is lethal to approximately 40% of those who do.

Outbreaks of yellow fever have occurred in Brazil in recent years despite the existence of a safe and efficacious vaccine since 1938. Even after more than a century of research, scientists and physicians were unaware of any specific clinical and laboratory predictors of mortality that could help prioritize admission to hospital intensive care units, which would be extremely useful given that patients with yellow fever often deteriorate rapidly.

"Many patients admitted to the health services with a diagnosis of yellow fever aren't initially severely ill. They may even walk into the hospital. However, within days they can become very sick indeed, and a fairly significant proportion die." said Esper Kallás, Full Professor in the Department of Infectious and Parasitic Diseases at the University of São Paulo's Medical School (FM-USP) in Brazil.

"We had no previous knowledge of markers that could be used by medical teams to assess each patient's prognosis and identify those most likely to develop severe illness to treat them accordingly, improving the probability of recovery," Kallás said.

These markers have now been identified. They are described in an article in The Lancet Infectious Diseases published by Kallás and 19 other researchers affiliated with FM-USP, the university's Tropical Medicine Institute (IMT-USP), the São Paulo Emílio Ribas Infectology Institute (IIER-SP), and the DASA S.A., a Brazilian medical diagnostics firm (owner of the Delboni Auriemo and Lavoisier laboratory chains).

The study was supported by São Paulo Research Foundation - FAPESP as part of a project led by Ester Sabino, director of IMT-USP and a professor in FM-USP's Department of Infectious and Parasitic Diseases.

The main aim of the study was to identify predictors of mortality measurable during hospitalization. They focused on patients with suspected yellow fever admitted to the Hospital das Clínicas, FM-USP's general hospital and IIER-SP, both in São Paulo city, during the 2018 outbreak of the disease.

Between January 11 and May 10, 2018, 118 and 113 patients with suspected yellow fever were admitted to the Hospital das Clínicas and IIER-SP, respectively. After screening to confirm the diagnosis, 76 patients (68 men and eight women) with yellow fever were included in the study.

The diagnosis was based on detectable yellow fever virus RNA in the blood (74 patients) or yellow fever virus confirmed in autopsy reports (two patients). Of the 76 patients, 27 (36%) died within 60 days of hospital admission.

"Infection by yellow fever virus was confirmed using real-time PCR [polymerase chain reaction] testing of blood samples taken at admission or from tissue during autopsy. We performed whole-genome sequencing on the yellow fever viruses obtained from the infected individuals and assessed demographic, clinical and lab test data at admission. We investigated whether any of these measures correlated with patient death," Kallás said.

Disease markers

The researchers found that the older the patient, the more severe the yellow fever illness tended to be. "This is intuitive. It makes sense that older people suffer more and are likely to have a worse prognosis. There's a high probability that the condition in elderly patients will worsen," Kallás said.

A high neutrophil count, high hepatic aspartate aminotransferase (AST) and high viral load at hospital admission were also significantly associated with mortality. Neutrophils are white blood cells (or polymorphonuclear leukocytes) that play an essential role in the innate immune system.

All 11 (100%) patients with neutrophil counts of 4,000 cells per milliliter (mL) or greater and viral loads of 5·1 log10 copies/mL (approximately 125,000 copies of the virus per milliliter of blood) or greater died, compared with only three (11%) out of 27 patients with neutrophil counts of less than 4,000 cells per mL and viral loads of less than 5·1 log10 copies/mL.

"The organism may be trying to combat something other than the yellow fever virus," Kallás said. "Our hypothesis is that the multiplication of the virus in intestinal cells releases gut bacteria into the bloodstream. This may be the factor that activates the immune system and increases neutrophil production. Another possibility is that the patient's immune system becomes unbalanced and the condition is made more severe as a result."

Increased viral load was one of the markers of severity identified by the study. "As in the case of advancing age, it seems logical that the patient's prognosis worsens as the viral load increases, but it's the first time anyone has verified this in a scientific study," Kallás said.

On the other hand, the researchers found that yellowish skin color was not a marker of severity at the time of hospital admission.

"Yellowish skin color resulting from the destruction of liver cells by the virus appears only in the most severe cases. In our study, it wasn't present at hospitalization in any of the patients who later died," Kallás said.

According to Sabino, the study represents significant progress by enabling physicians "in the event of an outbreak of yellow fever such as the one now under way in Brazil, the worst in decades, to screen patients on arrival at health services to identify those who could potentially develop severe illness. Early admission to an intensive care unit increases the chances of survival."

Early diagnosis

After decades of research on yellow fever, markers of mortality had not yet been found, even for patients in the most developed countries. "The major yellow fever epidemics that occurred in the advanced industrialized countries, which have more medical and scientific resources to identify such markers, happened decades ago, mostly before development of the vaccine, which began to be tested 80 years ago, before World War Two," Kallás said.

In 2017, at the onset of the latest outbreak of yellow fever in Brazil, Kallás, Sabino and collaborators conducted a follow-up study of patients with dengue, chikungunya and Zika in an attempt to predict the transmission and distribution of these diseases, all of which are caused by arboviruses (arthropod-borne viruses).

"When the first signs of a yellow fever outbreak were detected, we quickly realized that we were in an ideal position to add yellow fever to the scope of our research with the aim of identifying factors that predicted the severity of the disease. Collaboration between IIER-SP and FM-USP's hospital was crucial to making this contribution possible," Kallás said.

The identification of prognostic markers can help clinicians prioritize admission to intensive care units, as a patient's condition often deteriorates swiftly.

"Put yourself in the shoes of a doctor examining a patient who's been diagnosed with yellow fever and has just been hospitalized," he said. "Until now, there was no way of knowing the patient's probable prognosis. The medical staff had to wait and see whether the patient's condition would deteriorate, and this typically happens very fast. Our findings will help clinical staff understand what happens to patients. A patient with all the markers of severity at the moment of admission runs the greatest risk of dying, so staff will be able to set priorities and allocate patients to intensive care rapidly."

At the same time, resource allocation can be improved to prioritize laboratory tests that will be able to determine more accurately whether the patients are likely to get better or worse.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Greener, faster and cheaper way to make patterned metals for solar cells and electronics

image: Dr Ross Hatton examining a patterned metal.

Image: 
University of Warwick

Patterning metals for electronics and solar cells can be slow, expensive and involve toxic chemicals

Scientists from the Department of Chemistry at the University of Warwick have developed a way to make patterned films of silver and copper (the two most conductive metals) using cheap organofluorine compounds and without using toxic chemicals.

This method is more sustainable and potentially much cheaper because it uses an extremely thin printed layer of organofluorine to prevent metal deposition, so metal is only deposited where it is needed.

It can be used to make electrodes for flexible solar panels, next generation sensors and low-emissivity glass

An innovative way to pattern metals has been discovered by scientists in the Department of Chemistry at the University of Warwick, which could make the next generation of solar panels more sustainable and cheaper.

Silver and copper are the most widely used electrical conductors in modern electronics and solar cells. However, conventional methods of patterning these metals to make the desired pattern of conducting lines are based on selectively removing metal from a film by etching using harmful chemicals or printing from costly metal inks.

Scientists from the Department of Chemistry at the University of Warwick, have developed a way of patterning these metals that is likely to prove much more sustainable and cheaper for large scale production, because there is no metal waste or use of toxic chemicals, and the fabrication method is compatible with continuous roll-to-roll processing.

The work is reported in the paper 'Selective deposition of silver and copper films by condensation coefficient modulation' published as an advanced article on 13th August in the journal Materials Horizons.

Thanks to £1.15 M funding from the UK Engineering and Physical Sciences Research Council, Dr Ross Hatton and Dr Silvia Varagnolo have discovered that silver and copper do not condense onto extremely thin films of certain highly fluorinated organic compounds when the metal is deposited by simple thermal evaporation.

Thermal evaporation is already widely used on a large scale to make the thin metal film on the inside of crisp packets, and organofluorine compounds are already common place as the basis of non-stick cooking pans.

The researchers have shown that the organofluorine layer need only be 10 billionths of a metre thick to be effective, and so only tiny amounts are needed.

This unconventional approach also leaves the metal surface uncontaminated, which Hatton believes will be particularly important for the next generation sensors, which often require uncontaminated patterned films of these metals as platforms onto which sensing molecules can be attached.

To help address the challenges posed by climate change, there is a need for colour tuneable, flexible and light weight solar cells that can be produced at low cost, particularly for applications where conventional rigid silicon solar cells are unsuitable such as in electric cars and semi-transparent solar cells for buildings.

Solar cells based on thin films of organic, perovskite or nano-crystal semiconductors all have potential to meet this need, although they all require a low cost, flexible transparent electrode. Hatton and his team have used their method to fabricate semi-transparent organic solar cells in which the top silver electrode is patterned with millions of tiny apertures per square centimetre, which cannot be achieved by any other scalable means directly on top of an organic electronic device.

Dr Hatton from the Department of Chemistry at the University of Warwick comments:

"This innovation enables us to realise the dream of truly flexible, transparent electrodes matched to needs of the emerging generation of thin film solar cells, as well as having numerous other potential applications ranging from sensors to low-emissivity glass."

Credit: 
University of Warwick

Breakthrough in understanding of magnetic monopoles could signal new technologies

Breakthrough in understanding of magnetic monopoles could signal new technologies

A breakthrough in understanding how the quasi-particles known as magnetic monopoles behave could lead to the development of new technologies to replace electric charges.

Researchers at the University of Kent applied a combination of quantum and classic physics to investigate how magnetic atoms interact with each other to form composite objects known as 'magnetic monopoles'.

Basing the study on materials known as Spin Ices, the team showed how the 'hop' of a monopole from one site in the crystal lattice of Spin Ice to the next can be achieved by flipping the direction of a single magnetic atom.

Although in theory at low temperatures the magnetic atoms do not have enough energy to do this, the team found that as a monopole arrives in a lattice site, it induces changes in the fields acting on the magnetic atoms surrounding it which enable them to 'tunnel' through the energy barrier.

Dr Quintanilla of the University's School of Physical Sciences said: 'We found evidence that this mysterious low-temperature hopping is achieved through quantum tunnelling: a phenomenon that allows a quantum object to overcome an obstacle which would, according to the classical laws of physics, require more energy than the system has available to it.

'We showed that the magnetic atoms forming a monopole experience fields that are transverse to their own, which in turn induce the tunnelling. We compute the monopole hopping rates resulting from this scenario and find them to be broadly consistent with available observations.'

The researchers suggest that this better understanding of monopole motion in spin ice materials may enable future technologies based on moving magnetic monopoles, rather than electric charges.

Credit: 
University of Kent

Exercise associated with benefit to patients with advanced colorectal cancer

image: Dr. Meyerhardt is the clinical director of the Gastrointestinal Cancer Center at Dana-Farber Cancer Institute and the senior author of the study.

Image: 
Dana-Farber Cancer Institute

First study to examine the association of physical activity with patient survival in advanced, metastatic colorectal cancer

Even low-intensity exercise was associated with a reduction in progression free survival

BOSTON - Patients with metastatic colorectal cancer who engaged in moderate exercise while undergoing chemotherapy tended to have delayed progression of their disease and fewer severe side effects from treatment, according to the results of a new study.

Even low-intensity exercise, such as walking four or more hours a week, was associated with a nearly 20 percent reduction in cancer progression or death over the course of the six-year study, said researchers from Dana-Farber Cancer Institute and Brigham and Women's Hospital, reporting in the Journal of Clinical Oncology. The analysis hinted at a possible lengthening of survival in patients who reported greater physical activity, but the data were not statistically significant.

"What we found was that people who engaged in some type of physical activity had a statistically significant improvement in progression-free survival (PFS)," said Jeffrey A. Meyerhardt, MD, MPH, of Dana-Farber, senior author of the study. "While there may also be an impact on overall survival, it was not statistically significant - and should be studied further."

"Physically active patients in our study also appeared to tolerate chemotherapy better," said Brendan Guercio, MD, first author of the study which was conducted while he was working as a hospitalist at Brigham and Women's. "Total physical activity equivalent to 30 or more minutes of moderate daily activity was associated with a 27 percent reduction in severe treatment-related toxicities."

Previous studies have found that regular exercise can reduce the risk of disease recurrence and death from colon cancer that has not metastasized to other parts of the body. Researchers say this is the first study to examine associations of physical activity with survival in advanced, metastatic colorectal cancer. Patients participated in a phase 3 study of chemotherapy for advanced colorectal cancer conducted by the Alliance for Clinical Trials in Oncology and sponsored by the National Cancer Institute. Within a month after beginning treatment, patients were invited to complete a questionnaire about their average physical activity over the previous two months. The final number of participants included 1,218 patients. While the data are significant, further research with a randomized prospective trial will help validate the results, the researchers said.

Based on the patients' descriptions, researchers quantified their physical activity in terms of metabolic equivalent task (MET)-hours per week - a standard measure used in research studies of exercise. Vigorous activity was defined as any activity requiring six or more METs, such as running, biking, tennis, skiing, or lap swimming. Non-vigorous activities included walking, climbing stairs, or yoga.

Analysis of the data revealed a statistically significant difference in PFS - the length of time after the patient completed the questionnaire before the cancer progressed or the person died. The difference in PFS was almost 20 percent in favor of those who exercised more.

Meyerhardt added that the findings "help justify encouraging patients to exercise and referring patients to physical therapists or programs like the YMCA LIVESTRONG program that does small group training for patients with cancer."

The analysis also found that patients who engaged in 18 or more MET-hours per week of activity had a 15 percent improvement in overall survival (death from any cause) than patients who engaged in less than three MET-hours per week of activity. However, that difference was not statistically significant, meaning it could have resulted from chance.

Credit: 
Dana-Farber Cancer Institute

Improved sewage treatment has increased biodiversity over past 30 years

image: The River Ray in Wiltshire is downstream from Swindon's large wastewater treatment plant.

Image: 
CEH

A higher standard of wastewater treatment in the UK has been linked to substantial improvements in a river's biodiversity over the past 30 years, ensuring a welcome success story for wildlife, say scientists.

The Centre for Ecology & Hydrology analysed data from the regular monitoring of both chemicals and invertebrates in the River Ray in Wiltshire by the Environment Agency and its predecessors between 1977 and 2016. This Thames tributary is downstream from Swindon's large wastewater treatment plant.

The Defra-funded study found that, since 1991, there has been a steady increase in both the diversity and abundance of freshwater invertebrates, which play a vital and varied role in an ecosystem's food chain1.

The water is cleaner due to a reduction in ammonia (a chemical present in human sewage that is potentially toxic to animals) plus an increase in oxygen levels (as a result of less organic matter being discharged into the river).

The findings, published in the journal Environmental Toxicology & Chemistry, echo other research2 which indicates there has been an increase in the biodiversity of many rivers across the UK. This latest analysis, which carefully examines four decades of chemistry and invertebrates data, offers an explanation why this has happened.

Professor Andrew Johnson of the Centre for Ecology & Hydrology, who led the study, explains: "There was a marked increase in the diversity and abundance of freshwater invertebrates on the River Ray immediately after 1991 and there has been a steady improvement since then. Therefore, we have identified Thames Water's investment in improved treatment to comply with the EU Urban Wastewater Directive, which was adopted that year, as the crucial turning point."

In practice, the implementation of the directive in the UK3 meant many large wastewater treatment plants including that at Swindon had to switch to the 'activated sludge' process, which is a more efficient way of dealing with large amounts of sewage water than the older 'trickling filter' system.

Professor Johnson says the previously lower standard of wastewater treatment, exacerbated by the increase in sewage produced by rapidly growing populations of towns like Swindon, had a negative impact on biodiversity. Many UK rivers had a low abundance of invertebrates and fish back in the 1960s, 70s and 80s.

Professor Johnson adds: "The fact there has been a continual increase in biodiversity in the Ray despite it being a small river taking the entire treated wastewater of a large town of 200,000 residents is extremely encouraging. It indicates that even for rivers with a very high wastewater content their fortunes can be turned around.

"It is wonderful to think that people walking by the Ray are seeing returning species of damselflies and caddisflies that they weren't seeing 30 years ago."

The privatisation of the UK water industry in 1989 meant the new water companies could borrow money to invest in their treatment plants to comply with higher standards, overseen by independent regulators. The industry spent £26 billion on various wastewater improvements between 1990 and 2015, according to Defra figures.

The European Union had been considering the merits of charcoal filtering as an additional, final stage of the wastewater treatment process to remove pharmaceuticals - a measure that could cost a total of £30 billion to introduce at treatment plants in England and Wales.4

However, the analysis by CEH found the trial introduction of charcoal filtering at Swindon between 2008 and 2014 resulted in no significant increase in the diversity and abundance of freshwater invertebrates, over and above the existing improving trend.

Credit: 
UK Centre for Ecology & Hydrology