Culture

The long-term effects of disasters on seniors with diabetes: evidence from Hurricanes Katrina and Rita

TAMPA, Fla. (September 23, 2019)- Older individuals and those with chronic conditions are especially at risk following natural disasters. Researchers at the University of South Florida (USF) investigated the short- and long-term effects of Hurricanes Katrina and Rita on older individuals with diabetes. They found those who lived in areas impacted by the 2005 hurricanes had a 40% higher one-month mortality rate than those who lived in unaffected counties. The increased risk persisted even ten years later, at which point the affected individuals had a 6% higher mortality rate.

The study published in Diabetes Care focused on 170,328 people aged 65-99 with type 1 or type 2 diabetes. The sample was created by gathering data on Medicare beneficiaries who lived in Louisiana, Mississippi, Texas and Alabama at the end of 2004 through at least August 2005 when Hurricane Katrina made landfall. Those included in the affected group were classified as being entitled to "Individual Assistance," a designation by the Federal Emergency Management Agency (FEMA) for counties most affected by disasters. Researchers followed the individuals through December 2014 and cross-referenced their mortality status using the national death index.

"We found that generally the difference in mortality between the affected and unaffected groups dissipated over time," said Troy Quast, PhD, associate professor of health economics in the USF College of Public Health. "However, we found that this trend did not apply to those in the affected group who moved to a different county after the hurricanes."

While their analysis could not identify specific factors that caused the mortality difference between the affected and unaffected groups, Quast noted that after disasters diabetics may face disrupted access to health care providers, damaged or lost medications, and difficulty monitoring glucose levels. In earlier research, Quast found that senior diabetics who were impacted by the storms were less likely to obtain routine tests and blood screens and had higher hospitalization rates.

Quast stated that the finding that those in the affected group who moved experienced sustained higher mortality is especially noteworthy. He suggested that additional research could be helpful to determine if the difference was because people who remained in their homes typically had a stronger support system and provider care network, or the choice to move led to worse health outcomes, or some combination of the two factors. Such research could provide guidance to officials after future disasters.

Credit: 
University of South Florida

Affordable Care Act slashed the uninsured rate among people with diabetes

MADISON, WIS. - The Affordable Care Act (ACA) provided health insurance for an estimated 1.9 million people with diabetes, according to a newly published study.

In 2009 and 2010, 17 percent of adults who were under the age of 65 and had diabetes were uninsured. After the ACA took effect, that percentage declined by 12 percentage points and by 27 percentage points among those with low income.

Coverage gains were particularly strong among people whose diabetes was undiagnosed. In 2009 and 2010, approximately one in four adults under age 65 with undiagnosed diabetes lacked health insurance coverage. After the ACA was implemented, the uninsured rate in this group dropped by 17 percentage points to eight percent.

Rebecca Myerson, assistant professor of population health sciences at University of Wisconsin School of Medicine and Public Health, was the principal author of the study, which was conducted while she was on the faculty at the School of Pharmacy and Schaeffer Center for Health Policy and Economics at the University of Southern California. The findings were published in the journal Diabetes Care.

The research team analyzed information from 11 years of the National Health and Nutrition Examination Survey, which gathers data that are nationally representative of the civilian population. The biennial survey includes biomarkers, including HbA1c, a measure of blood-sugar control. Using the NHANES data allowed the researchers to identify those with undiagnosed diabetes.

The sample used in the study included 2400 US citizens (ages 26-64) with diabetes, defined as an HbA1c level at or greater than 6.5 percent or a diabetes diagnosis by a health-care professional. Prior studies have found that about one-third of all U.S. adults with diabetes do not know they have it.

"Insurance coverage can change the health trajectory of people with diabetes by providing access to diagnosis and treatment," said Myerson. "But just as importantly, increasing coverage rates can also enhance health equity, because people with undiagnosed or untreated diabetes disproportionately belong to underserved groups."

The researchers estimate that, of the 1.9 million people with diabetes who gained coverage under the ACA, 1.2 million had low income (defined in the study as below 138 percent of the federal poverty level).

Credit: 
University of Wisconsin-Madison

Synchronised or independent neurons: this is how the brain encodes information

image: Individual neurons do not always act as separate units but participate in processes where they display similar electrical activity. Thanks to an innovative approach that combines predictive mathematical models and laboratory experiments, a study by SISSA has shed light on several mechanisms behind this phenomenon.

Image: 
Colin Behrens on Pixabay

"Like a book in which the single pages are not all different but carry small portions of common text, or like a group of people who whistle a very similar tune": this is how our brain cells work, say scientists. It is the phenomenon of "co-relation", in which individual neurons do not always act as independent units in receiving and transmitting information but as a group of individuals with similar and simultaneous actions. Observing the electrical activity of these cells in the laboratory, together with the use of computerized mathematical models, a group of researchers led by Professor Michele Giugliano of SISSA has shed light for the first time on the cellular mechanisms behind these correlations. In the study, the scientists examined excitatory neurons, those intended to promote the electrical activity of other neurons, and inhibitory neurons, intended to suppress their activity. "Our discovery tells us that excitatory cells tend to prefer individuality and to reduce the redundancy of their own messages, while the inhibitory cells act together as one. This allows us to add a new piece to the understanding of how neurons organize information in the brain. The information is always represented by the electrical activity of groups of cells "explains Professor Giugliano. The study, which has seen the involvement of SISSA of Trieste and the University of Antwerp, Belgium, and of Pittsburgh, USA, has just been published in The Journal of Neuroscience.

Predicting the function of neurons with mathematics

The study, explain the authors, has been carried out with a combination of mathematical models designed to predict the electric behaviour of neurons and of direct observations made on neurons in the laboratory. It is a very interesting integrated approach for different reasons. Giugliano explains, "Accompanying an experimental discovery by a theory, makes its impact stronger. Furthermore, as our models are the simplest possible, we can focus our understanding to the biological mechanisms rather than describing only their effect. This approach is behind the current progress in Neuroscience and opens up very interesting research fronts in brain research".

Understanding the neural code used by the brain

The redundancy in the electrical activity of the neurons of the cerebral cortex, also called co-variability, has been known for some time. The similarity in the behaviour regards both the input exchanged among neurons and their output, namely the response following the incoming message. What is not fully known "is how similar inputs generate similar outputs, in terms of cellular mechanisms. Therefore clarifying these mechanisms and exploring how different types of cells participate in this phenomenon is a fundamental step in fully understanding the complex circuits of the brain". Giugliano continues, "studying the biophysical characteristics of this phenomenon, we have noticed that the excitatory neurons tend to discourage the redundancy of their outputs, perhaps because their action must be unambiguous and more informative, considering that their messages are outgoing from the cortex. Instead, the inhibitory neurons do the opposite. These cells tend to work together using this redundancy to synchronise and amplify their effects.

The most accurate experimental validation of a mathematical theory

"We are still unable to fully explain the role of correlations in the brain, but we have certainly discovered that these two categories of neurons must be seen in a new light: they are not identical." However, there is more. Giugliano concludes, "This research is the most accurate experimental validation of a simple mathematical theory, which describes how the electrical activity of neurons emerges and varies over time. This also represents a significant achievement of this study".

Credit: 
Scuola Internazionale Superiore di Studi Avanzati

Karla crater confirmed to be an impact structure

The Karla crater, one of the about 150 large impact structures on Earth, is situated near the border of the Republic of Tatarstan and Chuvash Republic, about 163 kilometers from Kazan Federal University.

Previous research there took place in the 1980s. This time, Russian and French scientists conducted new inquiries.

"During our field research near Buinsk, Tatarstan, the impact nature of the Karla structure was confirmed. A number of paleomagnetic, petromagnetic and geochemical samples were collected," said project leader Natalia Bezaeva (invited employee of Kazan Federal University).

Junior Research Associate Dilyara Kuzina added, "The project as a whole is dedicated to studying magnetic properties of space objects, such as craters and tektites."

Credit: 
Kazan Federal University

Green tea could hold the key to reducing antibiotic resistance

Scientists at the University of Surrey have discovered that a natural antioxidant commonly found in green tea can help eliminate antibiotic resistant bacteria.

The study, published in the Journal of Medical Microbiology, found that epigallocatechin (EGCG) can restore the activity of aztreonam, an antibiotic commonly used to treat infections caused by the bacterial pathogen Pseudomonas aeruginosa.

P. aeruginosa is associated with serious respiratory tract and bloodstream infections and in recent years has become resistant to many major classes of antibiotics. Currently a combination of antibiotics is used to fight P. aeruginosa.

However, these infections are becoming increasingly difficult to treat, as resistance to last line antibiotics is being observed.

To assess the synergy of EGCG and aztreonam, researchers conducted in vitro tests to analyse how they interacted with the P. aeruginosa, individually and in combination. The Surrey team found that the combination of aztreonam and EGCG was significantly more effective at reducing P. aeruginosa numbers than either agent alone.

This synergistic activity was also confirmed in vivo using Galleria mellonella (Greater Wax Moth larvae), with survival rates being significantly higher in those treated with the combination than those treated with EGCG or aztreonam alone. Furthermore, minimal to no toxicity was observed in human skin cells and in Galleria mellonella larvae.

Researchers believe that in P. aeruginosa, EGCG may facilitate increased uptake of aztreonam by increasing permeability in the bacteria. Another potential mechanism is EGCG's interference with a biochemical pathway linked to antibiotic susceptibility.

Lead author Dr Jonathan Betts, Senior Research Fellow in the School of Veterinary Medicine at the University of Surrey, said:

"Antimicrobial resistance (AMR) is a serious threat to global public health. Without effective antibiotics, the success of medical treatments will be compromised. We urgently need to develop novel antibiotics in the fight against AMR. Natural products such as EGCG, used in combination with currently licenced antibiotics, may be a way of improving their effectiveness and clinically useful lifespan."

Professor Roberto La Ragione, Head of the Department of Pathology and Infectious Diseases in the School of Veterinary Medicine at the University of Surrey, said:

"The World Health Organisation has listed antibiotic resistant Pseudomonas aeruginosa as a critical threat to human health. We have shown that we can successfully eliminate such threats with the use of natural products, in combination with antibiotics already in use. Further development of these alternatives to antibiotics may allow them to be used in clinical settings in the future."

Credit: 
University of Surrey

New AI app predicts climate change stress for farmers in Africa

video: PlantVillage Nuru can predict near-term crop productivity for farmers in Africa and may help them protect their staple crops in the face of climate warming.

Image: 
David Hughes, Penn State

UNIVERSITY PARK, Pa. -- A new artificial intelligence (AI) tool available for free in a smartphone app can predict near-term crop productivity for farmers in Africa and may help them protect their staple crops -- such as maize, cassava and beans -- in the face of climate warming, according to Penn State researchers. The team will unveil the new tool -- which will work with their existing AI assistant, called "PlantVillage Nuru" -- to coincide with the United Nations Climate Action Summit to be held today (Sept. 23) at the U.N. Headquarters in New York City.

"Hundreds of millions of African farmers are already suffering from the effects of climate change," said David Hughes, associate professor of entomology and biology. "For example, earlier this year, which has been the hottest year on record, Mozambique was hit with two cyclones, both among the strongest ever recorded in East Africa. They caused almost $1 billion in damages and destroyed nearly 80 percent of staple crops throughout the region. They also changed rainfall patterns across East Africa, which further affected the crops."

Hughes added that the majority of these farmers remain unprepared for the climate change impacts that are yet to come. For instance, many of the more than 95 percent of African farmers who rely solely on rain to irrigate their crops will be unable to deal with the increasing drought conditions that are expected, he said.

PlantVillage Nuru is an existing AI assistant that is being used across Africa to diagnose crop diseases. The researchers have rigorously tested the performance of their machine-learning models with locally sourced smartphones in the typical high light and temperature settings of an African farm. In these tests, the app was shown to be twice as good as human experts at making accurate diagnoses, and it increased the ability of farmers to discover problems on their own farms.

Now PlantVillage Nuru can draw in data from the United Nations' WaPOR (Water Productivity through Open access of Remotely sensed derived data) portal, a database that integrates 10 years' worth of satellite-derived data from NASA and computes relevant metrics for crop productivity given the available water. PlantVillage Nuru also incorporates weather forecast data, a soil dataset for Africa, and the United Nations Crop Calendar, which is a series of algorithms on adaptive measures that can be taken under certain conditions. The PlantVillage AI tool incorporates tens of thousands of data points across Africa with hundreds more being collected every day All of these data are freely available to the global community, who can work collectively to improve the AI.

Specifically, the AI assistant has the opportunity to integrate diverse data streams to provide information about drought tolerance of crops and which crops are suitable in which areas, for example. In addition, the app offers advice that could help farmers learn about crop varieties that are climate-resilient, affordable irrigation methods, and flood mitigation and soil conservation strategies, among other best practices.

Although the tool is smartphone based, it can be accessed through a webpage to inform diverse stakeholders. In Kenya, the PlantVillage AI tool informs messages that are then sent out to SMS phones across the country. According to Hughes, the team hopes that over time and with the right support, it will be able to extend that service to all of Africa, potentially helping millions of farmers prepare for climate warming.

"Our goal is to nudge behavioral changes that will help farmers prepare their farms to be climate ready," said Hughes. "There are proactive behaviors, such as planting for increased crop diversity, promoting soil moisture conservation and engaging in water harvesting, that are known to increase resiliency. Our AI tool is in the early stages, but it will get better over time and with more training. We are releasing it now so we can kick-start the necessary collaboration we need to help African farmers adapt to climate change. As the African proverb says: 'If you want to go fast, go alone. If you want to go far, go together.' Climate change means we must act together to help those most in need."

Credit: 
Penn State

West African camera survey details human pressures on mammals in protected areas

ANN ARBOR--When University of Michigan wildlife ecologist Nyeema Harris started her multiyear camera survey of West African wildlife, she sought to understand interactions between mammals and people in protected areas such as national parks.

She expected those interactions to include lots of poaching. Instead, livestock grazing and the gathering of forest products were among the most common human-related activities her cameras captured, while poaching was actually the rarest.

"The common narrative in conservation, particularly in the tropics and African savannas, is the persistent threat of poaching and bushmeat hunting on wildlife and environmental integrity," said Harris, an assistant professor in the U-M Department of Ecology and Evolutionary Biology and director of the department's Applied Wildlife Ecology Lab, which is known as the AWE lab.

"Therefore, we expected to document heavy poaching activity in our camera survey," she said. "Instead, livestock grazing was the dominant human pressure, and the gathering of forest products--such as wood, grasses and fruit--was the primary human activity that we observed in these protected areas.

"Historically, the impetus for protected areas was exclusively to promote species persistence and to maintain biological processes and unique landscape features. More recently, contributions or benefits from protected areas have extended beyond an environmental perspective to a more inclusive social consideration."

Harris' study is the first wildlife camera survey in the West African countries of Burkina Faso and Niger. It documented human pressures on mammal communities in three national parks that are part of the largest protected area complex in West Africa, the W-Arly-Pendjari, or WAP.

Harris said the study's findings highlight the need to incorporate livestock husbandry into management plans for the WAP, which is a UNESCO World Heritage site. One possibility might be the creation of travel corridors for livestock herders to minimize impacts on wild mammals.

"As the first camera study in Burkina Faso and Niger, we documented extensive human activities with social benefits co-occurring within the fragile mammal community of West Africa," Harris and her colleagues wrote in a paper published online July 29 in the journal Conservation Letters.

"Our findings constitute a crucial step in shifting the WAP complex from the singular and arguably outdated mandate of nature conservation to a more dynamic coupled human-natural ecosystem approach of sustainable, integrated management."

West Africa is a region high in both biodiversity and environmental challenges. Deforestation is rapidly reducing wildlife habitat throughout the region, and urban development is expected to exacerbate the problem in the future. Furthermore, climate projections for the region predict contractions in the distribution of savanna and the loss of tree cover in coastal areas.

The U-M-led camera survey was conducted between 2016 and 2018 in three national parks established in 1954: Arly National Park, Park W-Burkina Faso and Park W-Niger. Harris' research team included government and university partners in Burkina Faso and Niger, as well as early career professionals from those countries and U-M students.

To assess human pressure on, and interactions with, mammal communities in the parks, the researchers strapped motion-triggered digital cameras to the trunks of trees in the parks, with a total of 119 camera stations deployed across the study.

The study yielded more than a million images, making it the largest non-invasive wildlife study for the region. The images included pictures of 33 wild mammal species, as well as shots of people and their domesticated animals: cattle, goats, donkeys and dogs.

Hooved animals (ungulates) represented 72-85% of the wild mammal species across the surveys, while carnivores represented 7-17%. The cheetah was the rarest species photographed, and seven other animals of conservation concern also triggered the camera shutters, including leopards, hippos, red-fronted gazelles and critically endangered West African lions.

Harris and her colleagues defined human pressure on wild mammal behavior as either direct or indirect. People appeared in more than 3,300 of the photos and were placed into one of four categories of direct human pressure based on their observed activities: gathering, herding, poaching and transiting/recreation. People carrying harvested animals, fishing nets, guns or other weapons were categorized as poachers.

Photos showing cattle, goats, donkeys or dogs--but without any people in the frame--were categorized as indirect evidence of human pressure on wild mammals.

To gauge the impact of direct and indirect human activity on wild mammal behavior, Harris and her colleagues compared the number of wild mammal photos captured in places where people were known to be present to the "triggering activity" in places without any people.

Overall, they found reduced activity of wild mammals in places where people or their domesticated animals were present.

Harris and her colleagues had expected that carnivores would be most sensitive to human pressure and most likely to change their behavior in places where people were present. Surprisingly, ungulates were the most sensitive group, a result the scientists attribute to high levels of cattle grazing throughout the WAP complex and the resulting competition for food between livestock and wild ungulates.

Interestingly, human pressures did not impact the diversity (i.e., the number of species) or the composition (i.e., the type of species) of mammal communities in the WAP complex.

Credit: 
University of Michigan

Faults' hot streaks and slumps could change earthquake hazard assessments

image: The ancient Israeli city of Susita was destroyed in 749 AD. Fallen columns all pointing in the same direction indicate the damage was due to an earthquake on the Dead Sea transform fault. The earthquake history on this fault is one of the world's longest.

Image: 
Seth Stein

Phoenix, Arizona, USA: For more than a century, a guiding principle in seismology has been that earthquakes recur at semi-regular intervals according to a "seismic cycle." In this model, strain that gradually accumulates along a locked fault is completely released in a large earthquake. Recently, however, seismologists have realized that earthquakes often occur in clusters separated by gaps, and one research group now argues that the probability of a tremor's recurrence depends upon whether a cluster is ongoing -- or over.

On Monday, 23 Sept. 2019, at the GSA Annual Meeting in Phoenix, Seth Stein, the Deering Professor of Geological Sciences at Northwestern University, will present a new model that he and his co-authors believe better explains the complexity of the "supercycles" that have been observed in long-term earthquake records. "One way to think about this is that faults have hot streaks -- earthquake clusters -- as well as slumps -- earthquake gaps -- just like sports teams," says Stein.

In the traditional concept of the seismic cycle, Stein explains, the likelihood of a large earthquake depends solely upon the amount of time that has elapsed since the most recent large tremor reset the system. In this simple case, he says, the fault has only a "short-term memory."

"The only thing that matters," says Stein, "is when the last big earthquake was. The clock is reset every time there's a big event."

But this model is not realistic, he argues. "We would never predict the performance of a sports team based on how they performed during their previous game," says Stein. "The rest of the season is likely to be much more useful."

Geologists sometimes see long-term patterns in paleoseismic records that the seismic-cycle model can't explain. In these cases, says Stein, "Not all the accumulated strain has been released after one big earthquake, so these systems have what we call "long-term memories.'"

To get a sense of how a system with Long-Term Fault Memory would function, the researchers sampled windows of 1,300-years -- a period of time for which geologists might reasonably have a record available -- from simulated 50,000-year paleoseismic records. The results indicate that earthquake recurrence intervals looked very different depending upon which 1,300-year window the scientists examined.

Because there are random elements involved, says Stein, there are windows when the recurrence intervals appear to be periodic, and other times when they look clustered. "But the fault hasn't changed its properties," he says. Eventually, the model predicts that the earthquakes will release much of the accumulated strain, at which point the system will reset and the fault's "streak" will end.

According to this Long-Term Fault Memory model, the probability of an earthquake's occurrence is controlled by the strain stored on the fault. This depends on two parameters: the rate at which strain accumulates along the fault, and how much strain is released after each big earthquake. "The usual earthquake-cycle model assumes that only the last quake matters," says Stein, "whereas in the new model, earlier quakes have an effect, and this history influences the probability of an earthquake in the future." After a big quake, he says, there can still be lots of strain left, so the fault will be on a hot streak. Eventually, however, most of the strain is released and the fault goes into a slump.

Ultimately, says Stein, the earthquake hazard depends upon whether or not a fault is in a slump or a streak. "Depending on which of those assumptions you make," he says, "you can get the earthquake probability much higher or much lower."

Seismologists have not yet come up with a compelling way to determine whether a fault is -- or is not -- in a cluster. As a result, says Stein, "There's a much larger uncertainty in estimates of the probability of an earthquake than people have been wanting to admit."

Credit: 
Geological Society of America

What color were fossil animals?

video: HKU palaeontologists evaluate fossil color reconstruction methods to propose new study framework.

Image: 
@The University of Hong Kong

Dr Michael Pittman of the Vertebrate Palaeontology Laboratory, Department of Earth Sciences, The University of Hong Kong led an international study with his PhD student Mr Arindam Roy that evaluates fossil colour reconstruction methods to propose a new study framework that improves and expands current practice. The paper was recently published in the journal Biological Reviews.

"People are fascinated by the colour and pattern of dinosaurs and other extinct animals because these aspects can tell you so much about an animal. Just think of a zebra and a peacock. We evaluated everything we know about fossil and modern animal colour and used that knowledge to propose a framework to improve how we reconstruct fossil colour in the future." said Dr Pittman.

Colour and patterns are critical to understanding the life, ecology, physiology and behaviour of animals. These colours are produced when light interacts with pigments and the structure of animal tissue. Common naturally-occurring animal pigments include melanin, carotenoids, porphyrins pterins, flavins and psittacofulvins which produce colours ranging from black and grey to yellow, orange and green (Figure 1).

Feathered dinosaur fossils instrumental to understanding the origin of birds were the first animal fossils to yield evidence of melanin, the colour pigment we also have in our eyes and hair (Figures 2, 3). In the last ten years, colour patterns have been reconstructed in over 30 fossil animals including birds, non-avialan dinosaurs and mammals, providing a unique opportunity to test ecological and behavioral hypotheses that were previously out of reach. Unfortunately, our knowledge of other pigments is scarce in the fossil record as these non-melanin pigments are more difficult to fossilise. This incomplete knowledge and the lack of a standard study approach have been prevailing challenges to the reconstruction of colour in fossil animals.

Co-author Dr Evan Saitta of the Field Museum of Natural History, Chicago, USA said, "We are in the golden age of multidisciplinary techniques in palaeontology. This is the first comprehensive study that not only critically evaluates the currently available methods, but also provides a reliable and repeatable framework that covers all vertebrate pigment systems not just melanin alone."

The new palaeocolour reconstruction framework proposed by Dr Michael Pittman, Mr Arindam Roy and their international team (Figure 4) comprises four main steps: (1) Map the known or suspected extent of preserved colour and patterns in the specimen; (2) Search for pigment-bearing microstructures using electron microscopy e.g. microstructure shape can be used to identify melanin-based colours like black, grey and brown); (3) If melanin-based colours are not detected, use high-end chemical analysis techniques to detect biomarkers of other pigments (4) Use reconstructed colours and patterns to test fundamental hypotheses related to animal physiology, ecology and behaviour. The new framework overcomes past challenges by incorporating the chemical signatures of different pigments, large and small-scale anatomical details visible in fossils as well as the potential for different pigments to fossilise. This framework provides background context for the evolution of colour-producing mechanisms and is expected to encourage future efforts to reconstruct colour in more fossil animals including non-dinosaur reptiles and mammals.

Mr Roy, the study's first author and a Hong Kong PhD Fellow said, "I am really excited by the course we have charted in this review study as I will be tackling many of the key issues we identified during my PhD studies at HKU."

Credit: 
The University of Hong Kong

Scientists decode DNA of coral and all its microscopic supporters

image: UQ's Dr Steven Robbins said the research may aid in the revival of the world's embattled coral reefs.

Image: 
Image Anna Roik

Scientists have seen for the first time how corals collaborate with other microscopic life to build and grow.

A study led by The University of Queensland and James Cook University reveals at the DNA level how coral interacts with partners like algae and bacteria to share resources and build healthy, resilient coral.

UQ's Dr Steven Robbins said the research may aid in the revival of the world's embattled coral reefs.

"Symbiotic relationships are incredibly important for thriving corals," Dr Robbins said.

"The mostly striking example of this is coral bleaching, where corals expel their algal symbiotic partners at higher-than-normal water temperatures," he said.

"As algae make up the bulk of the coral's food through photosynthesis, the coral will die if temperatures don't cool enough to allow symbiosis to re-establish.

"It's possible that equally important interactions are happening between corals and their bacteria and single-cell microorganisms (archaea), but we just don't know.

"To properly manage our reefs, we need to understand how these relationships work, and genomics is one of our best tools.

"Genomics allows us to look at each organism's entire library of genes, helping us work out how coral symbiotic relationships might support coral health."

In the research, the scientists took a sample of the coral Porites lutea from a reef near Orpheus Island, just north of Townsville.

In the lab, they separated the coral animal, its algal partner and all associated microbes, sequencing each organism's DNA.

"Once we've sequenced the genomes, we use computer algorithms to look at the entire library of genes that each organism has to work with," Dr Robbins said.

"This allows us to answer questions like, 'what nutrients does the coral need, but not make itself?'"

Associate Professor David Bourne, from JCU and the Australian Institute of Marine Science (AIMS) said that having high-quality genomes for a coral and its microbial partners is hugely important.

"Large advances in human medicine have been achieved in the past 20 years since the human genome was decoded," he said.

"Over the next 20 years, similar knowledge of corals and how they function will emerge - this data set represents a foundation for that.

"For the first time, we now have the genomes of a large number of the microbes that make up this coral, which is incredibly important for their survival.

"It's truly ground-breaking - this is the blueprint for coral and their symbiotic communities."

The researchers hope the research may help imperilled coral reefs globally.

"Our coral reefs support incredible diversity and when we lose reefs, we lose far more than corals," Dr Robbins said.

"There are many threats to coral, but climate change is the most existential for our reefs.

"In 2016 and 2017, nearly 50 per cent of all corals on the Great Barrier Reef died, and we don't see this trajectory reversing if carbon emissions remain at current levels.

"But, as scientists we can try to understand what makes corals tick to devise ways to make them more resilient, and we're delighted to have added to that body of knowledge."

Credit: 
University of Queensland

Do the costs of cancer drugs receive enough attention?

A recent analysis from Canada found that information on health-related quality of life is often not collected for investigational cancer drugs or used to calculate the balance of costs and benefits of these drugs when they are submitted for reimbursement, according to findings published early online in CANCER, a peer-reviewed journal of the American Cancer Society.

Both the effectiveness and the expense of a medication are important in determining its value and whether the cost will be reimbursed. This is often considered in terms of the cost per quality-adjusted life-year (QALY). One QALY equates to one year in perfect health.

In Canada, recommendations for reimbursement come from the Canadian Agency for Drugs and Technologies in Health, and specifically from its pan-Canadian Oncology Drug Review (pCODR) group. While Canada has no explicit upper threshold for the cost of each QALY gained, a common standard of $50,000 Canadian dollars per QALY is often used. In other words, a new technology associated with a cost per QALY of less than $50,000 Canadian dollars is likely to be reimbursed.

Clinical trials were not originally designed to address reimbursement decisions, but trial results are currently being used for this purpose. In addition, trials have not traditionally reported on data that are meaningful to patients. To determine whether recent cancer drug trials collect such information, Adam Raymakers, PhD, of the Canadian Centre for Applied Research in Cancer Control, and his colleagues reviewed drug manufacturers' submissions to pCODR between 2015 and 2018. They looked to see whether information on health-related quality of life was collected alongside cancer drug trials and used to calculate QALYs in analyses submitted to pCODR for reimbursement recommendations.

Among the 43 submissions that were evaluated by pCODR, the gain in QALYs in most submissions was small, and in almost two-thirds (65 percent) of cases, the submitter's best estimate of cost-effectiveness of the drug was in excess of $100,000 per QALY. More than half (56 percent) of submissions did not include original data on health-related quality of life, with most relying instead on evidence from previous studies.

"It is important to bring attention to the idea that when drug companies/manufacturers are talking about improvements from new and expensive drugs, they might not actually be meaningful improvements or they may not be improvements that are valued by patients. Patients and the public should understand that it can often be the case that these drugs might confer little to no meaningful benefit, at a substantial cost," said Dr. Raymakers. "If drug prices continue to rise, and are to be reimbursed by insurance companies or publicly funded systems, these drugs must offer benefit relative to their costs. Benefit should not be an abstract measure but rather one that is valued by patients."

Credit: 
Wiley

Study confirms Monterey Bay Aquarium surrogate-reared sea otters helped restore threatened population

image: Rescued southern sea otter pup 327 with surrogate mother at Monterey Bay Aquarium

Image: 
Monterey Bay Aquarium

The population of threatened southern sea otters in Elkhorn Slough, an estuary in Central California, has made a significant comeback as a result of Monterey Bay Aquarium's Sea Otter Program. A newly-published study in Oryx--The International Journal of Conservation documents 15 years of research showing how the program helped restore the population in the coastal estuary, with surrogate-reared otters and their descendants accounting for more than 50 percent of observed population growth during that period. The study's findings also demonstrate the potential benefits of reintroducing otters into other California estuaries where otter populations once thrived.

"If otters do great things, and there are places missing otters, and we now have a way to change that, why wouldn't we want to do it? Let's fix this," said Dr. Kyle Van Houtan, chief scientist at Monterey Bay Aquarium.

Monterey Bay Aquarium began placing rescued sea otters in Elkhorn Slough as part of its collaboration with state and federal authorities to restore the threatened southern sea otter population. The species was nearly hunted to extinction in California during the fur trade in the 18th and 19th centuries. The state's population has slowly increased to just over 3,000 sea otters between Santa Cruz and Santa Barbara, but the otters have yet to return to their full historical range from Alaska, down the coast of California to Mexico's Baja California.

Monterey Bay Aquarium is the only facility with a sea otter surrogacy program, in which non-releasable females raise rescued pups for return to the wild. The program enabled the aquarium team to conduct this first case study using this method to boost a wild population. The aquarium chose Elkhorn Slough for the study because it contained relatively abundant prey resources, accessible vantage points and waterways to monitor released otters. In addition, the location already supported a small population of male sea otters. Study data were based on the releases of 37 sea otter pups rescued by the Sea Otter Program from 2002-2016. These otters were cared for using the aquarium's resident sea otters as surrogate mothers, and then released as juveniles into Elkhorn Slough.

Key findings of the study include:

Surrogate-raised otters and their wild offspring accounted for more than half of Elkhorn Slough's sea otter population growth during the 15 years of the study

Surrogate-reared otters survived and reproduced at rates comparable to their wild-reared kin

Despite all pups originally stranding in open coast habitats, 90 percent of surrogate-reared females and 80 percent of males remained in Elkhorn Slough after release

The release of surrogate-reared juveniles into coastal estuaries may be an effective way to rebuild sea otter populations in those waters and restore ecosystem health

Aquarium researchers say the young sea otters were able to establish their new home in Elkhorn Slough because they were ecologically naïve to the estuary.

"They just hadn't been alive long enough to establish any kind of home range. In many cases, they probably stranded the same day they were born," said Karl Mayer, sea otter field response coordinator at Monterey Bay Aquarium and lead author of the study.

Van Houtan and Mayer point to the case of one of the rescued pups -- sea otter 327 -- as emblematic of the study. Otter 327 was orphaned at three days old after washing ashore at Morro Strand State Beach in 2005. She was brought to the aquarium and raised by a surrogate mother, learning how to groom, forage and other necessary skills to survive in the wild. She was ultimately released into Elkhorn Slough where she has since given birth and reared her own pups in the wild.

The increased sea otter population in Elkhorn Slough also brought many ecosystem benefits, documented in other studies, which helped restore the degraded estuary -- a vital spawning habitat for many fish species and an important part of the Pacific Flyway for migratory birds. In particular, sea otter predation on various species of crabs allowed eelgrass beds and the ecological communities they support to recover and thrive.

"There are many locations along the California coast where sea otters were not only historically present, but would also benefit from having otters return there," explains Van Houtan. "We're just starting to see the extensive and positive impacts associated with a growing and healthy otter population."

The aquarium is now evaluating Morro Bay as a potential site along the Central Coast for the reintroduction of sea otters. Morro Bay currently supports a small population of resident otters, and the area is within the species' current existing range. Like Elkhorn Slough before the return of sea otters, its eelgrass beds are in poor condition.

"Trying to duplicate the program's success in Morro Bay makes sense as a next step," said Mayer. "We want to show that this is not just unique to Elkhorn Slough."

Credit: 
Monterey Bay Aquarium

Industry has unduly influenced TV advertising regs on restricting unhealthy kids' foods

Industry has unduly influenced the regulations for TV advertising of unhealthy foods to children, likely weakening legislation in this area, argue doctors in an analysis, published in the online journal BMJ Open.

The UK broadcast regulator, Ofcom, which has a duty to protect commercial broadcast interests, shouldn't therefore lead on what is essentially a public health issue, they conclude.

They base their conclusions on a thematic analysis of the responses from interested parties (stakeholders) to the public consultation on proposals to strengthen the rules on TV advertising of foods aimed at children, which eventually took effect in January 2009.

This consultation, which was led by the UK broadcast regulator in 2006-07, received 1136 responses, 139 of which came from advertisers; broadcasters; campaigners; food manufacturers, retailers, and industry representatives; politicians; and public health doctors/advocates.

The analysis focused on these responses as those from the general public mostly broadly supported restrictions without addressing specific issues of implementation.

The authors wanted to explore how the development of public health policy is influenced by different stakeholders, and would have liked to have analysed the responses to the 2016 public consultation on the industry sugar tax levied against soft drinks. But the UK government has not made these publicly available.

The original proposals applied to all foods by volume, to children's channels only, and to 4 to 9 year olds. Any restrictions were to have started in April 2007.

Most of the 139 responses argued that restrictions should apply to foods high in fat, sugar, and salt, rather than a blanket ban, and that volume curbs would make no difference. Ofcom agreed.

Public health doctors/advocates and campaigners also argued that the legal and medical definition of a child in the UK is anyone under the age of 16, and that teens are equally susceptible to marketing tactics, however media savvy they are. Ofcom agreed.

But respondents disagreed as to whether restrictions should apply only to children's channels.

Public health clinicians/advocates and campaigners said that children would also watch adult TV, and recommended an outright ban on all unhealthy food advertising before the 9 pm 'watershed.'

Broadcasters and advertisers feared this would disproportionately affect advertising revenues, impinge on adult viewing, and would have only marginal public health benefits if introduced.

Ofcom accepted this argument, despite not specifically consulting on this, and its own research showing that such a move would reduce children's exposure to adverts for foods high in fat, sugar and salt by 82%.

Public health doctors and campaigners said that restrictions should be imposed as soon as possible. But children's channels argued that they should be allowed a transitional period as they expected to take a financial hit. Ofcom agreed.

The authors conclude that although concessions were made to the public health camp, "ultimately, industry arguments appeared to hold more sway...Ofcom appeared to believe that the commercial impact of the regulation of advertising should carry greatest weight, even when the aim of the regulation was to protect children's health."

They point out that Ofcom retains direct responsibility for advertising scheduling policy under current legislation. "This then begs the question of whether a governmental body with a duty to protect broadcasting interests should be leading on public health legislation," they write.

"This conflict between Ofcom's duties to the public and to broadcasters may have resulted in eventual restrictions that did not appear to alter the level of exposure of children to [foods high in fat, sugar and salt] advertising," they explain.

While the regulator considered the balance between commercial and public interests...it didn't "appear to consider the cost to the economy of poor health that could stem from a lack of appropriate restrictions."

This analysis concerns only one consultation, and there may be other ways in which public health interests can exert influence over policy, acknowledge the authors.

Nevertheless, the issue of TV advertising of unhealthy foods is highly politically sensitive and at the top of the public health strategy agenda for tackling obesity, they point out. Yet it seems vulnerable to commercial interests, they suggest.

Credit: 
BMJ Group

Chromosomal changes implicated in disease linked to social and economic disadvantage

Chromosomal changes implicated in disease are linked to social and economic disadvantage, finds a study of 473 families, published online in the Journal of Medical Genetics.

The findings add to the evidence on the biological factors that may be involved in disadvantage, say the researchers.

Chromosomes are the thread-like structures found in the nuclei of all living cells. They carry genetic information in the form of genes.

It has long been thought that genetic conditions may affect the social and economic status of patients and their families, but few studies have tested this theory, say the researchers.

Socioeconomic status--the measure of a person or family's social and economic standing--is based on several factors, including household income, education, and employment. It is strongly implicated in health and disease.

The researchers focused on a particular type of genetic change implicated in disease. The changes, known as pathogenic copy number variants, describe extra or missing sections of genetic material.

They studied the anonymised results of chromosome tests, performed between 2010 and 2017, mostly from children suspected of having a genetic condition. These were taken from a database of more than 17,000 DNA samples at the Manchester Centre for Genomic Medicine.

In all, 473 cases of chromosomal changes implicated in disease were found, for which information on residential postcode and heritability were also known.

The postcodes were then mapped to the Index of Multiple Deprivation (IMD), an official measure that ranks every area of England by level of deprivation.

This showed that people with these particular changes tended to live in more deprived areas.

When the results were divided into those changes inherited from a parent and those which had arisen spontaneously, they found that people with the inherited changes were more likely to live in areas of even greater deprivation.

This is an observational study, and as such, can't establish cause. But the findings show that even in the absence of a medical diagnosis, chromosomal changes implicated in disease may have a discernible effect on socioeconomic status, suggest the researchers.

"Lower socioeconomic status in families with medically relevant inherited pathogenic and likely pathogenic [genetic changes] with milder phenotype [lower risk of disease] could therefore be due to cumulative multigenerational consequences of these subclinical effects," they write.

Credit: 
BMJ Group

Numbers limit how accurately digital computers model chaos

image: "The Great Floating Point Wave" in homage to Hokusai's "The Great Wave Off Kanagawa"

Image: 
P V Coveney, H S C Martin & Charu G

The study, published today in Advanced Theory and Simulations, shows that digital computers cannot reliably reproduce the behaviour of 'chaotic systems' which are widespread. This fundamental limitation could have implications for high performance computation (HPC) and for applications of machine learning to HPC.

Professor Peter Coveney, Director of the UCL Centre for Computational Science and study co-author, said: "Our work shows that the behaviour of the chaotic dynamical systems is richer than any digital computer can capture. Chaos is more commonplace than many people may realise and even for very simple chaotic systems, numbers used by digital computers can lead to errors that are not obvious but can have a big impact. Ultimately, computers can't simulate everything."

The team investigated the impact of using floating-point arithmetic - a method standardised by the IEEE and used since the 1950s to approximate real numbers on digital computers.

Digital computers use only rational numbers, ones that can be expressed as fractions. Moreover the denominator of these fractions must be a power of two, such as 2, 4, 8, 16, etc. There are infinitely more real numbers that cannot be expressed this way.

In the present work, the scientists used all four billion of these single-precision floating-point numbers that range from plus to minus infinity. The fact that the numbers are not distributed uniformly may also contribute to some of the inaccuracies.

First author, Professor Bruce Boghosian (Tufts University), said: "The four billion single-precision floating-point numbers that digital computers use are spread unevenly, so there are as many such numbers between 0.125 and 0.25, as there are between 0.25 and 0.5, as there are between 0.5 and 1.0. It is amazing that they are able to simulate real-world chaotic events as well as they do. But even so, we are now aware that this simplification does not accurately represent the complexity of chaotic dynamical systems, and this is a problem for such simulations on all current and future digital computers."

The study builds on the work of Edward Lorenz of MIT whose weather simulations using a simple computer model in the 1960s showed that tiny rounding errors in the numbers fed into his computer led to quite different forecasts, which is now known as the 'butterfly effect'.

The team compared the known mathematical reality of a simple one-parameter chaotic system called the 'generalised Bernoulli map' to what digital computers would predict if every one of the available single-precision floating-point numbers were used.

They found that, for some values of the parameter, the computer predictions are totally wrong, whilst for other choices the calculations may appear correct, but deviate by up to 15%.

The authors say these pathological results would persist even if double-precision floating-point numbers were used, of which there are vastly more to draw on.

"We use the generalised Bernoulli map as a mathematical representation for many other systems that change chaotically over time, such as those seen across physics, biology and chemistry," explained Professor Coveney. "These are being used to predict important scenarios in climate change, in chemical reactions and in nuclear reactors, for example, so it's imperative that computer-based simulations are now carefully scrutinised."

The team say that their discovery has implications for the field of artificial intelligence, when machine learning is applied to data derived from computer simulations of chaotic dynamical systems, and for those trying to model all kinds of natural processes.

More research is needed to examine the extent to which the use of floating-point arithmetic is causing problems in everyday computational science and modelling and, if errors are found, how to correct them.

Professor Bruce Boghosian and Dr Hongyan Wang are at Tufts University, Medford, Massachusetts, United States (Dr Wang now works at Facebook in Seattle). Professor Peter Coveney of UCL is speaking at an event tomorrow in the Science Museum on the future of quantum computing.

Credit: 
University College London