Culture

Undervalued wilderness areas can cut extinction risk in half

image: The research showed some wilderness areas, such as areas surrounding Madidi National Park in the Bolivian Amazon, play an extraordinary role in their respective regional contexts, where their loss would drastically reduce the probability of persistence of biodiversity.

Image: 
Rob Wallace/WCS

Wilderness areas, long known for intrinsic conservation value, are far more valuable for biodiversity than previously believed, and if conserved, will cut the world's extinction risk in half, according to a new study published in the journal Nature.

Wilderness areas - where human impact has been absent or minimal - are dwindling. The latest maps show over 3 million square kilometres (1.15 million square miles) of wilderness destroyed since the 1990s (an area the size of India), and that less than 20 percent of the world can still be called wilderness. Many of these areas are found outside of national parks and other protected areas. Until know, the direct benefits of wilderness for stopping species extinction were largely unknown.

By taking advantage of the new global biodiversity modelling infrastructure "BILBI" developed at CSIRO, which is able to provide fine-scale estimates of probability of species loss around the globe, and integrating this with the latest human footprint map generated by the University of Queensland (UQ), University of Northern British Colombia and WCS, a collaboration of scientists demonstrated that today many wilderness areas are critical to prevent the loss of terrestrial species in many areas of the world.

"Wilderness areas clearly act as a buffer against extinction risk, the risk of species loss is over twice as high for biological communities found outside wilderness areas. But wilderness habitat makes an even larger contribution, as some species can occur both inside and outside wilderness; this habitat is essential to support the persistence of many species that otherwise live in degraded environmental conditions," said Moreno Di Marco of CSIRO Land and Water, the lead author of the study.

The research showed the differing roles of wilderness around the world, with some areas playing an extraordinary role in their respective regional contexts, where their loss would drastically reduce the probability of persistence of biodiversity. Examples of such areas include parts of the Arnhem Land in Australia (covered by several indigenous protected areas), areas surrounding Madidi National Park in the Bolivian Amazon, forests in southern British Columbia (which are only partly protected), and savannah areas inside and outside the Zemongo reserve in the Central African Republic.

Said the paper's senior author, James Watson of WCS and UQ: "This research provides the evidence for how essential it is for the global conservation community specifically target protecting Earth's remaining wilderness. These places are getting decimated and beyond crucial for saving biodiversity, they are essential for abating climate change, for regulation of biogeochemical and water cycles, and are ensuring the long-term bio-cultural connections of indigenous communities."

The authors argue that a strategic expansion of the global protected-area estate is needed to preserve the irreplaceable wilderness areas that are most at risk, alongside national land-use legislation and the enforcement of business standards for stopping industrial footprints within intact ecosystems. The value of wilderness in the international biodiversity agenda can be no longer understated if nations are truly committed to achieving the Sustainable Development Goals.

WCS has protected the planet's most critical natural strongholds for over a century, leading with an effective science-driven model that is shaped and championed by communities.

Credit: 
Wildlife Conservation Society

Planned roads would be 'dagger in the heart' for Borneo's forests and wildlife

Malaysia's plans to create a Pan-Borneo Highway will severely degrade one of the world's most environmentally imperilled regions, says a research team from Australia and Malaysia.

"This network of highways will cut through some of the last expanses of intact forest in Borneo, greatly increasing pressures from loggers, poachers, farmers and oil-palm plantations," said Professor Bill Laurance, project leader from James Cook University in Australia.

"This would be a nightmare for endangered species such as the Bornean orangutan, clouded leopard and dwarf elephant," said Professor Laurance.

The study focused on new planned highways in the Malaysian state of Sabah, in the north of Borneo, the world's third-largest island.

"Some of the planned highways are relatively benign, but several are flat-out dangerous," said Dr Sean Sloan, lead author of the study and also from James Cook University. "The worst roads, in southern Sabah, would chop up and isolate Sabah's forests from the rest of those in Borneo."

"Slicing up the forests is toxic for large animals, such as elephants, bearded pigs and sloth bears, that must migrate seasonally to find enough food or otherwise face starvation," said Professor Laurance.

The new roads would also bisect protected areas in northern Borneo, making them vulnerable to illegal poachers and encroachers, say the researchers.

"Borneo already has many 'empty forests' as a result of severe poaching," said co-author Dr Mohammed Alamgir from James Cook University. "The poachers follow the roads into forests and then use snares and automatic rifles to kill the larger animals."

Road planners in Sabah have suggested that wildlife could use bridge-underpasses beneath roads to move and migrate, but the research team says this is unrealistic.

"When you build a new road you typically get a lot of forest destruction and fires, along with poaching, and that means vulnerable wildlife will largely avoid the area," said Dr Alamgir.

"Relying on underpasses to reduce road impacts is like trying to treat cancer with a band-aid," he said.

"This is the opposite of sustainable development," said Professor Laurance. "Clearly, several of the planned roads would be like plunging a dagger into the heart of Borneo's endangered forests and wildlife."

Credit: 
James Cook University

Rethinking scenario logic for climate policy

Current scenarios used to inform climate policy have a weakness in that they typically focus on reaching specific climate goals in 2100 - an approach which may encourage risky pathways that could have long-term negative effects. A new IIASA-led study presents a novel scenario framework that focuses on capping global warming at a maximum level with either temperature stabilization or reversal thereafter.

Scenarios can be seen as stories of possible futures that allow the description of factors that are difficult to quantify, such as those that influence the interconnected energy-economy-environment system. They are useful in terms of providing a way to explore how the future could evolve, and how today's decisions could affect longer-term systemic outcomes. The resulting information is commonly used by policymakers to inform decisions around issues like the measures we should take to limit global warming to well-below 2°C and preferably even 1.5°C relative to preindustrial levels, as stipulated in the Paris Agreement.

According to the authors of the study published in Nature, the vast majority of these scenarios, however, do not actually provide answers to these questions. They explain that while the scientific community has been creating scenarios that try to hit a climate target in 2100, they have been doing so without caring about how much warming we would be committing to over the coming decades, or whether that climate target was temporarily exceeded. This contrasts strongly with the near term climate change impacts and decisions policymakers and society actually care about. This dissonance became clear to the authors through their involvement as leading authors in the scientific assessments of the Intergovernmental Panel on Climate Change. The current approach has also resulted in a large literature of pathways that favor risky strategies with little emissions reductions in the near term, along with a strong reliance on CO2 removal in the second half in the century. In their study, the researchers resolve this issue and present a new scenario logic in which these important societal value judgments have to be explicitly decided on.

"Once we became aware that virtually all scenarios that are used to inform climate policy are biased towards risky pathways that for no good reason put a disproportionally large burden on younger generations, we started looking for a new logic that could resolve the issue. We were looking for a new way of designing long-term climate change mitigation scenarios that are more closely aligned with the intentions of the UN Paris Agreement on Climate Change," explains Joeri Rogelj, a senior researcher with the IIASA Energy Program and lead author of the study.

The study draws on insights from physical science to propose a new simple mitigation scenario framework that focuses on capping global warming at a specific maximum level with either temperature stabilization or reversal thereafter. It makes intergenerational trade-offs regarding the timing and stringency of mitigation action an explicit design criterion and provides a framework in which future CO2 removal deployment can be explored independently of variations in desired climate outcomes in light of social, technological, or ethical concerns. In this regard, the authors focus on three key issues: the time by which global CO2 emissions become net zero, the total amount of CO2 emitted until then, and the amount of CO2 that is annually removed from the atmosphere by human activities in the far future.

In terms of achieving net zero CO2 emissions, however, the authors note that this is not yet sufficient to meet the emission reduction requirements spelled out in the Paris Agreement, where a balance between sinks and sources of all greenhouse gases is required. The study's proposed scenario logic will allow modelers to translate geophysical and political science insights in a quantitative framework and defines how models that simulate the energy-economy-environment system can be used to determine climate change mitigation scenarios in line with current policy discussions. The staged design of the new scenario framework also allows researchers to explore mitigation investment decisions at various points in time, where choices at one time could influence the possibilities available at others.

"This new logic illustrates in a much clearer way how the choices we make as a society about emissions reductions in the next two to three decades determine the maximum level of warming, as well as our reliance on CO2 removal to return global warming to safer levels in the longer term. Interestingly, this logic also shows that once a transformation to a carbon neutral society is achieved, annual investments in the energy sector are pretty much equal to those expected for a world in which we didn't tackle climate change at all. Additional climate change mitigation investments thus play a role in the next decades, but don't have to be a continuous burden on future generations," concludes Rogelj.

Credit: 
International Institute for Applied Systems Analysis

3D virtual reality models help yield better surgical outcomes

Los Angeles - A UCLA-led study has found that using three-dimensional virtual reality models to prepare for kidney tumor surgeries resulted in substantial improvements, including shorter operating times, less blood loss during surgery and a shorter stay in the hospital afterward.

Previous studies involving 3D models have largely asked qualitative questions, such as whether the models gave the surgeons more confidence heading into the operations. This is the first randomized study to quantitatively assess whether the technology improves patient outcomes.

The 3D model provides surgeons with a better visualization of a person's anatomy, allowing them to see the depth and contour of the structure, as opposed to viewing a two-dimensional picture.

The study was published in JAMA Network Open.

"Surgeons have long since theorized that using 3D models would result in a better understanding of the patient anatomy, which would improve patient outcomes," said Dr. Joseph Shirk, the study's lead author and a clinical instructor in urology at the David Geffen School of Medicine at UCLA and at the UCLA Jonsson Comprehensive Cancer Center. "But actually seeing evidence of this magnitude, generated by very experienced surgeons from leading medical centers, is an entirely different matter. This tells us that using 3D digital models for cancer surgeries is no longer something we should be considering for the future -- it's something we should be doing now."

In the study, 92 people with kidney tumors at six large teaching hospitals were randomly placed into two groups. Forty-eight were in the control group and 44 were in the intervention group.

For those in the control group, the surgeon prepared for surgery by reviewing the patient's CT or MRI scan only. For those in the intervention group, the surgeon prepared for surgery by reviewing both the CT or MRI scan and the 3D virtual reality model. The 3D models were reviewed by the surgeons from their mobile phones and through a virtual reality headset.

"Visualizing the patient's anatomy in a multicolor 3D format, and particularly in virtual reality, gives the surgeon a much better understanding of key structures and their relationships to each other," Shirk said. "This study was for kidney cancer, but the benefits of using 3D models for surgical planning will translate to many other types of cancer operations, such as prostate, lung, liver and pancreas."

Credit: 
University of California - Los Angeles Health Sciences

Extinct human species gave modern humans an immunity boost

image: This map shows the frequency of the Denisovan TNFAIP3 gene variant in modern human populations of Island South East Asia and Oceania, it is found to be common east of the Wallace Line.

Image: 
Owen Siggs

Findings from the Garvan Institute of Medical Research show modern humans acquired a gene variant from Denisovans that heightened their immune reactions, indicating adaptation of the immune system to a changing environment.

The breakthrough study, published in Nature Immunology, is the first to demonstrate a single DNA sequence variant from an extinct human species that changes the activity of the modern human immune system.

The Denisovans - an extinct human species related to Neanderthals - interbred with modern humans ~50,000 years ago during the migrations of modern humans from Africa to what is now Papua New Guinea and Australia. Today, up to 5% of the genome of people indigenous to Papua New Guinea is derived from Denisovans.

The Garvan study reveals that modern humans acquired a gene variant from Denisovans that increases a range of immune reactions and inflammatory responses - including reactions that protect humans from disease-causing microbes.

"Our study indicates that the Denisovan gene variant heightens the inflammatory response in humans," says co-senior author Associate Professor Shane Grey, who heads the Transplantation Immunology Laboratory at Garvan.

"Previous research has found collections of gene variants from extinct human species that appear to have provided an advantage to humans living at high altitudes or to resist viruses, but have been unable to pinpoint which if any were actually functional," he adds. "This study is the first to identify a single, functional variant, and suggests that it also had an evolutionary benefit on the human immune system."

Discovering an immune switch

Harmful versions of a gene called TNFAIP3 have long been associated with the overactive immunity in autoimmune conditions, including inflammatory bowel diseases, arthritis, multiple sclerosis, lupus, psoriasis and type 1 diabetes. The TNFAIP3 gene codes for a protein called A20 that helps 'cool' the immune system by reducing immune reactions to foreign molecules and microbes.

As part of a collaboration between Garvan, the Sydney Children's Hospital, Randwick, the Children's Hospital at Westmead, and the Clinical Immunogenomics Research Consortium of Australasia (CIRCA), the researchers analysed the genomes of families in which one child presented with a severe and unusual autoimmune or inflammatory condition.

"Four separate families had the same DNA variant in the TNFAIP3 gene, changing one amino acid in the A20 protein from an isoleucine to a leucine (I207L)", says Professor Goodnow, Executive Director of the Garvan Institute and co-senior author of the study. "However, the presence of this variant in healthy family members indicated it was not sufficient to cause inflammatory disease on its own."

The researchers extracted immune cells from the families' blood samples, and found that, in cell culture, they produced a stronger inflammatory response than the immune cells of other individuals.

Tracing back immunity

Using datasets made available through the Simons Genome Diversity Project, the Indonesian Genome Diversity Project, Massey University, and the Telethon Kids Institute, which includes genome sequence data on hundreds of diverse human populations, co-first author and Flinders University senior researcher Dr Owen Siggs investigated the worldwide distribution of the TNFAIP3 variant.

The I207L variant carried by the Sydney families was absent from most populations but common in indigenous populations east of the Wallace Line, a deep ocean trench passing between Bali and Lombok and separating Asian fauna to the west from Australian fauna to the east. The I207L variant was common in people throughout Oceania, including people with Indigenous Australian, Melanesian, Maori and Polynesian ancestry.

"The fact that this rare version of the gene was enriched in these populations, and displayed genetic signatures of positive selection, means it was almost certainly beneficial for human health," says Associate Professor Grey.

The team also discovered the I207L variant in the genome sequence of an extinct human species, extracted from a 50,000-year-old finger bone of a Denisovan girl, found inside the Denisova cave in the Altai Mountains of Siberia. "Making that connection was extremely exciting," says Dr Siggs.

The I207L variant was present in two copies in the Denisovan girl but absent from Neanderthal remains from the same cave, indicating that the immunity-enhancing gene variant arose after the divergence of the Denisovan and Neanderthal lineages ~400,000 years ago.

Dialling up the immune system

To investigate the Denisovan gene variant's effects on the immune system, co-first author Dr Nathan Zammit replicated the I207L variant in a mouse model. "When exposed to a pathogenic Coxsackie virus strain - a virus which was originally isolated from a fatal case of human infant infection - mice with the Denisovan variant had stronger immune reactions and resisted the infection better than mice without the Denisovan gene," Dr Zammit explains.

"Our study indicates that the Denisovan variant, and others like it, act on a 'temperature control' dial in the immune system, turning up the temperature to change how we respond to different microbes," says Professor Goodnow.

"It was previously thought that A20, a gene that's central to the immune system, is binary - either it's switched on or off," adds Associate Professor Grey. "We've found it in fact tunes us as individuals to optimal 'Goldilocks points' in between - where immune reactions are neither too hot nor too cold - and that blows the field wide open."

Credit: 
Garvan Institute of Medical Research

Early maternal anemia tied to intellectual disability, ADHD and autism

image: From left: Renee Gardner, project coordinator at the Department of Public Health Sciences at Karolinska Institutet, with student Aline Wiegersma.

Image: 
Photo: Ulf Sirborn

The timing of anemia--a common condition in late pregnancy--can make a big difference for the developing fetus, according to research at Karolinska Institutet published in JAMA Psychiatry. The researchers found a link between early anemia and increased risk of autism, ADHD and intellectual disability in children. Anemia discovered toward the end of pregnancy did not have the same correlation. The findings underscore the importance of early screening for iron status and nutritional counselling.

An estimated 15-20 percent of pregnant women worldwide suffer from iron deficiency anemia, a lowered ability of the blood to carry oxygen that is often caused by a lack of iron. The vast majority of anemia diagnoses are made toward the end of pregnancy, when the rapidly growing fetus takes up a lot of iron from the mother.

In the current study, the researchers examined what impact the timing of an anemia diagnosis had on the fetus' neurodevelopment, in particular if there was an association between an earlier diagnosis in the mother and the risk of intellectual disability (ID), autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD) in the child.

Overall, very few women are diagnosed with anemia early in pregnancy. In this study of nearly 300,000 mothers and more than half a million children born in Sweden between 1987-2010, less than 1 percent of all mothers were diagnosed with anemia before the 31st week of pregnancy. Among the 5.8 percent of mothers who were diagnosed with anemia, only 5 percent received their diagnosis early on.

The researchers found that children born to mothers with anemia diagnosed before the 31st week of pregnancy had a somewhat higher risk of developing autism and ADHD and a significantly higher risk of intellectual disability compared to healthy mothers and mothers diagnosed with anemia later in pregnancy. Among the early anemic mothers, 4.9 percent of the children were diagnosed with autism compared to 3.5 percent of children born to healthy mothers, 9.3 percent were diagnosed with ADHD compared to 7.1 percent; and 3.1 percent were diagnosed with intellectual disability compared to 1.3 percent of children to non-anemic mothers.

After considering other factors such as income level and maternal age, the researchers concluded that the risk of autism in children born to mothers with early anemia was 44 percent higher compared to children with non-anemic mothers, the risk of ADHD was 37 percent higher and the risk of intellectual disability was 120 percent higher. Even when compared to their siblings, children exposed to early maternal anemia were at higher risk of autism and intellectual disability. Importantly, anemia diagnosed after the 30th week of pregnancy was not associated with a higher risk for any of these conditions.

"A diagnosis of anemia earlier in pregnancy might represent a more severe and long-lasting nutrition deficiency for the fetus," says Renee Gardner, project coordinator at the Department of Public Health Sciences at Karolinska Institutet and the study's lead researcher. "Different parts of the brain and nervous system develop at different times during pregnancy, so an earlier exposure to anemia might affect the brain differently compared to a later exposure."

The researchers also noted that early anemia diagnoses were associated with infants being born small for gestational age while later anemia diagnoses were associated with infants being born large for gestational age. Babies born to mothers with late-stage anemia are typically born with a good iron supply unlike babies born to mothers with early anemia.

Although the researchers couldn't disentangle anemia caused by iron deficiency from anemia caused by other factors, iron deficiency is by far the most common cause of anemia. The researchers say the findings could be the result of iron deficiency in the developing brain and may thus support a protective role for iron supplementation in maternity care. The researchers emphasize the importance of early screening for iron status and nutritional counselling but note that more research is needed to find out if early maternal iron supplementation could help reduce the risk of neurodevelopmental disorders in children.

Adult women typically need 15 mg of iron per day, though needs may increase later in pregnancy. Since excessive iron intake can be toxic, pregnant women should discuss their iron intake with their midwife or doctor.

Credit: 
Karolinska Institutet

Low-cost blood pressure drug improves brain function in individuals with autism

image: This is David Beversdorf, MD, professor of radiology, neurology and psychology at the University of Missouri School of Medicine.

Image: 
University of Missouri Health Care

Drugs known as "beta-blockers" have been used since the 1960s as a low-cost, safe and effective means to lower heart rate and control blood pressure. But now researchers from the University of Missouri School of Medicine and the MU Thompson Center for Autism and Neurodevelopmental Disorders have discovered a version of the drug known as propranolol could provide cognitive and social benefits for those living with autism spectrum disorder.

One in 59 children in the United States has been diagnosed with a form of autism spectrum disorder, according to the Centers for Disease Control and Prevention. The signs of autism begin in early childhood and can affect individuals differently. However, many with autism share similar symptoms, including difficulties with social communication. That is the core symptom researchers targeted with a pilot study to look at how this drug affected processing of language in the brain.

"Propranolol is used for test anxiety and performance anxiety, so we suspected it might help with social anxiety," said supervising investigator David Beversdorf, MD, professor of radiology, neurology and psychology at the MU School of Medicine and the Thompson Center. "I'd been studying its cognitive advantages and found some interesting benefits in language areas that prove difficult for those with autism. That's why we started this imaging study to understand its effects, and we're finding benefits involving both language and social interaction in single dose pilot studies."

The study involved 13 individuals with autism spectrum disorder and 13 without the disorder. They had a mean age of 22.5 years-old. Each participant completed three MRI brain-imaging sessions after taking either a placebo, the beta-blocker propranolol or the beta-blocker nadolol--which is similar to propranolol except that it does not cross the vascular barrier into the brain--before being asked to name as many items as possible that belonged in a particular word category during the MRI screening. Led by John Hegarty, PhD, who completed this work as part of his doctorate in the Interdisciplinary Neuroscience Program at University of Missouri, Beversdorf's team discovered in the autism group that propranolol improved performance compared to placebo on the word generation test, and the MRI results revealed the drug altered regions of the brain associated with word processing and improved specific task information processing.

"One of the interesting things we found in the autism group was the excessive connectivity in the frontal parietal control network--which affects how your brain allocates resources to other regions--became more similar to the levels of the non-autism group once propranolol was introduced," Beversdorf said. "It's an indicator as to why this drug may prove helpful."

Beversdorf's team is already working on a larger study involving propranolol. They've secured a federal grant from the Department of Defense (DOD) to examine the benefits of the drug on a larger and younger population of autism patients. Treating autism is challenging because of the many subtypes and factors that contribute to the disorder, so this study will monitor factors that might predict who will respond best to the drug.

"It's important to recognize that different individuals are going to respond differently to each approach or medication," Beversdorf said. "It's critical to identify who is going to respond to individual therapies so treatment can be tailored to each patient. We need continued support to do this."

Credit: 
University of Missouri-Columbia

Study quantifies impact of NCI-sponsored trials on clinical cancer care

image: A study shows that 82 of 182 phase 3 clinical trials led by SWOG or by other NCTN groups with SWOG participation were 'practice influential'.

Image: 
National Cancer Institute

A new study shows that nearly half of phase 3 cancer clinical trials carried out by the National Cancer Institute (NCI)-sponsored SWOG Cancer Research Network, one of five groups in NCI's National Clinical Trials Network (NCTN), were associated with clinical care guidelines or new drug approvals. NCI is part of the National Institutes of Health.

The analysis was published in JAMA Network Open and conducted by researchers affiliated with SWOG from several institutions around the country. The study suggests that NCTN trials add value regardless of whether findings were positive or negative. In addition, the authors calculated the cost of running NCTN trials, and they also found that the cost of a U.S. Food and Drug Administration (FDA) approval from an NCTN trial was much less than the cost of an FDA approval from a trial run by pharmaceutical companies.

"We found that the NCTN program contributes clinically meaningful, cost-effective evidence to guide care of cancer patients," said Joseph Unger, Ph.D., a health services researcher and biostatistician for SWOG at the Fred Hutchinson Cancer Research Center, Seattle, and lead author of the study. "These trials are largely funded by the public, which is getting good value for their investment."

The researchers used data from 182 phase 3 trials enrolling 148,028 patients between 1980 and 2017. These included trials that were led by SWOG or that were led by other NCTN groups with SWOG participation. According to the analysis, 82 of the 182 trials, or 45%, were found to be "practice influential," meaning that they influenced cancer care, either by being reflected in the National Comprehensive Cancer Network (NCCN) clinical guidelines or by being associated with a new drug approval by the FDA. Of those 82 practice-influential trials, 70 influenced NCCN guidelines, six influenced new FDA drug approvals, and six influenced both.

"Federally funded cancer treatment trials fill an important gap in clinical research by seeking answers to treatment questions that might not otherwise be explored," said James Doroshow, M.D., director of NCI's Division of Cancer Treatment and Diagnosis, which oversees the NCTN. "This study sheds light on the critical role these trials have in guiding clinical cancer treatment, whether the findings from the trials are positive or negative."

In fact, the influence of negative trial results on cancer care seen in this study surprised the researchers. Of the 82 practice-influential trials identified, 35, or 43%, had negative findings, with nearly half of those 35 trials reaffirming standard of care compared with experimental therapies being tested in the trials. Such negative findings signal to the oncology community which new, and potentially expensive, drugs are not effective. Negative trials can also reveal harmful side effects caused by experimental therapies.

The researchers also sought to estimate the costs of the trials in the study and looked at differences in costs of getting FDA approvals between the publicly funded trials in the study and privately funded trials conducted by pharmaceutical companies, biotech firms, and other industry funders.

They estimated that total federal investment supporting the trials in the study was $1.36 billion. This suggests that for 182 trials, average costs were $7.5 million per completed phase 3 trial (all trials), $16.6 million per practice-influential trial, and $123.6 million per new drug approval. In a review of 10 studies of the cost of new drug approvals by industry, the researchers found that the mean inflation-adjusted cost for a single new drug approval was $1.73 billion.

The study authors wrote that this kind of cost comparison is imperfect because pharmaceutical company trials can be more expensive, in part because of regulatory costs. Still, this comparison highlights the value of the NCTN program for taxpayers and the patients and families that benefit, according to Dr. Unger.

"The take-home message from the study is that NCTN studies provide a lot of clinically meaningful evidence for patients that influences their care routinely and does so at a relatively cost-effective level," he said. "It's important that people appreciate just how valuable these trials are in terms of benefit to patients with cancer."

Credit: 
NIH/National Cancer Institute

Searching for the characteristics of award-winning wine

image: Pictured here: Carolyn Ross from Washington State University.

Image: 
WSU

Award winning wines tend to be more complex and the best have high ethanol and sugar levels.

That's the finding of a recent paper in the Journal of Wine Research from Washington State University scientists, working with a colleague at the University of Lisbon in Portugal.

The researchers wanted to know what characteristics were prevalent in the wines that won the top awards at an international wine competition.

To find out, they crunched several years of data from the Mundus Vini Challenge, which is held twice a year in Germany.

Their analysis shows large wine challenges tend to favor wines with high ethanol and sugar levels. Flavors often associated with sweetness, including exotic fruits in white wines and dried fruit and spiciness in reds, also increase the chances of winning top prizes.

Conversely, white wines with tones of acidity and astringency and red wines of green/vegetative and red berries tended to not receive the top awards.

But simply making the wines sweeter, or less vegetal, may not make an award-winning wine.

"Complexity and harmony are hard to define," said Carolyn Ross, WSU professor in the School of Food Science and an author on the paper. "According to the data, you may want to add more exotic fruits, or spiciness. But that may have an impact on the broader attributes of the wine. The fact remains it will always be very impressive to make a wine that wins an award at a prestigious competition."

Previous research has looked at factors like pH level or acidity of award-winning wines, but the complexity of those characteristics made the results hard to quantify simply.

This new data breakdown helped scientists find more specific characteristics, Ross said.

Wine awards can have a huge impact on marketing, so competition at prestigious international events is fierce.

"Some people will decide between two different wines just because one has an award sticker on it," Ross said. "There's a major positive impact for a winery."

Credit: 
Washington State University

The life aquatic made clear with freshwater lens

image: This is a clear liquid optical chamber (CLOC) fixed to a baited remote underwater video (BRUV) system.

Image: 
Swansea University

A Swansea University doctoral student has found a way to view the life of plants and animals in murky waters - by using a lens of freshwater.

Robyn Jones, a PhD student in the College of Science has been testing the lens - a clear liquid optical chamber (CLOC) as a way to improve underwater visibility for scientists researching the flora and fauna that live around our shallow coastal seas. Typically, this equipment will be targeted at assessing fish communities living around marine renewable structures, such as offshore wind, tidal, and wave energy developments, where the waters are typically cloudy.

Robyn said: "It has long been a challenge for scientists to find a way to study biodiversity in these cloudy waters in a way does not disrupt these sensitive and complex habitats. While underwater cameras do help improve our knowledge of the coastal environment, the main drawback is their restricted use in low visibility environments. However, it is vital for us to be able to examine these areas as they are commonly considered critically important for biodiversity, fisheries, energy and increasingly ecosystem services."

To meet this challenge, Robyn and the team at Ocean Ecology and SEACAMS added a clear liquid optical chamber (CLOC) to a baited remote underwater video (BRUV) system. While similar chambers have been previously used by the marine surveying industry assessment of habitats at the bottom of deep water, adding the CLOC to the BRUV system as a way to assess the movement of fish is new.

The system was tested in both controlled and field conditions with results showing a drastic improvement in visibility - with scientists able to identify fish to species level. Dr Richard Unsworth, Swansea University supervisor on the project said: "Our study showed that by adding the CLOC to a conventional BRUV system, a better level of underwater visibility was possible which can be used for understand difficult marine and freshwater environments such as muddy estuaries and mangroves."

Robyn said: "With increases in marine renewable developments globally, there is a need to have a simple, reliable, safe, and repeatable method of monitoring the wildlife communities living around these developments. This CLOC?BRUV system will allow scientists around the world to closely monitor these environments while minimising the risk of damage to the seabed infrastructure."

This collaborative partnership with marine consultancy company Ocean Ecology Ltd has enabled this new technology to be experimentally field and lab tested in order to demonstrate its validity to government environmental regulators.

Ross Griffin at Ocean Ecology stated "This research is a perfect example of industry and academia working together to develop a novel methodology that can be applied in the real world. It's particularly exciting as it presents an approach to better assess the poorly studied marine biological communities found in low visibility environments not just in the UK but globally".

Credit: 
Swansea University

Genetically tailored instruction improves songbird learning

Some recent research suggests that educational achievement can be predicted based on differences in our genes. But does this really mean that genes set limits on an individual's academic potential? Or do these findings just reflect how standardized educational systems reward certain inborn learning styles and aptitudes at the expense of others?

A new UC San Francisco study conducted in songbirds supports the second interpretation, showing that what at first appear to be genetic constraints on birds' song learning abilities could be largely eliminated by tailoring instruction to better match the birds' inborn predispositions.

Education researchers have long advocated for tailoring classroom instruction to the specific learning styles of different students. However, carefully controlled studies showing the benefits of this approach have been inconclusive.

"Untangling the influences of genes and experience on educational achievement in humans is extremely challenging," said Michael Brainard, PhD, a professor of physiology and psychiatry and Howard Hughes Medical Institute investigator in the UCSF Center for Integrative Neuroscience. "The advantage of studying this kind of learning in songbirds is that in our experiments we can carefully control both the genetic background of individual birds and the instruction that they receive."

Male Bengalese finches learn to sing early in life by mimicking their fathers' songs. This results in unique family variants of the species' song being passed down generation-to-generation from fathers to sons. For example, some bird families tend to be slower-than-average crooners while others prefer jauntier up-tempo melodies. Brainard and other researchers have long studied this apparent "cultural learning" as a model of how human children learn language and other complex behaviors from their parents.

When David Mets, PhD, joined Brainard's lab after completing a doctorate in genetics, he wanted to ask a different kind of question: How do genetic predispositions and early life experiences combine to generate an individual's behavior?

In a 2018 study, Mets and Brainard had shown that differences in song tempo between Bengalese finch nests is at least partly genetic: young birds tend to sing at the same tempo as their fathers, even if they have never heard their fathers' song. In their new study, published September 10, 2019 in eLife, Brainard and Mets turned to the question of learning aptitude. They observed that some young birds easily pick up the song of an adult "tutor," while others struggle to match the structure of the songs they hear. Were these apparent differences in learning aptitude also genetic, or was there something more subtle going on?

"Having discovered that there were genetically determined biases in song tempo that differ across families and are heritable, we became interested in the idea that if we could understand these biases, we might be able to harness them to influence learning outcomes," Mets said.

To answer this question, the researchers exposed young birds, which had never heard their father's song, to a computerized tutoring program that played a synthetic, experimentally controllable, version of the species' typical song.

The researchers first had the computerized tutoring program present all the birds with a "one-size-fits-all" tutor song that captured the average song structure and tempo found in the Brainard lab's finch colony. They found that only birds from families that had long preferred to sing at this average, intermediate tempo were able to learn this "standardized" song effectively, while birds with a family history of singing faster or slower songs weren't able to pick it up accurately.

In contrast, when the researchers presented birds with a synthesized tutor song tailored to their genetic background -- slower-tempo songs for birds from slow-singing nests, medium tempo for birds from medium-singing nests, and higher tempo for birds from fast-singing nests -- all the birds proved capable of accurately learning the song.

Strikingly, birds from slow-song families, who had performed the worst at picking up the structure of the colony's average-tempo song, did just as well as average-tempo birds -- and better than fast-tempo birds -- when the groups were presented with songs that matched their family histories.

The results suggest that much of what initially seemed like genetically driven differences in learning ability were actually explained by mismatches between birds' genetic predispositions and their early life experiences: Birds who appeared to be worse learners when they were tutored with the average song of the colony turned out to be fully capable of learning well when presented with a stimulus tailored to their family background.

"In this study, we were able to demonstrate the importance of matching instruction to genetics using the simple songs of birds," Mets said. "We think that similar interactions between genetic predispositions and early life experience are likely to be equally important for complex human behaviors."

Brainard, also a member of the UCSF Weill Institute for Neurosciences and the Kavli Institute for Fundamental Neuroscience, added, "Almost everyone agrees that complex traits like learning ability are shaped by both genes and experience, but what's not very widely appreciated is that nature and nurture don't just add up independently -- they interact. We see this in our songbirds where it's the right match between inborn predispositions and early life experience that determines learning outcomes. Understanding the impact of these gene-experience interactions is critical to avoid misinterpretations of human genetic studies."

Mets and Brainard are now pursuing the specific genetic variants that drive differences in finch families' predisposition to learn faster or slower songs, as part of the lab's overarching goal to understand how genes and experience together shape the brain circuits underlying complex behaviors, which ultimately underlie an animal's individuality.

Authors: Mets and Brainard are the sole authors, and co-corresponding authors on the new study.

Credit: 
University of California - San Francisco

Tailored 'cell sheets' to improve post-operative wound closing and healing

image: Tailored cell sheet transplantation.

Image: 
Yokohama National University

Scientists have designed a new method for post-operative wound closing and healing that is both fast and effective. This strategy revolves around engineered "cell sheets" - or layers of skin-based cells. The procedure culminates in a wound dressing that is custom made for a specific cut or lesion that can be used to effectively treat open skin areas after surgeries.

The findings were published in Scientific Reports on July 18th and have the potential to address one of the greatest challenges of post-operative procedures: wound closing and adhesion.

Post-operative care refers to the successful adhesion and closing of wounds created during surgeries. Following operations, wounds have to be closed properly to avoid complications.

Cell sheet engineering was proposed more than a decade ago and has demonstrated successful outcomes in clinical trials for treatment of the esophagus, periodontal tissue, heart, and cornea. The cell sheets that have been engineered to date, while sturdy, take a long time to make, which is not optimal if they are to be placed on open wounds after surgery. Furthermore, these cell sheets have only been optimal on flat surfaces, which rules out a majority of surgeries performed on parts of the body that are not flat, such as the intestines.

"We start out with a scan of the surgical site, then design and print a 3D mold of the surface that needs be covered," says corresponding author Junji Fukuda, a professor at the Faculty of Engineering, Yokohama National University, Yokohama, Japan. "This surface is then coated with a gold-thin layer before cells are seeded and grown on the gold-plated mold. The gold cover speeds up the ultimate removal of the layer of cells that create the cell sheet, by using our original electrochemical approach."

The scientists tested the method on mouse model and found that they were able to transplant the engineered cell sheet directly into the animals. They then created appropriate 3D molds that modelled sites that had to be covered and grew cell sheets on top of those molds. These cell sheets were also successfully transplanted into mice. In other words, the scientists demonstrated that they effectively designed a 3D model of a surface to be covered after which they grew a cell sheet on it that they eventually successfully transplanted into mice. This cell sheet transplant onto arbitrary surfaces is a great advancement from previous cell sheets that were only effective on flat areas.

In their future studies, the researchers plan to perform experiments on large animal disease models. "We hope to prove that this approach is beneficial to the treatment of postoperative adhesion and occlusion," the authors comment. "Our ultimate goal is to integrate this approach to endoscopic surgery," they add.

Credit: 
Yokohama National University

Dartmouth study examines prevalence of screening for social needs

A new study from The Dartmouth Institute for Health Policy and Clinical Practice, published this week in JAMA Network Open, finds that most U.S. physician practices and hospitals report screening patients for at least one social need, a trend that is expected to increase in the future, and that practices that care for disadvantaged patients report higher screening rates.

In recent years, the link between patients' social needs, health outcomes, and costs has become increasingly recognized and advocated for by stakeholders across the nation's medical communities. But little has been known about the extent to which these screenings have been incorporated into patient care.

To help make this determination, researchers conducted a cross-sectional study using national survey data to assess the prevalence of screening among physician practices and hospitals for five social needs prioritized by the Centers for Medicare and Medicaid--food insecurity, housing instability, utility and transportation needs, and experience with interpersonal violence.

Responses from 2,190 physician practices and 739 hospitals were collected between June 2017 to August 2018. As part of the study, the researchers examined how screening efforts varied by organizational characteristics, including participation in reform efforts. They also identified major barriers to linking medical and social care that were reported by physicians and hospitals.

The new data showed that the majority of U.S. physician practices and hospitals were screening patients for at least one social need, with about 24 percent of hospitals and 16 percent of physician practices reported screening for all five social needs. Among both hospitals and practices, screening for interpersonal violence was most common while screening for utility needs was least common.

Federally qualified health centers and physician practices who participated in bundled payments, primary care improvement models, and Medicaid accountable care organizations were more likely to report screening than other practices. Academic medical centers were significantly more likely than
nonacademic medical centers to report screening patients for all social needs.

In general, the majority of physician practices and hospitals identified the lack of financial or staffing resources, time, and incentives as the main barriers to adopting initiatives that address social needs.

"Given the current focus on social needs from state and federal policymakers, payers, and providers, it seems likely that pressure on physicians and hospitals to identify and begin addressing patients' social needs will continue," says Taressa Fraze, PhD, a research scientist at The Dartmouth Institute for Health Policy and Clinical Practice and lead author on the study.

"We believe that systematic use of screening is a required first step to attend to social needs and improve health," says Fraze. "Addressing resource barriers, such as time, information, and money may be a key element in supporting physicians and hospitals in their efforts to screen patients for these important needs."

Credit: 
The Geisel School of Medicine at Dartmouth

NASA's wide view of major hurricane Humberto's massive Atlantic 'tail'

image: On Sept. 18, the MODIS instrument that flies aboard NASA's Aqua satellite provided these stitched together images of Hurricane Humberto in the Atlantic Ocean. Although Humberto's eye was just about 490 miles (785 km) west-southwest of Bermuda at the time of the Aqua overpass, the thick band of thunderstorms east of the center stretched beyond eastern Canada!

Image: 
NASA Worldview

NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for research. NASA's Aqua satellite provided a visible image of Major Hurricane Humberto and its very long "tail" of thunderstorms stretching past eastern Canada.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

On Sept. 18, the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite passed over the northern Atlantic Ocean and gathered images of Hurricane Humberto. Humberto is such a large storm that the Aqua satellite had to make two Earth orbits to capture the entire storm. At NASA's Goddard Space Flight Center in Greenbelt, Maryland, those images were put together to form a complete picture of Humberto.

Although Humberto's eye was just about 490 miles (785 km) west-southwest of Bermuda at the time of the Aqua overpass, the thick band of thunderstorms east of the center stretched beyond eastern Canada!

On Sept. 19, the National Hurricane Center (NHC) said, "Humberto's satellite presentation continues to be outstanding with a large ragged eye and surrounded by deep convection."

The NHC said, "At 8 a.m. EDT (1200 UTC), the eye of Hurricane Humberto was located by satellite near latitude 31.8 degrees north and longitude 68.9 degrees west. Humberto has increased its forward speed and is moving toward the east-northeast near 16 mph (26 kph). This general motion with an additional increase in forward speed is expected through early Thursday, followed by a northeastward to north-northeastward motion through Friday.

Maximum sustained winds are near 115 mph (185 kph) with higher gusts.  Humberto is a category 3 hurricane on the Saffir-Simpson Hurricane Wind Scale.  Some fluctuations in intensity are likely during the next day or so, but Humberto should remain a powerful hurricane through early Thursday.  A steady weakening trend should begin later on Thursday, Sept. 19.

Hurricane-force winds extend outward up to 60 miles (95 km) from the center and tropical-storm-force winds extend outward up to 175 miles (280 km).  The estimated minimum central pressure is 951 millibars.

On the NHC forecast track, the core of Humberto is expected to pass just to the northwest and north of Bermuda later tonight.

Credit: 
NASA/Goddard Space Flight Center

Supportive relationships in childhood leads to longer lives

(Boston)--After years of generalized theories and hypothesis, research has finally pinpointed certain aspects of childhood experience linked to people living longer.

Individuals raised in families with higher socioeconomic status were more optimistic in midlife, and in turn, lived longer. Those who experienced more psychosocial stressors, such as parental death, frequent moves and harsh discipline, tended to encounter more stressful life events in midlife, and had greater risk of dying.

Prior research has shown that adverse childhood experiences are associated with higher mortality risk. However, the effects appear to be driven by a small proportion of individuals who experienced multiple "hits" of severe stressors, such as physical abuse and domestic violence. Little is known about the potential effects of milder but more common stressors and the potential benefits of favorable childhood experiences on longevity. How different aspects of childhood experiences come to influence life span has rarely been studied. These questions are addressed in a new study in the journal Psychology and Aging.

The study involved 1,042 men who had been followed since 1961 in the Normative Aging Study. Three aspects of childhood experiences, including socioeconomic status, psychosocial stressors and presence of close relationships were assessed at study entry and in 1995. Optimism, life satisfaction, stressful life events and negative affect in midlife were assessed from 1985-91. Mortality status was tracked through 2016.

A key finding was that men who recalled having more childhood stressors also tended to experience more stressors as adults, and in turn, had greater risk of dying. For example, when comparing men who had five versus one childhood psychosocial stressors, those with more childhood stressors had a three percent greater risk of dying that was due to having more adulthood stressors. These finding suggest that a continuous pattern of stressor exposure from childhood to midlife may act as a precursor to reduced lifespan.

The researchers also looked at whether and how favorable aspects of childhood experiences may contribute to longevity. In particular, men raised in families with higher socioeconomic status tended to report higher levels of optimism and life satisfaction in midlife, and in turn, had greater likelihood of having longer lives. These findings suggest that optimism and life satisfaction are resilience pathways which convey the benefits of childhood socioeconomic resources onto longer lives.

"Our findings offer novel evidence on unique and shared pathways linking specific dimensions of early life experiences to longevity," said corresponding author Lewina Lee, PhD, clinical research psychologist at the National Center for PTSD at VA Boston and assistant professor of psychiatry at Boston University School of Medicine. "We hope that our research will stimulate further work to identify and intervene on factors which lie on the pathways linking childhood experiences to later-life health."

Credit: 
Boston University School of Medicine