Culture

Depression and binge-drinking more common among military partners

New research from King's College London suggests that depression and binge-drinking are more common among the female partners of UK military personnel than among comparable women outside the military community.

Researchers from the King's Centre for Military Health Research at the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) collected data from 405 women in military families with at least one child, representing around a third of the military population.

The researchers used a screening tool for depression, rather than a diagnosis from a psychiatrist, and women reporting frequent symptoms were considered to have probable depression. Drinking behaviours were also recorded through a self-reported screening tool.

7% of military partners met criteria for probable depression, compared to 3% of women from the general population

9.7% of military partners reported episodes of weekly, daily or almost daily binge-drinking, compared to 8.9% from the general population

After controlling for other factors linked to poor alcohol behaviours, the researchers found military partners were twice as likely to binge-drink as women in the general population.

Overall, military partners reported consuming alcohol less frequently than women in the general population but reported binge-drinking more often. Binge-drinking was significantly higher when families were separated for more than 2 months due to deployment.

Military families experience various unique challenges, such as frequently moving location and the stress and separation caused by deployment. The researchers say binge-drinking may reflect poor coping strategies used by military partners during the long absences of serving personnel from the family home.

Lead researcher Dr Rachael Gribble from the IoPPN says: 'While the majority of families cope well with the added pressures of military life, the additional challenges faced by military families may explain the additional mental health needs and higher rates of binge-drinking we found among military partners. More research is needed to help find out more about what contributes to depression and problematic drinking in this population.'

The researchers say that binge-drinking represents an important public health issue for the military community. They urge development of campaigns to reduce alcohol use in military families, suggesting that programmes which successfully tackle dangerous drinking among Service personnel could be extended to their partners.

Senior researcher Professor Nicola Fear from the IoPPN says: 'Our results indicate that healthcare professionals should be attuned to the impact military life can have on the mental health and wellbeing of family members. There are lots of support options available for military families out there, but these are not always easily accessible.'

This research was published in the European Journal of Psychotraumatology and is funded by the Economic and Social Research Council and the Army Families Federation. It is the first UK-based study to look at the mental health and well-being of women in relationships with members of the UK Armed Forces.

A spokesperson for the Army Families Federation, the independent voice of Army families, said: 'Isolation, separation and mobility can all impact on Service families' mental health and emotional wellbeing. Research in these areas helps organisations working with Service families to better understand how they can be supported. We welcome the conclusion of this research by King's College London that available support could be better signposted for military partners.'

Credit: 
King's College London

Is overall screen time associated with academic performance in kids, teens?

Bottom Line: Screen time overall wasn't associated with the academic performance of children and adolescents in this observational study. Called a systematic review and meta-analysis, this research consisted of a review of 58 studies from 23 countries (involving 480,000 participants ages 4 to 18) and a meta-analysis that combined the results of 30 of those studies involving 106,000 participants. The studies examined time or frequency for computer, internet, mobile phone, television, video game and overall screen media use and academic performance including composite scores, language and mathematics. While authors report the amount of time spent overall on screens wasn't associated with academic performance, the more time spent watching television and playing video games was associated with poorer academic performance. Previous research has produced conflicting findings about the association between screen media use and academic performance. A limitation of this research is that causal inferences can't be made. The findings of this current report suggest education and public health professionals should consider supervision and reduced time spent on screens.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Mireia Adelantado-Renau, M.Sc., University Jaume I, Castellon, Spain, and coauthors

(doi:10.1001/jamapediatrics.2019.3176)

Editor's Note: The article contains funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Loss of automatic reenrollment option associated with enrollment decrease in California marketplace

What The Study Did: This research letter analyzed enrollment data from California's health insurance marketplace, Covered California, and study authors report losing the option to automatically reenroll because some insurers exited the marketplace was associated with a decrease in enrollment.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Coleman Drake, Ph.D., of the University of Pittsburgh, and David M. Anderson, M.S., of the Duke Margolis Center for Health Policy, Durham, North Carolina, are the authors.

(doi:10.1001/jamainternmed.2019.3717)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Does being younger than classmates increase likelihood of childhood depression, ADHD, intellectual disability

What The Study Did: This observational study included 1 million children in the United Kingdom and looked at the association between children who are younger than their classmates and the likelihood of depression, attention-deficit/hyperactivity disorder and intellectual disability.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Jeremy P. Brown, M.Sc., of the London School of Hygiene and Tropical Medicine, is the corresponding author.

(doi:10.1001/jamapediatrics.2019.3194)

Editor's Note: The article contains conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

New CRISPR class expands genetic engineering toolbox

image: Illustrations representing the components of the common dCas9 system (top) and the Cascade system (bottom)

Image: 
Gersbach Lab

DURHAM, N.C.-- Biomedical engineers at Duke University have used a previously unexplored CRISPR technology to accurately regulate and edit genomes in human cells.

With this new approach, the researchers hope to dramatically expand the CRISPR-based tools available to biomedical engineers, opening up a new and diverse frontier of genome engineering technologies.

In a study appearing on Sept. 23 in Nature Biotechnology, Charles Gersbach, the Rooney Family Associate Professor of Biomedical Engineering at Duke, and Adrian Oliver, a post-doctoral fellow in the Gersbach lab who led the project, describe how they successfully harnessed Class 1 CRISPR systems to turn target genes on and off and edit the epigenome in human cells for the first time.

CRISPR-Cas is a defense system in which bacteria use RNA molecules and CRISPR-associated (Cas) proteins to target and destroy the DNA of invading viruses. The discovery of this phenomenon and the repurposing of the molecular machinery set off a genome-editing revolution as researchers learned how to wield the tool to specifically target and edit DNA in human cells.

CRISPR-Cas9, the most commonly used genome editing tool today, is categorized as a Class 2 CRISPR system. Class 2 systems are less common in the bacterial world, but they are theoretically simpler to work with, as they rely on only one Cas protein to target and cleave DNA.

Class 1 systems are not so simple, relying on multiple proteins working together in a complex called Cascade (CRISPR-associated complex for antiviral defense) to target DNA. After binding, Cascade recruits a Cas3 protein that cuts the DNA.

"If you were to look at the individual CRISPR systems of all the bacteria in the world, nearly 90 percent are Class 1 systems," said Gersbach. "CRISPR-Cas biology is an incredible source for biotechnology tools, but until recently everyone has only been looking at a small slice of the pie."

To demonstrate the capabilities of the Class 1 system, Oliver attached gene activators to specific sites along a type I E. coli Cascade complex and targeted the system to bind gene promoters, which regulate gene expression levels. Because she did not include the Cas3 protein in the experiment, there was no cutting of the DNA and no change to underlying DNA sequence. The experiment showed that the Cascade activator not only binds to the correct site and can turn up the levels of the target gene, but does so with accuracy and specificity comparable to CRISPR/Cas9.

Oliver repeated the process using type I Cascade complexes from an additional bacterial strain that was particularly robust in working at a variety of target sites. She also showed that the activator domain could be swapped for a repressor to turn target genes off. Again, the researchers noted accuracy and specificity comparable to CRISPR/Cas9 methods.

"We have found Cascade's structure to be remarkably modular, allowing for a variety of sites to attach activators or repressors, which are great tools for altering gene expression in human cells," Oliver said. "The flexible nature of Cascade makes it a promising genome engineering technology."

Gersbach and Oliver were encouraged to investigate the more complicated Class 1 CRISPR systems by their collaborators at nearby North Carolina State University, Professors Rodolphe Barrangou and Chase Beisel, who is now at the Helmholtz Centre for Infection Research in Germany. Barrangou is a microbiologist who has studied the natural biology of diverse CRISPR defense mechanisms for nearly two decades, and Beisel is a chemical engineer who has worked with Barrangou on engineering microorganisms with Class 1 CRISPR systems. They were both curious whether Gersbach's lab could use these systems in human cells similar to their work with Cas9.

"This work and the resulting technologies are a fantastic example of how collaboration across disciplines and across universities in the North Carolina Research Triangle can be highly innovative and productive" says Barrangou, the Todd R. Klaenhammer Distinguished Professor in Probiotics Research at North Carolina State University.

Now, the team is optimistic that their study, and the related work of others in the field, will incentivize new research into Class 1 CRISPR systems.

"The purpose of this project was to explore the diversity of CRISPR systems," said Gersbach. "There have been thousands of papers about CRISPR-Cas9 in the last decade, and yet we're constantly learning new things about it. With this study we're applying that mindset to the other 90% of what's out there."

So far, the team has shown that these Class 1 systems are comparable to to CRISPR-Cas9 in terms of accuracy and application. As they consider future directions, they are curious to explore how these systems differ from their Class 2 counterparts, and how these differences could prove useful for biotechnology applications.

The team is also interested in studying how Class 1 systems could address general challenges for CRISPR-Cas research, especially issues that complicate potential therapeutic applications, like immune responses to Cas proteins and concurrently using multiple types of CRISPR for different genome engineering functions.

"We know CRISPR could have a big impact on human health," said Gersbach. "But we're still at the very beginning of understanding how CRISPR is going to be used, what it can do, and what systems are available to us. We expect that this new tool will enable new areas of genome engineering."

Credit: 
Duke University

Promoting earth's legacy delivers local economic benefits

image: Shiprock, a volcanic neck named for its resemblance to a ship's silhouette, is a popular geotourism attraction in northwestern New Mexico. Photo taken Nov. 29, 2006.

Image: 
Credit Bowie Snodgrass. CC BY 2.0 via Wikimedia Commons, https://commons.wikimedia.org/wiki/File:Shiprock.snodgrass3.jpg

Phoenix, Arizona, USA: For iconic landscapes such as Grand Canyon or the Appalachian Mountains, geological features are an integral part of their appeal. Yet despite the seeming permanence of cliffs, caves, fossils, and other geological highlights, these features are surprisingly vulnerable to damage or destruction. Across the U.S., there is a growing awareness that America's geological resources represent a common heritage that needs to be preserved--and that doing so can yield considerable economic and societal benefits.

The notion of a shared geological record is central to the concept of geoheritage: the idea that people, landscapes, and the processes that have formed -- and continue to shape -- our planet are interconnected. Ways to protect and promote America's geoheritage, and the benefits of doing so, will be the focus of two sessions of talks presented tomorrow at the Geological Society of America's Annual Meeting in Phoenix.

"The only evidence of Earth's long history is the rock record. This can vary dramatically from place to place, so it's crucial to conserve this legacy," says Tom Casadevall, scientist emeritus at the U.S. Geological Survey and chair of the National Academy of Sciences-sponsored U.S. Geoheritage and Geoparks advisory group. "Geoheritage sites are crucial for advancing scientific and public knowledge about important topics like natural hazards, the evolution of life, and our nation's energy and mineral supplies."

In the U.S., sites of geological significance are protected at a variety of management levels and administered by numerous federal land-management agencies, including the National Park Service, the Bureau of Land Management, and the U.S. Forest Service, as well as state, tribal, and local entities. "During the last decade, there has been more and more interest in developing geoheritage sites," says Casadevall. "I have seen first-hand the economic, educational, and social benefits that can be derived from geology-related tourism, and I believe more American communities could benefit from this approach."

State geological surveys are playing a lead role in developing and promoting geology-related sites and educational programs across the country, according to Casadevall. In Florida, the state geologist has the authority to designate state geologic sites deemed important for scientific study as well as public understanding of the state's geological history. Four such sites have been designated to date.

In the central Appalachians, West Virginia University, the West Virginia Geological and Economic Survey, and the U.S. Geological Survey have developed an Appalachian Geo-STEM camp where high school students can engage in geoscience through outdoor adventure education activities. The university is also working with three southern counties to create an Appalachian Geopark that showcases the region's coal, caves, rivers, and other natural attributes and how these underpin the local culture.

Many state surveys also distribute educational materials, develop geo-tours, post blogs, and help catalog geosite attributes to guide tourism development. "State geological surveys play a vital role in translating the geological origins of interesting features into terms that non-scientists can understand," says Nelia Dunbar, New Mexico's state geologist.

In Texas, the Bureau of Economic Geology has begun several educational initiatives, including the Texas GeoSign Project, to promote geoheritage in the Lone Star State. The Arizona Geological Survey is currently cataloging more than 1,500 unpublished geologic and mining documents related to the Santa Cruz Valley National Heritage Area, which was established earlier this year. And the New Mexico Bureau of Geology & Mineral Resources is developing "e-materials" to help curious visitors understand the stories behind state's beautiful scenery and the valuable resources like turquoise that are so closely intertwined with New Mexico's rich cultural history.

"Growing awareness of the power of the 'geoheritage' approach has provided us with a pathway to increase awareness of the links between geology and human history," says Dunbar. "We look forward to increasing our reach and relevance through this new direction."

Credit: 
Geological Society of America

Grand ideas, global reverberations: Grand Canyon at its 6 millionth anniversary

image: Grand Canyon on its six millionth anniversary. Photo taken 3 June 2010.

Image: 
Credit National Park Service

Phoenix, Arizona, USA: Etched onto the steep walls of Arizona's 6,000-foot-deep, 277-mile-long Grand Canyon are clues that chronicle the sweeping changes the region has experienced during the past two billion years. The canyon's colorful layers narrate tales of ancient environments come and gone, from lofty mountain ranges and tropical seas to a Saharan-scale desert that once stretched across much of western North America.

The Grand Canyon was carved by the Colorado River, a ribbon of life-giving water that flows through the center of a desert wilderness. It was down this uncharted river that naturalist John Wesley Powell, a one-armed Civil War veteran, and his crew plunged in 1869 when they rafted through the Grand Canyon in what has been called "one of the most daring journeys in American history."

In commemoration of the 150th anniversary of this remarkable expedition, as well as Grand Canyon National Park's 100th anniversary, four sessions at the Geological Society of Annual Meeting in Phoenix will highlight the unparalleled role the Grand Canyon plays in advancing scientific discoveries, promoting geoscience research and education, and inspiring the millions of people who visit it each year.

On Monday, 23 Sept. 2019, a keynote session will cover geoscience research, education, and the human connections to the Grand Canyon, an "important but often overlooked space between new scientific research and its societal importance," says co-convener Karl Karlstrom, a University of New Mexico geologist. "These important milestones prompt us to reflect back, to take stock of the present, and also to look forward to the next 100 years."

When geologists look back, says Karlstrom, they really look back -- so much so that he and the other conveners, including Steven Semken from Arizona State University, Eleanour Snow from the U.S. Geological Survey, and Laura Crossey from the University of New Mexico, added a "six millionth" anniversary to the session title.

Current research suggests that was when the Colorado River stitched together several preexisting canyons into an integrated drainage that flowed along the river's current course from the Colorado Plateau to the Gulf of California. "Grand Canyon itself is geologically young when compared to the nearly two-billion-year-old rocks at its bottom," says Karlstrom, "so the conveners added the six millionth geologic anniversary to help put our human time scales into geoperspective."

Keynote speakers will include two Native Americans, Navajo Jason Nez and Ophelia Watahomigie-Corliss, a member of the Havasupai Tribal Council. Watahomigie-Corliss will explain why the centennial year is not a celebration for members of her tribe, and how the changes they have endured as a result of the national park's founding impacts them to the present day. Karlstrom hopes these talks will offer "a perspective that mixes some realism, some hope, and direction for improved future partnership."

Two additional oral sessions, one on Monday afternoon and a second on Tuesday morning, 24 September, plus a Wednesday, 25 September, poster session, will consider the Grand Canyon within a broader regional context and cover some of the numerous ongoing scientific debates regarding the Colorado Plateau and Rocky Mountain region--and their global implications.

One of the current debates revolves around the origin of the Great Unconformity, a 1.3-billion-year gap in the Grand Canyon's rock record that Powell recognized. This feature is unusual, says Karlstrom, in that it is the only such gap that appears to be global in its distribution.

Recent research suggests the Great Unconformity encompasses multiple episodes of erosion, each with a different cause. These appear to include the construction and breakup of a supercontinent, a "snowball Earth" episode during which the planet was completely frozen, and "a major flooding of the continent by advancing seas that was (somehow) related to one of the most interesting explosions in animal evolution in Earth history," says Karlstrom.

All four sessions will feature presentations that highlight the importance of the Grand Canyon for advancing geoscience research. These include short-term management of water-related issues, such as providing drinking water for the national park's six million annual visitors, as well managing the new river ecosystem created by the network of dams placed on the Colorado River. Recent research on Grand Canyon rocks has also revealed new insights into the formation of the North American continent around 1.8-1.7 billion years ago as well as the explosion in the diversity of animal life that occurred about 650-550 million years ago.

Many of these advances, says Karlstrom, have had global reverberations, assuring that the influence of this iconic canyon will extend well beyond its next big set of anniversaries. "The Grand Canyon will continue to be at the forefront of geoscience research, public education, and resource management and sustainability," he says.

Credit: 
Geological Society of America

Marijuana use among US adults with, without medical conditions

What The Study Did: National survey data was used in this study to examine how common marijuana use was among adults with and without medical conditions.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Hongying Dai, Ph.D., of the University of Nebraska Medical Center in Omaha, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.11936)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Researchers find way to kill pathogen resistant to antibiotics

image: Nagoya University researchers and colleagues in Japan have demonstrated a new strategy in fighting antibiotics resistance: the use of artificial haem proteins as a Trojan horse to selectively deliver antimicrobials to target bacteria, enabling their specific and effective sterilization. The technique killed 99.9% of Pseudomonas aeruginosa, a potentially deadly, antibiotic-resistant bacterium present in hospitals. This image shows a solution of the extracellular heme acquisition system protein A (HasA) with gallium phthalocyanine (Left) and the results of sterilization of Pseudomonas aeruginosa and Escherichia coli treated with HasA-bound gallium phthalocyanine by irradiation with near-infrared light (Right).

Image: 
Osami Shoji

Pseudomonas aeruginosa is a dangerous bacterium that causes infections in hospital settings and in people with weakened immune systems. It can cause blood infections and pneumonia, while severe infections can be deadly. Highly resistant to antibiotic treatment, P. aeruginosa is one of the most critical pathogens urgently requiring alternative treatment strategies, according to the World Health Organization.

This bacterium is one of many that have evolved a system that allows them to acquire difficult-to-access iron from the human body. Iron is essential for bacterial growth and survival, but in humans, most of it is held up within the 'haem' complex of haemoglobin. To get hold of it, P. aeruginosa and other bacteria secrete a protein, called HasA, which latches onto haem in the blood. This complex is recognized by a membrane receptor on the bacterium called HasR, permitting haem entry into the bacterial cell, while HasA is recycled to pick up more haem.

Bioinorganic chemist Osami Shoji of Nagoya University and collaborators have found a way to hijack this 'haem acquisition system' for drug delivery. They developed a powder formed of HasA and the pigment gallium phthalocyanine (GaPc), which, when applied to a culture of P. aeruginosa, was consumed by the bacteria.

"When the pigment is exposed to near-infrared light, harmful reactive oxygen species are generated inside the bacterial cells," explains Shoji. When tested, over 99.99% of the bacteria were killed following treatment with one micromolar of HasA with GaPc and ten minutes of irradiation.

The strategy also worked on other bacteria with the HasR receptor on their membranes, but not on ones without it.

The haem acquisition system is so essential to these bacteria's survival that it is not expected to change, making it unlikely the bacteria will develop resistance to this drug strategy, the researchers believe.

"Our findings support the use of artificial haem proteins as a Trojan horse to selectively deliver antimicrobials to target bacteria, enabling their specific and effective sterilization, irrespective of antibiotic resistance," the team reports in their study.

Credit: 
Nagoya University

Rethinking how cholesterol is integrated into cells

image: 1) Live-Cell Imaging of Sterol Transport to the Yeast Vacuole; 2) Eukaryotic sterol membrane integration; 3) Sterol Affinity of NPC2 and NTD and Sterol Transfer Assays

Image: 
Bjørn Panyella Pedersen, Aarhus University

Most people have heard of "cholesterol levels" and the dangers of high blood cholesterol, which is one of the main causes of cardiovascular disease. But besides the harmful side effects of high cholesterol, cholesterol is an essential component of all cells and fundamental to a host of important functions of the body. Hormones like estrogen and testosterone are made from cholesterol, for example.

It has been known for a long time that cholesterol is transported around the body in the blood as small particles consisting of fat and protein. In the body's cells, these particles are broken down and cholesterol is released and integrated as part of the cell. Although this process is essential, not just for humans, but for all animals and plants, surprisingly little is known about how cholesterol is actually incorporated into the cells after the breakdown of these particles.

In recent years, interest in how cholesterol is integrated and incorporated - and not least how this process is regulated - has grown tremendously. This is partly due the huge pharmaceutical potential in regulating this process, as shown with blockbuster drugs such as Zetia, which regulate cholesterol uptake from food. In addition, it has been shown that many viruses, including Ebola, uses the same process to infect cells.

During the past five years, researchers from Aarhus University have collaborated with researchers from the University of Southern Denmark and the University of Leeds to investigate how cholesterol is incorporated into cells, using biophysical and structural biological methods. The results have led to a groundbreaking insight into the process and to a new model for how cholesterol is integrated and incorporated that fundamentally changes our prior understanding of the process.

The results have just been published in the world leading journal Cell.

Credit: 
Aarhus University

Open Medicare data helps uncover potential hidden costs of health care

BLOOMINGTON, Ind. -- An interdisciplinary team of Indiana University scientists studying Medicare data have found an association between health care industry payments to medical providers for non-research expenses and what these providers charge for medical services -- shedding new light on potential hidden costs to the public.

Their findings, published Sept. 20 in Nature Communications, demonstrate that medical providers receiving higher amounts of industry payments tend to bill higher drug and
medical costs. Specifically, they found that a 10 percent increase in industry payments to medical providers is associated with 1.3 percent higher medical costs and 1.8 percent higher drug costs.

For example, a $25 increase in annual industry payments to a typical medical provider would be associated with approximately $1,100 higher medical costs and $100 higher drug costs.

"Let's be clear here, we should not find such an association," said Jorge Mejia, co-author on the paper and an assistant professor of operations and decision technologies at the IU Kelley School of Business. "Our findings raise the possibility that medical providers may be unduly influenced by payments from the healthcare industry."

It's important to note that an association shows that two variables appear to change at the same time, whereas causality implies that one variable causes another variable to change. This study does not prove causality, which the researchers said would be difficult to do with secondary data.

Jorge Mejia's co-authors were Amanda Mejia, assistant professor in the Department of Statistics, and Franco Pestilli, associate professor in the Department of Psychological and Brain Sciences, both at the IU College of Arts and Sciences.

Amanda Mejia said the team controlled for several key variables to rule out the possibility of other drivers of the association between industry payments and medical costs.

"We found that the association was still there after taking into account the size of the practice, its location and drug prescribing levels," she said.

Pestilli said the large Medicare data sets that the researchers used were made openly available as part of the 2010 Affordable Care Act.

"Our research capitalized on such openly shared data," Pestilli said. "We demonstrate the value of open data in providing society with critical insights on hidden costs that can be addressed at the policy level."

But Jorge Mejia said transparency alone is not enough to fix these hidden costs. That's why studies like this one are important; they help interpret the data so the public can better understand what it means.

"As a society, we have had the potential for quantifying and qualifying the influence of the industry on our medical costs," he said. "However, we have not done so. For example, we are just discovering the extent to which certain health care companies may be involved in the current opioid crisis in the U.S. We need tools to guide patients and consumers with all the data that is available."

To help achieve this goal, Jorge Mejia said he hopes Medicare will make it easier for researchers and the public to quantify the effect of the payments received by medical providers by adding the national physician identifier (NPI) to their Open Payments data set. Additionally, he hopes the research team's findings will start a conversation about how to communicate this information to consumers.

"We have energy efficiency scorecards for appliances, cars and many consumer products," Jorge Mejia said. "How can the public understand whether their physician is close to the health care industry? Instead of making this about whether it's good or bad, I'd like to kickstart a conversation about how information can be delivered in a simple way. Let's put patients in the driver's seat."

The researchers have several follow-up projects in progress, including one that aims to investigate how industry payments may drive future medical costs, which would bring them one step closer to establishing a causal relationship between payments and costs.

Credit: 
Indiana University

Weathering Antarctic storms -- Weather balloon data boost forecasting skill

image: This is a photograph showing radiosonde observation at Dome Fuji Station in Antarctica. The person in the photo is Dr. Konosuke Sugiura, a co-author of the study.

Image: 
Taichi Ito

Observational data from radiosondes deployed in Antarctica improve the forecasting accuracy for severe Antarctic cyclones, according to a Japanese research team led by the Kitami Institute of Technology, Hokkaido, Japan.

In parts of the Earth that are very sparsely populated, such as the Antarctic, direct observational weather data can be hard to come by, and with Antarctica's extreme climate, failure to accurately predict severe weather can easily become deadly. The team conducted a study that focused on the impacts of these data on forecasting an extreme cyclonic event, and the findings have been accepted and published as early view in Advances in Atmospheric Sciences.

With advancements in satellite technology and computer modeling, forecasting of storms and other weather events is constantly improving. However, accurate forecasts are not based on satellite data alone - they still rely on direct measurements taken at the surface and in the atmosphere. Direct measurements of the atmosphere can be obtained by deploying weather balloons equipped with radiosondes, devices that collect and transmit information about variables such as altitude, temperature, humidity, and wind speed.

The research team looked at the importance of weather radiosonde data in predicting severe weather events over Antarctica and the surrounding Southern Ocean. "We investigated the impact of including additional radiosonde observations from both the research vessel Shirase over the Southern Ocean and from the Dome Fuji Station in Antarctica on forecasting using an atmospheric general circulation model," explains lead author Kazutoshi Sato, an assistant professor at the Kitami Institute of Technology, Japan.

The researchers conducted a forecast experiment that focused on an unusually strong Antarctic cyclonic event that occurred from late December 2017 to early January 2018. Two datasets, one that included the additional radiosonde data and one that excluded those data, were used as the initial values. Only the experiment that included the radiosonde observations successfully captured the cyclone's central pressure, wind speed, and moisture transport 2.5 days in advance. These results clearly show that even with operational weather forecast centers, collecting radiosonde observation data is important to improve the forecasting accuracy for Antarctic cyclones.

However, the sparsity of observations in the Antarctic remains a problem. "Even with the assimilation of the additional radiosonde observations," says co-author Jun Inoue, an associate professor of polar science at the National Institute of Polar Research, part of the Inter-University Research Institute Corporation Research Organization of Information and Systems (ROIS) in Tokyo, Japan, "the experiment was unable to forecast the development of the cyclone four days in advance. That leaves a great deal of room for improvement." In a project called the 'Year of Polar Projection', many Antarctic stations have deployed additional radiosondes to provide an opportunity to further investigate the impact of the resulting data on weather forecasting in Antarctica.

To provide more accurate weather forecasts, Inoue noted that new additional observation systems need to be developed in the future. Improving severe weather forecasting in Antarctica will continue to be a priority, as the lives of researchers and other personnel in the region may depend on it.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Oil futures volatility and the economy

The drone strike on Saudi Arabia's oil infrastructure has highlighted the fragile and interconnected relationship between crude oil supply and the global economy, with new research bringing these economic ties into greater focus.

"We shouldn't underestimate the importance of geopolitical events in the oil market, as it has the power to impend the stability of our financial world," says University of Technology Sydney Finance researcher Dr Christina Sklibosios Nikitopoulos.

"On 16 September 2019 the oil market witnessed one of the highest intraday moves, with a 15% increase in Brent oil prices and an 14.7% increase in US WTI oil futures. Oil price spikes are seen as a recession barometer, but it is not just price but also volatility that matters," she says.

In a recently released paper, Dr Nikitopoulos, with colleagues Dr Boda Kang from Lacima Group, and Finance Professor Marcel Prokopczuk from Leibniz University Hannover, examined the connections between oil futures volatility and the global economy.

They looked at 30 years of data to discover economic determinants of oil futures volatility over the short, medium and long-term. These included oil-sector variables, financial variables and macroeconomic conditions.

The research revealed how deeply integrated crude oil markets have become with financial markets.

"Investors increasingly regard commodities as an alternative asset class to equities or bonds, and crude oil derivatives are the most actively traded commodity," says Dr Nikitopoulos.

Oil futures started trading in 1983, and options in 1986, and since then the market has experienced explosive growth. Daily trading volume has leapt from 21,997 contracts in 2012 to 1.6 million in 2016 and this week surpassed 2 million.

"Our study highlighted the importance of risk premiums in this market, and revealed that credit spreads play a significant role in determining short-term and medium-term variation in oil futures prices," she says.

In the bond market, term structure - the rate at which people can borrow or lend over different periods, is seen as an important economic signal - whether the yield term is up signalling growth or down signalling recession.

Term structures in oil markets can be seen in a similar light, where contango (where the futures price of a commodity is higher than the spot price) or backwardation (where the spot or cash price of a commodity is higher than the forward price) provide an economic signal.

Dr Nikitopoulos says the expected supply shortfall following the drone strike would cause oil futures markets to remain in backwardation for a while.

The researchers found that along with hedging pressure and VIX (an equities market volatility index) after 2004 (the beginning of the financialisation of the commodity markets) credit spreads, industrial production and the US dollar index, were all drivers of short-term volatility.

"This supports the notion of volatility spill-overs between equity and commodity markets, which has strengthened in the past 10 years, says Dr Nikitopoulos.

"It also supports the notion that oil volatility acts as a recession barometer, and fears about the impact of oil shocks on financial stability are justified," she says.

Medium-term volatility was consistently related to open interest (a measure of trading activity) and credit spreads, while oil sector variables such as inventory and consumption had a measurable impact after 2004 due to structural changes in the economy and the oil sector.

Dr Nikitopoulos argues that because oil futures volatility is a product of interaction between the oil-sector and the economy, there is a need for mutually consistent policies.

"Oil markets should be the focus of global discussions by policy makers, not just individual decisions from the US Commodity Futures Trading Commission or OPEC," she says.

"Crude oil futures volatility plays an important role in the global economy and has significant implications for market participants - from oil producers and institutional investors, to traders and market regulators.

"And while the US economy can manage this most recent oil shock with its own shale oil production and opening of strategic reserves, it is global markets like Australia that suffer the most through an increase in fuel costs," she says.

Credit: 
University of Technology Sydney

Engineered bacterial biofilms immobilizing nanoparticles enable diverse catalytic applications

image: Diverse catalytic applications of tunable functional E. coli biofilms with anchored nano-objects. (a) The biofilm-anchored Au NPs enable the recyclable catalytic reduction of the toxic p-nitrophenol (PNP) into the harmless p-aminophenol (PAP). (b) The biofilm-anchored heterogeneous nanostructures (Au NPs/Cd0.9Zn0.1S QDs) photo-catalyze the degradation of organic dyes to low-toxic products based on facile light-induced charge separation. (c) The biofilm-anchored quantum dots coupled with engineered strain enable photo-induced hydrogen production. Electrons are transferred from QDs to hydrogenase using methyl viologen (MV) as a mediator.

Image: 
©Science China Press

Nano-scale objects (1 - 100 nm) are desirable nano-catalysts featured with more catalytic active sites due to higher surface-area-to-volume ratios. The nano-scale nature brings several attendant challenges such as leakage of nano-catalysts to ambient environment and difficulties in reusing nanocatalysts over repeated reaction cycles. A major strategy for addressing these challenges has been the immobilization of nano-objects on various substrates via a variety of technological approaches. However, inorganic and bio-derived or bio-inspired substrates obviously lack "biology-only" attributes like self-regeneration, cellular-growth-based scalability, and the ability of cells to biosynthesize complex enzymes, substrates, co-enzymes, or other required reagents or reaction components in situ. Moreover, studies that have immobilized nano-objects directly on cell surfaces have reported damage to cells.

The Zhong group from the Materials and Physical Biology Division, at ShanghaiTech University has made a major conceptual advance in developing a new abiotic/biotic interface towards the integration and immobilization of nanoscale objects with living cells for catalysis. Very briefly, they successfully showed how engineered amyloid monomers expressed, secreted and assembled in the extracellular matrix of living Escherichia coli (E. coli) biofilms can be harnessed to anchor functional nano-scale catalysts to make highly efficient, scalable, tunable, and reusable living catalyst systems. In their proof-of-concept studies, they have demonstrated three simple catalytic systems, including biofilm-anchored gold nanoparticles to degrade the pollutant p-nitrophenol, biofilm-anchored hybrid Cd0.9Zn0.1S quantum dots (QDs) and gold nanoparticles to efficiently degrade organic dyes, and biofilm-anchored CdSeS@ZnS QDs in a dual bacterial strain semi-artificial photosynthesis system for hydrogen production. As revealed in their studies, the extracellular matrix in biofilms indeed provides an ideal milieu for interfacing and anchoring nano-objects for direct catalysis and for their integration with the metabolism of living cells: even after multiple rounds of reactions, nano-catalysts were still robustly anchored to biofilms and the E. coli cells were still alive for easy regeneration. Importantly, such an approach would open up the extremely powerful and unique attributes of living systems.

There is a large diversity of bacterial biofilms with different functionalities in nature, and their study thus lays the conceptual foundation for coupling the uniquely dynamic properties and capacities of these living materials with the highly reactive nanoparticles to innovatively solve challenges in bioremediation, bioconversion, and energy. Their research will spur further research for creating more efficient and industrially important reaction systems by building and integrating more intricate biofilms/inorganic hybrid catalytic systems.

Credit: 
Science China Press

Surface melting causes Antarctic glaciers to slip faster towards the ocean

image: Surface meltwater draining through the ice and beneath Antarctic glaciers is causing sudden and rapid accelerations in their flow towards the sea, according to new research.

This is the first time scientists have found that melting on the surface impacts the flow of glaciers in Antarctica.

Image: 
Google Earth

Study shows for the first time a direct link between surface melting and short bursts of glacier acceleration in Antarctica

During these events, Antarctic Peninsula glaciers move up to 100 per cent faster than average

Scientists call for these findings to be accounted for in sea level rise predictions

Surface meltwater draining through the ice and beneath Antarctic glaciers is causing sudden and rapid accelerations in their flow towards the sea, according to new research.

This is the first time scientists have found that melting on the surface impacts the flow of glaciers in Antarctica.

Using imagery and data from satellites alongside regional climate modelling, scientists at the University of Sheffield have found that meltwater is causing some glaciers to move at speeds 100 per cent faster than average (up to 400m per year) for a period of several days multiple times per year.

Glaciers move downhill due to gravity via the internal deformation of ice, and basal sliding - where they slide over the ground beneath them, lubricated by liquid water.

The new research, published today in Nature Communications, shows that accelerations in Antarctic Peninsula glaciers' movements coincide with spikes in snowmelt. This association occurs because surface meltwater penetrates to the ice bed and lubricates glacier flow.
The scientists expect that as temperatures continue to rise in the Antarctic, surface melting will occur more frequently and across a wider area, making it an important factor in determining the speed at which glaciers move towards the sea.

Ultimately, they predict that glaciers on the Antarctic Peninsula will behave like those in present-day Greenland and Alaska, where meltwater controls the size and timing of variations in glacier flow across seasons and years.

The effects of such a major shift in Antarctic glacier melt on ice flow has not yet been incorporated into the models used to predict the future mass balance of the Antarctic Ice Sheet and its contribution to sea level rise.

Dr Jeremy Ely, Independent Research Fellow at the University of Sheffield's Department of Geography and author of the study, said: "Our research shows for the first time that surface meltwater is getting beneath glaciers in the Antarctic Peninsula - causing short bursts of sliding towards the sea 100% faster than normal.

"As atmospheric temperatures continue to rise, we expect to see more surface meltwater than ever, so such behaviour may become more common in Antarctica.

"It's crucial that this factor is considered in models of future sea level rise, so we can prepare for a world with fewer and smaller glaciers."

Pete Tuckett, who made the discovery while studying for his Masters in Polar and Alpine Change at the University of Sheffield, said: "The direct link between surface melting and glacier flow rates has been well documented in other regions of the world, but this is the first time we have seen this coupling anywhere in Antarctica.

"Given that atmospheric temperatures, and hence surface melt rates, in Antarctica are predicted to increase, this discovery could have significant implications for future rates of sea level rise."

Credit: 
University of Sheffield