Culture

Does being younger than classmates increase likelihood of childhood depression, ADHD, intellectual disability

What The Study Did: This observational study included 1 million children in the United Kingdom and looked at the association between children who are younger than their classmates and the likelihood of depression, attention-deficit/hyperactivity disorder and intellectual disability.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Jeremy P. Brown, M.Sc., of the London School of Hygiene and Tropical Medicine, is the corresponding author.

(doi:10.1001/jamapediatrics.2019.3194)

Editor's Note: The article contains conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

New CRISPR class expands genetic engineering toolbox

image: Illustrations representing the components of the common dCas9 system (top) and the Cascade system (bottom)

Image: 
Gersbach Lab

DURHAM, N.C.-- Biomedical engineers at Duke University have used a previously unexplored CRISPR technology to accurately regulate and edit genomes in human cells.

With this new approach, the researchers hope to dramatically expand the CRISPR-based tools available to biomedical engineers, opening up a new and diverse frontier of genome engineering technologies.

In a study appearing on Sept. 23 in Nature Biotechnology, Charles Gersbach, the Rooney Family Associate Professor of Biomedical Engineering at Duke, and Adrian Oliver, a post-doctoral fellow in the Gersbach lab who led the project, describe how they successfully harnessed Class 1 CRISPR systems to turn target genes on and off and edit the epigenome in human cells for the first time.

CRISPR-Cas is a defense system in which bacteria use RNA molecules and CRISPR-associated (Cas) proteins to target and destroy the DNA of invading viruses. The discovery of this phenomenon and the repurposing of the molecular machinery set off a genome-editing revolution as researchers learned how to wield the tool to specifically target and edit DNA in human cells.

CRISPR-Cas9, the most commonly used genome editing tool today, is categorized as a Class 2 CRISPR system. Class 2 systems are less common in the bacterial world, but they are theoretically simpler to work with, as they rely on only one Cas protein to target and cleave DNA.

Class 1 systems are not so simple, relying on multiple proteins working together in a complex called Cascade (CRISPR-associated complex for antiviral defense) to target DNA. After binding, Cascade recruits a Cas3 protein that cuts the DNA.

"If you were to look at the individual CRISPR systems of all the bacteria in the world, nearly 90 percent are Class 1 systems," said Gersbach. "CRISPR-Cas biology is an incredible source for biotechnology tools, but until recently everyone has only been looking at a small slice of the pie."

To demonstrate the capabilities of the Class 1 system, Oliver attached gene activators to specific sites along a type I E. coli Cascade complex and targeted the system to bind gene promoters, which regulate gene expression levels. Because she did not include the Cas3 protein in the experiment, there was no cutting of the DNA and no change to underlying DNA sequence. The experiment showed that the Cascade activator not only binds to the correct site and can turn up the levels of the target gene, but does so with accuracy and specificity comparable to CRISPR/Cas9.

Oliver repeated the process using type I Cascade complexes from an additional bacterial strain that was particularly robust in working at a variety of target sites. She also showed that the activator domain could be swapped for a repressor to turn target genes off. Again, the researchers noted accuracy and specificity comparable to CRISPR/Cas9 methods.

"We have found Cascade's structure to be remarkably modular, allowing for a variety of sites to attach activators or repressors, which are great tools for altering gene expression in human cells," Oliver said. "The flexible nature of Cascade makes it a promising genome engineering technology."

Gersbach and Oliver were encouraged to investigate the more complicated Class 1 CRISPR systems by their collaborators at nearby North Carolina State University, Professors Rodolphe Barrangou and Chase Beisel, who is now at the Helmholtz Centre for Infection Research in Germany. Barrangou is a microbiologist who has studied the natural biology of diverse CRISPR defense mechanisms for nearly two decades, and Beisel is a chemical engineer who has worked with Barrangou on engineering microorganisms with Class 1 CRISPR systems. They were both curious whether Gersbach's lab could use these systems in human cells similar to their work with Cas9.

"This work and the resulting technologies are a fantastic example of how collaboration across disciplines and across universities in the North Carolina Research Triangle can be highly innovative and productive" says Barrangou, the Todd R. Klaenhammer Distinguished Professor in Probiotics Research at North Carolina State University.

Now, the team is optimistic that their study, and the related work of others in the field, will incentivize new research into Class 1 CRISPR systems.

"The purpose of this project was to explore the diversity of CRISPR systems," said Gersbach. "There have been thousands of papers about CRISPR-Cas9 in the last decade, and yet we're constantly learning new things about it. With this study we're applying that mindset to the other 90% of what's out there."

So far, the team has shown that these Class 1 systems are comparable to to CRISPR-Cas9 in terms of accuracy and application. As they consider future directions, they are curious to explore how these systems differ from their Class 2 counterparts, and how these differences could prove useful for biotechnology applications.

The team is also interested in studying how Class 1 systems could address general challenges for CRISPR-Cas research, especially issues that complicate potential therapeutic applications, like immune responses to Cas proteins and concurrently using multiple types of CRISPR for different genome engineering functions.

"We know CRISPR could have a big impact on human health," said Gersbach. "But we're still at the very beginning of understanding how CRISPR is going to be used, what it can do, and what systems are available to us. We expect that this new tool will enable new areas of genome engineering."

Credit: 
Duke University

Promoting earth's legacy delivers local economic benefits

image: Shiprock, a volcanic neck named for its resemblance to a ship's silhouette, is a popular geotourism attraction in northwestern New Mexico. Photo taken Nov. 29, 2006.

Image: 
Credit Bowie Snodgrass. CC BY 2.0 via Wikimedia Commons, https://commons.wikimedia.org/wiki/File:Shiprock.snodgrass3.jpg

Phoenix, Arizona, USA: For iconic landscapes such as Grand Canyon or the Appalachian Mountains, geological features are an integral part of their appeal. Yet despite the seeming permanence of cliffs, caves, fossils, and other geological highlights, these features are surprisingly vulnerable to damage or destruction. Across the U.S., there is a growing awareness that America's geological resources represent a common heritage that needs to be preserved--and that doing so can yield considerable economic and societal benefits.

The notion of a shared geological record is central to the concept of geoheritage: the idea that people, landscapes, and the processes that have formed -- and continue to shape -- our planet are interconnected. Ways to protect and promote America's geoheritage, and the benefits of doing so, will be the focus of two sessions of talks presented tomorrow at the Geological Society of America's Annual Meeting in Phoenix.

"The only evidence of Earth's long history is the rock record. This can vary dramatically from place to place, so it's crucial to conserve this legacy," says Tom Casadevall, scientist emeritus at the U.S. Geological Survey and chair of the National Academy of Sciences-sponsored U.S. Geoheritage and Geoparks advisory group. "Geoheritage sites are crucial for advancing scientific and public knowledge about important topics like natural hazards, the evolution of life, and our nation's energy and mineral supplies."

In the U.S., sites of geological significance are protected at a variety of management levels and administered by numerous federal land-management agencies, including the National Park Service, the Bureau of Land Management, and the U.S. Forest Service, as well as state, tribal, and local entities. "During the last decade, there has been more and more interest in developing geoheritage sites," says Casadevall. "I have seen first-hand the economic, educational, and social benefits that can be derived from geology-related tourism, and I believe more American communities could benefit from this approach."

State geological surveys are playing a lead role in developing and promoting geology-related sites and educational programs across the country, according to Casadevall. In Florida, the state geologist has the authority to designate state geologic sites deemed important for scientific study as well as public understanding of the state's geological history. Four such sites have been designated to date.

In the central Appalachians, West Virginia University, the West Virginia Geological and Economic Survey, and the U.S. Geological Survey have developed an Appalachian Geo-STEM camp where high school students can engage in geoscience through outdoor adventure education activities. The university is also working with three southern counties to create an Appalachian Geopark that showcases the region's coal, caves, rivers, and other natural attributes and how these underpin the local culture.

Many state surveys also distribute educational materials, develop geo-tours, post blogs, and help catalog geosite attributes to guide tourism development. "State geological surveys play a vital role in translating the geological origins of interesting features into terms that non-scientists can understand," says Nelia Dunbar, New Mexico's state geologist.

In Texas, the Bureau of Economic Geology has begun several educational initiatives, including the Texas GeoSign Project, to promote geoheritage in the Lone Star State. The Arizona Geological Survey is currently cataloging more than 1,500 unpublished geologic and mining documents related to the Santa Cruz Valley National Heritage Area, which was established earlier this year. And the New Mexico Bureau of Geology & Mineral Resources is developing "e-materials" to help curious visitors understand the stories behind state's beautiful scenery and the valuable resources like turquoise that are so closely intertwined with New Mexico's rich cultural history.

"Growing awareness of the power of the 'geoheritage' approach has provided us with a pathway to increase awareness of the links between geology and human history," says Dunbar. "We look forward to increasing our reach and relevance through this new direction."

Credit: 
Geological Society of America

Grand ideas, global reverberations: Grand Canyon at its 6 millionth anniversary

image: Grand Canyon on its six millionth anniversary. Photo taken 3 June 2010.

Image: 
Credit National Park Service

Phoenix, Arizona, USA: Etched onto the steep walls of Arizona's 6,000-foot-deep, 277-mile-long Grand Canyon are clues that chronicle the sweeping changes the region has experienced during the past two billion years. The canyon's colorful layers narrate tales of ancient environments come and gone, from lofty mountain ranges and tropical seas to a Saharan-scale desert that once stretched across much of western North America.

The Grand Canyon was carved by the Colorado River, a ribbon of life-giving water that flows through the center of a desert wilderness. It was down this uncharted river that naturalist John Wesley Powell, a one-armed Civil War veteran, and his crew plunged in 1869 when they rafted through the Grand Canyon in what has been called "one of the most daring journeys in American history."

In commemoration of the 150th anniversary of this remarkable expedition, as well as Grand Canyon National Park's 100th anniversary, four sessions at the Geological Society of Annual Meeting in Phoenix will highlight the unparalleled role the Grand Canyon plays in advancing scientific discoveries, promoting geoscience research and education, and inspiring the millions of people who visit it each year.

On Monday, 23 Sept. 2019, a keynote session will cover geoscience research, education, and the human connections to the Grand Canyon, an "important but often overlooked space between new scientific research and its societal importance," says co-convener Karl Karlstrom, a University of New Mexico geologist. "These important milestones prompt us to reflect back, to take stock of the present, and also to look forward to the next 100 years."

When geologists look back, says Karlstrom, they really look back -- so much so that he and the other conveners, including Steven Semken from Arizona State University, Eleanour Snow from the U.S. Geological Survey, and Laura Crossey from the University of New Mexico, added a "six millionth" anniversary to the session title.

Current research suggests that was when the Colorado River stitched together several preexisting canyons into an integrated drainage that flowed along the river's current course from the Colorado Plateau to the Gulf of California. "Grand Canyon itself is geologically young when compared to the nearly two-billion-year-old rocks at its bottom," says Karlstrom, "so the conveners added the six millionth geologic anniversary to help put our human time scales into geoperspective."

Keynote speakers will include two Native Americans, Navajo Jason Nez and Ophelia Watahomigie-Corliss, a member of the Havasupai Tribal Council. Watahomigie-Corliss will explain why the centennial year is not a celebration for members of her tribe, and how the changes they have endured as a result of the national park's founding impacts them to the present day. Karlstrom hopes these talks will offer "a perspective that mixes some realism, some hope, and direction for improved future partnership."

Two additional oral sessions, one on Monday afternoon and a second on Tuesday morning, 24 September, plus a Wednesday, 25 September, poster session, will consider the Grand Canyon within a broader regional context and cover some of the numerous ongoing scientific debates regarding the Colorado Plateau and Rocky Mountain region--and their global implications.

One of the current debates revolves around the origin of the Great Unconformity, a 1.3-billion-year gap in the Grand Canyon's rock record that Powell recognized. This feature is unusual, says Karlstrom, in that it is the only such gap that appears to be global in its distribution.

Recent research suggests the Great Unconformity encompasses multiple episodes of erosion, each with a different cause. These appear to include the construction and breakup of a supercontinent, a "snowball Earth" episode during which the planet was completely frozen, and "a major flooding of the continent by advancing seas that was (somehow) related to one of the most interesting explosions in animal evolution in Earth history," says Karlstrom.

All four sessions will feature presentations that highlight the importance of the Grand Canyon for advancing geoscience research. These include short-term management of water-related issues, such as providing drinking water for the national park's six million annual visitors, as well managing the new river ecosystem created by the network of dams placed on the Colorado River. Recent research on Grand Canyon rocks has also revealed new insights into the formation of the North American continent around 1.8-1.7 billion years ago as well as the explosion in the diversity of animal life that occurred about 650-550 million years ago.

Many of these advances, says Karlstrom, have had global reverberations, assuring that the influence of this iconic canyon will extend well beyond its next big set of anniversaries. "The Grand Canyon will continue to be at the forefront of geoscience research, public education, and resource management and sustainability," he says.

Credit: 
Geological Society of America

Marijuana use among US adults with, without medical conditions

What The Study Did: National survey data was used in this study to examine how common marijuana use was among adults with and without medical conditions.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Hongying Dai, Ph.D., of the University of Nebraska Medical Center in Omaha, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.11936)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Researchers find way to kill pathogen resistant to antibiotics

image: Nagoya University researchers and colleagues in Japan have demonstrated a new strategy in fighting antibiotics resistance: the use of artificial haem proteins as a Trojan horse to selectively deliver antimicrobials to target bacteria, enabling their specific and effective sterilization. The technique killed 99.9% of Pseudomonas aeruginosa, a potentially deadly, antibiotic-resistant bacterium present in hospitals. This image shows a solution of the extracellular heme acquisition system protein A (HasA) with gallium phthalocyanine (Left) and the results of sterilization of Pseudomonas aeruginosa and Escherichia coli treated with HasA-bound gallium phthalocyanine by irradiation with near-infrared light (Right).

Image: 
Osami Shoji

Pseudomonas aeruginosa is a dangerous bacterium that causes infections in hospital settings and in people with weakened immune systems. It can cause blood infections and pneumonia, while severe infections can be deadly. Highly resistant to antibiotic treatment, P. aeruginosa is one of the most critical pathogens urgently requiring alternative treatment strategies, according to the World Health Organization.

This bacterium is one of many that have evolved a system that allows them to acquire difficult-to-access iron from the human body. Iron is essential for bacterial growth and survival, but in humans, most of it is held up within the 'haem' complex of haemoglobin. To get hold of it, P. aeruginosa and other bacteria secrete a protein, called HasA, which latches onto haem in the blood. This complex is recognized by a membrane receptor on the bacterium called HasR, permitting haem entry into the bacterial cell, while HasA is recycled to pick up more haem.

Bioinorganic chemist Osami Shoji of Nagoya University and collaborators have found a way to hijack this 'haem acquisition system' for drug delivery. They developed a powder formed of HasA and the pigment gallium phthalocyanine (GaPc), which, when applied to a culture of P. aeruginosa, was consumed by the bacteria.

"When the pigment is exposed to near-infrared light, harmful reactive oxygen species are generated inside the bacterial cells," explains Shoji. When tested, over 99.99% of the bacteria were killed following treatment with one micromolar of HasA with GaPc and ten minutes of irradiation.

The strategy also worked on other bacteria with the HasR receptor on their membranes, but not on ones without it.

The haem acquisition system is so essential to these bacteria's survival that it is not expected to change, making it unlikely the bacteria will develop resistance to this drug strategy, the researchers believe.

"Our findings support the use of artificial haem proteins as a Trojan horse to selectively deliver antimicrobials to target bacteria, enabling their specific and effective sterilization, irrespective of antibiotic resistance," the team reports in their study.

Credit: 
Nagoya University

Rethinking how cholesterol is integrated into cells

image: 1) Live-Cell Imaging of Sterol Transport to the Yeast Vacuole; 2) Eukaryotic sterol membrane integration; 3) Sterol Affinity of NPC2 and NTD and Sterol Transfer Assays

Image: 
Bjørn Panyella Pedersen, Aarhus University

Most people have heard of "cholesterol levels" and the dangers of high blood cholesterol, which is one of the main causes of cardiovascular disease. But besides the harmful side effects of high cholesterol, cholesterol is an essential component of all cells and fundamental to a host of important functions of the body. Hormones like estrogen and testosterone are made from cholesterol, for example.

It has been known for a long time that cholesterol is transported around the body in the blood as small particles consisting of fat and protein. In the body's cells, these particles are broken down and cholesterol is released and integrated as part of the cell. Although this process is essential, not just for humans, but for all animals and plants, surprisingly little is known about how cholesterol is actually incorporated into the cells after the breakdown of these particles.

In recent years, interest in how cholesterol is integrated and incorporated - and not least how this process is regulated - has grown tremendously. This is partly due the huge pharmaceutical potential in regulating this process, as shown with blockbuster drugs such as Zetia, which regulate cholesterol uptake from food. In addition, it has been shown that many viruses, including Ebola, uses the same process to infect cells.

During the past five years, researchers from Aarhus University have collaborated with researchers from the University of Southern Denmark and the University of Leeds to investigate how cholesterol is incorporated into cells, using biophysical and structural biological methods. The results have led to a groundbreaking insight into the process and to a new model for how cholesterol is integrated and incorporated that fundamentally changes our prior understanding of the process.

The results have just been published in the world leading journal Cell.

Credit: 
Aarhus University

Open Medicare data helps uncover potential hidden costs of health care

BLOOMINGTON, Ind. -- An interdisciplinary team of Indiana University scientists studying Medicare data have found an association between health care industry payments to medical providers for non-research expenses and what these providers charge for medical services -- shedding new light on potential hidden costs to the public.

Their findings, published Sept. 20 in Nature Communications, demonstrate that medical providers receiving higher amounts of industry payments tend to bill higher drug and
medical costs. Specifically, they found that a 10 percent increase in industry payments to medical providers is associated with 1.3 percent higher medical costs and 1.8 percent higher drug costs.

For example, a $25 increase in annual industry payments to a typical medical provider would be associated with approximately $1,100 higher medical costs and $100 higher drug costs.

"Let's be clear here, we should not find such an association," said Jorge Mejia, co-author on the paper and an assistant professor of operations and decision technologies at the IU Kelley School of Business. "Our findings raise the possibility that medical providers may be unduly influenced by payments from the healthcare industry."

It's important to note that an association shows that two variables appear to change at the same time, whereas causality implies that one variable causes another variable to change. This study does not prove causality, which the researchers said would be difficult to do with secondary data.

Jorge Mejia's co-authors were Amanda Mejia, assistant professor in the Department of Statistics, and Franco Pestilli, associate professor in the Department of Psychological and Brain Sciences, both at the IU College of Arts and Sciences.

Amanda Mejia said the team controlled for several key variables to rule out the possibility of other drivers of the association between industry payments and medical costs.

"We found that the association was still there after taking into account the size of the practice, its location and drug prescribing levels," she said.

Pestilli said the large Medicare data sets that the researchers used were made openly available as part of the 2010 Affordable Care Act.

"Our research capitalized on such openly shared data," Pestilli said. "We demonstrate the value of open data in providing society with critical insights on hidden costs that can be addressed at the policy level."

But Jorge Mejia said transparency alone is not enough to fix these hidden costs. That's why studies like this one are important; they help interpret the data so the public can better understand what it means.

"As a society, we have had the potential for quantifying and qualifying the influence of the industry on our medical costs," he said. "However, we have not done so. For example, we are just discovering the extent to which certain health care companies may be involved in the current opioid crisis in the U.S. We need tools to guide patients and consumers with all the data that is available."

To help achieve this goal, Jorge Mejia said he hopes Medicare will make it easier for researchers and the public to quantify the effect of the payments received by medical providers by adding the national physician identifier (NPI) to their Open Payments data set. Additionally, he hopes the research team's findings will start a conversation about how to communicate this information to consumers.

"We have energy efficiency scorecards for appliances, cars and many consumer products," Jorge Mejia said. "How can the public understand whether their physician is close to the health care industry? Instead of making this about whether it's good or bad, I'd like to kickstart a conversation about how information can be delivered in a simple way. Let's put patients in the driver's seat."

The researchers have several follow-up projects in progress, including one that aims to investigate how industry payments may drive future medical costs, which would bring them one step closer to establishing a causal relationship between payments and costs.

Credit: 
Indiana University

Weathering Antarctic storms -- Weather balloon data boost forecasting skill

image: This is a photograph showing radiosonde observation at Dome Fuji Station in Antarctica. The person in the photo is Dr. Konosuke Sugiura, a co-author of the study.

Image: 
Taichi Ito

Observational data from radiosondes deployed in Antarctica improve the forecasting accuracy for severe Antarctic cyclones, according to a Japanese research team led by the Kitami Institute of Technology, Hokkaido, Japan.

In parts of the Earth that are very sparsely populated, such as the Antarctic, direct observational weather data can be hard to come by, and with Antarctica's extreme climate, failure to accurately predict severe weather can easily become deadly. The team conducted a study that focused on the impacts of these data on forecasting an extreme cyclonic event, and the findings have been accepted and published as early view in Advances in Atmospheric Sciences.

With advancements in satellite technology and computer modeling, forecasting of storms and other weather events is constantly improving. However, accurate forecasts are not based on satellite data alone - they still rely on direct measurements taken at the surface and in the atmosphere. Direct measurements of the atmosphere can be obtained by deploying weather balloons equipped with radiosondes, devices that collect and transmit information about variables such as altitude, temperature, humidity, and wind speed.

The research team looked at the importance of weather radiosonde data in predicting severe weather events over Antarctica and the surrounding Southern Ocean. "We investigated the impact of including additional radiosonde observations from both the research vessel Shirase over the Southern Ocean and from the Dome Fuji Station in Antarctica on forecasting using an atmospheric general circulation model," explains lead author Kazutoshi Sato, an assistant professor at the Kitami Institute of Technology, Japan.

The researchers conducted a forecast experiment that focused on an unusually strong Antarctic cyclonic event that occurred from late December 2017 to early January 2018. Two datasets, one that included the additional radiosonde data and one that excluded those data, were used as the initial values. Only the experiment that included the radiosonde observations successfully captured the cyclone's central pressure, wind speed, and moisture transport 2.5 days in advance. These results clearly show that even with operational weather forecast centers, collecting radiosonde observation data is important to improve the forecasting accuracy for Antarctic cyclones.

However, the sparsity of observations in the Antarctic remains a problem. "Even with the assimilation of the additional radiosonde observations," says co-author Jun Inoue, an associate professor of polar science at the National Institute of Polar Research, part of the Inter-University Research Institute Corporation Research Organization of Information and Systems (ROIS) in Tokyo, Japan, "the experiment was unable to forecast the development of the cyclone four days in advance. That leaves a great deal of room for improvement." In a project called the 'Year of Polar Projection', many Antarctic stations have deployed additional radiosondes to provide an opportunity to further investigate the impact of the resulting data on weather forecasting in Antarctica.

To provide more accurate weather forecasts, Inoue noted that new additional observation systems need to be developed in the future. Improving severe weather forecasting in Antarctica will continue to be a priority, as the lives of researchers and other personnel in the region may depend on it.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Oil futures volatility and the economy

The drone strike on Saudi Arabia's oil infrastructure has highlighted the fragile and interconnected relationship between crude oil supply and the global economy, with new research bringing these economic ties into greater focus.

"We shouldn't underestimate the importance of geopolitical events in the oil market, as it has the power to impend the stability of our financial world," says University of Technology Sydney Finance researcher Dr Christina Sklibosios Nikitopoulos.

"On 16 September 2019 the oil market witnessed one of the highest intraday moves, with a 15% increase in Brent oil prices and an 14.7% increase in US WTI oil futures. Oil price spikes are seen as a recession barometer, but it is not just price but also volatility that matters," she says.

In a recently released paper, Dr Nikitopoulos, with colleagues Dr Boda Kang from Lacima Group, and Finance Professor Marcel Prokopczuk from Leibniz University Hannover, examined the connections between oil futures volatility and the global economy.

They looked at 30 years of data to discover economic determinants of oil futures volatility over the short, medium and long-term. These included oil-sector variables, financial variables and macroeconomic conditions.

The research revealed how deeply integrated crude oil markets have become with financial markets.

"Investors increasingly regard commodities as an alternative asset class to equities or bonds, and crude oil derivatives are the most actively traded commodity," says Dr Nikitopoulos.

Oil futures started trading in 1983, and options in 1986, and since then the market has experienced explosive growth. Daily trading volume has leapt from 21,997 contracts in 2012 to 1.6 million in 2016 and this week surpassed 2 million.

"Our study highlighted the importance of risk premiums in this market, and revealed that credit spreads play a significant role in determining short-term and medium-term variation in oil futures prices," she says.

In the bond market, term structure - the rate at which people can borrow or lend over different periods, is seen as an important economic signal - whether the yield term is up signalling growth or down signalling recession.

Term structures in oil markets can be seen in a similar light, where contango (where the futures price of a commodity is higher than the spot price) or backwardation (where the spot or cash price of a commodity is higher than the forward price) provide an economic signal.

Dr Nikitopoulos says the expected supply shortfall following the drone strike would cause oil futures markets to remain in backwardation for a while.

The researchers found that along with hedging pressure and VIX (an equities market volatility index) after 2004 (the beginning of the financialisation of the commodity markets) credit spreads, industrial production and the US dollar index, were all drivers of short-term volatility.

"This supports the notion of volatility spill-overs between equity and commodity markets, which has strengthened in the past 10 years, says Dr Nikitopoulos.

"It also supports the notion that oil volatility acts as a recession barometer, and fears about the impact of oil shocks on financial stability are justified," she says.

Medium-term volatility was consistently related to open interest (a measure of trading activity) and credit spreads, while oil sector variables such as inventory and consumption had a measurable impact after 2004 due to structural changes in the economy and the oil sector.

Dr Nikitopoulos argues that because oil futures volatility is a product of interaction between the oil-sector and the economy, there is a need for mutually consistent policies.

"Oil markets should be the focus of global discussions by policy makers, not just individual decisions from the US Commodity Futures Trading Commission or OPEC," she says.

"Crude oil futures volatility plays an important role in the global economy and has significant implications for market participants - from oil producers and institutional investors, to traders and market regulators.

"And while the US economy can manage this most recent oil shock with its own shale oil production and opening of strategic reserves, it is global markets like Australia that suffer the most through an increase in fuel costs," she says.

Credit: 
University of Technology Sydney

Engineered bacterial biofilms immobilizing nanoparticles enable diverse catalytic applications

image: Diverse catalytic applications of tunable functional E. coli biofilms with anchored nano-objects. (a) The biofilm-anchored Au NPs enable the recyclable catalytic reduction of the toxic p-nitrophenol (PNP) into the harmless p-aminophenol (PAP). (b) The biofilm-anchored heterogeneous nanostructures (Au NPs/Cd0.9Zn0.1S QDs) photo-catalyze the degradation of organic dyes to low-toxic products based on facile light-induced charge separation. (c) The biofilm-anchored quantum dots coupled with engineered strain enable photo-induced hydrogen production. Electrons are transferred from QDs to hydrogenase using methyl viologen (MV) as a mediator.

Image: 
©Science China Press

Nano-scale objects (1 - 100 nm) are desirable nano-catalysts featured with more catalytic active sites due to higher surface-area-to-volume ratios. The nano-scale nature brings several attendant challenges such as leakage of nano-catalysts to ambient environment and difficulties in reusing nanocatalysts over repeated reaction cycles. A major strategy for addressing these challenges has been the immobilization of nano-objects on various substrates via a variety of technological approaches. However, inorganic and bio-derived or bio-inspired substrates obviously lack "biology-only" attributes like self-regeneration, cellular-growth-based scalability, and the ability of cells to biosynthesize complex enzymes, substrates, co-enzymes, or other required reagents or reaction components in situ. Moreover, studies that have immobilized nano-objects directly on cell surfaces have reported damage to cells.

The Zhong group from the Materials and Physical Biology Division, at ShanghaiTech University has made a major conceptual advance in developing a new abiotic/biotic interface towards the integration and immobilization of nanoscale objects with living cells for catalysis. Very briefly, they successfully showed how engineered amyloid monomers expressed, secreted and assembled in the extracellular matrix of living Escherichia coli (E. coli) biofilms can be harnessed to anchor functional nano-scale catalysts to make highly efficient, scalable, tunable, and reusable living catalyst systems. In their proof-of-concept studies, they have demonstrated three simple catalytic systems, including biofilm-anchored gold nanoparticles to degrade the pollutant p-nitrophenol, biofilm-anchored hybrid Cd0.9Zn0.1S quantum dots (QDs) and gold nanoparticles to efficiently degrade organic dyes, and biofilm-anchored CdSeS@ZnS QDs in a dual bacterial strain semi-artificial photosynthesis system for hydrogen production. As revealed in their studies, the extracellular matrix in biofilms indeed provides an ideal milieu for interfacing and anchoring nano-objects for direct catalysis and for their integration with the metabolism of living cells: even after multiple rounds of reactions, nano-catalysts were still robustly anchored to biofilms and the E. coli cells were still alive for easy regeneration. Importantly, such an approach would open up the extremely powerful and unique attributes of living systems.

There is a large diversity of bacterial biofilms with different functionalities in nature, and their study thus lays the conceptual foundation for coupling the uniquely dynamic properties and capacities of these living materials with the highly reactive nanoparticles to innovatively solve challenges in bioremediation, bioconversion, and energy. Their research will spur further research for creating more efficient and industrially important reaction systems by building and integrating more intricate biofilms/inorganic hybrid catalytic systems.

Credit: 
Science China Press

Surface melting causes Antarctic glaciers to slip faster towards the ocean

image: Surface meltwater draining through the ice and beneath Antarctic glaciers is causing sudden and rapid accelerations in their flow towards the sea, according to new research.

This is the first time scientists have found that melting on the surface impacts the flow of glaciers in Antarctica.

Image: 
Google Earth

Study shows for the first time a direct link between surface melting and short bursts of glacier acceleration in Antarctica

During these events, Antarctic Peninsula glaciers move up to 100 per cent faster than average

Scientists call for these findings to be accounted for in sea level rise predictions

Surface meltwater draining through the ice and beneath Antarctic glaciers is causing sudden and rapid accelerations in their flow towards the sea, according to new research.

This is the first time scientists have found that melting on the surface impacts the flow of glaciers in Antarctica.

Using imagery and data from satellites alongside regional climate modelling, scientists at the University of Sheffield have found that meltwater is causing some glaciers to move at speeds 100 per cent faster than average (up to 400m per year) for a period of several days multiple times per year.

Glaciers move downhill due to gravity via the internal deformation of ice, and basal sliding - where they slide over the ground beneath them, lubricated by liquid water.

The new research, published today in Nature Communications, shows that accelerations in Antarctic Peninsula glaciers' movements coincide with spikes in snowmelt. This association occurs because surface meltwater penetrates to the ice bed and lubricates glacier flow.
The scientists expect that as temperatures continue to rise in the Antarctic, surface melting will occur more frequently and across a wider area, making it an important factor in determining the speed at which glaciers move towards the sea.

Ultimately, they predict that glaciers on the Antarctic Peninsula will behave like those in present-day Greenland and Alaska, where meltwater controls the size and timing of variations in glacier flow across seasons and years.

The effects of such a major shift in Antarctic glacier melt on ice flow has not yet been incorporated into the models used to predict the future mass balance of the Antarctic Ice Sheet and its contribution to sea level rise.

Dr Jeremy Ely, Independent Research Fellow at the University of Sheffield's Department of Geography and author of the study, said: "Our research shows for the first time that surface meltwater is getting beneath glaciers in the Antarctic Peninsula - causing short bursts of sliding towards the sea 100% faster than normal.

"As atmospheric temperatures continue to rise, we expect to see more surface meltwater than ever, so such behaviour may become more common in Antarctica.

"It's crucial that this factor is considered in models of future sea level rise, so we can prepare for a world with fewer and smaller glaciers."

Pete Tuckett, who made the discovery while studying for his Masters in Polar and Alpine Change at the University of Sheffield, said: "The direct link between surface melting and glacier flow rates has been well documented in other regions of the world, but this is the first time we have seen this coupling anywhere in Antarctica.

"Given that atmospheric temperatures, and hence surface melt rates, in Antarctica are predicted to increase, this discovery could have significant implications for future rates of sea level rise."

Credit: 
University of Sheffield

Untapped resource, or greenhouse gas threat, found below rifting axis off Okinawa coast

image: Researchers at Kyushu University have located a large gas reservoir below an axis of rifting based on an automated method for deriving seismic pressure wave velocity from seismic reflection data. The reservoir can be seen in this two-dimensional seismic velocity mapping, which spans a depth of about 3.5 km below sea level and a distance of about 6.5 km, as a dark-blue area of low velocity within green areas of higher velocity. Identification of the reservoir was possible because of the greatly enhanced resolution provided by the automated technique compared to the manual analysis methods used to date. Depending on the nature of the gas, which is likely mainly carbon dioxide, methane, or a mixture of the two, this reservoir found in the Okinawa Trough could be a potential natural resource or an environmental concern.

Image: 
Takeshi Tsuji, Kyushu University

Analyzing reflections of seismic pressure waves by the subseafloor geology off southwestern Japan, researchers at Kyushu University have found the first evidence of a massive gas reservoir where the Earth's crust is being separated. Depending on its nature, the trapped gas could be a potential untapped natural resource or a source of greenhouse gases waiting to escape, raising the need for awareness of similar reservoirs around the world.

While the ocean can seem calm on the surface, the ocean depths can experience intense thermal activity as hot magma seeps from locations where the Earth's upper layers are being pulled apart--a process called rifting. In such areas, elevated levels of carbon dioxide and methane gas can be present in the water, possibly escaping from magma or being produced by microbial organisms or the interaction of organic-rich sediment with hot water.

In a new study published in Geophysical Research Letters, researchers from Kyushu University's International Institute for Carbon-Neutral Energy Research (I2CNER) now report that some of these gases may actually get trapped underground, leading to the existence of a massive gas reservoir beneath the axis along which rifting is occurring in the Okinawa Trough.

To find the reservoir, the researchers analyzed measurements of how geological structures reflect seismic pressure waves generated by an acoustic source carried by a boat to the study area. Applying an automated calculation technique to this seismic data, they were able to create a two-dimensional map of the velocities at which the pressure waves travel through the ground with a much higher resolution than previous manual techniques.

"Seismic pressure waves generally travel more slowly through gases than through solids," explains study co-author Andri Hendriyana. "Thus, by estimating the velocity of seismic pressure waves through the ground, we can identify underground gas reservoirs and even get information on how saturated they are. In this case, we found low-velocity pockets along the rifting axis near Iheya North Knoll in the middle of the Okinawa Trough, indicating areas filled with gas."

At this stage, the researchers are still not sure if the reservoirs are mainly filled with carbon dioxide or methane. If methane, the gas could be a potential natural resource. However, both carbon dioxide and methane contribute to the greenhouse effect, so the rapid, uncontrolled release of either gas from such a large reservoir could have significant environmental implications.

"While many people focus on greenhouse gases made by humans, a huge variety of natural sources also exist," says corresponding author Takeshi Tsuji. "Large-scale gas reservoirs along a rifting axis may represent another source of greenhouse gases that we need to keep our eyes on. Or, they could turn out to be a significant natural resource."

As for how the gas is trapped, one possibility is that layers of impermeable sediment such as clay could prevent the gas from escaping porous underlying layers of materials such as pumice. Based on the flow of heat around the study area, the researchers think another possibility is that a low-permeability cap of methane hydrate--a methane-containing ice--acts as the lid.

"Zones like the one we investigated are not uncommon along rifts, so I expect that similar reservoirs may exist elsewhere in the Okinawa Trough as well as other sediment-covered continental back-arc basins around the world," explains Tsuji.

Credit: 
Kyushu University

Why is the brain disturbed by harsh sounds?

image: Smooth and rough sounds activate different brain networks. While smooth sounds induce responses mainly in the 'classical' auditory system, rough sounds activate a wider brain network involved in processing aversion and salience.

Image: 
© UNIGE

Why do the harsh sounds emitted by alarms or human shrieks grab our attention? What is going on in the brain when it detects these frequencies? Neuroscientists from the University of Geneva (UNIGE) and Geneva University Hospitals (HUG), Switzerland, have been analysing how people react when they listen to a range of different sounds, the aim being to establish the extent to which repetitive sound frequencies are considered unpleasant. The scientists also studied the areas inside the brain that were stimulated when listening to these frequencies. Surprisingly, their results - which are published in Nature Communications - showed not only that the conventional sound-processing circuit is activated but also that the cortical and sub-cortical areas involved in the processing of salience and aversion are also solicited. This is a first, and it explains why the brain goes into a state of alert on hearing this type of sound.

Alarm sounds, whether artificial (such as a car horn) or natural (human screams), are characterised by repetitive sound fluctuations, which are usually situated in frequencies of between 40 and 80 Hz. But why were these frequencies selected to signal danger? And what happens in the brain to hold our attention to such an extent? Researchers from UNIGE and HUG played repetitive sounds of between 0 and 250 Hz to 16 participants closer and closer together in order to define the frequencies that the brain finds unbearable. "We then asked participants when they perceived the sounds as being rough (distinct from each other) and when they perceived them as smooth (forming one continuous and single sound)," explains Luc Arnal, a researcher in the Department of Basic Neurosciences in UNIGE's Faculty of Medicine.

Based on the responses of participants, the scientists were able to establish that the upper limit of sound roughness is around 130 Hz. "Above this limit," continues Arnal, "the frequencies are heard as forming only one continuous sound." But why does the brain judge rough sounds to be unpleasant? In an attempt to answer this question, the neuroscientists asked participants to listen to different frequencies, which they had to classify on a scale of 1 to 5, 1 being bearable and 5 unbearable. "The sounds considered intolerable were mainly between 40 and 80 Hz, i.e. in the range of frequencies used by alarms and human screams, including those of a baby," says Arnal. Since these sounds are perceptible from a distance, unlike a visual stimulus, it is crucial that attention can be captured from a survival perspective. "That's why alarms use these rapid repetitive frequencies to maximise the chances that they are detected and gain our attention," says the researcher. In fact, when the repetitions are spaced less than about 25 milliseconds apart, the brain cannot anticipate them and therefore suppress them. It is constantly on alert and attentive to the stimulus.

Harsh sounds fall outside the conventional auditory system

The researchers then attempted to find out what actually happens in the brain: why are these harsh sounds so unbearable? "We used an intracranial EEG, which records brain activity inside the brain itself in response to sounds," explains Pierre Mégevand, a neurologist and researcher in the Department of Basic Neurosciences in the UNIGE Faculty of Medicine and at HUG.

When the sound is perceived as continuous (above 130 Hz), the auditory cortex in the upper temporal lobe is activated. "This is the conventional circuit for hearing," says Mégevand. But when sounds are perceived as harsh (especially between 40 and 80 Hz), they induce a persistent response that additionally recruits a large number of cortical and sub-cortical regions that are not part of the conventional auditory system. "These sounds solicit the amygdala, hippocampus and insula in particular, all areas related to salience, aversion and pain. This explains why participants experienced them as being unbearable," says Arnal, who was surprised to learn that these regions were involved in processing sounds.

This is the first time that sounds between 40 and 80 Hz have been shown to mobilise these neural networks, although the frequencies have been used for a long time in alarm systems. "We now understand at last why the brain can't ignore these sounds," says Arnal. "Something particular happens at these frequencies, and there are also many illnesses that show atypical brain responses to sounds at 40 Hz. These include Alzheimer's, autism and schizophrenia." The neuroscientists will now investigate the networks stimulated by these frequencies to see whether it could be possible to detect these illnesses early by soliciting the circuit activated by the sounds.

Credit: 
Université de Genève

Sponge-like action of circular RNA aids heart attack recovery, Temple-led team discovers

(Philadelphia, PA) - The human genetic blueprint is like a string of code. To follow it, the code, or DNA, is transcribed into shorter strings of RNA. While some of these shorter strings carry instructions for making proteins - the functional units of cells - most RNA is not involved in protein production. Among these noncoding RNAs are the recently discovered circular RNAs, so-named because of their unusual ring shape (most other RNAs are linear).

Circular RNAs, like other noncoding RNAs, were thought to be nonfunctional, but recent evidence suggests otherwise. Circular RNAs may in fact act like sponges to "soak up," or bind, other molecules, including microRNAs and proteins, and now, new work by researchers at the Lewis Katz School of Medicine at Temple University (LKSOM) and colleagues supports this idea. They describe, for the first time, a circular RNA that fills a critical role in tissue repair after heart attack, thanks to its ability to soak up harmful molecules.

The study was published online September 20 in the journal Nature Communications.

"We discovered that a circular RNA known as circFndc3b, when added therapeutically to the injured heart after surgically induced heart attack in mice, enhances cardiac repair and helps restore heart function," explained Raj Kishore, PhD, Professor of Pharmacology and Medicine and Director of the Stem Cell Therapy Program in the Center for Translational Medicine at LKSOM and senior investigator on the new report. "We attributed these effects of circFndc3b to its ability to function like a 'sponge,' binding a protein called FUS that mediates cell death and reduces vascular growth, which hinders heart tissue repair."

Dr. Kishore and colleagues focused their investigation on circFndc3b after finding that this particular circular RNA was significantly decreased in the heart in mice that had experienced a heart attack. "This observation led us to wonder whether the change in circFndc3b expression meant that it was important functionally in the heart," Dr. Kishore said.

To investigate this possibility, a gene product to induce circFndc3b overexpression was injected into the heart in mice after heart attack. Subsequent examination showed that within eight weeks of injection, treated mice experienced gains in heart function and in survival compared to their untreated counterparts. There was also evidence within heart tissue that new blood vessels had started to form, greatly aiding the tissue repair process.

The findings offer exciting insight into circular RNAs and the significance of their potential role as molecular sponges that limit the activity of damaging molecules. "CircFndc3b specifically soaked up an RNA binding protein that suppresses blood vessel formation," Dr. Kishore explained. "In doing so, it made way for new vessels to grow."

Dr. Kishore and colleagues are now in the process of developing a large animal model to further investigate the therapeutic potential of circFndc3b. The team also wants to begin analyzing plasma samples from patients just after heart attack to investigate whether specific circulating RNAs could serve as biomarkers for heart disease or injury and to get a better sense of their clinical significance.

Credit: 
Temple University Health System