Culture

Satellite study proves global quantum communication will be possible

Researchers in Italy have demonstrated the feasibility of quantum communications between high-orbiting global navigation satellites and a ground station, with an exchange at the single photon level over a distance of 20,000km.

The milestone experiment proves the feasibility of secure quantum communications on a global scale, using the Global Navigation Satellite System (GNSS). It is reported in full today in the journal Quantum Science and Technology.

Co-lead author Dr Giuseppe Vallone is from the University of Padova, Italy. He said: "Satellite-based technologies enable a wide range of civil, scientific and military applications like communications, navigation and timing, remote sensing, meteorology, reconnaissance, search and rescue, space exploration and astronomy.

"The core of these systems is to safely transmit information and data from orbiting satellites to ground stations on Earth. Protection of these channels from a malicious adversary is therefore crucial for both military and civilian operations.

"Space quantum communications (QC) represent a promising way to guarantee unconditional security for satellite-to-ground and inter-satellite optical links, by using quantum information protocols as quantum key distribution (QKD)."

The team's results show the first exchange of a few photons per pulse between two different satellites in the Russian GLONASS constellation and the Space Geodesy Centre of the Italian Space Agency.

Co-lead author Professor Paolo Villoresi said: ""Our experiment used the passive retro-reflectors mounted on the satellites. By estimating the actual losses of the channel, we can evaluate the characteristics of both a dedicated quantum payload and a receiving ground station.

"Our results prove the feasibility of QC from GNSS in terms of achievable signal-to-noise ratio and detection rate. Our work extends the limit of long-distance free-space single-photon exchange. The longest channel length previously demonstrated was around 7,000 km, in an experiment using a Medium-Earth-Orbit (MEO) satellite that we reported in 2016."

Although high-orbit satellites pose a large technological challenge, due to losses from optical channels, Professor Villoresi explained the team's reasoning for focussing on high-orbiting satellites in their study.

He said: "The high orbital speed of low earth orbit (LEO) satellites is very effective for the global coverage but limits their visibility periods from a single ground station. On the contrary, using satellites at higher orbits can extend the communication time, reaching few hours in the case of GNSS.

"QC could also offer interesting solutions for GNSS security for both satellite-to-ground and inter-satellite links, which could provide novel and unconditionally secure protocols for the authentication, integrity and confidentiality of exchanged signals."

Dr. Giuseppe Bianco, which is the Director of the Space Geodesy Centre of the Italian Space Agency and co-author, said "The single photon exchange with a GNSS satellite is an important result for both scientific and application perspectives. It fits perfectly in the Italian roadmap for Space Quantum Communications, and it is the latest achievement of our collaboration with the University of Padua which is steadily progressing since 2003."

Credit: 
IOP Publishing

High sodium intake may contribute to increased heart-disease deaths in China

DALLAS, Dec. 19, 2018 -- Nearly a fifth of heart disease deaths in adults aged 25-69 in 2011 may be attributed to high sodium diets in a large province in China. Reducing salt intake in the region could potentially save thousands of lives, according to new research in Journal of the American Heart Association, the Open Access Journal of the American Heart Association/American Stroke Association.

Researchers assessed the impact of dietary sodium with data from the Shandong-Ministry of Health Action on Sodium and Hypertension (SMASH) program. They found that nearly 20 percent of deaths caused by cardiovascular disease among adults aged 25 to 69 could be attributable to the systolic blood pressure-raising effect of high-sodium diets in Shandong in 2011. That figure is much higher than the average level worldwide (9.5 percent) thought to be attributable to the blood-pressure raising effects of sodium.

"The burden of cardiovascular disease attributable to a high sodium diet is extreme but preventable and measures to reduce salt intake are urgently recommended," said Shiwei Liu, Ph.D., study senior author and epidemiology professor at the Chinese Center for Disease Control and Prevention in Beijing. "Sodium intake is high in China, mainly from home cooking, eating out and pickled foods, especially in northern China such as Shandong province."

Unlike in western countries like the United States where more than 70 percent of sodium comes from processed, prepackaged and restaurant foods, about 76 percent of the dietary source of sodium in China comes from home cooking. In Shandong, residents have a higher intake of dietary salt and a higher rate of adults with high blood pressure than the China's national average.

Researchers used blood pressure values for 13,272 SMASH participants in adults aged 25-69, sodium intake measurements of 24-hour urine excretions from 1,769 adults in the program, and death rates for the province. They estimated 16,100 deaths in adults aged 25-69 from cardiovascular diseases were attributable to higher sodium intake. That number includes 5,600 for ischemic heart disease and 9,000 for stroke.

Researchers estimated that if sodium intake was reduced from the 2011 Shandong baseline of 12,500 milligrams per day (mg/d) to 3,500 mg/d as many as 8,800 deaths from cardiovascular disease could potentially be averted.

SMASH started in 2011 as the first campaign on sodium reduction in Shandong, which has a population of nearly 100 million people. Based on its success, similar programs have been established in other regions of the country, Liu said.

Researchers said results of the study can raise public awareness and provide quantitative evidence to effectively evaluate the program when it ends. And the methodology used can be applied to similar programs conducted in China and other countries.

A limitation of the study is that the quantitative effects of high sodium intake on blood pressure and the values of relative risk for systolic blood pressure on cardiovascular disease were obtained from global studies, so the estimates may not represent the true effect in Shandong or the Chinese population as they were calculated based on the pooled global data, mainly with white people involved.

"High intake of sodium is harmful, and the potential benefits of reducing sodium intake are considerable," Liu said. "People should control the consumption of salt in their life, including cooking at home and eating out. Medical professionals and other healthcare providers should help their patients understand that high sodium consumption is one of the most important risks that can lead to cardiovascular disease," she said.

American Heart Association volunteer expert Lawrence J. Appel, M.D., M.P.H., noted "High sodium intake is a global public health problem and most people, regardless of their nationality, can benefit from eating less sodium."

Appel is chair of the American Heart Association's Sodium Reduction Taskforce and the director of the Welch Center for Prevention, Epidemiology and Clinical Research.

"While strategies to lower sodium differ by country based on dietary sources of sodium and other factors, the benefits of reduced sodium intake are substantial and should be broadly implemented," he said.

The American Heart Association recommends that people:

Choose foods with less sodium and prepare foods with little or no salt.

Aim to eat no more than 2,300 mg of sodium per day.

Reducing daily intake to 1,500 mg is desirable because it can lower blood pressure even further.

If you can't meet these goals right now, even reducing sodium intake by 1,000 mg/d can benefit blood pressure.

Credit: 
American Heart Association

Rabbit gene helps houseplant detoxify indoor air

Our homes are supposed to be safe havens from the outside world. However, studies have shown that household air is more polluted than either office or school air, exposing children and home workers to higher levels of carcinogens than the general population. Now, researchers have made a genetically modified houseplant that can efficiently remove at least two toxins from the air. They report their results in ACS' journal Environmental Science & Technology.

Indoor air often contains volatile organic compounds such as formaldehyde, benzene and chloroform. These toxins come from many sources, including cooking, showering, furniture and smoking. House plants can remove some toxins from the air, but they aren't very efficient: A homeowner would need more than 20 plants to remove formaldehyde from a typical room, researchers estimate. Stuart Strand and colleagues wondered if introducing a mammalian gene called CYP2E1 to a common houseplant, pothos ivy (Epipremnum aureum), would boost the plant's detoxifying potential. This gene encodes cytochrome P450 2E1, an enzyme that breaks down a wide range of volatile organic compounds found in the home.

The team introduced rabbit CYP2E1 to the ivy's genome and injected benzene or chloroform gas into closed vials that contained growing plants. After 3 days, the concentrations of these compounds in the vials had dropped dramatically, and by 8 days, chloroform was barely detectable. In contrast, the compounds' concentrations in vials containing unmodified ivy or no plants did not change. The researchers estimate that a hypothetical biofilter made of the genetically modified plants would deliver clean air at rates comparable to commercial home particulate filters.

Credit: 
American Chemical Society

Study finds dinosaurs battled overheating with nasal air-conditioning

image: In Panoplosaurus (left), the nasal passages were longer than the skull itself, and in Euoplocephalus (right) they were almost twice as long as the skull, as well as coiled up the snout.

Image: 
Jason Bourke, Ph.D., NYITCOM at A-State

JONESBORO, AR; December 19, 2018--Researchers have long wondered how gigantic, heavily armored dinosaurs, such as the club-tailed ankylosaurs that lived in sweltering climates, avoided overheating. While their large bodies were adept at retaining heat, their sheer size created a heat-shedding problem that would have put them at risk of overheating, even on cloudy days. In the absence of a protective cooling mechanism, the delicate neural tissue of their brains could be damaged by the hot blood from the core of their bodies.

Now, as seen in the December 19 issue of PLOS ONE, researchers, led by a paleontologist from New York Institute of Technology College of Osteopathic Medicine at Arkansas State University (NYITCOM at A-State), have posed a new theory--the dinosaurs had an intricate cooling system in their snouts.

"The large bodies of many dinosaurs must have gotten very hot in warm Mesozoic climates, and we'd expect their brains to adapt poorly to these conditions. With that in mind, we wanted to see if there were ways to protect the brain from 'cooking.' It turns out the nose may be the key, and likely housed a 'built-in air conditioner,'" said Jason Bourke, Ph.D., assistant professor of basic sciences, NYITCOM at A-State, and lead author of the study.

According to the researchers, smell may be a primary function of the nose, but noses are also important heat exchangers, ensuring that air is warmed and humidified before it reaches the delicate lungs. To accomplish this effective air conditioning, birds and mammals--including humans--rely on thin curls of bone and cartilage within their nasal cavities, called turbinates, which increase the surface area and allow for air to come into greater contact with the nasal walls.

The team used Computed Tomography (CT) scanning and a powerful engineering modeling approach called computational fluid dynamics to simulate how air moved through the nasal passages of two different ankylosaur species, the hippo-sized Panoplosaurus and larger rhino-sized Euoplocephalus. These tests examined how well ankylosaur noses transferred heat from the body to the inhaled air. The researchers found that ankylosaurs lacked turbinates, and instead evolved to have longer, coiled noses. Despite this strange anatomy, these noses were just as efficient at warming and cooling respired air.

"A decade ago, my colleague and I published the discovery that ankylosaurs had extremely long nasal passages coiled up in their snouts," said study co-author Lawrence Witmer, professor of anatomy, Ohio University Heritage College of Osteopathic Medicine. "These convoluted airways resembled a child's 'crazy-straw'--completely unexpected and seemingly without reason, until now."

In Panoplosaurus, the nasal passages were a bit longer than the skull itself, and in Euoplocephalus they were almost twice as long as the skull, as well as coiled up the snout. To see if nasal passage length was the reason for this efficiency, Bourke ran alternative models with shorter, simpler nasal passages that ran directly from the nostril to the throat, as in most other animals. The results clearly showed that nose length was indeed the key to their air-conditioning ability.

"When we stuck a short, simple nose in their snouts, heat-transfer rates dropped by over 50% in both dinosaurs. They were less efficient and didn't work very well," said Bourke.

Another line of evidence that these noses were air conditioners that helped cool the brain came from analyses of blood flow. When blood vessels were reconstructed, based on bony grooves and canals, the team found a rich blood supply running right next to these convoluted nasal passages. According to Ruger Porter, lecturer at the Ohio University Heritage College of Osteopathic Medicine and another of the study's co-authors, hot blood from the body core would travel through these blood vessels and transfer their heat to the incoming air. Simultaneously, evaporation of moisture in the long nasal passages cooled the venous blood destined for the brain.

The complicated nasal airways of these dinosaurs were acting as radiators to cool down the brain with a constant flow of cooled venous blood, allowing them to keep a cool head at all times. This natural engineering feat also may have allowed the evolution of the great sizes of so many dinosaurs.

"This project is an excellent example of how advances in CT scanning, 3-D reconstruction, imaging, and computational fluid dynamics modeling can be used in biological research to test long-standing hypotheses," said Kathy Dickson, a program officer at the National Science Foundation (NSF) that funded the research. "From these new images and models, fossils can provide further insight into extinct organisms like the ankylosaur - in this case, offering an explanation of how unusual features actually function physiologically."

Credit: 
New York Institute of Technology

From eye drops to potential leukaemia treatment

An active ingredient in eye drops that were being developed for the treatment of a form of eye disease has shown promise for treating an aggressive form of blood cancer. Scientists at the Wellcome Sanger Institute, University of Cambridge, University of Nottingham and their collaborators have found that this compound, which targets an essential cancer gene, could kill leukaemia cells without harming non-leukemic blood cells.

The results, published today (19 December) in Nature Communications reveal a potential new treatment approach for an aggressive blood cancer with a poor prognosis.

Acute myeloid leukaemia (AML) is a form of blood cancer that affects people of all ages, often requiring months of intensive chemotherapy and prolonged hospital admissions. It develops in cells in the bone marrow crowding out the healthy cells, in turn leading to life-threatening infections and bleeding.

Mainstream AML treatments have remained unchanged for over thirty years, with the current treatment being chemotherapy, and the majority of people's cancer cannot be cured. A subtype of AML, driven by rearrangements in the MLL gene has a particularly bad prognosis.

In a previous study, researchers at the Sanger Institute developed an approach, based on CRISPR gene editing technology, which helped them identify more than 400 genes as possible therapeutic targets for different subtypes of AML. One of the genes, SRPK1, was found to be essential for the growth of MLL-rearranged AML. SRPK1 is involved in a process called RNA splicing, which prepares RNA for translation into proteins, the molecules that conduct the majority of normal cellular processes, including growth and proliferation.

In a new study, Sanger Institute researchers and their collaborators set out to work out how inhibition of SRPK1 can kill AML cells and whether it has therapeutic potential in this disease. They first showed that genetic disruption of SRPK1 stopped the growth of MLL-rearranged AML cells and then went on to study the compound SPHINX31, an inhibitor of SRPK1, which was being used to develop an eye drop treatment for retinal neovascular disease - the growth of new blood vessels on the retinal surface that bleed spontaneously and cause vision loss.

The team found that the compound strongly inhibited the growth of several MLL-rearranged AML cell lines, but did not inhibit the growth of normal blood stem cells. They then transplanted patient-derived human AML cells into immunocompromised mice and treated them with the compound. Strikingly, the growth of AML cells was strongly inhibited and the mice did not show any noticeable side effects.

Dr George Vassiliou, joint leader of the research from the Wellcome Sanger Institute and the Wellcome-MRC Cambridge Stem Cell Institute, said: "We have discovered that inhibiting a key gene with a compound being developed for an eye condition can stop the growth of an aggressive form of acute myeloid leukaemia without harming healthy cells. This shows promise as a potential approach for treating this aggressive leukaemia in humans."

SRPK1 controls the splicing* of RNA in the production of new proteins. An example of a gene that is affected when SRPK1 is blocked is BRD4, a well-known gene that maintains AML. Inhibiting SRPK1 causes the main form of BRD4 to switch to another form, a change that is detrimental to AML growth.

Dr Konstantinos Tzelepis, joint lead author from the Wellcome Sanger Institute and University of Cambridge, said: "Our study describes a novel mechanism required for leukaemia cell survival and highlights the therapeutic potential of SRPK1 inhibition in an aggressive type of AML. Targeting this mechanism may be effective in other cancers where BRD4 and SRPK1 play a role, such as metastatic breast cancer."

Professor David Bates, from the University of Nottingham and co-founder of biotech company Exonate, which develops eye drops for retinal diseases, said: "When Dr Vassiliou told me that SRPK1 was required for the survival of a form of AML, I immediately wanted to work with him to find out if our inhibitors could actually stop the leukaemia cells growing. The fact that the compound worked so effectively bodes well for its potential development as a new therapy for leukaemia. It will take some time, but there is real promise for a new treatment on the horizon for patients with this aggressive cancer."

Credit: 
Wellcome Trust Sanger Institute

Researchers develop a new houseplant that can clean your home's air

image: Researchers at the University of Washington have genetically modified a common houseplant -- pothos ivy -- to remove chloroform and benzene from the air around it.

Image: 
Mark Stone/University of Washington

We like to keep the air in our homes as clean as possible, and sometimes we use HEPA air filters to keep offending allergens and dust particles at bay.

But some hazardous compounds are too small to be trapped in these filters. Small molecules like chloroform, which is present in small amounts in chlorinated water, or benzene, which is a component of gasoline, build up in our homes when we shower or boil water, or when we store cars or lawn mowers in attached garages. Both benzene and chloroform exposure have been linked to cancer.

Now researchers at the University of Washington have genetically modified a common houseplant -- pothos ivy -- to remove chloroform and benzene from the air around it. The modified plants express a protein, called 2E1, that transforms these compounds into molecules that the plants can then use to support their own growth. The team will publish its findings Wednesday, Dec. 19 in Environmental Science & Technology.

"People haven't really been talking about these hazardous organic compounds in homes, and I think that's because we couldn't do anything about them," said senior author Stuart Strand, who is a research professor in the UW's civil and environmental engineering department. "Now we've engineered houseplants to remove these pollutants for us."

The team decided to use a protein called cytochrome P450 2E1, or 2E1 for short, which is present in all mammals, including humans. In our bodies, 2E1 turns benzene into a chemical called phenol and chloroform into carbon dioxide and chloride ions. But 2E1 is located in our livers and is turned on when we drink alcohol. So it's not available to help us process pollutants in our air.

"We decided we should have this reaction occur outside of the body in a plant, an example of the 'green liver' concept," Strand said. "And 2E1 can be beneficial for the plant, too. Plants use carbon dioxide and chloride ions to make their food, and they use phenol to help make components of their cell walls."

The researchers made a synthetic version of the gene that serves as instructions for making the rabbit form of 2E1. Then they introduced it into pothos ivy so that each cell in the plant expressed the protein. Pothos ivy doesn't flower in temperate climates so the genetically modified plants won't be able to spread via pollen.

"This whole process took more than two years," said lead author Long Zhang, who is a research scientist in the civil and environmental engineering department. "That is a long time, compared to other lab plants, which might only take a few months. But we wanted to do this in pothos because it's a robust houseplant that grows well under all sort of conditions."

The researchers then tested how well their modified plants could remove the pollutants from air compared to normal pothos ivy. They put both types of plants in glass tubes and then added either benzene or chloroform gas into each tube. Over 11 days, the team tracked how the concentration of each pollutant changed in each tube.

For the unmodified plants, the concentration of either gas didn't change over time. But for the modified plants, the concentration of chloroform dropped by 82 percent after three days, and it was almost undetectable by day six. The concentration of benzene also decreased in the modified plant vials, but more slowly: By day eight, the benzene concentration had dropped by about 75 percent.

In order to detect these changes in pollutant levels, the researchers used much higher pollutant concentrations than are typically found in homes. But the team expects that the home levels would drop similarly, if not faster, over the same time frame.

Plants in the home would also need to be inside an enclosure with something to move air past their leaves, like a fan, Strand said.

"If you had a plant growing in the corner of a room, it will have some effect in that room," he said. "But without air flow, it will take a long time for a molecule on the other end of the house to reach the plant."

The team is currently working to increase the plants' capabilities by adding a protein that can break down another hazardous molecule found in home air: formaldehyde, which is present in some wood products, such as laminate flooring and cabinets, and tobacco smoke.

"These are all stable compounds, so it's really hard to get rid of them," Strand said. "Without proteins to break down these molecules, we'd have to use high-energy processes to do it. It's so much simpler and more sustainable to put these proteins all together in a houseplant."

Civil and environmental engineering research technician Ryan Routsong is also a co-author. This research was funded by the National Science Foundation, Amazon Catalyst at UW and the National Institute of Environmental Health Sciences.

Credit: 
University of Washington

Mysteries of the primrose unraveled

image: Hand colored copper plate print, engraved by Sydenham Edwards for William Curtis' Flora Londinensis published between 1777 and 1798. This image of Primula vulgaris published on March 1, 1791, shows pin stigmas in the mouths of the intact flowers.
The UEA research team identified the landscape of genes which operate within the primrose's two different flowering forms that are involved in the reproductive process. This adds fresh insight to a puzzle that scientists have been grappling with for over 150 years.

Image: 
University of East Anglia

Plant scientists at the University of East Anglia have succeeded in unravelling the complete genome sequence of the common primrose -- the plant whose reproductive biology captivated the Victorian naturalist Charles Darwin.

The research team has identified, for the first time, the landscape of genes which operate within the primrose's two different flowering forms that are involved in the reproductive process. This adds fresh insight to a puzzle that scientists have been grappling with for over 150 years.

Primula vulgaris plants flower in one of two ways; they either have a long style and low anthers, or a short style and elevated anthers -- known as pins or thrums. Darwin was intrigued as to why some species, such as the primrose, develop two different forms of flowers, and devoted a whole book the subject. He concluded from his studies that they provided a mechanism to promote outcrossing between individuals.

More recently, a cluster of genes known as the S (Style length) locus have been shown to be the control centre for the development of the flowers. This S locus is absent from half the individuals of this species, this cluster switches some genes on and others off, giving different patterns of gene expression in pin and thrum flowers.

The UEA team, based at the neighbouring genome focused Earlham Institute, has previously sequenced the S-locus and described aspects of its evolution. The new paper, published in Scientific Reports, describes the full sequence of the P. vulgaris genome and shows that the S locus controls hundreds of genes across the genome. The team also identify genes that are activated in its absence, in the pin form of the flower.

"We started many years ago with a packet of seeds and a vision to understand the molecular genetics and developmental biology of the reproductive system Darwin described in 1862.", says Philip Gilmartin, of the University of East Anglia and Earlham Institute, whose fascination with primrose biology has been a career-long pursuit.

"Completion of the genome sequence paves the way to identify the genes that are regulated by the S locus, and adds more pieces to the puzzle. A long line of scientists, from Darwin in the 1860s through Bateson in the early 1900s, to Haldane and Fisher in the mid 1900's have been gripped and we continue to unravel the mystery piece by piece."

The team aim to continue their investigations, and to understand how the two different architectures of pin and thrum flowers are orchestrated by the S locus and the genes that they regulate. They are also collaborating with researchers in Japan investigating a similar mechanism in buckwheat, the only crop plant with these two distinct forms, to see if there are similar genomics at work.

Credit: 
University of East Anglia

'Pause' in global warming was never real

Claims of a 'pause' in observed global temperature warming are comprehensively disproved in a pair of new studies published today.

An international team of climate researchers reviewed existing data and studies and reanalysed them. They concluded there has never been a statistically significant 'pause' in global warming. This conclusion holds whether considering the `pause' as a change in the rate of warming in observations or as a mismatch in rate between observations and expectations from climate models.

Their papers are published today in Environmental Research Letters.

Dr James Risbey, from CSIRO Australia, is the lead author of one of the studies, which reassessed the data and put it into historical context.

He said: "Many studies over the past decade have claimed to find a pause or slowdown in global warming and have typically posited this as evidence that is inconsistent with our understanding of global warming."

The study examined the literature on an alleged 'pause'. It looked at how the 'pause' had been defined, the time intervals used to characterise it, and the methods used to assess it. The study then tested historical and current versions of the earth's global mean surface temperature (GMST) datasets for pauses, both in terms of no warming trend and a substantially slower trend in GMST.

Dr Risbey said: "Our findings show there is little or no statistical evidence for a 'pause' in GMST rise. Neither the current data nor the historical data support it. Moreover, updates to the GMST data through the period of 'pause' research have made this conclusion stronger. But, there was never enough evidence to reasonably draw any other conclusion.

"Global warming did not pause, but we need to understand how and why scientists came to believe it had, to avoid future episodes like this. The climate-research community's acceptance of a 'pause' in global warming caused confusion for the public and policy system about the pace and urgency of climate change.

"That confusion in turn might have contributed to reduced impetus for action to prevent greenhouse climate change. The full costs of that are unknowable, but the risks are substantial. There are lessons here for the science, and for the future."

The group's companion study looks at the alleged mismatch between the rate of global warming in observations and climate models.

The team carried out a systematic comparison between temperatures and projections, using historical GMST products and historical versions of model projections from the times when claims of a divergence between observations and modelling were made.

The comparisons were made with a variety of statistical techniques to correct for problems in previous work.

Professor Stephan Lewandowsky, from the University of Bristol, is this paper's lead author. He said: "We found the impression of a divergence - i.e. a divergence between the rate of actual global warming and the model projections - was caused by various biases in the model interpretation and in the observations. It was unsupported by robust statistics."

Despite this, the authors point out that by the end of 2017, the 'pause' was the subject of more than 200 peer-reviewed scientific articles. Many of these articles do not give any reason for their choice of start year for the 'pause', and the range spans 1995 to 2004.

Professor Lewandowsky said: "This broad range may indicate a lack of formal or scientific procedures to establish the onset of the 'pause'. Moreover, each instance of the presumed onset was not randomly chosen but chosen specifically because of the low subsequent warming. We describe this as selection bias.

"This bias causes a problem. If a period is chosen because of its unusually low trend, this has implications for the interpretation of conventional significance levels ("p-values") of the trend. Selection of observations based on the same data that is then statistically tested inflates the actual p-value, giving rise to a larger proportion of statistical false positives than the researcher might expect. Very few articles on the 'pause' account for or even mention this effect, yet it has profound implications for the interpretation of the statistical results.

"This is important, because some of the biases that affect the datasets and projections were known, or knowable, at the time."

When the researchers reanalysed the data, accounting for the selection bias problem, they found no evidence for a divergence between models and observations existed at any time in the last decade.

They also offer some possible explanations why some scientists believed climate warming lagged behind modelled warming.

Co-author Professor Kevin Cowtan, from the University of York, UK, said: "One cause may be a that surface temperature data providers struggle to communicate the limitations of the data to climate scientists. This is difficult because users need to focus their expertise in their own problem areas rather than on the temperature data.

"Additionally, there can be delays of several years in updating surface temperature datasets. It takes time to find a bias, find a solution, and then for a paper to be published before most providers update their datasets. This process is good for transparency, but it may leave users in the position where they download data with knowable biases and unwittingly draw incorrect conclusions from those data.

Co-author Professor Naomi Oreskes, from Harvard University, USA, added "A final point to consider is why scientists put such emphasis on the 'pause' when the evidence for it was so scant. An explanation lies in the constant public and political pressure from climate contrarians. This may have caused scientists to feel the need to explain what was occurring, which led them inadvertently to accept and reinforce the contrarian framework."

University of Bristol climate scientist Dr Dann Mitchell, who was not involved with either study, said: "As climate scientists we often look back at previous bodies of evidence and wonder why certain topics were so prominent in discussion; the so-called climate hiatus being an excellent example of this. Given the fast pace of increasing climate change understanding, the conclusions of this paper will be very relevant for the inevitable future 'apparent' climate contradictions that emerge over time."

Credit: 
IOP Publishing

Uncovering a key mechanism in assembly of Avian Sarcoma Virus, a relative of HIV-1

image: This is Jamil Saad.

Image: 
UAB

BIRMINGHAM, Ala. - A key step in retroviral growth inside a cell, as described by Jamil Saad, Ph.D., and colleagues, is portrayed on the cover of The Journal of Biological Chemistry. It is a visual image, in molecular detail, of their journal article inside that looks at avian sarcoma virus, or ASV.

The University of Alabama at Birmingham researchers used nuclear magnetic resonance, or NMR, to detail how the matrix domain of the ASV Gag protein binds to certain phospholipids. These phospholipids are vital for Gag protein binding to the plasma membrane of a cell, as the virus replicates and takes its first step toward virus formation and budding.

ASV, a retrovirus that causes cancer in chickens, is the first oncovirus to have been described, more than a century ago. It belongs to the retroviridae family and is closely related to HIV, the virus that causes AIDS. ASV is widely used as a model to study mechanisms of HIV infection and replication. By studying similarities and differences in replication of the two viruses, researchers learn basic knowledge that can inform efforts aimed to halt replication and spread of HIV. Despite great similarities in their Gag proteins that initiate virus assembly, retroviruses have distinct mechanisms for assembly that are incompletely understood.

The work led by Saad, associate professor of microbiology at UAB, and a companion paper, led by Carol Carter, Ph.D., professor of molecular genetics and microbiology at Stony Brook University, examined how the ASV Gag protein is targeted to the plasma membrane of the host cell to initiate virus assembly. Their findings elucidate the plasma membrane binding by the matrix domain of Gag, all the way from determining the precise molecular shape of the protein domain to studying its vital activity in living cells to initiate viral budding.

At UAB, Saad and colleagues elucidated the molecular determinants of ASV matrix interaction with lipids and membranes, and they provided a model of how the matrix binds to a cell membrane.

Important findings included:

Obtaining a significantly improved structural model of the matrix domain and identifying a membrane binding site that was not obvious in previously determined structures.

Providing compelling evidence that a cluster of four lysine amino acids in the matrix domain create a basic surface, which acts as a single binding site that directly interacts with acidic membrane lipids called phosphoinositides.

Demonstrating that Gag-membrane interaction is governed by charge-charge interactions.

They also show that, although the HIV matrix domain uses more structural tools to bind to the membrane, both ASV and HIV matrix proteins share almost identical interacting motifs that drive assembly.

As part of the UAB experiments, the researchers found that replacing lysine residues in the binding site of matrix with a different amino acid greatly diminished binding to lipids and membranes.

In the companion paper, Carter and colleagues at Stony Brook University used those mutations in the matrix domain of the ASV Gag protein to show that disruption of the phosophoinositide binding site on the matrix domain inhibited Gag localization at the cell periphery in two different cell lines and severely reduced viral particle production, as compared with unmutated ASV.

"These studies solved a longstanding mystery on how a virus discovered a century ago utilizes the plasma membrane of the host cell to replicate," Saad said. "What is even more remarkable is how ASV and HIV-1 share very similar structural features that drive membrane targeting and assembly."

Credit: 
University of Alabama at Birmingham

Researchers find gender separation affects sense of smell

A University of Wyoming researcher and his team have discovered that separating male and female mice, over time, changes the way they smell.

The study investigates how the olfactory sensory receptors in mice change as a function of exposure to odors emitted from members of the opposite sex, says Stephen Santoro, an assistant professor in the Department of Zoology and Physiology.

"The idea is that our experiences change our sensory system in a way that is semipermanent. This is probably true in humans as much as mice," Santoro says. "We found that mice that are housed with the opposite sex all of the time have olfactory sensory receptors that are similar in composition because they are smelling similar smells. On the other hand, mice that were housed separately by sex have sex-specific differences in their olfactory receptors. As a result, they may perceive odors differently."

The new study, titled "Sex Separation Induces Differences in the Olfactory Sensory Receptor Repertoires of Male and Female Mice," was published Dec. 4 in Nature Communications, an open access journal that publishes high-quality research from all areas of the natural sciences. Papers published by the journal represent important advances of significance to specialists within each field.

Carl van der Linden, a graduate student from Santa Ynez, Calif., in the UW Neuroscience Program, was the paper's lead author. Pooja Gupta, a postdoctoral researcher in the Department of Zoology and Physiology and who works in Santoro's lab, was a contributing author. Susanne Jakob, a preceptor in the Department of Stem Cell and Regenerative Biology at Harvard University; and Catherine Dulac, the Higgins Professor of Molecular and Cellular Biology and department chair at Harvard University, and a scientist at the Howard Hughes Medical Institute, were contributing authors.

Santoro is the paper's corresponding author. He began this research as a postdoctoral researcher in Dulac's lab at Harvard before bringing his work to UW.

"The olfactory system of mice and humans is very similar," Santoro says. "Mice are a very good model to understand how neural systems work, in general. They are a much better model for humans than flies and other common-model organisms."

Sensory activity plays pivotal roles in the development of the nervous system. Mouse odors are a complex mixture of volatile and non-volatile chemicals derived from skin secretions, urine, tears, saliva and feces, which are known to differ substantially in their chemical compositions between males and females.

"Human males and females smell different, too. Men give off odors from testosterone metabolites, for example," Santoro explains. "There are genetic differences in being able to detect this. Some people would say the smell is good, while others find it unpleasant or cannot detect it at all. These differences in perception are related to genetic differences in people's receptors. Some researchers speculate that these kinds of molecules might function as pheromones in humans."

Unlike most neurons in the mammalian nervous system, olfactory sensory neurons (OSNs) are continually born and replaced throughout life, a process that normally replaces damaged neurons in humans when we have a cold or use a zinc nasal spray, Santoro says. Changes in the abundance of specific OSN subtypes occur, in part, through a use-it-or-lose-it mechanism in which active OSNs are retained and silent OSNs are eliminated from the population, the paper concludes.

Credit: 
University of Wyoming

Hurricane Maria gives ecologists rare chance to study how tropical dry forests recover

To counteract the damage hurricanes have caused to their canopies, trees appear to adjust key characteristics of their newly grown leaves, according to a year-long field study presented at the British Ecological Society's annual conference today.

When Hurricane Maria hit Puerto Rico last year, the worst natural disaster on record to affect the U.S. territory, it stripped numerous trees bare of their leaves and consequently disrupted their ability to absorb the light needed for growth and survival.

Ecologists from Clemson University took the opportunity to study how hurricanes affect tropical dry forests in the Caribbean and whether trees were capable of compensating for the significant damage by increasing resource acquisition in newly produced leaves.

For the study, the researchers examined the leaves of the 13 most dominant tree species one, eight and twelve months after Hurricane Maria struck and compared them with leaves that were collected before the hurricane. They analysed whether the immediate changes observed in leaves were temporary or maintained over multiple seasons.

"Our study took us to the Guánica State Forest in southwest Puerto Rico, which comprises one of the best parcels of native dry forest in the Caribbean. Rainfall here is extremely erratic, with huge variability within and between years. The forest also sits on limestone from an ancient coral reef which is extremely porous, meaning trees have little time to capture water as it travels through the underlying rock. As a result, organisms are uniquely adapted to cope with unpredictable water availability", said Tristan Allerton, PhD candidate at Clemson University.

Trees rely on exchanging gas through their leaves, simultaneously collecting CO2 from the atmosphere to convert into energy whilst trying to minimise water loss (leaf-gas exchange). In order to capture maximum leaf-gas exchange rates by trees, the team attached a sensor to new leaves in the forest at several points during the day.

They also looked at the newly produced leaves' shape and structure, which play an important role in efficiently extracting gas from the atmosphere.

The preliminary findings suggest that 11 of 13 species studied were taking in CO2 at much higher rates immediately following Hurricane Maria. Many had also changed key characteristics of their leaves, including increasing leaf area relative to leaf biomass investment. In other words, trees were able to capture the same amount of light while spending less on leaf production.

"A key finding was that the leaves of some of the species contained less chlorophyll than prior to the hurricane. Even though new leaves were better suited structurally to capture valuable resources, lower leaf quality could reduce leaf lifespan and the trees' ability to produce energy", added Professor Skip Van Bloem, Allerton's supervisor at Clemson University.

Overall, Caribbean tropical dry forests seem to be capable of tolerating major hurricanes, though the ecologists stressed that there may be "winners" and "losers" in terms of how species respond.

Currently it is unclear whether dominant evergreen species can exploit post-hurricane conditions to the same extent as deciduous species.

Allerton said: "Many of our evergreens displayed little change in gas exchange rates and in general the relative decline in new leaf chlorophyll after Maria was much greater than for deciduous species. Under normal conditions, evergreens renew their canopies over monthly/yearly timescales, therefore it's likely hurricane canopy damage is a more expensive process for these trees."

As climate change leads to expected increases in hurricane frequency and intensity, the species composition of tropical dry forests in the Caribbean is likely to change. One concern would be whether endemic species will disappear over time.

"This would be a huge shame as Caribbean dry forests are known to have a higher proportion of endemic species than mainland dry forests. Many trees found there are also incredibly ancient, making these forests a living museum of biodiversity", concluded Allerton.

Allerton will present the study on Tuesday 18 December 2018 at the British Ecological Society annual meeting in Birmingham, UK. The conference will bring together 1,200 ecologists from more than 40 countries to discuss the latest research.

Credit: 
British Ecological Society

Plant biologists identify mechanism behind transition from insect to wind pollination

image: This is a fly pollinator approaching male flowers of Thalictrum pubescens (tall meadow-rue) to collect pollen.

Image: 
David Timerman

TORONTO, ON (Canada) - New research by scientists at the University of Toronto (U of T) offers novel insights into why and how wind-pollinated plants have evolved from insect-pollinated ancestors.

Early seed plants depended on wind to carry pollen between plants, but about 100 million years ago, flowering plants evolved to attract insects that could transfer pollen with greater precision than random air currents. Although insect pollination is more economical, numerous lineages have since reverted back to wind pollination, leaving many biologists to question why that would ever happen given the success of insect pollination. This apparent paradox perplexed even Charles Darwin, and still today, little is known about the conditions initiating this transition.

In a study published this month in Proceedings of the Royal Society B, the researchers describe for the first time a mechanism driving this reversion involving the vibration of stamens, the pollen-bearing organs of flowers.

"We found that plants in which stamens vibrate more vigorously in wind, disperse pollen by wind more readily, and that this characteristic of stamens is favoured under conditions where plants receive few visits from pollinators," said lead author David Timerman, a PhD candidate working with evolutionary biologist Spencer Barrett in the Department of Ecology & Evolutionary Biology in the Faculty of Arts & Science at U of T.

The discovery helps to explain the origins of wind pollination, which is represented in approximately 10 per cent of flowering plant species.

"It may also be useful for understanding how plants will cope with a reduction in pollinator services due to the global collapse of wild pollinator populations," said Timerman.

The reproductive structures of flowering plants are the most diverse of any group of living organisms. Flowers vary extensively in size, shape and structure, and much of this diversity is related to modes of pollination. Wind-pollinated species have independently evolved similar suites of floral traits adapted for releasing, dispersing and capturing pollen in air. One of these traits involves long flexible stamens that vibrate conspicuously in wind relative to related insect-pollinated species.

Why and how wind pollination has evolved in flowering plants from animal pollination is a long-standing fundamental question in the evolutionary biology of plants. While wind pollination has evolved from animal pollination on at least 65 occasions in flowering plants - trees, ragweed and many grasses among them - the mechanisms involved in the transition are not well understood.

It has been long held by scientists that wind-pollinated plants are 'aerodynamically engineered' for efficient pollen dispersal. But compared to animal-pollinated species few studies have investigated the function of floral traits associated with wind pollination. Moreover, the modifications of flowers required for the evolutionary switch from insect to wind pollination have not been studied because until now, there have been no experiments on transitional species that are both wind and insect-pollinated

"We took a novel approach to this problem by applying biomechanics to understand the key processes involved in the early stages of this transition, and the work provided several novel insights," said Timerman.

Timerman and Barrett examined the problem in a species called Thalictrum pubescens, of the buttercup family. The plant is an ambophilous species, meaning it is pollinated by both insects and wind. As such, they speculate that the species probably represents a transitional state in the evolution of wind pollination.

Timerman used an electrodynamic shaker to apply controlled vibration to stamens to measure their natural frequency of vibration, then used a custom-built wind tunnel to investigate how the natural frequency of vibration influences pollen release. Timerman also performed a manipulative field experiment at U of T's Koffler Scientific Reserve to confirm whether natural selection acts differently on stamen properties in the presence or absence of insect pollinators.

Timerman measured variation in the natural frequency of the stamen's vibration across nine populations, and assessed the repeatability of vibration frequency over consecutive growing seasons. With all of the data they collected, the researchers analysed the effect of this parameter on pollen release in the wind tunnel, as well as male reproductive success of plants in the field with and without pollinators.

"Successful reproduction was greatest for plants whose stamens vibrated at a lower frequency when pollinators were absent, but this advantage diminished when pollinators were present," said Timerman. "Our biomechanical analysis of the wind-flower interface has identified this naturally occurring feature as a key trait for understanding early stages in the transition from insect to wind pollination."

Timerman says when animal pollinators do not provide adequate pollination services, natural selection should favour individuals with flexible stamens that vibrate readily, releasing pollen into the air.

"Wind is obviously a more consistent agent for pollen dispersal than relying on insects whose population sizes and behaviours fluctuate in time and space" says Timerman. "Further, many aspects of global environmental change are currently disrupting pollinator service to wild plants, leading to what has been termed 'the pollination crisis'.

"These situations could potentially favour the evolution of wind pollination through the mechanism that we have discovered."

Credit: 
University of Toronto

Personality and cybercrime: Being impulsive makes you a greater target

image: Thomas Holt, professor in the School of Criminal Justice, focuses his research on computer hacking, malware, and the role of the internet in facilitating all manner of crime and deviance.

Image: 
Michigan State University

Impulse online shopping, downloading music and compulsive email use are all signs of a certain personality trait that make you a target for malware attacks. New research from Michigan State University examines the behaviors - both obvious and subtle - that lead someone to fall victim to cybercrime involving Trojans, viruses, and malware.

"People who show signs of low self-control are the ones we found more susceptible to malware attacks," said Tomas Holt, professor of criminal justice and lead author of the research. "An individual's characteristics are critical in studying how cybercrime perseveres, particularly the person's impulsiveness and the activities that they engage in while online that have the greatest impact on their risk."

Low self-control, Holt explained, comes in many forms. This type of person shows signs of short-sightedness, negligence, physical versus verbal behavior and an inability to delay gratification.

"Self-control is an idea that's been looked at heavily in criminology in terms of its connection to committing crimes," Holt said. "But we find a correlation between low self-control and victimization; people with this trait put themselves in situations where they are near others who are motivated to break the law."

The research, published in Social Science Computer Review, assessed the self-control of nearly 6,000 survey participants, as well as their computers' behavior that could indicate malware and infection. To measure victimization, Holt and his team asked participants a series of questions about how they might react in certain situations. For computer behavior, they asked about their computer having slower processing, crashing, unexpected pop-ups and the homepage changing on their web browser.

"The internet has omnipresent risks," Holt said. "In an online space, there is constant opportunity for people with low self-control to get what they want, whether that is pirated movies or deals on consumer goods."

As Holt explained, hackers and cybercriminals know that people with low self-control are the ones who will be scouring the internet for what they want - or think they want - which is how they know what sites, files or methods to attack.

Understanding the psychological side of self-control and the types of people whose computers become infected with malware - and who likely spread it to others - is critical in fighting cybercrime, Holt said. What people do online matters, and the behavioral factors at play are entirely related to risks.

Computer scientists, Holt said, approach malware prevention and education from a technical standpoint; they look for new software solutions to block infections or messaging about the infections themselves. This is important, but it is also essential to address the psychological side of messaging to those with low self-control and impulsive behaviors.

"There are human aspects of cybercrime that we don't touch because we focus on the technical side to fix it," he said. "But if we can understand the human side, we might find solutions that are more effective for policy and intervention."

Looking ahead, Holt hopes to help break the silos between computer and social sciences to think holistically about fighting cybercrime.

"If we can identify risk factors, we can work in tandem with technical fields to develop strategies that then reduce the risk factors for infection," Holt said. "It's a pernicious issue we're facing, so if we can attack from both fronts, we can pinpoint the risk factors and technical strategies to find solutions that improve protection for everyone."

Credit: 
Michigan State University

How much are we learning? Natural selection is science's best critic

Cold Spring Harbor, NY -- In 2003, the Human Genome Project revealed to the world the three billion chemical units within human DNA. Since that time, scientists have designed many ways to organize and assess this overwhelmingly large amount of information. Now, scientists at Cold Spring Harbor Laboratory (CSHL) have determined that evolution can help guide these efforts.

Researchers have already concluded that a mere one percent of the human genome is made up of the genes that make the proteins our bodies need to grow and function. However, they've also learned that roughly five percent of the human genome has remained the same, or been conserved, over countless generations of mutation and evolution.

"That suggests that an extra four percent of the genome is doing something that's really important, even though we don't know exactly what that is," explained Adam Siepel, a computational biologist and professor at CSHL.

To solve the mystery of the four percent, scientists have spent more than a decade developing powerful methods to look for distinct functions among various bits of the genome. And, to understand what influences the genome has upon an organism, they've had to look to evidence from the epigenome. The epigenome is a universe of chemical compounds that attach themselves to DNA, influencing how and when parts of the genome are used by cells.

Searching for patterns among epigenomic factors has allowed scientists to guess where important parts of the genome may be and if they share biological function. However, this is no more certain than trying to determine the significance of a scene in a play by seeing only the props and costumes involved.

"This uncertainty about the true biological significance of many epigenomic measurements is a critical barrier not only for interpretation of the available data, but also for prospective decisions about how much new data to collect, of what type, and in what combinations," Siepel and his colleague Brad Gulko explained in the latest publication of Nature Genetics.

The Siepel lab has found a way around this barrier.

"So my lab and I decided to come at this from a different angle," added Siepel. "We asked, 'What if we let evolution do the work of telling us how much of the genome is important?' and, 'How much do we learn from each epigenomic data set?'"

The researchers used data from modern human populations to find evidence of recent natural selection. Then, they compared the genomes of humans and chimpanzees to get information that goes back five to seven million years to the divergence of humans from our great ape cousins.

"This allowed us to sort of chart how strong natural selection was during that whole period of time," Siepel explained.

The result was a way to guide future research. Siepel and his colleagues clustered sites within the genome based upon epigenomic features and how consequential each site has been for the survival of our species, according to evolutionary history. The resulting scores for each feature were then aggregated to create "fitness consequence maps," or FitCons maps.

If natural selection has been a powerful influence on a site in the genome--preserving it for countless generations despite mutation and evolution--this part of the genome should be important for survival. Moreover, if an epigenomic analysis identifies more of these conserved sites than not, then it will prove to be an informative study.

Siepel hopes that his fellow researchers will be able to reference FitCons to help determine which epigenetic markers or combinations of markers can prove the most informative for further investigation.

"This is an effort to try to see what we can learn by considering evolutionary information alongside what we already know," he said.

Credit: 
Cold Spring Harbor Laboratory

Front and center: Food labels have effects on consumption and product formulation

BOSTON (Dec. 17, 2018)--Over the past two decades, labels such as the U.S. Nutrition Facts Panel on packaged foods, calorie counts on national restaurant menus, front-of-pack labels encouraging healthier eating, and "low-sodium" or "fat-free" identifiers have been developed in order to promote healthier choices. But do they work?

A new Food-PRICE systematic review and meta-analysis of interventional studies, led by researchers from the Friedman School of Nutrition Science and Policy at Tufts University and published online today in the American Journal of Preventive Medicine, assessed the effectiveness of multiple types of food labels. The researchers found that these approaches can impact some targets, but not others, for both consumer and industry behavior. The 60 interventional studies reviewed were comprised of two million unique observations, including consumer reported dietary intakes, purchases, and sales receipts, and were published between 1990 and 2014.

"Many old and new food policies focus on labeling, whether on food packages or restaurant menus. Remarkably, the effectiveness of these labels, whether for changing consumers' choices or industry product formulations, has not been clear," said senior and corresponding author Dariush Mozaffarian, M.D., Dr.P.H., dean of the Friedman School. "Our findings provide new evidence on what might work, and what might not, when implementing food labeling."

In a pooled analysis of studies that included food labeling on menus, product packaging, or other point-of-purchase materials such as placards on supermarket shelves, the researchers found that labeling reduced consumers' intake of:

Calories by 6.6 percent.

Total fat by 10.6 percent.

Other unhealthy food options by 13 percent.

Labeling also increased consumers' vegetable consumption by 13.5 percent.

In contrast, labeling did not significantly impact consumer intakes of other targets such as total carbohydrate, total protein, saturated fat, fruits, whole grains, or other healthy options.

When industry responses were evaluated, the researchers found that labeling led to reductions of both trans fat and sodium in packaged foods by 64.3 percent and 8.9 percent, respectively. However, no significant effects of labeling were identified for industry formulations of total calories, saturated fat, dietary fiber, other healthy components (e.g., protein and unsaturated fat), or other unhealthy components (e.g., total fat, sugar, and dietary cholesterol), although relatively few studies evaluated these endpoints.

"For industry responses, it's interesting that the two altered components-trans fat and sodium-are additives," said Mozaffarian. "This suggests that industry may be more readily able to alter additives, as opposed to naturally occurring ingredients such as fat or calories, in response to labeling. It will be interesting to see whether this will translate to added sugar, newly added to the Nutrition Facts Panel on food labels in the United States."

The researchers also examined the effects of label type, placement, and other characteristics. No consistent differential effects were found by label placements (menu, package, other point-of-purchase), label types (e.g., traffic light, nutrient content), type of labeled products, whether labeling was voluntary or mandatory, or several other factors. The researchers concluded that this suggests that the general presence or absence of information may be more relevant to consumers and industry than the specific type of label.

Limitations were noted. While all studies were interventional, many were non-randomized. Restaurant labeling studies often assessed consumer effects for a single meal, rather than long-term effects. Too few studies evaluated obesity or metabolic risk factors to draw any meaningful conclusions on the effects of labeling on health outcomes. The authors also noted that the studies included in the review were heterogeneous, due to the nature of interventions.

However, by merging findings from 60 interventional studies, the researchers were able to evaluate differences in both consumer and industry responses across 111 intervention arms in 11 countries across four continents. The studies were conducted in the United States/Canada, Europe/Australia, and Asia, and the majority included both genders; most evaluated adults. Most studies evaluated specific meals or products. The findings were centrally pooled in a meta-analysis. Analyses were completed in 2017.

Credit: 
Tufts University, Health Sciences Campus