Culture

Study: Water births are as safe as land births for mom, baby

ANN ARBOR---A new study found that water births are no more risky than land births, and that women in the water group sustain fewer first and second-degree tears.

University of Michigan researchers analyzed 397 waterbirths and 2025 land births from two midwifery practices. There were no differences in outcomes between waterbirth and land birth for neonatal intensive care admissions, and postpartum hemorrhage rates were similar for both groups.

"The long and short of it is that if you use proper techniques...the outcomes are very good," said Lisa Kane Low, professor in the U-M School of Nursing, and senior author on the paper. "They mirror what we see in international studies of water birth."

Ruth Zielinski, clinical associate professor of nursing and study co-author, said more facilities should offer water birth and have guidelines for implementing it.

In water birth, the woman gives birth in a water-filled tub rather than a bed. Few US hospitals or birth centers offer tub births because of perceived risk to the newborn, mainly suggested by case studies of neonatal infections or cord tearing. Professional organizations tend to agree that women in labor should have access to water for comfort, but not all support birth in the water. This means hospitals must make women leave the tub before the birth.

During a water birth, babies take their first breath when removed from the tub. Until then, their lungs are filled with water, which is displaced when they hit the air and breathe. The connected umbilical cord provides oxygen.

"It's important not to re-submerge babies", Zielinski said. At U-M, they are birthed in the water, brought out almost immediately, and they're careful to not re-submerge them. Mom and baby exit the tub with help and warm blankets, typically prior to delivering the placenta so that blood loss can be more accurately calculated.

Zielinski said more studies are needed to understand the satisfaction level of women who have water births.

Credit: 
University of Michigan

Clemson geneticists identify small molecules that are potential indicators for disease

image: Robert Anholt and Trudy Mackay in their Clemson University Center for Human Genetics lab.

Image: 
Clemson University College of Science

CLEMSON, South Carolina - A critical question in medicine asks how individual variation in DNA can predict variation in health and disease. New research from the Clemson Center for Human Genetics identified hundreds of metabolites that might serve as intermediates to translate variation in the genome to variation in complex traits. Published recently in Genome Research, the findings could one day help doctors better monitor metabolite variation as an indicator for disease.

The central dogma of molecular biology spells out how DNA is transcribed into RNA, which is translated into proteins. Many of these proteins function as enzymes that catalyze chemical reactions between small molecules that carry out the programs specified in our DNA, known as metabolites. These metabolites, collectively referred to as the "metabolome," perform essential functions, from generating or storing energy to serving as building blocks for our cells.

"A lot of the reason the metabolome has not been emphasized in genome-wide studies is because of technology. Only recently has mass spectrometry advanced, such that it has become feasible to survey the entire metabolome on a fairly large scale," said Trudy Mackay, director of Clemson University's Center for Human Genetics.

In this groundbreaking study led by Mackay's former postdoctoral researchers Shanshan Zhou and Fabio Morgante, researchers sought to measure hundreds of metabolites found within the experimental model system of the common fruit fly, Drosophila melanogaster, to gauge how those metabolites fluctuated alongside variation in genetic information. Once the metabolites were identified, the rest of the study was conducted computationally, correlating variation in metabolites with DNA variants and variation in a wide range of traits measured in 40 different fruit fly lines.

"What Shanshan did first was to ask: When we look at variation, are these metabolites all independent or do some of them vary together as you would expect if they fall in the same kind of pathway? Then, she put them together in different pathways to see - both on the level of individual metabolite or of groups of metabolites - if they are associated with variation in the genome of these fly lines," said Robert Anholt, Provost's Distinguished Professor of Genetics and Biochemistry who is also located at the Center for Human Genetics.

Zhou was able to draw correlations between polymorphic changes in the DNA - specific base pair differences in the genetic code - that translated to variation in the metabolome. She also found connections between variation in RNA transcripts and variations in metabolites. Taking the analysis one step further, Zhou considered whether metabolite levels can be associated with particular traits or characteristics, called phenotypes.

"One of the traits studied was resistance to stress: how long can flies survive when you put them on a medium where there are no nutrients for them?" Anholt said. "Shanshan found different metabolomic transcriptional DNA networks between males and females, both of which are related to lipid metabolism."

Inter-male aggression, a trait that takes an abundance of energy to display, was another phenotype included in the study. As hypothesized, metabolites involved in rapid energy generation, such as those belonging to the citric acid cycle, featured prominently in this network.

"We knew that it is difficult to accurately predict the phenotype based only on information of DNA variants," Anholt said. "When Fabio Morgante analyzed the data, he found that the accuracy of prediction increases when you look at the metabolome, because the metabolome is much more closely related to the phenotype than the DNA."

Summed together, the results might one day facilitate early diagnosis - or a more accurate diagnosis - for illnesses that can be detected by metabolite variation.

"Let's say someone has family members in which there is a prevalence of a particular late-onset disease. It would be nice for that person to know: are they at-risk or not at-risk of developing the disease." Anholt said. "And maybe there is not a biomarker available for that risk today. But if you know the genetic variation and you know how genetic variation impacts variation in the metabolome, one could monitor the metabolome and find out a very early diagnosis if a particular metabolite changes."

Mackay emphasized that although the team's study identified hundreds of metabolites, the genetic basis of variation in those metabolites is still difficult to elucidate. Many more studies on the genotype-metabolite-phenotype relationship will need to be carried out to consolidate the genetic underpinnings of metabolite diversity.

Nevertheless, this comprehensive study fills in gaps of missing information that create more avenues for research in the future. As the technology to measure the metabolome improves, databases containing metabolomic information will expand.

Morgante, who completed his studies under Mackay and Anholt in their time at North Carolina State University, will soon be joining the researchers at the Center for Human Genetics as an assistant professor. Zhou is currently a research scientist at Covance Inc., a subsidiary of LabCorps.

Credit: 
Clemson University

Reducing wildfire risks for better management and resource allocation

Difficult to contain, wildfires consume everything in their path and wreak havoc on human and animal lives, homes and landscapes. From 1995 to 2015, wildfire management has cost the U.S. $21 billion. Over the past 10 years, the National Interagency Fire Center reports that there were 1.4 million fires with an average of 67,000 wildfires annually and an average of 7.0 million acres burned annually. Most of these wildfires are caused by human activity. Management resources are becoming strained and funds that were earmarked for promoting forest health and fuel reduction are being diverted to fire response activities.

As wildfires become deadlier, larger and more expensive, there is strong interest in better risk governance. Managing future wildfire risk requires an interface between human decision processes and knowledge about climate trends related to fire, as well as humans' abilities to anticipate wildfire potential and mitigation approaches are critical. Several presentations at the 2019 Society for Risk Analysis (SRA) Annual Meeting will explore analyses of past fire seasons, projections for the future and approaches for decision making aimed at mitigating risk.

One mitigation strategy that has been wildly employed across the U.S. is burning prescribed fires. More than 40 million acres have been burned from 2003-2017 by federal and state government agencies, with an average cost of $101.34 per acre. Esther Jose and Dr. Jun Zhuang, Professor at the University at Buffalo, reviewed the literature on prescribed fires to design a data driven model to calculate the effectiveness of prescribed fires for each state to determine the most significant factors in their effectiveness.

Jose and Zhuang's study, "An analysis on the effectiveness of prescribed fires," found that Oregon has conducted the most effective prescribed fires and U.S. Forest Service is the most successful agency in the country at conducting these fires. And while California has had some of the largest fires of the past five years, the state is prohibited from burning sufficient prescribed fires due to factors such as funding, weather and policies. Understanding when and where to burn these fires is vital in limiting excess spending and associated risks.

In another study, Harry Podschwit, University of Washington-Seattle, studied sets of wildfire characteristics such as size, frequency and intensity as predictors and pitfalls of resource demand from both a statistical and conceptual perspective.

His study, "Wildfire characteristics as predictors of firefighting resource demand," explores some of the conceptual consequences of adopting certain wildfire characteristics as representations of resource demand. Podschwit also extracted a set of potential wildfire proxies from 2009-2015 wildfire data and compared them to preparedness levels to validate the appropriateness of the wildfire proxies in firefighting risk analysis.

"Our research will improve the quality of wildfire risk assessments by instructing of the pitfalls of using popular wildfire characteristics, such as size, as surrogates for wildfire risk," states Podschwit. "This research will explicitly identify which wildfire characteristics are good proxies for resource demand and lead to higher quality decisions and fire management resource allocations."

Credit: 
Society for Risk Analysis

Why doesn't deep-brain stimulation work for everyone?

image: Brain networks corresponding to functions such as vision (blue) and attention (green) mingle and share information in structures deep inside the brain, as seen, for example, in the bottom right corner of this color-coded composite MRI image. Researchers at Washington University School of Medicine in St. Louis have mapped nine functional networks in the deep-brain structures of 10 healthy people, an accomplishment that could lead to improvements in deep-brain stimulation therapy for severe cases of Parkinson's disease and other neurological conditions.

Image: 
Scott Marek

People with severe Parkinson's disease or other neurological conditions that cause intractable symptoms such as uncontrollable shaking, muscle spasms, seizures, obsessive thoughts and compulsive behaviors are sometimes treated with electric stimulators placed inside the brain. Such stimulators are designed to interrupt aberrant signaling that causes the debilitating symptoms. The therapy, deep-brain stimulation, can provide relief to some people. But in others, it can cause side effects such as memory lapses, mood changes or loss of coordination, without much improvement of symptoms.

Now, a study from researchers at Washington University School of Medicine in St. Louis may help explain why the effects of deep-brain stimulation can vary so much - and points the way toward improving the treatment. The stimulators typically are implanted in structures known as the thalamus and the basal ganglia that are near the center of the brain. These structures, the researchers found, serve as hubs where the neurological networks that control movement, vision and other brain functions cross paths and exchange information. Each person's functional networks are positioned a bit differently, though, so electrodes placed in the same anatomical spot may influence different networks in different people - alleviating symptoms in one but not in another, the researchers said.

The findings are published Dec. 10 in Neuron.

"The sites we target for deep-brain stimulation were discovered serendipitously," said co-author Scott Norris, MD, an assistant professor of neurology and of radiology who treats patients with movement disorders. "Someone had a stroke or an injury in a specific part of the brain, and suddenly their tremor, for example, got better, and so neurologists concluded that targeting that area might treat tremor. We've never really had a way to personalize treatment or to figure out if there are better sites that would be effective for more people and have fewer side effects."

What neurosurgeons lacked were individualized maps of brain functions in the thalamus and basal ganglia. These structures connect distant parts of the brain and have been linked to neurological and psychiatric conditions such as Parkinson's disease, Tourette's syndrome and obsessive-compulsive disorder. But their location deep inside the brain means that mapping is technically challenging and requires enormous amounts of data.

Nico Dosenbach, MD, PhD, an assistant professor of neurology and the study's senior author - along with co-first authors Deanna Greene, PhD, an assistant professor of psychiatry and of radiology, and Scott Marek, PhD, a postdoctoral researcher, and other colleagues - set out to create individual maps of the functional networks in the basal ganglia and thalamus. Such maps, they reasoned, might provide clues to why people with neurological and psychiatric conditions exhibit such a wide range of symptoms, and why electrodes placed in those structures produce variable results.

"Deep-brain stimulation is a very invasive treatment that is only done for difficult, severe cases," said Greene, who specializes in Tourette's syndrome. "So it is difficult to grapple with the fact that such an invasive treatment may only help half the people half the time."

Using data from a group of Washington University scientists who scanned themselves at night as part of the so-called Midnight Scan Club, the researchers analyzed 10 hours of MRI brain scan data on each of 10 individuals. From this, they created 3D maps color-coded by functional network for each individual. One of the functional networks is devoted to vision, two relate to movement, two involve paying attention, three relate to goal-directed behaviors, and the last network is the default network, which is active when the brain is at rest.

The researchers discovered that each functional network followed its own path through the deep structures of the brain, intermingling with other networks at defined meeting spots. Some of these spots - such as the motor integration zone, where a movement and a goal-directed network come together - were located in much the same place in all 10 people. The locations of other networks and their points of intersection varied more from person to person.

"I showed a neurosurgeon where we'd found the motor integration zone, and he said, 'Oh, that's where we put the electrodes for essential tremor, and it always works,'" said Dosenbach, who is also an assistant professor of occupational therapy, of pediatrics and of radiology. People with essential tremor experience uncontrollable shaking. "That's interesting because there isn't a lot of variability among people in terms of where the motor integration zone is located. So then we looked at a spot they target to treat Parkinson's disease. We saw that there was a great deal of variation across people in terms of what functional networks are represented there, and deep-brain stimulation is only about 40% to 50% successful there."

The findings suggest that the outcome of deep-brain stimulation may reflect how successfully a neurosurgeon taps into the correct functional network - and avoids tapping into the wrong one.

"Historically, our understanding of deep-brain stimulation has been based on averaging data across many people," Norris said. "What this study suggests is that a particular patient may do better if the wire is placed in relation to their personal functional brain map rather than in context of the population average. A personalized functional map - as opposed to an anatomical map, which is what we use today - could help us place a wire in the exact place that would provide the patient with the most benefit."

The researchers now are studying the relationship between the locations of an individual's functional networks and the outcomes of deep-brain stimulation to identify the networks that provide relief when stimulated, and those that cause side effects. In the future, the researchers hope to dig deeper into the networks with therapeutic effects, looking for other spots that might provide even better results than the traditional sites of electrode placement.

"A lot of what I do is basic science - understanding how the brain works," Dosenbach said. "But now we can make a map and give it to neurosurgeons and potentially improve treatment for these devastating conditions. We still have to prove the hypothesis that deep-brain stimulation outcomes are linked to functional networks, and translation will take time, but this really could make a difference in people's lives."

Credit: 
Washington University School of Medicine

Chiton mollusk provides model for new armor design

image: The chiton mollusk, which is about 1 to 2 inches long, has a series of eight large plates and is ringed by a girdle of smaller, more flexible scales. The mollusk is the inspiration behind a 3D printed armor.

Image: 
Virginia Tech

The motivations for using biology as inspiration to engineering vary based on the project, but for Ling Li, assistant professor of mechanical engineering in the College of Engineering, the combination of flexibility and protection seen in the chiton mollusk was all the motivation necessary.

"The system we've developed is based on the chiton, which has a unique biological armor system," Li said. "Most mollusks have a single rigid shell, such as the abalone, or two shells, such as clams.

But the chiton has eight mineralized plates covering the top of the creature and around its base it has a girdle of very small scales assembled like fish scales, that provide flexibility as well as protection."

Li's work, which was featured in the journal Nature Communications Dec. 10, is the result of a collaboration with researchers from various institutions, including the Massachusetts Institute of Technology, the Dana-Farber Cancer Institute at the Harvard Medical School, California State University, Fullerton, the Max Planck Institute of Colloids and Interfaces, Germany, and the Wyss Institute for Biologically Inspired Engineering at Harvard University.

Because the mechanical design of the chiton's girdle scales had not been studied in-depth before, the team of researchers needed to start with basic material and mechanical analysis with the mollusk before using that information as the bio-inspiration for the engineering research.

"We studied this biological material in a very detailed way. We quantified its internal microstructure, chemical composition, nano-mechanical properties, and three-dimensional geometry. We studied the geometrical variations of the scales across multiple chiton species, and we also investigated how the scales assemble together through 3D tomography analysis," Li said.

The team then developed a parametric 3D modeling methodology to mimic the geometry of individual scales. They assembled individual scale units on either flat or curved substrates, where the scales' sizes, orientations, and geometries can also be varied, and used 3D printing to fabricate the bio-inspired scale armor models.

"We produced the chiton scale-inspired scale assembly directly with 3D multi-material printing, which consists of very rigid scales on top of a flexible substrate," Li explained. With these physical prototypes of controlled specimen geometries and sizes, the team conducted direct mechanical testing on them with controlled loading conditions. This allowed the researchers to understand the mechanisms behind the dual protection-flexibility performance of the biological armor system.

The way the scale armor works is that when in contact with a force, the scales converge inward upon one another to form a solid barrier. When not under force, they can "move" on top of one another to provide varying amounts of flexibility dependent upon their shape and placement.

"The strength comes from how the scales are organized, from their geometry," Li said. "Reza's [Mirzaeifar, assistant professor of mechanical engineering] team has done an amazing job by using computational modeling to further reveal how the scale armor becomes interlocked and rigid when the external load reaches a critical value."

The design of place-specific armor takes into account the size of scales used. Smaller scales, such as those around the girdle of the chiton, are more useful for regions requiring maximum flexibility, while larger scales are used for areas requiring more protection. "Working with Reza, our next step is to expand the space so we can design tailored armor for different body locations.

The flexibility vs. protection needs of the chest, for example, will be different than for the elbow or knee, so we would need to design the scale assembly accordingly in terms of scale geometry, size, orientation, etc."

The work being featured began with Department of Defense funding when Li was a graduate research assistant at the Massachusetts Institute of Technology. Since he arrived at Virginia Tech in 2017, the work has continued without sponsorship as part of his start-up funding.

"We started with a pretty pure motivation - looking for multifunctional biological materials," Li said. "We wanted to integrate flexibility and protection and that's very hard to achieve with synthetic systems. We will continue with our research to explore the design space beyond the original biological model system and conduct testing under different load conditions."

Li admits the process, which has taken multiple years, is long, but the work is unique in how they've approached it from the start as a two-step process in conducting the fundamental biological materials research followed by the bio-inspired research.

"Having that level of familiarity with the subject has been very useful to the design and modeling of the armor," Li said. "I think this type of bio-inspired armor will represent a significant improvement to what is currently available."

Credit: 
Virginia Tech

Dementia study reveals how proteins interact to stop brain signals

Fresh insights into damaging proteins that build up in the brains of people with Alzheimer's disease could aid the quest for treatments.

A study in mice reveals how the two proteins work together to disrupt communication between brain cells.

Scientists observed how proteins - called amyloid beta and tau - team up to hamper key genes responsible for brain messaging. By changing how genes are expressed in the brain, the proteins can affect its normal function.

These changes in brain function were completely reversed when genetic tools were used to reduce the presence of tau, researchers at the University of Edinburgh found.

The study focused on the connection points between brain cells - known as synapses - that allow chemical and electrical messages to flow and are vital to healthy brain function.

Stopping the damage that the two proteins cause to synapses could help scientists prevent or reverse dementia symptoms, the researchers say.

In both the mouse model and in brain tissue from people with Alzheimer's disease, the team found clumps of amyloid beta and tau proteins in synapses.

When both amyloid beta and tau were present in the brain, genes that control the function of synapses were less active. And some of the genes that control the immune system in the brain were more active.

Related to increased immune system activity, the scientists observed immune cells called microglia containing synapses in the brains of mice. This adds to findings from recent studies suggesting that these immune cells consume synapses during Alzheimer's disease.

Alzheimer's disease is the most common form of dementia, affecting some 850,000 people in the UK - a figure predicted to rise to more than one million by 2025. It can cause severe memory loss and there is currently no cure.

Lead researcher, Professor Tara-Spires Jones of the UK Dementia Research Institute at the University of Edinburgh, said: "More work is needed to take what we've learned in this study and find therapeutics - but this is a step in the right direction, giving us new targets to work towards."

Credit: 
University of Edinburgh

Greenland ice losses rising faster than expected

image: The midnight sun casts a golden glow on an iceberg and its reflection in Disko Bay, Greenland. Much of Greenland's annual mass loss occurs through calving of icebergs such as this.

Image: 
Ian Joughin, University of Washington

Greenland is losing ice seven times faster than in the 1990s and is tracking the Intergovernmental Panel on Climate Change's high-end climate warming scenario, which would see 40 million more people exposed to coastal flooding by 2100.

A team of 96 polar scientists from 50 international organisations have produced the most complete picture of Greenland ice loss to date. The Ice Sheet Mass Balance Inter-comparison Exercise (IMBIE) Team combined 26 separate surveys to compute changes in the mass of Greenland's ice sheet between 1992 and 2018. Altogether, data from 11 different satellite missions were used, including measurements of the ice sheet's changing volume, flow and gravity.

The findings, published today in Nature today, show that Greenland has lost 3.8 trillion tonnes of ice since 1992 - enough to push global sea levels up by 10.6 millimetres. The rate of ice loss has risen from 33 billion tonnes per year in the 1990s to 254 billion tonnes per year in the last decade - a seven-fold increase within three decades.

The assessment, led by Professor Andrew Shepherd at the University of Leeds and Dr Erik Ivins at NASA's Jet Propulsion Laboratory in California, was supported by the European Space Agency (ESA) and the US National Aeronautics and Space Administration (NASA).

In 2013, the Intergovernmental Panel on Climate Change (IPCC) predicted that global sea levels will rise by 60 centimetres by 2100, putting 360 million people at risk of annual coastal flooding. But this new study shows that Greenland's ice losses are rising faster than expected and are instead tracking the IPCC's high-end climate warming scenario, which predicts 7 centimetres more.

Professor Shepherd said: "As a rule of thumb, for every centimetre rise in global sea level another six million people are exposed to coastal flooding around the planet."

"On current trends, Greenland ice melting will cause 100 million people to be flooded each year by the end of the century, so 400 million in total due to all sea level rise."

"These are not unlikely events or small impacts; they are happening and will be devastating for coastal communities."

The team also used regional climate models to show that half of the ice losses were due to surface melting as air temperatures have risen. The other half has been due to increased glacier flow, triggered by rising ocean temperatures.

Ice losses peaked at 335 billion tonnes per year in 2011 - ten times the rate of the 1990s - during a period of intense surface melting. Although the rate of ice loss dropped to an average 238 billion tonnes per year since then, this remains seven times higher and does not include all of 2019, which could set a new high due to widespread summer melting.

Dr Ivins said: "Satellite observations of polar ice are essential for monitoring and predicting how climate change could affect ice losses and sea level rise".

"While computer simulation allows us to make projections from climate change scenarios, the satellite measurements provide prima facie, rather irrefutable, evidence."

"Our project is a great example of the importance of international collaboration to tackle problems that are global in scale."

Guðfinna Aðalgeirsdóttir, Professor of Glaciology at the University of Iceland and lead author of the Intergovernmental Panel on Climate Change's sixth assessment report, who was not involved in the study, said:

"The IMBIE Team's reconciled estimate of Greenland ice loss is timely for the IPCC. Their satellite observations show that both melting and ice discharge from Greenland have increased since observations started."

"The ice caps in Iceland had similar reduction in ice loss in the last two years of their record, but this last summer was very warm here and resulted in higher loss. I would expect a similar increase in Greenland mass loss for 2019."

"It is very important to keep monitoring the big ice sheets to know how much they raise sea level every year."

Credit: 
University of Leeds

Report discusses potential role of coffee in reducing risk of Alzheimer's and Parkinson's

A new report from the Institute for Scientific Information on Coffee (ISIC) highlights the potential role of coffee consumption in reducing the risk of neurodegenerative disorders such as Alzheimer's and Parkinson's diseases1-3.

For the first time in history most people can expect to live into their 60s and beyond, however with increasing age, the risk of disease and disabilities rises4,5. The number affected with Alzheimer's disease is estimated to increase globally from today's 47 million to 75 million 2030 and to 132 million in 20506.

Parkinson's disease, the second most common age-related neurodegenerative disorder, affects 7 million people globally7. Research has suggested that lifestyle may be an important part of the risk for neurodegenerative conditions for which there is currently no curative treatment8-10.

The new report, authored by Associate Professor Elisabet Rothenberg, Kristianstad University, discusses the role of dietary components, including coffee and caffeine, in reducing the risk of neurodegenerative disorders.

The report considers the mechanisms involved in the positive associations between coffee and Alzheimer's and Parkinson's diseases which are not yet well understood. The role of caffeine and other plant-based compounds present in coffee such as phytochemicals and polyphenols are of particular academic interest11-13.

Key research findings highlighted in the report include:

Dietary pattern may have an impact on the risk of developing neurodegenerative disorders5,6

Coffee consumption may help reduce the risk of neurodegenerative conditions or relieve symptoms1-3

Considering PD, men might benefit more from coffee consumption than women possibly because oestrogen may compete with caffeine9,10

Further research is required for better understanding of the associations11-13

Credit: 
Kaizo

Was Earth's oxygenation a gradual, not step-wise, process -- driven by internal feedbacks?

The oxygenation of Earth's surface - which transformed the planet into a habitable haven for all life as we know it - may have been the consequence of global biogeochemical feedbacks, rather than the product of discrete planetary-scale biological and tectonic revolutions as proposed, according to a new study. The findings have implications for the understanding of life's evolution on Earth and perhaps other planets. More than two billion years ago, oxygen levels in Earth's atmosphere and oceans began to rise, and with it, the evolution of progressively complex animal life began. The progressive path to oxygenation is widely considered to have occurred over three major steps - the "Great Oxidation Event," the "Neoproterozoic Oxygenation Event" and the "Paleozoic Oxygenation Event." While several biological and geological explanations for these stepwise increases in oxygen have been advanced, the geologically rapid yet rare oxygenation events do not correspond to known external tectonic or evolutionary processes. Now, Lewis Alcott and colleagues suggest that Earth's stepwise oxygenation may be better explained by internal feedbacks within long-term biogeochemical processes. Using a theoretical model, Alcott et al. identified a set of internal biogeochemical feedbacks involving the global phosphorus, carbon and oxygen cycles, which are capable of driving rapid shifts in ocean and atmospheric O2 levels. Based only on the gradual oxygenation of Earth's surface over time, the model produced the same three-step pattern observed in the geological record and did not require any biological advances beyond simple photosynthetic cyanobacteria. The findings dramatically increase the possibility of high-O2 worlds existing elsewhere in the universe, the authors suggest.

Credit: 
American Association for the Advancement of Science (AAAS)

Lower BMI means lower diabetes risk, even among non-overweight people

Lower body mass index (BMI) is consistently associated with reduced type II diabetes risk, among people with varied family history, genetic risk factors and weight, according to a new study published this week in PLOS Medicine by Manuel Rivas of Stanford University, and colleagues.

Weight-loss interventions have shown demonstratable benefit for reducing the risk of type II diabetes in high-risk and pre-diabetic individuals but have not been well-studied in people at lower risk of diabetes. In the new study, researchers studied the association between BMI, diabetes family history and genetic risk factors affecting type II diabetes or BMI. They used data on 287,394 unrelated individuals of British ancestry recruited to participate in the UK Biobank from 2006 to 2010 when between the ages of 40 and 69.

Nearly 5% of the participants had a diagnosis of type II diabetes and diabetes prevalence was confirmed to be associated with higher BMI, a family history of type II disease and genetic risk factors. Moreover, a 1 kg/m2 BMI reduction was associated with a 1.37 fold reduction (95% CI 1.12-1.68) in type II diabetes among non-overweight individuals with a BMI of less than 25 and no family history of diabetes, similar to the effect of BMI reduction in obese individuals with a family history (1.21, 95% CI 1.13-1.29)

"These findings suggest that all individuals can substantially reduce their type II diabetes risk through weight loss," the authors say. However they also caution that the results must be taken with a grain of salt since they didn't study actual weight loss interventions. Although the new analysis "can determine that lower lifetime BMI is protective against diabetes, that does not necessarily imply weight loss later in life, after carrying excess weight for decades, would have the same result," they say.

Credit: 
PLOS

Stardust from red giants

Around 4.5 billion years ago, an interstellar molecular cloud collapsed. At its centre, the Sun was formed; around that, a disc of gas and dust appeared, out of which the earth and the other planets would form. This thoroughly mixed interstellar material included exotic grains of dust: "Stardust that had formed around other suns," explains Maria Schoenbaechler, a professor at the Institute of Geochemistry and Petrology at ETH Zurich. These dust grains  only made up a small percentage of the entire dust mass and were distributed unevenly throughout the disc. "The stardust was like salt and pepper," the geochemist says. As the planets formed, each one ended up with its own mix.

Thanks to extremely precise measurement techniques, researchers are nowadays able to detect the stardust that was present at the birth of our solar system. They examine specific chemical elements and measure the abundance of different isotopes - the different atomic flavours of a given element, which all share the same number of protons in their nuclei but vary in the number of neutrons. "The variable proportions of these isotopes act like a fingerprint," Schönbächler says: "Stardust has really extreme, unique fingerprints - and because it was spread unevenly through the protoplanetary disc, each planet and each asteroid got its own fingerprint when it was formed."

Studying palladium in meteorites

Over the past ten years, researchers studying rocks from the Earth and meteorites have been able to demonstrate these so-?called isotopic anomalies for more and more elements. Schoenbaechler and her group have been looking at meteorites that were originally part of asteroid cores that were destroyed a long time ago, with a focus on the element palladium.

Other teams had already investigated neighbouring elements in the periodic table, such as molybdenum and ruthenium, so Schoenbaechler's team could predict what their palladium results would show. But their laboratory measurements did not confirm the predictions. "The meteorites contained far smaller palladium anomalies than expected," says Mattias Ek, postdoc at the University of Bristol who made the isotope measurements during his doctoral research at ETH.

Now the researchers have come up with a new model to explain these results, as they report in the journal Nature Astronomy. They argue that stardust consisted mainly of material that was produced in red giant stars. These are aging stars that expand because they have exhausted the fuel in their core. Our sun, too, will become a red giant four or five billion years from now.

In these stars heavy elements such as molybdenum and palladium were produced by what is known at the slow neutron capture process. "Palladium is slightly more volatile than the other elements measured. As a result, less of it condensed into dust around these stars, and therefore there is less palladium from stardust in the meteorites we studied" Ek says.

The ETH researchers also have a plausible explanation for another stardust puzzle: the higher abundance of material from red giants on Earth compared to Mars or Vesta or other asteroids further out in the solar system. This outer region saw an accumulation of material from supernova explosions.

"When the planets formed, temperatures closer to the Sun were very high," Schoenbaechler explains. This caused unstable grains of dust, for instance those with an icy crust, to evaporate. The interstellar material contained more of this kind of dust that was destroyed close to the Sun, whereas stardust from red giants was less prone to destruction and hence concentrated there. It is conceivable that dust originating in supernova explosions also evaporates more easily, since it is somewhat smaller. "This allows us to explain why the Earth has the largest enrichment of stardust from red giant stars compared to other bodies in the solar system" Schoenbaechler says.

Credit: 
ETH Zurich

No 'clouded' judgments: Geostationary satellite an alternative to monitor land surfaces

image: Seasonal variations in vegetation index observed by Himawari-8 (middle) and Suomi-NPP (bottom) at Takayama in-site observation site and camera images taken downward at the site (top).

Image: 
Tomoaki Miura and Kazuhito Ichii

Satellite remote sensing has widely been used to monitor and characterize the spatial and temporal changes of the Earth's vegetative cover. Satellites used in these analyses have conventionally been polar-orbiting satellites, which orbit from "pole to pole" and obtain only one to two images of the Earth per day. The utility of these polar-orbiting satellites has, however, often been limited because frequently occurring clouds block their view of the land surface.

New-generation geostationary satellites present an opportunity to observe land surfaces in a more efficient manner. Being in geostationary orbit, the Advanced Himawari Imager (AHI) sensor onboard Himawari-8, for example, can obtain multi-band color images over Japan every 10 minutes, increasing the chance of obtaining "cloud-free" observations. In a new study published in Scientific Reports, an international team of researchers, including Tomoaki Miura from University of Hawaii, Shin Nagai and Mika Takeuchi from Japan Agency for Marine-Earth Science and Technology, Kazuhito Ichii from Chiba University, and Hiroki Yoshioka from Aichi Prefectural University, examined this possibility and the utility of Himawari-8 AHI geostationary satellite data for capturing seasonal vegetation changes in Central Japan.

Their study found that Himawari-8 AHI acquired approximately 26 times more observations than the Suomi-National Polar-orbiting Partnership (S- NPP) Visible Infrared Imaging Radiometer Suite (VIIRS), one of the latest polar-orbiting satellite sensors, for the year 2016. As a result, there were a larger number of days with "cloud-free" observations with Himawari-8 AHI than with S-NPP VIIRS. The study has demonstrated that the AHI geostationary sensor obtained one cloud-free observation data every 4 days, whereas the VIIRS polar-orbiting sensor was able to obtain one cloud-free observation every 7 to 16 days. Owing to this larger number of cloud-free observations, AHI "vegetation index," a satellite measure of vegetation greenness, captured the temporal changes of vegetation from leaf expansion to leaf fall continuously throughout the growing season, corresponding to the observed vegetation phenology with in situ time-lapse digital images (Figure 1). There were, however, several periods where even AHI was unable to obtain cloud-free observations due to persistent cloud cover during summer-fall seasons.

"Detailed vegetation seasonal information from the Himawari-8 geostationary satellite can be useful for many applications such as short- term drought monitoring and assessing the impact of heavy rainfall events," said Prof Miura, the lead author of the study. "This study has shown that the Himawari-8 meteorological satellite can be used to monitor land surface and vegetation. With new-generation geostationary satellites, we may begin to see various types of vegetation changes that could not be seen with previous satellites. The new findings contribute to understanding land-atmosphere carbon dioxide budgets," said Prof Ichii of Chiba University, a co-author of this study.

The Himawari-8 AHI geostationary satellite also acquires multi-band color images over the tropical Southeast Asia region every 10 minutes. It is expected that AHI geostationary sensor data would contribute to improving our understanding of vegetation dynamics and the effect of climate change in this cloud-prone tropical region.

Credit: 
Chiba University

Scientists discover a novel method to combat antibiotic-resistant bacteria

image: AMPs target and kill bacteria in such variable ways that few bacteria ever become resistant to these molecules; this makes AMPs uniquely suited to treating antibiotic-resistant bacteria, also called, 'superbugs'. However, as of now, no one has been able to artificially create effective AMPs for use as an antibiotic. The researchers' discovery has found a way to overcome this limitation and has huge potential in treating and preventing infections for post-surgery wounds, and in diabetic patients and those with weakened immune systems.

Image: 
Unilever

We humans are constantly battling with bacteria and other microbes. In this war against the microbial world, we added antibiotics our arsenal in the early 1920s. Suddenly, fighting off infections became so easy that we thought we had won the war.

We could not have been more wrong.

Over the years, bacteria have evolved so many clever ways of protecting themselves against antibiotics, that now, the World Health Organization (WHO) fears that we may soon slip back into a situation similar to the pre-antibiotic era. The death toll caused by antimicrobial resistance is estimated to rise to 10 million deaths annually by 2050 with India carrying one of the largest burdens of drug-resistant pathogens worldwide. To compound this problem, the global antibiotic pipelines to develop next-generation antibiotics is precariously thin.

In the context of this alarming public health threat, scientists from the Institute for Stem Cell Science and Regenerative Medicine (inStem) and Unilever joined forces to develop innovative strategies to deal with antimicrobial resistance. Together, the team probed the cellular mechanisms that regulate the release of antimicrobial peptides (AMPs), which are natural antibiotics produced by skin cells to fight off bacteria. AMPs target and kill bacteria in such variable ways that few bacteria ever develop resistance to them, thus making AMPs uniquely suited to treating antibiotic-resistant bacterial infections. The scientists' work led to the discovery of a new signalling pathway in skin cells that controls the long-term release of AMPs from these cells. By tweaking this pathway, researchers can induce AMP release from skin cells without any exposure to bacteria! This has tremendous potential in preventing and treating infections for post-surgery wounds, and for diabetic patients and those with weakened immune systems.

Apart from their role as natural antibiotics, AMPs are also known to be involved in wound healing in the skin. This fact spurred Dr. Amitabha Majumdar (Unilever R&D) to hypothesise that the same machinery used to release AMPs during wound healing could be harnessed to control AMP release from skin cells for treating or preventing infections. To test this, Dr. Majumdar contacted Dr. Colin Jamora of the Joint IFOM-inStem Research Laboratory at inStem's Centre for Inflammation and Tissue Homeostasis (CITH) - a group that works extensively on the mechanisms of wound healing in the skin.

When the joint team of scientists probed the cellular mechanism regulating AMP release, they discovered a new signalling pathway for long-term release of AMPs from skin cells. Usually, AMPs are released to fight off bacterial infections when direct contact between skin epidermal cells and bacteria occur, and this process is triggered by a reduction in the levels of a protein called caspase-8.

Interestingly, the researchers found that reducing caspase-8 via molecular techniques is also enough to trigger the release of stored AMP from skin cells. Just by modulating caspase-8 levels in the skin, AMP release can be controlled to prevent a whole spectrum of infections; this may be especially useful for diabetics and patients with weakened immune systems who are highly susceptible to bacterial, yeast, fungal, and viral infections in post-surgery wounds.

"This fruitful collaboration illustrates how partnerships between academic institutions and industry benefits consumers and society", say Dr. Jamora and Dr. Majumdar.

Credit: 
National Centre for Biological Sciences

A study demonstrates the efficiency of a screening strategy to detect liver diseases

image: Estimates of survival of the cost-effectiveness model in the diagnosis of fibrosis.

Image: 
UPF

Non-alcoholic fatty liver diseases (NAFLD) and alcohol-related diseases (ALD) are currently the leading cause of chronic disease, cancer and mortality associated with this organ in developed countries.

Patients in advanced stages have a poor prognosis with regard to survival and quality of life. Similarly to cancer, liver diseases in their early stages are asymptomatic, and therefore, most patients are diagnosed in advanced stages. However, there is a lack of screening strategies in the field of public health for the detection of liver fibrosis.

A study published recently in Journal of Hepatology, drafted within the LiverScreen European consortium, with the participation of the Centre for Research in Health Economics (CRES-UPF) and Hospital Clínic de Barcelona together with experts from various international universities within the framework of the LiverScreen project, proposes exploring the cost-effectiveness of transient elastography (TE) as a screening method in primary care. It is a non-invasive test that evaluates the elasticity and stiffness of the liver by ultrasound as a screening method for detecting liver fibrosis in primary care.

"The implementation of a screening programme to detect liver fibrosis, in primary care centres, is a highly cost-effective intervention. This approach could prove a valuable public health strategy. In fact, the UK's National Institute for Health and Care Excellence (NICE) already recommends it in patients at risk", says Miquel Serra-Burriel.

A study in patients from six countries

To carry out their research, the researchers included a total of 6,295 participants from six countries (France, Spain, Denmark, UK, Germany and Hong Kong). They correspond to seven independent prospective studies carried out previously (one cohort per country, except Spain with two), in which TE was used to diagnose liver fibrosis.

The cohorts of the different countries consisted of people of different ages and risk factors, such as high alcohol consumption. The cohorts from Denmark, France and the UK were designed to obtain biopsies to confirm liver fibrosis. "We used data from a subset of patients who had undergone a liver biopsy. After defining the best indicators, we applied them to our cohorts to evaluate the prevalence of significant fibrosis and accuracy of diagnosis", the CRES-UPF researcher affirms.

Then, these results were used to adjust an economic model that compares two different ways to detect significant fibrosis: the TE pathway compared to the standard care pathway, based on liver enzyme tests.

"The goal of using non-invasive detection of fibrosis by TE as a public health intervention was to achieve earlier, more reliable identification of the patient, refer her in a timely manner to the specialist, give her the appropriate treatment and include her in monitoring programmes", Miquel Serra-Burriel asserts.

A cost-effective public health intervention

The study shows that the detection of liver fibrosis using optimized algorithms is a highly cost-effective public health intervention, especially in the early stages of fibrosis, with a 12% probability of cost saving. "As expected, when we focus on patients with risk factors for chronic liver disease, including patients with diabetes, obesity and hazardous alcohol consumption, the screening programme is even more cost-effective", Miquel Serra-Burriel adds.

According to the authors of the study, implementing TE liver screening would require investing between 2,500 (at-risk population) and 6,500 euros (general population) adapted to purchasing power parity (PPP), to achieve another year of life, adjusted for quality of life.

Compared to the subsequent stages of chronic liver disease, significant liver fibrosis screening results in a ten-fold cost-effectiveness improvement, highlighting the importance of early identification, referral and monitoring of these patients.

The authors state that future studies should test whether a two-step approach using serum biomarkers followed by TE would be cost-effective, and analyse the cost savings entailed in this combined system for screening the population.

A multidisciplinary international project

LiverScreen, which involves health professionals, economists, statisticians and quality controllers, has received funding within EIT Health 2018, a project which promotes healthy living, active ageing and improved health in the context of the European Commission's Horizon 2020 research and innovation programme. It has also received a Spanish Plan Nacional I+D+i grant, co-funded by the Carlos III Health Institute and the European Union's ERDF.

In addition to CRES-UPF, in Catalonia LiverScreen also involves institutions such as Hospital Clínic, the IDIBAPS research centre and the Catalan Health Institute (ICS).

Credit: 
Universitat Pompeu Fabra - Barcelona

Silver improves the efficiency of monograin layer solar cells

image: Next-generation lightweight flexible monograin layer solar cell developed by TalTech researchers.

Image: 
Professor Jüri Krustok

As a result of their two-year joint project, the materials researchers of Tallinn University of Technology have improved the efficiency of next generation solar cells by partial substitution of copper with silver in absorber material.

Economic development and the general growth in energy consumption have led to an increased demand for environmentally friendly energy production at lower cost. Most viable solutions can be found in the renewable energy sector. New technologies for energy production should provide clean, low cost, environmentally friendly solutions with versatile applications, making solar energy the best solution today. TalTech's material researchers are working on the development of the next-generation photovoltaics - monograin layer solar cells.

Senior Researcher at TalTech Laboratory of Photovoltaic Materials Marit Kauk-Kuusik says, "The production of traditional silicon solar cells that started back in the 1950s is still very resource and energy consuming. Our research is focused on the development of the next generation of solar cells, i.e. thin-film solar cells based on compound semiconductors."

A thin-film solar cell consists of several thin layers of semiconductor materials. For efficient thin film solar cells, semiconductor with very good light-absorbing properties must be used as absorber. Silicon absorber is not suitable candidate for thin film solar cells due to non-optimal light absorption leading to rather thick absorber layer. TalTech researchers are developing compound semiconductor materials named kesterites (Cu2ZnSn(Se,S)4), which in addition to excellent light absorption contain earth abundant and low cost chemical elements (e.g. copper, zinc , tin, sulphur and selenium). To produce kesterites, TalTech researchers use a monograin powder technology, which is unique in the world.

"The monograin powder technology we are developing differs from other similar solar cell manufacturing technologies used in the world in terms of its method. Compared to vacuum evaporation or sputtering technologies, which are widely used to produce thin-film structures, the monograin powder technology is less expensive," Marit Kauk-Kuusik says.

Powder growth technology is the process of heating chemical components in a special chamber furnace at 750 degrees for four days. Thereafter the mass obtained is washed and sieved in special machines. The synthesized high-quality microcrystalline powder, monograin powder, is used for the production of solar cells. The powder technology differs from other production methods in particular due to its low cost, since it does not require any expensive high vacuum equipment.

The monograin powder consists of unique microcrystals that form parallel connected miniature solar cells in a large module (covered with an ultra-thin buffer layer). This, however, provides major advantages over the photovoltaic modules of the previous generation, i.e. silicon-based solar panels: the photovoltaics cells are lightweight, flexible, can be transparent, while being environmentally friendly and significantly less expensive.

The indicator of the quality of photovoltaics is the efficiency. Efficiency depends not only on the properties of the materials used and the structure of the solar cell, but also on solar radiation intensity, angle of incidence and temperature.

The ideal conditions for achieving the maximum efficiency are in cold sunny mountains, not in a hot desert, as one would expect, because heat does not improve solar cell's efficiency. It is possible to calculate the maximum theoretical efficiency for each solar panel, which, unfortunately, has so far been impossible to achieve in reality, but it is an objective to pursue.

"We have reached the point in our development where partial replacement of copper with silver in kesterite absorber materials can increase efficiency by 2%. This is because copper is highly mobile in nature, causing unstable solar cell efficiency. The replacement of 1% copper with silver improved the efficiency of monograin layer solar cells from 6.6% to 8.7%," Marit Kauk-Kuusik says.

The two TalTech's groups of material researchers: photovoltaic materials and optoelectronic materials physics research groups published an article "The effect of Ag alloying of Cu2(Zn,Cd)SnS4 on the monograin powder properties and solar cell performance" in a high-quality scientific journal "Journal of Materials Chemistry A".

The monograin layer solar cell technology is implemented by the Estonian-Austrian joint venture Crystalsol GmbH. In order to commercialize the photovoltaic technology developed by our researchers, the solar cell efficiency should be increased to 15%.

Credit: 
Estonian Research Council