Culture

Fundamental discoveries for future nanotools: Chemists distinguish multiple weak forces

image: Fold your own nanocube! Although the nanocubes used in the research project build themselves, you can fold your own using this model. Each yellow X represents locations where researchers at the University of Tokyo used different atoms with different levels of polarizability.

Image: 
Image by Shuichi Hiraoka, CC-BY-ND

The process of building a tiny cube has revealed some of the fundamental mysteries of how molecules bind together in natural environments. Researchers hope to apply this knowledge to future projects designing complex structures that can mimic life.

When two molecules surrounded by water move towards each other, part of their initial attraction is sometimes due to the chemical force to repel water -- the hydrophobic effect.

Once the molecules are near each other, but not yet formally bound, a much weaker force becomes important -- the dispersion force.

"Our dream is to control the dispersion force and provide a simple design principle to use the dispersion force to build complex self-assembling structures," said Professor Shuichi Hiraoka, leader of the laboratory where the research was performed in the University of Tokyo Department of Basic Science.

Dispersion forces are one type of van der Waals forces, some of the weakest chemical interactions known in nature. Although weak, van der Waals forces are important; they help geckos walk up walls and were previously identified in 2018 by the same research group as locking together the gear- or snowflake-shaped molecules of the self-assembling nanocubes.

Measuring the dispersion force under natural conditions, such as when molecules are in solution with water, has been impossible. The force is so weak it cannot be identified separately from the other forces at play.

However, in new experiments, the research team used their self-assembling nanocubes as tools to amplify differences in the dispersion force.

The molecules that make up the sides of the cubes were modified to contain atoms selected for their polarizability, meaning their responsiveness to the surrounding electric field. Each fully assembled nanocube contained 18 of those polarizable atoms.

The combined effect of 18 atoms was enough to create measurable differences in the dispersion force depending on which polarizable atom was attached.

The dispersion force is calculated mathematically after using a technique called isothermal titration calorimetry to measure the amount of heat released when molecules bind together.

More polarizable atoms created stronger dispersion forces and made the nanocubes more stable. Depending on the estimated value of the hydrophobic effect, the dispersion force contributes 0.6 to 2.2 times greater attractive force and stability to the cube than the hydrophobic effect.

Researchers plan to use this knowledge about more polarizable atoms creating stronger dispersion forces to design future artificial molecular structures with more complex shapes and increased functions.

"For example, we could design molecules with larger binding surface areas and place polar atoms along the edges to enhance overall stability through the attraction of dispersion forces," said Hiraoka.

Solving a mystery in drug design

Hiraoka states that the measurements for nanocubes built with normal hydrogen compared to deuterium, the "heavy" isotope of hydrogen, should be relevant to drug design theory. Research by other groups had led to conflicting reports among chemists on whether swapping hydrogen with the twice-as-heavy and larger deuterium would create stronger dispersion force.

As a general rule, larger atoms are more polarizable and researchers had new data indicating increased polarizability led to stronger dispersion forces. However, in some cases the smaller hydrogen actually makes a stronger dispersion force than heavy deuterium, but other reports showed the opposite or negligibly small difference between the two atoms.

"In our experiments, the entropy-enthalpy difference is completely balanced. The free energy released by nanocubes with hydrogen or deuterium is essentially identical, so there may be no difference between them," said Hiraoka.

An essential difference between previous research and these experiments is that the UTokyo team used a more lifelike condition of being in solution with water and amplified the effect using the nanocube design.

Credit: 
University of Tokyo

Nurses sleep less before a scheduled shift, hindering patient care and safety

Nurses sleep nearly an hour and a half less before work days compared to days off, which hurts patient care and safety, finds a new study by researchers at NYU Rory Meyers College of Nursing. The findings are published in Sleep Health, the journal of the National Sleep Foundation.

"Nurses are sleeping, on average, less than recommended amounts prior to work, which may have an impact on their health and performance on the job," said Amy Witkoski Stimpfel, PhD, RN, assistant professor at NYU Rory Meyers College of Nursing and the study's lead author.

Nursing, especially in hospitals, is dominated by shift work, with nurses working outside of the traditional 9-to-5 day in order to be at the bedside around the clock. Research shows that shift work takes a toll on circadian rhythms and can impair the performance of workers.

In addition, 12-hour shifts are common and often result in unexpected overtime to finish patient care tasks or charting. Taken together with commute times and domestic responsibilities, nurses often have limited time to sleep before or between shifts.

Sleep deprivation hurts workers' ability to handle complex and stressful tasks, and work-related sleep loss has led to serious errors in other industries, with the nuclear meltdown at Chernobyl as a particularly devastating example. In healthcare, fatigued nurses may be a risk for making critical mistakes in administering medication or making clinical decisions.

In order to better understand nurses' sleep behaviors and patient outcomes, Witkoski Stimpfel and her colleagues studied sleep duration and work characteristics among registered nurses to determine whether sleep duration influences quality of care and patient safety. The researchers used data from two surveys of 1,568 nurses collected in 2015 and 2016.

The nurses were asked how much sleep they usually get, including naps, in the 24 hours prior to a scheduled shift, as well as how much sleep they usually get when they are not scheduled to work. They were also asked about the quality of patient care in their workplace. Patient safety was measured using the Agency for Healthcare Research and Quality (AHRQ) Hospital Survey on Patient Safety Culture.

Nurses reported getting, on average, just under 7 hours (414 minutes) of sleep prior to a work day and more than 8 hours (497 minutes) prior to a non-work day. Thus, the difference in sleep duration between work and non-work days was 83 minutes, or nearly an hour and a half less sleep before a work shift.

In addition, getting less sleep was associated with lower measures of patient safety and quality of care, a finding that may indicate several underlying issues. At the individual level, nurses who are sleeping less may be more fatigued at work, which may result in performance impairments. At the organizational level, if nurses are working in an environment that has frequent staffing shortages or high turnover resulting in unexpected overtime and long hours, patient safety may be compromised in part by tired, overworked nurses.

Can nurses "catch up" on sleep between shifts? Witkoski Stimpfel said it is unlikely.

"Research on chronic partial sleep deprivation in healthy adults shows that after several days of not getting enough sleep, more than one day of 'recovery sleep'--or more than 10 hours in bed--may be needed to return to baseline functioning. But considering a nurse's schedule, which often involves consecutive 12-hour shifts and may only offer one or two days off between shifts, the risk of complete recovery, or 'catching up,' is low," noted Witkoski Stimpfel.

The researchers note that more research on nurses' sleep is needed, but in the interim, healthcare leaders can use evidence-based scheduling strategies, limit the use of overtime, and provide professional development on the importance of sleep for nurses.

"It is in everyone's interest to have nurses well-rested so they can perform their critical function within the healthcare system and keep patients safe," said Christine Kovner, PhD, RN, FAAN, Mathey Mezey Professor of Geriatric Nursing at NYU Meyers and the study's coauthor.

Credit: 
New York University

Houston Methodist developed AI app to predict risk and prevent severe patient falls

New research will be live in npj Digital Medicine on December 12, 2019, that will feature a machine learning app aimed at preventing patients from severe fall-related injuries and death.

This AI technology was developed by Houston Methodist and tested over an eight-month period to help address the growing concern of severe patient falls with seniors and the worry it causes their care-providers and care-givers.

HOW IT WORKS

The AI app predicts the risk of getting injured when a patient falls. Clinical parameters and patient demographics such as bone density, diagnosis, past procedures, etc. are used to populate the app so it then triggers tailored interventions to prevent these high-risk severe injury patients from falling - whether they are in the hospital setting or with home caregivers. This AI technology can be integrated into the patient's electronic medical record (EMR) and make things easier for clinicians since it will be part of the record and will automatically flag or alert the care-providers for high risk fall with harm patients when they enroll in the hospital. This will then trigger a prevention-focused intervention plan or clinical care path. The immediate benefit to the patient is to avoid falling and injuries or even death. The benefit for the hospital is that additional costs and/or lawsuits for these types of patients can be avoided.

THE NEED

In 2015, the estimated medical costs attributable to fatal and nonfatal falls was approximately $50.0 billion. For nonfatal falls, Medicare paid approximately $28.9 billion, Medicaid $8.7 billion, and private and other payers $12.0 billion. Overall medical spending for fatal falls was estimated to be $754 million.

NEW RESEARCH & EXPERT AVAILABILITY

The manuscript titled, "Preventing Inpatient Falls with Injuries using Integrative Machine Learning Prediction: A Cohort Study," will be available for full review soon, and I'll follow-up with a link. I just wanted to get this on your radar and see if you might be interested in learning more about this new AI app and research. I'd be happy to connect you to Dr. Stephen Wong, who spearheaded the study and can speak to the increased need for technology like this for clinicians and caregivers to help patients at risk for severe falls.

Credit: 
Houston Methodist

Breast cancer patients to be evaluated for genetic testing

According to a statement on behalf of the American College of Medical Genetics and Genomics (ACMG) published Dec. 13 in the organization's official journal, Genetics in Medicine, there is insufficient evidence to recommend universal genetic testing for BRCA1/2 alone or in combination with multi-gene panels for all breast cancer patients.

The guidance from the ACMG differs from a consensus guideline issued in February by the American Society of Breast Surgeons, which recommended genetic testing for all newly diagnosed patients with breast cancer. The ACMG recommends evaluations before genetic testing.

"What we are saying is that all women with breast cancer should be evaluated for the need for genetic testing based on existing clinical criteria," said one of the lead authors, Tuya Pal, MD, associate director of Cancer Health Disparities at Vanderbilt-Ingram Cancer Center.

The group wrote the statement on behalf of the ACMG Professional Practice and Guidelines Committee.

"We expect that the evidence to support testing may evolve at different rates for different genes, and we expect that therapeutic indications will play a major role in the incorporation of genes to multi-gene panels," Pal and co-authors stated in the paper. "Consequently, as guidelines for testing are developed, it is critical to ensure they are supported by evidence and resources supporting strategies that include screening, medical and/or surgical care as indicated. Ideally, professional societies should work together to weigh data, formulate and harmonize evidence-based recommendations and seek to reduce barriers to care."

The ACMC document stressed the importance of genetic testing and said all breast cancer patients should be evaluated to determine whether germline genetic testing for hereditary breast cancer is warranted. They noted that only a small proportion of the at-risk population for hereditary breast cancers has been tested, with one estimate indicating that less than 10% of adults with BRCA1/2 pathogenic or likely pathogenic variants in the U.S. have been identified. Testing rates are disproportionately lower among racial and ethnic minority populations.

"As genetic testing now has the potential to guide cancer care, it has become imperative to ensure that all populations may benefit from these tremendous advances and that existing disparities in testing do not widen," Pal said. "In order to ensure this, we need to be intentional in developing and disseminating efforts such that improved outcomes based on genetic testing are experienced across populations."

The ACMG document provided the following guidance for clinicians to consider:

Genetic testing for breast cancer patients is indicated based on patient characteristics, including age at diagnosis, family cancer history and expression of estrogen progesterone receptors and HER2 expression.

In discussions with patients, clinicians should be aware of the current insufficient evidence to support genetic testing for all patients with breast cancer.

After identification of a pathogenic or likely pathogenic mutation in moderately penetrant breast cancer genes, clinicians should recognize that guidance is based on consensus recommendations and that enhanced screening, to date, has not been associated with enhanced survival or earlier stage diagnosis.

Whenever genetic testing is performed on a clinical basis, the testing should include full gene sequencing and be conducted in a lab certified or accredited by either the College of American Pathologists or Clinical Laboratory Improvement Amendments.

Patients should be counseled about the implications of genetic testing by trained genetics professionals or health care providers with special expertise in cancer genetics principles.

Patients who have a pathogenic or likely pathogenic variant in an established breast cancer associated gene should be educated about the importance of cascade testing of family members.

Credit: 
Vanderbilt University Medical Center

Gardens can be havens for soil animals in towns and cities

video: They are invaluable, providing us with many free services. But how are they doing in our towns and cities? A healthy soil can't do without soil animals.

Image: 
Netherlands Institute of Ecology (NIOO-KNAW)

The fifth edition of the Dutch Soil Animal Days saw earthworms almost grab top spot thanks to the wet autumn weather. But at the end of the day, woodlice once again emerged as the most-observed soil animal in Dutch gardens. Nearly 1000 'citizen scientists' sent in their observations this year. And a surprisingly high number of people tried to do something in return for the vital services these soil creatures provide for us.

People can't live without healthy soil full of soil life, yet it's not something you hear about very often. With today's announcement of the results of the 2019 Soil Animal Days, we're breaking the silence. "After five years", says lead researcher Gerard Korthals, "it's now clear that gardens and parks are important havens for common soil animals in the city, and even balconies can be of value. If they're maintained in a soil animal-friendly way."

So how is all that indispensible soil life doing in our cities and towns? Every year around World Animal Day, researchers led by the Netherlands Institute of Ecology (NIOO-KNAW) and the Centre for Soil Ecology (CSE) try to fill in the gaps in our knowledge by enlisting the help of citizen scientists. This year's results are now available:

MAP (NL) / ANIMATED INFOGRAPHIC / FULL RESULTS (NL)

Who's on top?

In the Soil Animal Top 3 for 2019, woodlice are once again the number one. But not for all garden types: in green gardens and schoolyards they've been overtaken by earthworms, and in paved gardens they're neck-and-neck with spiders and their relatives. In the Top 3, arachnoids and earthworms are now in joint second place, while snails are in third place having apparently recovered after a bonedry season the previous year.

One of the most striking findings of 2019's soaking wet Soil Animal Days is that ants were not spotted as often: only in less than 60% of gardens. In 2018, ants did very well: thanks to the warm, dry weather they were still quite active in the autumn. Meanwhile, centipedes were not easy to find in many places, but more millipedes were reported even though they're not so common. That's a good sign!

Raining cats and dogs... and soil animals?

For our citizen scientists, the wet weather during the 2019 Soil Animal Days was quite a challenge, but most soil animals didn't mind. With an average of 43 soil animals reported per participating garden, numbers were up and noticeably higher than the average over the past five years (37.5 soil animals per garden).

During the Soil Animal Days, many people in the Netherlands enthusiastically search their gardens, parks, schoolyards or balconies. 944 participants also helped out with the scientific part this year by handing in their results, covering 185 gardens across the country.

Grading gardens

Those who did were given a grade representing the 'soil animal-friendliness' of their own garden. Those grades varied wildly this year. There were also grades per general type of garden, indicating its potential to be a haven for soil animals. With an average of 8.8 out of 10 based on all participating gardens, that potential is definitely there. Green gardens and tiny forests raked in the highest scores in terms of their potential: 9.2.

Fifth anniversary

So are there any conclusions to be drawn after five years of Soil Animal Days in the Netherlands? Gerard Korthals and his fellow soil researcher Ron de Goede answer in the affirmative. "Green and half-green garden, and parks and public gardens, are eldorados for earthworms, snails, spiders and woodlice in particular." These groups of soil animals are found in more than 80% of the gardens belonging to one of those types.

Another conclusion is that the weather is indeed an important factor when it comes to the survival and level of activity of soil animals. In 2017, arachnoids came out on top after a wet season while in 2018 - a dry year - there were lots of woodlice and few slugs. "We can conclude that in dry years, in particular, the type of garden and the way in which it is maintained are key factors for the survival of soil animals in the city."

Don't ask what soil animals can do for you...

To mark the fifth edition of the Soil Animal Days, we published a festive booklet looking at some of the most unusual and surprising soil animals: Ondersteboven, with the velvet mite as our special 'ambassador'. The booklet is sent to anyone who tells what they are doing for soil animals, in return for all the services they provide for us. Like turning autumn leaves into food for next year's plants, purifying our water and suppressing pathogens.

The most popular options to do something in return were (1) treating soil animals to a 'soil animal snack' by not removing dead leaves, and (2) refraining from using chemical pesticides and fertilisers. In addition, many people came up with creative suggestions such as not winterising their gardens, and letting children discover soil animals.

Are you near a garden or park in the Netherlands and would you like to take part? The next edition of the Dutch Soil Animal Days will be from 25 September-7 October 2020!

Credit: 
Netherlands Institute of Ecology (NIOO-KNAW)

Cheers! Maxwell's electromagnetism extended to smaller scales

image: This is an artistic illustration of nonclassical effects in nanoscale electromagnetism. When the confinement of electromagnetic fields in nanostructures becomes comparable to the electronic length scales in materials, the associated nonclassical effects can substantially affect the electromagnetic response. This illustration represents a film-coupled nanodisk (the nanostructure studied in this work); the insert in the magnifier shows the electronic length scales (in this case, the 'thickness' of the surface induced charge).

Image: 
Marin Soljači Research Group

More than one hundred and fifty years have passed since the publication of James Clerk Maxwell's "A Dynamical Theory of the Electromagnetic Field" (1865). What would our lives be without this publication? It is difficult to imagine, as this treatise revolutionized our fundamental understanding of electric fields, magnetic fields, and light. The twenty original equations (nowadays elegantly reduced into four), their boundary conditions at interfaces, and the bulk electronic response functions (dielectric permittivity and magnetic permeability ) are at the root of our ability to manipulate electromagnetic fields and light (see Table (here without external interface currents or charges)).

Therefore, wondering what our life would be without Maxwell's equations means to try to envision our life without most of current science, communications and technology.

On large (macro) scales, bulk response functions and the classical boundary conditions are sufficient for describing the electromagnetic response of materials, but as we consider phenomena on smaller scales, nonclassical effects become important. A conventional treatment of classical electromagnetism fails to account for the mere existence of effects such as nonlocality [1], spill-out [2], and surface-enabled Landau damping. Why does this powerful framework break down towards nanoscales [3]? The problem is that electronic length scales are at the heart of nonclassical phenomena, and they are not part of the classical model. Electronic length scales can be thought of as the Bohr radius or the lattice spacing in solids: small scales that are relevant for the quantum effects at hand.

Today, the path to understand and model nanoscale electromagnetic phenomena is finally open. In the breakthrough Nature paper "A General Theoretical and Experimental Framework for Nanoscale Electromagnetism", Yang et al. [4] present a model that extends the validity of the macroscopic electromagnetism into the nano regime, bridging the scale gap. On the theoretical side, their framework generalizes the boundary conditions by incorporating the electronic length scales in the form of so-called Feibelman d-parameters (d// and d?, see Table).

The d-parameters play a role that is analogous to that of the permittivity , but for interfaces. In terms of numerical modelling [5], all one needs to do is to pair each two-material interface with associated Feibelman d-parameters and solve the Maxwell's equations with the new boundary conditions.

On the experimental side, the authors investigate film-coupled nanoresonators, a quintessential multiscale architecture. The experimental setup was chosen because of its nonclassical nature. Even so, recently graduated postdoc and lead author Yi Yang comments: "When we built our experiment, we were lucky enough to run into the right geometry that enabled us to observe the pronounced nonclassical features, which were actually unexpected and excited everyone. These features eventually enabled us to measure the d-parameters, which are hard to compute for some important plasmonic materials like gold (as in our case)."

The new model and experiments are momentous both for fundamental science and for diverse applications. It makes a hitherto unexplored connection between electromagnetism, material science, and condensed matter physics--one that could lead to further theoretical and experimental discoveries in all related fields, including chemistry and biology. Application-wise, this work points to the possibility of engineering the optical response beyond the classical regime - an example would be to explore how to extract more power from emitters using antennas.

MIT Professor Marin Soljacic is enthusiastic: "We expect this work to have substantial impact. The framework we present opens a new chapter for cutting-edge nanoplasmonics--the study of optical phenomena in the nanoscale vicinity of metal surfaces--and nanophotonics--the behavior of light on the nanometer scale--and for controlling the interaction of nanometer-scale objects with light."

Credit: 
Massachusetts Institute of Technology, Institute for Soldier Nanotechnologies

Single-cell analysis of the earliest cell fate decisions in development

image: The image shows that, for each single cell across four developmental time points (4.5 -7.5 days after fertilisation (E4.5 to E7.5)), we obtain three separate molecular profiles: chromatin accessibility, DNA methylation and RNA expression.

Image: 
The embryo and cell images were created by Veronique Juvin from SciArtWork. Data plots were produced by Ricard Argelaguet, EMBL-EBI.

Key points:

Uniting three different parameters in single cells taken from mouse embryos reveals more about how foundational cell identities are established in early development

The first single-cell multi-omics analysis of gastrulation allows researchers to connect gene expression, DNA methylation and chromatin accessibility to understand the role of the epigenome in regulating cell fate decisions in early development.

The findings propose how embryonic cells may be deviated away from a default cell state and awoken to new developmental possibilities during gastrulation, and plots the timeline of the epigenetic events controlling cell identity.

The research brings together expertise in single-cell analysis, computational methods and machine learning from the Babraham Institute, European Bioinformatics Institute, CRUK Cambridge Institute and Wellcome - MRC Cambridge Stem Cell Institute.

Researchers at the Babraham Institute, EMBL's European Bioinformatics Institute (EMBL-EBI), CRUK Cambridge Institute and the Wellcome - MRC Cambridge Stem Cell Institute have provided the first single-cell epigenomic analysis of gastrulation, a crucial process in early embryo development. The researchers analysed over 1,000 cells from mouse embryos to understand the epigenetic priming events preceding gastrulation and the cell fate decisions these establish. The findings, published today (Wednesday) in Nature, uncover fundamental knowledge about the processes that programme cell fate in the early embryo to generate all the organs and tissues of the body.

Just as with the construction of a building, one of the first steps before creating the upright structure is to set the foundations and the floorplan. Mammalian development is not so different. The gastrulation process takes the embryo from a ball of largely undifferentiated cells and establishes the head-to-tail and front-to-back axes and three foundational cellular layers that are each responsible for creating specific parts of the embryo.

The three layers (called germ layers) established during gastrulation are the ectoderm, mesoderm and endoderm. The ectoderm gives rise to the skin and the nervous system, the mesoderm specifies the development of several cell types such as bone, muscle and connective tissue, and cells in the endoderm layer subsequently become the linings of the digestive and respiratory system, and form organs such as the liver and the pancreas.

Building on exciting progress in the use of cutting-edge single-cell and computational techniques to understand early development, the researchers used two pioneering methods to collect and analyse the gastrulation data. The first was a technique developed at the Babraham Institute called scNMT-seq (single-cell Nucleosome, Methylome and Transcriptome sequencing). This was used to obtain multiple biological read-outs from 1,105 single cells taken from mouse embryos at different development stages spanning the gastrulation process. In each cell, the researchers assessed gene expression activity, DNA methylation and chromatin accessibility to chart how these changed as cells went through the gastrulation process.

The second was a computational method developed by researchers at EMBL-EBI called Multi-Omics factor Analysis (MOFA). This machine-learning approach, originally developed for personalised medicine, allowed the researchers to unite the three streams of biological information profiled from each single cell.

Ricard Argelaguet, a PhD student at the EMBL-EBI co-supervised by John Marioni and Oliver Stegle, said "MOFA offers a principled approach for combining high-dimensional multi-omics data. It helped us to identify the elements in the genome that are associated with cell fate commitment as well as enabling us to understand how different molecular features interact with one another throughout gastrulation."

The researchers found that the three germ layers (ectoderm, mesoderm and endoderm) showed differences in the timing of epigenetic events in a hierarchical way. The analysis demonstrated that ectoderm cells were primed epigenetically at an earlier developmental stage. This finding may explain the existence of a pre-programmed default developmental path specifying skin and brain identities. Epigenetic events occurring later in mesoderm and endoderm cells may act to actively divert these cells away from this default path by making the cells receptive to signals promoting other cell identities.

"Through analysing the timeline of events, we identified that the diversification of the three gastrulation layers was mainly driven by epigenetic events affecting germ layer specific enhancers." said Dr Stephen Clark, lead researcher and one of the paper's four joint first authors. "We found that the epigenome of the ectoderm layer was established much earlier in development than the other two, even though all three cell types arise at a similar time."

Professor Wolf Reik, Head of the Epigenetics Programme at the Babraham Institute, said "This is the first comprehensive application of the single-cell method developed here at the Institute to a biological question and gives a new view on how cell fate is established. Our findings develop our understanding of the role of the epigenome in defining cell fate commitments at different stages of development, with important implications for stem cell biology and medicine. It is very enjoyable to see how the multidisciplinary research community that has come together in this project is now sharing the success of their efforts."

Dr John Marioni, Group Leader at the EMBL-EBI and the CRUK Cambridge Institute, said "The ability of a cell to commit to a specific fate requires integration of a complex array of different molecular signals. Analogously, understanding this process requires combining cutting-edge experimental and computational tools, as exemplified in this work. Importantly, our data and analyses provide a blueprint for how the epigenome might regulate cell fate choice in other contexts, providing an exciting launch pad for future studies."

A newly announced research initiative, the Wellcome-funded Human Developmental Biology Initiative, will utilise the same single-cell methods to analyse cells from human embryos. The initiative, which involves several of the paper's authors (Professor Wolf Reik, Dr Gavin Kelsey, Dr Peter Rugg-Gunn, each a group leader at the Babraham Institute, and Professor Bertie Göttgens, Principal Investigator, Wellcome - MRC Cambridge Stem Cell Institute) might provide insight as to whether the same epigenetic mechanisms of control work similarly in humans.

Single-cell multi-omics analysis is also likely to deliver significant impacts for human healthcare in future years. A pan-European research initiative, LifeTime, is bringing together life science experts with leaders in the pharma, clinical medicine and technology industries to map how innovation and cutting-edge technologies (including single-cell multi-omics methods) can be united to revolutionise healthcare.

Credit: 
Babraham Institute

Mountain goats' air conditioning is failing, study says

image: Glacier National Park's iconic mountain goats seek out vanishing snow patches where they cool and reduce their respiration.

Image: 
Wesley Sarmento - University of Montana

A new study in the journal PLOS ONE says Glacier National Park's iconic mountain goats are in dire need of air conditioning.

Researchers from the University of Montana, Glacier National Park, and Wildlife Conservation Society (WCS) found that mountain goats (Oreamnos americanus) in Glacier National Park seek out patches of snow in the summertime to reduce heat stress. When they do, breathing rates went down, a behavioral strategy that results in less energy expended.

The trouble is Glacier has already lost some 75% of its glaciers and many snow patches are rapidly dwindling. The park had over 100 glaciers when it was established in 1910. In 2015, only a couple dozen met the size criteria to be considered active glaciers

The study's authors, Wesley Sarmento of the University of Montana, Mark Biel of Glacier National Park, and WCS Senior Scientist, Joel Berger, have studied mountain goats in the field since 2013 to better understand thermal environments and their changes on this cold-adapted species in Glacier.

To understand stressors to goats and ways in which they combat heat, the scientists performed observations of animals on and off ice patches on hot summer afternoons, and days with and without wind. To avoid higher temperatures, the goats sought snow patches for resting, and when they found them, breathing rates were reduced by as much as 15 percent.

The authors note that while people seek shade or air conditioning to stabilize their metabolic rates and animals like coyotes or marmots seek dens, mountain goats in the shade-less environs above treeline have less opportunity to reduce exposure to rising temperature. Goats that were observed resting in shade did not have significant reductions in respirations.

"10,000 years ago when the North American climate was cooler there were mountain goats in Grand Canyon, but certainly increasing temperatures and drier weather ultimately contributed to their extinction in that area," says Sarmento.

Says Biel: "This work is important to shed light on the impacts of a changing climate on these iconic animals and their habitat. How certain species may adapt as the changes continue is critical in understanding their persistence on the landscape into the future."

Like people from Europe to the America's and far beyond, high temperatures cause stress and death. In 2019, more than 1,000 people died from heat exposure in France and Spain.

Berger, also the Barbara-Cox Chair of Wildlife Conservation professor at Colorado State University draws analogies beyond alpine animals. "Just as people are feeling the heat of a warming planet with thousands and thousands struggling during summer without natural cooling systems, we're seeing very clearly that what happens to people is also happening to animals -we're all in this together."

The authors say that climate change will affect mountain goats in other areas including in Alaska's coastal mountains, where summer habitat is expected to shrink up to 86 percent in the next 70 years due to factors such as forest encroachment which in turn will fragment habitat, decrease food availability, and reduce predator detection.

Credit: 
Wildlife Conservation Society

Focus on food security and sustainability

The number of malnourished people is increasing worldwide. More than two billion people suffer from a lack of micronutrients. Infant mortality rates are unacceptably high. Against this background, there is a need for the global pooling of research efforts, more research funding and an international body for food security and agriculture that prepares policy decisions. This is what Prof. Joachim von Braun from the University of Bonn, Dr. Robin Fears from the European Academies Science Advisory Council (EASAC) and Prof. Volker ter Meulen, President of the InterAcademy Partnership (IAP) call for in the journal Science Advances.

The researchers argue that lack of healthy food and poorly managed agricultural systems on the one hand, and excessive consumption and food waste on the other, damage the planet and pose "an unprecedented threat to global food security". World leaders had started to recognize the challenges.

Academies of science, medicine and technology have recently joined forces to form the global network InterAcademy Partnership (IAP). An IAP project is working towards connecting the interfaces of food, food security and global environmental health. The organization brings together networks of experts from Africa, Asia, America and Europe to analyze food systems with a view to global environmental change.

Investment in research infrastructure

The authors highlight the urgent need for investment in research infrastructure to provide reliable data on population health, nutrition, agricultural practices, climate change, ecosystems, sustainability and human behavior. The political decision-makers must increase funding for agricultural and nutrition research.

Although a large body of scientific knowledge on nutrition and hunger is already available, increased international cooperation is required to close knowledge gaps. This also involves social science issues, for example how to transform the behavior of consumers and farmers and how to introduce previously neglected agricultural crops. The implementation of the sustainable development goals (SDGs) also requires the coherent consideration of research results.

The authors propose an international food security and agriculture body focused on preparing policy decisions. "Such a body would have the support of the large scientific community associated with it and could address the most pressing nutritional and agricultural issues," write the researchers. The tasks range from the question of how to balance nutritional and environmental goals to the analysis of how to motivate consumers to eat healthily and sustainably.

Credit: 
University of Bonn

ALMA spots most distant dusty galaxy hidden in plain sight

image: ALMA radio image of the dusty star-forming galaxy called MAMBO-9. The galaxy consists of two parts, and it is in the process of merging.

Image: 
ALMA (ESO/NAOJ/NRAO), C.M. Casey et al.; NRAO/AUI/NSF, B. Saxton

Astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) have spotted the light of a massive galaxy seen only 970 million years after the Big Bang. This galaxy, called MAMBO-9, is the most distant dusty star-forming galaxy that has ever been observed without the help of a gravitational lens.

Dusty star-forming galaxies are the most intense stellar nurseries in the universe. They form stars at a rate up to a few thousand times the mass of the Sun per year (the star-forming rate of our Milky Way is just three solar masses per year) and they contain massive amounts of gas and dust. Such monster galaxies are not expected to have formed early in the history of the universe, but astronomers have already discovered several of them as seen when the cosmos was less than a billion years old. One of them is galaxy SPT0311-58, which ALMA observed in 2018.

Because of their extreme behavior, astronomers think that these dusty galaxies play an important role in the evolution of the universe. But finding them is easier said than done. "These galaxies tend to hide in plain sight," said Caitlin Casey of the University of Texas at Austin and lead author of a study published in the Astrophysical Journal. "We know they are out there, but they are not easy to find because their starlight is hidden in clouds of dust."

MAMBO-9's light was already detected ten years ago by co-author Manuel Aravena, using the Max-Planck Millimeter BOlometer (MAMBO) instrument on the IRAM 30-meter telescope in Spain and the Plateau de Bure Interferometer in France. But these observations were not sensitive enough to reveal the distance of the galaxy. "We were in doubt if it was real, because we couldn't find it with other telescopes. But if it was real, it had to be very far away," says Aravena, who was at that time a PhD student in Germany and is currently working for the Universidad Diego Portales in Chile.

Thanks to ALMA's sensitivity, Casey and her team have now been able to determine the distance of MAMBO-9. "We found the galaxy in a new ALMA survey specifically designed to identify dusty star-forming galaxies in the early universe," said Casey. "And what is special about this observation, is that this is the most distant dusty galaxy we have ever seen in an unobstructed way."

The light of distant galaxies is often obstructed by other galaxies closer to us. These galaxies in front work as a gravitational lens: they bend the light from the more distant galaxy. This lensing effect makes it easier for telescopes to spot distant objects (this is how ALMA could see galaxy SPT0311-58). But it also distorts the image of the object, making it harder to make out the details.

In this study, the astronomers saw MAMBO-9 directly, without a lens, and this allowed them to measure its mass. "The total mass of gas and dust in the galaxy is enormous: ten times more than all the stars in the Milky Way. This means that it has yet to build most of its stars," Casey explained. The galaxy consists of two parts, and it is in the process of merging.

Casey hopes to find more distant dusty galaxies in the ALMA survey, which will give insight into how common they are, how these massive galaxies formed so early in the universe, and why they are so dusty. "Dust is normally a by-product of dying stars," she said. "We expect one hundred times more stars than dust. But MAMBO-9 has not produced that many stars yet and we want to find out how dust can form so fast after the Big Bang."

"Observations with new and more capable technology can produce unexpected findings like MAMBO-9," said Joe Pesce, National Science Foundation Program Officer for NRAO and ALMA. "While it is challenging to explain such a massive galaxy so early in the history of the universe, discoveries like this allow astronomers to develop an improved understanding of, and ask ever more questions about, the universe."

The light from MAMBO-9 travelled about 13 billion years to reach ALMA's antennas (the universe is approximately 13.8 billion years old today). That means that we can see what the galaxy looked like in the past (Watch this video to learn how ALMA works as a time-machine). Today, the galaxy would probably be even bigger, containing one hundred times more stars than the Milky Way, residing in a massive galaxy cluster.

Credit: 
National Radio Astronomy Observatory

Researchers develop first mathematical proof for key law of turbulence in fluid mechanics

image: Mathematicians from UMD have developed the first rigorous proof for a fundamental law of turbulence. Batchelor's law, which helps explain how chemical concentrations and temperature variations distribute themselves in a fluid, can be seen at work in the variously sized swirls of mixing warm and cold ocean water.

Image: 
NOAA/Geophysical Fluid Dynamics Laboratory

What if engineers could design a better jet with mathematical equations that drastically reduce the need for experimental testing? Or what if weather prediction models could predict details in the movement of heat from the ocean into a hurricane? These things are impossible now, but could be possible in the future with a more complete mathematical understanding of the laws of turbulence.

University of Maryland mathematicians Jacob Bedrossian, Samuel Punshon-Smith and Alex Blumenthal have developed the first rigorous mathematical proof explaining a fundamental law of turbulence. The proof of Batchelor's law will be presented at a meeting of the Society for Industrial and Applied Mathematics on December 12, 2019.

Although all laws of physics can be described using mathematical equations, many are not supported by detailed mathematical proofs that explain their underlying principles. One area of physics that has been considered too challenging to explain with rigorous mathematics is turbulence. Seen in ocean surf, billowing clouds and the wake behind a speeding vehicle, turbulence is the chaotic movement of fluids (including air and water) that includes seemingly random changes in pressure and velocity.

Turbulence is the reason the Navier-Stokes equations, which describe how fluids flow, are so hard to solve that there is a million-dollar reward for anyone who can prove them mathematically. To understand fluid flow, scientists must first understand turbulence.

"It should be possible to look at a physical system and understand mathematically if a given physical law is true," said Jacob Bedrossian, a professor of mathematics at UMD and a co-author of the proof. "We believe our proof provides the foundation for understanding why Batchelor's law, a key law of turbulence, is true in a way that no theoretical physics work has done so far. This work could help clarify some of the variations seen in turbulence experiments and predict the settings where Batchelor's law applies as well as where it doesn't."

Since its introduction in 1959, physicists have debated the validity and scope of Batchelor's law, which helps explain how chemical concentrations and temperature variations distribute themselves in a fluid. For example, stirring cream into coffee creates a large swirl with small swirls branching off of it and even smaller ones branching off of those. As the cream mixes, the swirls grow smaller and the level of detail changes at each scale. Batchelor's law predicts the detail of those swirls at different scales.

The law plays a role in such things as chemicals mixing in a solution, river water blending with saltwater as it flows into the ocean and warm Gulfstream water combining with cooler water as it flows north. Over the years, many important contributions have been made to help understand this law, including work at UMD by Distinguished University Professors Thomas Antonsen and Edward Ott. However, a complete mathematical proof of Batchelor's law has remained elusive.

"Before the work of Professor Bedrossian and his co-authors, Batchelor's law was a conjecture," said Vladimir Sverak, a professor of mathematics at the University of Minnesota who was not involved in the work. "The conjecture was supported by some data from experiments, and one could speculate as to why such a law should hold. A mathematical proof of the law can be considered as an ideal consistency check. It also gives us a better understanding of what is really going on in the fluid, and this may lead to further progress."

"We weren't sure if this could be done," said Bedrossian, who also has a joint appointment in UMD's Center for Scientific Computation and Mathematical Modeling. "The universal laws of turbulence were thought to be too complex to address mathematically. But we were able to crack the problem by combining expertise from multiple fields."

An expert in partial differential equations, Bedrossian brought in two UMD postdoctoral researchers who are experts in three other areas to help him solve the problem. Samuel Punshon-Smith (Ph.D. '17, applied mathematics and statistics, and scientific computation), now the Prager Assistant Professor at Brown University, is an expert in probability. Alex Blumenthal is an expert in dynamical systems and ergodic theory, a branch of mathematics that includes what is commonly known as chaos theory. The team represented four distinct areas of mathematical expertise that rarely interact to this degree. All were essential to solving the problem.

"The way the problem has been approached is indeed creative and innovative," Sverak said. "Sometimes the method of proof can be even more important than the proof itself. It is likely that ideas from the papers by Professor Bedrossian and his co-authors will be very useful in future research."

The new level of collaboration that the team brought to this issue sets the stage for developing mathematical proofs to explain other unproven laws of turbulence.

"If this proof is all we achieve, I think we've accomplished something," Bedrossian said. "But I'm hopeful that this is a warmup and that this opens a door to saying 'Yes, we can prove universality laws of turbulence and they are not beyond the realm of mathematics.' Now that we are equipped with a much clearer understanding of how to use mathematics to study these questions, we are working to build the mathematical tools required to study more of these laws."

Understanding the underlying physical principles behind more laws of turbulence could eventually help engineers and physicists in designing better vehicles, wind turbines and similar technologies or in making better weather and climate predictions.

Credit: 
University of Maryland

Mechanisms help pancreatic cancer cells avert starvation

A new study reveals the mechanism that helps pancreatic cancer cells avoid starvation within dense tumors by hijacking a process that pulls nutrients in from their surroundings.

Led by researchers at NYU Grossman School of Medicine, the study explains how changes in the gene RAS -- known to encourage the abnormal growth seen in 90 percent of pancreatic cancer patients - also accelerate a process that supplies the building blocks required for that growth.

Called macropinocytosis, the process engulfs proteins and fats, which can be broken down into amino acids and metabolites used to build new proteins, DNA strands, and cell membranes. Cancer cells cannot multiply without these resources on hand, say the study authors.

Published online December 11 in the journal Nature, the new work identifies the key molecular steps that are marshalled by the cancer cells to boost micropinocytosis.

"We found a mechanism related to nutrient supply that we believe could be used to deny RAS mutant tumor cells of a key survival mechanism," says first study author Craig Ramirez, PhD, a postdoctoral fellow in the Department of Biochemistry and Molecular Pharmacology at NYU School of Medicine.

Theater of Operations

Specifically, the research team found that RAS mutations further activate the protein SLC4A7, which enables the protein called bicarbonate-dependent soluble adenylate cyclase to activate the enzyme protein kinase A. This in turn was found to change the location of a protein called v-ATPase.

By shifting where v-ATPase operates from the depths of cells to areas near their outer membranes, the reaction positions the enzyme to deliver the cholesterol needed by RAC1 to attach to cell membranes, the researchers say. Build-up of v-ATPase near outer membranes, and the related positioning of Rac1, enable membranes to temporarily bulge, roll over on themselves, and form nutrient-engulfing pockets (vesicles) during macropinocytosis.

In cell culture studies, treatment of mutant RAS cells with the SLC4 family inhibitor S0859 led to a significant reduction in RAS-dependent v- ATPase localization to outer membranes, as well as to the inhibition of micropinocytosis.

Furthermore, analysis of molecular data from human pancreatic ductal adenocarcinoma (PDAC) tissue revealed that the gene for SLC4A7 is expressed four-fold higher in tumors than in normal nearby pancreatic tissue.

The study team also showed that silencing the gene for SLC4A7 in pancreatic cancer cells slowed down or shrunk tumors in mice. After 14 days, 62 percent of tumors with silenced SLC4A7 showed reduced growth in mice compared with tumors with the active gene, and 31 percent of tumors showed shrinkage.

"We are now searching for drug candidates that might inhibit the action of SLC4A7 or v-ATPase as potential future treatments that block macropinocytosis," says study senior author Dafna Bar-Sagi, PhD, senior vice president, vice dean for science, and chief scientific officer at NYU Langone Health. "Both of these proteins are in principle good targets because they're linked to cancer growth and operate near the cancer cell surfaces, where a drug delivered through the bloodstream could reach them."

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Increasing transparency in the healthcare sector: More might not be better

INFORMS Journal Operations Research New Study Key Takeaways:

Increasing quality transparency in the short-term typically improves social welfare and reduces inequality among patients.

Increasing transparency in the long-term can decrease social welfare and increase inequality.

The best solution is to target public reporting to specific patient populations and incentivize hospitals.

CATONSVILLE, MD, December 11, 2019 - More isn't always better. That's what researchers say when it comes to transparency in the U.S. healthcare system. This research, forthcoming in the INFORMS journal Operations Research, finds that in the short-term, patients who know more about hospital quality is positive, but in the long-term, the benefits may not be what you might think.

Advocates of public reporting of healthcare outcomes claim that increased transparency automatically leads to improved social welfare because patients are equipped with information to make better healthcare choices. But this new research demonstrates that there is more to the story. Unless public reporting is accompanied by other careful policy interventions, it can negatively affect social welfare.

The study, "Can Public Reporting Cure Healthcare? The Role of Quality Transparency in Improving Patient-Provider Alignment," conducted by Soroush Saghafian of Harvard University and Wallace Hopp of the University of Michigan, found solutions for policymakers. First, outcome information must not only be available, but understandable, to the public. Second, policymakers should target younger, more affluent or urban patients or those that require non-emergency treatment. Finally, public reporting efforts should be accompanied by policies aimed at incentivizing hospitals to make socially optimal investments.

"Implementing these guidelines is the best way the healthcare sector can achieve the full societal benefits of transparency," said Saghafian, a professor in the Harvard Kennedy School.

Currently, healthcare systems use various practices such as requiring a referral from primary care physicians to see a specialist to influence the alignment between patients and providers. However, while research shows patients value quality in providers, few make formal use of quality information in their choices.

"That could mean patients don't understand or aren't aware of the information being provided or that the healthcare sector is not using correct measures of quality," continued Saghafian.

"The point is, increasing quality transparency will eventually alter patient decisions and the healthcare market," he said. "We must understand the implications of this shift and use it to identify policy options for leveraging quality transparency to improve societal outcomes."

Currently, transparency in the short-term increases social welfare and decreases inequality among patients, but it does so at a diminished rate because higher levels of transparency becomes expensive. This study suggests that something less than full transparency is socially ideal.

Increasing quality transparency promotes increased medical specialization resulting in decreased geographical specialization and pushes hospitals to invest in their strengths rather than their weakness. Hospitals will also have more incentives to shift their investments toward quality improvement and away from pure marketing activities.

Credit: 
Institute for Operations Research and the Management Sciences

Teams of microbes are at work in our bodies. Here's how to figure out what they're doing

In the last decade, scientists have made tremendous progress in understanding that groups of bacteria and viruses that naturally coexist throughout the human body play an important role in some vital functions like digestion, metabolism and even fighting off diseases. But understanding just how they do it remains a question.

Researchers from Drexel University are hoping to help answer that question through a clever combination of high-throughput genetic sequencing and natural language processing computer algorithms. Their research, which was recently published in the journal PLOS ONE, reports a new method of analyzing the codes found in RNA that can delineate human microbial communities and reveal how they operate.

Much of the research on the human microbial environment - or microbiome - has focused on identifying all of the different microbe species. And the nascent development of treatments for microbiota-linked maladies operates under the idea that imbalances or deviations in the microbiome are the source of health problems, such as indigestion or Crohn's disease.

But to properly correct these imbalances it's important for scientists to have a broader understanding of microbial communities as they exist - both in the afflicted areas and throughout the entire body.

"We are really just beginning to scrape the surface of understanding the health effects of microbiota," said Gail Rosen, PhD, an associate professor in Drexel's College of Engineering, who was an author of the paper. "In many ways scientists have jumped into this work without having a full picture of what these microbial communities look like, how prevalent they are and how their internal configuration affects their immediate environment within the human body."

Rosen heads Drexel's Center for Biological Discovery from Big Data, a group of researchers that has been applying algorithms and machine learning to help decipher massive amounts of genetic sequencing information that has become available in the last handful of years. Their work and similar efforts around the world have moved microbiology and genetics research from the wet lab to the data center - creating a computational approach to studying organism interactions and evolution, called metagenomics.

In this type of research, a scan of a genetic material sample - DNA or RNA - can be interpreted to reveal the organisms that are likely present. The method presented by Rosen's group takes that one step farther by analyzing the genetic code to spot recurring patterns, an indication that certain groups of organisms - microbes in this case - are found together so frequently that it's not a coincidence.

"We call this method 'themetagenomics,' because we are looking for recurring themes in microbiomes that are indicators of co-occurring groups of microbes," Rosen said. "There are thousands of species of microbes living in the body, so if you think about all the permutations of groupings that could exist you can imagine what a daunting task it is to determine which of them are living in community with each other. Our method puts a pattern-spotting algorithm to work on the task, which saves a tremendous amount of time and eliminates some guesswork."

Current methods for studying microbiota, gut bacteria for example, take a sample from an area of the body and then look at the genetic material that's present. This process inherently lacks important context, according to the authors.

"It's impossible to really understand what microbe communities are doing if we don't first understand the extent of the community and how frequently and where else they might be occurring in the body," said Steve Woloszynek, PhD, and MD trainee in Drexel's College of Medicine and co-author of the paper. "In other words, it's hard to develop treatments to promote natural microbial coexistence if their 'natural state' is not yet known."

Obtaining a full map of microbial communities, using themetagenomics, allows researchers to observe how they change over time - both in healthy people and those suffering from diseases. And observing the difference between the two provides clues to the function of the community, as well as illuminating the configuration of microbe species that enables it.

"Most metagenomics methods just tell you which microbes are abundant - therefore likely important - but they don't really tell you much about how each species is supporting other community members," Rosen said. "With our method you get a picture of the configuration of the community - for example, it may have E. coli and B. fragilis as the most abundant microbes and in pretty equal numbers - which may indicate that they're cross-feeding. Another community may have B. fragilis as the most abundant microbe, with many other microbes in equal, but lower, numbers - which could indicate that they are feeding off whatever B. fragilis is making, without any cooperation."

One of the ultimate goals of analyzing human microbiota is to use the presence of certain microbe communities as indicators to identify diseases like Crohn's or even specific types of cancer. To test their new method, the Drexel researchers put it up against similar topic modeling procedures that diagnose Crohn's and mouth cancer by measuring the relative abundance of certain genetic sequences.

The themetagenomics method proved to be just as accurate predicting the diseases, but it does it much faster than the other topic modeling methods - minutes versus days - and it also teases out how each microbe species in the indicator community may contribute to the severity of the disease. With this level of granularity, researchers will be able to home in on particular genetic groupings when developing targeted treatments.

The group has made its themetagenomics analysis tools publicly available in hopes of speeding progress toward cures and treatments for these maladies.

"It's very early right now, but the more that we understand about how the microbiome functions - even just knowing that groups may be acting together - then we can look into the metabolic pathways of these groups and intervene or control them, thus paving the way for drug development and therapy research," Rosen said.

Credit: 
Drexel University

Earth was stressed before dinosaur extinction

image: Ben Linzmeier holds a fossilized clam shell found while trekking through the Lopez de Bertodano Formation, a well-preserved, fossil-rich area on the west side of Seymour Island in Antarctica.

Image: 
Northwestern University

Shells' chemistry shifted to respond to influx of carbon into the oceans, corresponding with Deccan Traps eruption

Researchers studied clam and snail shells collected from Seymour Island, Antarctica

Study is first to examine the shells' calcium isotope composition across this interval

Researcher: 'Each shell is a short, preserved snapshot of the ocean's chemistry'

EVANSTON, Ill. -- New evidence gleaned from Antarctic seashells confirms that Earth was already unstable before the asteroid impact that wiped out the dinosaurs.

The study, led by researchers at Northwestern University, is the first to measure the calcium isotope composition of fossilized clam and snail shells, which date back to the Cretaceous-Paleogene mass extinction event. The researchers found that -- in the run-up to the extinction event -- the shells' chemistry shifted in response to a surge of carbon in the oceans.

This carbon influx was likely due to long-term eruptions from the Deccan Traps, a 200,000-square-mile volcanic province located in modern India. During the years leading up to the asteroid impact, the Deccan Traps spewed massive amounts of carbon dioxide (CO2) into the atmosphere. The concentration of CO2 acidified the oceans, directly affecting the organisms living there.

"Our data suggest that the environment was changing before the asteroid impact," said Benjamin Linzmeier, the study's first author. "Those changes appear to correlate with the eruption of the Deccan Traps."

"The Earth was clearly under stress before the major mass extinction event," said Andrew D. Jacobson, a senior author of the paper. "The asteroid impact coincides with pre-existing carbon cycle instability. But that doesn't mean we have answers to what actually caused the extinction."

The study will be published in the January 2020 issue of the journal Geology, which comes out later this month.

Jacobson is a professor of Earth and planetary sciences in Northwestern's Weinberg College of Arts and Sciences. Linzmeier was a postdoctoral researcher with the Ubben Program for Climate and Carbon Science at the Institute for Sustainability and Energy at Northwestern when the research was conducted. He is now a postdoctoral fellow at the University of Wisconsin-Madison in the Department of Geoscience.

'Each shell is a snapshot'

Previous studies have explored the potential effects of the Deccan Traps eruptions on the mass extinction event, but many have examined bulk sediments and used different chemical tracers. By focusing on a specific organism, the researchers gained a more precise, higher-resolution record of the ocean's chemistry.

"Shells grow quickly and change with water chemistry," Linzmeier said. "Because they live for such a short period of time, each shell is a short, preserved snapshot of the ocean's chemistry."

Seashells mostly are composed of calcium carbonate, the same mineral found in chalk, limestone and some antacid tablets. Carbon dioxide in water dissolves calcium carbonate. During the formation of the shells, CO2 likely affects shell composition even without dissolving them.

For this study, the researchers examined shells collected from the Lopez de Bertodano Formation, a well-preserved, fossil-rich area on the west side of Seymour Island in Antarctica. They analyzed the shells' calcium isotope compositions using a state-of-the-art technique developed in Jacobson's laboratory at Northwestern. The method involves dissolving shell samples to separate calcium from various other elements, followed by analysis with a mass spectrometer.

"We can measure calcium isotope variations with high precision," Jacobson said. "And those isotope variations are like fingerprints to help us understand what happened."

Using this method, the team found surprising information.

"We expected to see some changes in the shells' composition, but we were surprised by how quickly the changes occurred," Linzmeier said. "We also were surprised that we didn't see more change associated with the extinction horizon itself."

A future warning

The researchers said that understanding how the Earth responded to past extreme warming and CO2 input can help us prepare for how the planet will respond to current, human-caused climate change.

"To some degree, we think that ancient ocean acidification events are good analogs for what's happening now with anthropogenic CO2 emissions," Jacobson said. "Perhaps we can use this work as a tool to better predict what might happen in the future. We can't ignore the rock record. The Earth system is sensitive to large and rapid additions of CO2. Current emissions will have environmental consequences."

Credit: 
Northwestern University