Culture

Archaeological assessment reveals Earth's early transformation through land use

image: UMBC Logo

Image: 
UMBC

BALTIMORE, Md . - Humans have been transforming Earth's ecology for thousands of years; far longer than previously known by Earth scientists. In work published on August 30, 2019 in Science , a global collaboration among more than 250 archaeologists, the ArchaeoGLOBE project, reveals the deep roots of Earth's reshaping into the human planet of the Anthropocene.

By hunting and foraging, farming and urbanizing Earth's land over the past 10,000 years, human societies drove
species extinct, deforested the planet, tilled up Earth's soils, and released carbon into the atmosphere, creating
the planet as we know it today. The earlier start times for land use revealed by this study have profound
implications for understanding and modelling contemporary global changes in land use, climate and
biodiversity around the world.

The project was developed as an extension of UMBC's NSF-funded GLOBE project towards the goal of
engaging the local expertise of archaeologists around the world to create a global assessment of land use
changes since the last ice age. Led by Erle C. Ellis, Professor of Geography and Environmental Systems at
UMBC
and Lucas Stephens together with archaeologists at Max Planck in Jena , the Smithsonian , University
College London
, University of Washington , Arizona State University , and many other institutions, the work
represents the first global assessment of land use by archaeologists; a collaborative achievement in itself. Three
undergraduate research interns at UMBC - Alexa Thornton (Environmental Science and Geography - BS Cum
Laude 2018), Santiago Munevar Garcia (Environmental Science and Geography - BS Magna Cum Laude 2018)
and Jeremy Powell (Geography - BS 2018) provided critical assistance in with geospatial analysis and mapping
the data set.

"Archaeologists have unrivaled tools for investigating human changes in environments over the long term. The
ArchaeoGLOBE project brought this expertise together at global scale to understand the global environmental
changes caused by humans. Our hope is that this is only the first achievement of what will become a new,
"massively collaborative" scientific approach to understanding the global environmental changes caused by
humans over the long term."

Credit: 
University of Maryland Baltimore County

Moving faster in a crowd

image: Molecular model of the crowded interior of a bacterial cell. New research shows that particles can move more quickly through crowds if the crowding molecules are non-uniformly distributed.

Image: 
Adrian H Elcock, CC BY 2.0 (https://creativecommons.org/licenses/by/2.0/legalcode)

Cell particles move more quickly through a crowded cellular environment when the crowding molecules are non-uniformly distributed. New research also shows that particle transport in crowded cells can actually be faster than movement in a non-crowded environment as long as the particles are moving from densely crowded areas to less crowded areas. Understanding the rate at which particles move in these environments can help researchers to better understand cellular processes that require multiple molecules to "find" each other in the crowded environment of the cell. A paper describing the research, by a team of Penn State scientists, appears online in the journal ACS Nano.

"Crowding is common in living systems at different length scales, from busy hallways down to dense cellular cytoplasm," said Ayusman Sen, Verne M. Willaman Professor of Chemistry and Distinguished Professor of Chemistry and Chemical Engineering at Penn State and one of the leaders of the research team. "The insides of cells are very, very crowded with proteins, macromolecules and organelles. Molecules that are involved in chemical reactions required by the cell must be transported through this crowded, viscous environment to find their partner reagents. If the environment is uniformly crowded, movement slows, but we know that the inside of a cell is non-uniform; there are gradients of macromolecules and other species. So, we were interested in how these gradients would influence transport at the nanoscale."

The researchers compared the movement of various "tracer" colloids--insoluble particles suspended in a liquid--through different environments using microfluidics. A microfluidic device can be filled with different solutions in which the researchers establish gradients--from high to low--of "crowder" macro-molecules in the fluid. The tracers, which can be large or small, hard or soft and deformable, are fluorescently labeled allowing the researchers to track their movement with a confocal microscope.

"We were surprised to see that the tracers moved faster in gradients of crowders than they did through a fluid with no crowders at all," said Farzad Mohajerani, a graduate student in chemical engineering at Penn State and co-first author of the paper. "We think that the densely packed crowders actually put a pressure on the tracers to force them toward less dense areas. Large tracer molecules moved faster than small ones, and soft, deformable tracers moved faster than hard ones."

"The soft, deformable tracers are better representatives of actual species moving around in cells," said Matthew Collins, a graduate student in chemistry at Penn State and co-first author of the paper. "We think that they can move faster because, unlike hard particles, they can squeeze through tighter areas."

"Our experiments and model not only show that molecules can move faster through gradients of macromolecular crowding, we think that these rates of movement may increase further inside actual living cells where other active moving molecules could increase the crowding pressure," said Sen.

Credit: 
Penn State

Eliminating visual stimulation may help counter symptoms of spatial neglect after stroke

image: Dr. Chen is senior research scientist in the Center for Stroke Rehabilitation Research at Kessler Foundation.

Image: 
Kessler Foundation/Jody Banks

East Hanover, NJ. August 30, 2019. A recent report by Kessler researchers and clinicians described the effects of binocular occlusion in a patient with spatial neglect and severe posture impairment. The article, "Impact of eliminating visual input on sitting posture and head position in a patient with spatial neglect following cerebral hemorrhage: a case report," was epublished on July 18, 2019 by Physiotherapy Theory and Practice (DOI: 10.1080/09593985.2019.1645252). The authors are Peii Chen, PhD, of Kessler Foundation, Shannon E. Motisi, DPT, Christina Cording, DPT, Irene Ward, DPT, and Neil N. Jasey, MD, of Kessler Institute for Rehabilitation.

Complications of the hemorrhagic stroke of the right basal ganglia and surrounding area included marked right gaze, severe postural asymmetry, and inadequate responses to multi-step commands. These symptoms limited the ability of the 53-year-old woman to participate in therapy, and hindered the use of conventional paper-based tools for evaluating her for spatial neglect. Under these circumstances, the diagnosis of spatial neglect was based on the Kessler Foundation Neglect Assessment Process (KF-NAP®), i.e., observation of her performance of daily life activities.

The team found evidence in the literature that indicated that some individuals performed better on tests of spatial neglect when visual stimuli were eliminated. They hypothesized that eliminating visual input would increase spatial awareness on the contralesional side, which in turn, would reduce the rightward deviations in gaze and posture. Improving posture and gaze could improve the patient's ability to participate effectively in her rehabilitation.

To eliminate visual stimuli, the patient was blindfolded during therapy sessions, and the stroke team evaluated her behavior with and without binocular occlusion. They measured three outcome measures: head position, weight distribution, and the degree of contact between buttocks and therapy mat (buttock-mat contact). These measurements were recorded at baseline, during binocular occlusion, immediately after removal of blindfold, and three minutes after removal.

All three measures showed immediate improvement during binocular occlusion, with return to baseline at three minutes after the intervention. "We documented that while the patient was blindfolded, she sat up straighter, and her posture was more centered," said Dr. Chen, senior research scientist in the Center for Stroke Rehabilitation Research. "It is unclear why visual stimuli was such a strong factor in this patient's symptoms of spatial neglect," she said.

Further research is needed to investigate the usefulness of this type of intervention in rehabilitative care. "It is important to look at ways to support upright and centered posture in our patients recovering from stroke," she noted, "because posture and positioning are essential to participating in many therapies and in performing activities of daily life."

Credit: 
Kessler Foundation

Common stomach bacteria is attracted to bleach

The widespread stomach pathogen Helicobacter pylori is attracted to bleach, according to new research by Arden Perkins of the University of Oregon in Eugene, Oregon, and colleagues. These findings published on August 29 in the open-access journal PLOS Biology.

Helicobacter pylori (H. pylori) is found in the stomachs of about 50 percent of all humans. While most people show no symptoms, the bacterium can cause gastritis, ulcers, and even stomach cancer. Previous research has shown that an H. pylori protein known as transducer-like protein D (TlpD) is required for the bacterium to efficiently colonize the stomach, but the molecular function of TlpD has been unclear.

The authors of the new study hypothesized that TlpD might respond to hypochlorous acid (HOCl), commonly known as bleach. At sites of inflammation in the stomach, certain cells naturally produce bleach as an antimicrobial defense, but some microbes, including H. pylori, hijack the system by strategically navigating to these sites of inflammation and persisting there.

To examine how bleach might affect TlpD and influence H. pylori navigation, the researchers first performed a series of biochemical experiments outside of living cells. They found that bleach causes a specific change in the molecular shape of TlpD, and this shape change appears to serve as a signal that is transmitted to other proteins involved in H. pylori navigation.

Additional experiments with H. pylori cells showed that the bacterium is indeed attracted to bleach, and that a functional TlpD protein is required for this attraction. The researchers also found that H. pylori is highly resistant to bleach.

These findings suggest that bleach-sensing by TlpD enables H. pylori to turn the body's defenses to its own advantage by helping it to seek out inflamed stomach tissue for colonization. Further research will be needed to clarify the role of bleach in H. pylori infection.

Credit: 
PLOS

USF-led team deciphers sea level rise from the last time Earth's CO2 set record highs

image: Researchers study the levels of phreatic overgrowth on speleothems in Teatro Room, Artá Cave

Image: 
A. Merino

TAMPA, Fla. (Aug. 30, 2019) - An international team of scientists have discovered evidence in the geological formations in a coastal cave showing that more than three million years ago - a time in which the Earth was two to three degrees warmer than the pre-industrial era - sea level was as much as 16 meters higher than the present day. Their findings have significant implications for understanding and predicting the pace of current-day sea level rise amid a warming climate.

The scientists from the University of South Florida, University of New Mexico, Universitat de les Illes Balears and Columbia University published their findings in today's edition of the journal Nature. The analysis of deposits from Artà Cave on the island of Mallorca in the western Mediterranean Sea, serves as a target for future studies of ice sheet stability, ice sheet model calibrations and projections of future sea level rise, the scientists said.

"We can use knowledge gained from past warm periods to tune ice sheet models that are then used to predict future ice sheet response to current global warming," said USF Department of Geosciences Professor Bogdan Onac.

Sea level rises as a result of melting ice sheets, such as those that cover Greenland and Antarctica. However, how much and how fast sea level will rise during warming is a question scientists have worked to answer. Reconstructing ice sheet and sea-level changes during past periods when climate was naturally warmer than today, provides an Earth's scale laboratory experiment to study this question, said USF PhD student Oana Dumitru, the lead author.

The project focused on cave deposits known as phreatic overgrowths on speleothems. The deposits form in coastal caves at the interface between brackish water and cave air each time the ancient caves were flooded by rising sea levels. In Artà Cave, which is located within 100 meters of the coast, the water table is - and was in the past - coincident with sea level, said Professor Joan J. Fornós of Universitat de les Illes Balears.

The scientists discovered, analyzed, and interpreted six of the geologic formations found at elevations of 22.5 to 32 meters above present sea level. Careful sampling and laboratory analyses of 70 samples resulted in ages ranging from 4.4 to 3.3 million years old, indicating that the cave deposits formed during the Pliocene epoch.

Sea level changes at Artà Cave can be caused by the melting and growing of ice sheets or by uplift or subsidence of the island itself, said Jacky Austermann an Assistant Professor at Lamont-Doherty Earth Observatory, Columbia University and member of the research team. She used numerical and statistical models to carefully analyze how much uplift or subsidence might have happened since the Pliocene and subtracted this from the elevation of the formations they investigated.

One key interval of particular interest during the Pliocene is the mid Piacenzian Warm Period - some 3.264 to 3.025 million years ago - when temperatures were 2 to 3ºC higher than pre-industrial levels. The interval also marks the last time the Earth's atmospheric CO2 was as high as today, providing important clues about what the future holds in the face of current anthropogenic warming, Onac said.

This study found that during this period, global mean sea level was as high as 16.2 meters (with an uncertainty range of 5.6 to 19.2 meters) above present. This means that even if atmospheric CO2 stabilizes around current levels, the global mean sea level would still likely rise at least that high, if not higher, the scientists concluded. In fact, it is likely to rise higher because of the increase in the volume of the oceans due to rising temperature. The authors acknowledge that this sea level rise would not happen overnight but it would take hundreds to thousands of years to melt such large amounts of ice.

Considering the present-day melt patterns, this extent of sea level rise would most likely be caused by a collapse of both Greenland and the West Antarctic ice sheets, Dumitru said.

The authors also estimated that sea level was 23.5 meters higher than present about four million years ago during the Pliocene Climatic Optimum, when global mean temperatures were up to 4°C higher than pre-industrial levels.

Credit: 
University of South Florida

Emotion recognition deficits impede community integration after traumatic brain injury

image: Dr. Helen Genova is assistant director of the Center for Neuropsychology and Neuroscience Research at Kessler Foundation.

Image: 
Kessler Foundation/Jody Banks

East Hanover, NJ. August 30, 2019. Kessler Foundation researchers have found a correlation between deficits in facial emotion recognition and poor community integration in individuals with moderate to severe traumatic brain injury. Their findings have implications for the development of rehabilitative interventions to reduce social isolation in this population, improve outcomes, and increase quality of life.

The article, "Community integration in traumatic brain injury: The contributing factor of affect recognition deficits," was epublished ahead of print on June 10 in the Journal of the International Neuropsychological Society (doi: 10.1017/S1355617719000559) by Cambridge University Press. The authors are Allison S. Binder of Goodwill Industries of Central Texas, Austin, TX, and Kate Lancaster, PhD, Jeannie Lengenfelder, PhD, Nancy Chiaravalloti, PhD, and Helen Genova, PhD, of Kessler Foundation.

Among people with moderate to severe traumatic brain injury, social isolation is prevalent, and contributes to poor rehabilitation outcomes. Social isolation manifests as lack of community integration, which comprises the home, social settings, and educational and employment settings. Despite the importance of community integration to individuals and their families, the barriers and facilitators to community integration are poorly understood, and targeted interventions are needed. One potential barrier to community integration is impairment in the ability to accurately identity facial emotions, a deficit that leads to difficulties in social interactions.

This study compared two groups of participants -- 27 with moderate to severe traumatic brain injury and 30 healthy controls. All participants completed a questionnaire to examine Community Integration and two tests of facial emotion recognition.The TBI group reported lower levels of community integration compared to the healthy control group. Importantly, those individuals who had lower performance on the facial emotion recognition task displayed reduced integration into the community.

"Our findings suggest that deficits in facial emotion recognition may contribution to the social isolation experienced by so many people with traumatic brain injury," said Dr. Genova, assistant director of the Center for Neuropsychology and Neuroscience Research. "By incorporating appropriate interventions to improve facial emotion recognition into rehabilitative care, we may see improvement in community integration, and increases in quality of life for both individuals and their caregivers," she concluded.

Credit: 
Kessler Foundation

Many who die waiting for a kidney had multiple offers, new study finds

Patients who die waiting for a kidney, or who are removed from the transplant waitlist for poor health, are usually considered unfortunate victims of the ever-growing shortage of available organs.

Yet a new study has found that most candidates have had multiple opportunities to receive a transplant, but the offered organs were declined by their transplant team and subsequently transplanted in someone lower on the waitlist.

Among the candidates who received at least one offer during the study period, nearly one-third (approximately 10,000 people per year) died or were removed from the list without receiving a transplant.

Candidates who died without a transplant received a median of 16 offers (over 651 days) while waitlisted.

"Presumably, these offers were declined primarily because centers were expecting patients to get a better offer in a timely manner," says study leader Sumit Mohan, MD, associate professor of medicine at Columbia University Vagelos College of Physicians and Surgeons and of epidemiology at the Mailman School of Public Health.

"In some cases, a decline may have been the right decision, but our data suggest that many others probably would have been better served if their transplant center had accepted one of the offers."

76% of candidates receive at least one offer

With colleagues from Columbia University Irving Medical Center and Emory University, Mohan examined all 14 million kidney offers made between 2008 and 2015 to more than 350,000 waitlisted patients in the United States. The data came from the United Network for Organ Sharing.

The analysis revealed that most waitlisted patients--76%--received at least one viable offer of a kidney.

Kidney offers that were automatically declined due to a center's minimum acceptance criteria were counted as viable offers, but offers of kidneys that were eventually discarded and never transplanted were not.

Each year, thousands receive multiple offers, but no transplant

Of the 280,041 patients who received at least one offer, 30% (approximately 85,000 people) either died on the waitlist or were removed from the waitlist before receiving a kidney.

For candidates who received at least one offer but died without a transplant, the first offer arrived a median of just 78 days after a candidate joined the list.

The vast majority of organs, 84%, were declined at least once, including organs that appeared to be an ideal immunological match.

Concern about organ quality was the reason most often provided by the transplant team for declining the kidney. "But clearly these organs were transplantable, because all of them were eventually transplanted," Mohan says. "We know 93% of transplanted kidneys are still working after one year and 75% are still working after five years, which calls into question the validity of these decisions to decline offers of a kidney."

Patients unaware of declined offers

For the most part, Mohan says, patients are unaware of declined offers.

The transplant team has just 60 minutes to accept or decline an organ offer, and often patients are not told when an organ is declined on their behalf, even after the fact.

"While the time constraints preclude real time shared decision-making, making patients aware of these organ offers subsequently will potentially improve patient engagement resulting in a process that prioritizes stated patient preference of shorter time to transplant," Mohan says.

"It's better to get a less-than-perfect kidney sooner than to wait years for the perfect kidney to come along. Better communication between patients and transplant centers may prompt a reconsideration of how and when to decline offers."

Executive order

"The vast majority of kidneys are not going to the first matched candidate on the list," Mohan says.

"The current system, which allows centers to decline offers without patient involvement or awareness, appears to make an otherwise objective allocation system more subjective than intended."

The recent executive order directing the Secretary of Health and Human Services to improve the allocation and utilization of deceased-donor kidneys may create an opportunity for change.

Credit: 
Columbia University Irving Medical Center

Providing more testing choices does not increase colorectal cancer screening rates

Offering patients the choice between home screening or in-office colonoscopy does not increase participation in colorectal cancer screening, according to a new Penn Medicine study. However, the framing of choice did impact patient decision-making, as the proportion of colonoscopies -- the gold standard for colorectal cancer screening -- fell when the home screening option was presented as an available option. This study was published in JAMA Network Open.

"As clinicians, we should think carefully about the choices that we offer to patients: While they're well-meaning and seemingly more patient-centered, choices may actually be overwhelming and could impede decision-making," said the study's lead author, Shivan Mehta, MD, MBA, associate chief innovation officer at Penn Medicine and an assistant professor of Medicine. "It is important for us to simplify choices as much as possible, but also think about how we frame them."

One in three people in the United States are not up-to-date on their screening for colorectal cancer, the second deadliest cancer in the United States, so doctors and researchers like Mehta and his team are working on ways to make tests more widespread and/or easier to complete. For this study, they explored whether offering fecal immunochemical testing (FIT), a stool test that can be completed at home and mailed to a lab as an alternative choice to colonoscopies would boost screening completion. FIT kits are often viewed as more convenient, but they need to be completed yearly to keep patients up to date. Colonoscopies are more comprehensive, allowing for the removal of potentially harmful tissues, and only need to be performed once a decade.

"We know from behavioral science that we all tend to overweigh present-time risks as compared to future benefits," Mehta explained. "In the short term, it's easier to get stool testing done, but the need to do it yearly presents more opportunities for a patient to get behind on their screening. Conversely, colonoscopies are more challenging in the short term, but they keep patients up-to-date longer."

A group of 438 patients overdue for screening were equally separated into three different study arms. Each received a letter from their primary care physician about the benefits of screening. The first group also received a phone number to call to schedule a colonoscopy. If they didn't schedule within in four weeks, they got a follow-up letter with the same information.

Patients in the second group received a number they could call for scheduling a colonoscopy, in addition to the letter. But if they, too, didn't schedule one within four weeks, they were then mailed a reminder letter along with a FIT kit (with instructions and a stamped envelope with which to return it).

Finally, patients in the third arm received the colonoscopy scheduling number and the FIT kit immediately. In four weeks, without either screening completed, they would then get a letter with information both about colonoscopy scheduling and the FIT kit.

The study showed that colonoscopy popularity fell as FIT kits became more readily available, with colonoscopies being used in 90 percent of the completed screenings in the first arm, 52 percent in the second, and just 38 percent in the third. However, overall screening rates did not vary significantly, with each group having roughly the same numbers.

"One interpretation of our results is that any of these strategies can be deployed by health systems with reasonable effectiveness," Mehta said.

Moving forward, Mehta said he would like to examine the long-term effects of these choices with more participants, as there may be small but significant differences in response rate. In particular, he'd like to examine variations of the sequential choice option -- the second arm that offered colonoscopy information before mailing a FIT kit four weeks later.

"Specifically, we would like to explore how long we should wait before we offer stool testing when patients do not participate in colonoscopy," Mehta said. "This may offer a clue as to whether there is a better timing option that might increase screening rates while accounting for the need to repeat stool testing annually."

Credit: 
University of Pennsylvania School of Medicine

Defrosting surfaces in seconds

image: Researchers have developed a way to remove ice and frost from surfaces extremely efficiently, using less than 1% of the energy and less than 0.01% of the time needed for traditional defrosting methods. Instead of conventional defrosting, which melts all the ice or frost from the top layer down, the researchers established a technique that melts the ice where the surface and the ice meet, so the ice can simply slide off. They describe their work in Applied Physics Letters.
This image shows (a) thin layer of ITO coating applied to substrate to be de-iced; (b) ITO heats up as current applied, water melts at interface allowing ice to slide down under gravity; (c) time-lapse images during de-icing.

Image: 
Nenad Miljkovic

WASHINGTON, D.C., August 30, 2019 -- In the future, a delayed flight due to ice will be no cause for a meltdown.

A group of researchers at the University of Illinois at Urbana-Champaign and Kyushu University has developed a way to remove ice and frost from surfaces extremely efficiently, using less than 1% of the energy and less than 0.01% of the time needed for traditional defrosting methods.

The group describes the method in Applied Physics Letters, from AIP Publishing. Instead of conventional defrosting, which melts all the ice or frost from the top layer down, the researchers established a technique that melts the ice where the surface and the ice meet, so the ice can simply slide off.

"The work was motivated by the large energy efficiency losses of building energy systems and refrigeration systems due to the need to do intermittent defrosting. The systems must be shut down, the working fluid is heated up, then it needs to be cooled down again," said author Nenad Miljkovic, at UIUC. "This eats up a lot of energy when you think of the yearly operational costs of running intermittent defrosting cycles."

According to the authors, the biggest source of inefficiency in conventional systems is that much of the energy used for de-icing goes into heating other components of the system rather than directly heating the frost or ice. This increases energy consumption and system downtime.

Instead, the researchers proposed delivering a pulse of very high current where the ice and the surface meet to create a layer of water. To ensure the pulse reaches the intended space rather than melting the exposed ice, the researchers apply a thin coating of indium tin oxide (ITO) -- a conductive film often used for defrosting -- to the surface of the material. Then, they leave the rest to gravity.

To test this, the scientists defrosted a small glass surface cooled to minus 15.1 degrees Celsius -- about as cold as the warmest parts of Antarctica -- and to minus 71 degrees Celsius -- colder than the coldest parts of Antarctica. These temperatures were chosen to model heating, ventilation, air conditioning and refrigeration applications and aerospace applications, respectively. In all tests, the ice was removed with a pulse lasting less than one second.

In a real, 3D system, gravity would be assisted by air flow. "At scale, it all depends on the geometry," Miljkovic said. "However, the efficiency of this approach should definitely still be much better than conventional approaches."

The group hasn't studied more complicated surfaces like airplanes yet, but they think it's an obvious future step.

"They are a natural extension as they travel fast, so the shear forces on the ice are large, meaning only a very thin layer at the interface needs to be melted in order to remove the ice," Miljkovic said. "Work would be needed to figure out how we can coat curved components conformally with the ITO and to figure out how much energy we would need."

The researchers hope to work with external companies on scaling up their approach for commercialization.

Credit: 
American Institute of Physics

New insights into how diet & medication impact the influence of gut bacteria on our health

Research published in Cell on 29th August by the groups of Filipe Cabreiro from the MRC London Institute of Medical Sciences and Imperial College and Christoph Kaleta from Kiel University in Germany has demonstrated that diet can alter the effectiveness of a type-2 diabetes drug via its action on gut bacteria.

Bacteria that reside in our gastrointestinal tract, referred to as the gut microbiome, produce numerous molecules capable of influencing health and disease. The function of the gut microbiome is known to be regulated by both diet and drugs such as the drug metformin, which is used to treat type-2 diabetes and has been shown to extend the lifespan of several organisms. However, understanding the complicated, multi-directional relationships between diet, drugs and the gut microbiome represents a considerable challenge. "Disentangling this network of interactions is of utmost importance since the specific mechanism of action of metformin is still unclear," says Filipe Cabreiro.

A new screening technique

Cabreiro and his team developed an innovative four-way high-throughput screening technique to better understand how diet, drugs and the gut microbiome interact to influence host physiology. They used the nematode worm C. elegans colonized with the human gut bacteria E. coli as a simplified host-microbiome model and exposed it to metformin in the presence of hundreds of different nutritional compounds.

They found that metformin treatment altered the metabolism and lifespan of the C. elegans host and that these effects could be either enhanced or suppressed by specific nutrients. Crucially, it was revealed that the gut bacteria played a key role in mediating this phenomenon.

The importance of the diet and gut bacteria explain why metformin was previously shown to have no effect on the lifespan of another commonly studied organism, the fruit fly. Helena Cochemé, who collaborated on this study says "As it turned out, the typical laboratory food of fruit flies is rich in sugars. After taking away the sugar we also saw positive effects of metformin in fruit flies colonized with E. coli."

Bacterial nutrient signaling is a central modulator of microbe-host-drug interactions

Further analysis revealed that bacteria possess a sophisticated mechanism that enables them to coordinate nutritional and metformin signals and to rewire their own metabolism accordingly. As a result of this adaptation, the bacteria accumulate a metabolite called agmatine which was shown to be required for the positive effects of metformin on host health.

What about humans?

Cabreiro collaborated with Christoph Kaleta from Kiel University to investigate whether the results found in C. elegans could also be observed in the more complex microbiota of humans. They analysed data related to the microbiome, nutrition and medication status of a large cohort of type-2 diabetic patients and healthy controls. "Intriguingly, we found that metformin treatment was strongly associated with an increased capacity for bacterial agmatine production," says Kaleta. Importantly, they could reproduce their findings in several independent cohorts of type-2 diabetic patients across Europe. Moreover, the bacterial species found to be major producers of agmatine were those known to be increased in the gut microbiome of metformin-treated type-2 diabetic patients.

Implications for metformin treatment

"Our results shed light on how the complex network of interactions between diet, microbiota and host impacts the efficacy of drugs," says Cabreiro. "With our high-throughput screening approach we now finally have a tool at hand that allows us to tackle this complexity". The findings of this study may help to inform dietary guidelines or the development of genetically engineered bacteria that could be used to enhance the beneficial effects of metformin. They may also provide a valuable insight into the evidence that suggests that metformin-treated type-2 diabetic patients are healthier and live longer than non-diabetic individuals.

Credit: 
Medical Research Council (MRC) Laboratory of Medical Sciences

Giving people a 'digital identity' could leave them vulnerable to discrimination, experts warn

Global efforts to give millions of people missing key paper documents such as a birth certificates a digital identity could leave them vulnerable to persecution or discrimination, a new study warns.

Work is underway to use digital technology so refugees and others lacking vital legal papers can have access to services such as health and education. But this could also provide a new way for ethnic minorities to be discriminated against and marginalised by officials and governments if safeguards are not in place, according to new research.

The World Bank estimates that over one billion people currently lack official identity documents - either because they never had it, or because they have lost it - and the United Nations Sustainable Development Goals include the aim to provide legal identity for all by 2030.

Without identity documents people can have difficulty accessing many basic services including healthcare, social protection, banking or education. Asylum-seekers without documentary evidence of their identity and age may incur significant problems in acquiring legal status in a host country.

However, the University of Exeter Law School study warns digital identity could allow for more "efficient" ways to discriminate against highly persecuted groups of people such as the Rohingya minority in Myanmar, as the technology would make their ethnic minority status more visible.

Dr Ana Beduschi, who led the research, said: "Technology alone cannot protect human rights or prevent discrimination. Depending on how digital identity technologies are designed and used, they may also hinder the rights of those that they intend to benefit. Having a digital identity may make people without legal documentation more visible and therefore less vulnerable to abuse and exploitation. However, it may also present a risk for their safety. If the information falls into the wrong hands, it may facilitate persecution by authorities targeting individuals based on their ethnicity.

"Giving people a digital identity will only help protect their human rights if those who provide it mitigate any risks of potential discrimination and promote high standards of privacy and data protection."

Governments and initiatives run by public-private partnerships are using technology such as blockchain and biometric data from fingerprints or iris scans to provide official identification, to control and secure external borders, and to distribute humanitarian aid to populations in need.

Data stored in the blockchain is encrypted and recorded in a way which makes it challenging to delete or tamper with. Those who support the use of blockchain say this means people can own and control their own digital identity, and can then decide with whom and when to share information contained in their 'digital wallet'.

These schemes are not just used for those who are stateless, or lacking papers. Countries such as Estonia and India already offer citizens the chance to have a digital identity, while Australia, Canada and the UK are currently exploring ways do this.

Digital identity technology could also lead to indirect discrimination. For example, biometric data collected from older individuals are often of less good quality, so only relying on this information could lead to them experiencing obstacles in joining and using digital identity programmes and accessing services.
T

he study warns that governments must not use digital identity information for unlawful surveillance. Any systems in place should comply with international human rights law, and have safeguards built into domestic law on data storage, duration, usage, destruction and access of third parties, as well as the guarantees against arbitrariness and abuse.

Credit: 
University of Exeter

Skin cancer risk in athletes: The dangers of ultraviolet radiation

The dangers of ultraviolet radiation exposure, which most often comes from the sun, are well-known. Speaking at The Physiological Society's Extreme Environmental Physiology conference next week, W. Larry Kenney, Penn State University, will discuss how broad its effects can be, from premature aging to cancer, and how this can be influenced by different skin tones and the use of sunscreen.

Athletes ranging from hikers, to tennis and runners exceed the recommended ultraviolet exposure limit by up to eight-fold during the summer and autumn months. While regular physical activity is associated with a reduced risk of most cancers, skin cancer is an exception. For malignant skin cancer, those in the 90th percentile for physical activity have an increased risk of cancer than those in the 10th percentile. Sun protection in these groups is especially important as multiple studies demonstrate an elevated risk of skin cancer for those who regularly participate in outdoor sports or exercise.

The ultraviolet radiation spectrum is categorized by wavelength as UV-A (320-400 nm), UV-B (290-320 nm), and UV-C (200-290 nm) and the biological effects vary per type. UV-A constitutes around 95% of ultraviolet radiation that reaches the earth's surface, with the remainder being UV-B. In the skin, UV-A is able to reach the skin's blood circulation but most of UV-B is absorbed in the outer layers of the skin (called the epidermis and upper dermis) due to its shorter wavelengths.

Skin pigmentation is another factor that affects our response to sun exposure. UV radiation affects the body's ability to create two important substances, vitamin D and folate, which contribute to both a health pregnancy and early childhood development. It helps vitamin D be synthesised, whereas it causes folate to break down.

There is a theory that suggests that early human populations, living in equatorial Africa, evolved skin pigmentation to protect themselves from folate degradation. This theory also says that depigmentation then occurred as humans moved away from the equator to allow for higher levels of vitamin D synthesis.

Commenting on his talk, Professor Kenney said:

"Sun protection in athletes is especially important as multiple studies demonstrate an elevated risk of skin cancer for those who regularly participate in outdoor sports or exercise. Surprisingly, fewer than 25% of surveyed athletes reported regular use of sunscreen, so there is clearly more awareness-raising that needs to be done."

Credit: 
The Physiological Society

Creation of new brain cells plays an underappreciated role in Alzheimer's disease

Much of the research on the underlying causes of Alzheimer's disease focuses on amyloid beta (Aß), a protein that accumulates in the brain as the disease progresses. Excess Aß proteins form clumps or "plaques" that disrupt communication between brain cells and trigger inflammation, eventually leading to widespread loss of neurons and brain tissue.

Aß plaques will continue to be a major focus for Alzheimer's researchers. However, new work by neuroscientists at the University of Chicago looks at another process that plays an underappreciated role in the progression of the disease.

In a new study published in the Journal of Neuroscience, Sangram Sisodia, PhD, the Thomas Reynolds Sr. Family Professor of Neurosciences at UChicago, and his colleagues show how in genetic forms of Alzheimer's, a process called neurogenesis, or the creation of new brain cells, can be disrupted by the brain's own immune cells.

Some types of early onset, hereditary Alzheimer's are caused by mutations in two genes called presenilin 1 (PS1) and presenilin 2 (PS2). Previous research has shown that when healthy mice are placed into an "enriched" environment where they can exercise, play and interact with each other, they have a huge increase in new brain cells being created in the hippocampus, part of the brain that is important for memory. But when mice carrying mutations to PS1 and PS2 are placed in an enriched environment, they don't show the same increase in new brain cells. They also start to show signs of anxiety, a symptom often reported by people with early onset Alzheimer's.

This led Sisodia to think that something besides genetics had a role to play. He suspected that the process of neurogenesis in mice both with and without Alzheimer's mutations could also be influenced by other cells that interact with the newly forming brain cells.

Focus on the microglia

The researchers focused on microglia, a kind of immune cell in the brain that usually repairs synapses, destroys dying cells and clears out excess Aß proteins. When the researchers gave the mice a drug that causes microglial cells to die, neurogenesis returned to normal. The mice with presenilin mutations were then placed into an enriched environment and they were fine; they didn't show any memory deficits or signs of anxiety, and they were creating the normal, expected number of new neurons.

"It's the most astounding result to me," Sisodia said. "Once you wipe out the microglia, all these deficits that you see in these mice with the mutations are completely restored. You get rid of one cell type, and everything is back to normal."

Sisodia thinks the microglia could be overplaying their immune system role in this case. Alzheimer's disease normally causes inflammation in the microglia, so when they encounter newly formed brain cells with presenilin mutations they may overreact and kill them off prematurely. He feels that this discovery about the microglia's role opens another important avenue toward understanding the biology of Alzheimer's disease.

"I've been studying amyloid for 30 years, but there's something else going on here, and the role of neurogenesis is really underappreciated," he said. "This is another way to understand the biology of these genes that we know significantly affect the progression of disease and loss of memory."

Credit: 
University of Chicago

Rethinking our resilience to wildfire

The 2017 wildfire season was the most extensive and expensive in U.S. history. Fires scorched 10 million acres in the western U.S. and federal fire-suppression expenditure surpassed a record $2.9 billion. There's no end to the record breaking in sight: climate change will continue to produce longer, drier fire seasons with substantial burning that will consume residential developments. In a paper released last week, researchers have found that society is doing little to adapt in the aftermath.

"Often, after a fire, a community rebuilds, allows everything to grow back and continues to function the same way," said Brian Buma, PhD, assistant professor in the department of integrative biology at University of Colorado Denver. "But humans have changed the playing field with climate change. Many times, something burns and the vegetation that comes back is coming back differently - or not at all - in the new climate. We can no longer force the systems to stay the same and we have to adapt with them. It may not be what we're used to, but it may be the new reality."

Buma, along with several biologists and sociologists across the U.S., laid out an appeal for the paradigm shift in a paper for the journal Nature Sustainability. It is an appeal not just to think about fire, but to fundamentally rethink our relationships with fire - one that, in the future, has to have fire as an integral component of the landscape (including around our houses, parks, and cities).

Promoting the right kind of resilience

Without adapting to the changing environment, Buma says our current pattern continues to put people at risk by replicating community vulnerabilities. The researchers found the key was promoting the right kind of resilience. In cases where emerging fire behavior is more or less consistent with historical patterns, supporting basic resilience (or recovery to the same state) is probably fine where natural patterns of recovery are likely sufficient and human exposure fairly low. In cases where fire behavior is somewhat new (like more frequent fires with similar intensity), adaptive planning - like reducing the flammability of houses - could reduce the potential for a future disaster.

But Buma and his associates found many cases where fire behaviors resulting from climate change are simply too novel, too intense - the ecosystem and human system simply can't adapt or recover. In that case, transformation as a path to resilience requires fundamentally altering our relationship to the changing dynamic role of fire. Instead of bouncing back, communities should think of it as "bouncing forward."

After a series of catastrophic wildfires in Santa Barbara, the city of Montecito began a community partnership to acknowledge and tackle the issue of their susceptibility to wildfires. The city educated their community, removed vegetation, thinned trees, traded out flammable chaparral shrubs for grass, improved fire response through better infrastructure planning, and required defensible spaces and retrofitting around area homes. Their efforts paid off. When 2007's Thomas Fire hit the city, only seven houses burned, rather than the 400 to 500 homes it was projected to lose.

Altering the social-ecological systems

Convincing people that we need to change is hard, says Buma, especially in a society where the pride of rebuilding is ingrained.

"We tend to force resilience on a system that can no longer be resilient," said Buma. "There are people who continually rebuild in flood-prone areas like the Mississippi Valley. In Colorado, we're developing neighborhoods in the fire-prone foothills of the Rocky Mountains without defensible spaces around the houses or mitigation plans in place. As a society, maybe we should fundamentally rethink how our neighborhoods interact with the inevitable future fire."

The paper outlines the ways in which we can adapt for the future. Decades of fire research has shown that communities need to reduce the woody fuel they've built up, schedule controlled burns and allow some fires under moderate weather conditions. Communities also need to come together to further educate and mitigate other sources of fuel, even though it means intentionally altering social-ecological systems.

"It's a societal shift to get people to get behind the idea of prescribed burns around their property and the smoke that comes with it," said Buma. "But the idea is not a new one. Indigenous populations in the southwest used to set controlled fires regularly and many communities do it. Fire is here and we have to work with it."

"Even as ecologists, we have to abandon this idea of this pristine environment," said Buma. "It's okay to say that this is a managed environment and it will be different from the natural world. We began altering it when we prevented fires. We continue to alter it by warming the climate."

Credit: 
University of Colorado Denver

Amazon deforestation has a significant impact on the local climate in Brazil

The loss of forest cover in the Amazon has a significant impact on the local climate in Brazil, according to a new study.

The UN Environment Programme has said warned that the Amazon wildfires threaten "...this precious natural resource..." and that the forest helps mitigate the effects of climate change.

Insight into the effects of deforestation in the Amazon - and the way it can intensify climate change, particularly at a local level - has been published open access in the journal Frontiers.

Using satellite data, Jess Baker and Professor Dominick Spracklen from the University of Leeds, evaluated the climatic consequences of deforestation in the Amazon between 2001 and 2013.

They found that deforestation causes the local climate to warm - and that warming intensified as the severity of deforestation increased.

Intact forests in the region, with less than 5% canopy loss, had the most climate stability over the ten years, showing only small increases in temperature.

Areas that had tree cover reduced to below 70% warmed 0.44°C more than neighbouring intact forests during the study period.

The differences between intact and disturbed forests were most pronounced during the driest part of the year, when temperature increases of up to 1.5°C were observed in areas affected by severe deforestation. This increase is additional to global temperature rises driven by climate change.

Study co-author Jess Baker from the School of Earth and Environment at Leeds said: "The Amazon wildfires have reminded us all of the important role that forests play in our global systems. But it cannot be overlooked that intact Amazon forests are also crucially important for Brazil's own local climate.

"A healthy intact Amazon forest helps regulate the local climate and can even act as a buffer to the warming effects of climate change, compared with disturbed forests."

Study co-author Dominick Spracklen, Professor of Biopshere-Atmosphere Interactions at Leeds said: "Deforestation decreases the amount of water emitted to the atmosphere from the forest through a process called evapotranspiration.

"Evapotranspiration can be thought of as the forest 'sweating'; when the moisture emitted by the forests evaporates it cools the local climate. Deforestation reduces evapotranspiration, taking away this cooling function and causing local temperatures to rise.

"As temperatures rise this increases drought stress and makes forests more susceptible to burning."

Credit: 
University of Leeds