Culture

Drones can detect protected nightjar nests

Thermal-sensing cameras mounted on drones may offer a safer and more cost-effective way to locate nests of the elusive European nightjar in forestry work and construction areas, according to new research presented at the British Ecological Society's annual meeting in Birmingham today.

Ecologists from Cardiff University recently conducted a pilot study in Bryn, a Natural Resources Wales managed conifer plantation in South Wales, to test the suitability of drones to detect nest sites of the protected European nightjar.

"The current methods of searching for nightjar nests on foot are expensive and can pose a health and safety risk for people, particularly when accessing clearfell worksites", said Mike Shewring, a PhD student at Cardiff University.

"Nightjars are camouflaged to look just like a fallen log or dead wood. They nest on the ground and 'sit tight' when approached to avoid detection, which makes it nearly impossible to spot them during the day when they are inactive", he added.

To test the new method, the team used the drones to take thermal photographs at nest sites, where observations and radio tracking previously showed European nightjars were breeding between May and August. Images were taken at various heights (10, 20 and 50 metres) at dawn, midday and dusk. The nests were observed from a distance to see if the drones caused any disturbance.

From the photographs, the researchers could detect nests due to the high temperature contrast between the nightjar's body (40°C) and the colder background area. Images taken at 10 metres and during cooler times of the day (dawn and dusk) proved most useful. The known elevations at which the drones were flown allowed the team to estimate the body size of the nightjars to confirm the species.

"Our preliminary findings demonstrate the potential of drones for surveying nightjars during their breeding season, allowing forestry managers to locate nests more accurately and plan their works adequately. This methodology could also have wider applications, since it could technically be adapted to detect any warm-blooded species", commented project supervisor Dr Robert Thomas.

All nightjars sat tight on their nests during the drone flights, as they usually do to avoid being detected by predators.

"We don't know whether the nightjars perceived the drones as a predator. This would be interesting to explore in future studies to ensure that the sight and sound of drones don't have any negative impacts on the birds' stress levels or metabolism", Shewring concluded.

Mike Shewring will present a poster on his work on Monday 17 December 2018 at the British Ecological Society annual meeting. The conference will bring together 1,200 ecologists from more than 40 countries to discuss the latest research.

Credit: 
British Ecological Society

Your co-worker's rudeness may impact how you sleep -- and thus how your partner does

Rudeness. Sarcastic comments. Demeaning language. Interrupting or talking over someone in a meeting. Workplace incivilities such as these are becoming increasingly common, and a new study from Portland State University and University of Illinois researchers found this behavior has the potential to not only negatively affect an employee's sleep but their partner's as well.

The study, recently published in the journal Occupational Health Science, builds on previous research by examining the relationship between workplace incivility -- a common stressful work event -- and employee sleep in the context of dual-earner couples. The researchers surveyed 305 couples in a variety of jobs.

Charlotte Fritz, the lead author of the study and associate professor of industrial and organizational psychology in PSU's College of Liberal Arts and Sciences, said when one spouse experiences workplace incivility, they tend to ruminate more about work at home and report insomnia symptoms whether it's trouble falling asleep or waking up in the middle of the night. But the study went a step further examining sleep problems in the employee's spouse and found their sleep is also affected -- but only if the couple works in the same company or occupation.

"Because work-linked couples have a better idea of what's going on in each other's work, they can be better supporters," Fritz said. "They probably know more about the context of the incivil act and might be more pulled into the venting or problem-solving process."

Fritz recommends that organizations do everything in their power to create a culture of civility by imposing zero-tolerance policies or offering civility training. But given that workplace incivilities aren't completely avoidable, Fritz also suggests a number of strategies to help employees cope, including mentally detaching from work during non-work hours by spending time with family and friends or enjoying hobbies, and practicing meditation at work and home.

The same is true of the employee's spouse.

"Not talking about work or not supporting your spouse is not the solution," Fritz said. "They can talk about work, vent about it, discuss it, but then they should make an explicit attempt to unwind together and create good conditions for sleep."

Credit: 
Portland State University

Exploring ways to reduce child deaths in low-income countries

image: This is a other and child leaving the hospital in Southern Mozambique.

Image: 
Marta Solano

In Mozambique, the probability of dying in the first month after hospital discharge is high, particularly for babies under three months of age, shows a study led by the Barcelona Institute of Global Health (ISGlobal), an institution supported by "la Caixa" Foundation, in collaboration with the Manhiça Health Research Center (CISM). The study also shows that an algorithm based on a series of simple clinical parameters can identify those children at higher risk of dying and that would therefore benefit from a proactive follow-up after their discharge. The implementation of these models could contribute to reducing child mortality in low-income countries.

In the last 25 years, the reduction in mortality of children under five years of age has been remarkable but insufficient (50% instead of the 75% target set by the millennium goals). In low-income countries, children are at increased risk of dying following hospitalisation, regardless of their illness, with an estimated risk ranging between 3 and 13% in the month following discharge. The challenge, therefore, is to identify those children at higher risk in order to follow them up closely after discharge, and thereby avoid a considerable number of pediatric deaths.

The research team performed a retrospective study analysing data from more than 20,000 pediatric hospital admissions over almost 20 years, in the district hospital of Manhiça, a semi-rural area in Southern Mozambique where almost half of the population is under 15 years of age. The researchers determined mortality during the first, second and third month after hospital discharge, and looked for indicators that would allow to identify children at higher risk of dying.

"This is the largest study performed to date to evaluate mortality three months following hospital discharge in a rural area of a low-income country," explains Lola Madrid, ISGlobal researcher and first author of the study.

The results show that the average mortality after discharge is high (3.6%), with half of deaths occurring within the first 30 days. The risk is highest in babies under three months of age and decreases progressively with age. The study also identifies as series of clinical parameter (malnutrition, diarrhea, clinical pneumonia, etc.) that allow to identify those children at highest mortality risk. Using all or some of these variables, the team used a series of predictive models capable of identifying up to 80% of children at risk of dying after discharge.

The children thus identified could benefit from a close follow-up during the first 30 days by community health workers, or receive preventive antimicrobial therapies. "If these simple models, based on easy-to-obtain parameters like those used in our study, are validated in other contexts, they could represent a valuable tool to save neonatal and infant lives in countries with a high burden of child mortality," concludes Quique Bassat, ICREA researcher and study coordinator.

Credit: 
Barcelona Institute for Global Health (ISGlobal)

For these critically endangered marine turtles, climate change could be a knockout blow

image: Hawksbill turtle nests are under threat from rising air temperatures.

Image: 
Wikimedia Commons

TALLAHASSEE, Fla. -- Hawksbill turtles aren't the only marine turtles threatened by the destabilizing effects of climate change, but a new study from researchers at Florida State University shows that this critically endangered species could be at particular risk.

In a study published in the journal PLOS ONE, researchers from FSU's Department of Earth, Ocean and Atmospheric Science suggest that projected increases in air temperatures, rainfall inundation and blistering solar radiation could significantly reduce hawksbill hatching success at a selection of major nesting beaches.

Earth's history abounds with examples of climate shifts, but researchers say today's transforming climate, paired with unabated human development, imperils hawksbills and other marine turtles in new and alarming ways.

"Marine turtles have been around for millions of years, and during this time they have adapted to substantial climatic changes," said Assistant Professor of Oceanography Mariana Fuentes, co-author of the study. "In the past they have adapted by shifting their nesting grounds and nesting season to align with more favorable conditions. However, increasing impacts to nesting habitats from coastal construction, storms and sea level rise are jeopardizing their ability to adapt."

To evaluate climate change's effects on hawksbill hatching success, FSU researchers analyzed more than 5,000 nests from the five Brazilian beaches where a majority of the region's hawksbill nesting occurs. The team focused specifically on five climatic variables -- air temperature, rainfall, humidity, solar radiation and wind speed -- in order to render a more comprehensive model of the various and subtle effects of a changing climate on the sensitive incubation process.

"Research is lacking on how climate change may influence hawksbills, and this population in particular," said former FSU graduate student Natalie Montero, who led the study. "We chose to study how climate change may impact hatchling production because significant changes to how many baby marine turtles are born can dramatically alter population stability."

As reptiles, marine turtles' body temperature regulation relies on external sources of heat. That makes hawksbills and their cousins especially dependent upon and responsive to air temperature. Nowhere is that responsiveness more apparent than in marine turtle nests, where extreme temperature fluctuations can influence egg incubation, dictate sex ratios and determine hatching success.

For some marine turtle species, rising temperatures may not necessarily mean less successful incubation. For example, a study from Montero and Fuentes published earlier this year revealed that, for loggerhead turtles in the temperate nesting beaches of North Florida, changing conditions could yield potential short-term increases in hatching success by 1 to 7.6 percent.

The outlook for the hawksbills, however, is not as rosy.

Montero and Fuentes found that rising air temperatures, accompanied by increased rainfall and solar radiation, are projected to reduce overall hatching success at the Brazilian nesting sites by up to 11 percent by the year 2100. Higher temperatures may warm nests beyond the threshold for healthy incubation, they said, and increased rainfall could saturate the soil and suffocate the embryos.

If the turtles do incubate successfully and hatch, they then have to contend with skyrocketing solar radiation, which could bake the sand and cause the nests to cave in -- a major hazard for the hatchlings as they seek the safety of the open sea.

While that may seem a dire and difficult future for a species whose numbers are already dwindling, Montero said there's still time for humans to soften the blow.

"Humans can help marine turtles in many ways," she said. "Reducing coastal construction and protecting more coastal habitat will help ensure present and future nesting habitat is available. Reducing human impacts on dune structure and beach vegetation is also important. Additionally, reducing trash and microplastics on the beach can create a higher quality nesting and incubating environment."

Credit: 
Florida State University

Stress in new mothers causes lasting health risks, depending on race, ethnicity, poverty

African-American women undergo more physical "wear-and-tear" during the first year after giving birth than Latina and white women, a consequence that may have long-lasting health effects, according to a study of a diverse group of more than 2,400 low-income women.

The study in today's (Friday, Dec. 14) American Journal of Perinatology involved women of diverse racial and ethnic backgrounds who were interviewed and evaluated at five different clinical sites in the United States.

In addition to insight into health risks facing new mothers, researchers united by the Eunice Kennedy Shriver National Institute of Child Health and Human Development's Community Child Health Network found evidence that women who breastfeed their babies may receive some protection from the health damage caused by physical stress of pregnancy and giving birth.

"All mothers are affected by stress, but low-income women and especially African-American and Hispanic women have more adverse health-risk profiles during their children's first years of life," said Sharon Landesman Ramey, a professor and distinguished research scholar at the Fralin Biomedical Research Institute at VTC and an author of the paper. "Our study was designed to look for biomarkers that are sensitive to psychological and physical stressors, and in turn determine whether those stressors contribute to poor outcomes for mothers and children."

Working groups of clinicians guided by scientists and community members recorded blood pressures, heart rates, cholesterol profiles, body mass indexes, waist-hip ratios, and other biomarkers in women six months and one year after they had given birth. In addition, women were assessed before they gave birth, some even before they became pregnant.

The readings were used to create a composite measurement of "allostatic load," which represents the cumulative physical and psychological strain on their bodies after delivery. The study may be the first to examine a large group of women at intervals over the course of a year using demographic, biometric, and biomarker measures.

Readings were taken to look at the post-delivery effects of pregnancy on mothers' cardiometabolic risks in partnership with groups at five clinical sites in the U.S., including medical centers in Chicago, Baltimore, Los Angeles, Washington, D.C., and North Carolina.

More than one half of the women in the study were classified as overweight or obese prior to becoming pregnant.

"We thoroughly looked at the effects of stress alongside levels of poverty and minority status to understand poor health outcomes for moms and children," said Madeleine Shalowitz, a Research Professor of Pediatrics at the Pritzker School of Medicine at the University of Chicago, Program Director of Outcomes Research at NorthShore University HealthSystem Research Institute, and a co-investigator in the study. "Filling this knowledge gap could lead to health interventions to lower the risk of chronic disease for mothers, many of whom are planning to have more children. As many of these women will become pregnant again, improving their health will lead to a healthier next pregnancy."

During pregnancy, dramatic changes occur in a woman's immune and cardiovascular systems to support the developing fetus.

In healthy women, maternal physiology gradually returns to normal within a year after delivery, but a persistently elevated allostatic load increases a woman's risk for chronic diseases across a lifetime.

The researchers said the results of the study are important for improving health and also to address interventions for disadvantaged groups.

As for why breastfeeding was shown to provide some health benefits 12 months postpartum, the study said it could be a cumulative effect reflecting longer durations of breastfeeding compounded by a better socioeconomic status for women who can afford to breastfeed for longer periods.

A large research network in the National Institutes of Health united with community members to execute the research.

"This study was extraordinary because it included community members as full partners at all of the clinical sites, working alongside health-care providers and scientists to determine how stress is affecting mothers and the next generation of babies," Landesman Ramey said. "The hope is that when people realize health-care providers, citizen activists, and neighbors approved the work and contributed to it, they will have more confidence in the information and it will be shared in a good way, to improve health outcomes."

The patients became involved through programs medical centers representing diverse populations at three urban sites including Washington, D.C.; Baltimore, Maryland; and Los Angeles County, California; one suburban site at Lake County, Illinois; and one rural site comprising seven counties in eastern North Carolina. Among the participating organizations:

Baltimore City Healthy Start, Johns Hopkins University;

The Lake County Health Department and Community Health Center, NorthShore University HealthSystem in Illinois;

Healthy African-American Families, Cedars-Sinai Medical Center in Los Angeles;

North Carolina Division of Public Health, Eastern East Carolina University, and the Baby Love Plus Consortium, University of North Carolina, Chapel Hill;

Developing Families Center, Washington Hospital Center, Fralin Biomedical Research Institute.

Credit: 
Virginia Tech

Low skilled, low paid workers of the world don't unite, research shows

Workers in low-skilled, low paid employment aren't prone to band together and form a common bond, new research has shown.

The belief that members of the "precariat" - the group of workers found in insecure, low-waged employment - are united by working contexts in order to rally against their bosses does not necessarily hold true, according to the study.

The research, led by Dr Constantine Manolchev from the University of Exeter Business School, showed that the idea of a formed and unified precariat tends to be over-stated, and that workers don't necessarily unite together with their peers to display group dissatisfaction in the workplace.

The research is published in leading journal, Economic and Industrial Democracy.

"The idea of the existence of a formed and unified 'precariat' is increasingly taken for granted," said the lead author of the report, Dr Constantine Manolchev from the University of Exeter Business School.

"Our research suggests that this tends to be over-stated. We need to also take into account personal life histories and working trajectories, individual experiences and aspirations; so their relationship with their boss, their own sense of pride in their job and their personal circumstances all play a part.

"What we've identified is that just because a worker is a part of that particular social group and has negative attitudes towards the workplace doesn't mean that they are necessarily united with their peers. We believe more research needs to be carried out in this area."

Precariat workers generally fall into three main groups - workers who have lost access to secure or meaningful employment, migrants & ethnic minority workers who have left their home countries and educated members of the group who don't have access to a career path.

They also differ from one another in terms of their working relationships with managers, social status or meaningful social relationships. For example, while migrant workers often recognise that they are in low-paid UK jobs, their wage still equates to three to four times their salary at home, which gives them a different perspective to many other workers in similar roles.

The research was also carried out by Professor Richard Saundry and Professor Duncan Lewis from the University of Plymouth. The team carried out 77 in-depth interviews with cleaners, care workers and farm workers in the south-west of England and also reviewed research data already produced in this arena.

While there were common characteristics within the group, the researchers were unable to find evidence of a clear collective, or a 'class' interested in far-right messaging or engaging in populist politics for its own agenda.

Credit: 
University of Exeter

New discovery about how a baby's sex is determined

image: New discovery about how a baby's sex is determined

Image: 
Jens Jonsson

Medical researchers at Melbourne's Murdoch Children's Research Institute have made a new discovery about how a baby's sex is determined - it's not just about the X-Y chromosomes, but involves a 'regulator' that increases or decreases the activity of genes which decide if we become male or female.

The study, 'Human Sex Reversal is caused by Duplication or Deletion of Core Enhancers Upstream of SOX9' has been published in the journal Nature Communications. MCRI researcher and Hudson Institute PhD student, Brittany Croft, is the first author.

"The sex of a baby is determined by its chromosome make-up at conception. An embryo with two X chromosomes will become a girl, while an embryo with an X-Y combination results in a boy," Ms Croft said.

"The Y chromosome carries a critical gene, called SRY, which acts on another gene called SOX9 to start the development of testes in the embryo. High levels of the SOX9 gene are needed for normal testis development.

"However, if there is some disruption to SOX9 activity and only low levels are present, a testis will not develop resulting in a baby with a disorder of sex development."

Lead author of the study, Professor Andrew Sinclair, said that 90 percent of human DNA is made up of so called 'junk DNA or dark matter' which contains no genes but does carry important regulators that increase or decrease gene activity.

"These regulatory segments of DNA are called enhancers," he said. If these enhancers that control testis genes are disrupted it may lead to a baby being born with a disorder of sex development."

Professor Sinclair, who is also a member of the Paediatrics Department of the University of Melbourne, said this study sought to understand how the SOX9 gene was regulated by enhancers and whether disruption of the enhancers would result in disorders of sex development.

"We discovered three enhancers that, together ensure the SOX9 gene is turned on to a high level in an XY embryo, leading to normal testis and male development," he said.

"Importantly, we identified XX patients who would normally have ovaries and be female but carried extra copies of these enhancers, (high levels of SOX9) and instead developed testes. In addition, we found XY patients who had lost these SOX9 enhancers, (low levels of SOX9) and developed ovaries instead of testes."

Ms Croft said human sex reversal such as seen in these cases is caused by gain or loss of these vital enhancers that regulate the SOX9 gene; consequently, these three enhancers are required for normal testes and male development."

"This study is significant because in the past researchers have only looked at genes to diagnose these patients, but we have shown you need to look outside the genes to the enhancers," Ms Croft said.

Professor Sinclair said that across the human genome there were about one million enhancers controlling about 22,000 genes.

"These enhancers lie on the DNA but outside genes, in regions previously referred to as junk DNA or dark matter," he said. "The key to diagnosing many disorders may be found in these enhancers which hide in the poorly understood dark matter of our DNA."

Credit: 
University of Melbourne

Can stem cells help a diseased heart heal itself? Researcher achieves important milestone

image: Rutgers scientist, Leonard Lee of Robert Wood Johnson Medical School, is one step closer to developing an alternative to heart transplant surgery through a new model that develops heart cells from stem cells to eliminate or reduce the need to be connected to a ventricular assist device (VAD) following heart failure.

Image: 
John Emerson

A team of Rutgers scientists, including Leonard Lee and Shaohua Li, have taken an important step toward the goal of making diseased hearts heal themselves - a new model that would reduce the need for bypass surgery, heart transplants or artificial pumping devices.

The study, recently published in Frontiers in Cell and Developmental Biology, involved removing connective tissue cells from a human heart, "reverse-engineering" them into heart stem cells, then "re-engineering" them into heart muscle cells.

The Rutgers team's true breakthrough, however, is that the newly created cardiac muscle cells clumped together into a single unit that visibly pumps under the microscope.

Senior author Leonard Y. Lee, chair of the Department of Surgery at Rutgers Robert Wood Johnson Medical School, said cardiac cells made in this way don't normally come together and beat as one. His team succeeded in making this happen by over-expressing, a protein in the cells called CREG.

According to Lee, fibroblasts, a cell in connective tissue, were isolated from the heart tissue and reverse-engineered - or transformed - into stem cells. This was done so that when the CREG protein was over expressed the stem cells would differentiate into cardiac cells.

"Heart failure has reached epidemic proportions. Right now, the only option to treat it is surgery, transplant, or connecting the patient with a blood-pumping machine," Lee said. "But transplantable hearts are in short supply and mechanical devices limit the patient's quality of life. So, we are working for ways to help hearts heal themselves."

Though still far off, Lee's ultimate goal is to be able to remove small amounts of a patient's native heart tissue, use CREG to convert the tissue into cardiac muscles that will work together cohesively, and re-introduce them into the patient's heart allowing it to heal itself.

More than six million Americans are living with heart failure, according to the American Heart Association. While most people hear the term "heart failure" and think this means the heart is no longer working at all, but it actually means that the heart is not pumping as well as it should be. People with heart failure often experience fatigue and shortness of breath and have difficulty with every day activities such as walking and climbing stairs.

Credit: 
Rutgers University

Scientists identify 66 alien species that pose greatest threat to European biodiversity

image: Northern Snakehead (Channa argus) tops the list of alien species, not yet established in the European Union, that pose the greatest potential threat to biodiversity in the region

Image: 
Picture: George Berninger Jr., CC BY-SA 4.0, Wikimedia Commons

Scientists have identified 66 alien plant and animal species, not yet established in the European Union, that pose the greatest potential threat to biodiversity and ecosystems in the region.

From an initial working list of 329 alien species considered to pose threats to biodiversity recently published by the EU, scientists have derived and agreed a list of eight species considered to be very high risk, 40 considered to be high risk, and 18 considered to be medium risk.

The research, led by Professor Helen Roy of the UK's Centre for Ecology & Hydrology and involving 43 people from across Europe and funded by the European Commission, is published in the journal Global Change Biology.

The authors developed a horizon-scanning approach in order to derive a ranked list of potential invasive alien species (IAS). Using this procedure, they worked collaboratively to reach consensus about the alien species most likely to arrive, establish, spread and have an impact on biodiversity in the region over the next decade.

The approach is unique in the continental scale examined, the breadth of taxonomic groups and environments considered, and the methods and data sources used. Species considered included plants, terrestrial invertebrates, marine species, freshwater invertebrates and vertebrates.

The eight species that pose the highest risk are:

1. Channa argus. The northern snakehead is a species of fish native to southern and eastern China but now also widely distributed in Japan within shallow, marshy ponds and wetlands, where it preys on native fish species.

2. Limnoperna fortunei. The golden mussel is native to China and south-eastern Asia but became established in Hong Kong in 1965, and Japan and Taiwan in the 1990s. Subsequently, it invaded the United States and South America. It alters native fauna with an impact on the freshwater food web.

3. Orconectes rusticus. The rusty crayfish, native to the United States but now found in Canada, is a large and aggressive species of freshwater crayfish, which is more successful in deterring attack from predators than other crayfish and therefore outcompetes native species.

4. Plotosus lineatus. The striped eel catfish is native to the Indian Ocean but was first recorded in the Mediterranean in 2002 and subsequently spread rapidly along the entire Israeli coast. This venomous catfish now inhabits all sandy and muddy substrates contributing to species declines through competition and displacement.

5. Codium parvulum. This green seaweed native to the Indo-Pacific Ocean and subsequently described from the Red Sea, has since been recorded off the northern shores of Israel in the Mediterranean and along the Lebanese coast. It is considered an ecosystem engineer, altering the structure and functionality of ecosystems.

6. Crepidula onyx. The onyx slipper snail is native to the southern coast of California and northern Pacific coast of Mexico. It is now widespread and considered highly invasive in Asia where it has been reported from Korea, Japan and Hong Kong. Slipper snails are sedentary filter-feeders and change native ecosystems.

7. Mytilopsis sallei. The black striped mussel described from the Pacific coast of Panama is a brackish species that invaded the Indo-Pacific Ocean during the 1900s and has reached Fiji, India, Malaysia, Taiwan, Japan, and Australia. In some of these coastal areas the species completely dominates since it can survive extreme environmental conditions.

8. Sciurus niger. The fox squirrel native to eastern and central North America, competes for resources with the native western gray (S. griseus) and Douglas squirrels (Tamiasciurus douglasii).

Other key findings include:

The highest proportion of the species identified originate in Asia, North America and South America.

Aquatic species are most likely to arrive via shipping, while terrestrial invertebrates are most likely to arrive along with goods such as plants.

The Mediterranean, Continental, Macaronesian and Atlantic biogeographic regions are predicted to be the most threatened across all taxonomic groups, while the Baltic, Black Sea and Boreal regions are least at risk. The Alpine region appears not to be under threat by any species.

The research provides a basis for full risk assessments that can comprehensively evaluate the threat posed by these species to EU biodiversity.

Professor Helen Roy of the Centre for Ecology & Hydrology said: "Preventing the arrival of invasive alien species is the most effective way of managing invasions. Predicting which species are likely to arrive and survive in new regions involves considering many interacting ecological and socio-economic factors including climate but also patterns of trade.

"Our collaborative approach involving experts spanning many disciplines has been critical to achieve the ranked list of alien species that pose the greatest threat to European biodiversity."

Credit: 
UK Centre for Ecology & Hydrology

Study shows in older people, type 2 diabetes is associated with a decline in brain function over 5 years

New research published in Diabetologia (the journal of the European Association for the Study of Diabetes [EASD]) shows that in older people living in the community, type 2 diabetes (T2D) is associated with a decline in verbal memory and fluency over 5 years.

However, contrary to previous studies, the decrease in brain volume often found in older people with T2D was not found to be directly associated with cognitive decline during this time period. Yet compared with people without T2D, those with T2D had evidence of greater brain atrophy at the beginning of the study.

Previous research has shown that T2D can double the risk of dementia in older people. In this new study, Dr Michele Callisaya (University of Tasmania, Hobart, TAS, and Monash University, Melbourne, VIC, Australia) and colleagues aimed to discover whether type 2 diabetes is associated with greater brain atrophy and cognitive decline, and whether the two are linked. It is the first study to compare decline in both cognition and brain atrophy between people with and without T2D together in the same study.

The trial recruited 705 people aged 55-90 years from the Cognition and Diabetes in Older Tasmanians (CDOT) study. There were 348 people with T2D (mean age 68 years) and 357 without (mean age 72 years) who underwent brain MRI (lateral ventricular and total brain volume - measures of brain atrophy) and neuropsychological measures (global function and seven cognitive domains) at three time points over a mean follow-up period of 4.6 years.

The results were adjusted for age, sex, education and vascular risk factors including past or current smoking, heart attack, stroke, high blood pressure, high cholesterol, and body mass index. The authors reported there were significant associations found between T2D and greater decline in both verbal memory and verbal fluency.

Although people with diabetes had evidence of greater brain atrophy at the start of the study, there was no difference in the rate of brain atrophy between those with and without diabetes over the time course in this study. There was also no evidence in the study that the rate of brain atrophy directly impacted on the diabetes-cognition relationship.

In people without type 2 diabetes, verbal fluency slightly increased on average each year (0.004 SD/units per year), whereas it declined in those with type 2 diabetes (?0.023 SD/units per year). The authors say: "Such accelerated cognitive decline may contribute to executive difficulties in everyday activities and health behaviours --such as medication compliance -- which in turn may poorly influence future vascular health and cognitive decline, and possibly an earlier onset of dementia in those with type 2 diabetes."

They add: "Contrary to our hypotheses and results from previous cross-sectional studies, the rate of brain atrophy over these 5 years of study did not directly mediate associations between type 2 diabetes and cognitive decline. It is possible that greater accrual of cerebrovascular disease than occurred in our study may be more likely to reveal whether there is such a relationship."

They conclude: "In older community-dwelling people, type 2 diabetes is associated with a decline in verbal memory and fluency over approximately 5 years, but the effect of diabetes on brain atrophy may begin earlier, for example in midlife, given the evidence of greater brain atrophy in people with T2D at the start of the study. If this is the case, both pharmacological and lifestyle interventions to prevent brain atrophy in people with T2D may need to commence before older age."

Credit: 
Diabetologia

To repair DNA damage, plants need good contractors

image: Salk scientists led by Assistant Professor Julie Law reveal a complex network of genes that helps plants cope with DNA damage.

Image: 
Salk Institute

LA JOLLA--(December 13, 2018) When a building is damaged, a general contractor often oversees various subcontractors--framers, electricians, plumbers and drywall hangers--to ensure repairs are done in the correct order and on time.

Similarly, when DNA is damaged, a molecular general contractor oversees a network of genetic subcontractors to ensure that the diverse cellular tasks needed to protect and repair the genome are carried out correctly and on time.

Scientists have known for some time that a master gene named SOG1 acts like a general contractor for repair, coordinating with various genetic subcontractors of the plant cell to mount an effective DNA damage response. But, it wasn't clear which specific genes were among the subcontractors, nor how SOG1 interacted with them to oversee the DNA damage response.

Now, researchers at the Salk Institute report which genes are turned on or off, and in which order, to orchestrate the cellular processes required to protect and repair the genome in response to DNA damage. The research, which appeared in the journal Proceedings of the National Academy of Sciences during the week of October 10, 2018, reveals the genetic framework controlling a complex biological process that has broad implications for understanding how plants in particular, and organisms in general, cope with DNA damage to ensure long-term health and fitness.

"Just as a building with structural damage can be unsafe, cells with DNA damage that goes unnoticed or unrepaired can be dangerous," says Assistant Professor Julie Law, the senior author of the paper. "However, the timing and overall coordination of events occurring after the detection of damaged DNA remain poorly understood. Is SOG1 acting like a micromanager, directly pointing each subcontractor to a task, or does it have a more hands-off role? This paper brings us one step closer to understanding how the response to DNA damage is coordinated over time to maintain genome stability."

To better understand the dynamics of gene regulation throughout the DNA damage response and to determine the direct roles of SOG1 in this response, Law and her team conducted a series of experiments in Arabidopsis thaliana, a weed commonly used for genetics research. They grew two sets of Arabidopsis seedlings: one set was normal; the other set contained a mutated version of the SOG1 gene, rendering it nonfunctional.

The team exposed both sets of plants to strong ionizing radiation to generate DNA double-strand breaks. Then they analyzed gene expression changes compared to non-irradiated controls at six time points ranging from 20 minutes to 24 hours. They found the expression of approximately 2,400 genes went up or down in response to DNA damage during that time period, almost all of which depended on SOG1's presence. However, they found that only approximately 200--or about 8 percent--were directly activated by SOG1, which revealed the first layer of a complex gene regulation network and showed SOG1 to be taking a more "hands off" overseer role.

To understand what these approximately 2,400 genes were doing, Law's group fed their data into a software program called DREM, which identifies genes with similar patterns of expression over the entire 24-hour period of the study. This resulted in the identification of 11 groups of genes that act on different time scales and play known or predicted roles in different aspects of the plants' response to DNA damage.

"It's exciting to get more clarity on the specific gene networks and subnetworks involved in the DNA damage response, as well as their timing, which had not been done before," says Law.

In addition to informing strategies to improve crop health by maintaining genome stability, this work may also shed light on conserved aspects of the DNA damage response in other organisms, as there are many parallels between SOG1 and a gene in animals that has a similar "general contractor" function--the p53 gene, a tumor suppressor known for its role in combating DNA damage to prevent cancer.

Next, the lab plans to study the roles of new factors implicated in the DNA damage response based on their expression profiles and to continue exploring the network of genes being directly or indirectly controlled by SOG1.

Credit: 
Salk Institute

Study compares dialysis reimbursement around the globe

image: Study Compares Dialysis Reimbursement Around the Globe

Image: 
van der Tol

Highlights

Dialysis reimbursement policies in most countries are focused on conventional in-center hemodialysis, although home hemodialysis and peritoneal dialysis might contribute to quality of life and cost savings.

The reimbursement for dialysis in low- and middle-income countries is insufficient to treat all patients with kidney failure and has a disproportionately high impact on public health expenditure in those countries.

Washington, DC (December 13, 2018) -- A new study has examined how countries around the world compare in providing reimbursement for dialysis care received by patients with kidney failure. The findings, which appear in an upcoming issue of the Clinical Journal of the American Society of Nephrology (CJASN), may help government officials make dialysis reimbursement more equitable and sustainable.

Worldwide, increasing numbers of patients are developing kidney failure and need to undergo kidney transplantation or dialysis. Transplantation is usually the best option, but most patients are treated with dialysis due to deficiencies in infrastructure, a scarcity of donor organs, and contraindications to transplantation. Some assessments of average dialysis costs have been published, but no comprehensive worldwide comparison of national government reimbursement for dialysis care has been done using data collected at a particular moment in time to provide a snapshot view

To perform such a comparison, Arjan van der Tol, MD, PhD, Raymond Vanholder, MD, PhD (University Hospital Ghent, in Belgium), and their colleagues surveyed nephrologists in 90 countries (one per country). The online survey evaluated government reimbursement fees for hemodialysis and peritoneal dialysis, criteria that are used to reimburse dialysis, incentives for self-care dialysis, measures to prevent the development or progression of CKD, and the prevalence of dialysis per country.

Among the study's findings:

Of the 90 survey respondents, governments from 81 countries (90%) provided reimbursement for maintenance dialysis.

In all countries, strategies to decrease the financial burden of kidney failure--such as programs to help prevent the progression of chronic kidney disease or promote more cost-saving dialysis modalities (home hemodialysis of peritoneal dialysis)--were underutilized.

The higher the Gross Domestic Product per capita, the greater the absolute expenditure for dialysis by national governments.

High income countries spent higher absolute amounts on dialysis reimbursement, but the percent of total health care budget spent on dialysis was lower than in low and middle income countries.

In low income countries, the absolute amounts of dialysis reimbursement were insufficient to provide equitable and sustainable access to dialysis care for all patients in need.

"Worldwide, we need better initiatives to improve care of patients with kidney failure with a focus on improving access to transplantation, increasing provision of prevention strategies to reduce the need of kidney replacement therapy, implementing cheaper ways to provide dialysis services to patients in need, and improving the quality of supportive renal care for end-stage kidney disease that does not involve dialysis," said Dr. van der Tol.

In an accompanying editorial, Edwina Brown, DM, FRCP (Hammersmith Hospital, London) noted that curtailing costs of dialysis is essential to enable dialysis provision to grow, but that "the environmental tapestry influencing dialysis modality distribution is much more complex than simply government policy or reimbursement."

Credit: 
American Society of Nephrology

Mass spectrometry sheds new light on thallium poisoning cold case

image: Hair samples from the victim are mounted on slides for analysis by mass spectrometry.

Image: 
Faye Levine

In 1994, Chinese university student Zhu Ling began experiencing stomach pain, hair loss and partial paralysis. By the time doctors diagnosed Ling with thallium poisoning about four months later, she was in a coma. Ling survived, but she suffered permanent neurological damage. A police investigation determined that Ling was intentionally poisoned, but the case remains unsolved.

Two decades after the poisoning, an associate of Ling's family asked Richard Ash, an associate research scientist in the University of Maryland's Department of Geology, to analyze several of Ling's hairs collected in 1994 and 1995 to establish a timeline of her poisoning. Although Ash specializes in analyzing geological samples, he frequently helps a variety of researchers by analyzing trace elements in samples using mass spectrometry, a technique that can measure elements at the parts per billion level.

In October 2018, Ash published the results in the journal Forensic Science International, revealing for the first time that the victim had been exposed to multiple doses of thallium over a long period of time.

"To my knowledge, this is the first use of mass spectrometry to reconstitute the timeline of a prolonged case of intentional heavy metal poisoning," Ash said. "The analysis showed that the victim was poisoned in many doses that increased in frequency and concentration over time."

Mass spectrometry makes finding even low concentrations of thallium in hair easy, but measuring the amount of thallium in a hair sample is challenging. Other laboratories turned away Min He, the Ling family associate and Ash's co-author of the study, because there was no established method for this type of analysis.
"Mass spectrometry is not good at measuring concentrations," Ash said. "To do that, you need an established standard reference material to compare your sample against.

And there are no such standards suitable for measuring thallium concentrations directly from a strand of hair."

Ash made his own standard using a standard reference material made out of orchard leaves, which was developed by the National Institute of Standards and Technology (NIST) to measure certain elements in biological samples. Ash added known quantities of thallium to the NIST material to create a new set of standards that allowed him to determine how much thallium was in the victim's hair.

As an additional level of quality control, Ash tested his measurements derived from the NIST-based standard against a powdered hair standard from China and found that his measurement of elements in the victim's hair samples closely matched those in the powdered hair standard.

"To be honest, I was surprised that the new standard worked so well," Ash said. "Developing my own standard was a shot in the dark, but it paid off."

Next, Ash took advantage of the fact that human hair grows--and incorporates chemicals from the body--at a constant rate. Because of this characteristic, measuring the distribution of certain metals along a hair's length is an established method to determine the timing and dosage of a person's exposure to the metal.

To measure thallium in the victim's hair, Ash scanned the length of the hair with an ultraviolet laser. The laser's energy converted the outermost layer of the hair into tiny particles. Ash then used a mass spectrometer to analyze the particles for thallium. Using the growth rate of hair and the scanning speed of the laser, Ash converted his measurements into a timeline of thallium ingestion.

Ash used this technique to analyze several of the victim's hairs, which were collected at different times in 1994 and 1995. One hair that began growing when Ling was asymptomatic revealed about four months of sporadic exposure to thallium, with increasing dosage and frequency until the hair fell out around December 1994. A second hair, which fell out around March 1995, showed about two weeks of constant ingestion of large doses of thallium.

Ash also discovered a large spike in thallium concentration in the first hair that corresponded to a brief period when Ling experienced vision loss but no gastrointestinal symptoms. As the study noted, this finding suggests that the victim might have been poisoned via eye contact at first, but later on, she was poisoned via oral ingestion.

By publishing his method and findings, Ash hopes that his work can help with heavy metal poisoning investigations in the future--and maybe even the decades-old case that introduced him to this research area in the first place.

"I hope that the new information our work has provided may one day lead to the perpetrator being brought to justice and Zhu Ling's family gaining some solace from seeing that," Ash said.

Credit: 
University of Maryland

Improved understanding of the pathology of dwarfism may lead to new treatment targets

image: Misfolded protein accumulation stimulates endoplasmic reticulum (ER) stress, oxidative stress, and inflammation, creating a self-perpetuating cellular stress loop. Tumor necrosis factor (TNF)-α inflammation increases TNF-related apoptosis-inducing ligand (TRAIL) and midline 1 (MID1), which results in up-regulation of mammalian target of rapamycin complex 1 (mTORC1) signaling. mTORC1 signaling increases protein synthesis, which most likely exacerbates the stress on the already stressed ER. S6K1, protein S6 kinase beta-1; PERK, protein kinase-like endoplasmic reticulum kinase; CHOP, CCAAT/enhancer-binding protein homologous protein; AKT, protein kinase B; PP2A, protein phosphatase 2A.

Image: 
<i>American Journal of Pathology</i>

Philadelphia, December 12, 2018 - Pseudoachondroplasia (PSACH) is a severe inherited dwarfing condition characterized by disproportionate short stature, joint laxity, pain, and early onset osteoarthritis. In PSACH, a genetic mutation leads to abnormal retention of cartilage oligomeric matrix protein (COMP) within the endoplasmic reticulum (ER) of cartilage-producing cells (chondrocytes), which interferes with function and cell viability. In a report in the American Journal of Pathology, investigators describe how this protein accumulation results in "ER stress" and initiates a host of pathologic changes. These findings may open up new ways to treat PSACH and other ER-stress-related conditions.

"This is the first study linking ER stress to midline 1 protein (MID1), a microtubule stabilizer that increases mammalian target of rapamycin complex 1 (mTORC1) signaling in chondrocytes and other cell types. This finding has significant implications for cellular functions including autophagy, protein synthesis, and potentially cellular viability. These results identify new therapeutic targets for this pathologic process in a wide spectrum of ER-stress disorders such as type 2 diabetes, Alzheimer disease, and tuberculosis," explained Karen L. Posey, PhD, Department of Pediatrics, McGovern Medical School at The University of Texas Health Science Center at Houston (UTHealth), Houston, TX, USA.

PSACH symptoms generally are recognized beginning at two years of age. Patients with PSACH have normal intelligence and craniofacial features. PSACH is caused by mutations in the gene encoding the cartilage oligomeric matrix protein (COMP). ER stress occurs when abnormal (unfolded or misfolded) COMP (MT-COMP) accumulates in the rough endoplasmic reticulum of chondrocytes. Rough ER, the portion of ER displaying ribosomes, is the network of membranous tubules within cells associated with protein and lipid synthesis and export.

In previous studies, Dr. Posey and her colleagues have investigated chondrocyte pathology in the growth plates of dwarf mice that express MT-COMP, in cultured rat chondrosarcoma (RCS) cells that express human MT-COMP, as well as in cultured cartilage nodules from PSACH patients. The mice replicate many of the clinical features and chondrocyte pathology reported in patients with PSACH.

In the current study, the researchers showed increased levels of MID1 protein in chondrocytes from the mutant dwarf mice as well as in cells from human PSACH patients. They also found that ER-stress-inducing drugs increased MID1 signaling, although oxidative stress did not.

The up-regulation of MID1 was associated with increased mTORC1 signaling in the growth plates of the dwarf mice. Rapamycin decreased intracellular retention of MT-COMP and decreased mTORC1 signaling. The mTOR pathway is activated during various cellular processes (eg, tumor formation and angiogenesis, insulin resistance, adipogenesis, and T-lymphocyte activation) and is dysregulated in diseases such as cancer and type 2 diabetes.

The results of this work show that MID1, mTORC1 signaling, the microtubule network, protein synthesis, inflammation, and autophagy form a complex multifaceted response to protein accumulation in the ER when clearance efforts fail and MID1 may act as a pro-survival factor.

In this study, aspirin and resveratrol normalized levels of MID1 and mTORC1 signaling in growth plate chondrocytes from dwarf mice. The investigators therefore suggest that a combination of rapamycin with anti-inflammatory medications may be beneficial if side effects can be controlled.

"Our work identifies possible new treatment paths for PSACH, as well as other common conditions such as type 2 diabetes, Alzheimer disease, and cancer, all of which involve ER stress," added Dr. Posey. "We believe ER stress-reduction therapeutics will play an important role in the treatment of a wide variety of illnesses."

Credit: 
Elsevier

Fire's effects on soil moisture, runoff

image: The 2011 Las Conchas mega-fire in New Mexico burned more than 150,000 acres and threatened the Los Alamos National Laboratory.

Image: 
Brian Klieson.

Fire and water. Timeless, opposing forces, they are actually linked in powerful ways that can have major impacts on communities and ecosystems.

The 2011 Las Conchas mega-fire in New Mexico burned more than 150,000 acres and threatened the Los Alamos National Laboratory. Now, using data from the fire, researchers at Los Alamos have created an experimental model that will help us better understand the interactions of fire and water in the soil.

Adam Atchley, a researcher at Los Alamos National Laboratory, and his team set off with a goal: to evaluate how the soil's water balance changes before and after a fire, depending on the burn severity.

They designed an experimental model to simulate the effects of wildfire on the water balance of a burned site. The model used actual site condition measurements in the Las Conchas fire region. These measurements were taken several years before the fire by the Bandelier Fire Ecology Field Team. The model also incorporated burn severity data from the wildfire.

They found low- to moderate-severity wildfires result in wetter soil.

Water leaves soil in multiple ways. Water can move through plants, escaping as vapor through pores in the leaves. This is called evapotranspiration. When vegetation is burned, evapotranspiration is typically reduced until plants regrow. As a result, less water is pulled out of the soil. It remains wetter.

Surface runoff is another mode of water movement. Wildfire can make the soil more vulnerable to this moisture loss. Fire removes the absorbent layers of fallen and decaying plant matter on the forest floor. These layers, called litter and duff, can store more moisture than soil can. Without these layers, heavy rain can provide more water than the ground can absorb. This contributes to surface runoff.

"It is well known that fire disturbances can have a strong effect on how water interacts with land," said Atchley. "Fire often dramatically increases flashy runoff responses to storms following the fire disturbance. But it also reduces evapotranspiration. What is not well understood, and is hard to measure, is how these two competing processes change the site water balance, or how wet or dry a burn site will be after the fire. Increasing the runoff would make the site drier overall while decreasing the evapotranspiration will keep water on the site and make it wetter," Atchley said.

The experimental model identified an important tipping point. In high-severity burn sites, increased runoff outweighs the effect of reduced evapotranspiration. Water runoff is greater than the water retained, leaving comparatively drier soils after the fire.

It's all a matter of degree of severity. "What we found," Atchley said, "is that burn sites will generally become wetter because the change in evapotranspiration is bigger than the change in runoff. However, in the case of high-burn severity, the site could become drier because the change in runoff shortly after the fire becomes bigger than the change in evapotranspiration."

The soil and vegetation conditions that affect moisture after a wildfire will also change over time. For initial site recovery and water management planning after a wildfire, these findings have important implications.

Credit: 
American Society of Agronomy