Culture

Food insecurity in toddler years linked to poor health, but not obesity

Young children, who grow up in homes with limited access to nutritious foods (known as food insecurity), are more likely to experience poor overall health, hospitalizations, and developmental problems, but they are not at higher risk of developing obesity, a new University of Maryland School of Medicine study finds.

The research, published today in the journal Pediatrics, examined the impact of food insecurity among children from birth to age four and found that obesity rates generally did not differ among those who lived in households with food insecurity compared to those who had access to healthy foods.

"We did find, however, that growing up in a low-income community -- typically with a lack of access to healthy grocery stores, an overabundance of fast food chains, and few safe areas to play outdoors - increased a preschooler's risk of developing obesity regardless of food security," said study leader Maureen Black, PhD, a Professor of Pediatrics at UMSOM. "This is quite alarming and indicates a significant public health issue."

She and her colleagues analyzed data from 28,184 racially and ethnically diverse children under four years of age primarily from low-income households in five U.S. cities that participate in Children's HealthWatch, an ongoing network of pediatric and public health researchers that monitors how economic hardships relate to the healthy development and growth of children. Data were stratified by every year of age from birth to one year and up to four years of age, and the researchers found that food insecurity was not associated with a higher risk of obesity with one exception. Children aged two to three years who lived in households that were food insecure had a 24 percent increased risk of obesity compared to those who lived in households that were food secure. However, those from the poorest households, where both parents and children lacked access to proper nourishment, also did not experience higher rates of obesity.

"I am not sure what to make of this finding," said Dr. Black. "It could be this is the time when toddlers are first trying adult foods, which in food insecure households may be low-cost, low nutrient-dense foods. In addition, pickiness and hesitancy to try new foods peak during this age period and may be associated with excess snack food."

About 27 percent of the children in the study lived in households that had food insecurity, including more than 13 percent in extremely deprived households with child food insecurity. The vast majority of households in the study qualified for federal and state food assistance programs that provided supplemental nutrition. The researchers identified food security based on interviews, using a standard questionnaire, with mothers of the children conducted by the Children's HealthWatch.

Earlier studies examining food insecurity among children under 4 were not stratified by age, potentially masking developmental differences in young children's experiences of food insecurity and susceptibility to growth issues. This new study found a steady rise in obesity rates as the children grew out of infancy: about 13 percent of those ages 1 to 2 years were obese compared to nearly 24 percent of those ages 3 to 4.

"Childhood obesity remains a persistent problem in this country, and we know it has led to earlier onset of high blood pressure and type 2 diabetes. These conditions, once rare in teenagers, now occur regularly in adolescents," said E. Albert Reece, MD, PhD, MBA, Executive Vice President for Medical Affairs, UM Baltimore, and the John Z. and Akiko K. Bowers Distinguished Professor and Dean, University of Maryland School of Medicine. "More public health efforts must be made to ensure that children in low-income communities are getting the proper nutrition they need."

While the study did not find a link between food insecurity and obesity risk, it did find that food insecurity was associated with significantly increased risks of a child being in poor health and experiencing a developmental delay, with the odds increasing with a child's age up to age four. For this reason, the study authors recommended that health care providers follow the American Academy of Pediatrics guidelines to screen and manage food insecurity, which involves asking caregivers about their access and concerns about enough food and providing them with access to government food assistance programs like WIC, SNAP and local food pantries, along with supports to help families cope with economic hardships and associated stressors.

Dr. Black, who is also a fellow at RTI International, conducted the study with colleagues at Boston University, Drexel University, University of Arkansas for Medical Sciences, and Hennepin County Medical Center. Chloe Drennen, MD, a 2019 UMSOM graduate, was the first author of the study. The study was funded by multiple foundations and donors that support Children's HealthWatch.

Credit: 
University of Maryland School of Medicine

Raising a glass to grapes' surprising genetic diversity

Here's a discovery well worth toasting: A research team led by Professor Brandon Gaut with the University of California, Irvine and Professor Dario Cantu with the University of California, Davis has deciphered the genome of the Chardonnay grape. By doing so, they have uncovered something fascinating: grapes inherit different numbers of genes from their mothers and fathers. Their paper has just been published in Nature Plants.

The team devoted three years of study to what are known as structural variants, or chromosome changes, in the genomes of the Chardonnay and Cabernet Sauvignon grapes to determine their genetic similarity. Each of the fruits has about 37,000 genes.

"Each of us inherits one copy of their gene from their mother and one from their father," said Professor Gaut. "One would assume that the grapes inherit two copies of every gene, too, with one coming from each of their two parents. However, we found there was just one copy, not two, for 15 percent of the genes in Chardonnay, and it was also true of Cabernet Sauvignon grapes. Together, that means that grape varieties differ in the presence or absence of thousands of genes."

"These genetic differences probably contribute to many of the differences in taste between wines made from different grape varieties" said Professor Cantu. And they definitely contribute to one important feature of grapes: their color.

The research team showed that red grapes have mutated into white grapes on several different occasions. Each mutation included a large chromosomal change that altered the number of copies of key color genes. Fewer copies of the color genes cause white grapes.

In addition to providing key scientific knowledge to vintners, the scientists say their findings have important implications for understanding the nutritional values among other fruits and vegetables. Structural variations have largely been unexplored in plant genomes, but Professor Gaut says the research is important for understanding what lies within the fruits and vegetables we eat. "For example, even between the various types of heirloom tomatoes, structural variations could account for differing nutritional values," he said. "Better understanding the genetic composition of species enables us to access tools that improve plant breeding."

Credit: 
University of California - Irvine

Swapping pollinators reduces species diversity, study finds

LAWRENCE, KANSAS -- University of Kansas plant biologists Carolyn Wessinger and Lena Hileman appreciate the sheer beauty of a field of colorful wildflowers as much as the next person. But what really gets their adrenaline pumping is understanding the evolutionary forces that render Earth's blooms in such a stunning array of shapes and hues.

A lot of that diversity stems from the birds and the bees.

Flowers depend on these and other pollinators to reproduce, and they can adapt strategically to attract these creatures - sometimes altering their traits so dramatically that they lure an altogether new pollinator.

But not all such strategies are created equal. In a new paper published in Evolution Letters, Wessinger and Hileman demonstrate that abandoning one pollinator for another to realize immediate benefits could compromise a flower's long-term survival. The research provides novel insights into fundamental biological processes that ultimately influence food security.

"Approximately 35% of the world's crop species rely on animal pollinators for productivity," said Hileman, professor of ecology & evolutionary biology. "Diverse natural communities of flowering plants and associated pollinators contribute to maintenance of pollinator populations important for crop reproduction."

The KU study focused on Penstemon, a genus commonly known as beardtongues. Most of the nearly 300 Penstemon species in North America are pollinated by bees, indicating a successful long-term partnership. But more than 15 lineages of this perennial wildflower have switched to attract hummingbird pollinators instead of bees.

"Because Penstemon has ditched bees for hummingbirds so many times, it is an ideal group of plants to examine the long-term consequences of switching to hummingbird pollination," said Wessinger, a postdoctoral fellow in Hileman's lab. "In a sense, nature has provided us with a large number of replicated experiments to study."

Those experiments reveal that once a branch of the Penstemon family tree shifts to hummingbird pollination - perhaps because bees have become less abundant or reliable pollinators - the evolution of new species on that branch slows or stops.

To demonstrate this pattern, the KU scientists collected and sequenced DNA representing the entire genome from about 100 related species of Penstemon, then reconstructed their relationships in a phylogenetic tree with common ancestors at its roots. The results surprised them: Any time a hummingbird-pollinated species emerged on the tree, it appeared alone or with just one or two other hummingbird-adapted species.

"This makes hummingbird pollination a surprisingly rare trait across the sampled Penstemon species, given the large number of times this pollination switch has occurred - 17 times in our sample," Wessinger said.

To make the flight from bee to hummingbird, Penstemon change in substantial ways. Think of it as a courtship story: Bee meets flower. Flower boasts bluish to purple petals visible to bee's eyes and a short, open structure ideally suited to accommodate a stocky insect seeking nectar. The flower's stamen are positioned to deposit pollen on the bee's back when it lands.

Pollination ensues. Ecosystem lives happily ever after.

Until nature introduces a plot twist. Over evolutionary time - thousands to millions of years - red flowers evolve, with long tubular petals and production of more nectar than the bee needs. The spark is gone.

But these new traits lure a different suitor. Hummingbird zips in craving a huge energy reward to fuel its hyper-speedy metabolism. It sips nectar through its long tongue while hovering - no landing platform required. Romance blossoms anew.

So what keeps hummingbird-pollinated Penstemon from continuing to form new species once they've made this transition?

Perhaps the switches have occurred so recently that there simply hasn't been enough time for hummingbird-pollinated species to become more common, the researchers posited. Through state-of-the-art statistical modeling, though, they ruled out that theory.

"Our findings indicate that the type of pollinator a flowering plant species relies upon will impact how likely that flowering plant lineage is to persist over long periods of evolutionary time," Hileman said. "This result suggests that the fact that we see so many insect-pollinated plants in nature is partly because they are highly successful at speciating and resisting extinction compared to hummingbird-adapted flowering plants."

One question the study leaves unanswered is why the hummingbird strategy reduces diversity. The researchers have a few theories that they hope to explore next. Their favorite? Species diversification often occurs because groups of flowers become isolated from one another by a canyon or mountain, for example. As the split populations experience different selective pressures over time, they might adapt and diverge enough to form new species.

"Perhaps hummingbirds move pollen great distances in the course of their seasonal migrations," Wessinger said. "Such long-distance pollination could help maintain genetic connections between geographically distant populations, hindering the speciation process."

Interestingly, similar studies conducted in tropical regions have produced opposite results. Hummingbird pollination in those areas has been linked to similar or higher diversification rates than bee pollination.

"I think that might have to do with differences in hummingbird behavior in the tropics versus North America," Wessinger said. "Tropical hummingbirds can often be territorial, taking up residence in a certain patch of flowers and keeping competitors at bay. They might not be moving great distances between plants."

Wessinger and Hileman's study is part of a broader body of research supported by $2.5 million in grants from the National Science Foundation. The funding is helping them and a Duke University collaborator dig deeper into the genetic processes that shape the complex bee-adapted and hummingbird-adapted flowers of Penstemon.

As they begin to explore why hummingbird pollination reduces diversity in Penstemon, they'll need to collect samples from geographically isolated species in the wild. Wessinger is excited to get out of the lab and into the field, where the thrill of discovery awaits.

"There's so much adrenaline," she said of searching for native flowers. "If I can't find a population, I get really frustrated. You're camping at high elevations; maybe you're not sleeping great. Everything feels overly emotional. But when you find what you're looking for, then it's just elation."

Credit: 
University of Kansas

Paid family leave improves vaccination rates in infants

BINGHAMTON, N.Y. -- Parents who take paid family leave after the birth of a newborn are more likely to have their child vaccinated on time compared to those who do not, according to new research from Binghamton University, State University of New York. The effect is stronger on families living below the poverty line.  
 

"Currently, many people do not vaccinate their child within the recommended schedule and are late," said Solomon Polachek, professor of economics at Binghamton University. "Often this might be due to parental time constraints. When an infant is really young, these immunizations are critical, since infants are at a higher risk of infection and illness if not vaccinated properly."
 

In 2004, California was the first state to implement a Paid Family Leave (PFL) policy, allowing private-sector employees up to six weeks of leave with partial wage replacement to care for a newborn baby. This time not only helps parents settle into their new caregiving roles, but it also allows them time to make vital parental decisions, such as ensuring their child is vaccinated on time.
 

Binghamton University PhD student Agnitra Roy Choudhury, who conducted this study under Polachek's direction, looked at the National Immunization Survey to collect data regarding child vaccination rates between 19-35 months old. Specifically, the researchers looked at children born before and after the PFL policy was implemented in California and whether children received vaccinations on time compared to children in other states during the same time period. Vaccinations studied include Hepatitis-B (HepB), Diphtheria Tetatus Pertusis (DTP) and Haemophilus Influenza Type B (HIB).
 

They found that the PFL policy in California granting six weeks of family leave with partial wage replacement reduced late vaccination rates in infants.
 

"The research finds that paid family leave (at least in California) increases the chance an infant will be inoculated for the second HepB injection by over 5 percent relative to states not implementing paid family leave, and for the DTP injection by about 1.5 percent," said Polachek. "The effects are bigger for poorer families, who are less likely to have access to paid family leave from their jobs alone."
 

According to Polachek, vaccinating infants on time is vital to their future health and well-being, since vaccines can ward off diseases that can impact future attendance at school. Not only do these outcomes lead to less learning for children, but also they can lead to lower earnings power.
 

"Poor school attendance and less early childhood learning can have consequences regarding the widening earnings distribution," said Polachek. "Paid family leave might be a viable national policy if it mitigates these detrimental effects."
 

Future research will focus on using more precise survey data and analyzing other states, such as New York, that have recently implemented PFL policies.

Credit: 
Binghamton University

Hospital infections declining in Canada

There is good news on the infection front: infections acquired by patients in Canadian hospitals are declining, with a 30% reduction between 2009 and 2017, according to new research in CMAJ (Canadian Medical Association Journal). However, continued focus is necessary to identify and prevent emerging antimicrobial-resistant pathogens, and infections with medical devices, such as urinary or intravenous catheters.

Health care-associated infections are a substantial issue worldwide. In the United States, an estimated 5% of patients admitted to hospital in 2002 developed an infection, resulting in 1.7 million infections and 98 000 deaths.

A series of studies, conducted by a team of researchers with the Canadian Nosocomial Infection Surveillance Program (CNISP), included data from hospitals from 9 Canadian provinces in 2002 and 2009, and all 10 provinces in 2017. The proportion of patients with a hospital-acquired infection increased from 9.9% in 2002 to 11.3% in 2009, and decreased to 7.9% in 2017, a 30% decline. Urinary tract infections (32%) were the most common infection, followed by pneumonia (23%), surgical site infection (20%), bloodstream infection (15%) and Clostridioides difficile infection (9%). Infection rates in intensive care units declined 29%.

"There is no single reason for the overall decline in infection types, which suggests Canadian hospitals have used a variety of methods to prevent infection, such as better hand washing, antimicrobial stewardship to prevent C. difficile and other measures," says Dr. Geoffrey Taylor, University of Alberta Hospital, Edmonton, Alberta.

In a related commentary, Dr. Jennie Johnstone, Public Health Ontario and coauthors write, "[a]lthough these rates are low, there are some concerning trends. The proportion of health care-associated infections caused by antimicrobial-resistant organisms was stable or increasing for all pathogens, and carbapenamase-producing Enterobacteriaceae, which are emerging antimicrobial-resistant pathogens, were identified for the first time in the 2017 survey."

"Without ongoing efforts to improve and reduce health care-associated infections and antimicrobial resistance and without frequent measurement of our performance as a country, it is likely that the gains seen in this study will not be sustained and that Canada's antimicrobial resistance problem may become unmanageable," write the commentary authors.

Credit: 
Canadian Medical Association Journal

New guideline on Parkinson's disease aimed at physicians and people with Parkinson's

A comprehensive new Canadian guideline provides practical guidance for physicians, allied health professionals, patients and families on managing Parkinson disease, based on the latest evidence. The guideline is published in CMAJ (Canadian Medical Association Journal), accompanied by an easy-to-reference infographic and podcast.

Parkinson disease is a debilitating, progressive neurological condition that affects quality of life for patients and their caregivers.

Since publication of the first Canadian guideline in 2012, there have been substantial advancements in the literature for Parkinson disease. The new guideline, funded by Parkinson Canada, is based on the latest evidence and advances in diagnosis, treatment and symptom management, and contains a new section on palliative care. Experts from various health disciplines from across Canada helped develop the guideline.

"We hope this guideline will help physicians and other health care professionals improve the care of people with Parkinson disease," says Dr. David Grimes, a neurologist at The Ottawa Hospital and the University of Ottawa Brain and Mind Research Institute, Ottawa, Ontario.

The guideline is divided into 5 sections for ease of use. Highlights:

Communication

People with Parkinson disease should be encouraged to participate in choices about their own care.

Communication should be both verbal and written.

Discussions should aim for balance between providing realistic information about prognosis and promoting optimism.

Families and caregivers should be informed about the condition and available support services.

Diagnosis and progression

Suspect Parkinson disease in anyone with tremor, stiffness, slowness, balance problems or gait disorders.

CT or MRI brain scanning should not be routinely used to diagnose Parkinson disease.

No therapies are effective for slowing or stopping brain degeneration in Parkinson disease.

Treatment

A regular exercise regimen begun early has proven benefit.

Patients with possible diagnosis of Parkinson disease may benefit from a trial of dopamine replacement therapy to help with diagnosis.

Deep brain stimulation and gel infusion are now routinely used to manage motor symptoms.

Rehabilitation therapists experienced with Parkinson disease can help newly diagnosed patients, and others through all stages.

Nonmotor features

Botulinum toxin A helps control drooling.

Management of depression should be tailored to the individual and their current therapy.

Dementia should not exclude a diagnosis of Parkinson disease, even if present early.

Rapid eye movement sleep behaviour disorder can predate the diagnosis of Parkinson disease.

Palliative care

The palliative care needs of people with Parkinson disease should be considered throughout all phases of the disease.

If the patient asks, the option of medical assistance in dying should be discussed.

In addition to its usefulness to health care professionals, the guideline may be used by policy-makers, charities and funders as well as people with Parkinson disease and their families.

"A limitation to implementing the guideline is the lack of access to health care providers experienced in caring for people with Parkinson disease," says Dr. Grimes. "In addition to specialist physicians, we need more nurses, and speech, occupational and physical therapists with training in this area, as well as adequate palliative care for Parkinson patients."

The guideline, which draws upon recommendations from Scotland, the United Kingdom, the European Union and the United States, is focused on recommendations relevant to the Canadian health care system.

"The guideline provides evidence-based recommendations to improve the overall standard of care of individuals with Parkinson disease in Canada, not only for health care professionals, but also for policy-makers, patients themselves and their caregivers," writes Dr. Veronica Bruno, Department of Clinical Neurosciences, Movement Disorders Program and Hotchkiss Brain Institute, University of Calgary, Calgary, Alberta, and coauthor in a related commentary. "Managing the complexity of Parkinson disease requires clear, standardized procedures that can be used by all actors involved."

The guideline "represents a great effort to streamline the management of Parkinson disease across Canada," they write.

Credit: 
Canadian Medical Association Journal

Case management in primary care associated with positive outcomes

In a systematic review, researchers identified three characteristics of case management programs that consistently yielded positive results: case selection for frequent users with complex problems, high-intensity case management interventions and a multidisciplinary care plan. This review included data from 20 studies, 17 of which were quantitative, of adult frequent users with chronic diseases in primary, secondary and tertiary care settings. Case management was delivered in a primary care setting in all of the studies. Factors such as health care system use, financial cost and patient outcomes were the primary outcomes assessed. All the case management interventions with positive outcomes included some method of identifying patients most likely to benefit. Most of the methods with positive outcomes included high-intensity case management interventions and care plans developed by multidisciplinary teams. The author suggests that policymakers and clinicians should focus on finding an appropriate method to identify patients most likely to benefit from case management. A high-intensity case management intervention and/or access to a multidisciplinary team may also improve outcomes.

Credit: 
American Academy of Family Physicians

High blood pressure treatment may slow cognitive decline

High blood pressure appears to accelerate cognitive decline among middle-aged and older adults, but treating high blood pressure may slow this down, according to a preliminary study presented by researchers at Columbia University Mailman School of Public at the American Heart Association's Hypertension 2019 Scientific Sessions.

"The findings are important because high blood pressure and cognitive decline are two of the most common conditions associated with aging, and more people are living longer, worldwide," said L.H. Lumey, professor of Epidemiology at Columbia Mailman School and senior author. According to the American Heart Association's 2017 Hypertension Guidelines, high blood pressure affects approximately 80 million U.S. adults and one billion people globally. Moreover, the relationship between brain health and high blood pressure is a growing interest as researchers examine how elevated blood pressure affects the brain's blood vessels, which in turn, may impact memory, language, and thinking skills.

In this observational study, the researchers analyzed data collected on nearly 11,000 adults from the China Health and Retirement Longitudinal Study (CHARLS) between 2011-2015, to assess how high blood pressure and its treatment may influence cognitive decline. High blood pressure was defined as having a systolic blood pressure of 140 mmHg or higher and a diastolic blood pressure of 90 mmHg or higher, and/or taking antihypertensive treatment. According to guidelines of the American Heart Association, high blood pressure is defined as 130 mmHg or higher or a diastolic reading of 80 mmH or higher.

For the study in China, researchers interviewed study participants at home about their high blood pressure treatment, education level and noted if they lived in a rural or urban environment. They were also asked to perform cognitive tests, such as immediately recalling words as part of a memory quiz. Among the study's findings:

Overall cognition scores declined over the four-year study.

Participants ages 55 and older who had high blood pressure showed a more rapid rate of cognitive decline compared with participants who were being treated for high blood pressure and those who did not have high blood pressure.

The rate of cognitive decline was similar between those taking high blood pressure treatment and those who did not have high blood pressure.

The study did not evaluate why or how high blood pressure treatments may have contributed to slower cognitive decline or if some treatments were more effective than others.

"We think efforts should be made to expand high blood pressure screenings, especially for at-risk populations, because so many people are not aware that they have high blood pressure that should be treated," said presenting study author Shumin Rui, a biostatistician at Columbia Mailman School. "This study focused on middle-aged and older adults in China, but we believe our results could apply to populations elsewhere as well. We need to better understand how high blood pressure treatments may protect against cognitive decline and look at how high blood pressure and cognitive decline are occurring together."

Credit: 
Columbia University's Mailman School of Public Health

Major environmental challenge as microplastics are harming our drinking water

Plastics in our waste streams are breaking down into tiny particles, causing potentially catastrophic consequences for human health and our aquatic systems, finds research from the University of Surrey and Deakin's Institute for Frontier Materials.

Led by Dr Judy Lee and Marie Enfrin from the Department of Chemical and Process Engineering at the University of Surrey and Dr Ludovic Dumée at Deakin's Institute for Frontier Materials, the project investigated nano and microplastics in water and wastewater treatment processes. The team found that tiny pieces of plastic break down further during treatment processes, reducing the performance of treatment plants and impacting on water quality. The study was published in Journal of Water Research

There has been substantial study of microplastics pollution, but their interaction with water and wastewater treatment processes had not been fully understood until now.

Approximately 300 million tons of plastic are produced globally each year and up to 13 million tons of that is released into rivers and oceans, contributing to approximately 250 million tons of plastic by 2025. Since plastic materials are not generally degradable through weathering or ageing, this accumulation of plastic pollution in the aquatic environment creates a major concern.

The research highlights the current difficulty in detecting the presence of nano and microplastics in treatment systems. In order to ensure water quality meets the required safety standards and to reduce threats to our ecosystems, new detection strategies are needed with the aim of limiting the number of nano and microplastics in water and wastewater treatment systems.

Dr Lee, Project Lead and Senior Lecturer at the University of Surrey, said: "The presence of nano and microplastics in water has become a major environmental challenge. Due to their small size, nano and microplastics can easily be ingested by living organisms and travel along water and wastewater treatment processes. In large quantities they impact the performance of water treatment processes by clogging up filtration units and increasing wear and tear on materials used in the design of water treatment units."

Credit: 
University of Surrey

New flying reptile species was one of largest ever flying animals

image: Cryodrakon boreas.

Image: 
David Maas

A newly identified species of pterosaur is among the largest ever flying animals, according to a new study from Queen Mary University of London.

Cryodrakon boreas, from the Azhdarchid group of pterosaurs (often incorrectly called 'pterodactyls'), was a flying reptile with a wingspan of up to 10 metres which lived during the Cretaceous period around 77 million years ago.

Its remains were discovered 30 years ago in Alberta, Canada, but palaeontologists had assumed they belonged to an already known species of pterosaur discovered in Texas, USA, named Quetzalcoatlus.

The study, published in the Journal of Vertebrate Paleontology, reveals it is actually a new species and the first pterosaur to be discovered in Canada.

Dr David Hone, lead author of the study from Queen Mary University of London, said: "This is a cool discovery, we knew this animal was here but now we can show it is different to other azhdarchids and so it gets a name."

Although the remains - consisting of a skeleton that has part of the wings, legs, neck and a rib - were originally assigned to Quetzalcoatlus, study of this and additional material uncovered over the years shows it is a different species in light of the growing understanding of azhdarchid diversity.

The main skeleton is from a young animal with a wingspan of about 5 metres but one giant neck bone from another specimen suggests an adult animal would have a wingspan of around 10 metres.

This makes Cryodrakon boreas comparable in size to other giant azhdarchids including the Texan Quetzalcoatlus which could reach 10.5 m in wingspan and weighed around 250 kg.

Like other azhdarchids these animals were carnivorous and predominantly predated on small animals which would likely include lizards, mammals and even baby dinosaurs.

Dr Hone added: "It is great that we can identify Cryodrakon as being distinct to Quetzalcoatlus as it means we have a better picture of the diversity and evolution of predatory pterosaurs in North America."

Unlike most pterosaur groups, azhdarchids are known primarily from terrestrial settings and, despite their likely capacity to cross oceanic distances in flight, they are broadly considered to be animals that were adapted for, and lived in, inland environments.

Despite their large size and a distribution across North and South America, Asia, Africa and Europe, few azhdarchids are known from more than fragmentary remains. This makes Cryodrakon an important animal since it has very well preserved bones and includes multiple individuals of different sizes.

Credit: 
Queen Mary University of London

Study shows shorter people are at higher risk of type 2 diabetes

Short stature is associated with a higher risk of type 2 diabetes, according to a new study in Diabetologia (the journal of the European Association for the Study of Diabetes).Tall stature is associated with a lower risk, with each 10cm difference in height associated with a 41% decreased risk of diabetes in men and a 33% decreased risk in women.

The increased risk in shorter individuals may be due to higher liver fat content and a less favourable profile of cardiometabolic risk factors, say the authors that include Dr Clemens Wittenbecher and Professor Matthias Schulze, of the German Institute of Human Nutrition Potsdam-Rehbruecke, Germany, and colleagues.

Short stature has been linked to higher risk of diabetes in several studies, suggesting that height could be used to predict the risk for the condition. It has been reported that insulin sensitivity and beta cell function are better in taller people. Short stature is related to higher cardiovascular risk, a risk that might in part be mediated by cardiometabolic risk factors relevant to type 2 diabetes - for example blood pressure, blood fats and inflammation.

This new study used data obtained in the European Prospective Investigation into Cancer and Nutrition (EPIC) - Potsdam; a study that included 27,548 participants - 16, 644 women aged between 35 and 65 years and 10,904 men aged between 40 and 65 years - recruited from the general population of Potsdam, Germany between 1994 and 1998.

A variety of physical data were collected from participants, including body weight, total body height and sitting height (with leg length calculated as the difference between the two), waist circumference and blood pressure. For this study, a sub-cohort of 2,500 participants (approx. 10%) was randomly selected being representative for the full study.

Those with diabetes already or lost to follow up were excluded, leaving 2,307 for analysis. In addition, 797 participants of the full cohort who went on to develop type 2 diabetes were included. Of these, an investigation of potential mediating factors was carried out for 2,662 participants (including 2,029 sub-cohort members and 698 diabetes cases).

The study found that the risk of future type 2 diabetes was lower by 41% for men and 33% for women for each 10cm larger height, when adjusted for age, potential lifestyle confounders, education and waist circumference.

The association of height with diabetes risk appeared to be stronger among normal-weight individuals, with an 86% lower risk per 10cm larger height in men, and 67% lower risk per 10cm larger height in women. In overweight/obese individuals, each 10cm larger height was associated with diabetes risk being 36% lower for men and 30% lower for women. The authors say: "This may indicate that a higher diabetes risk with larger waist circumference counteracts beneficial effects related to height, irrespective of whether larger waist circumference is due to growth or due to consuming too many calories."

Larger leg length was associated with a lower risk of diabetes. A slight sex difference was noted - for men a larger sitting height at the cost of leg length related to increased risk, whilst amongst women both leg length and sitting height contributed to lower risk. The authors suggest that, among boys, growth before puberty, which relates more strongly to leg length, will have a more favourable impact on later diabetes risk than growth during puberty (assuming that truncal bones are the last to stop growing). For girls both growth periods seem to be important.

The authors also calculated to what extent the inverse associations of height and height components with type 2 diabetes risk are explainable by liver fat (measured as Fatty Liver index) and other cardiometabolic risk factors. When the results were adjusted for liver fat content, the men's reduced risk of diabetes per 10cm larger height was 34% (compared with 40% in the overall results), and the women's reduced risk was just 13% compared with 33% in the overall results.

Other biomarkers also affected the results: in men adjustment for glycated haemoglobin (a measure of blood sugar) and blood fats each reduced the risk difference by about 10%. In contrast, among women adjustment for adiponectin (a hormone involved in blood sugar control) (-30%) and C-reactive protein (a marker of inflammation) (-13%) reduced the associations of height with diabetes, in addition to the reductions observed by glycated haemoglobin and blood fats. Taken together, the authors say that a large proportion of the reduced risk attributable to increased height is related to taller people having lower liver fat and a 'healthier' cardiometabolic profile.

The authors say: "Our findings suggest that short people might present with higher cardiometabolic risk factor levels and have higher diabetes risk compared with tall people...

These observations corroborate that height is a useful predictive marker for diabetes risk and suggest that monitoring of cardiometabolic risk factors may be more frequently indicated among shorter persons, independent of their body size and composition. Specifically, liver fat contributes to the higher risk among shorter individuals and, because height appears to be largely unmodifiable during adulthood, interventions to reduce liver fat may provide alternative approaches to reduce risk associated with shorter height."

However they add: "Our study also suggests that early interventions to reduce height-related metabolic risk throughout life likely need to focus on determinants of growth in sensitive periods during pregnancy, early childhood, puberty and early adulthood, and should take potential sex-differences into account."

They conclude: "We found an inverse association between height and risk of type 2 diabetes among men and women, which was largely related to leg length among men. Part of this inverse association may be driven by the associations of greater height with lower liver fat content and a more favourable profile of cardiometabolic risk factors, specifically blood fats, adiponectin and C-reactive protein."

Credit: 
Diabetologia

Researchers find regulator of first responder cells to brain injury

image: Astrocyte, the most abundant cell type in the brain.

Image: 
Wikimedia Commons/Gerry Shaw

Astrocytes are the most abundant cells in the brain, yet there is still much to learn about them. For instance, it is known that when the brain is injured or diseased astrocytes are the first responders. They become reactive and play roles that can be both beneficial and deleterious, but little is known about how these diverse responses to injury are regulated. Working with mouse models, a multi-institutional group led by researchers at Baylor College of Medicine has identified nuclear factor I-A (NFIA) as a central regulator of both the generation and activity of reactive astrocytes.

Unexpectedly, NFIA's role seems to depend on the type of injury and on the region of the central nervous system where the injury occurs. The report also begins to define the molecular mechanisms involved, and shows that NFIA also is abundant in reactive astrocytes found in human pediatric and adult neurological injuries, suggesting that NFIA may play similar roles in people. The study appears in The Journal of Clinical Investigation.
"Reactive astrocytes are associated with most forms of neurological disorders, from acute injury to degeneration, but their contributions to disease are only now coming to light," said corresponding author Dr. Benjamin Deneen, professor of neurosurgery and the Center for Stem Cell and Regenerative Medicine at Baylor.

Looking to better understand the roles these important cells play in neurological disorders, Deneen and his colleagues looked into NFIA, a known regulator of astrocyte development, to determine its role in the generation and regulation of reactive astrocytes.

First, they determined that NFIA is abundant in human pediatric and adult reactive astrocytes found in a host of neurological injuries. Then, to explore the role NFIA plays in the response of reactive astrocytes to injury, the researchers turned to mouse models. They generated mice in which NFIA was specifically eliminated in astrocytes, and compared the reactive astrocyte response of these NFIA-deficient mice to that of mice with NFIA after different types of neurological injury.

"The results were surprising," said Deneen, Dr. Russell J. and Marian K. Blattner Chair and member of the Dan L Duncan Comprehensive Cancer Center at Baylor. "Until now, it was thought that regardless of the type of injury or where it occurred in the central nervous system, reactive astrocytes would respond in the same way. Knocking out NFIA allowed us to uncover a previously unknown layer of functional diversity in reactive astrocytes."

When white matter injuries occurred in the spinal cord of NFIA-deficient mice, reactive astrocytes were generated and migrated toward the injury, but were not able to remodel the injured blood brain barrier as well as the reactive astrocytes of normal mice did. Consequently, the white matter was not repaired.

But when the researchers tested the response to a different form of injury in another region of the central nervous system, a stroke in the cerebral cortex, they observed something much different. While normal mice (with NFIA) responded to stroke by producing reactive astrocytes that migrated toward the injury to repair the bleeding, NFIA-deficient mice did not generate reactive astrocytes and the injury was not healed. In both cases, in the spinal cord and in the cerebral cortex, the injury was not properly repaired, but the underlying reasons for this were drastically different.

"These findings suggest that NFIA's function in reactive astrocytes is dependent upon the type of injury and brain region in which the injury occurs. In the cerebral cortex, NFIA is crucial for making reactive astrocytes, while in the spinal cord NFIA is important for sealing off leaking blood vessels. These results hint at an extensive reservoir of reactive astrocyte responses that vary based on form and location of injury," Deneen said.

In addition, the researchers began to define the molecular mechanisms underpinning the generation of reactive astrocytes. They found a direct connection between NFIA and thrombospondin 4. NFIA directly regulates the production of thrombospondin 4, a factor that had been previously identified in the lab of co-author Dr. Chay T. Kuo, associate professor of cell biology and neurobiology at Duke University, as an essential regulator of the generation of reactive astrocytes.

"Although our study was conducted in mice and much more research is needed, we think our findings may reflect what occurs in people, as NFIA also is abundantly present in reactive astrocytes in both pediatric and adult neurological injuries," Deneen said. "We also are interested in investigating the role of NFIA in reactive astrocytes in neurodegenerative diseases such as Alzheimer's and Parkinson's disease, as it's entirely possible it has a completely different set of functions in these diseases."

Credit: 
Baylor College of Medicine

Why don't the drugs work? Controlling inflammation can make antidepressants more effective

Research shows that controlling inflammation may be key to helping the brain develop the flexibility to respond to antidepressant drugs, potentially opening the way for treatment for many millions of people who do not respond to the drugs. This is experimental work on mice, and has not yet been confirmed in humans. It is presented together for the first time at the ECNP Congress in Copenhagen, after a series of publications in peer-reviewed journals.

Group leader, Professor Igor Branchi (Istituto Superiore di Sanità, Rome), said:

"If confirmed in humans, these results may have fairly far-reaching implications. The work shows that neuroplasticity and inflammation are interdependent, and that to provide the right conditions for the antidepressant to work, inflammation need to be tightly controlled".

Depression is the leading cause of disability worldwide*, and is a major burden on society. It is commonly treated by drugs called Selective Serotonin Reuptake Inhibitors (SSRIs). Unfortunately, SSRIs don't work for around 1/3 of people, meaning that they have very limited options for treatment.

Antidepressants are known to increase the ability of the brain to form new connections, a process called neuroplasticity. They have also been shown to help control levels of brain inflammation. In a new study, Italian scientists set out to test whether there would be an optimum balance between neuroplasticity and inflammation.

In a first study, the scientists fed mice the SSRI antidepressant fluoxetine (Prozac) to increase the neuroplasticity of their brains and found the mice had a change in the expression of inflammatory markers. When they then housed the mice for three weeks in a stressful environment to cause inflammation, giving the mice fluoxetine caused the inflammation to decrease. By contrast, when exposed to a relaxing environment, known to reduce inflammation, feeding the mice fluoxetine resulted in higher activity in genes associated with inflammation.

"The first step was to link the brain's ability to deal with change, the neuroplasticity, to inflammation", said lead researcher Dr Silvia Poggini (Istituto Superiore di Sanità, Rome). "Once we had shown that, the next step was to change the levels of the inflammation to see what happened to plasticity".

This led to a second study, where the researchers treated the mice either with lipopolysaccharide, a drug which increases inflammation, or with ibuprofen, which decreases inflammation. By doing this they were able to change the level of inflammation, rather like increasing or decreasing the volume of music. While doing this, they measured plasticity markers in the mice, to see whether changes in inflammation increased or decreased neural plasticity.

Dr Poggini continued,

"We found that neural plasticity in the brain was high* as long as we were able to keep inflammation under control. But both too high and too low inflammation levels meant that the neural plasticity was reduced - in line with the reduced efficacy of antidepressants in mice with altered levels of inflammation".

When the mice were tested for depression-like responses such as anhedonia and cognitive bias, the mice showed behavioural changes which reflected the expected balance between plasticity and inflammation.

Igor Branchi added:

"If the results can be translated to humans, then controlling inflammation might lead to more effective use of antidepressants. This may be done by drugs, but we may also consider preventing high inflammation arising in the first place, which may lead us to look at other parameters which lead to the stress which causes this problem. More generally, this work shows us that SSRI antidepressants are not one-size-fits-all drugs, and that we should look at other options for improving drug response".

Professor Carmine M. Pariante, Professor of Biological Psychiatry at King's College London, says that 'This paper really provides new insight into the relationship between the brain, inflammation, and antidepressant response. We have known for some time that depressed patients with increased inflammation do not respond to antidepressant treatment, but this study finally proposes the biological mechanisms underpinning these effects. Most importantly, the study also proposes that low inflammation can be equally harmful in depressed patients, also preventing response to antidepressants, a finding that, if replicated in humans, will have important clinical implications".

Professor Pariante was not involved in this work, this is an independent comment.

Credit: 
European College of Neuropsychopharmacology

'Building blocks' of bird calls resemble human languages

image: Chestnut-crowned babbler.

Image: 
Niall Stopford

New study sheds light on whether animal vocalisations, like human words, are constructed from smaller building-blocks.

Findings raise the exciting possibility that the capacity to generate meaning from meaningless building blocks is widespread in animals.

"To our knowledge, this is the first time that the meaning-generating building blocks of a non-human communication system have been experimentally identified", says Prof Simon Townsend of the University of Warwick and Zurich

The 'building blocks' of bird calls resemble those of human languages, new research from has found.

Through analysis of the calls of the chestnut-crowned babbler - a highly social bird from the Australian Outback, the researchers claim to have gained new insight into the evolution of human language.

Human languages are comprised of meaningful words, which themselves are built from different combinations of meaningless sounds. This new research into bird calls, or vocalisations, uncovered that they can, like human languages, also be decomposed into perceptibly distinct meaningless sounds, or 'building blocks'.

Previous research demonstrated that chestnut-crowned babbler calls seemed to be composed of two different sounds "A" and "B" in different arrangements when performing specific behaviours.

When flying, the birds produced a flight call "AB", but when feeding chicks in the nest they emitted "BAB" provisioning calls. In the current study, the authors used playback experiments, previously used to test speech-sound discrimination in human infants, to probe the perception of the sound elements in babblers.

Co-author Prof Simon Townsend of the University of Warwick and University of Zurich said:

"To our knowledge, this is the first time that the meaning-generating building blocks of a non-human communication system have been experimentally identified".

He concluded: "Although the building blocks in the babbler system may be of a very simple kind, it might still help us understand how combinatoriality initially evolved in humans".

Commenting on the research, lead author Dr Sabrina Engesser of the University of Zurich said:

"Through systematic comparisons we tested which of the elements babblers perceived as equivalent or different sounds. In doing so, we were able to confirm that the calls could be broken up into two perceptually distinct sounds that are shared across the calls in different arrangements.

"Furthermore, none of comprising elements carried the meaning of the calls confirming the elements are meaningless" she added.

"This system is reminiscent of the way humans use sounds to form meaningful words" co-author Prof Andy Russell from the University of Exeter explained.

These findings raise the exciting possibility that the capacity to generate meaning from meaningless building blocks is widespread in animals, but the authors caution there are still considerable differences between such systems and word generation in language.

They emphasize that a focus on the acoustic distinctiveness of sounds in meaningful animal vocalisations offers a promising approach to investigate the building blocks of non-human animal communication systems.

Credit: 
University of Warwick

Teeth offer vital clues about diet during the Great Irish Famine

image: Researchers analysed calculus on teeth from the human remains of 42 people, aged approximately 13 years and older who died in the Kilkenny Union Workhouse.

Image: 
Image courtesy of Jonny Geber/ University of Edinburgh

Scientific analysis of dental calculus - plaque build-up - of the Famine's victims found evidence of corn (maize), oats, potato, wheat and milk foodstuffs.

Surprisingly, they also discovered egg protein in the calculus of three people - more associated with diets of non-labouring or better off social classes at the time.

Researchers analysed calculus on teeth from the human remains of 42 people, aged approximately 13 years and older who died in the Kilkenny Union Workhouse and were buried in mass burial pits on its grounds.

The workhouse pits were discovered in 2005 and were found to contain the remains of nearly 1,000 people.

In the mid-19th century an estimated one million people died when a devastating famine hit Ireland after the potato crop failed in successive years.

Potato and milk was virtually the only source of food for a vast proportion of the population.

Many people were forced to seek refuge in the workhouses during the Famine, where they received meagre rations of food and shelter in return for work.

Researchers examined samples of calculus for microparticles and protein content linked to foodstuffs.

The microparticles showed a dominance of corn, as well as evidence of oats, potato and wheat.

The corn came from so-called Indian meal, which was imported in vast amounts to Ireland from the United States as relief food for the starving populace.

Analysis of the protein content identified milk, as well as the occasional presence of egg.

The study is a collaboration between researchers from the Universities of Edinburgh, Harvard, Otago in New Zealand, York, Zurich, and the Max Planck Institute for the Science of Human History in Germany.

One of the lead researchers, Dr Jonny Geber of the University of Edinburgh's School of History, Classics and Archaeology, said: "The results of this study is consistent with the historical accounts of the Irish labourer's diet before and during the Famine. It also shows how the notoriously monotonous potato diet of the poor was opportunistically supplemented by other foodstuffs, such as eggs and wheat, when made available to them.

"The Great Irish Famine was one of the worst subsistence crises in history but it was foremost a social disaster induced by the lack of access to food and not the lack of food availability."

Credit: 
University of Edinburgh